Meta recently made some major improvements to hand tracking on the Quest platform. One of the newer affordances is “direct touch” on 2D UI elements. You can see a simple demo here where I’m interacting with native Quest UI while doing some WebXR dev (using Babylon JS of course!)
This got me thinking, is there a way to interact with 2D GUI with the hand tracking in Babylon JS? The default version uses a ray and pointer with a pinch to simulate the click. It would be very interesting to build some UI that we could just tap with a fingertip.
I know we can easily code these interactions with 3D meshes and with some of the 3D/holographic GUI elements, but this question is mainly about the 2D GUI.
Update: To be clear about what I mean. I’m talking about using hand tracing inside an immersive WebXR session to touch and interact with 2D GUI controls from an Advanced Dynamic Texture on the surface of a mesh.
For example, if I reach out and touch the close button with hand tracking, nothing happens.
3d touch does not work outside outside XR mode, so maybe not how you are showing. Those are OS hands you are showing not webxr.
There is eventually a mesh behind the 2d GUI, so maybe it could be made so that buttons could be clicked. Draggable stuff like sliders might be tricky.
I think I was unclear, so I updated the question. The video I linked was an example of direct touch in a place that it already works, the Quest Home environment. I know that WebXR hand tacking only works in immersive mode, so I’m not suggesting anything about flat 3D mode outside of immersive mode. (Actually, screen-space UI already works since Meta is just simulating tap/pointer events).
I also know that I can just use collision detection to find out when two 3D objects are overlapping. I’m talking very specifically about 2D GUI controls on an Advanced Dynamic Texture on the surface of a mesh. See the image below for an example of the type of UI I’m talking about.
If I reach out to tap the close button, nothing will happen.
This morning I had a couple of ideas of how this could work.
Keep the existing ray/pointer system in place, but add a way to detect when the hand or finger is very close to the button/control (parent mesh), then just fire use notifyObservers to fire the click event. This is a bit of a hacky idea, but it could let us keep using the current hand/ray system for far interactions while simulating touch interactions for nearby objects.
Less ideal: add hidden meshes to around the shape of the 2D button and fire the observer for click when the hand intersects one of these. This is a bit more obvious, but also could add extra work for every interactive 2D control.
I’m going to start tinkering with the first idea. I just need to find out if there is a way for the control to measure the distance to a hand that is casting a ray.
In the documentation for MRTK I found mention of WebXRNearInteraction
The WebXRNearInteraction feature is enabled by default on the Babylon playground, and allows for touch interactions when using hands or motion controllers. Touch-supported objects will be able to dynamically react to 3D input, creating a more immersive experience than what basic buttons can provide.
…
To provide support for Near Interaction to an arbitrary AbstractMesh, simply set the isNearPickable and/or isNearGrabbable properties.
This works well for meshes, but not for the 2D controls on an Advanced Dynamic Texture. Is there a way to have these controls opt into this behavior?
If this is possible, I think it would be extremely useful. Advanced Dynamic Texture is such a powerful feature and extending it to support another input paradigm would make it even better.
WebXR hand tracking has recently gotten a lot better on the Quest platform, making it much more reliable and useful.
I’m also thinking about the future of WebXR input on Apple Vision Pro. Based on the limited information Apple has released so far, it seems like hand tracking is the only real input we will have available. It doesn’t have controllers and I don’t see WebXR getting eye tracking anytime soon, this leaves us with head-locked gaze input and hand tracking. Being able to use near interaction could make ADT a 2D powerhouse in 2D UI in immersive apps on that device.
Awesome! If this works out, then hand tracking on Quest and Vision just got a whole lot more useful. I can’t wait to adapt my GUIs to direct touch. Thank you!
I made an account on the forum just to say holy moly f*** yeah!! I was looking for this specific feature and cant believe support was merged literally yesterday