Hand tracking - Detecting mesh intersections

Hello Everyone,

I’ve been struggling to get webXR hand-tracking and mesh intersections to work. I want to detect if the user touches something.

Would anyone be able to point me in the right direction?

This is what I have tried so far. I want to sphere to disappear when touched:

The hands appear correctly. I can get physics working ok too. I cannot get events to trigger when touching something though.

My end goal is to be able to touch objects, or detect gestures.

Does @RaananW or anyone else know how to do this?

We have near intersections feature especially for that:

When enabled (and when the meshes are enabled to be near-pickable) you will be able to get pointer events.

Otherwise you can get the finger’s position on each frame and check if the finger’s mesh intersects the mesh you want to touch

Thanks for letting me know about the near intersections. I’ve read that documentation but I’m still struggling to implement it.

Are there any examples of it in use? I searched the playground but couldn’t find anything that worked.

Otherwise you can get the finger’s position on each frame and check if the finger’s mesh intersects the mesh you want to touch

Is what I was trying to do in my playground example what you are describing above? If so, did I do something wrong?

This example shows how to use the near interaction with a controller:

Babylon.js Playground (babylonjs-playground.com)

Ich you enable hand support (the hands feature) you will be able to press the button using your finger.

To get the depth between the touch section and the button you can use button.getPressDepth(position)