Picking up a mesh using handtracking only

I’m trying to create the following effect in webxr using hand tracking only:

  1. When the object is picked using the hand tracking laser ray pointer (by doing a pinch), the object flies to the hand so that it can be inspected closely (i.e. the object’s mesh is parented to the hand mesh). Upon opening the pinch, the object flies back to its original position.
  2. In addition to pinching, I’d also like to enable grab i.e. when the hand mesh intersects the object mesh AND the hands do a grasping motion (i.e. fingers curl inward), then I’d like to attach the object’s mesh to the hand. Upon ungrasping (i.e. fingers curl outward), I’d like to detach the object’s mesh from the hand.

Any pointers/sample code you can point me to that do this? I’m using WebXRExperienceHelper.

cc @RaananW

Many thanks in advance. Cheers.

Oh, fun! I love these kinds of interactions with objects in XR.

Our 6DoF gizmo can help with some of those, but a custom implementation will be needed if you want the object to “zoom in” to your hand. It really depends on what you want to do with the object once you grab it.
If all you want to do is to look around the object (not scaling, just grab and rotate around using your hand) you could use parenting. The first select (of the pointer selection) would start an animation that will bring the object right to your hand, and right after this animation is done, set the object’s parent to be the actual hand/controller. Make sure to keep a reference of the objects transformation before you started the animation, to know where to animate the object back after you no longer grab it.
Regarding finger-gestures, this is where the hand mesh and the hand implementation comes in handy. You will need to develop the gestures yourself, but all of the parts of the hand are available to you, and all you need to do is to check if the state you are looking for was reached. For example, you can measure the distance between the finger and the thumb to check a pinch (which is already built-in in webxr), or the distance of all edges of finges to the base of the hand to see if you are making a fist. The information is available for you, but babylon doesn’t have any implementation of specific gestures.

If you want to share a playground of what you have done so far, or what doesn’t work, I will be happy to look at the code and provide more pointers!

2 Likes