XR hands and finger tracking

Hi Babylonjs Team. I am grateful for babylonjs it changes my life forever. I was just wondering. I can’t seem to find answers in babylonjs webxr doc. On how to get the finger position. I know there’s a lot of things to think before getting their position like am i using default hand mesh. I know it’s easy to understand. But I really can’t find my way here I am kind of stuck. Should I use handT = getEnabledFeature(hand_tracking) or should I get them in xr.input.onControllerAddedObservable.add and there ? I have a little Idea like hand.get(bonename) but I don’t think that is the mesh of the bone that I am getting because I am only getting .jointName and .addEventListener

I am also willing to pay for paid courses related this matter. Thank you so much

@RaananW

3 Likes

Hello :slight_smile:

This observable is for controllers, there is another observable for hands :
onHandAddedObservable


Here is the way I use to go for example to get the finger joints :

// XR context
var xr = await scene.createDefaultXRExperienceAsync({options});
// Hands feature
var handsFeature = xr.baseExperience.featuresManager.enableFeature(BABYLON.WebXRFeatureName.HAND_TRACKING, "latest", {
    xrInput: xr.input
});
// Get joints
handsFeature.onHandAddedObservable.add((hand) => {
    let side = hand.xrController.inputSource.handedness;// "left" or "right"
    if(side){
        let joint = hand.getJointMesh(name);// joint.position follows the finger articulation
    }
})

About the joint name, replace it for example by BABYLON.WebXRHandJoint.INDEX_FINGER_TIP
You have the complete list here

++ :slight_smile:
Tricotou

4 Likes

I love you guys, this community is awesome ! Thanks a bunch ! I thought it won’t work because I can’t see the autofill and it won’t show options like .onHandAddedObservable but it works. right now I’m into the Ray that my hands is pointing I want to learn how I can detect if the ray touches something on the scene

2 Likes