Clarification of HandFeature options & non-GLB Custom Hands

Since hands are inherently part any “Hand” feature, there REALLY needs to be clear distinctions when handedness might also involved.

I want to use my own hand meshes. I specified in my options below. I got the balls for hands. My meshes did get into the scene, but were at (0, 0, 0) & not used. With no disableDefaultHandMesh option I get the default meshes in place of the balls (and my unused meshes of course).

    xrInput: defaultExperience.input,
    jointMeshes: {
        disableDefaultHandMesh : true,
        handMeshes: {
            right: new Hands.HandMeshRigged_R('rHand', System.Scene),
            left : new Hands.HandMeshRigged_L('lHand', System.Scene),

Looking at the source code, it seems handedness of the controller might also be involved, but as I said before, it can be very tricky to figure out if left & right apply to the hands or handedness.

Adding @RaananW

You just read my mind :slight_smile:

This PR - XR Hand tracking docs by RaananW · Pull Request #127 · BabylonJS/Documentation ( in the documentation page, that was JUST merged (and written today) would have helped for sure.

right and left here is referring to the handedness and not the system. we don’t reference rightHandedSystem in public code.

Ok, now that I have looked at the doc, I did not specify any rigMapping. Adding that, and removing disableDefaultMesh, now I get my un-assigned hand meshes and the balls.

I am subclassing WebXRHandTracking, to override _onXRFrame() & hook the joint info for further eval of hand states beyond just posing. Technically, you do not need to replace the hand meshes to do this, but that might also depend on what happens to the state. (hint: think E.T.)

Here is a static method of that class to initialize the feature. Is there anything that should cause the non-default hands to be used?

public static EnableHands(defaultExperience : BABYLON.WebXRDefaultExperience) : void {
    const baseRigMapping = [

    const handsFeature = <HandTracking> defaultExperience.baseExperience.featuresManager.enableFeature(
        HandTracking.Name, "latest", {
            xrInput: defaultExperience.input,
            jointMeshes: {
                handMeshes: {
                    right: new Hands.HandMeshRigged_R('rHand', System.Scene),
                    left : new Hands.HandMeshRigged_L('lHand', System.Scene)
                rigMapping: {
                    right: => `${joint}R`),
                    left : => `${joint}L`)

Slight correction, your doc does still have the disableDefaultHandMesh: true for custom meshes, so I put it back. Both ways, I get the ball hands. I am thinking about a PG, where I just go and get the published mesh .glb files, and use them. You should be able to tell them from when the system gets, because I could change the material.

Problem is would have to do the enabling in a callback, since it is not just a mesh subclass. Messy code not really good for a test.

Guess I could also just give it 2 cubes as the meshes. If something horrible happens, then it is trying to use the cubes. If I get the ball hands, then I reproduce my case, that the custom meshes are ignored

Do you see your mesh and the spheres, or just the spheres? You can set the spheres to be invisible using the invisible flag in jointMeshes, this way you should only see your hand meshes

I saw both, but they are stuck at 0, 0, 0. No juice, will try tomorrow morning (lunch for you).

The spheres are first created and are being updated on each frame. if a mesh and rig mapping is provided, they get their transformation updated and then use their transformation to update the transformation node with the corresponding name from the mapping.
If they are not moving there might be an exception thrown or some error in the loop. If you want to share a playground i might be able to help better

I did not try the invisible flag, since by your last post, I now know what the problem is. I had already tried remote debugging. There is nothing being thrown. I now searched the file, and there was nothing found in this format: mesh.skeleton .bones[i].

WebXRHand.updateFromXRFrame() is specifically written for that alien GLB format. It would be difficult for me to subclass WebXRHand too, since WebXRHandTracking._attachHand() instances it, and it is a private method.

If you were to refactor to allow both, I could provide my JS file, and a minimum html file which loads it. PM me for any arrangements. I actually grabbed @PatrickRyan 's left handed files before xmas, & put this aside. Tuesday, I modified them & they now look very different.

I generated a left handed, inline javascript file, but coming from Blender that meant a conversion. I think 2 lefts might actually make right. If not, I do have the option of creating a right-handed file straight through Blender. Someone put in right handed option in the JSON generator. It wasn’t very much, since Blender is Zup-right handed already. It just does slightly less. I never though I would need it in my javascript generator till now I never copied it, so would want to see how this fails first before adding it.

The changes to the hands, while visibly are great, are done mostly with Blender modifiers. I could start over from scratch & be done in under an hour.

There is no difference between the right-handed and the left-handed meshes, I might remove them for 5.0 final, but at the moment they are referenced this way in 4.2, so i can’t quite delete them. We initially thought this would be needed, but it wasn’t. We were able to make it work with a single model. Since GLB is right-handed by nature, it is a right-handed model.

It is written for a mesh with skeletons and bones and transformation nodes. I guess you refer to the transformation nodes vs. direct bone modification? I can add that, but I will need a mesh to test with and make sure it works.
Oh, and GLB is not an alien format :slight_smile: it is very much native for us.

regarding left-handed vs. right-handed (systems, not handedness!) - it shouldn’t matter. Babylon takes care of this under the hood. Once you load a model, it is loaded and displayed correctly, no matter the system you are using.

Ok, let me work on the html. I was generating with PBR materials, but only setting emissive. My scene has an environment, so cannot tell if it is required. Just going re-gen as std materials to be self contained. JS file is 2,526 kb, currently.

Let me know how to get them to you.

if you can zip it and upload it somewhere, it would be great. Do you mean a .babylon file?

No, I will generate one as well though. It is much easier to just say new than screw with a data file, since it is synchronous.

handMeshes: {
            right: new Hands.HandMeshRigged_R('rHand', System.Scene),
            left : new Hands.HandMeshRigged_L('lHand', System.Scene),

Hands.HandMeshRigged_x is a subclass of BABYLON.Mesh, but I can actually make it a sub-class of any BABYLON.Mesh subclass. The file is very readable. Todate, I have subclassed up to 4 levels below BABYLON.Mesh.

The .babylon version will be generated using the right-handed option, as a backup, in case things are backwards in the JS version.

The conversation continued in a PM, but am happy to report that non-GLB meshes, think a .babylon file, now work as well.

There was also a change for child meshes of the hand meshes. While I think that was so that .GLB based hands could also support parts of hands being different materials, I am using that to add a wrist control bracelet child to hold touchable butttons, sliders, or knobs, which are always wanted at the ready.

Here is a picture so far (from a desktop). The orange in the hands will only briefly show when a touch is made. The controls of the bracket are on the inside of the hands. It just has too many problems when they are on the outside of the wrist. The colors on bracelet to change for sure.

@RaananW, thanks for the changes!

1 Like