Use custom mesh for WebXR InputProfile/Controller

Hey guys, I just got stuck with implementing my own controller model for webXR.

I tried it with the online repository and also with the local profiles…
Forcing a profile or using the doNotLoadControllerMeshes-option did not help…or was not used correct.
I just can’t find a way to use my own mesh as Controller :sweat:

Here is a crappy PG which that may helps in understanding what I try to achive:

I try to use the “generic-hand” model/mesh from the “webxr-input-profiles-Github” but with the “oculus-toch” profile. The goal should be later, to use the oculus-touch-v3 profile with custom meshes (likes hands or hands that grab a gun, etc.)

Of course I read WebXR Controllers Support | Babylon.js Documentation ,but kinda don’t get the part how to set the mesh :sweat_smile:

Summoning allmighty @RaananW for help :raised_hands:

1 Like

You got me right before I was about to call it a day :slight_smile:
So, two different things - forcing a profile will not be the thing you are looking for, if you want to change the model to a different mesh in the future.

This a demo showing how you can use the doNotLoadControllerMeshes to load your own meshes:

Forcing a profile is something you should maybe use during development, but it can be very misleading, as the orientation of the model will still be based on the other profile’s orientation. You can use it to see how your experience looks like when using a different profile, but should not be fixed in production.

1 Like

Thank you for your blazing fast response! :smiley:

I checked your PG, invested some more time and came up with this result (which is hopefully helpfull for others too :nerd_face:):

Everything works now! Thank you very much :hugs:

What would you recommend if I wan’t to swap the meshes (e.g. grabbing a gun)? I would maybe store the webXrInputSource in a variable and attach it to other meshes via parent.
And another question: Is there a way to check which input-profile is used? Just wonder how to deal with some button actions that I define for Quest 2 but user is connected with a controller with only 1 button or so :sweat_smile:

Btw:
I needed also to use the RightHandedSystem, otherwise the models would appear mirrored… :sweat_smile: don’t get exactly why…Maybe it’s from the .glb which I load :thinking:

Parenting to the pointer or the grip nodes in the WebXR Input control would work the best for “grabbing” things.

Hmm… that should not happen. but it does seem like you are rotating them on the Y axis. any reason for that?
Also don’t forget that the XR environment should actually provide most if not all of those values for you. they are sending it in RHS coordinates, we convert them when needed. might be the case here. If you load the models and view them in both LHS and RHS, do they look the same?

Allright!

Rotation here is just a “comfort” thing! Without it, the fingers would point forward and parallel to the ground (like a zombie does with its hands when he walks). I felt it’s more natural to have them mapped to the controller with thumbs up. So it has nothing to do with the RightHandedSystem issue :upside_down_face:

If you remove the “scene.useRightHandedSystem = true;” in my code, you will see the hands mixed up :sweat_smile: I also checked the raw .glb file with our sandbox…and there everything is fine…it’s a bit confusing!

Hmm hope I got this right…I tried with scene.useRightHandedSystem = false and true.
With true they are swapped and are not the same.