I want to simply enable the hand tracking feature on runtime using the featuresmanager.
The hand tracking is initialized through a 3D Button click.
The hand tracking seems to be technically enabled and listed when accessing the enabledFeatures through the features manager, but the onHandAddedObservable of the WebXRHandTracking Feature is never being called and the hand meshes are not being loaded.
Here is a playground example: https://playground.babylonjs.com/#x7y4h8#51
Any tips for how to get it to run during runtime (inside xr session)?
pinging our xr master @RaananW
The hand tracking feature enables the use of hands as an input method on use the value to scale the hand for rendering and interaction simulation at runtime.
That’s an interesting one. You will need to enable the feature before, because you need to initialize the session with the feature. The decision whether or not to use hands is of the user (and the UA) and not of the developer.
Good to know. Well actually, the control by the user is probably even more useful for me.
So far it works flawlessly!
I am very much happy to hear
Please share your experience (and demo!) whenever (and of course if) you can so we can improve the hand support support
Sure, this all part of my bachelor thesis and I probably be able to share it in some form.
I do I have one more question, is there a way to observe whether the user switches from one control to another? (motion controller <-> hand tracking)
I want to detach pointer selection when hand tracking is being used.
I’ve already found a solution.
For anyone looking:
You can use the WebXRInput observables: onControllerAddedObservable / onControllerRemovedObservable
The best way. Then you can check if the controller has
.hand defined, and you are good to go