I want to start babylonjs webXR and control it with gestures.
There is no trackedMeshes property that should be able to receive joint data when getting hand data with onHandAddedObservable and it cannot be controlled by gestures.
The sample babylonjs docs also gives an error that there is no trackedMeshes.
If anyone knows how to solve it, please let me know.
thank you for your reply
Below, undefined is displayed in trackedMeshes around line 209.
What I want to achieve is
I want to control an object when I make my hand Rock(Goo?) with a gesture. Is this feasible?
Not sure where this demo is in our documentation? trackedMesh was removed during development on the feature. Would be great to know which playground is referenced so that I will be able to remove it.
Regarding getting specific meshes - an XRHand has two functions you can use:
and
getJointMesh will give you the corresponding mesh, if available. You can ask any of those hand parts - XRHandJoint | Babylon.js Documentation. getHandPartMeshes will give you a whole section of the hand (for example the entire index finger)
That’s not a playground created by us, this is someone who created a playground which you found in the playground search (the search and all playgrounds are public).
Not really, as I don’t really know what you are trying to achieve. If you are really looking for hand gestures - I am sorry, this is something you will need to implement yourself as we don’t have gestures tool integrated in the framework (yet). If you are looking for a way to get a certain part of the hand, this is exactly as documented:
xrHandFeature.onHandAddedObservable.add((newHand) => {
// get the index finger tip
console.log(newHand.getJointMesh(XRHandJoint.INDEX_FINGER_TIP));
});