I am running into a an error specifically when I try to add XR-hand-tracking on safari on the Apple Vision Pro :
“undefined is not an object (evaluating ‘this._xrSessionManager.session.enabledFeatures.indexOf’)”
What’s strange is the hand tracking example from the playground runs just fine, and my code also works just fine on chrome and the quest browser.
Not sure what the best way to go about debugging this is but here is the relevant code and es6 imports
A question (as I couldn’t quite get the emulator to work correctly) - can you select using pinch gesture? If you can, try selecting the floor (the mesh you defined as floor) for 3-4 seconds, and check if the teleportation circle is showing. Would be great to know if that works
@RaananW Pinch does not appear to trigger a select (the ray cast doesn’t change to blue like other platforms), and the ray cast currently comes straight out the center of the hand mesh instead of from between the thumb and pointer finger as it does on the quest. There is no teleportation circle showing up, either.
This data is vendor specific and is defined by the browser. That means that Apple defined the grip location differently than meta.
I don’t have a device and the emulator doesn’t support the hands feature, so i sadly can’t test it at the moment. I’m still trying to figure out what the best course of action is
I wonder if this simple scene works. It should change colors when you pinch (blue) and when you release (yellow). If that works, it means that the do support native WebXR events, but they dont support the hands fully. I will be able to patch babylon if I know this one works. I would expect the hands to show and move correctly as well.
The hands show up and track well, and a raycast appears through the center of the hand from the wrist. However nothing happens on pinch. Here is the log:
Hmmm. That means Apple didn’t implement the most basic WebXR event. selectstart and selectend are not Babylon events, it’s pure WebXR.
OK. I’ll think of a better debugging Playground and will share it here.
I was experimenting with vision OS 1.1 beta just now. It seems like selectstart and selectend have now been implemented. the emulater, though, does not support hand tracking.
Has anyone tried 1.1 beta? any experience with proper hand tracking?