Ok so I have been working on a WebXR application for some time now, and must first congratulate the team on making Babylonjs so accessible and feature rich. It has been a blast learning it and the compile/deploy time for testing is amazing compared to my previous Unity attempts.
But I really wanted to add haptics to my world where I use small vibrations when the user picks up stuff. But I simply cannot make it work when my Quest 3 is connected over a link cable. I have tried both Oculus Link as the OpenXR runtime as well as SteamVR but no difference. So when my controllers connect I print info about the controllers:
In createDefaultXRExperienceAsync inputOptions I have tried forceInputProfile: ‘oculus-touch-v2’ as well as ‘meta-quest-touch-plus’ which was suggested to me by Gemini. But they changed nothing, the debug I showed above reports them both as oculus-touch. Now I dont know if that even had haptics or if there is something else that is wrong?
Btw this is how I try to trigger that pulse vibration:
thanks a lot, that means a lot to hear that! this is exactly what Babylon is trying to achieve
Now - haptics! that’s fun. I can tell you that oculus didn’t implement everything for linked headsets. Babylon’s pulse function will work only if the underlying webxr runtime supports it an exposes the function. otherwise it is a no-op, which seems to be the case. I don’t have a setup using link at the moment, so i can’t test it sadly. What does the function return? is there any exception, or does it fail silently? if it does, it might just be another thing that is not supported on link (like hand support, which saddens me a lot)
Ah, yes that was the conclusion I came to as well. Chatting with Gemini it seemed to indicate that SteamVR OpenXR implementation was better but I have not gotten haptics working through that either (and I have somehow appalling frame rates in VR when I use that now, requiring fiddling with rendering resolution).
Here is hoping Valves new headset might be more optimized for WebXR - if so I might switch to that intstead.
As for hand support I was unaware it was not supported, I see on startup it claims that I have rejected the hand tracking so many times that they dont want to try again - although I have never really been asked if I should allow hand tracking. It is a shame if hand tracking isnt working either, it was one I wanted to also experiment with in my Babylonjs VR world/playground.
I am actually surprised at how performant it is although I need to work on ironing out some leaks that likely is the cause of the framerate eventually dropping below the target 72fps after some 10-15 minutes of creating/removing/moving and interacting with my world. Btw, I even have a fully working Pacman arcade now that I can play, its streaming the game from a Mame server capture over websockets.