I’m currently migrating a WebVR app to WebXR. It’s working great with both an Oculus Quest and HTC Vive. But I’ve run into some problems with Chrome on Android in “Google Cardboard” mode (meaning no controllers).
WebXR’s input source for touch screen is the “screen” type. Every time you touch your screen a new input source is generated and removed when you remove the finger. My guess is that the cardboard button works like that.
What I do in the pointer-select is to emulate this touch to a pointer event.
If you use a headset (like daydream) it is not gaze but a tracked pointer. and then you can treat it like any other motion control device. you can get the main button using the getMainComponent() or getting the types available (like touchpad or thumbstick). This would be the right place to read about it - https://doc.babylonjs.com/how_to/webxr_controllers_support#motion-controllers
As you can see in my console.log comments in the playground, I’m pretty sure the input source accessible via xrHelp.input.onControllerAddedObservable.add is of type gaze.
But it’s still not clear to me how to detect the button click. Intuitively I’d think there would be a function like inputSource.onMotionControllerInitObservable.add (maybe inputSource.onGazeControllerInitObservable.add), but I don’t see any such function.
Thanks for your help. It’s amazing to me how responsive you guys are. I know you must be getting a million questions like these.
Very interesting. Actually makes sense when I think about it, this is a classic gaze-type controller.
The “select” event on the session is your friend. I’ll need to check tomorrow why it doesn’t register correctly, but this console output is very helpful already. Will keep you updated.
Very nice tip! I was able to get it working on Chrome, Android. Here’s some code, in case it helps others looking for a similar solution.
xrHelper.input.onControllerAddedObservable.add((inputSource) => {
xrHelper.baseExperience.sessionManager.session.onselect = (inputSource) => {
// Note that this gets triggered by any selection, including those
// made with motion-controller buttons. If those buttons are
// dealt with elsewhere, you'll need top check for gaze here. Otherwise,
// the actOnButton function will get called twice.
if (inputSource.inputSource.targetRayMode.toUpperCase().indexOf("GAZE") !== -1) {
// Note that really inputSource.inputSource.targetRayMode === "gaze". I'm
// being a bit paranoid here, just in case something changes in the future.
// I figure regardless of future changes, it should still have the substring
// "gaze" in it.
actOnButton();
}
};
});
If i can ask - does the pointer-selection module work correctly? when pressing, do you see a ray from the headset towards the objects in front of you? This is implemented and should work out of the box. You should get the pointer down, up, and move events, just lick a mouse click. If it doesn’t - would you be able to quickly debug this? I just want to know if line 301 in PointerSelection returns or continues (was the pick found correctly) - Babylon.js/WebXRControllerPointerSelection.ts at master · BabylonJS/Babylon.js · GitHub
Thanks
I can’t find my cardboard-with-button, so i can’t seem to be able to test it. I wonder now how the samsung gear registers as. will have to try this out
Hi @RaananW. I should say that my app doesn’t use PointerSelection. I cast a ray from the camera to identify the picking location. I then detect trigger buttons, pad movements, and clicks to navigate based on that location (basically a simplified version of the teleportation system).
I do see something that might be related to the pointer, though: