Detecting headset button press with WebXR on Android (Cardboard)

I’m currently migrating a WebVR app to WebXR. It’s working great with both an Oculus Quest and HTC Vive. But I’ve run into some problems with Chrome on Android in “Google Cardboard” mode (meaning no controllers).

I think I’m probably missing something obvious, but I can’t figure out how to detect when the user clicks the magnet select button on the cardboard headset (described here in case you’re not familiar: https://www.quora.com/How-does-the-magnet-select-button-for-Android-Cardboard-work-and-how-does-the-required-magnet-interface-with-the-touch-screen-of-the-phone).

I see some references to “gaze” in the documentation:

  1. Introduction to WebXR - Babylon.js Documentation
  2. WebXR Input and Controller support - Babylon.js Documentation

But no concrete examples.

Thanks for your help.

In case it helps, I do see some sort of gaze-related data here when using WebXR on Chrome/Android:

xrHelper.baseExperience.sessionManager.session.inputSources[0]

But I’m still not sure how to detect a headset-button press. Thanks again!

WebXR’s input source for touch screen is the “screen” type. Every time you touch your screen a new input source is generated and removed when you remove the finger. My guess is that the cardboard button works like that.

What I do in the pointer-select is to emulate this touch to a pointer event.

If you use a headset (like daydream) it is not gaze but a tracked pointer. and then you can treat it like any other motion control device. you can get the main button using the getMainComponent() or getting the types available (like touchpad or thumbstick). This would be the right place to read about it -
https://doc.babylonjs.com/how_to/webxr_controllers_support#motion-controllers

BTW - I very much enjoy your questions, as they give me pointer as to what documentation is missing. Please (!!) do ask as many as you can :slight_smile:

Famous last words! :slight_smile:

I’ve tried to come up with a playground example to illustrate my problem: https://playground.babylonjs.com/#WIF7NN#21

As you can see in my console.log comments in the playground, I’m pretty sure the input source accessible via xrHelp.input.onControllerAddedObservable.add is of type gaze.

But it’s still not clear to me how to detect the button click. Intuitively I’d think there would be a function like inputSource.onMotionControllerInitObservable.add (maybe inputSource.onGazeControllerInitObservable.add), but I don’t see any such function.

Thanks for your help. It’s amazing to me how responsive you guys are. I know you must be getting a million questions like these.

2 Likes

:smiley:

Very interesting. Actually makes sense when I think about it, this is a classic gaze-type controller.
The “select” event on the session is your friend. I’ll need to check tomorrow why it doesn’t register correctly, but this console output is very helpful already. Will keep you updated.

The “select” event on the session is your friend.

Very nice tip! I was able to get it working on Chrome, Android. Here’s some code, in case it helps others looking for a similar solution.

xrHelper.input.onControllerAddedObservable.add((inputSource) => {
    xrHelper.baseExperience.sessionManager.session.onselect = (inputSource) => {
        // Note that this gets triggered by any selection, including those
        // made with motion-controller buttons. If those buttons are
        // dealt with elsewhere, you'll need top check for gaze here. Otherwise,
        // the actOnButton function will get called twice.

        if (inputSource.inputSource.targetRayMode.toUpperCase().indexOf("GAZE") !== -1) {
            // Note that really inputSource.inputSource.targetRayMode === "gaze". I'm
            // being a bit paranoid here, just in case something changes in the future.
            // I figure regardless of future changes, it should still have the substring
            // "gaze" in it.
            actOnButton();
        }
    };
});

Thanks again!

2 Likes

If i can ask - does the pointer-selection module work correctly? when pressing, do you see a ray from the headset towards the objects in front of you? This is implemented and should work out of the box. You should get the pointer down, up, and move events, just lick a mouse click. If it doesn’t - would you be able to quickly debug this? I just want to know if line 301 in PointerSelection returns or continues (was the pick found correctly) - Babylon.js/WebXRControllerPointerSelection.ts at master · BabylonJS/Babylon.js · GitHub

Thanks :slight_smile:

I can’t find my cardboard-with-button, so i can’t seem to be able to test it. I wonder now how the samsung gear registers as. will have to try this out :slight_smile:

Hi @RaananW. I should say that my app doesn’t use PointerSelection. I cast a ray from the camera to identify the picking location. I then detect trigger buttons, pad movements, and clicks to navigate based on that location (basically a simplified version of the teleportation system).

I do see something that might be related to the pointer, though:

See the slight line to the right of the center point? That shows up automatically in each eye (i.e., it’s not something my code does).

If there’s a playground I could use to test line 301, I’m happy to do it.

Thanks.

Yes, that’s the pointer select. Have you tried using the pointer events? They should work, and could also detect a click (pointer down)