WebXR input for Cardboard

Hi @RaananW et al,

I am building a WebXR app in Babylon and wanted to support Cardboard input and especially intereaction with GUI controls such as the Slider.

Following a discussion over here https://forum.babylonjs.com/t/using-onpointerobservable-xr-cardboard/14485 I worked out how to respond to a “click”, ie a user touching the screen by doing this…

xr.baseExperience.sessionManager.session.onselect = (event) => {
// let xrReferenceSpace = xr.baseExperience.sessionManager.viewerReferenceSpace;
if (event.type === 'select' && event.inputSource.targetRayMode === 'gaze') {
      var pointerRay = xr.baseExperience.camera.getForwardRay();
     //do stuff
 }
}

However I am wanting the Cardboard input to work like any other WebXR controller like a pointer. So touch screen becomes pointer down, release becomes pointer up and the pointer location being center of screen as in gaze (as above). What will be the best way to implement this or, are there plans to have Cardboard input control like this in the core WebXR? Should I use code similar to above and just invoke pointer events manually? What about drag openations as required for a GUI Slider?

Hi Mark,

Cardboard is considered a gaze device (hence - the gaze target ray mode), so, unless changed since the last time i checked, it doesn’t have any controller-data created when touching the screen (like in AR/screen mode).
It does, however, react to the very basic XR events, namely select (and with is selectstart and selectend). You can use those events just like pointer events, if you want general interaction with the scene (to know the screren was touched), but you will probably not know exactly where the user touched.

It will be interesting to check if new XRInputSource is created each time you are touching the screen. This would be the only way to get the touch location. Otherwise the only location-based pointer you have is the gaze, which can be enabled using the pointer selection feature.

Hi Raanan,
Thanks for the reply. Yeah i don’t care at all where the user touches the screen, just that they have touched it, it is a trigger. This is how cardboard works in Unity for instance. The position is always directly ahead of the camera, ie the gaze direction. Touching the screen is just like pulling a trigger on a controller, for a controller the ray is determined by the orientation of the controller. For cardboard it is the orientation of the camera. I don’t see why they should be handled any differently. I got sidetracked on other issues today so didn’t get to progress it but i will pursue soon.
Thanks again for your input.

1 Like

In that case, emulate pointer events using the select (start and end) events.
I will be happy to add simple pointer support for touch/select events when in gaze mode, you can create the issue on github for that. Will probably take a little while as I have a backlog the size of… well, a long list of tasks I need to finish :slight_smile:

1 Like

Did we find a solution for this?

I did answer here - Reticle based XR control - #7 by RaananW .

1 Like