Reticle based XR control

Hi Team,


I am seeking guidance on how to implement teleportation and pointer events in my VR experience, using reticle click as the trigger. As I do not possess an XR controller due to budget constraints, I would like to explore the option of enabling teleportation and click events based on reticle position and click, specifically the center pointer in VR mode based on the gyro in a regular mobile device.

Similar to Youtube VR Controll.


There’s a XR emulation extension for chrome you can use to test.

No @carolhmj . I am not asking that ,I am asking about the default cursor which is comin by default at the center . Similar to Youtube VR I want same functionality. Based on the gyro center we can able to click with just simple tap on the screen.

With ray, rayhelper,
and decal
you should be able to create anything you want.

is the pointer selection feature not working well? isn’t it exactly what you are looking for?

Hiii @RaananW ,

I want to implement a gaze input event that triggers when the user taps the screen anywhere. Currently, the gaze input event is triggered only when the user focuses/stares at an element for a few minutes. I want the user to be able to tap anywhere on the screen to teleport to the position where the gaze/reticle is targeting if it’s targeting the floor .

in VR scenario? how would people click on the screen?
Good Cardboard had a few different solutions to support input, but they were all very much off (like a small hole from the bottom to allow clicking on the screen). Daydream (deprecated, but did replace cardboard) did what we are doing - delay-based click based on gaze.

Of course - you can implement your own solution. The pointer selection would be a perfect starting point for that. But if I remember correctly we don’t have pointer input when using the gaze input.

I am searching the same solution like how youtube is doing with the VR . Like same click event, any part of of screen u touch it will consider center gaze as cursor pointer. Some thing similar I am expecting.

Anyway I will try with some custom solution.

Thanks alot @RaananW .

The WebXR select event(s) will trigger when you touch the screen (unless blocked by the browser), so you can listen to them. Babylon exposes the session (in the xr session manager), so it’s only a matter of registering these events and reacting accordingly.

Hi @RaananW Raananw,

This is my code

So actually I am trying to enable teleportation whenever some one is targeting gaze to floor mesh. Instead of stare-point to teleport I want touch (any point) to teleport.

following to this Using onPointerObservable XR Cardboard - #15 by RaananW

`xr.baseExperience.sessionManager.session.onselect( e => console.log(e))

sessionManager doesn’t consist any “session” so its giving undefined. Any thing I am missing here , doing wrong.

Thanks in advance.

The session only exists when the session has started. so you will need to run this code after entering XR. you can either use the observables on the XR session manager, or the xr state observable on the XR experience helper

Hi @RaananW ,

I have one playground running here.

I am trying to teleport to the gaze position screen tap based on headset gaze position but it is not working properly.
And I want to enable the gaze pointer for all the time instead of stare and point.

As I don’t have a gaze device that will allow me to test it, it will be very hard for me to debug.
Having said that - you should only register the onselect once (or use addEventListener(“select”) after webxr initializes. It will trigger, irrelevant of what controller XR is creating under the hood.

Thanks for your reply.
This is what I tried to control based on gaze and screen tap:

but I am a little confused. How to do all the default pointer events even similar to the default way (point and stare)
Here I am doing everything inside the onselect (which I don’t think is the correct way; I need your suggestion). I want all the default functionality (pointer focus) events here.

You can emulate the pointer events yourself, but if you are already running the callback yourself using simulatePointerDown(or Up or Move). Those functions exist on the scene.
Check the pointer selection code (for example here - Babylon.js/WebXRControllerPointerSelection.ts at 9e0bd9ff72aba11f090236a0690c4ca2be34249a · BabylonJS/Babylon.js · GitHub and further down)

How can we simulate pointer pick ?

pick would be pointer down, depending on the use case TBH.