I am seeking guidance on how to implement teleportation and pointer events in my VR experience, using reticle click as the trigger. As I do not possess an XR controller due to budget constraints, I would like to explore the option of enabling teleportation and click events based on reticle position and click, specifically the center pointer in VR mode based on the gyro in a regular mobile device.
No @carolhmj . I am not asking that ,I am asking about the default cursor which is comin by default at the center . Similar to Youtube VR I want same functionality. Based on the gyro center we can able to click with just simple tap on the screen.
I want to implement a gaze input event that triggers when the user taps the screen anywhere. Currently, the gaze input event is triggered only when the user focuses/stares at an element for a few minutes. I want the user to be able to tap anywhere on the screen to teleport to the position where the gaze/reticle is targeting if it’s targeting the floor .
in VR scenario? how would people click on the screen?
Good Cardboard had a few different solutions to support input, but they were all very much off (like a small hole from the bottom to allow clicking on the screen). Daydream (deprecated, but did replace cardboard) did what we are doing - delay-based click based on gaze.
Of course - you can implement your own solution. The pointer selection would be a perfect starting point for that. But if I remember correctly we don’t have pointer input when using the gaze input.
I am searching the same solution like how youtube is doing with the VR . Like same click event, any part of of screen u touch it will consider center gaze as cursor pointer. Some thing similar I am expecting.
The WebXR select event(s) will trigger when you touch the screen (unless blocked by the browser), so you can listen to them. Babylon exposes the session (in the xr session manager), so it’s only a matter of registering these events and reacting accordingly.
So actually I am trying to enable teleportation whenever some one is targeting gaze to floor mesh. Instead of stare-point to teleport I want touch (any point) to teleport.
The session only exists when the session has started. so you will need to run this code after entering XR. you can either use the observables on the XR session manager, or the xr state observable on the XR experience helper
I am trying to teleport to the gaze position screen tap based on headset gaze position but it is not working properly.
And I want to enable the gaze pointer for all the time instead of stare and point.
As I don’t have a gaze device that will allow me to test it, it will be very hard for me to debug.
Having said that - you should only register the onselect once (or use addEventListener(“select”) after webxr initializes. It will trigger, irrelevant of what controller XR is creating under the hood.
but I am a little confused. How to do all the default pointer events even similar to the default way (point and stare)
Here I am doing everything inside the onselect (which I don’t think is the correct way; I need your suggestion). I want all the default functionality (pointer focus) events here.