I have an existing BJS app that I am trying to make work with WebXR.
I converted the full screen ui control panel into a ui texture applied to a plane:
AdvancedDynamicTexture.CreateFullscreenUI(“UI”) becomes AdvancedDynamicTexture.CreateForMesh(plane)
And used the approach here:
to keep the controls facing the camera. So far so good.
But when I enter immersive mode on my quest 2 the position of the control panel goes back and up and it’s too small to use.
Here’s a playground that shows the plane that I am putting my controls on:
Is there a better way to get 2d gui controls to work in VR?
What is going on here with respect to my size and position in VR vs the size and position of my control panel? I admit, I am extremely confused.
Thanks and I hope I have more or less given enough info.
I think the reason the control panel is not being positioned properly is that in immersive mode a new camera is activated. I tried to figure out which camera is being used and use the position and rotation of that camera to position the control panel but I was unsuccessful.
I have not figure out effective methods of debugging, like using the inspector, in immersive mode. Yet.
WebXR doesn’t support fullscreen UI. It is technically possible (though not yet integrated), but makes little sense, unless you want to display HUD-details. You can’t interact with a full screen UI.
The better approach is either to create a “menu”, based on a plane (for example), and attach the advanced dynamic texture to it, or use the 3D gui elements. Those can be positioned where you think it makes most sense - connected to the controller, 1 unit away of the user, as a child of the webxr camera - whatever fits the usecase.