I would like to create some sites that have different interfaces when used on a desktop (mouse/kbd), a touch screen (phone/tablet) and a VR device.
Poking around, I don’t see a clear way of detecting which of these situations I’m in. Initially, I expect to be in non-VR (desktop, or touch), and can (I assume) detect entering VR from the VR camera/helpers. (Eventually, WebXR will hopefully support in-VR navigation, but it doesn’t yet)
So, I’d like to know which form of 2D screen I’m on (create the right variation of the UI), and then put the enter-VR button (via the helper) and switch up the UI once the user enters VR.
Thanks for any pointers or guidance. I’m not asking for help with the UI’s in particular, just points to the “Babylon way” of organizing the control flow and code structure.