VR Menu for Merge-Headset

I’m trying to create an VR-Application for the merge-Headset (https://mergevr.com/).
This system uses very simple mechanic buttons which press at the top left and top right of the phone’s screen, simulating simple touch interaction.

What is the best solution to build a gui for this in vr (2 non stereoscopic buttons on top of the vr output, similar to the “exit vr” button).

My current state is similar to the example below (an approach with full-vr buttons, but with parenting the ui to the camera it would work, although not a very clean solution, as the gui is rendered stereo instead of once):

But on my phone the touch interactions don’t seem to work in vr-mode.
(Test: How would it be possible to restart this game in vr-mode on a phone using touch)

Is this the right approach in general and what would be needed to enable regular touch interactions in vr-mode?

Thanks in advance.

@RaananW Can you help with this?

I never heard of merge, but it looks very nice. Kind’a like daydream, i assume?
The phone you are using is important. if it supports WebVR you can use the WebVR experience helper, but if it doesn’t (like in the case of iOS), it will fall back the device orientation camera, which is a bit limited due to lack of API support. But! it is still working :slight_smile:

I don’t know how the touch registers on the device itself. The GearVR, for example, registered itself as a “controller”, so you can use the controller API. Daydream as well has an external controller. If MergeVR registers the touch as regular touch events, you can use the Pointer API we have integrated in babylon. If it registers as a gamepad, you can use our Gamepad API. Is there any documentation or API definition for the device itself? Maybe I will be able to help there.

BTW, you can always use the Gaze support we have in the experience helper, to use the direction you are looking at as a pointer. You can read about it here - Use the WebVR experience helper - Babylon.js Documentation

If you have some more information about the device itself, I might be able to help further.

Thank you for the fast reply.
I attached an image to make the intended gui more clear.
The merge device is actually only a “dumb case” without any external controller, the buttons are really simply using a mechanical system for the touch input. So actually you can test the interaction simply on your phone. My device is a Samsung S8, i also tested it on a hauwei p30 lite. On the huawei touch interactions in vr seemed to work. But actually I think there must be an easier and robust way, especially as I only need the buttons rendered mono, outside the stereo view.
For example maybe there is a way to hook into the vr-fullscreen mode which could make the entire document fullscreen instead of the canvas only. Then I could simply create fixed html-Buttons on top of the canvas to enable this kind of interaction.
Or maybe there is a way to build an additional non-stereo gui on top of the stereo view.
Any ideas?

Ok, I finally figured out a way.
The solution was to use a simple VRDeviceOrientationFreeCamera without the webVR. I then implemented my own fullscreen button, which shows the entire document including 2 simple html buttons on top of the canvas.
I will take a look into the webvr api to see if I can find a solution including webVR.

1 Like