I recently got a Canon dual fisheye VR lens, and I’m building a little gallery page to show the photos. The gallery consists of:
A PhotoDome to show the image (180-degree SBS stereo, cross-eyed)
Some floating buttons (meshes made from Discs and whatnot) to let the user jump to the previous or next photo and exit the immersive environment.
I’m testing in my Quest 2.
Everything looks great… until the texture for a photo loads in the dome. The photo looks fine and is rendering as intended, in 3D.
But for the buttons, I’m getting “double vision” – my left and right eye just refuse to converge them into a 3D view.
I’m pretty sure BabylonJS isn’t changing the buttons – I can close one eye at a time and click a button to hide and show the dome, and the button in each eye seems stationary.
So, I’m guessing my brain is playing tricks and is latching onto the photo (60mm IPD according to Canon) for convergence, then getting confused by the buttons. I’ve tried both 58mm and 63mm (hardware setting on the Quest 2) with no improvement.
It’s also possible my custom photo post-processing scripts (custom) are wrong, which might make the two images less aligned than they should be – enough that while my brain can converge them, there’s too much difference between them and the way the L/R views of the buttons are rendered.
Has anyone else played with mixing real 3D photos with meshes and seen similar issues?
Is there any way to control the render offsets of the buttons so I can see if there’s a setting that matches the photos?
This seems to be working, but the meshes on-screen move around a bit when moving my head (they are in billboard mode, so they also rotate, but that’s beside the point). I can live with this, since the only elements of this environment are the photo dome and the buttons.
I still feel like I’m missing something and I shouldn’t have to mess with the viewMatrix directly (I also get a TS error that it’s “read-only”). Is there a better solution?
(I also have to set the camera position back to the default so the user can’t move around, only look around. There’s probably a better way to do that as well.)
The buttons are just floating discs, not on a UI layer.
I did also have a “Loading…” message on the UI layer (which only appears, not surprisingly, between photos). The mitigation above actually caused that to create double vision, so I had to move that to a floating Plane.
I’ll see what I can do about a repo.
The issue also presents when viewing photos made in my Vuze XR, so I suspect it’s going to be a common problem for content produced in most stereo cameras, not just specific to the Canon. I haven’t tried photos from my EVO yet, I’ll give that a shot too. And since it appears to be an optical illusion, maybe some people are just better than me at dealing with scenes with more than one IPD represented.
I wasn’t able to unwire everything enough yet to have a reproduction (the code is strewn around a bunch of Vue and Typescript files), but I do have a live demo!
If you go here, you can see some of my 3D photos taken at Burning Man. By default, the immersive mode does use the mitigation I mentioned above to adjust the IPD:
But if you go here, the IPD adjustment is temporarily disabled, so the camera rigs are at their default offsets:
For me at least the second one gives me double-vision of the nav buttons (just above eye level) as soon as the dome texture loads, but the first one is able to reasonably correct for it.
The photos are downloadable (WEBP for best quality/size compromise), so if you’d like to see how they look in some other context to see if I’ve butchered the equirectangular projection or other adjustments from the RAW file, please feel free to do so.
Hey, sorry! just catching up on older topics. Did you ever manage to get it to work?
Changing the offset as you did will make your images look correct but the rest will be displayed incorrectly, as the view matrix is based on data provided to use by WebXR. It might be possible to make the right changes when rendering the image only, while keeping the rest of the scene render as it should, but a reproduction here will be more than helpful.
BTW - For some reason the website is unreachable from my quest. I have no idea why. Internet connection is working, and I had no issue with any other website except for this one… i’ll keep on trying
I created a simple PG that reproduces the problem. The BJS Logo appears blurry and if you close the left or the right eye you can see the Logo not beeing on the same position.
The photo dome is a great way of displaying stereo images to a VR user (or videos!) but it is not a good way to create a skybox for a scene. that’s not the way it is working, and not the way it is meant to be used. Adding objects in the center of it will make you see everything “weird” - these objects will actually be 3d-placed in the position you want them. The reason it is blury is that you put it 180 units away from the headset.
My goal is to build an interactive “Panorama Tour” in WebXR. I want to use a stereoscopic VR180/360 image as the immersive background and place interactive 3D elements (like video planes, images, and GUI buttons) in front of the user so they can interact with the scene.
Since you mentioned that PhotoDome isn’t meant to be used as a skybox with 3D objects placed inside it (which explains the blurriness and depth clash), what would be the recommended approach to achieve this?
Since there’s no native stereo skybox yet, do you have any quick pointers on how I could hack this together myself?
Would using two spheres with layer masks (one per eye) or a custom shader be the way to go? Any rough ideas or architectural hints would be awesome so I can try building a workaround!
It’s a very good question, to which I don’t have a fast answer. When I will address this I will dive into the main differences between the two implementation, and get to the reason behind the “incorrect” projection. there seem to be a few flags set on the skybox (for example infiniteDistance and ignoreCameraMaxZ), there is the texture type, and of course, there is the stereo effect which we will need to maintain. But this is how i would approach this. implement a stereo sky box based on the existing, working skybox.