IPD mismatch with Canon VR lens and PhotoDome?

I recently got a Canon dual fisheye VR lens, and I’m building a little gallery page to show the photos. The gallery consists of:

  • A PhotoDome to show the image (180-degree SBS stereo, cross-eyed)
  • Some floating buttons (meshes made from Discs and whatnot) to let the user jump to the previous or next photo and exit the immersive environment.

I’m testing in my Quest 2.

Everything looks great… until the texture for a photo loads in the dome. The photo looks fine and is rendering as intended, in 3D.

But for the buttons, I’m getting “double vision” – my left and right eye just refuse to converge them into a 3D view.

I’m pretty sure BabylonJS isn’t changing the buttons – I can close one eye at a time and click a button to hide and show the dome, and the button in each eye seems stationary.

So, I’m guessing my brain is playing tricks and is latching onto the photo (60mm IPD according to Canon) for convergence, then getting confused by the buttons. I’ve tried both 58mm and 63mm (hardware setting on the Quest 2) with no improvement.

It’s also possible my custom photo post-processing scripts (custom) are wrong, which might make the two images less aligned than they should be – enough that while my brain can converge them, there’s too much difference between them and the way the L/R views of the buttons are rendered.

Has anyone else played with mixing real 3D photos with meshes and seen similar issues?

Is there any way to control the render offsets of the buttons so I can see if there’s a setting that matches the photos?

(I’m using the latest version of BabylonJS)

Ok, so I made some headway by manually adjusting the L/R camera offset during each frame:

	scene.onBeforeRenderObservable.add(() => {
		if (!xrCamera) return
		xrCamera.position = initialPosition
		const leftEye = xrCamera.rigCameras[0]
		const rightEye = xrCamera.rigCameras[1]
		const offset = 0.3
		leftEye.getViewMatrix().m[12] = offset
		rightEye.getViewMatrix().m[12] = -offset

This seems to be working, but the meshes on-screen move around a bit when moving my head (they are in billboard mode, so they also rotate, but that’s beside the point). I can live with this, since the only elements of this environment are the photo dome and the buttons.

I still feel like I’m missing something and I shouldn’t have to mess with the viewMatrix directly (I also get a TS error that it’s “read-only”). Is there a better solution?

(I also have to set the camera position back to the default so the user can’t move around, only look around. There’s probably a better way to do that as well.)

That seems a pretty specific issue, but maybe @RaananW has some idea?


I’ll comment about the buttons - are you using fullscreen UI? this seems to be the effect of the (still incorrect) fullscreen UI support in XR.

Can you share a reproduction of the scene? in the playground if possible?

The buttons are just floating discs, not on a UI layer.

I did also have a “Loading…” message on the UI layer (which only appears, not surprisingly, between photos). The mitigation above actually caused that to create double vision, so I had to move that to a floating Plane.

I’ll see what I can do about a repo.

The issue also presents when viewing photos made in my Vuze XR, so I suspect it’s going to be a common problem for content produced in most stereo cameras, not just specific to the Canon. I haven’t tried photos from my EVO yet, I’ll give that a shot too. And since it appears to be an optical illusion, maybe some people are just better than me at dealing with scenes with more than one IPD represented.

Sounds like this might be true if that’s the case. would be great to see the reproduction

I wasn’t able to unwire everything enough yet to have a reproduction (the code is strewn around a bunch of Vue and Typescript files), but I do have a live demo!

If you go here, you can see some of my 3D photos taken at Burning Man. By default, the immersive mode does use the mitigation I mentioned above to adjust the IPD:

But if you go here, the IPD adjustment is temporarily disabled, so the camera rigs are at their default offsets:

For me at least the second one gives me double-vision of the nav buttons (just above eye level) as soon as the dome texture loads, but the first one is able to reasonably correct for it.

The photos are downloadable (WEBP for best quality/size compromise), so if you’d like to see how they look in some other context to see if I’ve butchered the equirectangular projection or other adjustments from the RAW file, please feel free to do so.

Hey, sorry! just catching up on older topics. Did you ever manage to get it to work?
Changing the offset as you did will make your images look correct but the rest will be displayed incorrectly, as the view matrix is based on data provided to use by WebXR. It might be possible to make the right changes when rendering the image only, while keeping the rest of the scene render as it should, but a reproduction here will be more than helpful.

BTW - For some reason the website is unreachable from my quest. I have no idea why. Internet connection is working, and I had no issue with any other website except for this one… i’ll keep on trying