For rendering the (transmittive) CubeTexture of a mesh I needed to implement my own camera. This works OK in standard mode, but in WebXR mode using the camera.position seems the wrong approach, since the texture then looks in 3D like being painted on the glass material.
How does one access the position of the currently rendering camera origin, which should jump from left to right eye in every rendering pass, or stay constant for non-stereo environments?
If you want to get the currently-rendered camera you can use scene.onBeforeCameraRendered observable. You can get both cameras using the webxr’s camera’s rigCameras array as well.
Thanks. For reference, here is the code, which seems to do the job:
if (scene.scene_cam == undefined || scene.scene_cam==null) {
scene.onBeforeCameraRenderObservable.add((c) => { // update needed for stereo
scene.scene_cam = c;
});
}
and then using scene.scene_cam
in the texture rendering.
1 Like