WebXR detect initial rendering of the scene

Hello again!

I have an issue that the contents of my scene is rendered a bit too far when XRCamera (immersive-ar unbounded) is rendering, than where they are rendered with Universal or ArcRotate cameras. Early positioning of the camera does not help and XRCamera starts at the same position and orientation regardless. Then I realized that after the scene is rendered in WebXR (after it reinstates with the room and positions the 3D elements), then I can reposition the XRCamera. It is enough for me, but I cannot find the correct observers or events that could fire when the 3D elements are rendered.

I have tried the following (if not more) but couldn’t find that. Probably I’m missing something:
SessionManager.onXRReady
SessionManager.onXRFrameObservable
SessionManager.onXRSessionInit
SessionManager.onXRReferenceSpaceInitialized
SessionManager.onXRReferenceSpaceChanged
WebXRExperienceHelper.onInitialXRPoseSetObservable
WebXRExperienceHelper.onStateChangedObservable

I’ll be very happy for any guidance.

Thank you! Cheers!

Hello @Caner, how are you doing?

This is probably a question for @RaananW. However, most folks on the team are already on holiday and will probably be back in early February.

While we wait for Raanan, I will do some investigation to see if I can help you with that question.

1 Like

Much appreciated. Happy new year to the dearest community of Babylon JS :heart:

1 Like

On scene being rendered at different position and orientation, I figured out that the initial transformation can be achieved with XRRigidTransform here.

Since setting XRCamera transformation does not work without AR placement is completed, I tried to listen to the aforementioned observables; therefore I could apply transformations when that completes. Then I found that XRRigidTransform enables applying pre-transformations.

sessionManager.referenceSpace =
  sessionManager.referenceSpace.getOffsetReferenceSpace(
    new XRRigidTransform(
      offsetPosition.applyRotationQuaternion(offsetQuaternion),
      offsetQuaternion
    )
  );

The quaternion needs to be applied to the position vector as shown, since XRRigidTransform calculates the orientation prior to the position.

This solves my issue, yet tracking AR placement completion with observables will also be useful.

Thanks a lot, happy new year again!

The reason for that is that in AR the camera transformation is not copied from the original camera. There is reasoning behind it (mainly the unbounded space and the way you expect AR to work) but I won’t go into it too deep here.

However, there is a solution! :slight_smile:

The camera itself has onXRCameraInitializedObservable, which can be used to set the transformation of the camera.

A side note - I wouldn’t change the camera’s position in AR, nor would I teleport in AR. I would adjust the world to the AR scenario. I like your solution, would be great to know in which scenario you need to behavior to see if I can integrate it somehow to the AR init process.

1 Like