WebXR and Offscreen Canvas

The following test of Offscreen Canvas with WebXR [Offscreen Canvas] seems to work fine on my Oculus Quest.

I also tried the BabylonJS Offscreen Canvas example on my Quest and it worked fine (but in 2d…).

I was wondering if BabylonJS, which has support for both WebXR and Offscreen Canvas, can support both together?

1 Like

Hi @mlz,

Babylon webxr does support a custom canvas (even an offscreen one) when creating the context. If you are using the default webxr experience helper:

https://doc.babylonjs.com/api/classes/babylon.webxrdefaultexperienceoptions#outputcanvasoptions

I have to admit I haven’t tested it yet, but there is always a first time!

1 Like

Hi @RaananW,

I tried some test on my code which is using the createDefaultXRExperienceAsync. What I don’t know how to address is the fact that createDefaultXRExperienceAsync has some parts that uses the document (for the VR button) which seems not allowed for a webworker.

I really hope there is a solution to this, I was amazed that the OffscreenCanvas was well supported in the Oculus Browser (but in 2D), and having the rendering done in a web worker would help ensure a smooth framerate on the Quest.

So, using createDefaultXRExperienceAsync, how should I split my code between the webworker and the main thread?

that’s a good quesation.

what exactly do you need in the worker? You will need to initialize XR on the main thread, and will need to reference the offscreen canvas when initializing XR. apart from that you can do anything else in the worker. I guess you might need to combine XR’s pose and the worker. Want to share what exactly you want to offload?

From what I understand from this video: [https://www.youtube.com/watch?v=zYrZNROQzKQ&feature=youtu.be]

I was hoping to be able to use WebXR without the DOM, so I should be able to put the whole BabylonJS engine and WebXRExperience in the webworker.

In [WebXR Device API]:

The user-agent has an immersive XR device (null or XR device) which is initially null and represents the active XR device from the list of immersive XR devices. This object MAY live on a separate thread and be updated asynchronously.

Also, since we cannot use the DOM freely when in immersive-vr, might as well put the WebXRexperience in a web worker that has its own thread.

Oh, that would be an interesting experiment - initializing XR in a webworker.

I was trying to find window references in XT and found one in the session manager. I submitted a PR as a fix, so you can use XR in a webworker - [XR] Remove all window references from XR classes by RaananW · Pull Request #8901 · BabylonJS/Babylon.js · GitHub

Thanks, I’ll look at it.

Since I wanted my webworkers in TypeScript, I look around and found this example on Github [GitHub - easyCZ/typescript-webpack-web-worker-example: Example configuration for working with Web Workers in typescript], combined with this adjustment (putting globalObject: 'this' in the webpack config output) [https://github.com/webpack/webpack/issues/6642] to work well with babylonjs.

This could be a good starting point for a TypeScript experiment.

Hi @RaananW,

I tried and got the code to compile but createDefaultXRExperienceAsync resulted with an error at runtime. It is not clear if navigator.xr is available to the webworker [Navigator: xr property - Web APIs | MDN]:

The read-only xr property provided by the Navigator or WorkerNavigator interface returns an XR object which can be used to access the WebXR Device API.

I digged more about the WorkerNavigator.xr and found this [OffscreenCanvas in a Worker support for webxr · Issue #1102 · immersive-web/webxr · GitHub].

As you previously indicated there are some possible workarounds. An interesting one would be to have babylonjs run in a webworker and have the main thread handle the xr objects and send messages to the worker (using something like comlink). But that seems a lot of work. This could provide increase framerate, but more importantly stable framerate, which is even more important for immersive VR in my opinion. (constant 72hz is better than 120hz with spikes at 30hz).

I can only assume that webworker (still) don’t have the XR namespace. Looks like it, from the github issue you linked.

The way I see it is this - XR in the main thread is (generally) just like keyboard events that are sent to a webworker - XR updated the position according to the device, it can be broadcasted to the webworker, which in turn will update babylon. But this is not supported with our XR implementation.

I agree about the constant FPS! Having said that - rendering is what usually take long. XR is not so intensive.

[GitHub - GoogleChromeLabs/comlink: Comlink makes WebWorkers enjoyable.] could be help to synchronize a webxr class with the webworker. The only part that seems obscure to me is how to handle the offscreen canvas for VR, do we need a canvas for each eye or a double-wide canvas? Also the sync with the XR loop would have to be adressed and the session frame update.

The canvas needed is a single canvas. Its size will be changed by Babylon to fit the needed proportions.

Babylon has callbacks for each state change in XR (including a frame observable). Hooking up to those will probably be the right way to go. But that means the render loop should run on the main thread. If you want to run the render loop in the worker, initiate the xr session yourself and run a simple update loop on the main thread, and emulate XR in the worker (it will turn immutable objects received async from the main thread). Dunno if that makes sense! Just a suggestion

To gain full advantage, it seems that the render loop must be on the worker thread, based on the Offscreen canvas video by David. The VR button requirement seem to impose that the XR session must be started in the main loop. This is quite puzzling since the WebXR doc seem to suggest that the session could be start in a worker but a worker has no DOM (so no button available)!

I got the impression that the XR object doesn’t exist in a worker, so it would anyhow be impossible to run it in the worker. This is why I suggested that architecture.

Sometimes standards simply don’t match (or it is too early to match them). I’m sure it will eventually be possible, but it’s seems to me it won’t work at the moment