Strange performance with WebXR, AmmoJS and Scene.animate

I am trying to fix some performance issues on a low end VR device (Lenovo Mirage running Chrome 84) and have noticed something weird.

The performance is fine when running not in VR mode but just in the browser (scene.animate takes about 5.87 ms) but when I switch to WebXR everything takes much longer, including things that I wouldn’t expect to be affected by whether in VR mode or not, eg scene.animate now takes 17.89 ms (no other changes just switching to webXR).
This is browser mode, scene.animate = 5.87ms
Capture5.87

And switching to webXR, scene.animate = 17.89 ms
Capture17.89

the time is all internal to AmmoJSPlugin.executeStep
Capture

Just wondering why having XR enabled/active would make such a difference to Ammo’s internals?
Or is DevTools rather just showing a general slow down?

Update:
It is NOT WebXR per se. It has something to do with rendering my second camera (as in this thread WebXR render canvas and streaming - #24 by MarkM ). When I start XR I add another renderFunction to the render loop which essentially changes the activeCamera, calls render(false, true) then changes the active camera back. If I disable this function then the performance is fine and scene.animate reports approx 5ms.
You can see the second scene.render in the screenshot in the previous post, it does not cause scene.animate to be called (as expected as second parameter ignoreAnimations=true) but it causes a blow out in the time for scene.animate in the PREVIOUS render?? (unless it is just the way dev tools is reporting it).

I feel like it is high time I tried to recreate this in a PG :slight_smile:

Update: I can get an improved render time for the second camera (<5ms) by replacing the call to
scene.render(false, true)
with
scene._bindFrameBuffer();
scene._renderForCamera(scene.activeCamera);

but the time for AmmoJSPlugin.executeStep still blows out (>17ms).
Also there seems to be a long idle time between attempts to render a frame (see below). Is this some sort of vsync? It is adding to the problem.

Update:
I have created a PG to test the rendering of the desktop/spectator view while Ammo is active in WebXR. It all works fine with good framerate on the actual device. However with my setup here I cannot verify that the original canvas is being rendered but I suspect so. It must be something else in my project. Ugh.

PG: https://playground.babylonjs.com/#1XBS59#1

Pinging @RaananW

I can confirm that it is the call to scene._bindFrameBuffer() that causes the drop in frame rate, not the rendering. I get the result that I want - scene is rendered in headset and on the desktop canvas - but the frame rate plummets, it is obviously not the way to do it so any guidance @RaananW would be welcome.

I also tried with the following but no luck. The AmmoJSPlugin.executeStep step still goes from ~6ms in non-webXR to >19ms in webXR

    var target = engine._currentRenderTarget;
    if (target) {
        engine.unBindFramebuffer(target);
        engine._currentRenderTarget = null;
    }
    scene._bindFrameBuffer();
    scene._renderForCamera(scene._activeCamera);

    engine._currentRenderTarget = target;

Hey Mark, would you be able to share the project? or a minimal reproduction of this? it will be great to be able to debug the issue.

I can assume this happens because the physics loop is being executed twich, for each render. but it will be great to see the code and go over the render loop step by step.

Thanks!!

Hi Raanan, the physics engine was a red herring, it happens in scenes where I do not have a physics engine. It has to do with attempting to render the desktop canvas as per this thread WebXR render canvas and streaming - #24 by MarkM

the playgound I linked here is (a few attempts) at doing it but it works in the PG.
PG: https://playground.babylonjs.com/#1XBS59#2
I will now download this and create a standalone project outside the PG and see if it happens there. The project is quite complex with other bits (React, WebRTC etc) which I don’t think are interferring so I will try and isolate it in an external project. It has something to do with binding/unbinding frame buffers because if I comment out that then there is no problem.

Update: I just downloaded that PG above and ran it on a device (I had to comment out the line to await Ammo). In non-VR the render time is about 9ms /frame and in VR it is about 13ms/frame, so thats good. I then enabled the renderPOV_3 function which was only added about 1.5ms so again all good. I cannot see that the canvas is being rendered correctly but I assume it is - the preview in chrome devtools doesn’t update but when i am in my project I can stream the canvas contents ok. It is pointing to something odd going on. I will try to disable things in my project to find out what is happening.

But I think if this is the way to switch between cameras and buffers for rendering a non-VR view concurrently with a VR one, and it works, as it appears to do in this PG then the problem is mine :frowning:

Update: I can confirm it is the call to engine._bindUnboundFramebuffer(null); that triggers my problem. No idea why.


Update: It is some sort of conflict/timing thing that gets setup when I get the stream from the canvas.
If I comment the line
stream = canvas.captureStream(25);
in my project then the problem does not occur. I can drop the capture frame rate to 5 and the problem either isn’t there or is not noticable. I thought I should be able to reproduce this in the PG but I couldn’t, even with a high capture frame rate (75). In my project the mere act of declaring the stream, even without using it, causes the problem. Perhaps the solution (if I want a higher capture frame rate) is to use a second canvas for the webXR which was what I tried in the first place but couldn’t get it going.
At least some progress.


I just tried again to render the Xr session with a different canvas than the one the engine was created with and that is not working (if it worked I think my main problem would be resolved). I checked the renderTarget and it is set to the XRcanvas I create the experience with and a context is created but it still renders to the engine canvas and not in the headset.

For example, this does not work. If I change line 55 to use the renderCanvas then XR works but if I use xrCanvas then XR does not render in the headset. (Chrome 84)
test.zip (2.6 KB)

1 Like

I am so sorry to be stupid with XR but @RaananW will be back on Monday so bear with us until then :slight_smile:

i’ll have to debug this. We should be able to add the xr camera and the free camera to the activeCameras array, but there seems to be an issue when one is rendering to a render output texture (XR) and the other to the canvas.

Feels like a bug on our side.

1 Like

Do you want me to enter an issue on GitHub for it?

oh, yes please. i’ll have to dive into the code

1 Like