Hi Raanan, the physics engine was a red herring, it happens in scenes where I do not have a physics engine. It has to do with attempting to render the desktop canvas as per this thread WebXR render canvas and streaming - #24 by MarkM
the playgound I linked here is (a few attempts) at doing it but it works in the PG.
I will now download this and create a standalone project outside the PG and see if it happens there. The project is quite complex with other bits (React, WebRTC etc) which I don’t think are interferring so I will try and isolate it in an external project. It has something to do with binding/unbinding frame buffers because if I comment out that then there is no problem.
Update: I just downloaded that PG above and ran it on a device (I had to comment out the line to await Ammo). In non-VR the render time is about 9ms /frame and in VR it is about 13ms/frame, so thats good. I then enabled the renderPOV_3 function which was only added about 1.5ms so again all good. I cannot see that the canvas is being rendered correctly but I assume it is - the preview in chrome devtools doesn’t update but when i am in my project I can stream the canvas contents ok. It is pointing to something odd going on. I will try to disable things in my project to find out what is happening.
But I think if this is the way to switch between cameras and buffers for rendering a non-VR view concurrently with a VR one, and it works, as it appears to do in this PG then the problem is mine
Update: I can confirm it is the call to engine._bindUnboundFramebuffer(null); that triggers my problem. No idea why.
Update: It is some sort of conflict/timing thing that gets setup when I get the stream from the canvas.
If I comment the line
stream = canvas.captureStream(25);
in my project then the problem does not occur. I can drop the capture frame rate to 5 and the problem either isn’t there or is not noticable. I thought I should be able to reproduce this in the PG but I couldn’t, even with a high capture frame rate (75). In my project the mere act of declaring the stream, even without using it, causes the problem. Perhaps the solution (if I want a higher capture frame rate) is to use a second canvas for the webXR which was what I tried in the first place but couldn’t get it going.
At least some progress.
I just tried again to render the Xr session with a different canvas than the one the engine was created with and that is not working (if it worked I think my main problem would be resolved). I checked the renderTarget and it is set to the XRcanvas I create the experience with and a context is created but it still renders to the engine canvas and not in the headset.