Get a MediaStream from a Babylon Camera?

I would like to set up a camera in a scene and get an HTML5 MediaStream of it so I can stream out using WebRTC.

I know I can probably get the stream from the visible <canvas>, but what about an arbitrary Babylon camera? So for example if I wanted to stream from a fixed POV while players ran around?

Edit: I see there is support for multiple via scene.activeCameras() and I found rendering a scene to a video, am I on the right track?

Edit 2: Ok just looked at the source of VideoRecorder, it calls engine.getRenderingCanvas() which I assume is the visible canvas. But I see that Camera has outputRenderTarget, so maybe there is a way to make a camera render to an off-screen canvas?

Pinging @sebavan

Unfortunately I can not find anyway in the spec to stream a webgltexture.

Only canvases might be stream. this would then require 2 different canvases so 2 different engines without the ability to share resources between them breaking a bit everything.

245894 - WebGL Shared Resources - chromium - Monorail which is closed as won t fix.

Thanks @sebavan,

DynamicTexture keeps its own canvas. So if I set the outputRenderTarget on a camera to a DynamicTexture, would that cause the camera to render to a separate canvas? Then I could take that canvas and stream it?

Nope, you can not render the scene to a separate canvas without dupplicating all the resources (engine, sceneā€¦) The canvas used in dynamicTexture is for a 2dContext injecting the info into Babylon (by copy)

The only way to share it would be to dump the content of the render target frame buffer to a separate picture that you would draw to a separate canvas. This is pretty expensive and you would probably not be able to reach real time.

Maybe you could try to draw each frame your shared camera to the canvas, share to webrtc, then draw the second camera on top of it. This way even if 2 renders happen (they would need to anyway), all the resources would be shared.