Exact frame rendering of an animation into a video

Hi, I want to render a complex scene with animations into a video and I’m not sure which apporach I should take. I know there is the built-in VideoRecorder in Babylon.js, but it uses the MediaRecorder which I don’t want to use. In case of a complex scene and a slow rendering device, the MediaRecorder won’t record the scene at smooth 60FPS (at least in my experiment it didn’t work). So what I want to achive is that every single frame gets rendered into the video and that no frame gets skipped.

So far, I had only success with running the engine in deterministic mode and pause the engine after each frame to fetch the image data from the canvas. But somehow the solution looks very hacky and I’m not sure if there is a better approach. I’ve tried also fetching the media stream but had so far no luck with it.

Here is my playground: Frame Rendering | Babylon.js Playground (babylonjs.com) and here is an example with the native video recording feature using the MediaRecorder on a “heavy” scene (video gets downloaded after a few seconds) Slow video rendering | Babylon.js Playground (babylonjs.com)

Any input would be helpful, thanks :slight_smile:

Hello yeah unfortunately if the device is not performant enough encoding whilst rendering will be too slow and end up lagging :frowning:

The 3 approaches I can think of are:

  1. Use webrtc to stream the canvas to a separate server for rendering.
  2. Use a worker thread to encode data from the canvas in an async way.
  3. Use webcodecs to encode.

Please, let me know your findings :slight_smile: as we could try to repurpose some in the framework if that works.

Hi @subesokun, you might take a look at ccapture.js

I tried this a couple years ago and it seems to try and address the timing problem. I’m not quite sure how it works under the hood but using it was a matter of calling capturer.capture() every frame in the runRenderLoop() until you are done recording.

Another thing, are you using engine.getDeltaTime() in your animations for your slow rendering device? The sleep() call in your slow sample is blocking the render loop, it recorded fine on my machine without that. Maybe you were simulating a slow device.
–Andy

1 Like

It is possible to create and animation gif frame by frame but you would then need a propriety converter to change to other formats

In code
animation gif frame by frame

with inspector
image

I actually think that unless this going to be a facility in your scene, this is the best approach. Just do not have a render loop, and call when finished with the previous frame.

The biggest problem is the BJS animation system cannot be told what time it is AFAIK. It is hard coded to the system time. When I made my animation system, I specifically put that feature in as well as never doing any frame based animation, strictly time based interpolation.

This method also allows you to really do things that cannot otherwise be done like:

  • UHD resolution
  • sound integration (I use ffmpeg to actually merge audio with video tracks in addition to optimizing for streaming)

When the motion picture industry moves beyond draft quality for animation, I am pretty sure they still push it off to a server farm for rendering. Not being able to tell the BJS system what time it is makes it kind of a mess to make promotional material.

Thanks everyone for the usefull hints!

@sebavan Approach 2.) sounds interesting. Do you’ve any examples that shows how I can access the canvas data from a worker thread? What do you mean with 3.) exactly? My idea was that I render the canvas into a 2d image and then pass each frame to a native video encoding library via WASM. Seems like I can access the canvas also driectly via WASM web-sys: canvas hello world - The wasm-bindgen Guide (rustwasm.github.io)

@westonsoftware Yes, I’ve checked out ccapture.js but since the last release was on +3yrs ago I don’t really want to use it.

In my playground I’m using the exact same core functionality of Babylon.js as for this GIF recorder funtionality. But I’ve just removed all the logic which is not required in my case.

Regarding the rendering loop and timing issues: Because I’ve enabled on the engine the deterministicLockstep feature with lockstepMaxSteps: 1 I can be sure that the stepId for each rendering call will be always increased by one. So that I can calculate the current time like:

let now = engine.isDeterministicLockStep()
    ? scene.getDeterministicFrameTime() * scene.getStepId()
    : performance.now();

So far, I had with that no timing issues with this apporach and could pause the rendering loop after each frame without messing up my animations. You just need to make sure that all your animations will be calculated based on this timestamp and are not consuming performance.now() or similar directly. I’d also not recommend to change any global clock.

you could use createImageBitmap from the canvas and transfer the result over ?

Tried to get my head around the ImageBitmap object but I’m not sure how to use it. So the idea here would be to get the ImageBitmap from the 3d canvas and then write the image bitmap into a 2d canvas via drawImage + read the raw image data from the 2d content via getImageData?

For example, if I want to use the libvpx encoder I need to convert the current frame somehow into a YUV image and then pass it into the decoder.

Yup the ImageBitmap is only a nice way to share the data with a worker so it does not slow down as much your main thread but then you d need to get the data as you mentioned.