How would you implement frame interpolation in BJS

So by frame interpolation I mean: if you’re at 30fps, inferring the in-between frames so it looks like 60fps. (though the input latency won’t change)

babylon alr has a GBuffer which provides the velocity texture.

so it should be fairly ez to,

  • render a frame
  • extrapolate next frame by using velocity texture
  • skip render on the next frame and display the extrapolated frame
  • and you would just ping-pong between render and don’t render

I see 2 complications here however.

1: How do I stop all calculations in BJS, and essentially just put an RTT on display for 1 frame.

AFAIK, to display the RTT, I’d have to call scene.render() and if I call that it’ll do a whole bunch of unnecessary stuff when I just want it to display 1 precalculated RTT.

2: How would I extrapolate from velocity lol

The best method I could think of rn is, in the frag shader, check the neighbours’ velocity to see if they contribute to the current pixel.

BUT, depending on how many neighbours I choose, I have a cap on the max velocity I can react to. (so if I choose 3 neighbours, I can only extrapolate till 3 pixel movements correctly)

Also then, large and sudden 1 shot changes will look weird, since they’re supposed to be only 1 frame. (if interpolated it’ll look like the movement happened twice forwards and 1 backwards)

I would love to hear everyone’s thoughts regarding this : )

That sounds like a fun experiment!

Another challenge beyond 1 shot changes is when turning the view even slowly you will lack pixel information and it will results in visual artifacts on one side of the screen :thinking:

If you render a larger texture and display only a cropped version you can avoid this problem though, as long as you cap the player’s camera rotation speed!

If you render your scene as a cubemap, you might even be able to handle 1 shot changes but that might also look weird haha

1 Like

it will results in visual artifacts on one side of the screen

True! though I expect it won’t be too bad since in the worst case it’ll just stay static when it should change? (so objects might appear to shrink or elongate a tiny bit?)

display only a cropped version

though if it does get too bad this would for sure solve it : D

If you render your scene as a cubemap, you might even be able to handle 1 shot changes

even if we render it as cubemap it still won’t kno when/where not to interpolate yeah?

I think you underestimate how good our brains are at detecting visual anomalies :wink: But then it all depends on how you do it so who know ^^

Ah I understand what you mean. The cubemap idea was for a camera that can only rotate and not translate: you would have the entire information at all time so you can interpolate as much as you want. That falls apart the moment there is a translation and indeed one shot is still an issue :thinking:

Now that I am thinking about it there is now support for scene.customRenderFunction that should allow you to get rid of the unecessary stuff.

1 Like

I think you underestimate how good our brains are at detecting visual anomalies

lmao true : )

The cubemap idea was for a camera that can only rotate and not translate

I wasn’t just talking about camera I was talking about like, what if a mesh in the scene suddenly jumps 5 units at once (say as a part of an undo-redo action) : O

there is now support for scene.customRenderFunction that should allow you to get rid of the unecessary stuff.

oh you mean alternate between normal render and custom render? if that’s possible that would solve the 1st issue : D

Good point! Then teleportation must be forbidden :laughing:

Something like that yes, but then I am not sure how you would send the RTT directly to the screen with BabylonJS without relying on a dummy camera and screen plane :thinking:

1 Like

teleportation must be forbidden

nooooo ;^;

but then I am not sure how you would send the RTT directly to the screen with BabylonJS without relying on a dummy camera and screen plane

a screen plane is basically just a postprocess, so yeah would make a simple PP that takes in a texture and displays that : )

1 Like

That definitely works, but we are talking about removing unecessary steps haha. Wouldn’t it be so much nicer to just copy the RTT to the swapchain, no need for rasterizing screen planes and all ^^ Don’t know if that’s possible with WebGL though :thinking:

1 Like

oh yeah truu that would be faster : O