Can Babylon texture compatibility support be added for NV12 VideoFrame data?

When migrating to WebGPU, the ImageBitmap stream of data we receive from a third party streaming service no longer works. The encoded format in the raw stream from the third party streaming service is NV12. Would it be possible to add RawTexture support for NV12?

Reference

1 Like

does it not mean you have two input streams ?

Also you could use this code or similar for decoding it A fragment shader to convert NV12 to RGB. · GitHub

I guess it is a bit specific to inject everywhere a texture can be used.

1 Like

We only get one of two different varieties of input stream based upon whether or not the client is running WebGPU. As ImageBitmap is not consumable by the RawTexture update method while running in WebGPU, I will need to handle the fallout. Thanks for the reply and reference @sebavan .

The growing prevalence of NV12 format due to its memory efficiency, particularly in video streaming, might make a case for a RawTexture.createNV12Texture functionality or perhaps RawTexture.createLuminanceChromaTexture to handle any other similar image formats.

1 Like

Why not using a regular texture and update the content with an imageBitmap in this case ?

RawTexture is only meant for raw binary data.

Trying for the leanest solution. Regular texture seems to have a considerable amount more going on.

So I tried the regular Texture route but was met with a black plane.

The following two callbacks yield a black screen with no errors:

frameCb: (frame) => {
            // eslint-disable-next-line @typescript-eslint/ban-ts-comment
            //@ts-ignore
            createImageBitmap(frame).then((bitmap) => {
              this.hyperbeamTextureTarget?.updateURL("", bitmap);
            });
          }

and

frameCb: (frame) => {
            // eslint-disable-next-line @typescript-eslint/ban-ts-comment
            //@ts-ignore
            this.hyperbeamTextureTarget?.updateURL("", frame);
          }

I am going to try creating a basic WebGPU texture then wrap it in a Babylon texture.

Once I create the basic WebGPU texture, what is the best way to wrap it in Babylon?

It doesn’t seem that Babylon exposes the gpu device via the engine.

What pattern should I follow if I need to do my own direct WebGPU device interaction? I see _device as a private property but there is no getDevice() method.

@Evgeni_Popov
Any ideas why the updateURL approach using a regular texture does not work?

frameCb: (frame) => {
            // eslint-disable-next-line @typescript-eslint/ban-ts-comment
            //@ts-ignore
            this.hyperbeamTextureTarget?.updateURL("", frame);
          }

Note: The frame object being fed into that update callback is a valid ImageBitmap.

Are you able to reproduce in the playground? It’s hard to know what’s going on with just a snippet of code.

This PG does work, both in WebGL and WebGPU:

2 Likes

Thanks @Evgeni_Popov . I will take another shot at it. I may be able to capture a video of the debug session but won’t be able to authenticate a PG with the third party service.

updateURL should work correctly, the repro will definitely help.

1 Like