Hello and I have been looking at the base compute shader examples and attempting to adapt one of them. The example I am using is the simple texture copy included in the documentation and I would like to modify it to take a VideoTexture as input.
I have so far been unable to get the texture to update or play, despite changing the shader code to dispatch the kernel at each frame. In another example the WGSL shaders were using an external_texture as input to the shader, but this doesn’t appear to be available in the ComputeShader class.
Using the CopyTextureToTexture class I can copy the buffer into the input texture as below, but I would like to avoid creating an unnecessary buffer, if possible.
Thanks, and is there any overhead incurred from copying the video texture to an RTT using the CopyTextureToTexture class? Or should I program a small EffectWrapper to leverage ExternalTexture and copy via GPU?
Copying a texture adds an extra cost, since it implies a copy (but probably not much).
You can use an EffectWrapper to read the external texture and write to another texture. This is essentially the same as using the compute shader to write to a storage texture. In both cases, you avoid copying into an intermediate texture.
Understood, and the setTexture call works with a VideoTexture on the WebGL. Any idea why the uniforms are not being passed in correctly? Ideally, I would like to create one blur shader instance and to call it multiple times in the same update, but the uniforms in the playground are not being passed and in my React app are erroring (uniform1i: location not for current program).
Do I need to modify the viewport and framebuffer as in this thread?