Hello!
This is the first time I ask on this forum, so I hope I’m doing it correctly.
So I’m working on a code that requires compute shaders to use camera info, like a deferred rendered would. I would like to use position, normal, and color textures.
First I tried using the GeometryBufferRenderer to get the normals and positions. Which worked fine.
However, I didn’t know how to get the output from the camera as an input (readonly) texture on a compute shader.
I would get a usage warning if I tried to bind it to the Compute Shader. I thought about cloning/copying the texture to another with proper usage, but I wasn’t able to. Also I don’t know how to change the usage flags of a texture after it was created.
(TextureUsage::(TextureBinding|RenderAttachment)) includes writable usage and another usage in the same synchronization scope.
While validating render pass usage.
Then I found out about the PrePassRenderer and I thought that it would get me the data I was looking for, but still, I wasn’t able to access the textures, even after seeing them exist on the inspector.
I wrote this code that portrays what I’m trying to do:
Babylon.js Playground
How can I bind the color to a Compute Shader?
Thank you