Chaining WebGPU Post-Processing

I’m having a lot of trouble with trying to chain my custom WebGPU post processes. Basically I just want to separate my shaders and apply them one on top of the other when they’re toggled (and for organization sake), but I can’t figure out how to get the sampler into WGSL.

I should mention I’m new shaders but looking at the docs it seems that I need a sampler in order to get the colors from the previously applied PostProcess. There’s no WGSL counterpart I could find in the guide that says how to access it.
Here’s my code thus far, I believe my only issue is trying to get the sceneSampler into my WGSL:

let starshader = new PostProcess(...)
camera.attachPostProcess(starshader)

// nebulashader should apply on top of starshader, via mix()
let nebulashader = new PostProcess("NebulaShader","nebulas",["seed"],["sceneSampler"],
1,camera,undefined,engine,true,undefined,undefined,undefined,undefined,undefined,undefined,ShaderLanguage.WGSL)
nebulashader.onApply = (effect) => {
     effect.setFloat("seed", 1)
     effect.setTextureFromPostProcess("sceneSampler",nebulashader)
}
camera.attachPostProcess(nebulashader)

In my nebulasPixelShader whenever I try something like uniform sceneSampler : sampler it errors with sampler cannot be used as the type of a structure member.

In WebGPU, you also need a sampler in addition to a texture to be able to sample from this texture.

Here’s how to do it:

The chaining is automatic, just use textureSampler in your shader, it will automatically be set with the output from the previous post-process.

1 Like

A follow-up to this, I’ve been encountering an issue where only the top-most post process is rendered on each .render() call, then subsequent calls will render the post process below it; as if each post process only samples the texture directly below it.

I’m unsure how to replicate it in the playground since I’m rendering frames on-demand, but my code is something like this: Babylon.js Playground

You can see it occurring on my website here (click ‘Render’ on the right).

The PG does not work because you have to use the same number for both loops, else you get some errors in the console:

Yes, that’s how it works.

The first post-process P1 samples from its texture T1 (which has been filled by a rendering of the scene) and generates its result in texture T2, which is the texture of post-process P2.
The second post-process P2 samples from its texture T2 and generates its result in T3.
…etc…
The last post-process Tn samples from its texture Tn and generates the final output on screen.

1 Like