WebGPU WGSL Vertex shader definition for PostProcess

Hey there!

Right now im trying to create post process shader with webGPU, using some examples from docs I initialized fragment shader and it works just fine, but when Im adding vertex shader definition it fails with
image

Do I need to include both sceneUboDeclaration and meshUboDeclaration uniform definitions for post-process vertex shader or there should be smth very different for this case?

Also will put it here:

When I try to pass RawTexture to compute shader it throws a strange error.

Looking for suggestions.

Just remove the vertexUrl option of the PostProcess constructor, there’s already a default vertex url used by the class. You also don’t have to set the uniforms / uniformBuffers parameters:

Generating mipmaps is not supported for integer textures, you should pass “false” for the generateMipmaps parameter:

Thank you for the answer!

About vertex shader - I was trying to get view direction from vert shader and pass it down to the fragment.
About mipMaps - I got you, I thought falsy value is just enough.
Can I clarify one detail also?
I saw your PR about adding creationflag to the raw3d texture, but unfortunatelly I cant pass it to the compute shader im getting
image

Please tell me its possible to do :slight_smile:

You won’t have access to the view direction, as a post-process is a 2D rendering: you don’t have access to anything 3D in the vertex shader (for a post-process, the geometry being rendered is a simple 2D plane the size of the screen). You’ll have to calculate it yourself on the javascript side and pass it as a uniform to your fragment shader (or calculate it in the fragment).

Rendering in a 3D texture is gated by a special flag in Chrome, as it is not yet officially supported (--enable-dawn-features=allow_unsafe_apis). On the Babylon.js side, we don’t yet support it either, but it’s on our roadmap.

That’s sad.