Getting depthSampler using custom shaderMaterials and custom postProcess

Hello everyone,
I currently work on the recreation of the fluid renderer since I work with a lot of particle systems and change the position of the particles within the vertexShader. Therefore, I need to write my own shader. Nevertheless, I need to let the particles look like a fluid system. So far I followed the technical implementation from: Babylon.js docs

My problem:
I’m not able to figure out how the depthBuffer is filled and can be reused in the postprocess.
Here is a small example that shows, that the depth is calculated properly and “saved” in gl_fragDepth.

However, I’m not able to use this depth in the post process.
Can you explain to me, how I use the calculated depth from the fragmentShader within the post process?
Or is there a completely other/better way? (Still with custom postprocesses and shaders)?

Small Edit:
First, thank you for your response!
I want to add, 1-2 things:
I want to recreate the fluid renderer or the bilateral filtering. I think in both cases, it is executed as a postProcess! @roland I’m aware that the postProcess is in 2D but I thought that this is the point of filling a depthTexture such that we get the depth of the fragment during the postprocessing.
The bilateral filtering is done like this:

And here, the depth is somehow encoded in the color? (textureSampler.r)

The Fluid rendering is using a depthSampler, but I just don’t get where it is coming from.
(Babylon.js/packages/dev/core/src/Shaders/fluidRenderingRender.fragment.fx at master · BabylonJS/Babylon.js · GitHub)
Here, @Eric.J showed, that the depthMap works for meshes but not for pcs. But a PCS also creates a mesh in the back, doesn’t it? And it is flagged with forceDepthWrite. So in my opinion it should also write to the depthMap?
The problem is that we must create a lot of PCSs because we need to load them one after another and must be able to control them individually.



@sebavan might know

1 Like

I cannot get the depth of the PointsCloudSystem. But I can get the depth of a mesh like this. Hope this can help.

1 Like
1 Like

Actually the depthMap also works for PCS. If I create the PCS with a StandardMaterial and set it as true of the pointsCloud property. I can get a image as below.

But if the value of hasVertexAlpha is true or I use your custom ShaderMaterial, the depthMap is empty. Maybe someone else can help.


Thank you very much!
This is a great hint! This means the problem is not the mesh and not the pcs. It might be, that the standardMaterial shader writes to the depthMap. Maybe I can find a hint in the git and have to write the depth also on my own :slight_smile: I have to look into this a bit deeper but it helps a lot!

It starts here:

Okay, this is also great! I will have a look at it, thank you!

1 Like

I got an working example. I somehow mixed both examples but finally,

helped me to fix the problem. It does work without using hasVertexAlpha. Here is a working example:

At the end one must set the metarial for the depth renderer (depthRenderer.setMaterialForRendering). Don’t ask me, how it calculates the depth or knows what depth it should use (i think it uses the gl_FragColor), but it works. It also works for multiple meshes, and if the visibility is 0, the mesh is not considered. This is also important vor me.
Thanks for the support you all :slight_smile:


Congrats! :slight_smile: :muscle: