ProceduralTexture rendered in afterRender cause buffer error

PG: https://playground.babylonjs.com/#DHJVBN#3
Sorry I failed to reproduce the bug, but there is difference. Uncomment line 56 can make the proceduralTexture render.

When the proceduralTexture is not rendering, as you can see, the depth buffer changes before the noise effect render.

When the proceduralTexture is rendering, the depth buffer changes before my custom effect.


In my situation, delay of the change caused my meshes having error depth buffer.
So is this delay of the change as expected?

You can change the depth test state used by the engine when rendering the procedural texture by adding an observer to onBeforeGenerationObservable:

proceduralTexture.onBeforeGenerationObservable.add(() => {
    engine.setDepthBuffer(false);
});

You should reset the start when the texture is rendered:

proceduralTexture.onGeneratedObservable.add(() => {
    engine.setDepthBuffer(true);
});
1 Like

I have found the reason for my situation.
Not only bjs is controlling the frame buffer, so every frame depth buffer will be reset. In the playground it works fine because the depth buffer can be inherited by last frame.
So I have to recover the depth buffer to last frame when bjs start, and I simply used proceduralTexture.onBeforeGenerationObservable, though really ugly, but it do work anyway.

The playground doesn’t do anything specific regarding the depth buffer: it is cleared at the beginning of each frame by the Scene.render function. In any case, you should be able to do the same thing on your side(?)

1 Like

Yes, this is what I really should do rather than the solution before by accident, thanks for pointing me out! It is much better than using proceduralTexture.onBeforeGenerationObservable, though I still have to set all the need properties and then set it back to trigger dirty.