Gradient for smooth scene limitations

Hello community,

I just wonder how to create a smooth transition so the scene doesn’t look that there is an end. Imagine the camera is in the middle of the scene on a plane. The plane is e.g. 10x10. Several objects are placed on the plane. In this case you can clearly see all objects and the edges of the plane. Is it possible to create a gradient that affects the whole scene for all vertices with a higher distance than maybe 7 to the center? I hope it’s clear what I mean.

My first thought was about a volumetric fog. The normal fog is influenced by the camera position what is not what I am looking for. Using a particle system is too expensive imo. Best solution for me would be to have a shader that successively reduces the opacity for vertices with a higher distance to center than X.

This is my approach so far: https://playground.babylonjs.com/#AM5JBF#1

It makes the fog static regarding the camera zoom but it doesn’t work for panning. And it is not possible to create a gradient to transparent.

Best

Do you mean something like that?

https://playground.babylonjs.com/#AM5JBF#2

Thanks @Evgeni_Popov!

My first general question is:

Is it even possible to implement a shader that affects the whole scene? Or is it only possible for materials? Otherwise you have to add the shader snippet to each material (see pg). I would like to avoid this.

Regarding you shader snippet:

Is it necessary to use a color for the gradient? If the scene contains a skybox or a non uniform color in general that would lead to problems (see pg). It would be more dynamic to decrease the opacity.

The gradient decreases from 0 to dist, am I right? Is it possible to define the start and end instead? So the gradient starts at 10 and decreases the opacity successively until 12 e.g.

https://playground.babylonjs.com/#AM5JBF#4

You can use a post process to apply a shader to the whole scene, but it’s a 2D process that comes after the scene is rendered: you would need to enable the geometry buffer renderer or the prepass renderer to get the 3D position corresponding to each pixel.

You can use the alpha channel and apply a blending, but I’m not sure it is what you want as the blending will be done with what is below (which can be the ground):

It’s better if the ground also uses the material:

See PGs above.

Thanks again.

I already thought about a post process but as you mentioned it is just a 2d process. I assume that using the geometry buffer or prepass renderer is more expensive than using shaders for the materials? Than I will stay with your solution. I only have to use the shader for objects that are inside the stated range for the gradient. That should be fine.

Your second pg looks good :slight_smile: But the boxes shouldn’t loose their depth / be visible in front of each other. Is it possible to fix this without loosing the opacity gradient?

EDIT:

I noticed that enabling needDepthPrePass solves the problem. But it definitely reduces fps. Is there another option maybe? https://playground.babylonjs.com/#AM5JBF#7

Not really, as the alpha channel is used, the objects are now considered as transparent and are not handled the same way than the opaque objects: they don’t write to the zbuffer and they are sorted before being rendered. needDepthPrePass = true help with the sorting process.

2 Likes

I see. Okay that’s fine. I am really happy with the result :slight_smile: