Hi everyone!
I am currently using a cheap custom made outline for my game models, inspired by this Unity tutorial, and you can see it here in the NME.
It looks like this in action and it basically works with just inspecting the color changes instead of actual normals
However I’ve been planning on improving this for a while now. There are couple of problems:
- I want to exclude some meshes from the effect
- Edges are bit rough
As you can probably guess, the problem nr. 1 is a tough one. I have a separate UI camera right now to ensure that the post processing effect does not effect any UI elements. But today I faced another problem; I wanted to exclude the eyes of the characters, but if I put them into a separate camera with different layer mask, they are above everything else. Yikes.
This led me into a rabbit hole of trying to find a solution for this:
- This thread described how to share a depth value with two cameras, but I had problems implementing them into the game and they seemed a bit hacky. Also those playgrounds of the solutions like EnableDepthRenderer to texture | Babylon.js Playground and Layers, depth and postprocess | Babylon.js Playground (this has my original shader) do not work with webgpu and the latter breaks on window resize…
- Tried also edge renderer but it seems a bit wonky if it just flips normals
- Tried also the new official edge post processing, it looks really good
but it has the same problem as my original shader where I would need to pick meshes that it affects to and also a bonus problem with stuff that moves with shaders. I wobble my foliage like trees a bit so there is a visible offset between the mesh and the outline. Also this seems to take more processing power than my original shader.
So the final questions where I would very, very much like some solutions with:
-
How to most elegantly filter out meshes from post processing, but also tackle the render indexing problem. Can the new node render graph help with this? Here is one of my biggest gripes, since I had to exclude the phone dials from the main camera to not have the post processing effect, but this causes the render indexing issue:

-
How to improve my current node material post process, any ideas would be welcome! I would very much like to make it more smoother on the edges, for example.
The most controllable way to stroke is to copy a model and set it as the backface, then expand it outward along the normal in the clipping space. On this basis, you can achieve a stroke with the same width as the screen space. The method in Genshin Impact is similar to this.
Ah yes I forgot to mention that I am well aware of this tactic, but it is not viable for me. It adds another workflow to Blender and more performance strain since the meshes are basically duplicated. Also since I cannot fully merge all the meshes into one because of dynamic clothing etc. it will become a real hassle that a solo dev might not be able to do.
Yes, for post-processing strokes, it relies on changes in normals or depth to calculate gradients to guess whether it is an edge. If you consider implementing it this way, you can refer to the approach mentioned in this article.
To make the edges smoother, you can consider using blur to reduce the jagged feeling.
article:https://www.vertexfragment.com/ramblings/unity-postprocessing-sobel-outline/
1 Like
Thank you for the article, I ran into the sobel operator in my investigation as well and this explains it nicely! 
Now, I think the biggest issue is the exclusion of meshes… I would really like to find out an elegant and nice solution for Babylon, that works with webgl/webgpu on this one…
The article gives input on this problem as well:
Excluding from the Outline
Occasionally you may need to exclude geometry from the outline effect. In Realms, we exclude both water and grass from the outline but approach it in different ways.
For forward effects, we make use of a custom OutlineOcclusionCamera
which can be configured to render certain geometry. This camera writes the depth values of the selected geometries to a _OcclusionDepthMap
which is provided to the Sobel shader. The implementation of this camera is demonstrated in the GitHub repository.
For deferred effects, such as the grass shader, we set a signal flag in the form of setting the .w
of the normal vector to 0.0
, which we then interpret in the Sobel shader as meaning “ignore this fragment.” It is crude, but gets the job done.
I think this kind of functionality would be doable with Babylon as well, but it seems a bit hassle to me!
1 Like
You just need to render the objects that don’t need outlining to a rendertarget and pass it into your outline pass. Use the UV calculated by the screen coordinates to sample the texture and do a sobel operation. You can get a threshold value and use this threshold value to determine whether to return the outline color or the scene texture. 
1 Like
You could have used the stencil texture, if WebGL supported reading from it… I think the solution from @KallkaGo should work, though.
1 Like