Detect if normals are "correct" or need to be flipped?

Hi guys,

sometimes I import models into babylon, which have wrong normals (for example they are “black” when light is turned on). Is it possible to detect, when they are wrong and need to be flipped?

P.S.: I’m using a right handed system.

Best regards,

MrGroove

Hi,
I suppose you know about the ‘inspector’ debug section? ‘Display Normal’ and ‘Render Vertex Normals’ debugging options on mesh selection should help with that. I guess.

1 Like

I’m looking for an algorithmic way. Possibly detect the side orientation?

I’m not an expert on computational geometry, but as far as I know there’s no “perfect” solutions for the general case, only approximations and heuristics. If you can make some assumptions about the geometry then there might be a way; I’ve implemented one approach for terrain that could only consist of walls, simplistic slopes, and flat tiles, but it likely won’t work without these restrictions. One other basic idea would be defining the “center” of a model and then having the normals point “outwards”, which works in many simple cases but isn’t a general solution for complex meshes either.

If you need something more advanced, I’d google for “point cloud normal vector estimation” algorithms.

One paper I found this way describes some approaches worth looking into, if it does match your problem: ResearchGate overview

Hi @MrGroove just checking in, was your question answered? :smiley:

@MrGroove :

I have been dealing with this issue both with importing into BJS and exporting from BJS. I have had success with fixing up the normals in Fusion 360 before importing. There is an option to repair imported meshes by “wrapping” them.

I have also tried doing the same with algorithms and have found it to be some pretty heavy lifting! I tried shooting rays at my models to determine what the outside was and it worked until it didn’t. :melting_face:

What format are your models?

Hi.

I also have the same problem. I need a way to detect if the imported asset is based on left or right handed system.

Thanks.

What format are your assets in? If you aren’t using a format with a defined handedness (like GLTF, which is always right-handed), this is a difficult task :slight_smile:

How do I interpret the colors? Or rather: what color or coloring indicates that something is wrong?

Also, if vertex normals are rendered as expected, will this be sufficient to tell that normals are correct?

Hi,
The subject is kind of complex and complex to achieve with just the tools of the inspector. In essence, the colors of displaying normals from the inspector can help you detect if there’s something wrong with your normals/normals interpolation but they won’t tell you the source of the issue. Very basically, the color coding of displaying normals/normals orientation in the BJS inspector is: cold colors (blue, green) are facing back normals; warm colors (yellow, orange) are facing front. The spectrum in between two displays the interpolation.
Now, the thing is that normal and normal interpolation is depending on a number of factors. The first one I believe is front and back faces. Connect two polys one facing front and one facing back and you will create ‘a breach’ in the UVs. This ‘breach’ can be handled with a threshold/interpolation. In general, your 3D app will allow you to create ‘smoothing’ between UV normals. In general, this smoothing/interpolation works with a limit in the angle between two connected polys. In general, there are a number of methods to create a type of smoothing/interpolation AND in general, the projection mode used for projecting the texture on a group of polygons will be accounted in the method used for smoothing/interpolation. When typing this sentence, I realize this doesn’t really help a lot in practice;)
I’m not a very good teacher but I do play a lot with UVs and interpolation to try minimize the number of textures and materials I use in scene. Fact is that playing with UVs and normal, eventually affecting only parts of a model can create very different aspect using a same texture or even material. Now, honestly, I don’t find this easy to achieve from within BJS. On the other hand, honestly, I don’t think it’s purpose of a 3D engine. My recom would be to work it from your 3D app every time you can. If not possible, because of transform and export in BJS, you would probably need to go through a full regen of normals (and then in case of complex objects) accept that there will be a compromise/difference between the original, manually all set normals/UVs/interpolation towards the regen version from BJS.
I’m not sure this helps you lot but I’m afraid I do not have a one-fits all solution for working this perfectly in BJS. May be someone else has…? cc @PatrickRyan (for the better teacher/guru info :grin:… if he has some time)

Thanks @mawa for the great explanation :slight_smile: And I have to say, holy shit, was totally not aware of the complexity normals have. Your second paragraph makes me actually worried because, well, it is a bit like that Futurama episode where they eat these super tasty snack thingies that turn out to be the babies of the Omicronians…

Is my understanding right about the color coding: direction means world direction? And then, say, we have a mesh with zero rotation. Can we say that all faces of a mesh that point towards the Mesh.forward vector should be colored orange’ish? (And so on)

And the colors (from above) themselves, they are like vertex colors or so? They are made from the 3 vertices of a triangle?

Honestly (dare I say it), I’m not sure :zipper_mouth_face:
I would assume it is local which would make more sense to me. But I’d rather have someone from the team answer. As I said, I only use these tools from the inspector to help detect an issue. I can see wrong interpolation through the change in spectrum and I can see a vertice normal eventually being flipped causing the problem in some cases. Like I know CSG and sometimes OBJ imports can do some weird things in this aspect (flipping the last vertice normal). Though when this happens, I usually go back to my 3D app and try fix it from there.
For the explanations about the color display in the inspector, you should wait for the expert advise. I’m pretty sure someone will kick-in later today. Meanwhile, have a good one :sunglasses:

1 Like

@Joe_Kerr, while looking at the color representation of normals can give you a general understanding of a surface, it does put a large mental task on you to decipher what you are seeing. Let me back up a bit and describe what you are seeing so that we can set a baseline.

Mesh normals are an array of vectors that each vertex has as one parameter (others are position and UV) to help describe a mesh. Each vertex is given a vector that indicates which way the triangulated face will be pointing (the front face). This vector is typically described in the range of -1 to 1 and can generally be understood by looking at the vector. A value of (0, 1, 0) would be pointing directly up while (0, -1, 0) would be pointing down. And a value of (1, 1, 0) would be a 45-degree vector coplanar with the X-Y plane. When we display the normals on the mesh, we are remaping the range of -1 to 1 to a range of 0 to 1 so they can be displayed as a color as any value below 0 would be clipped when viewed as a color. This means that a value of blue (0, 0, 1) as displayed as a color in our normal debug display actually be a vector of (-1, -1, 1) since the values of 0 in X and Y would have been remapped from -1 to 0. This means that a lot of the color display will not give you very good information about what the actual vectors are unless you are able to discern between colors that are very close. However, large changes in colors, like @mawa mentioned, can give you an indication on faces that are pointing in very different directions.

Other complications here include that the debug display will be in local space and not world space. You can tell this because if you turn on the debug display and then rotate the mesh, you can see that the colors do not change on the model. This means that the display is in local space and not world space, which means that there is a conversion needed to determine the world space vector of a face. The other thing to remember here is that the handedness of the scene will determine the winding order of the triangle list of the mesh which determines which is the front face of the mesh. If you reverse the winding order of a mesh, it basically means that the entire mesh will be reversed so front faces become back faces and vice versa.

This is a lot of words to describe the concept, so it will be helpful if I show a few examples. Fortunately, I have on hand a mesh that I tinkered with for another forum thread talking about booleans and this mesh shows some large changes in the mesh right next to each other. This is the Stanford bunny with some booleans out of the side:

Turning on the debug normal display shows some large changes in color which indicate a drastic change in the normal:

You can also determine when faces are likely pointing in the same direction in different parts of the mesh like this where the inside of the boolean has similar colors to the back of the bunny, so these faces are likely pointing in similar directions:

Like I say, this is normalized color, so the vectors have been remapped to 0-1, and they are displayed in local space, so understanding where the normal points in world space only from this display is very difficult as we don’t have information about the actual rotation of the mesh in world space from this display only. However, using the Render vertex normals debug display can help you understand the normal directions without the mental gymnastics of understanding how to translate a color to a vector:

The other thing that you will see with this display is if your vertices are split or averaged. You can also see this in your mesh, if the edges of your triangles are well defined by the light, this means your vertices are split and each one has their own normal perpendicular to the triangle face. If your mesh is smooth and you can’t see any faceting in the lighting from the mesh, then your vertices have been merged and have one normal averaged from all surrounding triangle faces. As you can see below, each of the intersections of the triangles have 6 normals being displayed. The one centered in the image is the easiest to see:

This is because six triangles all meet in a single point and one vertex for each triangle with its own normal perpendicular to the triangle that it belongs to is overlapping in that location. If the mesh had “smoothed” or “averaged” normals, there would be only one vertex with a normal direction averaged from each connected triangle at that location.

So if you are trying to determine the direction of face normals, I would always suggest using the Render vertex normals debug view. However, if you are just trying to get a general sense of the normals of a mesh to determine if they look reasonable or if areas that should be pointing in the same direction or opposite directions in fact are, then the Display normals debug view is very good for that. This is because, as you can see from the examples above, on a complex mesh with a lot of vertices, rendering vertex normals can actually obscure portions of the mesh which can slow you down in inspecting the mesh.

I hope this explanation helps you understand what you are seeing and when to use each debug display. However, if I missed anything that you have question on, please feel free to ping back.

5 Likes

Wow, super helpful @PatrickRyan Thanks for taking the time to explain and in this detail :slight_smile:

2 Likes

@PatrickRyan all your post should create a book !!!

3 Likes

Or game… :grin:

2 Likes