Has there anything changed how BabylonJs handle Normalmaps?

@oglu, could you share some broken assets for maya, I guess this could help @Drigax to create the fix ?

Yes i will upload them.
Could take two days im on vacation.

1 Like

no rush thanks a ton have a great vacation

Hey @oglu,

I took a quick look at creating a Maya normal mapping, this does look kinda suspicious:
Blinn:

Arnold Standard surface:

Babylon:

glTF:

It looks like our normal R and G channels are getting inverted somewhere when importing a StandardMaterial? I’m not totally sure what may be going on here, but I’d start by investigating if we are modifying the normal texture somewhere in the import process.

here are my source assets for reference:
Forum_has-there-anything-changed-how-babylonjs-handle-normalmaps_8047.zip (117.7 KB)

@Drigax Yes thats the first isssue.
The second one is more complicated i will upload something later.

yay! have struggled for days with this, it’s not just Maya I do not think. When I was generating my own normal maps it was nearly impossible to tell if it was offsetting the surface inward or outwards from the way the lighting was reacting. I thought it was the way I wa generating the Normal maps, but the math checked out so I just kinda moved on.

Maybe its UV orientation thing that I was struggling with that @oglu is mentoning? Or maybe it’s completely unrelated. Just thinking there might be a solution coming up out of this for me as well.

I don’t really understand the second issue too well, I’m not too well versed in UV “shells”. However, if we rotate our uv orientation, wouldn’t that break our normal mapping?

Your second issue with the stone structures looks similar to the first issue, Maya seems to interpret the UV mapping as Green = Up, Red = Right for both shaders, while our Standard Material appears to be rendered with Green = Down, Red = Left, making the surface appear Z inverted.

@sebavan, does any of this sound familiar in the StandardMaterial shader? So far I haven’t found any much difference in how we parse a StandardMaterial vs PBRMetallicRougnnessMaterial vs PBRMaterial, in-engine the textures assigned appear unmodified:

Let me add @bghgary and @PatrickRyan who did an amazing job on the loader for it, so they ll have a better idea than me on this topic :slight_smile:

Here is the scene showing the issue.

We had the same issue years ago in some other engines. UV Shells had to be rotated upwards to get good normalmap results. Modern Shaders dont have this issue anymore.

The only thing i did to make it work is to rotate the uvShells and rebaked.
Thats a fix in the content creation side. I hope we can get this working without this hack.
Cause its not allways possible to rotate UV Shells.

Those normalmaps does look wrong in the maya viewport cause i inverted them to look “correct” fot the export.
blinn_normal_issue.zip (3.9 MB)

@oglu, I wanted to take a moment to clearly explain what is happening with normal textures with some samples to hopefully clear up any confusion about what is happening. I am aiming for a complete end to end explanation here for anyone to be able to follow, so bear with me for any information you already know.

First let’s start with a explanation of what the normal map formats are and what they look like.

  • OpenGL expects the first pixel in the texture to be at the bottom (lower-left pixel) and can be thought of as bottom up
  • DirectX expects the first pixel in the texture to be at the top (upper-left pixel) and can be thought of as top down

To see what that looks like, we can consider this normal map comparison:

To help identify what normal format you have by looking at the texture, you need to understand if details on the texture are embossed (stands proud of the surface) or debossed (does not stand proud of the surface). As an example, the left shapes in each texture above are embossed and the right shapes in each are debossed. Next look at the tones in the map and assume that the lighter tones in the normal texture (the light greens) are being cast from a light either directly above or below the texture. If you know your details are embossed and the lightest tones are at the top suggesting the light is positioned above your texture, you have an OpenGL format texture. If the lightest tones on an embossed detail are on the bottom, suggesting the light is below the texture, you have a DirectX format texture.

Truly, the only difference between the two formats at the file level is that the Y coordinate is inverted, positive being up in OpenGL and down in DirectX. If you assume the R, G, and B channels map to the coordinate system X, Y, and Z, you can see that we only need to change the G channel of the texture to convert between the formats. And the only operation you need to do is an invert of the tones in the G channel if you want to convert in an image editing package. Or you could convert directly in the shader by including a one minus pixel color for the G channel of the texture. And in our Node Material Editor, we have parameters on the perturb normal node that allow you to invert Y in the normal texture to effectively convert between the two formats.

Now let’s look at the requirements used by software:

When you are creating your textures, you can specify the format for the normal map before baking, but you need to align with your final use case. With that in mind, whether you are targeting an offline rendering engine like Arnold or a real-time rendering engine like Babylon.js, you need to know which format is expected by the renderer’s shaders and how your file formats may impact that choice. So let’s look at those:

This would seem to be a simple conversion from OpenGL to DirectX formats when using a glTF, but there is another issue that complicates this matter.

  • glTF uses a right-handed coordinate system
  • Babylon.js uses a left-handed coordinate system

This is important for this reason:

This is a glTF file with two planes in it exported from Maya. They were both assigned an Arnold standard surface shader with an OpenGL format normal map. When the file is loaded the plane on the right has its material replaced

When you load that glTF into Babylon.js, we take care of converting the file to match the handedness of the scene for you. However, what you see here is that the plane on the right has been assigned a Node Material with a DirectX format normal texture. It may be a little confusing why the normal looks incorrect when we just said that Babylon materials expect a DirectX format normal but this has to do with how the glTF was loaded. Babylon.js materials still expect a DirectX format normal, but when we loaded the glTF, we inverted the tangent space in Y so that it conforms with the OpenGL convention of glTF. That means that any normal texture that is applied to the mesh, no matter if it is a material from the original file or one that is created in Babylon.js needs to be authored in the OpenGL format to render correctly.

Now it may be easy to assume that if I just save my files in the .babylon format that all would be fixed and I could just save all my normal maps and DirectX format. You would be correct only if you do not change any textures in the materials assigned to your meshes and just used the assets from the .babylon file directly. If you assign a new texture to a material on a mesh from a .babylon file or you assign an entirely new material such as a node material you will get an unexpected result. The image below shows a .babylon file imported with the same OpenGL format normal from above on the left and on the right a node material that assigns a DirectX format normal to the plane.

So what happened? The plane on the left looks incorrect as it should, but the one on the right looks completely wrong. This stems from the origin of Babylon.js as it was a spiritual successor to an older engine that @Deltakosh wrote which was a DirectX engine. Much the same as mentioned before, DirectX formatted normal textures are read from the upper left pixel rather than the lower left pixel in OpenGL, UV space in DirectX is also read from the upper left instead of the lower left.

To illustrate this, I created a simple graph to show how the UVs are stored in a glTF file versus .babylon file:

When we export a .babylon file, we don’t change anything about the texture files as that could cause some extra problems when updating or editing the textures, but we know on load that the UV space is DirectX with the textures likely being in OpenGL format. So when we read in the textures we store them in memory inverted in Y. You can see this if you inspect a texture in your scene loaded from a .babylon file. You will notice an indication as to whether it is “Stored as Inverted on Y” in the list of general properties.

So how do we fix this. There are three ways we can work around this issue when adding new textures to a loaded .babylon file, one art fix and two code fixes. The art fix would be to author your normal textures in DirectX format and save your textures inverted in Y. You could do this on export from your texturing tool like Substance or you could manually invert them in Y in an image editor. This could be very disruptive to your art pipeline, so it may not be the right solve.

The code fixes are simpler. One is to invert your texture when you load it by using the invertY parameter available in BABYLON.Texture which is the easiest solve. However, if you are loading textures through Node Material rather than in your javascript, that won’t work. This leads us to the other code solve which is to add a one minus Y operation to the UVs fed to your texture as you can see below.

This will correct your texture inversion issue and as you can see this will fix the rendering. Again, the planes are loaded from a .babylon file and the left one was assigned an OpenGL format normal texture in Maya and the right uses a DirectX format normal assigned with a Node Material that has the UV space inverted in Y.

So that you can experiment on your own, here is a playground with the scene referenced here. https://playground.babylonjs.com/#YCCU8U

Once we have added PBR to Node Material, it will be easier to make any of these convention switches, but at least I hope that this will shed light on what is happening and why. It is a deep topic which is made more complex by multiple file formats with their own conventions, but there are tools available in engine to switch to whatever you need. In a way we can say that while Babylon.js was originally designed based on DirectX principles, it has since become more convention agnostic as there are plenty of tools available to make your assets work correctly so long as you know where you are coming from and where you are going. Hope this helps!

6 Likes

And here is the doc version:
https://doc.babylonjs.com/how_to/normal_maps

3 Likes

Thank you so much for your explanation! I never knew of this difference between normal formats between the graphics APIs.

1 Like

I was just about to say that sometimes, @PatrickRyan posts must have to be saved into the doc :slight_smile:

1 Like

Thanks Patrick for the insids. But that doesnt help with the uv shell rotation issue with blinn shaders. Is that a limitation of the blinn shader itself? Cause its working with the PBR one.

Ok so that’s why 1-y worked on my normal map generator. That makes so much sense thanks for the extensive explanation of webGL vs DirectX normals and UV alignment.

Always knew they were flipped but never really thought about it to this amount.

1 Like