Recreating PBRMaterial in NME

Hey everyone, this is my first post to the forums I hope i don’t waste too much of your time!

Long story short we’ve been trying to recreate the PBRMaterial in NME. We need to recreate things such as lightmap application and do so as accurate as possible to the reference PBRMaterial. We’re also looking to replicate how PBRMaterial displays HDR lightmaps with RGBD data.

You can probably tell we got pretty far already (Playground) (left is PBRMaterial, right is our NME)

Here’s an imgur album of relevant screenshots with notes: Imgur: The magic of the Internet

The scene is a loaded .gltf file, the materials and textures came loaded in as-is. The only difference is that they both have the lightmap loaded in and displaying, the lightmap is tagged to use RGBD, left camera is PBRMaterial right is our NME shader. The .env is a converted .hdr HDR map that we’ve used in the past without issue.

We ran into plenty of roadblocks, one is that empty texture nodes do wonky things when left blank, they’ll usually break the PBR block so I had to include multiple failsafes to make sure blank texture nodes aren’t influencing the results. Other quirks include bump strength needing to be exaggerated greatly on our metal material in order to match with the reference but the wood material does not.

We did some digging into the PBRMaterial code and how it applies lightmaps and I do believe we have a faithful setup in our NME setup. Unfortunately no matter the combination of linear or gamma converts we try we always end up with coloring that’s just slightly off. Even worse, the NME material does not respond to image processing (like exposure) accurately

You can find most of our findings in the imgur album detailed out.

I do think my main concern right now is whether or not we’re doing the lightmap application correctly or if we still need to do some tweak to get it matching exactly. The AO and bump mapping situation are manageable but the lightmaps needs to be accurate as possible, the closer the better!

1 Like

promising results. I dont have a direct answer but I would tripple check that all of your gamma 2.2 textures are being linearized before you feed it to the shader, as the shader expects linearized source assets. If the source asset is gamma 2.2, then they need to be flagged to be converted to linear. If they are linear to begin with, then you dont want to linearize them. It looks like you are not gamma correcting the lightmap, which tells me your incoming lightmap is not gamma corrected. But Id double check that your LM is infact not gamma corrected.?

-Anupam

We’ve tried pretty much every combination of convert to linear or convert to gamma, the results were pretty inaccurate. It seems like the RGB data of the lightmap is already linear, but the alpha (HDR data) was not so that alone got converted to linear via a Power Of block to .4545 before being applied to the lighting, that lead us to the current result. It is the closest we’ve been able to get while using the same textures for both examples.

Added Popov our NME guru

1 Like

Since I can’t include it in the main post due to new user limitations I have to post the link to the NME material here: Babylon.js Node Material Editor

The lighting output of the PBRMetallicRoughness block is in gamma space with image processing applied.

As you want to make additional computations over lighting, you should instead use the other outputs, which are in linear space: simply sum all of them (you can omit some if you don’t use the corresponding feature - for eg, omit sheenDir and sheenInd if you don’t use sheen).

Once you have summed them all, you can factor in the emissive / lightmap contribution, but you must make sure those are in linear space! If those textures are in gamma space (emissive will most probably be, lightmap I don’t know but I assume that yes), you should check the “Convert to linear space” switch for them:
image

Also, don’t do that:
image

The alpha channel is not concerned by the gamma/linear space conversion, it is always a linear value that you should be used “as is”. In any case, using the “Convert to linear space” switch is the way to go.

Also, taken from the doc:

A note about image processing and manual compositing: Note that the composited lighting output of the PBRMetallicRoughness block also adds image processing from the scene. If you desire to to add additional components to the standard lighting setup, you will want to do the compositing yourself, using the separated components. The outputs of the separated components are in Linear color space. This is important because if you desire to calculate scene image processing in your manual composite, you’ll need the ImageProcessing block. This block assumes input values in gamma color space by default and runs an internal conversation to a linear color space output. You will need to turn this conversion off in the ImageProcessing block properties to pass linear through without a conversion.

So, you should disable the linear space conversion for this block as the input will already be in linear space:
image

Thanks for the thorough response! When you break it down like that it does make sense and it does click.

I followed your advice and do notice a difference… but I think I’ll let the screenshots speak for themselves: NME Updates - Album on Imgur

https://playground.babylonjs.com/#C5CGXI#17

I must be doing something wrong here… I did try leaving the lightmap in gamma space but converting to linear seems to be closer (but only if the scene is tonemapped)

You should:

  • pass the albedo tint color as a linear value (so don’t call toGammaSpace() in the PG), that way you don’t need to flag the AlbedoTint block with “Convert to linear space”
  • set “Force irradiance in fragment” for the Reflection block. As you have some bump mapping, having the irradiance computed in the vertex shader (which is the default in node materials) makes a small difference compared to when it is done in the fragment shader (which is the default for PBRMaterial when created from a gltf/glb file).
  • normalize the output of the World normal block. It is not normalized by default, to save some perf and because most of the time there’s no scaling on mesh nodes. But in your case you do have some scaling, so you need to normalize.
  • I removed the vertex data tangents from the sphere/plane so that the rendering with the PBRMaterial is exactly the same than with the node material (the PerturbNormal block is computing the tangents itself - for some reasons passing the pre-existing tangent to this block as an input does not lead to the same result than PBRMaterial).

https://playground.babylonjs.com/#C5CGXI#33

2 Likes

Ah! Thanks so much!