NodeMaterials, Env Maps, Tonemapping and Exposure

So it’s taken me a little while to figure out a good repro in PG for the transparency issue I mentioned in my first post, mainly because it took me a while to notice that my PG was actually being wrong in the same way as the screenshot I posted above.

To recap:

When loading GLTFs using the transmission extension, they screw up the colours of whatever’s behind them.

And here’s a good playground to test this on:

This actually has nothing to do with NME materials, so it seems like it’s a general issue?

When you load up the PG you’ll see this:

There are two chrome spheres, one glass sphere and a glass block in front of them.

The glass sphere is set up using a standard process of creating a PBRMaterial, making it transparent, setting index of refraction and roughness. This seems to be a cheaper PBR-lite way of doing glass that Babylon uses, where it is rendered with alpha blending instead of the multi-pass approach for ‘proper’ PBR glass, so it can have some noticeable artefacts - because its reflections are alpha-blended rather than additive, they behave incorrectly when the background behind them is brighter than the reflection itself:

Which is fine, I don’t think there’s a better way of rendering glass as just an alpha blended material.

However, when we import a GLTF with the transmission extension like the glass box, Babylon sets up that material in a slightly cleverer way, which I don’t fully know how to replicate in code. It seems like it actually gets rendered as an opaque object that resolves and samples the opaqueSceneTexture containing the objects that were rendered to the scene before it. This allows it to draw physically correct glass that can both attenuate and add to the colour behind it - as well as being able to sample opaqueSceneTexture at lower mips to fake frosted glass. Very cool.

But, it seems like there’s a gamma issue where opaqueSceneTexture gets sampled by the glass box material. When you load the scene by default, it looks like the box is just kind of dark, but actually it should be fully transmissive just like the glass sphere is, its tint colour is full white, so we should really be seeing only its reflection.

In fact the behaviour is obviously wrong when you adjust exposure:

The glass box gets disproportionately darker when exposure is decreased, but when exposure is increased it actually becomes luminous and appears to add light to the scene.

Actually, the transmission extension creates a TransmissionHelper that is doing a rendering of all opaque objects in the scene (so the glass box is excluded) and then set this texture as the refraction textures of all transmissive materials (so, to the glass box).

You can’t replicate it by code because this class is hidden from the outside. The current implementation of the transmission extension is a kind of “work in progress”, I’m not sure it will stay as it is, I have heard that we may use another method (like screen space refraction) at some point…

The current problem is that this opaque texture is generated using the current setting of the scene, and notably the image processing setting: so it is rendered with tone mapping + exposure and converted to gamma space. This texture is used in the refraction code after it is converted to linear space, and in the end we apply the image processing to the final color.

To get the right output, I think we should:

  1. generate the opaque texture without applying the image processing and in linear mode
  2. when rendering the mesh with the refraction (opaque) texture, we should use the texture “as is” (as it is already in linear space) and apply the image processing to the final color.

Point 2 is easy as it is essentially what we are currently doing, we need simply to set gammaSpace = false on the opaque texture to avoid the conversion to linear space in the refraction code.

Point 1 requires a bit of thinking…

1 Like

Yes, so it sounds like everything is forward-rendered at the moment with tonemapping built into the fragment shader of each object, so your rendertarget is always LDR and in gamma space? I wasn’t sure if imageprocessing was applying just the exposure controls, or the tonemapper too.

I think that’s what confused me initially with the NME materials as well somewhat - I would expect the forward pass to draw into a linear, HDR floating point target, and then get tonemapped later down the line, after any other postprocess effects have been carried out. (Not too surprised I suppose though, since Three JS does this as well.)

Not sure what the main advantages of this approach are in the current pipe - again I don’t know much about it yet - besides being able to swap tonemapping operations per object. And I guess that unless there’s a depth prepass, presumably that means you also risk computing the tonemapping multiple times per pixel, where there’s overlapping geometry that gets rendered back to front.

Having the option to run a tonemapping pass on the image at the end of the rendering pipeline would let you do more things in linear space like alpha blending, blurring, bloom, ssao, ssr if you choose to add it etc. which is where these effects should be done. In terms of risk, I guess it would add an extra resolve of the scene that you don’t have to do currently, and if you were rendering objects without a background (i.e. you were applying tonemapping to less than all pixles on the screen) you’d be doing slightly more work, but for heavier scenes with overlapping geometry as mentioned above, it might be a saving.

It’s already possible, the image processing can be applied as a post process instead of being applied for each draw.

So, there’s a somewhat hacky way to make your PG work:

It sets an image processing post process and sets the gamma space property of the opaque texture to false (see line 34).

We will have to devise a more robust way to do it (that will work even if not using a separate post process for image processing):

1 Like

Oh that’s cool, that nearly works!

Only issue I can see at the moment is that some colours get much more washed out when you apply the new postprocess.



I think maybe the rendertarget is getting clipped and values higher than 1 aren’t being tonemapped correctly?

That’s because by default the image processing post process is using a RGBA8 texture type. You should use something like half float or float instead:

1 Like

Amazing, I was hoping there was an easy way to do that!

One last question I think - what would be the correct way to combine this with a DefaultRenderingPipeline? At the minute if I add the camera to a rendering pipeline, it screws up the colour space again:

The default rendering pipeline is already using an image processing post process internally, so you should not create a new one but use the one of the rendering pipeline:

Note also that scene.imageProcessingConfiguration.applyByPostProcess = true is done by the image processing post process, so no need to do it explicitely.

1 Like

PR is on its way:

1 Like

Ahh, I’m starting to understand how this works now :slight_smile:

That’s awesome. There’s only one thing left in that case - I actually didn’t notice this in your earlier example until I replicated this in my own scene. It’s a bit subtle in this example, but hopefully you can see with the NME Brightness driven up to 2.

No glass:

Through glass:

It seems like there’s still a clamp happening when the glass reads/writes scene colour. Is there a setting that can be tweaked to avoid this?

It is fixed in the PR:

Without the PR (but with the current code in the PG):

With the PR:

The difference is because in the PR the default texture type for the opaque texture is now HALF_FLOAT, whereas it is RGBA8 currently. You can fix it like this without the PR:

1 Like

Fabulous, that does the trick - excellent glass :star_struck:

Thank you for digging into this @Evgeni_Popov - May the 4th be with you! :confetti_ball: