HDRI lighting is too desaturated

Hello, I have been doing some simple shader tests, and I noticed that Babylon’s HDRI IBL rendering seems quite off (compared to other renderers like Blender and Three.js)

I created a simple untextured model for testing purposes. It has very simple PBR materials (only color and roughness is changed, no texturing, no SSS, no clearcoat).

I then added a sun light and tested it in Blender Cycles, Blender Eevee, Babylon, and Three:

All four look identical (except for some very minor shadow differences). That’s great!

But then I removed the sun light and added in HDRI lighting. Here are the results:

Now things look very different! Eevee looks too bright and washed out, and Babylon looks even worse: the colors are desaturated, and the shading is too flat. Three is the only one that looks good, it closely matches Cycles.

Here is a zip file which contains the code for Babylon and Three:

Test.zip (2.3 MB)

After unzipping it, you can simply open up a local webserver and then go to http://localhost/Babylon/ or http://localhost/Three/

And here is the HDRI file that I am using (from HDRI Haven). Of course I pre-processed it into a .env file for usage with Babylon:

wide_street_01_1k.zip (1.0 MB)

Lastly, here is the .blend file (which requires the above HDRI file):

Blend.zip (1.5 MB)

adding @sebavan

I am pretty sure it is related to the high values stored in this file. We are generating our IBL mip maps like Filament does and some other renderers helping us to store a prefiltered format (.env)

As a result we chose to not necessarily support high dynamicity were all the maths definitely adapted to this case might not be as efficient in the general one.

This adds an extra step for the sun to extract it as a linear light.

You can find a greater explanation here: Using High-Contrast, Image-Based Lighting in Babylon.js | by Babylon.js | Medium

@PatrickRyan can also provide more feedback on this one.

1 Like

@Pauan, anytime you get an image from HDRI Haven that is boasting 15+ EVs, you will likely have a couple of problematic pixels in the image. These are meant to be used as light sources for ray tracers, but when you are using a rasterizing engine, you need to convert the file to be used like the env in our case. Our env uses an RGBD method where we store a divisor in the alpha channel that sets the value of the RGB pixel as a percentage of the way between 0 and our maximum value which I think was around 50-60K. In our conversions, we are already clamping values to combat the super high values, but it may be worth some manual manipulation here.

In the case of this image, there are two problematic pixels in the sun, one topping out at 173,000 and the other topping out at 105,000. To illustrate which pixels need to be evaluated, I dropped a levels adjustment on the image and pushed the white values toward black which will highlight the brightest pixels in the image:

You can see here what happens when precomputing the HDRI in Lys. The irradiance shows you how blown out the map is, and when looking at the specular preview at mip 5, we are getting serious blow out in pixels:

When dropping the unedited environment in the sandbox, you can see we are really blowing out the metallic reflections (top row):

Level adjusting the two pixels to just under 70,000 and dropping that hdr in the sandbox returns a better result with less harsh transitions in the rough metallic spheres (top right):

Now when you are looking at the overall light in the dielectric spheres (middle and bottom rows) you aren’t seeing the overall brightness of the environment in the materials because we should really be using a punctual light for the key light in the scene when you have such a harsh key light. Rely on your IBL to add in bounced light from your environment, but I don’t think it’s a great idea to try to simulate bright, high-noon sun with just IBL. You really need a directional light so that you can match the intensity you are trying to achieve, while also getting the benefit of shadows which we don’t generate from the IBL.

The blog post we did that @sebavan referenced shows how to achieve what you are looking for when using these high-EV HDRIs in your scenes.


@PatrickRyan Thanks, that’s a great explanation. Of course I am aware that real time engines cannot work the same as a raytracing engine, they will always be an approximation. I included Cycles simply for a base reference, the real comparison is with Eevee and Three.

However… I tested it again with two more HDRIs:

You are correct that a low EV HDRI (forest_slope) looks a lot better, however Three works fantastic with all the HDRIs, and it does so with faster performance than Babylon.

In addition, almost all HDRIs have 15+ EVs, the only HDRIs which are <15 are very low-contrast HDRIs, which severely limits the type of lighting that can be done, since you are limited to overcast. Even a medium contrast indoor HDRI has 27 EVs.

Since other engines handle high EV HDRIs correctly and Babylon does not, that seems like a bug in Babylon. Other engines do not require manual editing or hacky tricks to workaround this issue (which of course destroys the realism of the HDRI).

Since one of the purposes of Babylon is for things like product advertising and configurators (which benefit greatly from realistic lighting), I’m surprised that your solution is to hack the HDRIs to make them less realistic.

So perhaps there is something that could be improved in Babylon to align it closer to what other realtime engines (like Three or Eevee) do.

Are you speaking of the generation or runtime performances?

I mean the runtime performance. I re-rendered every frame nonstop for 10 seconds, and used the browser profiling tools to see how much time was spent.

Babylon used up 1,726ms (out of 10 seconds), whereas Three used up 1,424ms, which makes Three ~20% faster.

It’s not a big difference, and the testing was not rigorous (so it should be taken with a large grain of salt), though it does show that Three is able to achieve accurate HDRIs without hurting their performance relative to Babylon.

Incidentally, Three also has a much lower startup time, but that’s to be expected since it’s a much smaller and more barebones library.

The 20% you are seeing is coming from CPU so not related to IBL in this case and comes from the options in Babylon being more open. You could potentially freeze your objects to get the perf back.

Now on the IBL technique, the one use in three is not fully common : Environment Lighting

As the maths were not fully validated so far and some of them incuring small errors in between mips, we chose to stay on the standard path for now but this is definitely something we are closely following.

In the mean time you could turn on real time filtering on babylon which would by pass the prefiltering and compute back the lighting per pixels.

1 Like

Right, I retested it more rigorously (stress testing by re-rendering 50 times per frame), and these were the results:

  • HDRI Babylon: dropped 50% frames
  • Sun Babylon: dropped 55% frames
  • HDRI Three: dropped 90% frames
  • Sun Three: dropped 80% frames

So Babylon is doing quite a lot better than Three (in terms of GPU), which is very impressive (especially since I think Babylon has overall higher quality than Three).

Interestingly, it’s faster to use an HDRI in Babylon (compared to a sun light), but the opposite in Three. I assume this is because of the difference in the IBL algorithms.

That’s fair, this is a tricky area which has a lot of tradeoffs. I’m glad to hear that you are aware of the problem and that it will get fixed eventually.

The ideal situation is for all the engines to agree (more or less), so that way it’s easy to move scenes between engines (e.g. from Blender to Babylon). It will be a long time for us to reach that future, but every little bit of progress helps.

That worked really well!

I just set material.realTimeFiltering = true; on all the materials, and with no other changes the colors are now correctly saturated (even with very high EVs):

The shading is still a little flat, but even Eevee breaks with that specific HDRI, so I’m very happy with the result.

I think the documentation should definitely mention realTimeFiltering, since I think most users will expect HDR to “just work”. It is ~7 times slower, but for many use cases that’s a reasonable trade-off for the extra realism.

1 Like

Glad to hear about all this !!! and yes we ll definitely stay on top of this issue.