Path-tracing in BabylonJS

@Evgeni_Popov and @sebavan

This is a reply mainly to @Evgeni_Popov but I included @sebavan as well because it pertains to your question about luminosity and the E channel.

Thank you for the suggestion to use the RGBA instead of the E channel. I too tried this initially with the three.js renderer, however I ran into the issue of lost precision by the time it was converted from RGBE to RGBA. As you know the A channels are all 255 at that point so I have to look at the RGB part - the problem is that many pixels elsewhere in the GPU-friendly RGBA image will have 255,255,255,255 - especially clouds, snow, white signs, the corona around the sun disk, so therefore I can’t rely on sampling everything that is white with full opacity.

From inspecting the raw data, I noticed that only the exact center pixel (or maybe 2 pixels) of the Sun’s disk will have a higher E channel than every other pixel in the image, even the rest of the small sun disk. If I can indeed narrow it down to 1 or 2 absolute winners in this bright pixel competition, then I can create an exact 3d vector SunDirection that will be sampled from inside the path tracer (with of course some randomness added in so that the shadows aren’t artificially too razor sharp, like old 80’s ray tracers, ha). The problem with using RGBA data is that I don’t know which of the many 255,255,255,255 pixels is actually the center of the Sun.

When I’m able to loop over the raw RGBE data, only 1 or 2 array elements of the large uint8array would have an E channel that was higher than the rest - sometimes narrowly by one digit, like 143 for the absolute brightest pixel, vs. 142 for runners-up elsewhere in the image. I guess because these numbers are exponents and the result is not linear, that one number can make a big difference. I’m not too familiar with loading and converting RGBE files, but that’s my guess.

Apologies if I didn’t understand your luminosity RGB suggestions correctly and it is possible to get the very brightest winning pixel with just the RGB of the final image. I don’t even know if I was using the RGBA data incorrectly somehow and for instance maybe the winning pixel has 255,255,255 (or similar high numbers), and everyone else has maybe 254,254,254, or something similar which would appear just as white to the human eye, but mean much more in terms of exponential brightness values if it where to be converted back. Again, I’m not very experienced with this file format and its conversion calculations, so sorry if I’m not understanding your suggestions as you intended.

I’m hoping we can discover a solution that has minimal impact on the source code. Thanks again for your time and expert input. :slight_smile:

The data are in float format, so looking for the max value should work:

This will put a big red dot at the location where the max value has been found.



Oh wow, this is great! Thanks so much for clarifying. I guess I totally had the wrong ‘big picture’ of how these RGBE files are converted and stored inside using the Babylon.js library. I have definitely learned something new today, ha!

Ok I will see if I can change my sun direction finding algo in my .js setup code and follow your example. If we can in fact use the RGB float data in the end, will there even be any changes necessary to the Babylon source? If so, that would be the best of both worlds.

I will start working on this and hopefully soon we can have beautiful hdr / glTF renderings. I’ll let you guys know if I need more help, but hopefully everything can fall into place now.

Thanks again for the clarification and the very helpful playground example! :smiley:


No need to change anything if you can use what I did in the PG. The .hdr is loaded as a cube in the PG, though. If you load it as a 2D texture in your 3js code, you will need to load it in a different way (I think a new BABYLON.Texture("xxx.hdr", scene) should work).

1 Like

Awesome, I will try to use as much as I can from your PG example.
If you don’t mind, I have a couple of further questions about actually loading and displaying the hdr images inside my pathtracing shader:

Just to clarify for all reading this thread, how I did it in the past with three.js is I would have a url path to the hdr file on server or disk, I would then call three.js’ hdrLoader.load(url, etc.) and use the data to loop over the pixels to find the brightest one. (Thanks again to @Evgeni_Popov for showing me a better way to do this looping/finding step with his helpful PG example!). I would also get the width and height of the original hdr image in addition to this pixel data.

After my sunDirection vector-finding algo was complete, I sent it over as a uniform (i.e. vec3 uSunDirection) to the pathtracing shader so the diffuse materials in the scene can directly sample the Sun very accurately. But also the end users need to be able to actually see the hdr image in the background, so I just sent the hdr over as a normal GPU texture. I had incorrectly assumed that it was in 8bit format (0 to 255) by the time it was handed over to the GPU to be used as the background image. But as I just learned today, this hdr texture is indeed handed over to the GPU in float format, which can have values way outside the 0-255 range when it is being sampled as a texture2d in the shader. Do I have this basic idea right?

So I’m thinking if I happen to sample the texture in just the right spot, it will give an rgb value of something like rgb(240000.0, 199000.0, 56000.0), which is totally fine because I accordingly down-weight these diffuse surface Sun ray samplings in the end anyway (per MonteCarlo importance sampling) and also apply tone mapping to the final overall pathtraced image, which brings every single pixel into the 0-255 range for correct monitor output.

So essentially I was just sampling the large hdr texture when a ray hit the sky (or ray t value of INFINITY in the shader), which ‘paints’ the hdr image as an infinitely far away interior of an imaginary sphere around the entire scene. Each background pixel is calculated by a 3D ray direction-to-2d textureUV lookup function in my shaders. This means that I don’t really need an actual cube or sphere or anything as a base geometry for the hdr image. Is there a way I can get the RGBA data (as you have shown), the hdr width, and the hdr height, without having to go through the HDRCubeTexture function?

This is where it gets tricky because from what I can see in the Babylon.js source is that the necessary readPixels() function flows from and is baked inside the HDRCubeTexture() function. Which is totally understandable because when rasterizing traditionally, you have to apply the actual image to some kind of background geometry like a giant cube or sphere that is infinitely far away, in order to have it surround the scene in all directions. But in our new ray tracing use case, I just need the hdr as a normal 2d texture to sample from using only the 3d ray direction look-up in the shader.

I think the final stage will work no problem as you mentioned I could simply load the hdr as a new BABYLON.Texture() which would correctly (in my case) hand over the image to the GPU as a normal 2d texture without creating a cube.

But I’m hoping we can somehow bypass the hdr cube texture function and still be able to get the RGBA pixel data, the width, and the height of the image. Those 3 things I absolutely need in order for everything to work correctly.

Sorry for being long-winded about all this, but I wanted to be precise as possible about the needs of the pathtracing shader in regards to hdr files. Again thanks for your awesome help!

1 Like

Using Texture to load the .hdr as a 2D texture does work:


Fantastic! This is exactly what I was hoping for! I will work on getting this in the Babylon Path Tracing Renderer as soon as I can.

I’ll let you know if I have any other questions, but hopefully I can do everything correctly now. Thanks again for your time and expertise! :smiley:

1 Like

Hello @Evgeni_Popov ,

If it’s ok with you, I must once again lean on you for support with a loading issue I encountered.

I have already done all the infrastructure for the .js and .glsl code to be able to (theoretically) load, decode, find brightest pixel area, pass to GPU, and sample from an HDR inside our GPU path tracer. All the plumbing is there, ha.

The problem is that when viewing the final path tracing scene, the background is either black or blown out with oversaturated values - I can’t make out any kind of picture from the hdr, it must be junk lying around in GPU memory or something.

Here’s a quick link to the scene:
Non-Working HDR Environment demo (debugging, W.I.P.)

I went back to the .js source where BABYLON.Texture() is used and tried to add some console.log() debug output. It’s not even firing that part of the code. When I then type hdrTexture (which is the variable to hold the hdr texture object to be handed over to the GPU to be sampled from as a normal texture2D sampler) the browser console reports that hdrTexture is undefined. The graphical output kind of makes sense in this situation because nothing is getting passed to the GPU which I am erroneously trying to sample from in the path tracing shader.

The question becomes, why is the async () => callback not firing? Could you please take a look at the .js file where all the hdr loading action happens?

Loading section of the HDR demo

Maybe I’m missing something or doing something that happened to work with three.js, but doesn’t fit Babylon’s pattern of dealing with float textures. As you can see from my code, I used your very handy bright pixel finding routine (credit to you coming in the comments in the source code soon!) with a few minor changes just because of how I want to look at the data on the screen. But my minor changes shouldn’t have broken anything - I can’t even get the callback to fire or print out any console output at all, lol. It’s as if the hdr texture never existed.

Thanks once more for taking a look at this. We’re so close!

Your callback should not be async here :slight_smile: but it still does not explain why it does not fire I let @Evgeni_Popov check this part

1 Like

Ah, ok thank you. I must admit I don’t really have a firm grasp on async, await, etc. - I have been so deep with path tracing for the last 6 years that all these new javascript features have passed me by, ha. :slight_smile:

1 Like

In fact you do need the async keyword as we are using a await construct to get the pixel data.

Are you sure you don’t have a 404 in the browser console? It’s the only explanation I can come by to explain you don’t see the first console.log (or pathTracingScene is not ok)…

1 Like

1 Like

There are no 404 errors that I can see. Just to test, I misspelled the hdr path name on purpose, then I got the 404 right away.

This is in odd situation, because it works fine in your playground example - it’s nearly the same code. Could it be that the PG has a createScene() function that returns the scene whereas I do not? Or could it be that by the time the big hdr is loaded from the server, the time to get it into the GPU has somehow passed? I’m afraid I’m not familiar with the network inner-workings.

p.s. as you can see from my screenshot, it is 93 degrees outside in September here in Texas, lol! :grinning_face_with_smiling_eyes:

It seems there was a problem with .hdr loading and/or loading notification in 4.2, if you test my PG ( in 4.2 you will see it does not work.

I guess you should switch to 5.0.


Yay! At some point I was going to ask you guys if I needed to upgrade to the latest version of Babylon and how often. I haven’t been keeping an eye on the main development page as much as I should. It’s good to find out that this was the problem. Going to upgrade now!..

Whoo hoo! That was it! As you can see, it is loading as expected.
The model not loading was due to the same issue -
I had to not only update Babylon.js to 5.0, I had to update the babylon.glTFFileLoader.min.js as well. When all that was done:

I still have some firefly kinks to work out in the shader. I remember from when I did this in the three.js renderer, this particular demo (using an hdr) was the most finnicky and sensitive in terms of suppressing hot pixels or fireflies - it’s simply because the range of luminosity is so great inside the hdr texture. For now, I turned off refractive caustics so it’s not so visually distracting, but I’m confident that soon I will be able to work out the firefly issues so we can have focused refractive hotspots from glass glTF models.

Here’s the working demo minus refractive caustics:

Working HDR demo

Again, thanks so much for helping me solve the loading issue! :smiley:

p.s. it says I can’t post more than 3 times in a row on this thread, which is why I keep editing this last post, lol.

To all, I am still fiddling with the sun brightness, color bleeding, caustics, firefly suppression, etc. So the end HDR result will change for the better over the coming days - it just requires a lot of finesse and fine-tuning on my end. HDR path tracing is the hardest in terms of my custom shaders (a lot of conflicts and tradeoffs to juggle). Thanks for your patience while I work on this, but I’m so excited to have this displaying correctly now! :slight_smile:

Hey everyone,
On the new HDR Environment demo, I just now added an HDR image picker to the gui as well as an HDR exposure controller and Sun power slider!
Could a Babylon forum admin please grant me access to be able to post more than 3 times in row? I have to keep adding to this post, ha ha. :grinning_face_with_smiling_eyes:

Or, since I didn’t start this thread, could maybe @PichouPichou or @sebavan or @Evgeni_Popov please post a short reply sentence below, so that it will break my 3-post-in-a-row sequence? Thanks!


This is Amazing !!! :wink:

1 Like

Thank you! And thanks for breaking the 3-post streak - Ha! I was in limbo for a couple of days :grinning_face_with_smiling_eyes:



1 Like