Babylonjs equivalent to Three.js LightFormer?

Hello everyone, how are you? I hope you are all doing well.

Recently, I have been studying how to generate more realistic scenes with BabylonJS (mainly for product visualization). While researching, I ended up finding a very interesting Twitter thread with some tips on how to create beautiful scenes using WebGL:

One of the points I found most interesting was the use of HDRi environments. From what I read, an environment made specifically for your scene is much better than a pre-made environment (like those we download from HDRIHeaven, etc.).

And then I came across this Three.JS feature called Light Former, which allows you to create an HDR in realtime using specific shapes that emit light in the scene itself, as you can see here:

I found this concept extremely interesting because it doesnā€™t require external software to create great lighting for the scene.

My question is: Is there a native way to do the same with Babylon.JS? I tried to search the documentation for something similar, but I couldnā€™t find it.

If not, what would be the alternatives?

Thank you in advance for your great help!

Also, this is my current result with BabylonJS. Any tips to a non-artistic developer? :grinning:

Hello :slight_smile:

Did you have a look at Reflection Probes ?

Reflection probes are used to dynamically generate cube maps that can the be used as reflection textures for instance.

You have to be cautious with reflection probes as they need to actually generate 6 textures per frame (One per face).

Itā€™s basically a real time Cube environnement generator, so I guess itā€™s the closest native BabylonJS equivalent so far, at least, as far as I know :stuck_out_tongue:

1 Like

Hi @Tricotou thank you for this information. Iā€™ll check Reflection Probes; they seem awesome! Is there any way to save a Reflection Probe to a .env file?

@Tricotou I just checked the following thread, and it seems Reflection Probes does not work for lighting: Reflection with png or jpg - #10 by PatrickRyan

@TiagoSilvaPereira, the issue with environment reflections is that we have to respond to the material roughness by passing a specific mip value for the reflection texture. So when we process an HDRI into an env file, we are generating a mip chain which effectively blurs the image as we reduce the image size per mip. Then we pass the corresponding mip level as determined by the material roughness which will render a reflection at the correct level of blur.

With what I am seeing in the examples you attached above are basically area lights that arenā€™t part of a mipped environment. Punctual lights in a scene already handle the blurring of reflection based on the material roughness properly so thereā€™s no need to precompute the mip chain. While there hasnā€™t been area light support in Babylon traditionally, I will ping @srzerbetto for his thoughts on the matter.

In terms of working only with an environment and not dynamically moving lights, I disagree with the premise of the original post about found HDRIā€™s not looking good. The example shown in the post was not balanced for the scene and so it obviously looks blown out. However, with care, you can tailor even a found HDRI to your scene. Thereā€™s nothing to say you canā€™t bring a found HDRI into an image editor and modify the number of EVs or change the color balance of the image. A lot of the examples we create do use HDRIs from Polyhaven because itā€™s an easy way to bring good light to the scene. However, you can author your own, particularly if you are using an indoor lighting setup.

In your favorite DCC tool, place a spherical camera at the world center and set up your lights as you would want to light your object. Make sure your lights render in your ray tracer and then render from your spherical camera. Creating a room around your lights will allow for some extra detail in your lights. You can see in our default sandbox env that I have a simple cube for the room and a cylinder for a table in the environment. I modeled the light heads and stands to add some realism to the environment, but just enough detail to let the brain connect to other lighting that weā€™ve seen through traditional photography. This way I can set up whatever lights I want and render a specific lighting setup for a specific model. This is cheaper than trying to be dynamic at runtime because we will already need an IBL for PBR to work correctly, so I can author my lights however I like and not worry about perf. The only reason to really want to calculate lights at runtime is if you need them to be dynamic in some way. If the lights donā€™t move, an IBL will be cheaper. Then you just need your shadow casting light. as a punctual light, but it can be used to cast shadow only and not add to lighting by culling the light from meshes. This will make the scene cheaper and rely on your specifically authored environment to do the heavy lifting for lighting.

I hope this makes sense, but feel free to ping back with more questions.

4 Likes

@PatrickRyan Thank you very much for the detailed answer. I read several of your answers in the other threads and learned a lot about HDR, etc. In fact, after looking at the code examples in the link I provided, I noticed that the secret is that Area Lights are being used.

@sebavan updated me that this feature is being developed, so Iā€™ll be eagerly awaiting it. Although Iā€™m getting some satisfactory results using predefined HDR, Iā€™d like to offer an option where my clients can configure the sceneā€™s lighting themselves.

One question: Is there any way to edit the .exr file and view it in real time in the BabylonJS scene? For example, while I modify the parameters in Photoshop and save them, they are already updated in Babylon. Here, I always go through the process of saving, opening it in IBL Baker, adjusting the parameters, and only then viewing it in Babylon, which is a bit of a slow process.

Another question: In the Twitter thread I shared, I saw an example of using a video for lighting. I was wondering how this could be done, and the solution I came up with was to create some area lights in front of the video (not too many so as not to have performance issues), use raycasting to get the closest color in the video, and change the color of these area lights accordingly. Would this be the best solution?

Thank you very much!

1 Like

Also, Iā€™m playing with a lot of Babylon.js settings and Iā€™m pretty satisfied with the current result, although I feel I have so much to learn:


2 Likes

When I say ā€œThere is a lot to improveā€, Iā€™m saying about me, and not Babylon :grinning: I just noticed that the previous statement was ambiguous (also edited on my previous response)

@TiagoSilvaPereira, there is one path you can use for iteration purposes that will speed things up. If you save your HDRI in the Radiance .hdr format, you can ask Babylon to prefilter the image on load and create your environment. In this way you can iterate on your image and change as much as you like and then save over the old file. When reloading the page, Babylon will precompute the .hdr file for you. There is obviously a cost to this as we are converting the equirectangular image to a cube and then creating the mip chain but for development itā€™s a bearable cost. Then once you have final approvals on the light, you can use the longer path to generate an .env file which will be the lowest cost path from load to render.

For a video light source, there are a couple of problems. The first is that your video is likely to not have a high dynamic range which means your values will not accurately reflect the light in your scene. There are a couple of things you can do to add color from a video to a material. The first would be a projection texture which would just add color to your surface but projected from a particular angle. The second would be a custom node material which passes the video as a texture to the base color contribution in your PBRMetallicRoughness block. Going this route would allow you to apply a curve to the video to push their values out of their low dynamic range source. The PBR block assumes linear color space for the base color input so make sure to toggle the convert to linear space on the Texture block before applying the curve. You could use a triplanar projection approach on the video texture to project only on one axis. Hereā€™s an example of a triplanar projection node material which demonstrates the technique. You only need to do one projection instead of three in your case.

1 Like

Awesome, @PatrickRyan. I think this may work very well for development purposes. I could check the file for changes and automatically reload the page (or maybe dispose() and recreate the environment texture).

Thank you very much for your detailed response.

1 Like