Baked lighting and dynamic objects

Does anyone have suggestions for how to approximate lighting for dynamic objects moving through a scene where most lighting has been baked to a lightmap?

Light probes? Lighting volumes? Anything like that?

Babylon currently does not have Light probes, only ReflectionProbes - but they do not capture light.

Your best bet would be to use backed lightmap for static objects, and a procedural HDR reflectionTexture (I’m working on this, expect the first prototype soon) to light dynamic objects with PBRMaterial.

Otherwise, use StandardMaterial workflow with four kinds of Lights (but it has a limit of 9 lights for any scene/material) and procedural Shadows.

Thanks ecoin, I noticed that Babylon doesn’t have light probes and it surprised me a bit so now I’m looking for alternatives.

That’s an interesting idea for generating a reflection texture that can be used for image based lighting. I think it might be a bit heavy for a scene where there are a lot of dynamic objects, because I assume you’d have to render a reflection cubemap for each dynamic object and update it as the object moves around?

No, since procedural HDR is effectively a local scene environment texture, you can pre/generate them as clusters, similar to Unity’s prebaked Light probes.

For example, divide a 10x10 Scene into 100 smaller points, then you have 100 HDRs, one for each point. As your dynamic object moves, swap the reflectionTexture to the closest HDR. Thus, multiple objects can share the same HDR.

Procedural HDR can be taken without dynamic objects, it will not be 100% accurate, but for the purpose of lighting, it’s enough. There is no limit on the number of lights and it takes care of global illumination automatically. You only have to deal with shadows.

ReflectionProbes have roughly the same or more expensive cost, because they use real time prefiltering and render the scene into RenderTargetTexture each frame x6 textures. Whereas procedural HDR only creates 1 texture, and only prefilters once (unless you include dynamic objects and update it each frame).

Though, I’m not sure how to handle transitions between HDR swaps, if it’s not regenerated each frame. Need to consult @PatrickRyan for some shading magic.

1 Like

Sounds promising, would love to see your results.

I was thinking of various tricks, such as using a hemispheric light per dynamic object (and the affectOnlyMeshes setting on the light) and setting the ground and diffuse color based on some sort of lookup, to approximate indirect GI, and then use realtime lights for direct lighting with some sort of lighting manager to look for the closest lights per object.

Would be nice if the engine supported something out of the box though.

@Falagard, this article talks about real time filtering of reflection probes to take dynamic elements into account for lighting which may be of interest to you for your scene.

1 Like

Yes, ReflectionProbes should work for most cases.

However, its lighting is not accurate, and does capture effects, such as Glow, without some hacks.

You can see the difference in theses two examples, using the same environment texture:
Example PG lighting using ReflectionProbes.
Screen Shot 2022-02-19 at 12.23.42 AM

Example PG lighting using HDR directly (more accurate lighting).
Screen Shot 2022-02-19 at 12.24.10 AM

For apps where accurate lighting is important, like architecture and design visualizations, this matters.

@ecoin I agree there are tradeoffs for real time filtering due to the fact that we need to prefilter the mip chain on the reflection probe every frame… which is very heavy. But if you have dynamic objects that need to move in the scene and get accounted for in the IBL, we need to do something to render the light bouncing off those objects. As your example shows, the scene without real-time filtering enabled has IBL that does not account for the magenta and purple spheres, which would certainly affect the specular reflections of the main sphere. We can look to improve the accuracy of the tonemapping into the real time filtering, but to get it to be quick, we need to sacrifice on texture size a bit.

The original question mentioned dynamic objects in the scene, so I thought it was worth mentioning this path for completeness.

I agree, there are different uses for ReflectionProbes - it’s better suited for games, such as RPG, where you have small number of moving objects and getting high FPS is more important than accurate lighting.

My HDR approach would solve the static scene use cases, like product view configurators, where accurate lighting is more important than FPS.

P.S. in the above ReflectionProbes above example I forgot to remove the purple and pink spheres from render list, but it produces similar orange sphere, regardless (here is PG).

This is not really true, your setup is not correct on this scene as you created a probe in SDR. You need to create the probe allowing float rendering and in linear space to ensure a correct usage as you would from a HDR texture.

They actually then use a pretty similar process to render https://playground.babylonjs.com/#FEEK7G#387

2 Likes

Thanks Seb, for correcting. I didn’t know ReflectionProbe could now capture HDR texture floats, because the PG was taken directly from official docs, and I misunderstood it from your old post:

But is my assumption that ReflectionProbes cannot capture light still correct?

For example, if I have a box.material.lightmapTexture = texture.hdr, and the box is added to probe.renderList, will the probe capture floats from the box’s lightmap or only LDR RGB values?

1 Like

I do completely agree the doc deserves an update :slight_smile: @PirateJC ?

And you are totally right that they are not light probes, they only capture what they see light a metallic mirror so it is handy to replace a reflection texture. They can basically be used to compute indirect radiance and irradiance but wont capture analytical lights data.

On the box example side, the data would be captured including the lightmap hdr data. basically all the "
“return to gamma” part of the rendering process is skipped as we know it is intended to be used in a PBR material which will do it at the end of the rendering.

1 Like