I have a static scene and want to use Babylon’s ShadowGenerator to create lightmapTexture for individual meshes in the scene (and do not want to use external tools, like Blender, because of difficulties getting a consistent result when exporting scene to GLB, and complex devOps setup, since this process must be automated).
The short answer is you can not as light mapping would require techniques close to raytracing and potentially also a custom UV layout. @CraigFeldspar did an experiment a while back I hope he will revive
In the context of Babylon Shadows could be rendered only once if everything is static by forcing the shadowMap.refreshRate to render only once. which would at least prevent the expensive shadow pass. Which will only incur a small startup cost. And looking into the resulting lights in playcanvas it looks probably close from what they do as well in the thread you shared as we can see shadow aliasing and other artifacts from real time shadow algorithms.
Yes, I’ve seen this in the docs and intend to use the same because my scene is static. The problem I have is in a typical 3D apartment model, there are way more than 9 light sources.
this looks very interesting, will need to look into how this can be automated.
Something to consider in the appartment static case is that most of the light sources do not lit the full flat or not much so you could use our include only meshes or exclusion to ensure of which meshes are receiving which light so in most cases 4 would be enough. This greatly unprove perfs usually.
Unfortunately it’s not a case anymore, these days people use track lights, ceiling LEDs, wall lights, etc. (think art gallery), which can easily get to 12 light sources in a single room. I can only fake it to a single light sources for things, like chandeliers, even though it’s still not accurate if it’s multicolored.
And the point of having lights is so people can visualize how it will look like in the real world with different light combinations - the reason the app exists in the first place, because there is nothing on the web that currently allows you to do that without spending weeks to learn CG.
This is the problem I had when renovating my apartment, and something I want to solve, not just for the sake of realism - it has practical use. For example, you can see if the light covers the entire kitchen counter or not, so you can position it correctly and tell builders to install them in correct locations.
Hi. You need lightmaps baker like in unity, unreal, playcanvas and other game engines. Maybe you can do it with something like this https://github.com/pmndrs/react-three-lightmap or any other ready solution on backend. But in any way you need prepare your assets uvw for bake lights. in any way you cannot use too many light sources as physical correct lights. Its only approximations. Fore real light calculation you need special program calculated lights not in real time. You can bake lights approximations and show with game engine. You can’t use game engines in real life solutions like you can position it correctly and tell builders to install them in correct
I don’t use this lightbaker. In my pipeline i use baker in something like blender/3ds max or other software like unity or unreal because in this time its better solution for realistic lighting. Bake shadows on frontend too expensive and in web no sense.
I know I’m aiming to achieve something that has not been done on the web all in browser. Baking using Blender is only the last resort, because it’s not easy to setup with an automated workflow.
The entire scene is user generated - you do not have the luxury of custom fine tuned shaders and node materials to adjust by hand.
If I (a CG newbie) can pull this of, it will be a great example of why people should choose Babylon over competitors, like Threejs or PlayCanvas, with the coming support of WebGPU.
Hi Patrick, I reread your post several times, and looked into PG, but still cannot figure out how your hack works (since the NodeMaterial screenshot is too small to see anything).
Please help me clarify some points:
Does this hack work on arbitrary mesh shapes (like user imported GLTF meshes)?
You mentioned PG only used x axis, does this mean if I used y and z axes I can get full CubeMaps that can be applied to any mesh of any size? (the PG does not work if you change ground plane to a box)
How does this hack work in layman’s terms? (so I can automate it in code for any scene)
In the setup I currently have, all visible Meshes are Mesh instances, having UV set dynamically to maintain correct material dimensions (one Material with Textures is shared between many Mesh instances of different sizes- Meshes adapt their UVs to Materials on the fly).
So your approach seems like a perfect solution to this dynamic lightmap challenge, because I have a Material centric architecture.
@ecoin, you can open the node material editor from any node material in the inspector by clicking on the pencil icon next to the material name:
This will show you the full graph and allow you to edit the shader with a live connection to the scene.
Beyond that, here are the clarifications:
If you pass the node material to any mesh, the shader will work, but you will also need to grab the PBR textures from the loaded glTF material and assign them to the node material so that you get the original textures with the added hack light.
I would not think of this shader as a cube map at all. This is basically a PBR shader with a world position passed to it to add a color to pixels on the mesh based on their relative distance from the passed world position while also accounting for some naive dot calculations. You would need to add a number of calculations per light point you need to pass to the shader (i.e. one position for every “light source” that could affect your mesh). If you need to make it procedurally scalable, you can create a node material via code which would give you some control over automating how many lights can affect any given mesh.
In layman’s terms, we are adding color per pixel of the rendered mesh based on the distance from the pixel to a position in the scene. If you have your light sources as separate meshes, you can use the mesh position as the source. If you have one mesh with several light sources, place null transforms (empty in Blender) in the mesh so you can query mesh by name to get their positions within a more complex glTF. Again, your node material needs to handle one set of operations per light affecting the mesh, so this could get out of control, but the operations are relatively cheap so you should be able to handle them within reason.
This node material ignores UVs for the purposes of adding color to the rendered pixels, so any changes you make to the scale of the UVs should not affect the distance calculations.
I tweaked the playground and shader to use cubes instead of planes. Again, this is naive fake lighting, so the light sources will all act like point lights. If you need something like a spot or area light, you will need to update the shader to create those shapes manually.