I’ve been working on a Light Mapper for Web3D / BabylonJS, it’s still a work in progress but the basics are there. I’ll be adding to it over the next few months.
This is project allows you to create light maps for BabylonJS meshes within the browser. The system is built using C++ and WebAssembly using the Emscripten toolchain. It works by first UV unwrapping each mesh and packing into a single atlas using the xatlas library. It then creates a bounding volume hierarchy 3D data structure from the whole scene using a cpp bvh library. The bounding volume hierarchy is packed linearly in depth first order in a memory efficient way, which makes it suitable for a linear stackless traversal on the GPU. To generate the actual light map a GPU based path tracer was written using WebGL2, which basically bounces light rays around the scene using the BVH to speed things up, bouncing off surfaces until a light is hit (this is simplified somewhat but basically true). All the scene data (like the BVH, materials, meshes etc.) is packed into GL textures which are used by a fragment shader that does the actual path tracing. The uv maps / atlases (with position and normal information too) is copied to a vertex buffer object for rendering (the first ray for each pixel starts from the interpolated position vertex and propagates along the interpolated normal etc.). There are a few simple filters including a gaussian blur, implemented in WebGL fragment shaders for cleaning up the resulting light maps.
Let me know what you think / what you want it to do !
Sorry yes it’s a bit of a mess right now, this will improve very soon.
Clone the repo then: npm install
pull down and set up Emscripten
npm run installemsdk
npm run useemsdk
Build the c++ part of the project (with optimization) npm run buildprod
Run the example npm run test
This will load the Cornell Box scene, and run the light mapper at 2048X2048 map size with 1000 samples (might take a bit of time depending on your system).
I’ve figured it out. The blinds has morph targets and the default values keep them closed.
The patterned texture we can see is not the blinds texture but the windows texture. The blinds are in front of the windows and some light passes through them thus the patterned lightmap.
It would be cool to have an option to use the lights from the glb.
Edit: I,ve just realised that’s the demo only which loads a glb. The baking library is not meant to be used only with glbs. Maybe it could be implemented as a helper class.
Yeah It would be cool in general to be able to take the lights in from from a Babylon scene, and auto convert them to lights in the baker renderer so you didn’t really have t think about it.
Also a few more things are coming really soon:
Firstly emissive materials, so you will be able to set up area lights by applying an emissive factor to materials in the model.
Second, multi lightmaps, so the scene is divided into multiple lightmaps based on area, which can be dynamically loaded at runtime, allowing for much bigger areas to be light mapped.
Third I’m experimenting with building out a CNN based auto encoder for some AI noise removal on the output map. This will be cool because from the papers I’m reading, you can get an insane level of quality from just a few samples (that 10 seconds could become 1 second).
A while ago I was working on 3D product configurator with someone and they really wanted to do light bakes in the browser or an electron app. I ended up just doing all the light maps in blender because I did not want to make something like this myself.
If you geared it towards high quality product lighting could be a big deal.
Anyway very inspirational I am excited to see where this goes.