Material lightmap texture atlas support

Yo @Deltakosh / @sebavan … Do you think we can add support for lightmap texture atlas. ???

Right now the lightmapTexture seems to only support a single full size texture for the lightmap texture2D call… Unity exports lightmaps as a texture atlas that it packs as many material lightmap textures as it can into a single lightmap texture for all the materials in the scene (provided the texture atlas is large enough… otherwise it splits into more than one texture atlas)…

The point is, i dont think the shaders support sampling just a portion of the lightmapTexture.

So basically all unity exported lightmaps are usesless.

Can we please add support for lightmap texture atlas (or if its in there already … how do i use it ???)

Use the uv2 set for the lightmap:

https://playground.babylonjs.com/#89D6W1

In this sample, the plane with the checkerboard is using the normal uv coordinates, whereas the other one is using the uv2 coordinates calculated so that only a part of the lightmap is used (namely, the part with a yellow/red/purple gradient).

i see where you going… you are manually altering the uv2 coord of the geometry… i think…

i would have to alter the uv2 coord on EVERY exported geometry from unity…I dont think that would be very good… Dunno … will have to play with that idea…

i think unoty handles in the shader… probobly fract or something

Unity must put the lightmap coordinates somewhere too…

As far as I know, using a specific set of uv coordinates is the standard way to support lightmaps.

https://docs.unity3d.com/Manual/LightingGiUvs-GeneratingLightmappingUVs.html

It seems that’s what Unity is doing, it is putting the coordinates in uv2, so it should be easy for you: just export uv2 as well as uv when you export a mesh.

I believe i am using the Unity UV2 channel in my GLTF export:

AccessorId aPosition = null, aNormal = null, aTangent = null, aTexcoord0 = null, aTexcoord1 = null, aColor0 = null;

aPosition = ExportAccessor(meshObj.vertices, true, "vertices");

if (meshObj.normals.Length != 0) aNormal = ExportAccessor(meshObj.normals, true, "normals");

if (meshObj.tangents.Length != 0) aTangent = ExportAccessor(meshObj.tangents, true, "tangents");

if (meshObj.uv.Length != 0) aTexcoord0 = ExportAccessor(GLTFSchemaAddons.FlipTexCoordArray(meshObj.uv), "uv");

if (meshObj.uv2.Length != 0) aTexcoord1 = ExportAccessor(meshObj.uv2, "uv2");
else if (meshObj.uv.Length != 0) aTexcoord1 = ExportAccessor(meshObj.uv, "uv2");

But i dont think the UV2’s (mesh.uv2) are offset with the lightmap texture atlas coordinates. I was under the impression that need to be handled in the shader… But you are saying the physical mesh.uv2 should be offset with the lightmap texture atlas coordinates… If unity is not already doing that for us in the mesh.uv2 … Is that right ???

Yes, I think uv2 are the lightmap coordinates you need. So, if you already export them, then you only need to do coordinatesIndex = 1 on Babylonjs side on the lightmap textures to get the right display.

I had to actually apply the renderer.lightmapSacleOffest to the mesh.uv2 after the light baking process

Also I had some mesh instances and shared materials issues… basically if the mesh is using a lightmap via its material.lightmapTexture … it cannot be an instanced mesh

Working pretty good now… looks beautiful out the box unity baked light on complex detailed scenes… great looking baked shadows to

2 Likes

You @Evgeni_Popov … What about higher quality lightmap texture formats RGBM encoded png files… Or would we have to modify the shader to decodeRGBM packed color after texture2D call for lightmapTexture ???

I don’t think Babylon supports RGBM encoding, but it does support RGBD.

After you create a texture by loading a RGBD encoded png file, you can do:

texture.isRGBD = true;
RGBDTextureTools.ExpandRGBDTexture(texture);

Then you can use your texture as usual, no need to decode it in the shader code, IF the GPU does support half/full float encoding. If not, the ExpandRGBDTexture will do nothing and you will indeed need to handle the decoding in the shader. To help in that matter, you have the fromRGBD (to decode) (and toRGBD to encode in case you would need to do that) functions in the ShadersInclude/helperFunctions.fx file.

Yo @Evgeni_Popov … What about Skybox CubeTexture made with 6 Sided PNG encoded as RGBD… Is that supported as well… I would love to encode the six sided hdr skybox textures as RGBD :slight_smile:

As I can see it, you can’t use RGBD encoded files for reflection textures in the standard material.

However, you can easily support it by updating the material from https://nme.babylonjs.com/?#AT7YY5#6, which is a re-creation of the standard material in the nme.

For PBR materials, you simply have to set reflectionTexture.isRGBD=true (reflectionTexture being either the reflection texture defined at the material or at the scene level) and the shader will decode the value after sampling the cube texture.

1 Like

Also Lightmaps should supports RGBD natively on pbr mat :wink:

Do we still need some sort of RGBDTextureTools.ExpandRGBDTexture(texture) call ???

How do we do that for a CubeTexture … ???

Or does the PBR shader need to use fromRGBD ???

Just set material.yourTexture.isRGBD = true (yourTexture can be the reflection, refraction or lightmap texture - reflection and refraction can be a cube texture) and the shader will do the decoding, no need to call ExpandRGBDTexture.

Calling it is a plus cause interpolation will be on linear data and not rgbd which could have artifacts but it will convert only if it can.

1 Like

FYI… Babylon.js now supports RGBD for reflection, refraction and lightmaps automatically when setting texture.isRGBD = true for Standard Materials just like PBR Materials :slight_smile:

No need to expandRGBDTextures (unless you want to like @sebavan says if artifacts are a problem for you )

2 Likes