Soo… after a lot of time has passed I got far enough into the source code to make it happen for my use case!
TLDR first because this is going to be a big post:
I extended the PBRMaterial with a second and third normal map. They are applied upon each other.
I would be glad if someone could look into my code to give me feedback on how to improve on it.
As far as I have seen you could do something similar with the PBRCustomMaterial but I have to change the maps within the lifetime of the material. I couldn’t figure out how to do that with the existing methods.
One normal map:
Three normal maps combined:
I don’t have a playground because it is typescript, custom classes, etc but you can find a demo project with the full source code on my github.
So here is what I have done. This is going to be very technical:
After digging a lot in the PBRCustomMaterial, PBRMaterial and PBRBaseMaterial I finally understood it well enough to replicate what PBRCustomMaterial does.
I added tow textures for detail normal maps. They can be scaled independently from the main bumpmap as they use separate uniforms for the texture matrices. Also I added a multiplier for each texture so you can use the same texture with different strengths on multiple materials as detail texture.
After one of these textures change the defines and for the shader are rebuilt. The custom NameResolve also looks if the shader was already built with that specific set of defines and makes the engine use the cached version if so.
AttachAfterBind sets the textures, matrices and infos for the textures if they are ready. I wanted to use the same method as PBRBaseMaterial uses but there is no way to access the UniformBuffer without duplicating a huge part of the material. As these values can be set on the effect directly I used this solution.
Also I use a custom garbage collection that relies on hasTexture() on my other project so I updated that method to look for the detail textures.
For the shader I updated the bumpFragment. I could have written an updated fragment into the shader part store or replace the current one but I couldn’t find a reason for that at the moment as it is used as include in the fragment shader code anyway.
What I did here is to use the same method as the regular tangent space normal uses. It is applied on the already altered normal from the regular bumpMap. This is a bit on the heavy side processing wise but it is exactly the blending method I need in my case.
By the way… the uv coordinates are calculated in the vertex shader code for the other maps… but as they use vMainUV1 for the first set of UVs and this is a varying I used that in the fragment shader directly. Works in my use case
- It only works for tangent space normal maps
- I might have missed some edge cases where the shader might crash. I am not a pro at shader designing at the moment
– I think I forgot something to make the bumpFunctions accessible when the regular bumpMap is not set.
- The detail normals always use the first UV set
- Parallax Occlusion is only calculated on the main bumpMap. Would have to alter the bumpFunctions to change that.
This works surprisingly well for my case but as mentioned above it does not cover all possible uses. If yours is similar feel free to use it.
As it is I can’t make a pull request or anything because too much is missing. That said you can surely use this to improve on it and fully flesh it out. Sadly I don’t have enough time on my hand to work through all of it.
I still would be glad if at least one detail map would become a default feature but now I can use this in the meantime.
Also: This was a great practice on building my own material extensions.
PS: Did you know it is really easy to make a customProceduralTexture with a pixelshader that calculates a normal map from a bumpmap? I accidentally learned that in the process of researching all this stuff.