Dynamic Material based on uv coordinates

Hello Community, first time I post here.

I feel like I already know the answer which is writing my own shader, but I wanted to confirm my suspicion.

For my chosen workflow I use Blender to setup a level which I then import via Babylon meshimport.
In Blender, I’m using a texturing method called color palette so I can give most meshes the same material (a 32x32 texture) to save on performance.

But because that isn’t enough for me and I’m obsessed with optimizations, I wrote a shader in Blender That changes the settings [roughness, metallic, transmission, emission strength] based on the uv-coordinates on the palette (with some math).

I can’t yet upload images, so you have to trust me on that it works.
roughness = (y%0.5)2
metallic = (y>0.5)2(x-0.5)
transmission = (1-(2
x)^4)(y<0.5)
emission = ((3
x)^3-0.5)*(y<0.5)

It works in Blender, but not so much in my Babylon project.
Everything is very emissive, bright it hurts my eyes.

I guess the shader isn’t automatically part of the material and it just choses some values based on what was last used when exported?
One material can only ever have one value per RMTE right?
I would have to recreate the shader in code?
In which case a lot of research ahead of me, I have no idea how to achieve these effects etc.

Thanks in advance.

I dont know of any exchange formats that will include custom shaders on export. Babylon has custom shaders capabilities and a great shader editor tool to create the shader within :

you can export the shader code or even the js code required to set it up via nodes.

1 Like

As @shaderbytes said, you can’t export custom shaders from Blender to Babylon.js. During the export, your shader has probably been replaced by a texture. You will have to write a custom shader or a material plugin if you want to support the same thing in Babylon.

Ah thank you very much, that’s a neat tool.

I can mostly work with it, allthough there are some weird things happening.
When plugging in a float into the alpha channel of the fragment output, be it alpha 0 or 1 or any number, the texture get’s weird, like faces on the back get drawn before the ones on the front.
I wanted to use this for glass materials that are see-through.

Any idea?

I’m also not quite sure how to include glow layers in this. I saw in the wiki how to include meshes, but for me, the glowy part is on the same mesh as the non-glowy part. Is this not possible?

EDIT: post content removed because info was incorrect , posts after this correctly informs on what i mentioned here… sorry

:wink:

In fact, the problem with the alpha input being connected to something means the system considers the mesh to be transparent, so the mesh won’t write to the zbuffer and you may get artifacts because of that (see https://doc.babylonjs.com/features/featuresDeepDive/materials/advanced/transparent_rendering).

If the material will be used for glasses, then you should be ok because the mesh should be a simple plane, or a thin object.

Regarding the glow layer, you can’t enable it in the NME, you will have to create a playground for that. I think it will be easier to help you on this part if you can provide a repro in the Playground.

I think I’m on to something. I kept following the tutorials on transmissive materials and they mentioned something about having to add a second render pass.
If that is the case, I’m going to drop this idea and just use the metallic/roughness on this palette, while putting the emissive and transmissive ones on their own materials.

It makes no sense adding a second render pass for normal materials just because some of it is glass/glowing. I’ll end up with worse performance if I did that I’m sure.

@shaderbytes Yeah, the uvs in Blender are a bit reversed where topleft is x0, y1 and botright is x1,y0.
That and in the playground it seems to also display it wrong, maybe the normals are the wrong way, but it was displaying correctly in production.

Thanks for the help!

no , actually that is correct , the standard for uv is bottom left is 0,0. babylonjs follows this standard in their code outside of the NME tool, as would almost all other engines. Somehow in the NME tool though , the x polarity is flipped, which requires a small tweak to see expected uv polarities , but ssaid tweak should not be exported, as mentioned , outside the tool the UV polarities work as expected.

EDIT , sorry its flip y , not x … links to explanations can be seen below

I’m not sure to understand this one?

One discrepancy between the NME and the Playground is that the textures you load directly in texture blocks are not Y-inverted, whereas in the Playground they are because you generally don’t pass a value for the invertY parameter of the constructor, and the value of this parameter is true by default.

But I don’t remember having a problem with the x coordinate?

Oh sorry , yes Patrick did explain this , i went and checked the thread i discussed it in :

TLDR , I incorrectly thought x flip worked , but it was not correct , it required y flip and rotating the camera 180. ( it does matter if you look in which corner the letters a,b,c,d etc are in each block of patricks example )

Ok so i had the wrong axis , sorry @Shirhix , but anyway an issue with unexpected behaviour is still at hand regardless of me getting the two axis muddled , inconsistent behaviour between NME and babylon engine outside of this, patrick did say talks were happening about how to iron this out : quote :

@Shirhix, as @Evgeni_Popov mentioned there is no direct path to the glow layer in node material, but there are still ways to mix the glow layer post process and node material. Basically, the glow layer needs to know which bits of your node material output need to glow, so there is a simple trick for doing that.

Basically, you need to be able to output the final color of your shader and then just the pixels that glow with a simple flag in the shader. First you need to tell the glow layer to allow the mesh to reference it’s own material for the glow effect using glowLayer.referenceMeshToUseItsOwnMaterial(mesh). Then, you can enable the flag to render only pixels that will glow in the glow layer onBeforeRenderMeshToEffect which will send just the glow pixels to the post process. Then you flip the flag back in onAfterRenderMeshToEffect to pass the final color the mesh for rendering. The glow layer will then be mixed with the base render.

There is a simple playground showing the effect as well as a tutorial video walking though one way to set up a node material to be able to use the glow layer.

Hope this helps with your project.

2 Likes

Good to know thank you.
I’ll come back to the glow part later.

I managed to get the roughness and metallic working directly from blender import.
What I did differently this time is instead of doing math on the uvs I put a texture to where and how much should be rough and metallic.

1 Like