Multi-channel material

How is this multi-channel material achieved? I know it is implemented by controlling the combination of multiple textures, but I don’t know how to implement the code because Baylonjs’ basic texture channel only has one texture pasted. Can it be achieved through material plugins or shader secondary processing? I know NME can be implemented, but I currently don’t know how to use it. There are too few documents and tutorials available

If you do not need to control the mix dynamically at runtime, the easiest way is to bake multi-channel basic texture into one texture, then attach it to material.
If not, there are 4 ways in babylon.js:

  • ShaderMaterial gives you full control over the raw shader, you decide how the object is rendered, pixel by pixel, but you will lost some babylon.js features like lighting, shadows unless implemented by yourself
  • Node Materials like the ShaderMaterial, gives full control without the need to code a shader, shader would be constructed from nodes automatically at runtime
  • Material Plugins let you modify an existing material to inject custom shader code, but need some knowledge of babylon.js internal to work
  • Dynamic Textures enables the texture to change at runtime, so instead of complex shader coding, you can bake texture at runtime, at the cost of losing some performance since there is a canvas underlying for each dynamic texture

I mainly use it for device status management, which requires dynamic operation. It not only has color differentiation, but also has real-time dynamic effects

Here, I made an example of how you can achieve this in NME: https://nodematerial-editor.babylonjs.com/#8HGSZL#1 (updated with more controls)

chrome_1oH09IZ5em

That looks so cool as some type of scanning effect!

looks really good @fuyutami !!! guess we should add it to the doc

Thanks! That would be cool :smiley:

Thank you very much, but I don’t know NME. I wrote a simple code using claude code