I’ve done some searching but the forum search keeps telling me “you’ve done this operation too many times”…
I’m looking for advice on the right way to do the below;
I previously made a procedural material creator. It was done in javascript on a canvas and worked pretty well. However with complex materials and large texture sizes it was very slow - too slow for users IMO.
I want to try to do the same thing using shaders to build them so it can be done in the GPU.
Basically it works like quixel mixer but with procedurally created base textures. It creates several images (albedo,metallic,roughness) and a heightmap. It then mixes multiple procedural textures together using the heightmap to determine which of 2 base materials points to use.
I would like to use NME to do this, so - is it possible to have multiple texture inputs and outputs from a procedural node material created in NME?
If not how would I do a sequence of mixing say 4 materials in a row to create a final material to be placed on a 3D surface? The composition just needs to be done once, for each change in the materials (not constantly every frame).
I can render the same procedural image multiple times to different render targets (for each of the mixed materials albedo,metallic,roughness,heightmap textures) but it would be good to do all in one go.
@joelutting, I think I am following what you are asking, but could you give me a little more context about what you mean by:
is it possible to have multiple texture inputs and outputs from a procedural node material created in NME?
Can you explain more about what you mean my multiple outputs?
Yes - thanks for responding.
So what I want to do is take in 2 source materials (each material having a texture for albedo,metallic,roughness, and a heightmap), and output a ‘mixed’ material - so I would output several textures for the new material; a new albedo,roughness,metallic, and height texture maps.
From their the normal and AO is made from the heightmap to produce the final full material (and possibly other textures for refraction in the case of a transparent material for example).
So I would like to create a shader, ideally in NME, that outputs to several different textures at once.
I could create several shaders and do each output texture individually but ideally I’d like to do all output textures in a single shader.
The other part of the question is I need to do this mix many times in a row - so mix 4 materials together, one after the other - so how would I do that in bablyon.js - like setup each input and output textures and call a ‘render()’ function and then take the output and the next material and mix it again. I can do this easily on a canvas, but not sure how it works on webgl/babylon.
@PatrickRyan Is this clear?
@joelutting, I think I understand what you are saying, but my question lies in why you are taking the path it sounds like you are taking. I understand mixing textures to create other textures, but typically those textures and generated textures would be fed to lighting calculations within the same shader. So your input textures and your generated textures from those input textures would generate a material to be assigned to a mesh with all textures contributing to the final color output on the mesh. You could duplicate this material and replace the texture inputs to assign to a different mesh as much as needed. In this example, I am mixing three different base color, normal, and orm texture sets based on the UV coordinates of the mesh. You can add as many texture blocks as needed to mix in whatever way you need. In this case, I am using a different set based on UV coordinate, but you can do anything you want in terms of how your textures are mixed.
But it sounds like your desired output is the actual mixed textures themselves rather than a material to assign to a mesh. This is where I was running into confusion and thought I was lacking context. If you indeed just want the textures mixed as you want them, there are some extra steps you need to take. First, you will want to be using NME in procedural texture mode:
This will create a node material that is intended to create a texture at the end. You will notice that some nodes are not available in this mode, mostly those dealing with mesh as there is no concept of mesh here. You can import textures and mix them as you would like, but the output is always a single texture. So if you want one node material to mix multiple textures, you will need to create a graph that does all of the mixes you want, and then put a switch at the end so only one mix is passed at a time. Then you will pass all textures to the node material, and set your switch value and await the texture creation before calling it again and changing the switch value to output the second texture and so on.
If you were to need to batch process a lot of textures, this is just setting up your textures in a 2D array and iterating through calling your function that generates a texture for each set and awaiting an output texture before changing the switch and doing it again. You will be left with all of your textures in memory once they are created.
Node material was not designed to output more than one texture at a time so there isn’t a way to do that. But you can certainly generate multiple textures from one node material by changing parameters and calling createProceduralTexture multiple times on the same node material.
I hope this answers your question, but let me know if I am off base with what you are trying to accomplish.
@PatrickRyan Thanks - yes that answers my question - I thought that might be the case.
For context I am building a material creator for my software (called Mindless Creator, an older version of which is here; http://mc.mindlessbrain.com). I haven’t been working on this for a while as contract programming work has got in the way, but I want to get back into it by making a better version of an old material creator I did in a much older version of this software, and just trying to figure out how to do it faster then it used to be.
Now I just have to figure out how to render textures in a row and use their outputs as an input to the next texture. I’m pretty sure I did this ages ago when I was still using Three.js but I can’t remember…or know how babylon.js does a similar thing.
I assume setting up the textures and calling some sort of render() function…?
1 Like