I’m a newbie with NME. I am wanting to create a texture that takes 2 textures as input and outputs a blended single texture based on a float value input (0.0-1.0). I want to use the resulting texture in a PBRMaterial.albedoTexture.
@inteja, the thing you want to keep in mind for the texture mode of NME is that the texture is built in screen space and does not care about mesh.uv at this point. Once the texture is created in memory, you can assign it to any material at which case it will map to the mesh.uv coordinates. So the shader will look like this:
We need to pass screen.position to the vertex output, which should be set up by default when switching to procedural texture mode. So you can normally leave this part alone. Then we will need to use screen.position to drive UV space for your textures. Since screen space encompasses -1 to 1, we first need to remap that to 0 to 1. That is the first thing we do by multiplying screen.position by 0.5, and then add 0.5 which will get us into the right range with a few less operations than using the remap node.
The next part is to split the remapped screen.position so that we can invert Y on the texture if needed. This is due to the engine mapping UV space to DirectX conventions, where other tools/formats may require OpenGL conventions. The boolean to flipY is exposed to the inspector so that you can simply invert the texture if needed to match up with the UV space convention of any mesh you have. For example, a Babylon generated cube would not need to flipY, but a glTF mesh would need to. If you are using this texture with meshes that are all of the same type, you will want to either do the inversion or remove it rather than leaving in the lerp since it is extra operations per pixel that don’t change.
Then we merge the result back together and use these for the UV input on your Texture nodes. If you want to pass textures to these nodes in code, make sure you give them unique names. The lerp that handles the blend of the texture does not need anything more than the float as it generates the same value that passing the float into a black to white gradient will do. If you need to limit the range, I would either change the min/max of the slider or use a remap node if you need to pass a 0-1 range from code. The gradient node used just for a simple black to white gradient is too expensive as there is a branch in the logic on that node.
Hopefully this helps unblock you, but feel free to reach out if you have more questions.
Thanks so much @PatrickRyan ! Awesome response as always. Give me a chance to digest this and I’ll get back to you if I have any trouble, but it looks perfect!
So, just a question on usage. I’ve got something like the following so far but am not getting any output and am not sure how to access the “texFade” input to dynamically set the cross-fade amount?
const nodeMat = new NodeMaterial("nodeMat", scene);
nodeMat.loadAsync("path/to/nodeMaterial.json").then(() => {
nodeMat.build(true);
nodeMat.getTextureBlocks()[0].texture = new Texture("path/to/texture1.png", scene);
nodeMat.getTextureBlocks()[1].texture = new Texture("path/to/texture2.png", scene);
const proceduralOutputTex = nodeMat.createProceduralTexture(256, scene) as Texture;
myMat.albedoTexture = proceduralOutputTex;
});
I receive the following error: [.WebGL-0x7f8f3d85aa00]RENDER WARNING: there is no texture bound to the unit 0
EDIT
The texture issue was a race condition in my code, but the question about how to set “texFade” is still open