Node editor: Triplanar mapping lacks texture scale control

My example node graph:

Just to preface this a bit; I’m quite used to working with node materials in a few 3D softwares but I’m just a 3D generalist dumdum and helpless when it comes to actual shader code, or code in general for that matter.

The TriPlanarTexture block seems to lack means of scaling the texture. Its required input, the imageSourceBlock has Scale U/V parameters but they don’t have an effect. Also, I’m not convinced that it’s mapping correctly to a mesh with no UVs but that might just be because I’ve hooked some of the “utility” nodes wrong. Looks okay on the preview meshes, though.

I am looking for a way to add triplanar bumpiness to a set of complex meshes without UVs with a normal map but since that proved impossible due to the lack of texture scale control, I turned to noises in the node material. They seem to be triplanar by nature so I reckoned that the Clouds noise could do it for me in a pinch but it lacks all scale control as well. Also, the noise seems to map weirdly to a mesh with no UVs but again, might just be me hooking the noodles incorrectly.

Sorry that this turned out to be a longer read and a bit of a rant but I’d really like to be able to participate in the visuals of our projects.

Welcome aboard!

The way to change how the texture is lookup in a triplanar block is to modify the input position:

You can scale and offset the texture by simply scaling and adding an offset to position.

The triplanar block is not using the uv coordinates of a mesh for the calculations, so there’s no difference between a mesh with or without uv coordinates when you apply a node material using a triplanar block (assuming you are not using uv coordinates for something else in the material!).

Same thing for the noise block, it is not using uv for the calculations (but it can if you use uv as an input or as part as some calculations for the input).

Thanks for the warm welcome and solving my problem so quickly! :raised_hands:
I can make the home stretch from here but I’ll bug you a bit with the UV/no UV thing:

I did a quick test with the version of the example project you posted in your answer.
I used Blender’s default cube as the preview mesh and the very same mesh but I deleted the default UV channel from it before exporting it.
Should be the same result but it’s not, so have I set the Perturb normal node up somehow wonky?

The PerturbNormal block is using the uv, so if they are not present the rendering will be different indeed.

However, if you unlink the perturbNormal input of PBRMetallicRoughness, you should see no difference with or without uvs.

Yeah that’s true. No difference between the results when no normal map is being used.
Sadly the whole point of my inquiry was to get triplanarly projected normal map to work in a pbr material. Do you know if there’s a way of doing that with the node editor?

The PerturbNormal does need a uv input, there’s no way around that.

You can use the triplanar block to read from a normal map, though, but you will need a texture. For eg:

Node material:

I gathered as much. Alas, that still fulfills only two parts of my three component goal;

  1. normal map
  2. projected
  3. to a mesh with no UVs

I find it a bit odd that you can have triplanar texture projected on a mesh without UVs, but if you do that with a tangent space normal map it’s just not possible to take it into account in lighting.
Maybe I’ve just been spoiled with other editors ¯_(ツ)_/¯

Thanks for your time and even though I didn’t get what I was aiming for, I really do appreciate the help!

Actually, it should work if your mesh has normals + tangents => uvs are used to create a cotangent frame in case the mesh has no normal or no tangent.

The objects we use in the preview pane of the NME do have normals + tangents, so this material does work even if we set the uv input of PerturbNormal to (0,0):

Well hey, I got it working on some of my meshes now! At least some of the time.

I have no idea what the affecting factors here are, I gotta look into that.
I have exported all my test meshes with Blender’s .gltf exporter, with some meshes originating from Maya and some primitives directly from Blender. Some work and some doesn’t with no apparent consistency.

I’ll probably make a post-mortem post here if I can pin point what makes the normals not work in this instance and what has sometimes pooped up even the albedo part of the projection.

Thank you so much for you help!