NME Texture UV Planar Projection

Hi,
I am trying to do a planar projection through NME.
Have a look at this snippet: Babylon.js Node Material Editor

If you choose preview Sphere, you will see weird artifacts if you use the texture, but with gradient everything looks fine.
Do you know, what could cause this issue?

Best regards,
Martin

image image

pinging @Evgeni_Popov

I’m not sure I understand…

The gradient takes a scalar as input, so when linking a vec2, it will only take vec2.x. If you link the y value to the gradient input, you will see the display changing on the sphere.

Hi,
thanks for clarifying the gradient input! I am not yet too familiar with programming shaders.
I have set up another simpler example:
https://nme.babylonjs.com/#VSGA00#2

HereI used x and z positions as colors green and red and if this is used for the fragment output, they show clear gradients. However, if I use them as uv input for a texture, the texture is distorted on the sphere. For the rotating shader ball it even is kind of flickering. It seems there is a relation of texture to the mesh faces?

image
image

It seems to be an artifact of the texture sampling: if you display the x/z directly it is ok.

https://nme.babylonjs.com/#VSGA00#4

image

Clamping u and v helps a little:
image

Hi,
did you try out other objects besides the sphere, such as view rotating shader ball? It seems those artifacts are sometimes even stronger there and when one looks closely, they seem to be “walking” over the mesh faces. This is all because of the sampling right? If I would have the option of nearest neighbour or linear as within the inspector, the outcome would probably look different?

shader

Concerning your approach of using the xyz of the mesh position and not the world. It looks better!
Maybe I should clarify, what a colleague and me would like to achieve. We wanted to project planar (later triplanar) onto multiple meshes, avoiding the mesh uv, but using a texture atlas. Therefore, we need the world position. But those texture artifacts currently distort the material.

That’s “expected”, it’s coming from the geometry that is not tessellated enough.

You can experiment with this PG:

You will see that raising the number of segments help with the artifacts but some others may come from the sampling method (notably at the poles).

Just realized that the issue is related to Fract. I left it out in this example on the v axis (GREEN) and keep it on u axis (RED):
https://playground.babylonjs.com/#2SN0M8#1

We have also tried to use Mod instead of Fract. Same result. Isn’t this weird?
Normally the uv of texture would get position values like 0, 0.5, 0,99, 1, 1.5, 2 i.e.
with fract it would just do 0, 0.5, 0.99, 0, 0.5, 0 etc. so the output should be the same?

It’s not the same because if you use fract you do the clamping between 0…1 yourself before the interpolation of the uv done by the GPU.

So, for eg, if the values to interpolate between are 0.4 and 1.2, when you use fract the GPU will interpolate between 0.4 and 0.2, whereas without using fract it would interpolate between 0.4 and 1.2 (and when going over 1 it would wrap around to 0). So it does not give the same result in the end.

1 Like

Now it makes sense to me! Since geometry is stored on the GPU the fract will be calculated precisely on the GPU, I guess. Hope we find a good workaround and that we can share it!

I would think the workaround is to not use fract?

I guess, yes :smiley:

We were trying to do planar mapping with a texture atlas instead of multiple uvs. So instead of having i.e. 3 textures and one material, we would have one texture and one material, but would assign the uv through an index. Probably too much effort for too less effect.

Hi,
sorry that I have to dig up this topic again. But I am still surprised about how Babylons node material handles texture sampling.
I have made comparisons with threejs as well as blender. Both allow me to do planar mapping while using fract/mod/floor to limit the planar uv within bounds of i.e. 0 to 0.5 instead of 0 to 1.
When I look into the exported shader I guess the most important lines are the following ones:

#ifdef UVTRANSFORM0
uniform mat4 textureTransform;
//Texture
//VectorMerger
vec2 xy = vec2(u_Float, u_Float1);
//Texture
#ifdef UVTRANSFORM0
transformedUV = vec2(textureTransform * vec4(xy.xy, 1.0, 0.0));
#ifdef UVTRANSFORM0
vec4 tempTextureRead = texture2D(TextureSampler, transformedUV);

Maybe it is worth having another look at it.

threejs via https://dusanbosnjak.com/test/nodes/:

babylonjs (texture sampling is linear) - snippet id 9WKHVC#1

blender - not webgl

There are two things:

  • the filtering used by the NME for texture samplers is trilinear, whereas in your screenshot of Threejs / Blender you are using nearest filtering
  • the NME is trying to do as much as possible work in the vertex shader (in opposite to the fragment shader) to improve performances, but in your case it’s not what you want because the fract * mul operation should be done in the fragment shader to get the result you seek. To my knowledge there’s currently no way to force the NME to favor the fragment shader over the vertex shader when having a choise. @Deltakosh maybe we should do something about that?

You can do what you want with a PBRCustomMaterial, though:

1 Like

Thanks a lot for the clarification. Now I finally understood. It actually seems to make sense to keep fract in vertex shader for performance reasons. Having the option to use it in fragment shader would be a good addition for the user.

Concering fitlering mode I just realised that I had to reload the nme after setting it to nearest in the inspector.

I will have a look at the PBRCustomMaterial. Would it be even more performant than a node material, if I set it up same way?

Performances would be about the same as it is doing the same thing, if using the PBR block in the NME.

However, you need to know GLSL to use the PBRCustomMaterial and you don’t have the same freedom than in the NME because you can inject your code only in certain specific locations.

Thanks a lot! I will give it a try :slight_smile:

I tried to work a bit with pbrcustommaterial, but already fail with assigning the correct variables. NME is a much more convenient solution.

It would be very helpful, if there would be an option to use fract for fragment instead of vertex shader!
Currently I am using around 10 to 15 copies of a node material and am assigning textures for each. I have also tried to place multiple texture nodes into one material, but it does not make a great performance difference. Therefore I am still considering the solution to use one material with a texture atlas. I hope the performance win overweights the loss of moving from vertex to fragment shader.

I hope it is not too much work and worth the investment.

I will let @Deltakosh answer regarding doing all the computations in the fragment shader, it may not be straightforward to enable such a mode…

1 Like

I think we could think of exposing a way to flag a node to fragment only

So far, if you look at mod for instance:
Babylon.js/modBlock.ts at 74e37d81733c99ef79c64a36674e8fe9dcc7ee89 · BabylonJS/Babylon.js (github.com)

This means that NodeMaterial will remain in the section (vertex or fragment) and will not force the system to dispatch it

That property can be changed by code:

node.target = BABYLON.NodeMaterialBlockTargets.Fragment;

Other option is to force the shader to dump the operation in the fragment by using a fragment node before (like the DerivativeBlock) and yes this is really hacky :smiley: