Animated displacement

Sorry for the delay, I’m ill in bed, but better today and better enough to be able to read your messages.
Thank you @carolhmj @PatrickRyan and @Blake for your inputs, they are super helpfull.
In order to see the final (kind of) effect we want to achieve, I create a mockup scene in MAYA that you can check here:
AnimatedDisplacement_andOpacity_Test
AnimatedDisplacement_andOpacity.zip (12.5 KB)
So you can see we need to displace the geometry with one texture AND use another texture for opacity, both of them with different texture coordinates but with the same animation.
Hope that makes sense.
Cheers.

Oh no, hope you feel better soon D: This effect is pretty straightforward to achieve on NME, take a look at this mockup: Babylon.js Node Material Editor (babylonjs.com)

3 Likes

I also hope you feel better quickly, @joie! And @carolhmj is right on the money with her NME graph. There is one addition that I would add for performance reasons, which is a discard node. What this node does is to throw out any pixel that’s value falls beneath the cutoff value. The reason that this is important is that if you have a lot of chunks of mesh that are invisible, you still have to pay to render those pixels even though they render nothing. What happens is that the pixel behind any pixel with an alpha value are rendered first, then a second pass is done to render the pixel with alpha on top. For a pixel with an alpha of 0, we don’t render anything, but still have to go through all of the pixels that have an alpha value whether we render anything or not.

What the discard block is doing is saying “skip rendering for any pixels that are below 0.0001” which should largely be the pixels that have a value of 0 in the alpha channel. In that way we don’t even consider those pixels on the second pass of rendering. This is a way to limit overdraw on meshes where you don’t need much in the way of translucency but are focused on eliminating unwanted pixels. This helps you render faster and use that performance on other calculations in your frame.

2 Likes

Thank you all for your very appreciated help on this.

I’ve made the final mesh for this and laid out its UVs so the textures will hopefully do what we need.
So, here is the setup I’ve come up with.

I have decades of experience with nodal procedural shading editors, so it’s pretty natural for me. But man, the NME has so low level nodes, that it is very difficult for me to understand your graphs! LOL

For example, thanks @PatrickRyan for the discard node trick, but I see it’s not connected to anything else, so I guess it is an output node somehow.

Thank @carolhmj too for your mockup, that definitely helped a LOT!

Anyway, you can check it out here.

My problem is that I can’t preview the real mesh inside the editor, there is the option to upload a glTF for the preview window, but it keeps saying “uploading waiting…” and never show anything, so I’m quite blind here. You can find the glTF+.bin+textures here:

Correa_glTF.zip (1.3 MB)

Could anyone check it and tell me if it is everything OK?

Oh, one more thing, the node “Anim”, which is the responsible of moving the displacement + opacity textures along, should be able to be animated outside in the BabylonJS scene.

Remember, I’m NOT a coding guy AT ALL, so be patient with me.

Cheers.

Hey! Sorry I didn’t see this earlier :sweat_smile: I think we talked about the .gltf loading in another topic, right?

About this, you can animate NME properties on code, you just need to access the blocks through code and set the values. :smiley:

@joie, if you are trying to load a custom glTF file into the preview window in NME, you need to use a glb format (binary) rather than the separated glTF and bin files.

You can get a glb either by exporting as a glb, or you can use the shell extension that @bghgary wrote which will give you a right-click context menu item to convert between a glTF file and a glb file. I use this utility often and honestly it saves a lot of work. Especially if you don’t have access to the source file the glTF was derived from.

And, yes, the discard node is an output node like fragment or vertex out. This just simply applies to the fragment out with no extra wiring.

The one thing I noticed in looking at your displacement is that your mesh has thickness, so you need to be able to isolate your vertices so some have displacement and others do not. The simplest way is that any vertex that needs displacement needs to have a vertex color assigned. Something simple like (1, 0, 0) so you can multiply your red channel of vertex color by the displacement amount which will then only affect vertices that are colored red. Note that the glTF format does not respect welded seams across UV boundaries, so your sample glTF has separate mesh for inside, outside, top, and bottom. So you will need to convert the displacement on the vertices on the top/bottom mesh to not displace along Y, but instead along local Z, but only for the vertices that meet up with the face that also displaces.

This is where I would use a separate color like (0,1,0) and then you could use either the multiply trick from above or you could use a logical equal and pass the correct axis to displace along.

This is getting so complex…

Let’s see…,

About the isolation of the displacement @PatrickRyan is mentioning, I tried to do it as I usually do in offline rendering pipelines…, just by forcing the textures to show in a range of the UVs only. So in U direction, the texture is used in the 100% of the range, but for V it is used just in the 90%, leaving the last 10% blank.
The UVs of the object are designed precisely for that purpose, so it should work right away. Please see here:

You can see the displacement is not affecting that 10% of the regular plane. It should work with the final OBJ of course. I don’t know where the problem is.

About the vertices not being welded, my solution would be to have two set of vertices at the very same location, so one row would have displacement and the others would’t.

1 Like

Hi there @PatrickRyan
I still need to connect, somehow, an animation from one object in the scene TO the atribute named “Anim” inside the shader.
Is there any way to do that?
Thanks!

Hi there @PatrickRyan and all the good people in the forum:

About the last comment from my colleague @joie, the case here is that we need to connect the “Anim” attribute of the Animation block (at the very left side of this graph) with a locator (helper) in our scene.

It would be something like this pseudo-code:

...
locator.scaling.x = API_TO_GET_VALUE_OF_Anim;
...

How can it be done?

Thanks for your time.

P.S. This is the “Anim” parameter we’re talking about:

You can use getBlockByName in the NodeMaterial to access and change the values of any input block: Node Material Get and Update Values | Babylon.js Playground (babylonjs.com)

3 Likes

Thanks, @carolhmj; it worked like a charm!

2 Likes

Hi there, @carolhmj and @PatrickRyan:

I don’t know if it would be better to open a new issue about the following.

Please look at this PG, where basically we have the infamous timing-belt my colleague @joie have been talking about long time, which is pressed by a trivial “actuator” by means of a morphtarget/blendshape.

The case is that, if we activate/de-comment the code lines implementing the shader in order to have a ripple-effect (please do so in the PG code), the morphtarget/blendshape is overriden.

Any thoughts about having both morphtarget and shader working together on the same geometry?

Thank you for your time?

That’s a simple fix, just add the MorphTarget node on the Node Material: Morph targets Node Material | Babylon.js Playground (babylonjs.com) :smiley: We need to add this info to the docs page, I’ll do that rn.

Edit: Add info abt node material in morph targets docs by carolhmj · Pull Request #691 · BabylonJS/Documentation (github.com)

3 Likes

Since I’m not a coding guy by any means, please tell me what should I do with that node.

I leave it disconnected as it is by default?
Thanks!

If you check the node material in the example, you’ll see it is connected to the starting position/normal/etc and passes the values through to worldPosition, worldNormal, etc :slight_smile:

3 Likes

I didn’t know I could open the inspector in a playground, thanks for the tip.

But in my material (here) I have several position nodes and all that, and I don’t really know where to connect the morphTarget node (several parts of that graph are like black magic to me).

May I ask you to check that graph and help me put the morphTarget in context?

Thanks in advance.

You should connect it in every place you have a mesh.position, mesh.normal, mesh.tangent and mesh.uv, and the output goes to where the input was previously going (so it’s like “intercepting” the values to update them with their morph target values). You can have multiple Morph Target nodes, so you can place them wherever is the best for you.

1 Like

@joie, if you want an example to follow, we have the morph target node in the node material used in this playground example.

4 Likes

Hi there @PatrickRyan
Since I don’t really know what I’m doing, I’ve just follow what @carolhmj said and connected the morphTarget node to every single node using mesh.position, mesh.normal, mesh.tangent and mesh.uv.
But the morph target still doesn’t work in combination with the shader.
Could you, please, give a sight to the updated playground here?
Thank you for your support and help with this.
Cheers.

@joie, I am only guessing at the way the scene is supposed to work, but I believe that this is what you intended? There are a couple of things at play here.

First, you had two morph target nodes in your graph and only one of them was getting the updates for the motion in the UVs so your animation wasn’t working. You either need to wire all of your UVs with the same UV output from a morph target node with the updated UVs, or you can simply not use the UV output from the morph target UVs. Since you probably don’t care about the UV layout of your morph target, in your case you can use the UVs of the original mesh and not bother with the calculation of the morph target UVs. If your morph targets manipulated your UV layout, you would want to use this output, but in your case you don’t need it.

The second thing is that you have your alpha wired to both the alpha of the fragmentOut and to a discard output node. If you are discarding the same pixels, you don’t need the alpha wired. You will achieve the same result and it’s cheaper. This is primarily due to the fact that once you wire alpha, the entire mesh is placed in the transparent queue for rendering. So your scene will render first without the transparent mesh and then again to add in the transparent mesh. This adds draw calls to your scene. Using just the discard node will leave your mesh in the opaque queue and just skip rendering some pixels on the mesh.

The other thing you have to handle with a mesh in the transparent queue is sorting issues. When you have overlapping front faces in a transparent mesh, the renderer does not know what should be in front of another without a depth sort. You would need to enable depth sorting on your mesh to have it render correctly, but depth sorting comes at a cost as now you also have to calculate the depth to every part of the mesh from the camera’s position to make sure you are rendering correctly. From the standpoint of the material, you simply need to force depth writing but this does add cost to your render every frame.

Let me know if you have more questions.

4 Likes