GLTF - Can you animate at the vertex level and export it from 3ds Max?

GLTF - Can you animate at the vertex level using no bones/skin and export it from 3ds Max?

I have successfully exported animations from the object level and from a skinned object with bones, but when I tried to animate at the vert level and it doesn’t seem as if that exports at all.

It just might be that this is a feature that isn’t supported and that’s fine too.

Thanks!
~Jeff

hi. you cant use alembic type animations in babylon. you canuse morph targets Use Morph targets - Babylon.js Documentation

1 Like

Thanks for the reply Kvasss! Ah it’s as I thought. I did see you could use Morph targets in the docs. However that would be a really inefficient and limited way to animate verts.

Thanks again for your help!
~Jeff

i agree alembic animation is cool but i think for now size of alembic files is tooo large for web and no sense. user go away from you webpage if waiting too many time. but morph targets is also cool and optimal for web

The best approach depends on your needs I guess.

  • If your animation can be done using math, you could create a shader.
  • If this isn’t possible, you could create any amount of morph meshes and save only the differences between vertices and manually change their positions at runtime. This should save you some space.
  • Alternatively, you can use Vertex Animation Textures as shown here: Vertex Animation Textures

Yeah I have actually used alembic animations in other mobile engines and it works great and easy to setup. For example on a low-poly mesh( since we worry about memory) I can easily put a space-warp modifier on the mesh to create a loopable wave, or wind that passes over vegetation, etc. All of which are a pain to do with bones and morph targets. You just have to be conscious of the number of verts you are playing with…again this should only be used on low-poly objects. If this is the case the memory taken isn’t bad at all and it’s way more performant than a skinned rig in terms of processor load.

Hey Raggar. All good suggestions. However I am looking for something that doesn’t require a custom shader that works within the 3ds Max pipeline. The vertex animated textures are super interesting!

Thanks!
~Jeff

webgl use other magic - this shaders and you can control in vertex shader all your vertex on every frame programmaticly and for this purposes babylon team create node material editor. its very powerful technology. I understand that you experience cognitive dissonance when moving from one technology to another, but for working with GPUs, aembic is not the best solution. this is just my opinion)

and also look at shadertoy https://www.shadertoy.com/
and if you want to move forward in gamedev you dont have other variants) only hardcore only shaders) look at nvidia demo were they shoot one by one balls
this is what happens on your CPU and the picture is drawn for a very long time and after this they shoot from mega cannon all balls at once in one moment this is how the GPU works. And brige between you and this power - shaders

Well. Suggestion 1 and 3 both require a custom shader.
Suggestion 2 doesn’t, but I’m not sure how you would simplify the exportation pipeline.
Let’s say you have a looped animation of 1 second, all done in Max using modifiers or manual vertex manipulation.
Now, for each frame (60), you’ll have to export the mesh and compare the current positions of the vertices with those of frame 0 (We compare to frame 0 instead of the previous frame, as to make the animation frame-independent)
This difference is now our delta, and can be applied by updating the positions of the vertices programmatically.
A simple JSON pseudo string/file example could look like this:

frame0:
vertex0:{0,10,0}



vertex100:{7,4,120}

frame60:
vertex0:{0,-24,0}



vertex100:{-14,0,214}

You could optimize this JSON by using arrays for basically everything and perhaps some compression, but you’d still need the values of a vector3 for each vertex every frame (assuming every vertex changes every frame).

The biggest issue is having to export every single frame and do the delta calculations. I’m not sure how to do this in an intuitive way without having to add it to the exporter.

but your solution similar as morphtargets. you can’t calculate transitions between vertex positions if you have different arrays lengths. how you plan interpolate this positions if you don’t known where your vertex in array? p.s. and your intuitive way for works with this arrays is morphtargets))))

Hi @Jefro5,

Unfortunately, glTF is currently limited to morph target and node/skeletal animation techniques. I’m not sure if @PatrickRyan has any suggestions for implementing these effects in-engine, but you may want to look at vertex shading for recreating your wind/wave effects?

1 Like

@Jefro5, I know you aren’t a fan of the shader route, but for procedural animations like you are describing, looping animation over something like vegetation, the shader is really the easiest way to do this. And you can use the new Node Material Editor (NME) to create them visually like you are used to in your DCC packages. You will likely want to still set up your assets in Max to empower the shader by using vertex color to allow you to isolate motion on your mesh.

As an example, I created a shader in NME to play a displacement animation per vertex using the red channel in the vertex color to record the final position in 0 - 1 range and then animated between them in the shader. We also created an overview video for the technique to show how we are using NME for animation.

Procedural animation in the shader takes the animation off the CPU (which bone, node, and morph target require) and places that all on the GPU which can open up resources to do other things for you. And as @Drigax mentioned, glTF does not have a notion of per vertex animation as you would need to store all keyframes for all vertices in your mesh.

On a super-low density mesh, this may not be a lot with short clips, but you can see the potential for that type of data to blow up the size of the file as the complexity of animation and mesh density increases. For example, with bone animation, rotating a joint saves the curve for that joint, and if you have hundreds of vertices skinned to that joint, they all take their motion from that one curve. Without the joint, we would have to store hundreds of curves, one for each vertex.

We will have more examples of vertex animation through NME closer to the launch of 4.1, but I would give it a try because it really can save you time and energy by reusing the same shader across multiple assets without needing to animate each one separately. And with some extra inputs, you can really vary the motion across assets. Hope this helps in some way.

Thanks for the links, I’ll check it out!

Yeah thanks it seems that might be the only viable option.

Hi Patrick. I have done what you mention in your example in Unity with different games, it works well. Don’t know why didn’t think to use it here.

Yeah when I used vertex animation in the past…I made sure the models didn’t have many verts and the animation loops were short. I believe I could also tell the tool how many key frames to skip in order to further get the memory down.

1 Like

Well I am not sure if I should open a new topic or not but it is related in a sense. Once again I am using 3ds Max and GLTF as the output file.

I tried using morph targets which work surprisingly well. I have a parent object that has many duplicate (non-instanced) children linked to it…all of which have a morph targets referencing the same morph target(pasted a morph target modifier onto the many objects at once creating a shared reference). The parent animates to rotate the cards and the morph targets curl using the same keyframe range. They share the same animation groups. One animation group goes from 0-15 keyframes and the second goes from 16-31. The odd thing is that when you play the first animation group it plays both the parent rotation and the morph target animation in sync as it should. When I play the second animation group the parent rotation plays fine but the morph target doesn’t start playing until the rotation is over, when it should play it in sync with the rotation. It would seem that the morph target doesn’t play frame 16 as frame 0 but rather frame 16 as frame 16 hence the offset. To be clear animation groups play as if they always start at frame 0 even if they are somewhere else down the timelines…relative vs. absolute index. It seems the morph target animation plays using an absolute index?