@Jefro5, I know you aren’t a fan of the shader route, but for procedural animations like you are describing, looping animation over something like vegetation, the shader is really the easiest way to do this. And you can use the new Node Material Editor (NME) to create them visually like you are used to in your DCC packages. You will likely want to still set up your assets in Max to empower the shader by using vertex color to allow you to isolate motion on your mesh.
As an example, I created a shader in NME to play a displacement animation per vertex using the red channel in the vertex color to record the final position in 0 - 1 range and then animated between them in the shader. We also created an overview video for the technique to show how we are using NME for animation.
Procedural animation in the shader takes the animation off the CPU (which bone, node, and morph target require) and places that all on the GPU which can open up resources to do other things for you. And as @Drigax mentioned, glTF does not have a notion of per vertex animation as you would need to store all keyframes for all vertices in your mesh.
On a super-low density mesh, this may not be a lot with short clips, but you can see the potential for that type of data to blow up the size of the file as the complexity of animation and mesh density increases. For example, with bone animation, rotating a joint saves the curve for that joint, and if you have hundreds of vertices skinned to that joint, they all take their motion from that one curve. Without the joint, we would have to store hundreds of curves, one for each vertex.
We will have more examples of vertex animation through NME closer to the launch of 4.1, but I would give it a try because it really can save you time and energy by reusing the same shader across multiple assets without needing to animate each one separately. And with some extra inputs, you can really vary the motion across assets. Hope this helps in some way.