Hi everyone,
It’s been a while! I’m looking for a few things for a potential new project.
I will need to animate a character in webgl with face animations. Because of the numbers of bones and blendshapes that can come with it and that I need to have something efficient enough to run on most mobiles I wanted to use that technology (that I already used for another project): Vertex Animation Tool | Unreal Engine 4.27 Documentation
This is a script that works for 3ds max and it allows to bake textures (one for the morphing, one for the normals) to store complex animations. That way, I could have the animated face running with a few generated textures.
I have found that babylonjs is using VATs ( Baked Texture Animations | Babylon.js Documentation (babylonjs.com)) but not in the exact same way as the script first need to serialize into json the animation that should already run in babylonjs.
So my question is, are they both compatible? Is there a way to generate the json from 3ds max or maybe just convert textures generated from it into json?
Or maybe someone sees a completely other solution, I guess?
I’m not sure I understand the differences between vertex animation and morph targets from the Unreal doc…
It seems vertex animation is mostly used for particle systems, but for regular meshes it’s no different from morph targets. And as they state, you can’t use vertex animation for rigged meshes, you must resort to morph targets.
So, in your case, you should really use morph targets I think, it will be the most efficient way to make facial animations for your rigged meshes.
VATs is baking bone transformations (matrices), so is not suitable for your needs.
1 Like
As I understand VATs are used with particles system to scatter animated meshes with baked animations.
When doing a complex animated character, we can have something between 50 to 150 blendshapes. We still need to add bones on that. I really don’t think wegl can work with this on low end smartphone as this should consume a lot of memory (depending on the complexity of the mesh). That’s why I wanted to bake the vertex animation (either in json or in a texture) so the animation is only dependent on the vertex animation and nothing else.
I will have the full animation in 3ds max or maya and will export it to use it in webgl. I don’t need it to be modified in webgl. That’s why I was looking at VATs.
No, in Babylon.js VAT are for rigged meshes and are storing bone matrices, not vertex positions (so VAT is not really the appropriate name here).
That’s what morph targets are: a list of vertex positions (and/or other attributes like uvs, if you want) stored in textures.
Ok so VAT won’t help me here as the goal is to only have a static mesh animated with the shader.
The textures from Unreal are storing the position of vertices for each frame. So if you used blendshapes, it’s like if you had one blendshape for each frame and played them all to get the whole animation. The goal is to use a static mesh and get the whole animation from the texture, so you don’t need skeletal meshes for this.
I guess there is nothing done in Babylon for this, and it will need to be developed to be used.
With a little work (creating one morph target for each vertex set) you could use morph targets:
I create 3 spheres that will be used as morph targets (there’s a convenient method BABYLON.MorphTarget.FromMesh
to create a morph target from a mesh but you can also create a morph target from a list of vertex positions). Then I switch from one target to the other by updating the influence
property of each target).
On shader code side, it is using texture fetches to update the position (and normal/uv in my example):
1 Like
MMmmm yes but wouldn’t 1 (or two if I got one for normals) image would weight less than blendshapes for every frame?
For a 1 minute animation I would need 1800 blendshapes (60 x 30fps) or a texture of number of vertices x 1800.
On the other hand, the Unreal maxscript script create blendshape (morphers) for every frame before generating the texture. So this is just a way to export that data that I need to find and how to use it properly with Babylonjs.
Indeed, that would create a lot of morph targets and we currently can’t dispose the vertex (normals, uvs, …) arrays that are stored in the target even if the data are copied to a texture and not needed anymore (we have a mode in the MorphTargetManager
that is not using textures to send data to the shaders, so we need to keep those arrays alive for that case).
Your use case is simpler than what the morph target manager is doing, as it is possible to have multiple targets influencing a vertex whereas in your case there’s no influence, at each frame we are using a new vertex set. So, it would probably be best to make a specific implementation for that.