I hope you’re all doing well.
I create a sweater model in Marvelous Design, then I applied an animation preset and exported as an FXB file. I imported the file into 3ds Max and the animation was working fine. So, I tried to export as a GLB using Babylon plug-in. But it didn’t work. Only shows the static model not the animation.
Do you guys have any clue about what it is wrong?
Hello, not without a repro but I bet the animations are not supported or exported?
Did you check our doc on that topic>
How to export 3DS MAX scene as glTF | Babylon.js Documentation
The animation is working on 3ds max.
When I animated the model in 3ds max and export using babylon, it works fine. But using an FBX that I exported from marvelous designer doesn’t work for some reason.
Yes but unfortunately this is beyond our control you may have to look for how different the animations are when loaded from your fbx
I run a test using a character animation that I downloaded from Mixamo. It is an FBX sequence too. I imported the fbx file to 3ds max. This fbx sequence has a lot of keyframes. The one that I imported from Marvelous design has none. The one from Mixamo worked fine. I was able to create an animation in a glb file. The log showed me a lot of errors though. But in the end, the animation worked fine. So, I took a look at the documentation and I was not able to find anything that could help me. My animation works in 3ds max viewport. But the glb file created using Babylon plugin is static. I don’t know what to do. It is not a simple animation that I can replicate using keyframes. It is a simulation. So, my best option (only option actually) is to use Marvelous Design. But for some reason, the fbx file that a get from Marvelous, that works normally on 3ds max, results in a static mesh when I exported using the plugin.
Guys… I tried Alembic and the result is the same. I created a simple sim using houdini and exported as an alembic file. Opened the Alembic file in 3ds max. The animation runs fine on the 3ds max viewport. But when I try to export using babylon plugin the result is just a static frame. I mainly use the plug-in to export animations, 'cause static mesh I can export directly from substance. So, this is frustrating.
I understand it can be frustrating but not all type of anim are supported, let me add @Guillaume_Pelletier and @PatrickRyan who might have an idea of why it is not supported in your case.
In the mean time could you share an asset that repro the behavior ???
@Lucio_Freitas, the issue you are seeing here is that Marvelous Designer will do a simulation on each vertex so what you are seeing in the Max viewport is that each vertex has a new vector3 position stored for each frame. As you can imagine, this is a TON of data to run the simulation.
What is supported by glb for animations is skinned animation (a mesh skinned to a skeleton), node animation (an entire mesh object with animations to translation, rotation, or scale), and morph target animation (a mesh that has one or more targets that have been deformed from the base mesh so that it only holds data for the start and end position of each morph target allowing an interpolation between each). No other animation type is supported by glTF at the moment.
The reason for limiting the types of animation in glTF is that the goal was a runtime file that was as small as possible. Node, skin, and morph animations are fairly small because they limit the amount of data to displace vertices in your mesh. This is why the mixamo model works and your simulation does not. The mixamo model has a mesh skinned to a skeleton which holds rotation and some translation data. We only hold that data per bone, and the skin is just a per vertex value that shows an influence between the bones that affect its position. This is basically a texture lookup for the mesh so none of that data is changing per vertex, per frame.
Right now, we don’t support any type of mesh streaming, which is what would have to happen with a mesh simulation like that. The best thing to do with Marvelous Designer would be to simulate your cloth over your model, but then bake that down as a static mesh with folds and volume mapping the simulation on your base model. Then skin that static mesh to a skeleton so that you can deform that mesh with your character. But you would lose any kind of flow of your cloth over your base mesh that is derived from the simulation. You could also make use of morph targets for a little motion in your cloth and blend skinned and morph target animations, but it won’t get near the quality of a full simulation.
As with all real-time engines, you can do some simple physics-based cloth simulations for something like a cape or scarf that can be represented with a few number of vertices, but that simulation can get taxing on low-end devices.
Sorry there isn’t a better answer here, but glTF doesn’t support mesh streaming and while we are interested in that for Babylon, it is still in the idea phase on our team as there are several pipeline concerns to get it here.
So, I think the same applies to Houdini.
Thanks anyway, @PatrickRyan.
@Lucio_Freitas, yes, any simulation from any DCC package will hit the same wall. Even if you are trying to use something like a bend deformer in Max, those will not bake out to glTF because it is a simulation in the DCC tool. You need to bake any simulation down to the mesh to be able to export it out to glTF.
Thanks, man! I really appreciate it.