I may have painted myself into a corner on something. I’ve spent some time setting up a system that manipulates vertex points on a plane in real-time. Works great with code, but now I want to bake the animation and ultimately export to GLTF.
I found a demo on the forum from the past couple of days that visualizes in an oversimplified way what I’d want to do.
Basically, this demo is moving vertex points on that ground plane programmatically. I’d like to capture this movement as keyframes and be able to export. In my situation, I’ll have probably a dozen or so planes with each of the 4 points moving a few times per second and expecting maybe a few minutes of animation from a recording session. (big picture: I’m doing 2D cutout style character animation against motion capture data)
I had assumed I could make a few dozen animation tracks against the vertex points, but I think I’m reading this might not be possible.
Can someone steer me in the right direction? Ultimately, I want Babylon.js to do the real-time capture and real-time programmatic playback and editing, but then send it off to Blender to do the rendering.
If I had to guess based on what I’m reading, I should maybe look into morph targets for this?
I guess my secondary option would be to actually construct a rigged skeleton, but that seems like it might be overkill for something I don’t plan on editing/animating later.