I have a human figure mesh that I need to animate with a stream of data.
The issue is, the data I am receiving is 16 or so vector 3’s for each frame, each corresponding to the position in space of a “joint” or point on the mesh - not a transformation matrix defining the location, rotation and scale of a bone, as Babylon Animations are set up.
So, there is a point on the nose, and it is connected to points on each of the ears, which I guess the positions of all three points combined will determine the rotation and position (and scale…) of the head - but i dont have the information to transform a bone (or rather set its transformation). Similarly the next pint connects to both hips separately, rather than a central spine.
So I’m looking for advice on how to handle this - I need to create the mesh and skeleton to receive this data and react to it, ie weight paint the mesh and so on.
Some more specific questions I have:
- can I transform a bone given its start and end point in Babylon, and generate the whole matrix?
- do i need to do this, or can i just treat the points individually pulling about the mesh in space according to the weiht paint?
- if these conversions into matrices are needed, would they be very heavy performance-wise if run on every frame?
- can i even animate this way, streaming movement data to a skeleton, ie without using Babylon’s Animations?