Animating a face in VR

Hi,

I would like to develop a WebVR Babylon application where a user in VR would be able to have a conversation with an avatar showing emotions.

There are a number of Daz3D characters that I can import into Unity including their blendshapes (or morphs as they are called in DAZ). It looks like BabylonJS is able to support morphs but I read that there is a limitation of 16 attributes per mesh which would limit the number of emotions down to 4 apparently. (ref: Use Morph targets - Babylon.js Documentation, bottom section on limitations)

I am curious to better understand this limitations as there would be many more emotion targets I would like to support. Would there be a way to increase the limit?

Thanks

Yep, it looks like that is the case but if you are only using position attributes (avoiding normal + tangents + uvs) it only takes up 1/16 instead of 4. One off the top of my head workaround is to have multiple meshes that can blend between everything to fill in the gaps but that seems a little bit hacky to me. Maybe @PatrickRyan would have a different suggestion.

1 Like

I use my own cpu based morphing, and haven’t hit a limit yet, more difficult .

Staying with this, I see no need for tangents, so that could be skipped. You could also swap targets out with removeTarget().

3 Likes

Also no need for Uvs.

2 Likes

@JCPalmer: Is your tool available publicly?

@trevordev is right here. YOu must stick with webGL limitation of 16 attributes max. But if you do not need normals or uv then you can have more “emotions”

Thank you guys very much for your insights! I understand the idea of not using UV or tangents to increase the number of possible emotions. The issue with this is that I am hoping to match the number of blendshapes offered by ARKit which is about 50 so this idea still falls short of the goal. I could focus on only a few blendshapes but the result is likely going to feel unnatural.

@JCPalmer, I would be curious to learn more about your approach. Not sure I am following what you mean by CPU based morphing. Does that mean you do procedural mesh deformations? That approach would not work in this case since each emotion is a custom deformation.

@Deltakosh, the short answer is no. Longer, it is dependent on Blender, and my javascript exporter, which generates subclasses of classes that are in my QueuedInterpolation animation library. It is a very integrated work flow. Something which I am really good at. I am not a big fan of “throwing dead cats over the wall or a glue job approach”.

There are old copies of stuff in Extensions, but have moved way beyond. Cannot really support others, since I am willing to drop compatibility at anytime. Which I do when I notice something is better in a different part of the work flow, for example.

@redoracle75, no I do not do procedural deformations, but I do perform compound / derived deformations. An example is this old test scene, where I developed emotions / visemes. All the sliders down the left side control a different a different morph target, which is exported out of Blender. The dropdown holds all the currently defined emotions / visemes saved in the library. You can check Expression Dev., and work on new ones.

You can do much of this also in the BJS implementation. You can have say a key for mouth open, mouth wide, 2 for eyes, etc. You can blend them at different degrees just the same as I. It will just happen on the GPU on the fly.

Another thing I do, is have morph target groups, so that on a single mesh, I can have independent morphing of different parts. It save a lot of memory as well, since each key is not the whole mesh. Here is a very recent scene, where I have Face, Eyes, Left Hand, Right Hand, & Breasts (only females) morph groups.

@JCPalmer, thank you for sharing more details about your approach. It makes sense to me and it is encouraging to see that there are multiple ways to achieve avatar emotions in WebGL.

1 Like

Same thing here, I would be interested to know ways to do it. small PG examples would be nice too.