Is it possible to use non relative shape keys (blender) for morph target animations?

Hello guys,

I wonder if we can use shape keys (morph targets) based on evaluation time instead of relative ones. If I create a morph target animation in Blender using relative shape keys the animation works fine in babylonjs. But otherwise the animation doesn’t work.

Can you give me a hint if it’s not possible or if I have to do something to make it work. I export my files to glb by the way.

Thanks

Pinging @PirateJC

Hey @samevision - If I’m understanding your question correctly, you have a blender file with shape key animation. When you animate the shape keys in Blender and export them as gltf the animations come in just fine correct?

But you’re wondering if you can animate the shape keys (morph targets) directly in Babylon? The answer is yes absolutely. It’s actually incredibly similar to how you do it in Blender. You just animate the influence of each shape key over time.

This question is funny because I’m working on a new video series about this very subject right now. It will be releasing soon.

Here’s a playground and a simple asset with 1 morph target that’s animated in a loop through code.

Hope this answers your question!

1 Like

Hey @PirateJC,

thank you for your response.

Unfortunately that isn’t my issue. I still want to export the shape keys from blender and just use them in Babylon. In blender you have the option to tick the checkbox „Relative“ at the shape keys panel. If this checkbox is checked the animation works fine in Babylon. If you uncheck the box you can choose different interpolations e.g. linear (like in the video of my previous post). This animation doesn’t work in babylon, but I would like to use these non-relative shape keys. Is this possible anyhow?

While trying to make this work I experimented with morph targets in Babylon as well. My pgs looked similar to yours. Your upcoming video will be interesting for me too. Unfortunately it doesn’t solve the problem of this case.

Best

Ah I see. Interesting. Unfortunately I’m not super familiar with Relative shape keys.

I wonder if @JCPalmer might have any insight?

I do not do the glb exporter. The short answer is no, or it would have exported it. Currently, in the .babylon exporter, I only export the keys and no animation, so that is not an alternative

Unless I am mistaken about what non-relate keys are, I am not even sure it directly can be done after load. Not to bore you, but I know that I can do it with my own Blender js generator / animation system. I am not all that familiar with the morph manager in BJS, so @PirateJC here is what he is really asking:

Can you go from one key / state to the next? In my work, I just tell it the key / state I wish to go to, whether that be an exported key or a composite of keys, and an the system takes into account the state it is currently in and morphs from that state to the one requested directly. I am not even limited to putting all my keys in order, like Blender.

I thought if you had 3 keys like he does, (Cube.22, Cube.21, & Cube.24), in morph manager you can only specify what percentage you wished of each at any given time. You can make an infinite # of composite poses from that, but you go from key 2 to key 3 you would need an animation(s) where the frames were like:

0, 1, 0
0, .9, .1
0, .8, .2

0, .1, .9
0, 0, 1

And I am not even sure this would look right in all spots in the intermediate frames.

Non relative shape keys are just absolute shape keys, so you morph from one mesh to another in a specific time. That’s very handy if you want to morph through multiple steps / mesh shapes linearly. All tests to achieve a smooth transition failed with relative shape keys.

I realized that the blender gltf exporter ignores all absolute shape keys and just export the relative ones as animation. So it’s generally not possible with glb.

I was able to solve my problem by a different approach. Instead of normal animations I used bones which are parenting vertex groups. The really nice side effect is that I just have one mesh and one draw call with this method. Since I still want to give my parts different materials I wonder how to access the vertex groups. No submeshes are created for the vertex groups. Although they are definitely exported with this branch of gltf exporter (GitHub - scurest/glTF-Blender-IO at custom-data). The are defined at meshes => primitives => attributes in my gltf.

Is there a way to create submeshes while importing the model?

@bghgary can probably guide us here.

I’m not sure I can. There are some terms being thrown around that I don’t know which I think are Blender terms maybe? I don’t know Blender. For example, what are vertex groups?

Is there any way you can provide a playground or a model that we can look at?

No. Submeshes are sequentially & continously ordered. There is no requirement that a vertex group be either.

I am really curious to know how to handle this.

I created a mesh and exported it via the modified gltf exporter (GitHub - scurest/glTF-Blender-IO at custom-data). It’s just a cube with a vertex group of the four vertices at the top. @bghgary vertex groups are just groups of vertices :grinning: In Blender you can define them to access the exact same vertices easily or like in my case to parent a bone to them to animate just these vertices. This works perfectly in babylon btw. Here you can download the mesh: https://raw.githubusercontent.com/samevision/static/master/vertex_groups.gltf

This issue on github (Export Vertex Groups · Issue #1232 · KhronosGroup/glTF-Blender-IO · GitHub) related to the same problem. The guy was able to access the indices via “content.children[i].geometry.attributes” in threejs.

Knowing the indices of the vertex group would be a very good start. But the next question is how to apply a specific material to these vertices. Multimaterial uses submeshes. But as you mentioned @JCPalmer submeshes are sequentially. So I have to state the range of affected indices. This would lead to problems if some indices between are not affected. I imagine the only solution right now is to create multiple submeshes like:

  • Material A: Index 0 - 10
  • Material B: Index 10 - 20
  • Material A: Index 20 - 30

Or is the a more effective way to solve this problem? Are submeshes expensive?

I created a pg with the cube that includes the vertex group. I assume the right place would be “root.geometries[0]._vertexBuffers” but it is not there. Can we modify it to show the vertexgroups there: https://playground.babylonjs.com/#4HLCQK#42

EDIT:

I realized that creating a submesh with it’s own material doesn’t reduce draw calls. So it doesn’t matter if I have two meshes with one material each or if I have one mesh with two submeshes with one material each. Is that correct? If yes, what it the purpose of using multimaterial? If the draw calls are not reduced I have much more freedom if I let meshes separated by material.

First, this fork of the glTF exporter is 168 commits behind the the official repo. Sooner or later, this is going to byte you.

2nd, if you wish to use bones, just making sure each bone is weight painted right is all you need to. If you have multiple materials on the mesh, it is going to get broken into multiple meshes on export, but both meshes will use the same skeleton, so they “move as one”. Think you might be making this harder than it needs to be.

Multi-materials are implemented very differently across platforms. This is probably why the glTF format does not handle it. There is even another restriction implied by have contiguous vertices for BJS’s sub-mesh based implementation. That is 2 materials cannot share the same vertex. In your cube example, you would have to double up the 4 verts on the top in order to have a different material on the sides. This is exactly what I do with the .babylon format exporter. Blender does not need to do this, clearly demonstrating they approach multi-materials differently.

As far as why Blender, BJS, & others allow multiple materials on the same mesh, is not all about performance. Ease of access when there are fewer meshes is probably the primary reason. There is some savings though. A world matrix only needs to be computed once for a mesh with multiple materials. That is just a cost of using that format.

Thanks for you advise!

I always thought multi materials are also a method to increase performance - but good to know. Since this was the whole point why I wanted to use multi materials I will stay with separated meshes per material and group them in a transformNode.

1 Like