Sandbox fails to play my glb with large numbers of morph targets

Hello!!!
I’m investigating a system to display transient animations of finite element meshes.
The mesh is topologically invariant across all frames, the positions of each vertex changes each frame as do the vertex colors (that represent stresses, strains etc.) and the vertex normals too.
I’m generating a .glb with a morph target for each animation frame and then using an identify matrix keyframe output to animate. Currently only animating deformations with constant colors and normals.
This was working well until I increased the size of the animation (Sandbox worked nicely at 1,000 frame, and failed at 10,000 frames). The .glb size for 10,000 frames is ~500MB.

I’m trying to figure out whether or not this type/size of model is too big for babylon.js or if there are size limits on Sandbox or ??? My end goal is to develop a viewer for finite element meshes and results data.

I don’t have a Playground for this since I’m using Sandbox - but I can share the .glb if necessary - although it’s 500MB… I can also share much small .glbs with a lower number of frames that work quite nicely in Sadnbox

Thanks in advance,

Doug

Welcome aboard!

What is the error you get? I think the file should still work, even if it is 500MB.

Are you able to share the file somewhere so that we can have a look?

Thank-you!!!
I get an “Error code : Out of memory” error
Chrome memory usage grows to around 5GB just before the Error occurs. I have 64GB installed memory and nothing much else running.

I’ve shared the file below - it compresses down to 42MB!! - from my Google Drive

The file you sent has a reference to an image that doesn’t exist, so some tools are failing on it. But once I remove the bad image reference, I was able to open it with https://gltf.report (three.js based) but the Babylon.js sandbox runs out of memory. We should definitely investigate why.

Thanks. I know about the bad image reference. I have two approaches for vertex colors in my test bed- one using vertex color values and the other uses a texture which is what the file is used for. The error has been there for a while but it seems benign so I never investigated - Babylon displays smaller versions of this model with the same error.

Nonetheless, I’ll take a look at the image error

May I ask if the problem is solved? I was meeting a similar issue that for a model with a large number of morph target animations, Babylon.js will take much more GPU memory than three.js.

I think Threejs is also using a 2DArrayTexture to store the morph data, like Babylon.js, so it should not be different on the GPU side (?) Would you have a link to a Threejs repro to compare?

The problem raised in this thread is about CPU memory, not GPU, something which is not fixed yet, as it is related to how we store the animation data.

I’ve tested again and it seems that the intances cause the issue.
I was using files at GitHub - Hypnosss/trashbin for test.
1000.glb is the origin model with 1000 morph target animations, 1000multi3.glb is the model that I create some instances in blender(with alt+d).

And I just drop the model to https://gltf-viewer.donmccurdy.com/ and https://sandbox.babylonjs.com/.
GPU memory is basically the same before I drop the 1000multi3.glb to the sandbox.

I’m not sure how you check the GPU memory used by a process on Windows? In task manager, I can get the %GPU time but not the amount of memory used.