Should I use draco to compress the glb file which is exported by the babylonjs.serializers.min.js?

Hello friends,I meet some problems when handle glb models.
I have export some glb files from obj models by the babylonjs.serializers.min.js like those:


Then,I use this script to compress them:
gltf-pipeline -i Tile_+002_+002_L23_00113100.glb -o Tile_+002_+002_L23_00113100-draco.glb -d --draco.compressionLevel 10
The result is:

I found that,only ten persent volume is compressed.It is much smaller than the example on the draco official website!How could I find out what went wrong?
Otherwise,if I can’t further reduce the model volume,should I use the 90% glb file when it has a decode calculation process,or directly use the original glb exported by babylonjs?
The third question is, when I adjust the --draco.compressionLevel parameter from 0 to 10 the volume is not in reverse ratio,the glb with lower compression level may smaller than the higher oneAnt the compress time cost is not in direct ratio too:

image

Why is that?And which compression level should I choose?

cc @bghgary about draco compression

1 Like

This isn’t really a Babylon.js question. I would suggest you ask this on the Khronos glTF forum.

That said, you might consider using :package: gltfpack | meshoptimizer instead of Draco.

3 Likes

Thanks for your reply :+1:.I will try gltfpack to reduce the size.

This optimization tool is really effective!
After use “gltfpack -i Tile_+003_+002_L23_00112000.glb -o pack-tc.glb -c -tc”,the size reduce from 1236KB to 183KB!

2 Likes