Amazing tool to reduce the size of gltf files

Hi there.
I just found an amazing tool to compress gltf files using gltf transform.
You can compress a file using this tool: https://optimizeglb.com/
I have tested in some of my files and I achieved 60-90% size reduction. It’s an amazing achievement. Some of my files went from 100 to 3Mb.
Their source is found here: GitHub - donmccurdy/glTF-Transform: glTF 2.0 SDK for JavaScript and TypeScript, on Web and Node.js.
Is there a way to implement this on Babylonjs exporter?

5 Likes

You may also check the tool which I created recently - https://glb.babylonpress.org/
While Draco compressing is not implemented there yet (will be soon, hopefully) it has much more options and settings for optimizing geometry and textures + viewer + pixel comparison.

5 Likes

Amazing!! Could you guide me on how to implement this tool? I would like to have it in the babylonGLTF exporter.

The optimized GLB is exported after all transforms directly from gltf-transform as

        const glb = await io.writeBinary(doc);
// Then one may convert it to the URL
        const assetBlob = new Blob([glb]);
        const assetUrl = URL.createObjectURL(assetBlob);

babylonGLTF exporter just performs the lossless non-compressed export, it doesn’t perform any optimizations.

If you don’t mind, is it possible to show a simple example in a playground, or it would require install nodejs?

It is much easier to perform in some repo.
I believe that Playground integration is also possible but uncomfortable.
Could you tell more exactly what are you trying to achieve and what are the desired input/output variables?

At https://3dhouseplanner.com/ I have gltf exporter, but the downloaded files end up with large sizes because of the PBR texture images and heavy objects when lots of objects are added to the scene

The first thing you may do is to convert all textures to webp format (and perform any needed optimization which gltf-transform provides).

To catch the export and pass it to gltf-transform for the further processing :

        let options = {
            shouldExportNode: function (node) {
                return node !== camera1 && node !== hdrSkyBox // just for example
            },
        };

        const exportScene = await GLTF2Export.GLBAsync(this._scene, "fileName", options);
        const blob = exportScene.glTFFiles["fileName" + ".glb"];

        const arr = new Uint8Array(await blob.arrayBuffer())
        const io = new WebIO().registerExtensions(ALL_EXTENSIONS);
        const doc = await io.readBinary(arr)

Then one may perform any optimizations from gltf-transform before the final export.

Great, and I also need to install all the dependencies from the library to make it work, right?
I don’t have experience with the npm installs
where does your library come from?

Is it from the previous link I sent?

import { Document, NodeIO } from '@gltf-transform/core';
import { ALL_EXTENSIONS } from '@gltf-transform/extensions';
import draco3d from 'draco3dgltf';
import { resample, prune, dedup, draco, textureCompress } from '@gltf-transform/functions';
import sharp from 'sharp'; 
let options = {
            shouldExportNode: function (node) {
                return node !== camera1 && node !== hdrSkyBox // just for example
            },
        };

        const exportScene = await GLTF2Export.GLBAsync(this._scene, "fileName", options);
        const blob = exportScene.glTFFiles["fileName" + ".glb"];
     
        const arr = new Uint8Array(await blob.arrayBuffer())
        const io = new WebIO().registerExtensions(ALL_EXTENSIONS);
        const doc = await io.readBinary(arr)
        const transf = await document.transform(
        resample(),
        prune(),
        dedup(),
        draco(),
       textureCompress({
        encoder: sharp,
        targetFormat: 'webp',
        resize: [1024, 2024],
    })
);


If so, how to export transf ?

А. Change NodeIO to WebIO, otherwise it will not work in browser (sharp is also useless here).
B. Don’t use draco for the first time. It requires additional complicated setup with wasm files.
C. If there are no animations in your scene you don’t need resample.
D. Dedup should go before prune.
E. After this line const doc = await io.readBinary(arr) you may use

await doc.transform(
        dedup(),
        prune(),
       textureCompress({
        targetFormat: 'webp',
        resize: [1024, 2024],
    })
);

const glb = await io.writeBinary(doc);
// Then one may convert it to the URL
        const assetBlob = new Blob([glb]);
        const assetUrl = URL.createObjectURL(assetBlob);
        const link = document.createElement("a");
        link.href = assetUrl;
        link.download = "SomeName" + ".glb";
        link.click();

Let me know if you’ll have any questions.

1 Like

Good afternoon, @labris.

Your tool is really terrific! Congratulations and thanks! I’m just seeing a reduction of around 86% in my last project assets, and of course w/o Draco, what are very good numbers.

Anyway, in our usual workflow, we work with GLBs with not embedded textures (obtained with the handy gltf-pipeline), in order to have them browser-cacheable and also shareable between geometries. Also, that way is easier to apply an encryption pass to the GLB.

Do you think it could be possible to have this feature, of exporting GLBs w/o embedded textures, in a future release of your webtool?

Best regards.

1 Like

Thank you for your kind words!

What kind of export format do you mean?
GLTF+bin+separate textures?
Or GLB cleaned from any texture references?

There should be no problem to add other export formats. This month there will be added the new panel with user settings. Meanwhile the next update (tomorrow) will bring EXT_mesh_gpu_instancing to the functions available. Since you noticed that you use shareable geometries you may think about using this feature in your workflow.
I’ll announce all functional changes in the main thread - GLB Optimizer for Geometry and Texture Conversion (WEBP and KTX2)

Draco compression will be implemented too, sooner or later :slight_smile:

1 Like