KTX Test & Research

Hello BabylonJS fellows! I wish you all the best of health.

I come here to expose you the result of our test with KTX textures.
We made indeed several tests with several models. First, we thought that KTX will allow us to compress the size of our texture files. But it turns out that this is not the case. We used 3 models and here is the weight before and after KTX transformation:
DISHWASHER ________ 8.1MB => 8.9MB
DISHWASHER light ____ 3.2MB => 5.2MB
HEADSET ____________ 24.6MB => 14.7MB
SCANWATCH ________ 123.4MB => 101.5MB

We used this command line to compress the model as advised in this other thread by @bghgary

gltf-transform weld scene.glb scene1.glb
gltf-transform draco scene1.glb scene2.glb
gltf-transform etc1s scene2.glb scene3.glb --slots "baseColorTexture" -v
gltf-transform uastc scene3.glb scene4.glb --zstd 22 --slots "!baseColorTexture" -v

We find that on models that are already light, the weight will rather increase. And in our use case, the models we used will often be fairly well designed and optimized in the first place, so we’re pretty sure KTX will mostly make our model heavier.

As can be read in many forums, including here, KTX provides huge performance improvements in terms of cache and memory usage and thus rendering quality/smoothness.

The question we’re asking now is whether it’s really worth it to use KTX texture since, for us and our client, load time is very very important!

Did you have the same results and reflections on your side about KTX ?
Any thought from the BabylonJS community will be appreciated :wink:

Adding @bghgary back to the thread :slight_smile: I guess it is always a tradeoff between load and run time unfortunately.

1 Like

It’s certainly a trade-off between download size, transcode speed, GPU memory. KTX (version 2) will almost guarantee the texture to be smaller on the GPU and will require some time transcoding on the CPU. For small/simple textures, it may end up being a bigger download. I don’t have any production data about them though.

Are the numbers listed gzipped?


This is involved with a workflow, but I like to try optimize the image’s dimensions PRIOR to establishing the format.

  • Either define or get the data for the texture in large dimensions.
  • create your export however you do it.
  • Open a scene in a browser tab.
  • Repeat, decreasing the dimensions, & comparing browser tabs until the drop off in quality is unacceptable.

You may have already done this, but if not, this will get your size down due to having your textures being unnecessarily big.

When people make textures they tend to make them really big, which is good. You cannot easily up size images. When I get images from MakeHuman, eye textures are 1k x 1k, which is overkill. The tongue texture is also 1k x 1k.

This might be great to get a high detail still render from blender, but I cannot tell much of a difference 256 x 256. You can still use any format / compression on top of this.

1 Like

So you use uastc compression for all textures except base one.
While it is recommended for normal textures, one may save some space compressing with etc1s all other textures (metallic/roughness etc) as well.

1 Like

Thanks a lot for all your answers. This is awesome!

@bghgary the number are not gzipped, actually, I wasn’t aware we could gzip a GLTF or GLB?

@JCPalmer the DISHWASHER light (3.2MB => 5.2MB) is a model where we used our custom texture compressor before transforming to KTX. But this is the example where the weight has risen the most also, unfortunately.

@labris thanks we will try that!

1 Like

You may try also some recommendations from here - 3D-Formats-Guidelines/KTXArtistGuide.md at main · KhronosGroup/3D-Formats-Guidelines · GitHub


This all depends on the server. Some (many?) servers configure compression based on the file extension. For best performance, you don’t want to zip PNGs / JPGs / KTX content (since they are either already zipped or doesn’t do well with zip), but you do want to zip geometry (i.e., mesh attributes), animation, and JSON content. This is why GLB supports referencing external files. The GLB can be configured to store the geometry, animation, and JSON in one request with gzip enabled. The remaining image assets can be separated into different requests without gzip enabled as that would be counter-productive. The image assets can also be downloaded in parallel to speed things up.

While a single self-contained GLB is convenient, it is not optimal for a server serving the files.

Okay to try this, but I will caution that ETC1S will correlate different channels of a texture and thus it usually ends up with artifacts for textures that have uncorrelated channels (e.g., occlusion/roughness/metalness and normal map textures). Similar issue with using JPG for these type of textures. Make sure the result is good enough for your purposes.