Scene server side compression or not!

Hi all,

first of all, great community, helpful stuff and I learned a lot.

I’m almost done with the functional aspect of my demo/project and now I’m working on the non-functional part before going production.

So I’ve been looking into server size compression for scene files, haven’t yet made it work on CloudFront but in the meantime I did some test. Here is my test with data

I uploaded my scene.babylon (json) file 50 MB and it takes about 45seconds to download, along with everything else in my scene and works ok.

Now, as I haven’t managed to fix my server-side compression for babylon file yet, I decided to re-upload a manually gzipped version, added some encoding tag=gzip then went again and viewed the new file in Chrome.

Well, yes the file is now 8MB but the overhead of gunzipping locally seems to take longer, so total time of download+gunzip ~ 60 seconds!

so it seems after all while gzip saves network costs, it doesn’t help (and in this case harms) the total load time of the page.

Happy to learn otherwise!
PS> all HTML/CSS/JS are compresses on server side, and I believe given their small size, gzip works better.

I am wondering if in your case some format like gltf and deaco could help with overall latency as several request would be done in parallel and if you have http2 support on your server this might help parallelizimg loading? Also maybe you could split some part of the scene only to have several .babylon and .gltfs ???

yeah, I am kinda okay with the overall scene loading time being a total minute end to end tops.
I was just fiddling with server side compression and I came to this revelation (happy to be proven wrong)

Yes, I have about other 5-6 other external .babylons that are already loading in parallel too, but my main scene was mostly created in editor…I can definitely further split it from the editor but I’m too lazy now :slight_smile:

1 Like

ok so I got to the bottom of it. Here I am sharing back to the community.

I’m using Amazon S3 public website hosting for my scene. The trick here for me was to
1- gzip using 7z on ‘Fastest’ compression mode.
2- rename the resulting file from ‘scene.babylon.gz’ to ‘scene.babylon’. obviously the file is still gzipped ===> This one was the game changer to me, it is silly but I overlooked it earlier. my code is still referencing ‘scene.babylon’ as usual. now the browser as most of us know, will detect the content encoding tag and gunzip the file.
3- upload to S3 and add Content-Encoding: gzip

file is 11MB after a horrible 55MB scene, downloaded and uncompressed super fast and worked like a charm.

5 Likes

Thanks a lot for sharing the solution :slight_smile:

In case anyone is digging into that sort of questions like I am:

  • @7aw 's tip is in line with the info I had found in this article from David Rousset.
  • and here is some automation script for Azure which seems directly inspired from the previous source. (Not tested yet.)

Interesting tool, unfortunately I don’t know how to use it. A tutorial or something will help alot. :star_struck:

I’m sharing an article soon with all steps and code on AWS.

1 Like

I’m waiting for it :raised_hands:

There you go!
Enjoy and let me know if you have questions.

https://aws.amazon.com/blogs/networking-and-content-delivery/serving-compressed-webgl-websites-using-amazon-cloudfront-amazon-s3-and-aws-lambda/

4 Likes

Awesome article, thanks for sharing!

1 Like

Here is the LinkedIn post :slight_smile:

1 Like