Just updated from 7.18.0 to 7.20.0, and am suddenly getting a bunch of issues/errors in my Webpack build process. The issue seems to be that there’s a bunch of shader-related code in chunks that have their names set to null for some reason.
I think they should be in my “common” chunk as I use babylonjs across a few pages, but for some reason seem to be getting split out into individual shader chunks.
I haven’t changed any code / configurations, and it works fine with 7.18.0, so it’s something to do with one of the updates since then.
Any ideas?
Edit: Just tested 7.19.1 and it’s broken there as well.
Here’s the full Webpack config just in case it’s of any use.
Edit 2: Updated webpack config which has got it building again by not depending on the chunk name not being null, but it still means there’s now an extra 18 JS files being built with the shader stuff…
The code of course is all minimized so is hard to read, but the culprit pieces of code start with e.g. backgroundPixelShaderWGSL, rgbdEncodePixelShaderWGSL, backgroundVertexShader, and pbrVertexShaderWGSL, so it’s definitely related to shaders. There seem to be around 18 of these problematic classes / chunks being generated with a null name.
I’d appreciate any ideas from anyone - I’m of course happy to edit my webpack config a bit if necessary but it is unfortunate for updates to break backwards compatibility without any explanation - I spent an hour thinking it must be something I’d changed in my code before working out it was the babylon.js update Although I guess that will also teach me to be more careful updating packages!
Thank you, I appreciate your super fast replies! And it’s totally understandable that sometimes things need changing and updating - it was more the lack of warning I was mentioning. I didn’t mean to sound quite so accusatory though, sorry about that - I was just slightly stressed out as I have a meeting about this project tomorrow! (of course now that I realise that it’s due to the babylon update I can just downgrade back to 7.18.0 for now if I can’t get it working properly with the new version!)
Luckily I’ve now got it building again successfully (changed my webpack config to not rely on the chunk name not being null - added link to that in the top post just in case it’s useful), the issue is just that I suddenly have an extra 18 JS files for those shader chunks, and having so many extra web requests for the user’s device to download will likely to slow things down as opposed to having just the one before.
Let me know if I can give any more info to help debug this - I do appreciate your super fast support! And I’ll keep trying to see if I can work out where the issue is…
Having multiple chunk is important for tree shaking. You do not want to download some shader code (which can be a lot) if you are not using the shaders
Oh - my assumption was that the shader chunks getting the individual JS files were shaders that would all be necessary, as they’re all there after bundling w/ tree shaking. But I’ve just done some testing and indeed as you say only a few of them were actually downloaded on page load, which as you say is of course good if it reduces the download sizes (it does seem to have reduced the size of my main chunk by about 50KB - although that is less than the 130KB of shaders being used from those new chunks ). Apologies for that, I should have checked that before.
I’m assuming then the reason why they’re all being generated is that the bundler doesn’t know which shaders the model I import will be using, and so they’re all made available?
If so then I guess that only really leaves the issue of why their chunk names are all set to null, which is what broke my build process (as I was putting in different folders based on name)…
The bundler doesn’t know exactly when a shader will be needed, but if it is referenced somewhere in your code, it will be included in the bundle and its shaders will be chunked. Also, there is not a chunk per language - glsl and wgsl (for webgpu), so you don’t need to download unneeded shader code (as a user). It is true that it will generate a few more chunks, but the overall package size will be lower, and shader code will be downloaded on demand.
Regarding the name - we decided not to add webpackChunkName to the code, as this is bundler-specific. If rollup will suddenly come with a different way to name chunks, it will cause issues that we simply don’t want to have. Don’t trust a chunk to have an explicit name, but use its ID or define its name using a “name” function in splitChunks in your webpack configuration ( SplitChunksPlugin | webpack)
It is curious - while my main chunks decreased in size by 52KB, the new shader chunks add up to 335KB, so there must be something there that wasn’t being added in before and which isn’t actually used And the shaders it downloads on page load for the scene I create are 132KB, meaning the size the user downloads has actually gone up even without taking into account shaders that are loaded later on in specific situations (e.g. adding a background HDRI).
In any case, I can live with a 80KB increase for now. I’ll have a look at some point to see if I can work out why it increased…
Thanks for taking a look at and answering my slightly-panicked post
that’s interesting. The reason behind the larger chink sizes is the new WGSL shaders that didn’t exist before. However, there shouldn’t have been such a large increase in size of the shaders that you use. I’ll run a few tests tomorrow or next week, see if I can reproduce and what the cause is.
Thanks!
Oh, BTW - for future reference - you can avoid chunking the shaders if you actively import the shaders you are using as a side effect. We will need to document it eventually, but it is possible to import "the shader i am using" and avoid the chunk altogether. Another way is to import the material you are using straight from the index file, but that might increase your package size.
Just wanted to let you know that I am working on documentation to fully explain how to control the chunk behavior with both webpack and rollup (and maybe parcel if I have the time). I am off next week and it is not yet done, but it is in the works.