I am using multiple canvas and followed this documentation.Babylon.js docs
Everything worked until a certain Chrome update the last month? Now I have giant memory consumptions in the render loop. 1GB/s
It doesn’t happen on Safari with webGPU enabled.
It doesn’t happen, if I use the normal engine. I can also “fix” it, if I attach the working canvas to the html body, but then it’s not really a “working” canvas anymore.
Has anyone an idea what could cause it? Does WebGPU require that the canvas be part of the DOM? I guess then it’s something that needs to be fixed in Dawn?
Are you able to create a reproduction in the Playground or elsewhere? We can do some tests, and if it is a bug in Dawn, we will contact the developers in the Matrix channel.
I created a little sample project based on the project from the babylonJS documentation.
This sample project consumes memory less aggressively then my project, but still quite aggressive. The current setup has appToDeviceRatio enabled as well as antialiasing. If you disable the latter, it consumes memory more slowly.
I checked with nvidia-smi on my computer (RTX3080), and the memory does go up until some point (~6.5Gb used), but then always reset to ~5Gb used memory (which is ~ the memory used before I start your repro).
We (or Dawn) probably create a temporary texture, and because the browser/Dawn doesn’t have to release it right at the time it is destroyed but can choose to delay freeing the memory, this behavior is not abnormal. Did you already get an “out of memory” on your end, which would indicate a real memory leak?
Hi, thanks for testing it Yes on my macBook M2 Pro I get an “out of memory” error/warning after a while.
I disabled the feature for now, so I am not using a “working” canvas anymore and I don’t run into the “out of memory” warning.
I let this example now run for 5 minutes and I haven’t gotten the error, so I guess we can close this thread until I can provide an easier way to reproduce this error. Most likely I missed another dependency…