In WebGPU API, a GPUTexture
can be created and sent to another worker thread as spec, so it could be possible to do real multi threaded rendering like:
- main thread: create GPUTexture for each thread with depth enabled, split the scene into parts by mesh, initialize, dispatch events (frame render, input and camera change), composite render results with alpha and depth
- worker threads: initialize scene parts, handle events, rendering to GPUTexture
Questions:
- Is this possible for babylon.js?
- Is it possible using offscreencanvas to do similar thing on webgl? Can depth be corrently handled?
- Would this increase performance of scene containing lots of meshes (draw calls) near linearly by adding threads on a CPU with enough cores?
- Are post processes required to be handled on main thread?
- Can animations be handled in worker threads?
- How would this compared to Snapshot Rendering?
- Are transparent meshes supported by this?