Ive been getting to work with a bunch of projects that can toggle between the webGPU and webGL engine. Which has given me a chance to get first impressions on the webGPU branch. Aaaaand that is… webGL is better so far… Cool webGPU has calculation shaders and a bunch of other jazzz which is supposed to bring us closer to the metal and give us more performance and power. So far its been noticeably less performant and has a bunch of quirks. Especially when doing a scene heavy on GPU particles, I was supper surprised to see webGL just hands down out performs the webGPU engine in this case. In this specific case webGPU was hard locked at around 85-90 fps while webGL running the same scene was holding at 144 with room to spare, and it was not even an obnoxious amount of particles, I think it was like only 10-20k on the screen at max and I have the cache for them turned way way down like 400-2000 max per system (granted there are like 12+ systems going at once but this is all very “normal”).
This is making me thing either we are using the engine wrong on our projects, or webGPU is not all its boosted up to be, or there is something holding back the webGPU engine from out performing the webGL one in most cases.
Either way wanted to start some discussion on this. Why would the “saving grace” of internet rendering tech appear to be worse than something that is nearly 20 years old now? (holy crap webGL is almost 20 years old…)