Hey folks,
As I’m working with WebGPU compute shaders a lot for procedural geometry generation on the GPU, I find it helpful to have a base template playground to get started from. The first step of procedural generation is usually to get the points / vertices in the right positions, so the “Hello world” example just sets positions of a vertex array in parallel, and then shows the resulting positions. The same buffer is directly used for drawing on the GPU, so no CPU copy is needed between updates.
So I’ve created this lightweight sandbox where you can right away start editing WGSL code and seeing the effect on output positions.
The geometry output is stored in a PointCloudMeshRenderer
, it’s my simple wrapper of BABYLON.PointCloudSystem
to support dynamic vertex count, and it’s also a normal 3d object so it can be transformed within the scene like any other mesh.
If you read this far: Try dropping the above PG code into ChatGPT / Deepseek / whatever and ask it to give you an updated createScene
function with something other than the built-in Fibonacci sphere as the position function, then share it here!
For fun, here’s a ChatGPT take on the Babylon logo [playground]:
I’ll probably add similar Hello World examples for generating other vertex buffers and index buffers in the coming weeks.
Let me know if you have improvement ideas! I would be interested in contributing this to the docs if others are likely find it valuable.