Webgpu: Mesh.setVerticesBuffer doesn't work as I expect it to

Hello,

I am trying to use compute shaders to compute positions of a large particle system. I have read the boids webgpu example.

In the code below, I am trying to update the mesh vertices positions using a VertexBuffer with setVerticesBuffer, but it doesn’t work. When doing so, the mesh just disappear;
See the mesh.setVerticesBuffer(positionsVertexBuffer, false); commented code line.

import * as BABYLON from 'babylonjs';

export async function SimulateBabylon() {

    const canvas = document.createElement('canvas');
    document.body.style.margin = '0';
    document.body.style.overflow = 'hidden'; // Prevents scrollbars
    document.documentElement.style.margin = '0';
    canvas.style.width = '100vw';
    canvas.style.height = '100vh';
    canvas.style.display = 'block'; // Ensures the canvas fills the width/height without margins

    document.body.appendChild(canvas);
    const engine = new BABYLON.WebGPUEngine(canvas, { adaptToDeviceRatio: true, antialias: true });
    await engine.initAsync();

    const scene = new BABYLON.Scene(engine);

    const camera = new BABYLON.ArcRotateCamera("camera", Math.PI / 2, Math.PI / 2, 10, BABYLON.Vector3.Zero(), scene);
    camera.attachControl(canvas, true);

    new BABYLON.HemisphericLight("light", new BABYLON.Vector3(1, 1, 0), scene);

    const mesh = new BABYLON.Mesh("custom", scene);

    const vertexData = new BABYLON.VertexData();

    const vertexCount = 4;
    const positions = [0, 0, 0, 1, 0, 0, 0, 1, 0, 1, 1, 0]; // 4 vertices with x, y, z
    const indices = [0, 1, 2, 1, 3, 2]; // 2 triangles with 3 vertices each, in a square shape

    vertexData.positions = positions;
    vertexData.indices = indices;
    vertexData.applyToMesh(mesh);

    const material = new BABYLON.StandardMaterial("material", scene);
    material.diffuseColor = new BABYLON.Color3(1, 0, 0); // RGB for red
    material.wireframe = false;
    material.backFaceCulling = false;
    mesh.material = material;
    mesh.sideOrientation = BABYLON.Mesh.DOUBLESIDE;

    engine.runRenderLoop(() => {
        scene.render();
    });

    window.addEventListener('resize', () => {
        engine.resize();
    });

    // set up webgpu buffer for storing positions
    const positionsStorageBuffer = new BABYLON.StorageBuffer(engine, vertexCount * 4 * 3, BABYLON.Constants.BUFFER_CREATIONFLAG_VERTEX | BABYLON.Constants.BUFFER_CREATIONFLAG_WRITE);
    const positionsVertexBuffer = new BABYLON.VertexBuffer(engine, positionsStorageBuffer.getBuffer(), BABYLON.VertexBuffer.PositionKind, false, false, 3, true, 0, 3);

    positionsStorageBuffer.update(new Float32Array(positions));

    // index buffer
    const indexBuffer = new BABYLON.StorageBuffer(engine, indices.length * 4, BABYLON.Constants.BUFFER_CREATIONFLAG_INDEX | BABYLON.Constants.BUFFER_CREATIONFLAG_WRITE);
    indexBuffer.update(new Uint32Array(indices));

    async function animate() {
        engine.runRenderLoop(async () => {
            // mesh.setVerticesBuffer(positionsVertexBuffer, false); // uncomment this line to try to update the mesh with the new positions. However, this doesn't work, the mesh just disappears
            // mesh.setIndexBuffer(indexBuffer.getBuffer(), vertexCount, indices.length); // this one too
            scene.render();
        });
    }

    await animate();
}

Do I miss something?

The shader which wil process the storage buffer.

Sorry my question is not clear enough.

The problem is I expect the following lines of codes

const vertexCount = 4;
const positions = [0, 0, 0, 1, 0, 0, 0, 1, 0, 1, 1, 0]; // 4 vertices with x, y, z
const indices = [0, 1, 2, 1, 3, 2]; // 2 triangles with 3 vertices each, in a square shape

// set up webgpu buffer for storing positions
const positionsStorageBuffer = new BABYLON.StorageBuffer(engine, vertexCount * 4 * 3, BABYLON.Constants.BUFFER_CREATIONFLAG_VERTEX | BABYLON.Constants.BUFFER_CREATIONFLAG_WRITE);
const positionsVertexBuffer = new BABYLON.VertexBuffer(engine, positionsStorageBuffer.getBuffer(), BABYLON.VertexBuffer.PositionKind, false, false, 3, true, 0, 3);
positionsStorageBuffer.update(new Float32Array(positions));

// index buffer
const indexBuffer = new BABYLON.StorageBuffer(engine, indices.length * 4, BABYLON.Constants.BUFFER_CREATIONFLAG_INDEX | BABYLON.Constants.BUFFER_CREATIONFLAG_WRITE);
    indexBuffer.update(new Uint32Array(indices));

mesh.setIndexBuffer(indexBuffer.getBuffer(), vertexCount, indices.length);
mesh.setVerticesBuffer(positionsVertexBuffer, false);

To be equivalent to

const vertexCount = 4;
const positions = [0, 0, 0, 1, 0, 0, 0, 1, 0, 1, 1, 0]; // 4 vertices with x, y, z
const indices = [0, 1, 2, 1, 3, 2]; // 2 triangles with 3 vertices each, in a square shape

vertexData.positions = positions;
vertexData.indices = indices;
vertexData.applyToMesh(mesh);

However, only the later mesh displays correctly. Why? Whaty do I need to change to the StorageBuffer solution to make it equivalent to the applyToMesh solution?

You have to dispatch an action on the GPU and fill the VertexBuffer. It doesn’t make sense to create a VertexBuffer bound to a StorageBuffer w/o making any changes in the buffer on the GPU.

You can create the VertexBuffer directly from the array:

    const positionsVertexBuffer = new BABYLON.VertexBuffer(
        engine, new Float32Array(positions), 
        BABYLON.VertexBuffer.PositionKind)
        
    mesh.setVerticesBuffer(positionsVertexBuffer, true); 

Thanks @roland
Of course it doesn’t make sense to create a VertexBuffer bound to a StorageBuffer w/o making changes. The idea right now is just to check how to use a buffer that is already on the GPU to display the mesh. The computations will come next.

I’ve tested your solution of adding a dispatch call with a dummy compute shader.

    const dummyCs = new BABYLON.ComputeShader("dummyCs", engine, {
        computeSource: `
        @group(0) @binding(0) var<storage, read_write> data: array<f32>;
 
        @compute @workgroup_size(1) fn main(
        @builtin(global_invocation_id) id: vec3<u32>
        ) {
        let i = id.x;
        data[i] = data[i] + 0.001;
        }
        `
    }, {
        bindingsMapping: {
            "positions": { group: 0, binding: 0 },
        }
    });
    dummyCs.setStorageBuffer("positions", positionsStorageBuffer);

    async function animate() {
        engine.runRenderLoop(() => {
            dummyCs.dispatch(vertexCount * 3, 1, 1);
            mesh.setVerticesBuffer(positionsVertexBuffer, false);
            mesh.setIndexBuffer(indexBuffer.getBuffer(), vertexCount, indices.length);
            scene.render();
        });
    }

This should slowly move the square shape out of the screen. But still, the mesh does not appear. Can you think of anything else missing?

Create a PG please. It’s a bit time confusing to always copy your pasted code to the PG and test it. Thank you!

@roland here it is: Mesh.setyVerticesBuffer webGPU test | Babylon.js Playground (babylonjs.com)

This PG reads the storage buffer computed by the compute shader and applies the data to the mesh:

Using the vertex buffer directly:

Please note I started with WebGPU shaders not so long ago, hopefully I’ll not give you falsy information :stuck_out_tongue:

I like the second solution, without the CPU data transfer, thanks @roland :slight_smile:
I have noted that the biggest problem was new BABYLON.VertexBuffer() arguments.

I would like to use indices computed from a compute shader too. What do we need to change to setIndexBuffer to make it work, without having to call vertexData.applyToMesh(mesh);?

Found it! See Mesh.setVerticesBuffer webGPU test | Babylon.js Playground (babylonjs.com)

The problem was the datatype for the index buffer. We should use Uint16Array instead of Uint32Array. No need to use vertexData.applyToMesh(mesh);

Now all the computation is done on the GPU.
@roland by the way, no need to call any dispatch to see the square on the screen