Adding shadows to custom mesh

Hey it’s me again.

I’ve gotten my code up and running and working using my custom shader, but I was just wondering if it would be possible to add shadows using shadowGenerator or if I’d have to implement my own shadow mapping.

I’m currently passing the vertices in through a storageBuffer and then re-using them with transforms in my vertex shader with vertexID, this seems to give about double the performance than using regular instancing.

boidMesh = new Mesh("custom", scene);
boidMesh.setVerticesData(VertexBuffer.PositionKind, [0]);
boidMesh.isUnIndexed = true;
const numVerts = pyramidMesh.vertices.length / 4;
boidMesh.subMeshes[0].verticesCount = numBoids * numVerts;
boidMat.setUInt("numVertices", numVerts);

boidVerticesBuffer = new StorageBuffer(engine, pyramidMesh.vertices.byteLength);
boidVerticesBuffer.update(pyramidMesh.vertices);
boidMat.setStorageBuffer("boidVertices", boidVerticesBuffer);
boidNormalsBuffer = new StorageBuffer(engine, pyramidMesh.normals.byteLength);
boidNormalsBuffer.update(pyramidMesh.normals);
boidMat.setStorageBuffer("boidNormals", boidNormalsBuffer);

boidMesh.material = boidMat;

The issue I’m running into is when I try to add shadows to this mesh, because it’s trying to use the vertexBuffer which technically has a length of 1.

Vertex range (first: 0, count: 576) requires a larger buffer (6912) than the bound buffer size (4) of the vertex buffer at slot 0 with stride 12.

Is there a way to set the shadowGenerator to work with my custom mesh, or will have to generate my own shadowMap and use that?

Source code is here if that helps: GitHub - jtsorlinis/BoidsWebGPU

2 Likes

You will need to create your own depth shader material, as your use case is quite uncommon and WebGPU specific. You can then instruct the shadow generator to use this material when rendering the mesh to the shadow texture by calling shadowGenerator.getShadowMap().setMaterialForRendering(boidMesh, depthMaterial).

1 Like

Okay thanks for that. Will look into doing that :blush:

Hey @jtsorlinis, I’ve already run into you on GitHub and I’ve got some boids swimming around in my underwater game, very exciting! But I’m getting a bug similar to what you mentioned here, whenever the boids are visible on screen.

In my case it isn’t shadow generation but depth testing that causes this error.

The bug only happens when:

  • At least one boid is in frame AND
  • Post processing is enabled (I do some fancy depth map combination & raymarching there).

The errors look like this:

localhost/:1 Vertex range (first: 0, count: 150000) requires a larger buffer (1800000) than the bound buffer size (4) of the vertex buffer at slot 0 with stride 12.
 - While encoding [RenderBundleEncoder (unlabeled)].Draw(150000, 1, 0, 0).
 - While calling [RenderBundleEncoder (unlabeled)].Finish([RenderBundleDescriptor]).

babylonjs.js?v=53b33638:203934 BJS - [23:43:53]: WebGPU uncaptured error (1): [object GPUValidationError] - Vertex range (first: 0, count: 150000) requires a larger buffer (1800000) than the bound buffer size (4) of the vertex buffer at slot 0 with stride 12.
 - While encoding [RenderBundleEncoder (unlabeled)].Draw(150000, 1, 0, 0).
 - While calling [RenderBundleEncoder (unlabeled)].Finish([RenderBundleDescriptor]).

I’m using DepthRenderer in one of my postprocessing actions:

let depthTexture = scene.enableDepthRenderer(this.main_camera).getDepthMap()._texture;

When I set for the boid material material.disableDepthWrite = true, the error goes away, but unfortunately I’m relying on Z data to combine some raymarched isosurfaces with normal poly world, so I need the depth data.

I’m in a little over my head with the buffer concepts here; do you know how I can get depth testing working? Maybe you made progress on this already for shadows as well?

N.b., I’m okay with a slightly less performant solution, I could probably live with less scummy particles in the water :slight_smile:

Thanks again!

Chris