Cannot render mesh's edges without indices

Summary: How to render mesh’s edges without the use of indices?

To reproduce: Playground
Notes: You can adjust these two to see effects: use_indices and render_edges (around lines 20 and 21)

Details:

I am a beginner, and I learned from here that setting mesh._unIndexed = true might increase performance, and in my case, indices just ranges from 0 to num_verts-1, so it is not necessary to use indices. I must use WebGPU, because only WebGPU supports compute shaders, and my vertex positions are from the output of a compute shader.

Firstly, we can see the scene is correctly rendered without the use of indices and edges.

However, when we turn on edge rendering, it seems that it must require indices. By setting use_indices = false and render_edges = true (around lines 20, 21), we can find that the edges are not rendered, and the console throw a warning: Calling [RenderPassEncoder “MainRenderPass”].Draw with an index count of 0 is unusual. I think my code is correct, because if we additionally set use_indices = true, the edges appears. I don’t want to set indices because it is redundant.

So my question is, can we render the edges with the following flags set?

var use_indices = false 
var render_edges = true

Hello,

You’re right, rendering edges with use_indices=false can be tricky in WebGPU. Here’s why and how to potentially tackle it:

The Challenge:

When rendering edges, the GPU needs to identify which vertices connect to form those edges.
Normally, indices define these connections. Each group of three indices tells the GPU which vertices form a triangle. By analyzing these triangles, the GPU can identify connected edges.
Without indices (unindexed rendering), the GPU lacks this explicit information about vertex connectivity.

Potential Solutions:

Line Strips:

This approach leverages a specific primitive type called “line strip.” It defines a sequence of connected vertices, where each vertex is connected to the previous one in the list.
By constructing a line strip for each edge, you can instruct the GPU to render those connected vertices as lines.

This might require modifying your code to build line strips from your vertex data.
Geometry Shader (if supported):

If your WebGPU implementation supports geometry shaders, you can leverage them to analyze vertex positions on-the-fly and generate line segments representing edges directly in the shader.
This approach eliminates the need for pre-defined connectivity information.

However, not all WebGPU implementations might have full geometry shader support, so check your specific environment.
3. Pre-process Vertices and Connectivity:

As a last resort, you could pre-process your vertex data on the CPU. This involves identifying connected edges in your mesh data and creating a separate list of vertices specifically for edge rendering.

While this adds an extra step on the CPU, it avoids the need for explicit indices during rendering.
Choosing the Right Approach:

Line strips are generally a simpler approach, but require modifying your code to build them.
Geometry shaders offer a more flexible solution but might not be universally supported.
Pre-processing adds complexity but eliminates the need for indices during rendering.

I hope the solution may help you.

Thanks for the quick reply!

However, if I am right, it seems that geometry shaders are not supported in any web platforms (including WebGL, WebGL2, WebGPU, see here), otherwise I can directly use geometry shaders instead of a workaround compute shader.

Moreover, the code cannot run with WebGL2 / WebGL. You can test with the following flags.

var use_indices = false 
var render_edges = true

So I think it is not only a problem with WebGPU, but also problems with WebGL2 / WebGL.

Do you have any idea?

I don’t want to be that person but I think that @Dennisleon is a bot :slight_smile:
I had to remove several links to unrelated websites from their answer.

And the wording is suspiciously close to what ChatGPT could write.

2 Likes

So my response now:

Unfortunately the EdgeRenderer needs face definition to get the edges of…each face :slight_smile:

So in your case if you want to render the edges you MUST have indices.

That being said, I’m not sure to understand why you say it is faster with no indices. It is quite the opposite most of the times as indices help with not recomputing vertices many times. (you can then have only 4 vertices and use indices to still create a box. Without indices you will be forced to use 36 vertices (or a bit less if you used a triangle fan))

2 Likes

@Deltakosh Hi, you mentioned that 4 points can define a box, is this how the edge rendering works? I think maybe this code snippet is relevant because it seems like it is making a square, but what does the square have to do with the rendering of an edge? I also found the edge renderer depends on a shader code, but how does an edge renderer work in general? Have we already applied the ShaderMaterial somewhere to let it render some edges?

I don’t know the answer, but I have a few observations.

Using convertToUnIndexedMesh() appears to change the indices to contain the values 0 to n-1. It doesn’t appear to remove the indices.

I did notice that enableEdgesRendering() takes optional parameters, the second of which uses vertices.

enableEdgesRendering(epsilon?: number, checkVerticesInsteadOfIndices?: boolean, options?: IEdgesRendererOptions): AbstractMesh

Note the default epsilon is 0.95, so you could try

enableEdgesRendering(0.95,true)

And see if that works for you. I could’t see how to remove indices altogether, so I couldn’t test this.

In another thread, I was trying to figure out why normals couldn’t use the same indices that mesh triangles use (I thought it was a bug). Turns out that that is not a thing and normals are offset-aligned with the positionData.

Edit: looking at the enableEdgeRendering function call, I’m guessing the edge rendering code uses epsilon to define triangles without an edge between them.

epsilon: number

defines the maximal distance between two angles to detect a face

I tried all sorts of things in you PlayGround and still couldn’t get edges to draw without indices. In you playground. Even tried mesh.edgesRenderer with new BABYLON.EdgeRenderer and playing with options, especially {useAlternateEdgeFinder:false} because EdgeRenderer says

checkVerticesInsteadOfIndices: boolean
bases the edges detection on vertices vs indices. Note that this parameter is not used if options.useAlternateEdgeFinder = true

I couldn’t get EdgesRenderer to ever display edges.

Here’s a slightly modified Playground that sets mesh._unIndexed = !use_indices, and sets color to black.

But still no luck on getting edges to render whithout indices.

@HiGreg Yeah, I really appreciate that, it seems to be very difficult to render edges without the use of indices. Introducing indices would be the easiest thing to make it work. I guess the best attempt would be to rewrite the whole EdgesRenderer, but there are still many parts in the code that I do not quite understand. :sweat_smile:

EdgesRenderer is based on the assumption that you can provide a face index: Babylon.js/packages/dev/core/src/Rendering/edgesRenderer.ts at 5aabb87adeae59d5f6fc06387c7c10312dac6e34 · BabylonJS/Babylon.js (github.com)

Hence the need for indices :frowning:

== begin incorrect interpretation ==

Don’t the if statements on lines 833, 846, and 859 preclude the need for the source mesh to have indices? And it looks to me that faces (constructed by choosing only facet edges that have adjacent facets meeting at an angle greater than a value calculated from epsilon) are seem to be found by the algorithm. The facets considered are from the source mesh’s indices only if (this._checkVerticesInsteadOfIndices) is true while the facets are directly from positions data (without indices) if (this._checkVerticesInsteadOfIndices) is false.

== end incorrect interpretation==

Oh, I see it. Lines 788, 789, and 790 require the source’s indices. Do you think this is necessary? Why not get the positions directly from the positions array instead of indirectly from the indices array? A lot of other code is dependent on the indices, but maybe a motivated contributor could refactor out the use of the indices array.

It’s not clear to me what checkVerticesInsteadOfIndices actually does, though I do see it in the code. Thank you for the pointer to the code, it helped me understand a lot!

I think I finally found a perfect way to efficiently render mesh’s edges. The playground may give some hints. The basic idea is to leverage buffers that are set from a compute shader, and the edge renderer can directly read in those buffers and sidestep the heavy CPU computation. The standard implementation process all edges on the CPU side, e.g., the createLine function, which is undesirable for performance. We just need to implement lines 42~62 in a compute shader so everything will run on GPU and become fast.

Besides, after reading the source codes, the edges are actually some sorts of triangles, and the variable _linesNormals is not real normals, it is more like a helpful variable to help identify the other ends of the line segments. See the vertex shader for more details, where line 36 is to rotate the direction vector so that it becomes the direction of width, so later it can be used to make the triangle a little bit wider by offsetting the vertex coordinates.

1 Like