Hi,
The mesh using normals from Vertexdata.ComputeNormals looks weird, any wrong with it?
https://playground.babylonjs.com/#WGBIS5
Thanks!
Hi,
The mesh using normals from Vertexdata.ComputeNormals looks weird, any wrong with it?
https://playground.babylonjs.com/#WGBIS5
Thanks!
From my point of view it looks nice (probably because I donât know how it should look like).
Could you kindly provide the reference of the desired look?
I think the issue here is with the quality of the triangles. The normal of a vertex is computed from the triangles that contain that vertex, and having triangles that are too âstretchedâ leads to bad quality normals:
The same mesh shape can be achieved in a variety of ways combining vertices, edges and faces, and those different ways are the possible âtopologiesâ. Some topologies are better than others, like this example in Blender:
When triangular facets share vertices by having the same index the normal for the vertex is the average of facet normals, see Vertex Normals | Babylon.js Documentation
Converting to a flat shaded mesh gives a different look to the mesh https://playground.babylonjs.com/#WGBIS5#1 which maybe what you need.
Maybe this old explanation can help to understand :
BTW, itâs impressive that 6 yo PG examples from the former forum still work with the current version of BJS
Proven really effective backward compatibility !
Hi all,
To make it more clearly, I change the light direction to Zero to compare the looks.
This is what I need:
https://playground.babylonjs.com/#WGBIS5#4
And this looks not good:
https://playground.babylonjs.com/#WGBIS5#5
Why I have this issue because I 'd like to merge all my meshes into a SPS, the SPS must provide the normals to build the mesh (If remove line 29 in below PG, errors happend, It is a bug?)
https://playground.babylonjs.com/#WGBIS5#6
Thanks.
Why SPS? SPS is for repeating one mesh many times not for one mesh.
Not a bug, SPS calculates its normals from the supplied mesh normals.
My scene has huge meshes, Iâm trying to use SPS to improve the performance. When I add all the meshes into the SPS, it looks very different from the original, thatâs why I think the normals cause this issue.
Yes man!!
If the problem is that the meshes are too big, I donât think adding them all to an SPS will help because that will create an even bigger meshâŚ
It might help thou to go the opposite direction and break each mesh into smaller meshes, that way the parts that are offscreen arenât rendered as much.
Check this out for optimizing your scene: Optimizing Your Scene | Babylon.js Documentation (babylonjs.com)
The documentation on Optimizing Your Scene has a section for how to optimize a âScene with large number of meshesâ, but it would be good to also have a section for how to optimize a âScene with large meshesâ. IDK enough about it to volunteer to add it myself thouâŚ
One needs to define more exactly what a âlarge meshâ is actually.
Is it one inseparable mesh, or it is a group of meshes, or it is just too big for a camera to fit to the view.
There could be different approaches in different cases of âlarge meshesâ.
But in all cases the first question is âWhat should user to experience at any given moment and how to achieve this with minimal lossesâ
Hi all,
Sorry not described clearly, actually I has huge number of meshes, Iâm trying SPS to reduce the drawcall.
Iâd like to come back to the normalsďź
https://playground.babylonjs.com/#WGBIS5#4
In this PG, I removed the normals of vertexData in line 29, but it looks fine.
So I wonder where does BABYLON compute the normals in this case?
A mesh without normals should be black, right? @Deltakosh @jerome
Thanks.
Hi @sebavan ,
You hit my original concern:
As the normals computed in the shader (PG4) and by VertexData.ComputeNormals (PG5), looks so different
Thatâs why I doubt the normals seems not correct
Thanks
@Freeman when you call
BABYLON.VertexData.ComputeNormals(positions, indices, normals);
and set the mesh normals using
vertexData.normals = normals;
the function calculates the normals based on the algorithm as descibed in earlier posts and the shader uses these values to render the mesh.
Should you bypass these steps the shader uses its own (different) alogorithm to calculate the normals.
The ComputeNormals
function has access to all facet triangles that share a particular vertex index and so calculates the average of the shared facet normals and applies this to the vertex.
A shader can only access one facet triangle at a time so a shared vertex normal will end up being that of the last facet accessed and these normals used by the renderer are not stored by the mesh in the CPU.
Now because of the way the SPS is built the algorithm used expects to find the normals stored in the passed mesh,
SPS.buildMesh();
finds non existant normals at the step where it needs them and so produces an error long before it gets to calling the shader.
One way to solve your problem is, as I said, to use convertToFlatShadedMesh
However as this increases the number of indices and maybe the number of vertices there is another way which is to correct your mesh data - coming next post.
When you look at the data in your positions array some z values are 12.5 and some 12.3 the difference though minor is enough to produce different average normals at different vertices. The difference between these averaged normals is enough to produce the different rendering for different facet triangles.
By making all the z values 12.5 your mesh now lies completely in the XY plane and so all normals for all facets are along the Z axis and so are their averages.