Ray tracing for nested meshes

Hey everyone, So I was ray tracing nested meshes and the distance that I am getting via the multipickwithray func are not what I am visually seeing.

Here are the attached screenshots :


This is case 1 where I hit the naval region and I am getting body closer to the camera, which is how it looks.


This is case 2 where I again hit the naval region but now the torso (Hoodie) is fitted to the body below and a light is getting traced but as in the console the distance of the hoodie is larger than the distance of the body.


The light ray path.


I traced a ray at the biceps and the hoodie came as the top mesh.

Please note that I am using morph targets to fit the torso (hoodie) on the model.
So, I would like to know what I am missing here.


Above is the used function for ray tracing.

1 Like

Models made me laugh ngl.

If you want to raycast and hit only certain meshes with a ray, then you can use a predicate function to decide which meshes are valid or not.

(method) BABYLON.Scene.multiPickWithRay(ray: BABYLON.Ray, predicate?: (mesh: BABYLON.AbstractMesh) => boolean, trianglePredicate?: BABYLON.TrianglePickingPredicate): BABYLON.PickingInfo[]
Launch a ray to try to pick a mesh in the scene

@paramray — Ray to use

@parampredicate — Predicate function used to determine eligible meshes. Can be set to null. In this case, a mesh must be enabled, visible and with isPickable set to true

@paramtrianglePredicate — defines an optional predicate used to select faces when a mesh intersection is detected

@returns — an array of PickingInfo
2 Likes

Turns out it was a problem with the bounding box information. So whats happening is the bounding box info is not getting updated after I vary the influence in morph targets and refresh the bounding info with mesh.refreshBoundingInfo. Don’t know why this strange behaviour. But when I cloned the mesh and re add it to the screen, it works somehow. Strange. Any insights over this?

You should pass true for the second parameter of the refreshBoundingInfo call, if you want the morph to be taken into account in the calculation. But be aware that it is slower, because the morph calculation must be done on the CPU to update the bounding info.

1 Like

Hey thanks for the tip. It worked like a charm, but question if I export the then scene post refreshing the bounding box after the above approach still the bounding info in the exported info is the older one. How can I go around it. I want this new bounding info with the morph to be set and exported as default from the scene.

I think the bounding info are reconstructed at load time according to the geometry data. You will have to export modified morphed data if you want to get the corresponding bounding info (or call refreshBoundingInfo after loading).

2 Likes

And how can I export the mesh with the morphed data, also I want the morphed targets to be freezed and strip those to reduce the mesh size?

You can call const positions = mesh.getPositionData(true, true) to get transformed vertices, that you can set by calling mesh.setVerticesData("position", positions). You can do the same thing for the normals, by calling getNormalsData(true, true).

Not sure what you mean by “freeze morph targets”, but if you want to get rid of morph targets once you baked the positions/normals, you can simply set mesh.morphTargetManager = null.

1 Like

If I set normals the hoodie loses its structure as shown in the below image. Any insights on this?

Check that getNormalsData returns appropriate data (meaning, not all 0, or not all (0,1,0) for eg). If it seams ok, I guess you will have to provide a repro in the Playground for us to have a look.

Okay thanks for the tip. So, as I can see morphing takes time, Is there a way to handle this asynchronously.

It does returns me appropriate data as :

Evidently,

Above is my execution order, and the normals are getting updated automatically. I viewed the normal map in sandbox and can confirm the same.

You should call getNormalsData before replacing the vertices, because this method needs to access the unmodified vertex data.

Once you have replaced the vertices / normals, don’t call refreshBoundingInfo with true/true, else it will apply the morph deformations on data which are already transformed.

I’m not sure what you mean, setVerticesData don’t recompute the normals. The normals you viewed in the sandbox are the original normals of your mesh, not modified by the morph.

  1. No, the normal map shown in the sandbox came different.
  2. I think I am getting confused in the flow. So just summarize, you mean to calculate vertex data (position, normals) from morphed mesh and set it.

Also, just to confirm so if I update vertices and normals the bounding info would be refreshed automatically? but vice versa is not true if I use only refreshBoundingInfo().


Scenario (1)

Okay so, I am trying to figure this out properly. (scenario 1)

  1. I applied the morph targets.

  2. Now I needed to apply the new positions data based on applied morphs, so I called getPositionData and set the vertices.

  3. Post this I set the morphTargetManager to null.

(scenario 2) This seems half the story to me though since the boundingbox didn’t get updated in the above process.
Not until I called the refreshBoundingBox() with false/false and then set the morphTargetManager = null. Check the below image for reference.


Scenario (2)

Now I am trying to understand, what role did normals play, because my normal got updated, which I can confirm after exporting this and uploading it to sandbox.

Try scenario 1 but call mesh.refreshBoundingInfo() in the end and let us know if that fixes it.

Hey, so this doesn’t fixes up the problem.

Can you explain to me how bounding box and vertex and normal updations are connected.

Bounding box is recalculated by refreshBoundingInfo by using the vertex data and looking for the min/max of the coordinates (normals are not involved in this calculation).

In fact, you don’t need to call refreshBoundingInfo yourself, because it is done automatically when you call setVerticesData for the “position” kind.

However, there’s a bug for normals when calling getNormalsData, this PR will fix it:

Once the PR is merged, this PG will work as expected:

Currently, after 2s, when we bake the positions/normals, we can see that the vertices are ok but not the normals (the lighting is wrong, but you can see that the bounding box is correct after the baking, without calling refreshBoundingInfo). With the PR, the mesh and lighting is exactly the same before/after the baking.

So can I interpret the below:

The refreshBoundingInfo with true/true recalculates the bounding box but never updates the vertex positions.

So, If I want to update the mesh with current morphs, I need to get vertices after morphs applied, update the vertices (same for normals) and then the bounding box will be recalculated automatically based on the new normals and vertices.

And so, I don’t need to compute the bounding box again.

But this raises one question, as I thought vertices and normals are co-related so normals should have been updated after updating vertices. (This actually happened in scenario 2 described above). And this is why without manually updating normal kind, it got updated, which brought me to the conclusion that vertices and normals are co-related.

Can you clarify this please?