I am working on a project where I need to pick animated meshes. I read that for performance reasons the animation is done on gpu side while pick uses cpu, wherefore the pick can’t get the animated data. Is there any workaround? For the moment of picking the mesh, it is not animated. So it can animate to different poses and then the user should be able to pick. Since the mesh should be animatable after the pick I would like to avoid something like baking. Is there any way to do that?
Pinging @Deltakosh, but I think it would hit perf too much if the normals were also recomputed on the CPU. Currently, only positions are recomputed to take into account the skeleton/morph targets that could exist.
You can try to get the 3 vertices of the face that was picked (pickInfo.faceId) and compute the normal from there (the updated vertex positions can be retrieved from mesh._positions and the face indices from mesh.getIndices()).
The updated positions are in mesh._positions, not in mesh.getVerticesData("position") (which are the initial positions, without the skeleton/morph modifications applied).
Sure, I noticed that you mentioned that. But I tried to get it done in initial position first. So the mesh isn’t modificated by the skeleton but the vertex positions are not correct anyway. Is this the right method to get them?
const v1 = new Vector3(
positions[indices[faceId*3]],
positions[indices[faceId*3+1]],
positions[indices[faceId*3+2]]
)