Most performant way to check depth order of gui buttons and meshes


I wonder what is the most efficient way to check if a mesh is behind or in front of a 2d gui button. My gui buttons are linked to a mesh. The mesh is disabled because its only purpose is to position the gui button in 3d space. Due to the mesh I can get the vector3 position without any problems. My idea would be to make a ray between the gui position and the camera position and check if it hits another mesh. Since I have many different buttons I can imagine that calculating the rays all the time will be expensive. So is there any other approach to solve this? Or what would be best practice?


Adding @msDestiny14

Hi there! Is this the scenario you are trying to check?
By using a fullscreen GUI the button will always render on top even if there is a mesh in front of it

The other scenario which now thinking is maybe what you mean is a GUI that is textured to a mesh? In that case a ray would be a good idea to see if there is any object in between.

If you just want to see which one is closer to the camera in terms of a given direction. I’d imagine you’d use the “look vector” (not sure what it’s specifically called) of the camera and calculate the distance of each mesh relative to that.

Please let me know which scenario seems correct. :slight_smile:

1 Like

Thanks for your response. I want to know if a GUI button is occluded. Of course that is not possible since it the buttons are not in the 3d space.

The easiest solution would be to enable my helper meshes and Test mesh.isOccluded on each frame. But since these meshes are just to determine the correct positions of the buttons I don’t want to spend draw calls for them.

So I just want to check if a specific vector is occluded. I saw some posts from you @Evgeni_Popov, you said that this could be done by checking the z-buffer. I just found some different approaches from you:



What approach do you suggest from performance perspective? Or do you have something else in mind?

Is the shader method even possible with GUI? I wouldn’t be able to apply the shaderMaterial to the buttons, so I would have to switch to sprites which cause additional draw calls?


I use rendering on demand to improve the performance of my scene - something similar to this: Since I want to animate my buttons I decided to create dom elements placed above the canvas and project the 3d positions. That works very good.

But this method doesn’t allow to use the shader solution to adjust the opacity for buttons behind a mesh.

Moving the mouse (in the above mentioned pg) gives me 6000 absolute fps. Adding “depthMap.readPixels(0, 0, buffer)” to the onViewMatrixChangedObservable causes a fps drop to 90. This is very very expensive. Is there any other method to check if a vector3 is behind or in front of a mesh?

1 Like

I still try to figure out what’s the best way to render buttons behind and in front of meshes. I tried several solutions but all of them have their limitations:

  • Bjs gui: fullscreen gui renders on top of the scene so I won’t be able to adjust the opacity for buttons behind meshes
  • Dom elements: same problem as with bjs fullscreen gui. I can ray cast on camera movement if there is a mesh between camera and button and adjust the opacity if a mesh it hit or not but this is too expensive for multiple buttons.
  • Sprites

The best solution are definitely sprites since the depth can be handled by the gpu. To make this as efficient as possible I want to create the buttons with instances and give them a shader. I tried to recreate this shader (+ instance support) in nme but something has to be wrong:

I can’t call setDepthFunction for my instances - this might be the problem. I would really appreciate if you could guide me here @Evgeni_Popov