I read through some articles and came accross an interesting solution for placing objects inside each other without visible intersection: Creating a cutout shader for doors and windows
I believe that clipping planes are a similar approach but as far as I know clipping planes are on scene level and you can create max 4 planes. A house with several doors and windows would need probably more than that.
I am very curious to know if we can use this in bjs.
I am not quite sure how to implement it. Mesh A is placed above mesh B and all vertices of mesh B which are inside of mesh A should be invisible. Imagine you place a window inside a wall. Usually I would use csg to cut a hole into the wall where the window is placed. But since csg calculates on cpu it is very expensive. Especially if you want to make it more dynamic like moving the window to another position you have to use csg all the time. Since the window is transparent you canât leave the vertices of the wall there so a shader that fakes a hole would be very great.
As I understand it itâs not how they do it: the door has no vertices in-between the sides but thatâs the part which is removed in the wall.
I think they simply have a texture that represents the door (more precisely, the space between the sides of the door), they project it on the wall and where the texture is projected they discard the fragments.
So every vertex that is hit by the texture projection gets discarded, right? But wouldnât this mean that a mesh behind the door will be influenced as well if it is hit by the projection? Or is it possible to limit the projection to specific meshes? How is it even possible to make the door project a texture onto other meshes?
No, itâs not the vertices that are discarded but the fragment: I think the clip method they use in the Unity shader code is the discard instruction of the fragment shader. This instruction means that the pixel will not be written to (recall that this instruction is in the fragment shader) and the zbuffer wonât be updated neither.
No, they wonât be influenced if they donât use this material: itâs something that is done in the fragment shader, so linked to a material. Set this material to the wall but not to any other objects.
See the projectionTexture of the SpotLight: this texture is passed to the fragment shader and is handled with the right math to appear being projected onto the mesh(es) the spot light affects:
The projected texture is:
You could imagine having a specific material that is using the same math to project a texture and discard the pixel if the texture has some specific value or any other condition (like alpha < 0.5 for eg).
To illustrate this, I have hacked the standard material code to discard fragments if the rgb value in the projected texture of the spot light is (0,1,0) in the PG above:
I also have added a cube so that you can see it is visible where the pixels are discarded. And because it hasnât the customized standard material that I set on the ground, pixels are not discarded for it.
Thank you for the explanation. This solution is definitely working for me. I will apply the spot light only to the meshes which I want to âcutâ. While creating a pg I noticed two things:
The shader breaks if I use a dynamic texture instead of the image. I would like to create rectangles or other forms dynamically via canvas. Setting the fillColor to green should be the same like loading an image with green values.
Am I able to use this shader snippet with pbr material?
Your problem comes from the sampling mode: as it is bilinear (even trilinear) by default, it is performing an average between red and the background color (black) at the border, leading to a color different from full red. You should disable mipmapping and use the nearest filter to be safe:
I have to say that I really like this solution. I created a helper that calculates the position due to the target object. So I can easily remove some wall to place a window or something: https://playground.babylonjs.com/#9UU80W#12
But there are some issues a didnât have in mind before. Since the wall isnât flat an the spotlight projects the texture in a specific angle the spotlight influences the wall different on both sides. This leads to an offset at the backside of the wall if the angle an position of the spotlight is configured to fit the frontside.
My first idea was to reduce the angle. This way the offset gets smaller but first it doesnât get zero and second the spotlight moves very far away to keep the shape of the target while using a small angle. This also leads to changing the color of the material:
You need an orthographic projection instead of a perspective projection to get rid of the offset problem.
You are currently not able to update the projection matrix yourself, I think I will make a PR about that. In the meantime, you can update some private variables to use your own matrix:
See line 62 / 63. The width/height values I pass to BABYLON.Matrix.OrthoLH are values you should compute yourself depending on the target location: I have tried different values and 75/75 fit quite well with the setup of the PG.
Regarding the aliasing problem, it will be hard to fix it: you wonât never be able to exactly fit the window inside the wall, so there will be some rendering artifacts at the border. You can try FXAA + multisamples (see my PG above) to soften a bit. The best fix would be to add some window frame to hide the borders.
I played around with some values and you were right. 75/75 doesnât work for all sizes. It behaves very different for width and height combinations of the target object. Can I calculate the width and height dynamically according to the given size of the target object? I canât figure it out.
The dynamic texture is now fully used (as in your first PG) and the projection is computed so that this texture matches the glass window: see line 61
Note lines 24-27: I had to perform the projection and not call the existing function because we need to check that the projected uv coordinates are in the 0âŚ1 range
Thatâs because iOS (Safari) is using WebGL1 and not WebGL2: the texture function does not exist in WebGL1, you must use texture2D (line 26 in the PG).
Unfortunately I am not able to implement it in my local project. Creating the custom material is fine but after assigning the material to the mesh it crashes:
This also happens in the pg if the material is assign without creating the spotlight. Is it possible to avoid this? Otherwise is always crashes if no spotlight with projection texture is available in the scene but sometimes it is not neccessary to create a hole so no spotlight needed.
I already tried to wait until spotlight and projection texture are ready but I receive still the same error.
thanks again. Your help is very appreciated. I really donât want to disturb you with this topic anymore but I noticed two problems. I hope these will be the last ones.
1
I noticed that the error of my local project doesnât come from the missing identifier in the shader but from having a directional light in the scene. Just adding a directional light will break it: https://playground.babylonjs.com/#9UU80W#52
2
Imagine having a room with four walls. These walls are merged in one single mesh. The initial idea was to place the spotlight inside of the room so it can project the texture from inside to outside. Due to the orthographic projection the projection goes through the hole scene so it cuts the walls on both sides. https://playground.babylonjs.com/#9UU80W#54
(I adapted the projection matrix and the position and direction of the spotlight to cut the right hole depending on rotation or position of the target object. That works fine. But position and direction obviously canât solve this problem.)