Generated from OpenStreetMap-data I have XYZ- and UV-values and a texture-material.
I generate a BABYLON.Mesh(“custom” for i.e an OSM-building.
Now I have many buildings with different colors.
So I use subMeshes and a MultiMaterial. A City is visible, fine.
BUT!: Some of the buildings have windows, done by a 2nd transparent Texture.
An in this case there are two sets of UV-values!
(As there are many combinations of textures, I can’t make one mixed texture.)
With ThreeJS, I solved this by a workaround :
I used both sets of UV, duplicated the XYZ and added two subMeshes and two materials.
But that doubles the GPU load too.
I assume, a shader-material should be used. But I do not know shaders
Babylon certainly offers more than one solution to this.
Is there a Multi-Texture-Material?
If Shader, could someone show me the way? A playground would be heaven.
Materials with two Textures in a multiMaterial should be no problem.
Do I have just to add the two sets of UV-values in the UV array?
… and, in order to add some more confusion in your thinking, I can tell you that some users already did cities with many buildings by using the solid particle system (SPS).
Not sure it fits your need, but I had to introduce this possibility too.
As OSM has “real” buildings, each has an different footprint. Cloning or instancing will not work. Could I make “particles” of different shape? Different materials? Two textures merged?
But a wood of trees could be SPS. I tried clone instance and SPS with trees, mainly to check, what is the most fast solution. And the most fast still was the custom-mesh.
You can build solid particles of different shapes and apply to each a different part of the same texture, like you would do with a texture atlas or a sprite sheet.
mmh… maybe could you set both the different images (textures) into a single one side by side, so it would look like an atlas. Then you could assign each image to the required solid particles.
Anyway it would probably far faster because the SPS renders everything in a single draw call. It’s one of its reason of existence actually.
I compared SPS with my single mesh (using subMeshes) and it was not faster.
I can’t place both in one texture because there are many different combinations of two textures. And may be I did not make it clear: I need to use two textures on the same particle/subMesh/face!
Think of a street texture and an arrow texture partly transparent above it. I am sure, a shade is needed because other solutions (with Java) used it successfully.
Ok so afaik, BJS only has one diffuse texture channel. You can not set a second diffuse texture on the same face. You can however set them in multiple faces by assigning each face a custom material.
A solution would be to place two meshes on top of each other but I am not sure how viable it would be in your case. There are also the terrain material and the mix material. https://doc.babylonjs.com/extensions/terrain. Both of them are used to mix textures together by blending them.
You would simply draw the color channel where you want the texture to draw and it would draw. Also, afaik, this works mostly on planes and it applies to all faces of that plane. I also think this might not be a solution for what you need to do.
Now reading your post you say :
I would simply make all the windows a different material and set the transparent texture, why is this not possible in your case? Also, why not mix everything in a texture atlas?
This is the idea of a texture atlas, add all the combinations there and which one is in which face is decided by the UV’s. Probably this is your best bet here.
Hm, mix material seems almost what I need. Only there is no way to set the diffuseTexture’s accroding to extra UV values.
I will try to get the Java shader and add it here, to clarify what I mean.
Use two meshes for 2 textures is what I did already in ThreeJS. And it worked fine, but is not the best FPS solution.
The window IS a different, transparent material.
I assume, I can’t use an atlas because the combinations are undefined and very high, almost endless. Even if the two textures are the same to mix, the UV values may be different. And i forgot: the base textures are modified with RGB values to; which causes even more combinations. Could I do it by generating different textures for >10000 buidings?
I think somewhere along the line the actual issue @DerKarlos has asked about has got lost. I myself got a little confused about adding transparent windows over an existing texture as at first I thought the transparency was to be transferred to the first texture.
I think now that the issue is there are two textures A and B. B has a main feature and this feature is to be applied over A. Other than the main feature the rest of B is transparent. The question is how to apply A and then B to a mesh. Unfortunately the only way I know is as DerKarlos as already said to use two meshes one overlaying the other. Now the uv2 vertexData array exists and does need to be used via a shader but I do not know how.
That’s different Textures to different Faces, using atlas.
I need two Textures to the same Face
That’s faceUV to select the Texture-part
I have UVs but to place the 2nd Texture only.
Thank you anyway
Thank you, JohnK, interresting, I see new features of Babylon every day.
I think, decals may be usable in a way:
The first texture should be on the mesh as I do it now already.
The second has to be sized and pointed, window by window (no repeat like with textures)
I am not sure, how the GPU would be “happy” with that
and will head for the Shader or use 2 (sub)Meshes.
This are the shader files, merging a texture with alpha on a second texture.
How to use it? I will try it with Babylon documentation.
Could someone make me a babylon playground?
Yes, I understand. What I had in mind, if you used a SPS, was to stick a quad planar particle textured with the arrow on a transparent background to the street (or wall, or whatever you want) solid particle. This is really an easy operation, simpler than coding a dedicated shader.
Solid particles allow to mix the vertex colors (the particle color) with the texture directly.
Well, I know that you try another approach fitting better with you needs and skills. I just mentionned this in order to show you that many different ways can be used with the tools provided by BJS. Good luck and have fun with your project
Sorry, I am late. Had some ohter priorities
That example looks so nice and compact. So I made a test to use it,
and did not make much success. My knoweledge about shaders is miserable :-/
It looks like the shader code, I got has an older interface
// per vertex input
in vec3 VertexPosition;
instead of
// Attributes
attribute vec3 position;
etc. And so much interface. I do have a Java code but it is just to much (search for “default”):
I can’t expect, someone would do all the work to make me an example with my shader code.
So I will use my first solution, duplicating the triangles.
Yes, my skills relay seem to bad even to be sure about what you offer with SPS.
As long as you mean to use two different particles for the two textures, I do understand.
(In this case, subMeshes would be about the same and also a single draw call)
Otherwise I don’t see what and how to to it, without an example, sorry.