What would be the best way to create Material that’s “mostly” the same as a SimpleMaterial or StandardMaterial, but supports a texture that’s sampled a 2D Texture Array?
In other words, I’m trying to do something like this, but have the material support lighting and fog and the usual standard material features.
I understand that I could make a ShaderMaterial, if I define the entire vert/frag shaders I need, but since the Simple/Standard material shaders are dynamically created I guess that’s not something I can copy/paste.
Alternately CustomMaterial looks like it might be made for this use case, but I can’t find any docs for it - this post seems to be the only place where its functionality is described.
Is there a better approach I could use, or something already built in to Babylon?
Thanks, it looks like material plugins could solve this.
But after a lot of hacking, I can’t seem to get the 2D array texture data bound into the shader. None of the docs or samples use textures, and there don’t appear to be any insertion points in the shaders related to samplers, so it’s kind of a guessing game how to proceed.
Naively it seems like this demo might work - I declared a ubo of type sampler2DArray, and passed the texture data to that uniform by calling uniformBuffer.setTexture inside bindForSubMesh.
As you can see in the demo this compiles without errors, but the texture data doesn’t actually get in, it’s all zeros. Can anyone tell what’s wrong here?
The follow-up question is then, how to attach an array of texture indices to the mesh, so that different meshes can share this same texture, and each mesh render can get drawn with different textures from out of the sampler. (My use case here is using a texture atlas to draw lots of terrain with only a few materials/draw calls).
I guess this is the marquee use case for Texture2DArray, but I couldn’t find any examples to work from. Just hacking around, I guess maybe I can:
attach indices to the mesh with mesh.setVerticesData
then declare that as an attribute on the material plugin
then copy that attribute to a varying float in the vertex shader
and finally use the float as a texture lookup in the fragment shader.
Is this the right approach? I hacked it out in a playground, and it seems to be more or less working, but I don’t know if this is the “right way” to do it.
That means you were in WebGL1 mode, or at least your current WebGL mode did not support uniform buffers. In that case, the code for the “vertex” and “fragment” keys are injected in the shader code in place of the uniform buffer declaration.