How to pass an array of vectors to CustomProceduralTexture fragment shader


I’m using CustomMaterial where i’m able to pass array of vectors as uniforms (e.g. setFloatArray3) or even manage UBOs (bindUniformBlock/Buffer) in the effect of the Material (this.onBindObservable.add(() => this.updateUniforms(this.getEffect()));) to its ShaderParts (this.Fragment_Before_FragColor, etc…) and finally access it in the vertex/fragment shader.

This material also uses a CustomProceduralTexture as its diffuseTexture but in there I can only pass an array of floats using this.setFloats and that can only be 1024 floats long, which is really not enough for my use case.

So my question is, does CustomProceduralTexture has some method similar to the ones in CustomMaterial’s effect or some other mechanism how I can pass large arrays of data to its fragment shader?

Thank you in advance!

Welcome aboard!

I’m afraid there’s no UBO support in the CustomProceduralTexture class and I’m not sure it would be an easy task to add it…

Let’s see what the experts (@sebavan?) have to tell about this.

I would recommend to store your data in a texture as the max length for array in webgl uniform can be rather small: webgl - What is the max size for an array to be uploaded using gl.uniform1fv? - Stack Overflow.

1 Like

Wow, thank you for such a fast response.

I have one additional question though: If I use texture for passing data into the shader - lets say babylon’s RawTexture and I structure it as an array - meaning it will have width equal of the array length and height 1 (then I use texelFetch to access the data directly and don’t need mipmaping).

Now, will babylon always end up padding the texture to a square, therefore I’ll end up wasting a lot of memory (lets say 2048x1 texture padded to 2048x2048)… or does babylon use something like Rectangle Texture - OpenGL Wiki under the hood automatically if it detects the dimensions do not match? If not, is there some way to turn it on?

Thanks again for your time :slight_smile:

No, Babylon won’t make your texture POT (Power Of Two) if in WebGL2. However it will do it in WebGL1 because WebGL1 does not support non POT textures.

You normally don’t want to use W x 1 pixel textures, as there is a limit to the width/height of a texture, depending on the GPU. You should instead use something like 256x256 (or less, depending on your needs, but you can reuse this texture for multiple data) and use modulo arithmetic in your shader to read from the texture (or hardcode things, knowing the width of the texture in advance).