For one of our current projects, we need a single channel texture (like a grayscale hightmap) with a relative high bitrate so we have more than 255 diffrent values. We use these textures to process them in a custom shader to do some fancy things with them
Iāve seen that when creating a RenderTargetTextureSize
we can set the Format to BABYLON.Constants.TEXTUREFORMAT_RED_INTEGER
which is exactly what we want. However in our usecase we have to create the textures on the server (or sometimes a user might even upload a handcrafted texture).
With GIMP it is possilbe to export a .png Files in R16 UInt sRGB format. However when trying to use them, Babylon converts them to RGBA8 unsigned byte format. When trying to hint the format to BABYLON.Constants.TEXTUREFORMAT_RED_INTEGER
in the constructor the texture becomes black and a warning is logged:
INVALID_VALUE: texImage2D: invalid internalformat
Looking a bit deeper into the babylon code I found following section inside the ThinEngine._createTextureBase:
gl.texImage2D(gl.TEXTURE_2D, 0, internalFormat, texelFormat, gl.UNSIGNED_BYTE, img as any);
Iām not familiar at all with WebGL development but searching a bit online Iāve read that the internalFormat must always be the same value as the format parameter. Since format is hardcoded to gl.UNSIGEND_BYTE I guess the format hinting is not working?
Am I correct whith this assumption? And would it even be possible to support R16_UINT .png formats?
Playground with a test .png file:
Slightly unrelated topic:
When creating a RenderTargetTexture
with the type=Constants.TEXTURETYPE_UNSIGNED_INT
the inspector shows that the type is āunsigned byteā. Iām not sure if this is just a display error or if the texture was āfalslyā created as unsigend byte.