Using .png images in "R16 UInt sRGB" format

For one of our current projects, we need a single channel texture (like a grayscale hightmap) with a relative high bitrate so we have more than 255 diffrent values. We use these textures to process them in a custom shader to do some fancy things with them :slight_smile:

Iā€™ve seen that when creating a RenderTargetTextureSize we can set the Format to BABYLON.Constants.TEXTUREFORMAT_RED_INTEGER which is exactly what we want. However in our usecase we have to create the textures on the server (or sometimes a user might even upload a handcrafted texture).

With GIMP it is possilbe to export a .png Files in R16 UInt sRGB format. However when trying to use them, Babylon converts them to RGBA8 unsigned byte format. When trying to hint the format to BABYLON.Constants.TEXTUREFORMAT_RED_INTEGER in the constructor the texture becomes black and a warning is logged:

INVALID_VALUE: texImage2D: invalid internalformat

Looking a bit deeper into the babylon code I found following section inside the ThinEngine._createTextureBase:
gl.texImage2D(gl.TEXTURE_2D, 0, internalFormat, texelFormat, gl.UNSIGNED_BYTE, img as any);

Iā€™m not familiar at all with WebGL development but searching a bit online Iā€™ve read that the internalFormat must always be the same value as the format parameter. Since format is hardcoded to gl.UNSIGEND_BYTE I guess the format hinting is not working?

Am I correct whith this assumption? And would it even be possible to support R16_UINT .png formats?

Playground with a test .png file:

Slightly unrelated topic:
When creating a RenderTargetTexture with the type=Constants.TEXTURETYPE_UNSIGNED_INT the inspector shows that the type is ā€˜unsigned byteā€™. Iā€™m not sure if this is just a display error or if the texture was ā€œfalslyā€ created as unsigend byte.

good question for @sebavan or @Evgeni_Popov but please be patient as they are on vacation for now)

I canā€™t seem to find direct reference about this, but I donā€™t think browsers support loading 16-bit textures from PNGs.

image - Can I use 16 bit per channel for WebGL textures - Stack Overflow
Looking to access 16-bit image data in Javascript/WebGL - Stack Overflow

Thanks for the links @bghgary. If thatā€™s true, I think it will be impossible for now :frowning:

In addition to that Iā€™ve found out, that there are also some limitations within WebGL 2.0. According to this table https://www.khronos.org/registry/webgl/specs/latest/2.0/#TEXTURE_TYPES_FORMATS_FROM_DOM_ELEMENTS_TABLE it seems that it is only possible to create R16F (single red chanel with 16 float value) textures not R16UI (single red chanel with 16 bit unsignedinteger value). Not sure if this would be a deal breaker but for now I donā€™t now if standard image formats even support this.

Furthermore the Khronos specification states:

When the data source is a DOM element (HTMLImageElement, HTMLCanvasElement, or HTMLVideoElement), or is an ImageBitmap, ImageData, or OffscreenCanvas object, commonly each channelā€™s representation is an unsigned integer type of at least 8 bits. Converting such representation to signed integers or unsigned integers with more bits is not clearly defined. For example, when converting RGBA8 to RGBA16UI, it is unclear whether or not the intention is to scale up values to the full range of a 16-bit unsigned integer. Therefore, only converting to unsigned integer of at most 8 bits, half float, or float is allowed.

Which seems to me exactly what you have stated before, that getting the image data through DOM element or ImageBitmap will mos likely result in 8bit per channelā€¦

Assuming you mean normalized values, there is EXT_texture_norm16, but itā€™s not widely supported and not supported in Babylon.js at the moment.

1 Like