I came across this question from last year asking about 16-bit PNG support. One of the sources in the answer says that WebGL doesn’t support loading 16-bit textures. Is it possible to load a texture from a PNG with 16-bit channels now with the advent of WebGPU?
I don’t know what browsers do with a 16 bits PNG…
Do you have an exampe of a 16 bits png file that we could use to test?
Here is one I made in GIMP. I’m not sure if it will be compressed to 8-bit when I upload it here though.
In case it is compressed, you could add to a zip and upload that instead?
I think it’s ok:
However, I don’t see a RGBA16_UNORM format in the list of texture formats supported by WebGPU (and by WebGL2, for that matter), which would mean that we could not use 16 bits color textures in our existing shaders (which expect to read floating point values)…
I asked the question to the WebGPU matrix channel, for confirmation.
I got the answer, it is not supported in WebGPU, probably because Metal does not support them.
So, you will have to use RGBA16 float instead, but creating the texture from a .png won’t work, you will have to read the data in another way. I think we plan to support (at least partially) the .exr format, to be able to read 16/32 bits float data and create textures for them.
Thanks so much for looking into this. Do you have a timeline for .exr support?
Also, what is the WebGPU matrix channel? It sounds like a useful resource.
I stand corrected, it’s Vulkan that does not support them!
No, no timeline for the .exr support, it is not high priority. But contributions are always welcomed!
WebGPU matrix channel: https://app.element.io/#/room/#WebGPU:matrix.org
Dawn matrix channel: https://app.element.io/#/room/#webgpu-dawn:matrix.org
Those are valuable resources for WebGPU and Dawn (implementation of WebGPU in Chrome) indeed!
You could try to use .hdr format instead but it would be full float and hot half in this case.