Disclaimer: This might be a bug, but maybe it’s just me.
I want to use a texture containing normalized integers (uint16). Everything is working as expected when I use non-normalized values, but those do not support linear interpolation (only nearest) which I need.
I got it to work with moderngl in Python, but maybe WebGL2 has its limitations here.
Find a PG example here.
The example works, because it’s using non-normalized integer textures and nearest interpolation.
To switch to normalized integers (and eventually linear interpolation) you have to replace places with !!! comments (sometimes just a value, sometimes the entire line of code, you get the idea).
Maybe I’m doing something wrong, but it could also be a wrong internal texture format which is selected by babylon.js internally.
I believe this is only possible via the EXT_texture_norm16 extension which is only supported by a limited number of browsers. Babylon.js doesn’t support this extension at all at the moment.
What the code currently does with your !!! setup is the internal format will be RGBA16UI which is not normalized.
I can only encourage everyone (in particular outside of the core team) to contribute. The core team is very supportive and give great guidance! Thanks a lot guys! That was fun!