Disclaimer: This might be a bug, but maybe it’s just me.
I want to use a texture containing normalized integers (uint16). Everything is working as expected when I use non-normalized values, but those do not support linear interpolation (only nearest) which I need.
I got it to work with moderngl in Python, but maybe WebGL2 has its limitations here.
Find a PG example here.
The example works, because it’s using non-normalized integer textures and nearest interpolation.
To switch to normalized integers (and eventually linear interpolation) you have to replace places with !!! comments (sometimes just a value, sometimes the entire line of code, you get the idea).
Maybe I’m doing something wrong, but it could also be a wrong internal texture format which is selected by babylon.js internally.
Thanks a lot for looking into this issue.