Shaders: using textures to pass calculated values and de-normalization of values

Hi. I’m using textures to try to pass some data. In summary, every pixel should have assigned a value in the red and the green positions in “hexPointsImage”.

This is the code for my fragment shader:

#ifdef GL_ES
    precision mediump float;
#endif

varying vec2 vUV;
varying vec4 vPosition;
    
uniform float height;
uniform float width;
uniform sampler2D textureSampler;
uniform sampler2D hexPointsImage;

void main(void) {
    vec4 hexagonData = texture2D(hexPointsImage, vUV);
    int gridX = int(hexagonData.r * 255.0);
    int gridY = int(hexagonData.g * 255.0); 

    if (gridX == 0 && gridY == 0) {
       gl_FragColor = texture2D(textureSampler, vUV) + vec4(0.2, 0.2, 0.2, 1);
    } else {

       gl_FragColor = texture2D(textureSampler, vUV);
    }
}

This don’t seem to work. Probably there’s something wrong in the code.

Note: I’m multiplying by 255 because I assume the image (which is a texture 2d) is saving the values normalized.

textureSampler is the image that contains the texture

hexPointsImage is the image I use to save some specific values, that I’m using later in the IF condition.

Both images have the same size.

You should disable filtering when creating the texture pointed by hexPointsImage else you won’t read the exact value you put there but an interpolated value.

Or you can use texelFetch instead of texture2D.

Note that this method gets int coords between 0 and texture.w/h-1, not between 0…1.

I guess my question could be summarized in: is this the right way to extract a value from an texture that has the same size as the main texture:

vec4 hexagonData = texture2D(hexPointsImage, vUV);
int gridX = int(hexagonData.r * 256.0);
int gridY = int(hexagonData.g * 256.0);

The values in this case are encoded in the red and the green coordinates of the image. I’m assuming that data varies from 0 to 255, because when I generate the texture I’m using rgb, but maybe that is not true? I’m generating the texture using HTML canvas, and filling it with style = “rgb(x, y, 0)”.

(hexPointsImage has the same size as textureSampler image which is the one I use to set gl_FragColor).

Thanks. I’ve downloaded the texture (I’m generating it with canvas) and I don’t see that is blurred (using, for example, Gimp to zoom into the pixels). But maybe you are talking about something else? (The texture object of Babylon?)

I think texelFetch is not supported on my machine.

Yes, I’m talking about the Babylonjs texture object: when you create it, there’s a parameter which is the sampling mode that you should set to nearest to disable linear interpolation and mip mapping (Constants.TEXTURE_NEAREST_NEAREST).

Else, when doing texture2D(hexPointsImage, vUV), the GPU will interpolate between the neighbours of vUV and you won’t get the value you expect.

texelFetch is supported in WebGL2 but not in WebGL1.

1 Like