How to use vertex attributes type as unsigned byte (maybe bug..?)

I have been trying to use custom attributes data for some mesh, and I need to pass a uint8 as attribute of the vertices.

In the example I am creating a vertex buffer with data type UNSIGNED_BYTE.

It all goes well until I try to use the attribute in the shader.
As soon as I try to assign the custom data to the out varying from the vert to frag shader, I do not see anything being rendered.
No errors in the console either.

But if for example I assign to the varying out a constant value, things work fine…very weird!

this make nothing rendering (but no errors in compilation):

    #version 300 es
    precision highp float;

    // Attributes
    in vec3 position;
    in uint custom_data;

    // Uniforms
    uniform mat4 worldViewProjection;

    // Varying
    flat out uint v_custom_data;

    void main(void) {
        v_custom_data = custom_data; // <---
        gl_Position = worldViewProjection * vec4(position, 1.0);
    }

This is ok:

    #version 300 es
    precision highp float;

    // Attributes
    in vec3 position;
    in uint custom_data;

    // Uniforms
    uniform mat4 worldViewProjection;

    // Varying
    flat out uint v_custom_data;

    void main(void) {
        v_custom_data = uint(255); // <---
        gl_Position = worldViewProjection * vec4(position, 1.0);
    }

I do not think that webgl support int attributes:
WebGL: How to Use Integer Attributes in GLSL - Stack Overflow

It does work in WebGPU, when you set size to 1 (which is the number of components, not the size of the buffer):

It does work in WebGL in this PG:

However, we set the “position” buffer, which is a known type by the engine. Maybe we don’t support custom int buffers in WebGL… We’ll have to take a look.

@Deltakosh I think your link deals with WebGL, not WebGL2.

ha yeah that maybe why I was remembering it ;D