Custom Texture Channel Packing - PBR Material


Just wondering if its possible to map texture channels to specific shader inputs. I am aware of the ORM map, but if you say wanted to pack the textures differently is it possible to remap them? I was looking through the API but couldn’t see a way to access the channels of a texture.

A custom shader would work I believe, but from I can see it may need a rewrite of the PBR implementation and how it handles each channel.

Is this the correct assumption to make?


Upon further reading and investigation I can see that the most optimal channels have been chosen in the PBRMetallicRoughnessMaterial, so I guess the need to pack textures differently isn’t really needed.

That said, I could see cases where you could in theory pack say a roughness texture into the blue channel of a normal map and reconstruct the blue channel within the shader. Maybe the node based shader editor will allow for this flexibility eventually?

Yes this is correct. A custom shader or the new Node Material :slight_smile:
How To use the Node Material - Babylon.js Documentation

Hi @Deltakosh, I actually think this would be a useful addition. Looking at the PBR shader, there’s this line:

#ifdef ALBEDO
    vec4 albedoTexture = texture2D(albedoSampler, vAlbedoUV + uvOffset);
    #if defined(ALPHAFROMALBEDO) || defined(ALPHATEST)
        alpha *= albedoTexture.a;

    surfaceAlbedo *= toLinearSpace(albedoTexture.rgb);
    surfaceAlbedo *= vAlbedoInfos.y;

With just a small change, we could handle texture packing for using grayscale images:

#ifdef ALBEDO
    vec4 albedoTexture = texture2D(albedoSampler, vAlbedoUV + uvOffset);
    vec3 albedoColor = albedoTexture.rgb;
    #if defined(ALBEDO_CHANNEL) // 0, 1, 2
        albedoColor = albedoColor[albedoChannel];
    #if defined(ALPHAFROMALBEDO) || defined(ALPHATEST)
        alpha *= albedoTexture.a;

    surfaceAlbedo *= toLinearSpace(albedoColor);
    surfaceAlbedo *= vAlbedoInfos.y;

What do you think?

EDIT: Usage would be something like:

const tex = new BABYLON.Texture('myCombinedTexture.png');
pbrMaterial.albedoTexture = tex;
pbrMaterial.albedoChannels = [TEXTURE_CHANNELS.R];
pbrMaterial.normalTexture = tex;

What is it supposed to do?

albdeoColor is a vec3, I don’t think albedoColor[...] is valid in a shader.

Even if it were, it would return a single float, that you can’t put in a vec3.

It is valid, since a packed channel is grayscale and would only result in a float from 0-1. I’m thinking of a scenario where your diffuse texture is a neutralized gray image, and a user can use a color picker to add the color. So a final color may look like:

// albedoColor is a float like 0.78...
vec3 finalColor = albedoColor * vec3(1.0, 0., 0.);

I stand corrected, it is valid, but it gives you a float. So you would need at least to do:

albedoColor = vec3(albedoColor[albedoChannel]);
1 Like

Yes, and it can also used as a multiplier, when combined with a vec3 color.

How this would translate in the shader?

Maybe instead of ALBEDO_CHANNEL (or NORMAL_CHANNEL) being a number, it should be something like:

#define NORMAL_CHANNEL gba

then in the shader:

normal = vec3(normal.NORMAL_CHANNEL);

However, it won’t work as is because you need to supply at the end of vec3(…) as many zeros as necessary to reach 3 components (another define with something like “,0” ?):

In the albedo example we would have:

albedoColor = vec3(albedoColor.ALBEDO_CHANNEL ALBEDO_CHANNEL_ZEROS);



? (maybe filling with something else than 0?)

Trying to understand how it could fit in the existing shaders in Babylon…

I haven’t thought it all the way through, but you can reference a uniform array and convert the results into a vec3. The DEFINES just acts to filter out the logic, but let’s say you input an array of [1,2,3] for the normal map. Since this is a normal map, we can assume that we use three channels, but we could also send in the array length as a uniform.

    int channels[3];
    channels = int[](1, 2, 3); // This is the uniform
    vec4 iColor = vec4(0., 1., 0., 0); // This is the inputed texture map
    vec3 col = vec3(iColor[channels[0]], iColor[channels[1]], iColor[channels[2]]);

    // Output to screen
    fragColor = vec4(col,1.0); // This outputs red

So really the trick is just allowing the user to specify a list of channels and then in the shader we can convert that into any kind of vec3 we want. It’s already doing that for the ORM model on the metallic/roughness texture, but it could be a lot more flexible and allow for some clever texture packing schemes to boost load times.