IBL of Diffuse BRDF

I’ve been learning about PBR recently. When using IBL to calculate the diffuse term part, I have some doubts. When calculating the spherical coordinates, I found that most of article are in the z up direction, but in threejs it is in the y up direction. The azimuth is defined as starting from +z. I tried to convolve envMap based on this benchmark, but I got the wrong image. Can someone explain it to me?

result:

Here are the modifications I made:

vec3 sphericalEnvmapToDirection(vec2 tex) {
  float theta = PI * (1.0 - tex.t);
  float phi = 2.0 * PI * (0.5 - tex.s);
  // return vec3(sin(theta) * cos(phi), sin(theta) * sin(phi), cos(theta));
  return vec3(sin(theta) * sin(phi),  cos(theta),sin(theta) * cos(phi));
}

vec2 directionToSphericalEnvmap(vec3 dir) {
  // float phi = atan(dir.y, dir.x);
  // float theta = acos(dir.z);
  float phi = atan(dir.x, dir.z);
  float theta = acos(dir.y);
  float s = 0.5 - phi / (2.0 * PI);
  float t = 1.0 - theta / PI;
  return vec2(s, t);
}

and in for loop

vec3 prefilterEnvMapDiffuse(in sampler2D envmapSampler, in vec2 tex) {
  float px = t2p(tex.x, width);
  float py = t2p(tex.y, height);
  
  vec3 normal = sphericalEnvmapToDirection(tex);
  mat3 normalTransform = getNormalFrame(normal);
  vec3 result = vec3(0.0);
  uint N = uint(samples);
  for(uint n = 0u; n < N; n++) {
    vec3 random = random_pcg3d(uvec3(px, py, n));
    float phi = 2.0 * PI * random.x;
    float theta = asin(sqrt(random.y));
    // vec3 posLocal = vec3(sin(theta) * cos(phi), sin(theta) * sin(phi), cos(theta));
    vec3 posLocal = vec3(sin(theta) * sin(phi),  cos(theta),sin(theta) * cos(phi));
    vec3 posWorld = normalTransform * posLocal;
    vec2 uv = directionToSphericalEnvmap(posWorld);
    vec3 radiance = textureLod(envmapSampler, uv, mipmapLevel).rgb;
    result += radiance;
  }
  result = result / float(N);
  return result;
}

This is the axis definition used in most page and literature:

I made an example for anyone to debug:demo
To see the results, click the play button in Time Control.
image

if u want to modify shader, can do that
select shader node and click edit code button

cc @PatrickRyan

@KallkaGo, the best thing to do would be to set up a playground using our PBRMaterial on a sphere and then create your own shader for a second sphere to mirror what our PBRMaterial is doing. You can even set it up using our PBRRoughnessMetallic node in NME and pass only the component you care about (such as the diffuse indirect component which is just the IBL contribution to the diffuse color). This way you are using the same coordinate system as our shader and can refer back to what our shader is doing for an example. The reason this is difficult talking about the topic using external tools is that we may, and probably do, have different conventions in coordinate system and handedness. Once you have that, we can help you with any gaps in your approach so that we are all using the same context without needing to figure out where the possible convention conflicts are.

2 Likes

Thanks for your reply. The original implementation of IBL Diffuse BRDF was based on threejs. Because I am familiar with Threejs. I also referred to the practice in Learn Opengl. The convention of spherical coordinate system in Learn Opengl is the same as that on Google Wiki.By the way, there is a class called Spherical in Theejs. It takes +y as the starting axis of the polar angle and +z as the starting axis of the azimuth angle. However, I don’t seem to see any description that the definition of theta and phi must start from a certain axis in wiki page. So I used the world space right-hand system to test. I’m sorry, I am not very familiar with the Babylonjs-related API. If I need to use PG to reproduce it, it may be troublesome. So I used an external online website similar to the shader graph in Unity to show this demo, In this way, we only need to worry about shader level issues.

I solved it. Let me share the process again.

The wiki and the page are based on the A coordinate system in the figure above. The polar angle (theta) starts with +z and the azimuth angle (phi) starts with +x. However, if the axis is defined as the B coordinate system, theta starts with +Y and phi starts with +z.

So the corresponding xyz alignment in the figure below will also change

But the key to the problem is to use Monte Carlo integration and importance sampling to pre-calculate the DIffuse BRDF in IBL. It is necessary to transform the direction vector (also called normal) from local space to world space. There are two ways to construct the transformation matrix from the local coordinate system to the world coordinate system. The first is to construct the TSR matrix. The second way is to use the basis vector to construct the transformation matrix. For example, if we transform from A to B, then the basis vector we need needs the representation of B’s ​​basis vector under A, and fill the matrix in columns as corresponding to the local coordinate system.

But I ignore this step when I construct the transformation matrix. Suppose there is a patch facing me. In other words, the normal direction of the patch is exactly opposite to the direction of my sight. Then their local coordinate system from my perspective should be like this:

So when using spherical coordinates defined in the B coordinate system, I must also follow this form when constructing the transformation matrix using basis vectors. The normal should be on the Y axis, so the order in which it is put into the matrix should be T N B

Maybe there is something wrong with my understanding, please point it out, thank you

Here is how we do it in Babylon Babylon.js/packages/dev/core/src/Shaders/ShadersInclude/hdrFilteringFunctions.fx at 96a9a2f0fae461a85822f1a768fd8bc15e06fdc1 · BabylonJS/Babylon.js · GitHub if that helps

1 Like