I tried to visualize the depth map at https://www.babylonjs-playground.com/#RL5CX0#4. It seems clear that the depth values are linear from 0 to 1; 0 represents the near plane and 1 represents the far plane.

In my app, there are two draw calls. In the first one, I write the depth value with gl_FragDepth = xxx;, and I hope the second draw call can use the depth value written by the first to perform the depth test.

However, within gl_FragDepth = xxx; only when xxx is larger than 0.996, which is almost 1.0, can the second scene be rendered. If the value is smaller than 0.996, the second one will be overlapped.

It seems the depth buffer used for the depth test is not linear from zero to one. I have not turned on log depth BTW.

So my question is, what is the actual value used for the depth test? And how can I convert the depth value ranged [near, far] into the appropriate value [0, 1] for the depth test?

Indeed, depth values are not linear. Depending on your use case, you’ll need to convert to linear
IIRC, spaces are not the same between webGPU and WebGL, so conversion need to be a little different.

Where gl_Position.z should be in clipped space ranged [0, 1], depthValues.x is minZ and depthValues.y is minZ + maxZ, do you know what does this formular mean?