Incorrect rendering in VR on oculus quest &c using logarithmicdepthbuffer?

In the following example https://www.babylonjs-playground.com/#W2N9TD#1
I am using a logarithmic depth buffer to try and work with objects at astronomical scales. (I’m simulating the solar system, and I need to scale things so that the moon is far enough away to avoid parallax issues, but the other planets are still rendered correctly )

When planets overlap, I expect the closer one to be in front, and so it is (planet1 and planet2) if the distances are not too great. When I move both planets further out, to distances in the 1E7 range, the relative distances are ignored using a logarithmic depth buffer and are rendered the wrong way round, but only when in VR on an oculus quest ( or probably other android browsers ). On a desktop, or before entering VR, it’s correct.

Now I’m guessingthis is a problem with floating point precision somewhere. But what I would like to know is:

  • what exactly is the limitation on oculus - is it the shader in the graphics card or something else
  • where is the available precision documented if anywhere
  • is there a workaround ( custom material, different material ? )
  • or should I report this as a bug?

Thanks

Hello and welcome!

@benaadams recently introduced the perfect feature for that: Reverse Depth Buffer (z-buffer)

To use it you only need to flag your engine with engine.useReverseDepthBuffer = true

If I add that line into my playground, then both examples render back to front all the time ( even not in VR) . So it doesn’t seem to work…

You have to stop using the log depth in this case :slight_smile:

Thanks. But when I remove the logarithmic depth buffer,the original incorrect behaviour is still there., and in fact is now visible in VR on the desktop

Yeah I was afraid of that. We do not have a lot of simple option for you then.

You will have to scale down your world then unfortunately as the oculus Quest have a less precise depth buffer

1 Like

That’s what I was afraid of. Thanks anyway.
BTW, where can I find the detailed definitions of things like the precision of the oculus depth buffer?

If you reduce scaling by a power of 10 then it should work with useReverseDepthBuffer? (Don’t know about on occulus)

The issue I think you are hitting is the float data just doesn’t contain enough precision for these distances as there are only so many digits of precision for the position from the most significant digit (very far away) to also encompass the smaller precision (close together/vertexes)

Floating Point Bitdepth Largest value Smallest value Decimal digits of precision
32-bit Float 3.4028237 × 10^38 1.175494 × 10^-38 7.22
16-bit Float 6.55 × 10^4 6.10 × 10^-5 3.31
14-bit Float 6.55 × 10^4 6.10 × 10^-5 3.01
11-bit Float 6.50 × 10^4 6.10 × 10^-5 2.1
10-bit Float 6.50 × 10^4 6.10 × 10^-5 1.8

The IEEE 754 standard specifies a 32-bit float (binary32) as having:

Sign bit: 1 bit
Exponent width: 8 bits
Significand precision: 24 bits (23 explicitly stored)

This gives from 6 to 9 significant decimal digits precision

1 Like

I understand that thanks. But what I don’t know is what the oculus quest actually supports. I’m guessing it’s only 16 bit floats whereas the desktop supports 32. But it would be nice to know for sure…