Hi!
Not sure if bug or intended behaviour.
I was expecting that if the system support float rendering (32bit), scene.enableDepthRenderer(...)
will create a depthRender with a depth map that is a 32bit float texture.
It seems to default to half float (16bit) even if it’s possible to render to float (32bit).
Playground: https://playground.babylonjs.com/#348H8F#1
check console in playground
Source code for enableDepthRenderer
: https://github.com/BabylonJS/Babylon.js/blob/efcd60d556d7792d32d186a95103281df81f0eae/src/Rendering/depthRendererSceneComponent.ts#L41-L48
here it checks half float render support before full float render support, and the target becomes 16bit when it could be 32bit. Is this intended?
Docs are a bit vague regarding if it will default to 16bit or 32bit.
Blockquote
Note: By default, generated texture uses float components thanks to WebGL OES_texture_float extension . If this extension is not supported, Babylon.js reverts back to byte component which means less precision for depth values.
https://doc.babylonjs.com/how_to/how_to_use_depthrenderer_to_get_depth_values