scene.enableDepthRenderer defaults to half float even if system supports full float

Hi!

Not sure if bug or intended behaviour.
I was expecting that if the system support float rendering (32bit), scene.enableDepthRenderer(...) will create a depthRender with a depth map that is a 32bit float texture.

It seems to default to half float (16bit) even if it’s possible to render to float (32bit).

Playground: https://playground.babylonjs.com/#348H8F#1
check console in playground

Source code for enableDepthRenderer: https://github.com/BabylonJS/Babylon.js/blob/efcd60d556d7792d32d186a95103281df81f0eae/src/Rendering/depthRendererSceneComponent.ts#L41-L48
here it checks half float render support before full float render support, and the target becomes 16bit when it could be 32bit. Is this intended?

Docs are a bit vague regarding if it will default to 16bit or 32bit.

Blockquote
Note: By default, generated texture uses float components thanks to WebGL OES_texture_float extension . If this extension is not supported, Babylon.js reverts back to byte component which means less precision for depth values.

https://doc.babylonjs.com/how_to/how_to_use_depthrenderer_to_get_depth_values

pinging @Evgeni_Popov

It’s not a bug, 16 bits float are prioritized over 32 bits float to save some texture resources.

But adding a parameter to force 32 bits float is easy to add, here’s the PR:

1 Like

Thanks for the quick response and action.

I think this added flexibility is great! You all rock :+1: