First post in the new camp, hooray!
Do data textures such as metallic / roughness, normal, ao, height maps need to be manually de-gammaed, using something like “texture.gammaSpace = false;” ?
Let’s assume these textures come out of Substance Painter or are downloaded from CGTextures.com PBR Resources.
Or maybe gamma is taken care of by the PBR shader in Babylon and I can just pass the textures in without worrying about any gamma (linear workflow) issues?
Thanks in advance!
First post in the new camp, hooray!
I have the feeling that it’s automatic (I don’t succeed to see where it’s set, probably somewhere around these files), 'cause I haven’t seen yet artefacts due to this (except once, about lightmap in PBRMaterial, but it was fixed soon after).
Note that according to the doc, maybe one texture have to be forced to
gammaSpace = false : the environmentTexture. I’m not sure if this have to be set if you’re using the .env format.
(actually, in Shaders folder rather than Materials I think: Babylon.js/pbr.fragment.fx at master · BabylonJS/Babylon.js · GitHub)
Got it!.. I think
Need a dev’ to confirm but, is the normalize function get pixels from texture in linear space?
Yep, this is the goal of the
texture.gammaSpace property. There is no way to detect it in files so you have to provide it manually
OK so I was totally wrong
Let me complement on this one.
In PBR mat, albedo, reflectivity and emissive are known to be gamma and can not be change by code.
In the shader we rely on toLinearSpace to convert. @Vinc3r normalize is mostly use for vector to keep their directions with a length of one.
That said, lightMap, reflection, refraction and environment Textures can be configured as they might be directly stored in linear space for hdr for instance. So you can use the gammaSpace flag for them.
Hope that helps,
are always considered in linear space and no transformation apply to them
Thanks alot folks!
This clears up my doubts. Color space has always made me a bit paranoid.
So the dummy version:
HDR textures such as lightMap and environmentTexture, can be flag with gammaSpace=false if the data is stored linearly;
sRGB textures (usually in formats like jpg and png etc) don’t need gammaSpace flag and are taken care of by the shader.
As soons as possible, I will add these precisions somewhere into the doc.
(also, in case people doesn’t know this great pdf, here the link: The Beginners Explanation of Gamma Correction and Linear Workflow)
That pdf is a great source of reference. When I was doing a lot of off-line rendering arch-viz type of work, it helped me alot with getting the lighting right.
Is this still true @Deltakosh ? We upload a lot of HDR and sometimes we get bad result depending on this gamma property. So we would need a way to detect it if possible?
I know the gamma space information is there in ktx2 files but I don’t think it exists in most of the files (at least not in png, jpg, hdr, …).
Regarding .hdr files, those files are normally used for linear space data, so the loader automatically set
gammaSpace = false. Same thing for the .env files. But for other types of files,
gammaSpace = true by default and it’s up to you to set it to
false in case you know your data are in linear space.
Yes we also put gammaSpace = false by default with HDR and DDS. But when we make the transformation to ENV with
EnvironmentTextureTools.CreateEnvTextureAsync, the rendering using the new ENV file can be really darker than with the HDR file. And in that case, setting gammaSpace = true makes it more like the original HDR file.
CreateEnvTextureAsync is expecting the input texture to be in gamma space, so is doing a gamma to linear conversion (but not in all cases…) that is wrong if the input is not in gamma space… We could easily check that the texture is already in linear space and not do this conversion, but it would be a breaking change… @sebavan what do you think?
CreateEnvTextureAsync expects the input to be linear if Float or Half Float which is what happens for any DDS or HDR textures. We only convert to linear if the input is a byte texture (6 faces png for instance).
Then loading an Env or HDR should work by default without any setup cause we know what they are storing.
DDS is way more random but usually in linear space if storage is float or half float.
For us this is mostly HDR.
Let me show you the differences I got. Here is the rendering
with original hdr
with the ENV file obtained with gammaSpace = false and which is way darker
with the ENV file with gammaSpace = true which is closer to the original HDR
Plus in both cases, we can see some weird artifacts especially if you look at the sphere on the top right between HDR and ENV rendering. I played with all the parameters from
HDRCubeTexture function but can’t get rid of these artifacts yet. Maybe this is linked to the dark issue?
I can send you the HDR if needed.
Thanks for helping
Those artefacts are strange… Are you able to setup a repro in the PG?
After working on a playground: https://playground.babylonjs.com/#0HZNW3#2
I realized that actually, it seems the issue does not come from the transformation to ENV. But from the parameter
prefilterOnLoad used when loading the HDR.