What's happening in imageProcessing.fragment.fx?

I’m looking at imageProcessing.fragment.fx’s code and can’t understand its inner workings. The code looks like this:

#ifdef IMAGEPROCESSING
	#ifndef FROMLINEARSPACE
		// Need to move to linear space for subsequent operations.
		result.rgb = toLinearSpace(result.rgb);
	#endif

	result = applyImageProcessing(result);
#else
	// In case where the input is in linear space we at least need to put it back in gamma.
	#ifdef FROMLINEARSPACE
		result = applyImageProcessing(result);
	#endif
#endif

I don’t understand what’s happening in the outmost else block. If image processing is disabled, applyImageProcessing should not be happening in any case. The comment above the line suggests result.rgb = toGammaSpace(result.rgb), but not result = applyImageProcessing(result).

Does anybody know what’s happening there and also why?

Thanks.

I would agree with you on this one.

Let’s wait for @Deltakosh or @sebavan input.

Hum yeah this looks like a bad copy/paste

I think you are right

So, it is all normal, but agree the code looks a bit confusing…

Basically, in case of any image processing filter, we first need to convert to linear if it is not the case (ifndef) to then be able to apply the image processing filters and finally move back to gamma space here: Babylon.js/imageProcessingFunctions.fx at f04a29233c2215a5c1f6b679ab8bf6b6ca13f4a7 · BabylonJS/Babylon.js · GitHub

On the opposite, if no effects are applied we need only the opposite go back to gamma + clamp if the previous part is linear (ifdef vs ifndef of the first case).

Why not simply doing it here ??? basically I chose to reuse the code from Babylon.js/imageProcessingFunctions.fx at f04a29233c2215a5c1f6b679ab8bf6b6ca13f4a7 · BabylonJS/Babylon.js · GitHub which does it and and clamp so if at any point I change the conversion, I can do it in only one place.

Hope that can help, and I completely failed on my side :slight_smile:

@sebavan Thanks, it’s much clearer now even though the code seems to confuse even @Deltakosh :slight_smile: However, I’ve been digging through code for a reason, where I thought that this bit might be the culprit. Will reproduce in a playground and write again.

Yup please repro in the pg. I was detailing the intent but the realization might be fully wrong :slight_smile:

Ok, so that means that if IMAGEPROCESSING is not defined, then none of the TONEMAPPING, VIGNETTE, CONTRAST, etc are defined either, so that we don’t execute the corresponding code when we are in the else part of the #ifdef IMAGEPROCESSING test?

[…] Ok, looking at the code it’s the other way around:

defines.IMAGEPROCESSING = defines.VIGNETTE || defines.TONEMAPPING ||
   defines.CONTRAST || defines.EXPOSURE || defines.COLORCURVES || 
   defines.COLORGRADING;

IMAGEPROCESSING won’t be defined only if none of the image processing related defines are defined, so that clears things up (for me)!

I don’t have a (saved) playground repro, since it’s reproducible with inspector alone. Just load a playground, go to the inspector, add default pipeline, yank contrast and exposure to the maximum and then disable image processing.

My expectation is that the rendering should go back to “normal” contrast and exposure, i.e. the first and the last image to be mostly identical, but it seems it’s still getting post-processed. Something similar can be observed with other image processing parameters.

And while we are at it: what’s with the color change in the background after enabling image processing, shouldn’t it be identical too if I don’t change any parameters?

So all the behaviors you see are normal despite being annoying :frowning:

Basically when enabling the post process, the rendering now works in linear space before being processed in full screen. so the clear color should now be setup in linear space to see the same visual on screen:

Babylon.js Playground

We unfortunately can not update it for back compat reasons also please note that in post process mode, the clear color will also get modified by your image processing configuration.

That said in most of the scene a skybox is often present for the background so it should be less of an impact in those cases.

Now, ImageProcessing has a kind of adaptive way to be either used in post process or directly as part of the forward loop. This helps a lot in WebGL1 to prevent the aliasing artifacts as there is no MSAA on offscreen texture. Also if the models are small and not filling the full screen it might even be more performant.

Basically changing the contrast and exposure is a global scene setup which can either be applied by the post process if present or directly in the forward loop depending on the use case.

So when setting it up in the UI, despite disabling the post process, you still have global values applied by the meshes themselves:

https://playground.babylonjs.com/#QZ8JBZ#1

You can actually see the setup in the inspector scene Material Image Processing section:

Tech wise the configuration is shared by default via:

scene.imageProcessingConfiguration no matter if used through post process or not.

@sebavan: couldn’t we do something like this:


?

I have checked that with those changes the use case from @rassie does work, but maybe it breaks something else…

In my understanding, the problem comes from the fact that when image processing is disabled, the ImageProcessingPostProcess instance is removed from the pipeline, so the #defines of the effect can’t be updated (because they are updated by a call to ImageProcessingPostProcess.imageProcessingConfiguration.prepareDefines) and retain their values from when the image processing was enabled.

The PR simply does not remove the ImageProcessingPostProcess instance but instead set the ImageProcessingPostProcess.imageProcessingConfiguration.isEnabled property to true/false depending on the imageProcessingEnabled property.

I see your point but in this case the output might end up being in linear as the scene would think there is a post process where it is actually disabled. I would prefer to not do the change in the pipeline if you can revert it.

Maybe we could simply reset the values when we disable the post process in the inspector->scene.imageProcessingConfiguration = new … At least it would not be confusing from this part ?

It’s not a problem only with the inspector.

Take this PG for eg: https://www.babylonjs-playground.com/#Y3C0HQ#388

I have added a “Enable image processing” checkbox that simply sets the imageProcessingEnabled property:

  • at start, uncheck the box
  • now change contrast / exposure / tone mapping => as expected, no change in the display
  • check the “Enable image processing” box => still no change, which is not right. You have to update the contrast/exposure/tone mapping values to trigger a redisplay
  • now that the display takes the values of contrast/exposure/tone mapping into account, uncheck the box again => no change in the display, which is not right either

If the changes from the PR is not ok, maybe the easiest thing would be to remove the imageProcessingEnabled property altogether, as it is not doing anything anyway: if the user does not want to use the image processing effects, they just have to reset their values to their defaults.

Ok I better understand the issue thanks a lot !!!

Let s go with your PR :wink:

1 Like

Actually wait…

It is because in this PG we access the post pro configuration manually instead of the global one. Here is the working version https://www.babylonjs-playground.com/#Y3C0HQ#390

I do not think we need the change anymore, any thoughts @Evgeni_Popov ?

I think the use of scene.imageProcessingConfiguration and defaultPipeline.imageProcessing is very confusing…

Which one should be used? How they relate to each other?

If you disable image processing with scene.imageProcessingConfiguration.isEnabled = false in your PG, you end up with over saturated color. You need to also add defaultPipeline.imageProcessingEnabled = false to come back to the right colors…

The problem I think is that you have an image processing instance both in defaultPipeline and in the scene, and they both apply when you render the scene.

Is scene.imageProcessingConfiguration always existing, in which case maybe defaultPipeline.imageProcessing should be removed?

Or the other way around, if using a defaultPipeline, is it possible to remove scene.imageProcessingConfiguration and drive the image processing effects through defaultPipeline?

So no matter if you use scene.imageProcessingConfiguration or material.imageProcessing… or renderingPipeline.imageProcessing… they all use by default the same one coming from the scene.

We can override them on demand for really particular cases but in practice in most cases the image processing step is identical no matter what object you work with.

I totally agree that it can be confusing and I wonder what is the best way to convey that.

Usually all the access should be through the scene one and all the rest is here mainly for back compat :frowning:

isEnabled is probably the worst part but removing it is a breaking change.

I am literally taking any idea to make it less confusing ?

So, it seems the problem is to synchronize the scene.imageProcessingConfiguration.isEnabled property with defaultPipeline.imageProcessingEnabled: if one is changed, the other should be too.

I have updated the PR to do this, let me know what you think.

Note: it does fix the OP use case and the PGs you and I did

1 Like