Enabling HDR in defaultPipeline disables AA

Attaching a defaultPipeline with hdr = true seems to disable antialiasing.

Full description
I believe this is a bug since the documentation for the hdr flag to defaultPipeline says:

The HDR value should be true as long as possible, unless you’re targetting cheap fallback for low end devices. This value allow one of the half float or float texture type, depending on the GPU. Also, some effects (like bloom) will be more accurate.

This very much suggest that you should try and always keep HDR on for the best image quality.

However, if you open the associated playground you will notice that the animated gizmo is flickering horribly. At least for me in Windows 10 running Chrome and the new Edge browser.

Here is a screenshot of it with hdr turned on (default) and off (modified pg):
gizmo-hdr-on gizmo-hdr-off

I have also made my own more basic playground. Here is the results with hdr = true:

Here is the exact same playground with hdr = false:

Notice the drastic differance. This seem to affect both edges against the background (same result with transparent clearColor btw), the edges against the plane and also the intersection edge between the spheres which end up kinda funky.

I noticed in our code that setting hdr = false resulted in the “samples” slider in the Inspector for the defaultPipline no longer affected the scene quality. Is the type of AA that samples affects depending on the hdr setting?

I thought that might be the reason and manually set samples = 4 in the two playgrounds above. In the playground from the documentation that mostly fixed the flickering edges, but the result is still slightly better with hdr disabled.

However in my playground things got weird. It no longer reproduces by simply changing the hdr flag, sometimes it is broken even with it off until you rotate the camera for example. It just feels glitchy there. Any clues on what is going on?

However, when saving it with hdr = true and samples = 4 and then reloading it, it appears that AA is working each time even with hdr. But changing the values and pressing play causes it to get weird.

PG with hdr on and samples = 4:

OK, so it turns out that this is perhaps a documentation issue rather than a bug. Setting hdr = true doesn’t seem to break AA, just that with it false you get AA by default and with it on you need to set samples to something like 4 to get about the same results. Is that correct?

Doesn’t explain it in our code though where we have samples set to 4 but get bad results on edges unless we disable hdr. This is mind-wrecking.

I still have the strange results in my PG though where things seem to break if you change the hdr setting on and off during runs. But perhaps that is some kind of issue with the PG environment?

When using hdr = true, the default rendering pipeline always uses post processes to render the different effects. It means the scene is rendered in an offscreen texture and default AA does not apply to this texture, it applies only to the rendering framebuffer. If you want antialiasing in those offscreen textures you need to set a value for samples, which will enable MSAA.

When you use hdr = false, the default rendering pipeline does not use post processes if you only use image processing effects: in that case it renders directly to the framebuffer so AA applies. Note however that as soon as you enable another effect (like bloom), the pipeline must switch to post processes rendering and you will loose AA even when hdr = false.

Big thanks for the details @Evgeni_Popov.

The fact that it renders to an offscreen texture, can that affect how the pixels values are composited when brought back? In our case (which I can’t reproduce in a simple PG) we get problems with:

  • A) Semi-transparent objects rendered against our transparent (0, 0, 0, 1) background gets very odd color shifting when brightly lit (blue turns more cyan).
  • B) Semi-transparent objects gets very dull / translucent against the transparent background, but drastically shift colors when moved over rendered opaque pixels.
  • C) Antialiasing of very bright materials fails and seem to render against a black background rather than a transparent one.

A & B is mostly solved / reduced / hidden either by setting hdr to false or by setting {premultipliedAlpha: false} when creating the Engine.

C sounds extremely much like we are getting a premultiplied alpha render that is then composited using straight alpha, with the typical dark fringes.

So the fact that hdr = true will result in an offscreen texture combined with premultiplied-alpha like symptoms showing up in A, B & C sounds like a large piece of the puzzle. Any suggestions on where I could dig deeper, either by experimenting on our side of things or reading / testing things in the Babylon source?

Honestly it’s hard to help on this matter without a PG, as all those transparency / blending uses can be a headache to debug…

Have you tried to use the ALPHA_PREMULTIPLIED or ALPHA_PREMULTIPLIED_PORTERDUFF blending values during your testing? It can help in some cases…

Yeah, tell me about it. :-/

That is why I have been working hard on trying to minimize it to a PG-based test case, but so far unsuccessfully. And since we are loading a custom format both for the objects, materials and scene graph we can’t use that code in the PG either even though it is open source (to our partners at least).

Haven’t tried ALPHA_PREMULTIPLIED_PORTERDUFF specifically before since it wasn’t exposed in the Inspector, but no difference there.

The thing is that the models that I know have this issue is not using texture mapping but are just using an albedo color (PBRMaterial) and an alpha value.

I recently learned why setting clearColor to for example new Color4(1, 0, 0, 0) produces very weird color tints on transparent objects, which is because clearColor is defined in WebGL to be a premultiplied alpha color. In other words (r, g, b, 0) is invalid for all other values than r = g = b = 0 with undefined results.

I was pondering if the material shaders (or WebGL in itself) also expects premultplied color values in some places, like how they need toLinearSpace() for PBRMaterial. But that can’t be it right, because we don’t supply a Color4, we supply a Color3 + an alpha value…?

The same actually goes for our textured materials that uses opacity, since we always use an opacity texture rather than an alpha channel in the albedo texture. The engine can’t assume that the albedo texture is premultiplied with the opacity texture, right?

By the way, here is an example render of the strange color drifts we get with ´hdr´ enabled. The left most is using a defaultPipline with hdr enabled and 4 sampels AA (the rest default settings). The middle one is with hdr disabled in the defaultPipeline. The right most one is using our old rendering code in Three.js as a reference, which is also about the same how it looks in our other rendering engines and the raytracer we use in our Windows application.

man_babylon_hdr-off man_babylon_hdr-on man_three-js

If the odd colors rings any bell, then please let me know @Evgeni_Popov, otherwise I will just continue testing things and scratching my head until I go bald or I figure it out, which ever comes first. :wink:

The left one seems to definitely apply an additional gamma space transformation compared to the two other pictures. Make sure your colors are all in linear space when feeding the PBR material. But it could also be a difference due to lighting, because the computation is not the same between PBR and default material…

Comparing middle and left, maybe the difference comes only from the fact that the PBR shader is different from Babylonjs and Threejs. Or it can be the blending mode: try using ALPHA_SCREENMODE, it makes generally things brighter than ALPHA_COMBINE:

(left combine, right screenmode)

No pre-multiplying takes place in the standard / PBR material except if you use ALPHA_PREMULTIPLIED / ALPHA_PREMULTIPLIED_PORTERDUFF for the alphaMode property.

Even if you can’t share code, maybe you can share a live link? That could help looking for the problem.

Both the middle and the left one is using PBRMaterial as we are only using PBR. We do use toLinearSpace() on the albedo colors, found that one out the hard way as well when we couldn’t match Three’s colors during the early stages of the switch to Babylon.

But one thing just came to mind. I have been assuming that the colors to PBRMaterial should always be in linear space, no matter the hdr setting on the DefaultPipeline. That is correct, right? The reason I’m asking that is because you need to use gamma space for the clipColor if hdr is off, but linear space if hdr is on, right?

Like I did in this PG, otherwise the background color doesn’t match. How come that color needs to change between linear and gamma space but not the material’s colors? Any additional “gotchas” like that when switching hdr on and off?

That would be much appreciated. I will get back to you when we go live / preview with the switch to Babylon. I hope we can contribute back with things in the future as thanks you all for your assistance. =)

Yes that’s right. You should see turning hdr on as being a switch to use linear space on the whole pipeline, and when it is off we are in legacy mode. The only “glitch” is that even in legacy mode PBR expects its color inputs to be in linear space because all the shader computations assume the colors are in this space.

In fact, the “right” way of doing things is to use hdr on + PBR and hdr off + standard materials: doing so will make everything smoother because you are either always in linear or gamma space. Still, a user may mix PBR/non PBR and hdr on/off and we must support that in some way.

1 Like