How to Override Materials

Hi All,

I have a general question.

So, for example I want to write a shader which applies to all the meshes, what is the best way to do that?

I used an approach where in the render target texture’s meshes list, in the before render callback, I save the old material, and apply a new material, and restore it later in afterrender call back.
But, I think if there are many meshes, this is costly.

In three.js it is possible to use a property called sceneOverrideMaterial which then will by pass the existing materials, and apply the override material.

Is it possible to do it in babylon?

Thanks!

We have a different approach but which will work as well: Just go through your meshes and add an observer to mesh.onBeforeRenderObservable.

Inside this observer (which could be the same for all meshes of course), just store the old material (so you can restore is during onAfterRenderObservable) and set the material to the one you want

@Deltakosh, yeah, I am doing something similar now.
Thanks for your answer!

1 Like

@spidersharma, could you post a sample of your code? I’m having the same issue.

Something like this::

this.renderTargetDepthNormal_.onBeforeRender = () => {

    for (let index = 0; index < this.renderTargetDepthNormal_.renderList.length; index++) {
        (this.renderTargetDepthNormal_.renderList[index] as any).savedMaterial_ = this.renderTargetDepthNormal_.renderList[index].material;
        this.renderTargetDepthNormal_.renderList[index].material = this.depthMaterial_;
    }
   }

this.renderTargetDepthNormal_.onAfterRender = () => {

    for (let index = 0; index < this.renderTargetDepthNormal_.renderList.length; index++) {
        this.renderTargetDepthNormal_.renderList[index].material = (this.renderTargetDepthNormal_.renderList[index] as any).savedMaterial_;
    }
   }

I see. So you have to render it to a texture, it’s not possible to do it as another pass of the post processing system? I was thinking about something like this:

  const renderSceneBase = new BABYLON.PassPostProcess('Scene copy', 1.0, this.camera);

  const magicPass = new BABYLON.PassPostProcess('SomeMagic', 1.0, this.camera);
  magicPass.onApply = (e) => {
    /// Apply the shader on all meshes
    this.scene.meshes.forEach(function (m) {
      m.savedMaterial = m.material;
      m.material = magicMaterial;
    });
  };
  magicPass.onAfterRender = () => {
    this.scene.meshes.forEach(function (m) {
      m.material = m.savedMaterial;
    });
  };

  const finalPass = new BABYLON.PostProcess(
    'Final pass',
    'finalpass',
    [
       ....
    ],
    [
      'baseTexture',
      'magicTexture',
    ],
    1.0,
    this.camera,
    0,
    this.engine
  );
  finalPass.onApplyObservable.add((effect) => {
    effect.setTextureFromPostProcess('baseTexture', renderSceneBase);
    effect.setTextureFromPostProcess('magicTexture', magicPass);
  });

but this doesn’t work properly. The scene is rendered with the original materials. Something seems to be reset (if I comment the after callback, it works, but then the original texture pass is rendered with the swapped materials, of course). Am I doing something wrong with the passes or does it only work as a render to texture?
Thanks!

I created a sample project for overriding materials: https://www.babylonjs-playground.com/#BYYJ4A#2. I still can’t figure out what is wrong. Any ideas?

In your example you are changing the materials inside the postProcess.onApply but this is too late.

The post process will get the RTT used to render the scene before the onApply. You have to change the material on scene.onBeforeRender for isntance when you want to use your Postprocess

In a nutshell: the scene is not re-rendered when using a postprocess. it is actually reused

I see. So besides using onBeforeRender, I have to render a pass to a texture and use it in the postprocess pipeline, inserting the RTT as a texture similar to what is done with the depth buffer in my sample, right?

Also, is it worth using a MultiRenderTarget for this, and do both the color and the overlay effect passes there, and use the PostProcessing with two textures?

I tried to find any code samples for multipass rendering, but couldn’t find anything unfortunately. Once I figure this out I’ll make a nice sample and post it. Thanks for all the help @Deltakosh!

Absolutely correct

But you cannot use multirender as you can’t change the material in that case

Okay, I finally got it working: render effect to target texture, a postprocess mixes a PassPostProcess with the texture in another shader. I’ll post a playground demo later.

Replacing the materials on before/after makes the fps drop to half, though. Perhaps it recompiles the shader or recreates the objects? Any optimizations to avoid that? I was considering making instances of the meshes, one for the color scene and one for the effects scene, would it be faster?

well you render the scene twice :slight_smile: Instances will help for sure

No, it’s not caused by the second render, it’s caused by a second render swapping the texture. Here’s how to reproduce: https://www.babylonjs-playground.com/#BYYJ4A#5

Alter changeTexture on the first line to true and false and see the difference in FPS. If you need, increase the number of spheres for a more noticeable effect. With changeTexture = false I get 60fps, with changeTexture = true I get ~40fps. The shader is trivial and if you apply it to the object statically the performance is still 60fps.

Thanks for all the help, @Deltakosh. I’ll write a How to and send a PR to the Documentation on RenderTargetTexture + PostProcessing once I get all this working properly.

I see no difference on my computer,. Can you run a profile on yours? To see where the CPU is used

The more spheres you have the more noticeable is the effect, here’s a variation with 1k https://www.babylonjs-playground.com/#BYYJ4A#7.

Profiles from Chrome are attached. The same effect happens on Firefox. I’m running Linux, if that matters (who knows what drivers do). There’s nothing that jumped to my eyes, except that the second render (the one to the framebuffer, not to the texture) is the one that takes longer.

noreplacematerial.zip (4.0 MB)
replacematerial.zip (3.1 MB)

Ok gotcha this is definitely due to shader resync. It could be worth trying another way as the two main contributors are linked to material preparation:

What about creating a clone of each sphere and alternating between the clone and the main one

Extra tip, commenting the onAfterRender callback makes the application run @60fps, even with the onBeforeRender still running (versus @22fps with the code un-commented).

So the problem really doesn’t seem to be linked to RTT, but to something happening while restoring the original material for the second pass. I tried a renderTarget.renderList[i].material = shader2Material on onAfterRender with a similar shader to the on in the code and there seems to be no performance hit.

Perhaps it’s StandardMaterial that reprocesses something on being reattributed?

Good you found, thanks :slight_smile: Is this something solvable or is it difficult to properly fix?

Yes, I could clone as a workaround, although it’d consume more memory. I don’t even need to alternate then, I can add the clones to the RTT.

Would instances work? Same object with a different material? Or it wouldn’t make any difference in the end?

Thanks a lot for all the patience.

Using mesh instances help tremendously:

https://www.babylonjs-playground.com/#BYYJ4A#8

Back to 60 fps.

Note that you must change the material of the base mesh (the one we clone from - I called it sphereBase in the code), that’s why I break early in the onBeforeRender / onAfterRender loops: no need to change the material of the cloned sphere, it won’t be taken into account anyway.

For mesh instances, the final world matrix is reconstructed from 4 base vectors given through mesh attributes, that’s why you need a #ifdef INSTANCES check in the custom vertex shader.

@Evgeni_Popov was faster than me. Instances is the key here as you want to minimize the time spend by the cpu to resync the materials