Orthographic Camera Snapshot

I’m planning to take some snapshots with an orthographic camera and display them at the html level, but I’m running into a problem where it seems like the rendering is happening at the wrong time.

This is my orthographic camera configuration

    this.snapShotCamera = new ArcRotateCamera('snapShotCamera', 0, 0, 1, 
     Vector3.Zero(), this.scene);
    this.snapShotCamera.mode = Camera.ORTHOGRAPHIC_CAMERA;
    this.snapShotCamera.radius = 1;
    this.snapShotCamera.minZ = 0;
    this.snapShotCamera.maxZ = 5;
    this.snapShotCamera.orthoLeft = -1;
    this.snapShotCamera.orthoRight = 1;
    this.snapShotCamera.orthoTop = 1;
    this.snapShotCamera.orthoBottom = -1;

This is the function called by snapshot

private materialSnapShot() {
    const { scene, snapShotCamera, sphere, assets } = this;
    for (let i = 0; i < materialList.length; i++) {
      if (i !== this.curIndex) {
        const item = materialList[i];
        const fboTexture = new RenderTargetTexture(`${item.name}-preview`, 256, scene, { generateMipMaps: false, type: Constants.TEXTURETYPE_HALF_FLOAT, samples: 8 });
        fboTexture.activeCamera = snapShotCamera;
        fboTexture.level = 0;
        fboTexture.clearColor = new Color4(0, 0, 0, 1);

        const cloneSphere = sphere!.clone(item.name);

        const sphereMaterial = new PBRMaterial('sphereMaterial', scene);
        sphereMaterial.metallic = 1;
        sphereMaterial.roughness = 1;
        sphereMaterial.metallicTexture = assets.get(item.ormTexture)?.data as Texture;
        sphereMaterial.albedoTexture = assets.get(item.albedoTexture)?.data as Texture;
        sphereMaterial.bumpTexture = assets.get(item.normalTexture)?.data as Texture;
        sphereMaterial.useRoughnessFromMetallicTextureAlpha = false;
        sphereMaterial.useRoughnessFromMetallicTextureGreen = true;
        sphereMaterial.useMetallnessFromMetallicTextureBlue = true;
        cloneSphere.material = sphereMaterial;
        fboTexture.renderList?.push(cloneSphere);
        fboTexture.render();
        cloneSphere.isVisible = false;
      }
    }
  }

My goal is to render a frame so that I can have good performance, but I get a black texture. If I add the RenderTargetTexture to scene.customRenderTargets and make it visible, rendering is handed over to babylon, and I can get the snapshot I expect. By the way, I put the snapshot function in the callback of scene.onReadyObservable, and everything works fine with the perspective camera. Am I overlooking something?

Can you share a PG? onReady observable will make sure shaders and textures are loaded. Don’t forget the option to check the rendertargets as well:

Okay,here https://playground.babylonjs.com/#VI4MD4#1

cc @Evgeni_Popov

@KallkaGo
material needs to be defined before calling onReadyObservable. Otherwise texures and shaders are not loaded:

PG:

Even though the material is defined before onReadyObservable, I still don’t get the expected snapshot

You must check that the render target is ready for rendering before calling render:

1 Like

Thanks for the answer, this is very useful