Convert JPGEG or HDR to ENV

@Sonny_Cirasuolo yup you are fully right… I knew it was too easy… The current PG is not prefiltering, let me modify it and come back to you.

1 Like

@sebavan Awesome, thanks. I applied the changes to the environmentTextureTools that you made in the PR and now I am seeing a difference when I change the gammaSpace to true. I also made sure to allow the generation of mipmaps on the Equirectangular cube texture. At this point, the skyboxes are looking very similar and lighting seems to be pretty close as well, I just seem to be losing some roughness/metalic data. Is it the prefiltering that is missing?

Attached below are two photos, one showing the hdr->env and the other, shinier, one is jpg->env


1 Like

Here is what would work https://playground.babylonjs.com/#9Y5589#3 with the prelittering.

The useless lines can be removed after this PR: fix filtering texture output setup by sebavan · Pull Request #13662 · BabylonJS/Babylon.js · GitHub

2 Likes

@sebavan Thanks! I just tried this and it seems that the roughness/metallic getting messed up is more pronounced after this prefiltering process. Here are two photos comparing a jpg before the prefilter and after.

Before prefiltering only some surfaces were too shiny, now everything looks like it has a glaze on it.


1 Like

Can you repro in the playground ?

If you look at the latest one I shared, the ground roughness is now ok. You need to keep both

        // THE BOTH FOLLOWING LINES WILL BE USELESS AFTER THE PR :-)
        scene.environmentTexture.gammaSpace = false;
        scene.environmentTexture.lodGenerationScale = 0.8;

Until the next release after prefiltering,

I will reproduce in the playground when I have some time, but here is the function I am using for prefiltering the jpg.

const prefilterEquirectangularCubeTextureAsync = async (texture: EquiRectangularCubeTexture, engine: ThinEngine) => {
    return await new Promise<EquiRectangularCubeTexture>((resolve, reject) => {
        const filtering = new HDRFiltering(engine);
        filtering.prefilter(texture, () => {
            texture.gammaSpace = false;
            texture.lodGenerationScale = 0.08;
            console.log('prefiltering done');
            resolve(texture);
        });
    });
};

and I modified my createEnvFromJpg to this -

    public async createEnvTextureFromJpg(filepath: string, callback?: () => void) {
        //OLD
        console.log('888 createEnvTextureFromJpg', filepath);

        const jpgCubeTexture = await Materials.createEquiRectangularTexture(
            this._scene,
            filepath,
            1024,
            true //this is confusing -- needs to be set to true if you will be using it with PBR materials
        ).catch((e) => console.error(e));
        const envTextureArgs = {
            imageType: envExportImageTypes[this._envOptions.imageTypeIndex].imageType,
            imageQuality: this._envOptions.imageQuality,
        };
        if (!jpgCubeTexture) return;

        const prefilteredJpgCubeTexture = await Materials.prefilterEquirectangularCubeTextureAsync(
            jpgCubeTexture,
            this._engine
        );

        return EnvironmentTextureTools.CreateEnvTextureAsync(prefilteredJpgCubeTexture, envTextureArgs).then(
            (buffer: ArrayBuffer) => {
                const blob = new Blob([buffer], { type: 'octet/stream' });
                console.log('CreateEnvTextureAsync: ', buffer, blob);
                // TODO: DEBUGGING REMOVE THIS
                // Tools.Download(blob, `envTextBlob ${Date.now()}`);
                return blob;
            }
        );
    }

You have 0.08 instead of 0.8 ???

3 Likes

Oh wow, thank you for catching that. It works great now. Thanks for all of your help, we should be launching this feature soon. I will drop a link when it does if anyone wants to see AI skybox creation working

1 Like

@sebavan hi , If using EquiRectangular to convert JPG files to env files, Do I have to use HDFFiltering for prefiltering?Can the function of HDFFiltering be written in the document so that we know how to use it when we use it. Additionally, in Babylon.js Texture Tools Can this website support JPG? I think we can integrate the functions into it

Why is _lodGenerationScale 0.8? Is there any basis for it? Is it a fixed value?

This only works with a cube texture and about 0.8, it is kind of arbitrary but fits with how we generate. Basically, we only use 80% of the texture mip map chain so on a 256 texture, we only have meaningful data from the 4*4 face size keeping a bit a definition compared to a completely monochrome face.

What if it’s 512 Texture? Or 1024 Texture?