@sebavan Okay so I have been working on this with @Evan_Frohlich. Here is where I am right now.
We have a pipeline working that can convert HDRs into .env files locally on the client’s machine before it is uploaded and distributed to all of our users. This function looks like this.
public async createEnvTextureFromHDR(filepath: string, callback: () => void) {
console.log('888 createEnvTextureFromJpg', filepath);
const hdrCubeTexture = await Materials.creatHDRCubeTextureAsync(
this._engine,
filepath,
1024,
false,
true,
true
).catch((e) => console.error('createHDRCubeTextureError', e));
const envTextureArgs = {
imageType: envExportImageTypes[this._envOptions.imageTypeIndex].imageType,
imageQuality: this._envOptions.imageQuality,
};
if (!hdrCubeTexture) return;
return EnvironmentTextureTools.CreateEnvTextureAsync(hdrCubeTexture, envTextureArgs).then(
(buffer: ArrayBuffer) => {
const blob = new Blob([buffer], { type: 'octet/stream' });
console.log('CreateEnvTextureAsync: ', buffer, blob);
// TODO: DEBUGGING REMOVE THIS
// Tools.Download(blob, `envTextBlob ${Date.now()}`);
return blob;
}
);
}
You can see that we are using createHDRCubeTextureAsync, which is defined here.
const creatHDRCubeTextureAsync = async (
scene: Scene | ThinEngine,
filepath: string,
resolution = 512,
gamma: boolean,
prefilertOnLoad: boolean,
generateHarmonics: boolean
) => {
return await new Promise<HDRCubeTexture>((resolve, reject) => {
const texture = new HDRCubeTexture(
filepath,
scene,
resolution,
true,
generateHarmonics,
gamma,
prefilertOnLoad,
() => {
resolve(texture);
},
() => {
reject();
}
);
});
};
We are actually using a resolution of 1024 instead of 512 for the cube map so that we get better-looking results, and we are using WebP for the images instead of PNG. I discovered a much smaller file size for an env that is created when using webp (~.5mb) vs png (~8mb) at 1024 resolution.
If there are any downsides to using WebP, I would love to know.
The results I get from this seem to be on par or better than I get using the IBL texture tool, with a much smaller file size.
Part of the pipeline we are trying to also solve is the straight jpg to env. To do this I am currently using these two functions.
public async createEnvTextureFromJpg(filepath: string, callback?: () => void) {
//OLD
console.log('888 createEnvTextureFromJpg', filepath);
const jpgCubeTexture = await Materials.createEquiRectangularTexture(
this._scene,
filepath,
1024,
false
).catch((e) => console.error(e));
const envTextureArgs = {
imageType: envExportImageTypes[this._envOptions.imageTypeIndex].imageType,
imageQuality: this._envOptions.imageQuality,
};
if (!jpgCubeTexture) return;
return EnvironmentTextureTools.CreateEnvTextureAsync(jpgCubeTexture, envTextureArgs).then(
(buffer: ArrayBuffer) => {
const blob = new Blob([buffer], { type: 'octet/stream' });
console.log('CreateEnvTextureAsync: ', buffer, blob);
// TODO: DEBUGGING REMOVE THIS
// Tools.Download(blob, `envTextBlob ${Date.now()}`);
return blob;
}
);
}
With createEquiRectangularTexture defined here
const createEquiRectangularTexture = async (scene: Scene, filepath: string, resolution = 512, gamma: boolean) => {
return await new Promise<EquiRectangularCubeTexture>((resolve, reject) => {
const texture = new EquiRectangularCubeTexture(
filepath,
scene,
resolution,
true,
gamma,
() => {
resolve(texture);
},
() => {
reject();
}
);
});
};
The resulting env file from this conversion seems to be missing some information and is similar to the results I was getting from the HDR conversion before I set prefilterOnLoad to true. Reflections come through and the skybox changes, but the roughness and metallic values on my materials seem to be binary rather than a spectrum (very shiny or very rough, with no in-between). It also takes much longer for the jpg → env to load than the hdr → env. The other thing I noticed is that specular-antialiasing gets screwed up with the not prefiltered (jpg → env) version.
One solution to this is to obviously just convert the jpg to an hdr before the conversion to the env file begins, but if there is a way to avoid doing this and to prefilter the equirectangularCubeTexture, I would love to know. I would also love to know what exactly is happening behind the scenes if anyone has any insight.
I saw the PR at the end of this thread and if that is the solution then great. I just wanted to post this so others can see where I ended up.
Photos for reference: The first two are hdr-env and the next two are jpg-env. Thanks!