Project : Progressively Loaded HDR Environments - To 16K and Beyond

Sebavan!

I’m very happy you replied and that you like this direction. While I am primarily focused on delivering a highly specialized product at my company, I am also persuaded that there are some things best done in the context of the community around babylon.js, and I would love to find ways I might effectively collaborate with the community.

Overall upgrades to the fidelity of the experiences made possible by babylon.js is a goal I think it is best to pursue as a community, as we all benefit via the improvements of our various specialized applications.

While I have your attention, let me step back and tell you where I intend to go with this.

What I am self-consciously pioneering with our Bionic Trader line, beginning with this game we call TRADE, is what we might call “The Streaming App”.

At the heart is this framework I’ve built around babylon.js, that I call “GLSG (Global Liquidity Scene Graph)”

I described it in my other post about TRADE.

GLSG allows me to define a Scene which is a hierarchy of SceneElement’s. The Scene encapsulates a single babylonjs “scene” and SceneElement extends BABYLON.TransformNode.

The reason I decided to “wrap” the babylon internal scene graph like this becomes apparent when I tell you that each SceneElement can be bound to a SceneElementPresenter which is updated with realtime data by one or more ActiveModels.

I’ve built this simple, extensible MVP framework focused on bringing multiple streams of high frequency realtime-data into the babylon.js world.

TRADE is the first example. Consider why I call this a “streaming app”. The entire experience is rendered onto a single fullscreen canvas. Consider that the javascript code sent to each player’s browser is good for nothing without the realtime data feeds that we use to “drive” the experience. Without the on-demand loading of asset’s that we serve, you will not even be able to see anything in the scene because we “stream” in the lighting.

The GLSG Application is something like a domain specific media player, where the “CODEC” decodes 3D meshes and associated assets, along with coded behaviors, in stead of simply decoding colored pixels.

I believe this is the way forward and that this represents some major adjantages to those who lay hold on this model and extend it.

My ideas here around Progressively Loaded HDR Environments stem from a larger idea I have which illustrates the power of the streaming app paradigm.

https://hdrihaven.com/hdris/

Here we have this wonderful library of over 400 16K HDR Panoramas. Some are even 24K.

Let’s devise a workflow which will process this entire library into a cloud hosted set of .PHE Environments, so that my users might “subscribe” to different environments. The entire set might be browsed in real-time. Imagine swiping left and righ to see your entire PBR environment crossfade from one look to another. (All we are doing is progressively loading the lower res images in the set). See one you like, tap, and we load in the rest.)

Do you see how this is such a huge advantage over publishers that must “ship” a DVD, for example, or who must “submit” all new apps and their content to an “app store”?

The main roadblock I have right now is the .env creation process using a single thread in my local browser. We would need to port it to a lambda function. Also, it would be good to update it so it can work multi-core. Maybe it breaks a 16K image into 64 2K planes and spins up 65 functions - 64 to process and 1 to stitch back to 64K.
Likewise with the .HDR to .DDS process using Lys. Depending on how their Licensing works, I could automate this toy my making Lys run in the cloud automatically, processing from one cloud folder to another, where the .ENV converter will run.

Also, I’m considering that I may blur the lower resolution images so they provide adequate lighting without being blocky. This way, blockiness is ruled out of the format. Low res images lack “sharpness”.

The dream situation is I press a button, and some hours later, I’ve got over 400 .PHE Environments sitting in an S3 bucket, able to be browsed in realtime PBR 3D.

I’ll start by implementing this first PHE locally, according to the plan I presented yesterday.

Oh, and it strikes me that we can consider this PHE to be a sort of “streaming mip-mapping.”

Cool.

PS Related idea. What if .ENV can be the basis for the first practical 360 degree HDR streaming video codec? If we can cloud automate all this, it would be possible to render out high res HDR videos as a series of HDR panos (V-Ray does this easily, for example). Then process each one into .ENV, then process the series of ENV frames using h.265 or the best solution which allows for alpha. The codec would see just a regular LDR video with an alpha channel. Then we just need shader code on the client which will reconstruct the ENV for display in realtime. (reading the light values from the alpha channel of the video). We map this resulting video to the skybox and we have a real animated HDR environment. Imagine an HDR rendered sunset causing the lighting of your entire scene to change realistically in real-time.