3D Texture as Pseudo GI Light?

I’m doing an experiment where I am taking a reflection probe and moving its position incrementally in a volume then passing that cube texture to a shader that gets the average color of that location and reduces it to one pixel. I then read that value and add it to my buffer.

I have a prototype that Is using a pseudo 3d texture. But I would like to use a real 3d buffer and sampler as I think that would be easier to control.

The problems I am having with that is one, it does not seem like the reflection probe sees fog, which I was hopping to use to make the emissive colors on meshes falloff during the baking. The second is there is not very much information on visualizing 3d textures for debugging them and was hoping that someone had an example somewhere. Basically like make two planes I can slide through the volume and sample the 3d texture at that plane?

Oh and is it possible to force more levels of mipmaps? Ideally I would just be able to do like mip32 or something and reduce a 32x32 image to one pixel that way?

We can see in this PG that the fog is taken into account in the probe cube textures:

With fog:
image

Without fog:
image

It seems there are different way to visualize 3D textures:

I’m not sure that something exists for Babylon.js, though…

1 Like

Ive got a way to do the volume rendering, but webGL does not like it. Maybe ill try to work up that plane slice example.

Must have done the fog wrong, thanks!

image

welp that was ez…

It seems that I’m not building or sampling the 3d texture correctly.

Essentially after the bake process is done, the slice planes should display at the very least red in the center of them not how its currently showing. It also does not make sense to me what they are displaying, so I assume that one I’m either making the 3d texture incorrectly or two I’m sampling it after incorrectly.

Scene to bake, is just a red emissive sphere and black plane to block the red sphere when under it while collecting the data

The slices to try to debug the 3d texture after its created.

If I can get this to line up then Ill be able to start doing runtime baked pseudo GI for some scenes that would benefit from it so I’m really interested in figuring this out.

Any advice would be apricated.

https://playground.babylonjs.com/#8U8YXP#5 scene with some fixes

*Update Nevermind got it. I forgot to make both the sphere and ground plane double-sided so I was getting funky data.


This is pretty much the expected output now to apply it.

So its not the best method, but its not the worse either!

But this is going to be fun to use with my Doomish editor that ill be posting soon. It will be really cool to come with with some SDF methods to tailor and animate sections of the 3D texture, for like glitching lights and things on walls.

1 Like

@Evgeni_Popov can you think of a way to make this process faster other then multiple probes going at once?

Right now if I set the 3D texture to like 32x32x32 this takes like 10 mins to complete which is not ideal. Lower resolutions are quite fast, but the factor of time goes up so quickly its kind of nuts.

I’m also wondering if it would be better to blast a ton of raycasts from my sample positions instead of using a probe, and have those recasts bounce once and record the colors that the ray collides with.

Honestly any input on this from another brain would be really nice.

If you are not limited by the GPU, you could do the processing several times in parallel, as they are all independent.

For eg, have 4 probes and call CaptureEmissiveVolume for each probe.

You are currently limited by the browser, which is updating the display only at 60fps. If you are able to fit 4 processing in one frame, you will cut the time by 4. Of course, that will depend on your GPU.

That’s kinda what I thought was the only option too ok cool thanks for confirming that.