I have found myself with the makings of a GPU based cubeTexture editor which is pretty cool… I am hoping to extend some of its capabilities right now and then add “filters” and some more generators. The whole grabbing the probe thing might save me a bunch of compilation time because I do it incrementally per-face right now and that is kinda heavy on some of the generators (still takes only about 2-3 seconds to render 6x2048pxl images with a huge amount of cycles on the noise generators).
It would be really nice if I could just apply my shaders to a cube that is dedicated to a render probe that just dumps its contents to a cubeTexture upon request.
@sebavan so what you are saying is try rtt.readPixels? I think I was able to get the blob to come up with the DumpFramebuffer tool, but I am not too sure how to convert that to a cubeTexture now? Is this where RawCubeTexture comes into play?
Readpixels reads the data back from a probe to anative array the yes you could use a rawCubeTexture to create a new cube texture from modified rgba array values.
You can take a look at the environment texture tools where we are recreating cube textures from other cube textures by using a post process in between.
I dropped all the face color stuff out, and it still does it. Then I dropped the emissiveColor and not the face colors and it still did it. Then I dropped both and it stopped. Odd, this does not seem like normal behavior.