Dealing with 3D volumes and RenderTargetTexture

I’ve reached a brick wall here!

I want to apply a custom filter on a 3D volume once and then use the processed 3D volume in another shader where I do more processing. I figured out that this can be done using RenderTargetTexture but there is something quite wrong.

Here is a playground to illustrate what I’m trying to do:

Note: the first shader is not exactly what I have, I just want to know how to pass a filtered volume to another shader correctly

RenderTargetTexture does not support rendering to a 3D texture, only to a single, a 2DArray or a cube texture: the layers parameter is for the number of textures in the 2DArray.

The MultiRenderTarget extended support PR adds support for rendering to multiple slices of a 2DArray or of a 3D texture at once: you have to bind the slice(s) of the 2DArray/3D texture you want to render to, you can’t render to the whole 2DArray / 3D texture. However, rendering to a slice of a 3D texture is currently bugged in Angle (so, in Chrome, Firefox and Safari at least), and it does not work except if you choose OpenGL as the backend, which is not the default setting.

To ensure I understand, I have to replace the RenderTargetTexture obj with a MultiRenderTarget obj and then adjust the settings for that. Finally, I can use the MultiRenderTarget obj as the sampler3D texture for the second shader. Is that right?

Here’s a PG that should explain how to use the new feature, but that will work only after the PR is merged:

1 Like