Creating a 3D RenderTargetTexture

Is it possible to create a 3D RenderTargetTexture?

The closest I can get is creating a 2D texture array by specifying the number “layers” in the size parameter of RenderTargetTexture’s constructor.

Welcome aboard!

Rendering to 3D textures is not supported currently. You can indeed use a 2D texture array instead.

Thanks @Evgeni_Popov, follow up question then: How can I detect in the (frag) shader which layer of the array I’m rendering to?

You will have to pass this information yourself to your shader(s). You can use the renderTargetTexture.onBeforeRenderObservable notification to know the layer index:

renderTargetTexture.onBeforeRenderObservable.add((layer) => {
    // `layer` is the 0-based index of the texture in the texture array
1 Like

Curious. I didn’t know the onBeforeRenderObservable passed any arguments to subs - TIL!

And when rendering to a cube, you get the face index through the notification.

1 Like

I wonder if @erichlof might find this useful in his real-time ray trace stuff?

1 Like

This is a cool feature! I hadn’t thought of using it in this way, thanks for the ping!

Normally the reverse situation is encountered when ray tracing (actually ray marching) through a 3D volume. One can imagine ray marching through an atmospheric cloud, or rendering medical data by ray marching through successive 2D layers of scans of the human patient (maybe MRI or head scan). The data would either be stored as a 3D texture, or as many 2D textures stacked on top of each other in a 2D texture array.

Volumetric path tracing for medical imaging

Around the 1:30 time mark, you can see how the data is saved as an array of 2D textures or ‘layers’, which is then path traced (ray marched) through on the GPU. But as I mentioned, the reverse process is happening here - the data is already saved as a 2D texture array or 3D texture, which is then accessed in the shader when rendering. What the OP was asking about on this topic, if I’m not mistaken, is actually writing to the 3D texture (or 2D texture array as backup). But that might still be useful somehow! :slight_smile:


1 Like

Thanks for the in-depth and thoughtful response! Always learn something from your posts :grin::sunglasses:

1 Like

@Evgeni_Popov, I got this working using texture arrays, but could the “3D RenderTargetTexture” idea be translated to a feature request?

It’s unclear if it is actually feasible in WebGL without geometry shaders. If the 3D texture’s layers can be bound and rendered to separately (like a texture array), then I suppose it could work? If so, the upshot would be saving some instructions in the fragment shader, because the array approach requires additional shader code to explicitly blend between layers.

Adding @sebavan / @Deltakosh for the possibility to add this as a feature request (I don’t know if it is supported by WebGL).

Yup rendering to 3d texture should be possible in babylonjs and you would use the exact same code than texture array, indeed each slice would be indexed the same way and fetching would be more efficient.

Anybody wants to create a PR for it ?

It looks like there are no takers so I ll give it a shot before end of the week :slight_smile: