Texture3D custom node in NME

Is it possible to build a custom node for NME that can read from a Texture3D?

I would like to be able to supply a 3D texture generated in typescript, and then have a node much like the current Texture one except with a Vector3 uvw input.

My use case is loading voxel “lightfields”. If you’ve ever played minecraft, you might be familiar with the fairly primitive, but effective flood fill volumetric lighting style. I’m attempting to create something similar, however being able to apply to meshes that aren’t necessarily voxels themselves. I’m really keen to use the NME for building this material, rather than going full on custom shader.

I see that there has been some discussion about adding a built in Texture3D block, which would definitely be preferred for my use case, but since we now have custom node support in babylonjs, I’m curious if I can just add this myself.

Before I waste too much time just diving in and giving it a go (trial and error style), can anyone give me any pointers getting started, or reason that this might not be possible with a custom json node?

Cheers!

Hey there! Adding @Evgeni_Popov and @sebavan to the topic (Seb is OOF so he might take a while to see)

You won’t be able to use a custom block because you won’t be able to pass a sampler3D to this block, as we don’t support them yet.

I think adding a Texture3D block would be a good idea! However, I don’t know how we could import data into it statically (meaning inside the NME, as we do for 2D textures) as I don’t think there is a 3D/volumetric picture file format… You would be able to set data only programmatically, which restricts a bit the value of the editor because you would not be able to see the result of the material in the preview…

Thanks for the info!

I agree that the direct value of Texture3D in the NME editor is restricted, however the advantage for my use case is interoperability with the rest of the node workflow. A lot of what I want to build can be achieved much more easily in the node editor than a manual shader (for example, I’m still keen to use the standard light node but multiplied and manipulated by the 3D texture).

Might try prototyping this by packing to a flattened to 2D texture and see how it feels. The big loss is the lack of interpolation when sampling from w, but I might be able to work around this, we’ll see.

Edit: also in regards to preview, when the node editor is used in the inspector, the results can be previewed in the playground, so that provides a pretty decent way to test

I’m tempted to say the primary use case is not static files, but a 3D texture generated in typescript? And then mipmaps are generated? Maybe this is more about providing a typescript callback which iterates the 3D texture (uvw) and returns an rgba value? Most of the utility here is in generating 3D tiled noises, right? When webgpu becomes available, this can be done faster using wgsl?

Yes, most of the usages will probably be to generate the 3D texture by some programming means.

WebGPU won’t change anything for this one, 3D textures are also supported but there’s no specific added value to WebGPU. Maybe you were thinking of compute shaders? In that case, yes, it could be used to generate the 3D texture on the GPU with more freedom then doing it with fragment shaders.

Maybe you were thinking of compute shaders? In that case, yes, it could be used to generate the 3D texture on the GPU with more freedom then doing it with fragment shaders.

Yes, I was thinking compute shader might be faster at generating the texture in gpu memory than cpu. Not a big deal. Probably created confusion by mentioning webgpu, ignore that part.