NodeMaterialEditor and epsilon

With respect to avoiding texture edges/boundaries, what should i be using for epsilon?

This is a tricky question. It depends :slight_smile:
Can you share a NME where you need an epsilon?

This is a pretty iterative operation

I’m not skilled enough “yet” to share an example. But, I can try to describe one example of what I’m thinking.

I realize Babylon doesn’t support 3D volume textures, this is likely because ios devices don’t support it? I’ve been trying to think of a workaround, and here’s my idea. Generate a 3D volume texture, likely a cube containing periodic worley noise, and load it into Babylon as a 2D texture. For example, a 16x16x16 3D volume texture would translate to a 64x64 2D texture. Using this 2D texture, leveraging uv/uv2 to create a uvw, I use the z/w coordinate to find the two adjacent 16x16 tiles, sample them both, and lerp between both values.

I’m pretty confident if the sub-tiles in the 2D texture are sampled using straight modulo to get the sub-tile coordinates, there will be artifacts. Instead, I’m thinking the uv coordinates need to be offset by “epsilon” towards the center of the tile to avoid visual artifacts/seams in the rendering.

If someone could hold my hand where needed, I’d be happy to build this frame for Babylon, think it could be really useful for many, from a 3D/volume texturing perspective. It would also help out a lot with power consumption, in certain cases, where the 3D texture can be static.

Given 3D volume textures can be combined with each other at different scales to create larger ranges of non-repeating/random looking values, the opportunity for creating very high “levels of detail” (ie multiple fractal octaves, see Texturing and Modeling a procedural approach) at a lower level of power consumption is huge. Combine this with the NME and a pretty incredible set of textures could be created. Periodic volume textures can also be generated, in javascript, on the fly, eliminating the need for large downloads.

The simplification is also worth mentioning. One potentially no longer has to reckon w/ the complexities of uv unwrapping a mesh, baking, etc. “Cool” texturing becomes easier.

Babylonjs do support 3d volume texture :slight_smile: NME does not though

Would it be worthwhile to have NME support support 3D volume texture?

Sure thing. How would it look like? So far volume texture are user created. They are not loaded from a file format

Maybe a custom monochrome bitmap format, that simply stores 1 byte in three dimensions? Along w/ classes for serialize/deserialize of the format? (-127, 128) (-1, 1) The 3d bitmap could then be indexed by uv.x, uv.y, and uv2.x, or mesh.position, in the nodematerial editor? Trilinear sampling along w/ mipmapping?

Correct me if I’m wrong, but I think max size would be 256x256x256?

And we could have code which generates two default textures for improvedperlin3d and worley3d? Both being periodic/tiled/repeating textures? These can then be mixed/recombined at different scales to create larger texture ranges on the gpu.

See section 5.5, 5.6

@Deltakosh am I making sense with the above?

I do not plan to add a special format for it (too many already exists). My question was more about how the node should work? Something similar to the reflectionTextureNode I guess?

@Deltakosh I’d be happy w/ a Texture3D node, which might be simpler to implement? In addition, there is more I can do w/ a Texture3D node (ie more degrees of freedom).

That being said, a ReflectionTexture3D block would be awesome to have as well.

I do not plan to add a special format for it
Understood. I’m not an expert on all the formats. Whatever we use, I’m inclined to think it needs to be an 8 bit integer per pixel/voxel. In some cases, being able to leverage the gpu cache will be a priority, making compactness a priority.

Should I put something on the backlog?

@Deltakosh One other perspective to throw in the mix. From a procgen perspective, the Texture3D will be incredibly useful for “mesh shading” of “meshlets”, when WebGPU arrives in a few years. Think automatic LOD subdivision, displacement, texturing, etc.

Interestingly, meshoptimizer has a webassembly build. Given there is a rust crate, and rust can auto-generate typescript bindings for webassembly, meshoptimizer fits nicely w/ Babylonjs.

see: “mesh shading”

but would you be ok to provide the texture by code right? You won’t be able to pick it like a regular 2d images in NME

Sure, I can provide typescript texture code which generates the periodic 3D volume texture for periodic improved perlin and periodic worley noise. The periodic worley noise will be based on the periodic improved perlin noise.

The periodic volumes, what 3d volume texture format should I use?

In addition, the worley volume, if large, could take a while to compute. Perhaps we should simply bundle a couple pre-generated volume texture files? Using brotli compression, I imagine they would squash quite nicely.

Where in git should I put the texture code?

For reference, an example of periodic perlin noise to generate a periodic volume texture.

In our case px == py == pz

@Deltakosh I forgot to mention, at risk of stating the obvious, periodic 3d volume worley noise can be generated using the periodic perlin noise above.

The potential VolumeTextureNode will work like the ReflectionNode (I guess) but you will have to provide code to generate the volume texture like here:

3d texture volumetric shader test | Babylon.js Playground (babylonjs.com)

(Check line #22)