I was hoping to take in a series of 2D arrays generated outside Babylon consisting of float values and use them as inputs for a number of vertex AND fragment shaders built in the NME. The dimension of the array row and column are known. These arrays would be updated dynamically.
My initial thought on a strategy was to normalize values than somehow convert the array into a greyscale psudotexture (is this possible, how?) where each row,col,float creates a pixel map that controls the surrounding space’s shade. This was because I am use to using maps and because I know maps can be stretched over a surface. Also, if it is a texture, I can easily update the texture to update a number of materials simultaneously that may use that same psudotexture map in its material. But I would still have to convert numeric values into a texture somehow (how?), then the engine would basically have to convert that back to numeric values when using the map.
It feels inefficient to go that route instead of just directly using the values and array position in an NM. If I wanted to directly use an array of values (say a 64x64 array of floats between 0 and 1) what would be the best way to set something like this up in a NM so I could use it as a mask or control texture?
The best way would be to pass a texture. You could create it as a RawTexture (the example creates a RGB texture, but you can create any type of texture: Red, RGBA, etc) and update it using RawTexture.update. When reading the texture, you should use a “nearest” filter, to be sure to read the right value from it, and not an interpolated value.
@Evgeni_Popov your nearest filter comment got me thinking, what if I did want it to slowly interpolate between texture channel pixel values as part of the update? Ideally with enough control that I could set the interpolation time to fade on / off between full update states.
Brute force processing the full texture data array takes a lot of resource to compare pixel A(before) to pixel A (new state) then interpolating, and often only a few pixels change at a time. I was curious if there were other ways built into Babylon that maybe I am not aware of.
You could read two texels (in NME, you can use a single ImageSource block for your texture, and two Texture blocks to perform two readings in that texture) and apply a Lerp to both values. The gradient value can be a uniform value that you update each frame. I’m not sure if I understood your need correctly, though (?)