Texture Arrays Creation from PNG giving tiny tiles

I have a large tile set in a PNG image file. The tileset is 8 by 8 tiles, each being 256by256. So the overall texture is 2048 by 2048.
I am trying to take import this file then create a textureArray for sampling to display a specific tile but as a wrapping. So chainmail etc…
I have managed to get them into the texture array and when I look at the surfaces really closely I can see the correct patterns that have been seleted but they are coming out incredibly small. So I I am really close I can get a nice image. And I have to scale the texture coords hugely to see them. But moving further away the texture just disolves into fuzz. Using the same texture coordinates on a single texture produces a huge tile not a load of tiny ones. Am obviously missing something in relation to these textures.

Forgot to mention that the problem is to get the size of texture I want it does not seem to resize properly without disappearing into fuzz, as if I was expanding a tiny texture up hugely not the reverse.

The trouble is as I expand the size and move away it keeps the texture sampling as at the end rather. So instead of having an image like the first pic when I am far enough away. It just goes white. Does not seems to be adjusting the texture reading to account for my distance. Is very wierd. The first texture I am right in front of a 1by1 cube and it has hundreds of these across. So the texture scaling is definitely doing something very wierd. I obviously have set up something very wrongly.

Solve. Had the sample of nearest_nearest obviously I misunderstood what this does.

It causes the texture to appear as if it is about 40 times smaller! is this a bug?

Changing it to Linear_Linear sorted it out.


If you can create a repro in a playground, maybe someone can help answer this question. It’s a bit hard to follow what you are talking about without code.

I will give it a go at producing an example in a few days. Can’t post the actual project as it is pretty vast. Have tested the NEAREST_NEAREST filter on a mesh with array2DTexture and it seems fine there. But I am doing an SDF and sphereTracing tool, so the UVs are entirely calculated in the fragement shader based on the shape type, these are then not being scaled correctly. With that sampler. But with LINEAR_LINEAR is fine.
Odd to get such a completely different result.

Tried to repeat the problem in the playground but it is not happening. Not sure why it seems to work fine.
Anyway if you are interested have a link to the example which create a nice rawArray2DTexture from a single png splitting it into small chunks and storing them in the array, then applying the Array to an SDF shape for display.