I noticed that when I call readPixels() on a 3D texture, it only returns a buffer containing top 2D slice of pixel data. In the linked example, I create a 2x2x2 3D texture a fill it iteratively with data. I then log the data array and see the expected 2 x 2 x 2 x 4 = 32 data points. After the data is fed to a RawTexture3D, I read the textures pixels but only see the first 16 data points.
I’m not sure if this should be considered a bug or a feature request. On one hand, it seems like the readPixels() method oriented toward 2D textures because there is no depth parameter. However, it also seems like the method isn’t doing what it should when readPixels() doesn’t read all the pixels by default.