hmmmkay… so lets say I have a sphere and on this sphere I have a shader that I am using to generate a set of data.
Now can I take this information from the GPU and pass it back to a canvas or a buffer?
Sense the sphere is technically got a uv from 0-1 I could essentially map back all of the colors at the uv points divided by resolution back to a flat image for later use if I can get the information to bounce back.
The reason why I ask is I am trying to generate noise normals on a shape and then store that data back onto a dynamic texture.
figured this can be accomplished by and RTT as long as I get the sample spaces correct.