I want to use a custom shader to implement general calculations (hundreds of thousands of matrix calculations, Computer Shader is not my choice because it cannot be enabled in some cases),
I searched for relevant information
There is an article describing how to implement it in webgl gpgpu.
I have also seen some PG showing the power of GPU Pick
As a newbie, I don’t know how to reproduce it in babylon (it would be best if there is the simplest PG)
, and I need to be able to read it into the CPU for processing.
Maybe you can use a procedural texture with a fragment shader and then you can retrieve the results by reading the pixels of the texture? It’s a bit hacky but that would probably work.
As long as you know the dimensions of your texture you can treat the UV coordinate as a global invokation id like for compute shaders!
Thanks for the answer, I believe what you said is completely feasible
I don’t understand how fragment matches each pixel of the texture one by one.
In my understanding, maybe it’s a step like this
Suppose I have a 100 x100 matrix, I can save it into a RawTexture
I need to design a fragment shader to extract the pixel coordinates in RawTexture and calculate the results. I am very confused at this step.
The fragment shader may not be run 100 x 100 times, and it does not seem to have an integer index corresponding to the pixel coordinates in the RawTexture (I searched and found that gl_FragCoord seems to be able to convert), maybe strictly 100*100 operations is correct?
The vertex shader also seems to need to be defined, but I don’t know how to define it. Just 4 points as a plane or 100*100 to achieve one-to-one?
Here I need to use RenderTargetTexture to render, I have questions
Do I need to create a mesh? I just use it for calculations. It seems that it can run without using mesh?
Sorry to have so many questions, I’ve been stuck on this concept for a long time. . .
Thank you very much, it looks like procedural texture can be used for calculations
In my project, I need to render the point cloud and select the rectangle with the mouse to obtain the coordinates of the target point cloud (many points), and my input is a depth map
So here’s what I did (hoping for better advice)
Convert to xyz coordinates through vertex shader
Get the color corresponding to the intensity through the fragment shader
My goal is to get all the xyz coordinates and point cloud coordinates/indexes within the rectangular plane, and with your example I think I will do this
I can’t manipulate texture in vertex shader, so I need to find a way to save xyz to texture in fragment shader (this stumps me)
Use procedural texture to process xyz data to extract the required points
I have a PG that simply simulates some point clouds (but in fact the CPU does not know the xyz coordinates).
I don’t know how to save the xyz from the shader so that the procedural texture can process it.
By the way, I want to break the casserole and ask, can similar things be achieved with custom shaders, because procedural texture seems to be a high-level encapsulation, and I really want to learn lower-level concepts.
I am writing a similar shader. The following is PG
Using the vertex shader to compute the positions is fast but I don’t know any way to get them back from the GPU
I don’t know how your point cloud is implemented in your project but if each particle can have a unique index, then it is possible to associate each index to a pixel in a texture. Then you would be computing the positions of the points inside the fragment shader (cursed I know ^^) exactly in the same way as in your vertex shader.
Using a filter to get only the points lying in a plane can then be done GPU or CPU side depending on what you want I suppose. If you do it on the GPU, then you can write 1 to the texture when the point is in the plane and 0 otherwise.
On the CPU you just filter your point data with a distance to plane function I guess.
Thank you, I searched for relevant information. It seems that webgl2 Transform feedback buffer can update vertices directly from the vertex shader without doing some magic operations in the fragment shader, but there seems to be no more information.
I’m trying to write relevant code, hope to make progress soon, thanks again
I didn’t know about those! This looks cleaner that using a fragment shader, but I don’t think the logic behind them is exposed by the engine, you might need to use WebGL2 directly
Thank you for your enthusiastic replies.
I am working hard to learn and try to use webgl to complete calculations, and then use babylonjs to do similar things.
For compatibility, I’ll stay with webgl api (even though webgl2 looks cleaner)