Hot to use RenderTargetTexture to get result of custom material

I just learned babylon.js and want to implement a function:

I have some point clouds (actually there may be millions). The first step is to implement the point selection function.
I hope to run the calculation on the GPU and save the results into the texture for CPU extraction, so that I can quickly determine whether it is in the frame.

Below is a minimal (almost wrong) example I wrote in vue3+babylon.js, it’s not clear to me how to use RenderTargetTexture and PostProcess correctly

  <canvas ref="renderCanvas"></canvas>

<script setup>
import { ref, onMounted } from 'vue'
import * as BABYLON from 'babylonjs'

const renderCanvas = ref(null)

onMounted(() => {
  const canvas = renderCanvas.value

  const engine = new BABYLON.Engine(canvas, true)
  const scene = new BABYLON.Scene(engine)

  // Create camera and light
  var light = new BABYLON.PointLight('Point', new BABYLON.Vector3(5, 10, 5), scene)
  var camera = new BABYLON.ArcRotateCamera('Camera', 1, 0.8, 3, new BABYLON.Vector3(0, 0, 0), scene)
  camera.attachControl(canvas, true)

  const axes = new BABYLON.Debug.AxesViewer(scene, 1)

  // Create point cloud
  const points = []
  const indices = []
  for (let i = 0; i < 10000; i++) {
    points.push(new BABYLON.Vector3(Math.random() * 10, Math.random() * 10, Math.random() * 10))

  const vertexData = new BABYLON.VertexData()
  const positions = []
  points.forEach((point) => {
    positions.push(point.x, point.y, point.z)

  vertexData.positions = positions

  // Create shader material
  const customMaterial = new BABYLON.ShaderMaterial(
      vertexSource: `
        precision highp float;
        attribute vec3 position;
        uniform mat4 worldViewProjection;

        void main(void) {
          gl_PointSize = 3.0;
          gl_Position = worldViewProjection * vec4(position, 1.0);
      fragmentSource: `
        precision highp float;
        void main(void) {
          gl_FragColor = vec4(1.0, 0.0, 1.0, 1.0);
      attributes: ['position'],
      uniforms: ['worldViewProjection'],
      needAlphaBlending: true,
      needAlphaTesting: true

  customMaterial.pointsCloud = true // set PointCLoud mode
  customMaterial.pointSize = 3 // set Point size
  const pointCloud = new BABYLON.Mesh('pointCloud', scene)
  pointCloud.material = customMaterial

  //??? how to render to texture with shader material
  const renderTarget = new BABYLON.RenderTargetTexture(
    { width: 100, height: 100 },

  // do simple calculation
  const modifyMaterial = new BABYLON.ShaderMaterial('modifyShader', scene, {
    vertexSource: `
      precision highp float;
      attribute vec3 position;
      uniform mat4 worldViewProjection;

      void main(void) {
        gl_Position = worldViewProjection * vec4(position + vec3(1.0, 1.0, 1.0), 1.0);
    fragmentSource: `
      precision highp float;
      void main(void) {
        gl_FragColor = vec4(1.0, 1.0, 1.0, 1.0);

  renderTarget.onBeforeBindObservable.add(() => {
    // scene.postProcessManager.directRender([modifyMaterial], renderTarget)

  //??? how to get texture data, position
  setTimeout(() => {
    const textureData = renderTarget.readPixels()
  }, 2000)

  // Run engine
  engine.runRenderLoop(() => {

canvas {
  width: 100%;
  height: 100%;
  display: block;

Any help is greatly appreciated!

Try this

   renderTarget.setMaterialForRendering(pointCloud, customMaterial);
   renderTarget.refreshRate = RenderTargetTexture.REFRESHRATE_RENDER_ONCE;
1 Like

I tried your method with some modifications and there seems to be some progress
But looking at the returned values 51,51,77,255,51,51,77,255…

This is the demo

The render target texture was using a pretty low res (100x100), which made the points “smush” together. If you raise the resolution and bring the camera back a bit, you can see it properly rendered in the render target using the Inspector:

Render Points Calculation To Texture | Babylon.js Playground (

1 Like

Thanks, I kind of get it, I think my understanding of the texture save calculations is completely wrong

I set the width and height of 100 in the hope of realizing gpu general computing.

I would like the texture to match points one to one, maybe similar to ComputerShader in Unity

In the meantime, I’m wondering if babylon.js can achieve something similar in webgl or can it only be done using webgpu

You want to color the points using the texture instead of the pixel shader?

Oh no, I want to implement whether the calculation point in the shader is within a rectangular box, and output the result to the texture and extract it on the CPU side

I searched for relevant information and saw a similar link


What’s a simple way to do a compute shader without WebGPU?

Compute shaders are not supported in WebGL. You can do a RTT pass and fill your texture by the fragment shader.

I am too new to know how to implement it. It would be great if there were relevant examples