How to pass the array of vectors to the shader attributeBuffer?

hi. when i pass a float or a vec3 to the shader, it looks very well.

const buffer = BABYLON.VertexBuffer(...)
customMesh.setVerticesBuffer(buffer, false)

but if i want to pass the array of vec3, data like this:


the web console shows Invalid Format. i dont konw how to pass the array of vectors. could anyone help me? thanks u.

may not be able to pass the vec array, you can try [1,-1,1,1,1,1,1,0,1] instead of [[1,-1,1],[1,1,1],[1,0,1]]

but if i [1,-1,1,1,1,1,1,0,1], the vertex shader how to know the type is array<vec3, 3>.
i cant get data accurately.

Here seems to be the answer you’re looking for

Maybe I didn’t express it clearly enough. I know if want to set vec3 to a vertexInputs, I can just set a one-dimensional array. but I dont konw how to set more data (mabe 48 or more), the fake demo like this:

// make fake sh data
const vertexCount = 6
const shDegree = 16
const sh = new array(vertexCount).map(() => new Array(16))
sh.forEash(item => {
  const r = Math.Radom()
  const g = Math.Radom()
  const b = Math.Radom()
  item.push([r, g, b])

// deal sh data
const data = []
for (let i of sh) {
  for (let j of i) {

// add buffer
const buffer = new BABYLON.VertexBuffer(
  customMesh.setVerticesBuffer(buffer, false)

fake shader like this:

attribute sh: array<vec3<f32>, 16>;
varying vColor: vec3<f32>;
fn compute_color_from_sh(position: vec3<f32>, sh: array<vec3<f32>, 16>) -> vec3<f32> {
fn main(input : VertexInputs) -> FragmentInputs {
  vertexOutputs.position = scene.viewProjection * * vec4<f32>(vertexInputs.position, 1.0);
  vertexOutputs.vColor = compute_color_from_sh(vertexInputs.position,;

I am not good at this, my suggestion is that you can use the stride, size, type parameters to try.

thank you for your reply. I’m not very familiar with this either, I can only try every possible method at the moment.

Can you give a palyground? So that people can better help you solve.

Good advice, I’ll prepare a playground now.

You can’t pass so much data through the attribute mechanism. The size of an attribute can be at most a vec4, and you are very limited by the number of attributes a vertex shader can support (often less than 18 attributes).

Maybe you can use a texture instead, and pass an offset inside this texture through a single attribute?

1 Like

I see raw webgpu can pass data structures like

struct PointInput {
    @location(0) scale: vec3<f32>,
    sh: array<vec3<f32>, 16>,
@binding(0) @group(0) var<storage, read> points: array<PointInput>;
fn vs_points(@builtin(vertex_index) vertex_index: u32) -> PointOutput {

Is it possible to support like it above, or is there any reason that limits this?

Yes, you can use storage buffers to retrieve some data, but it will only work in WebGPU.

If you want your code to work in WebGL, you can use a texture instead.

Thank you, i found setStorageBuffer in documentapi,. it’s perfect!