Particle Vertex Shader / Screen or View Position

I’d like to use position or screen coords as my uv for a particle texture, but cannot seem to use mesh.position in the NME. The docs suggest you can only customize a fragment shader, so I don’t know how to manually pass these values in to the fragment shader. Any suggestions?

Example particle NME with mesh.position wired to the uv input Babylon.js Node Material Editor

Hello,

I agree we should be able to get the world position at least of the particle but so far this is not something we expose yet

Would you want to create an issue so we can track it?

cc @Evgeni_Popov FYI

Note that you can’t get the particle position for the time being, but screen coords is the same thing as particle.uv, which is available.

I’m possibly using wrong terminology referring to screen coords. particle.uv goes from 0 to 1 per particle, and what I’m trying to achieve is https://nme.babylonjs.com/#Q24HCH#4.

Is there a workaround I could use for 4.1? Possibly manually pass the position in via a uniform and then add particle.uv to it?

I’ll create an issue for adding support in babylon.

My bad, I was thinking “post proces” and not “particles” when I replied about the uv.

So, you would like to apply the effect shown in https://nme.babylonjs.com/#Q24HCH#4 to each particle of a particle systems? In that case I think you do need we provide the particle position in the nme, I can’t see how you can do otherwise.

Agree we need to expose world pos

If I need to get something working before 4.2, how can I pass the position in manually? I don’t see a way to set a uniform per particle. Am I mistaken than you can’t customize the vertex shader for particles via the ShaderStore?

you can overwrite the particle.vertex.fx in the shader store to set the world pos as a varying

Well it’s not so straightforward because not all code path in the vertex shader compute the world pos.

In one of the code path (the main one), we would need to compute it like invView * viewPosition, but invView is not available (except if you use clipping planes)…

As a temporary workaround it can be computed with inverse(view) * viewPosition, but that means a matrix inversion per vertex. Or enable a clipping plane to make invView available, but that is set so that it does not clip anything.

@Prodigal: do you need world space or screen space?

Looking at the material https://nme.babylonjs.com/#Q24HCH#4, it is screen coords that are needed.

Couldn’t we provide a block that maps gl_FragCoord and another for the screen width/height dimensions? That way, one can calculate the screen coords in any type of shader (material, post process, particle).

I can make either work but the screen coords are the easiest - the effect should just “reveal” an underlying and unmoving texture in pieces with each particle.

This is getting close by overriding the default and passing worldpos through to the frag shader. Babylon.js Playground

I have updated your PG so that it does what you did in the nme:

https://www.babylonjs-playground.com/#PXHKLH#6

Note that what you called vWorldPos is in fact the coordinates in clipping/NDC space, not the world coordinates.

@Evgeni_Popov: Let’s add the glFragCoords and the screen size blocks!

Thank you for the updated shader as well as the correction to my terminology!

PR on its way:

1 Like

Once again impressed with the turnaround on this, and excited for 4.2 release!