Shader sanity check

I need a sanity check. I’m making a starfield shader as a post-process effect. This is a loading screen so it needs to not involve too much asset gathering. The effect has to fade out in a particular way at the end of loading, so it has to take place in the current game scene.

Due to not having any geometry to work with I’ve generated a random starfield. Then I loop through the stars and project them into screen space, and if the current pixel is within a certain range it draws the star. I currently have the PG set up to try to debug the math involved with the screen projection.

Because it’s a post-process shader it seems like I have to pass in the camera projection matrix then multiply it by the camera position and the camera direction to get a useful value, but I know that getting a vec4 out of this still isn’t useful. Is there a way to create a mat4 from a vec4?


My value CS should be showing something other than black but it is not. V definitely shows some change in values when the camera is moved, and I’ve tried multiplying the value by various factors just to check that the numbers aren’t too small.

I’m not able to visualise the math for all of this in my head at the moment so I’m kind of just assuming that it’s working in this context.

Or is there just a better way to do this entirely? I could create a starfield material that generates its own geometry and render it, but this effect has to overlay the world so I’ll just end up doing a render target and complicating everything during the loading process.

How would you do this?

Hi @Antidamage

I only get a black screen when I run your PG. Taking a quick look at your code, I’ve found a few things:

  • try to avoid too generic names like CameraVector. Favor more explicit naming like CameraDirection instead. It makes reading code easier
  • this code : CameraPosition * CameraVector it doesnt make sense or I missed something
  • vec4 v same here, use more explicit name to hint what your are trying to do.

I would not use cameraPosition/direction. I would instead assume camera is at (0,0,0), looking toward +Z. This implies view matrix (and view inverse are identity). Just use the projection matrix.
Did you try with procedural noise instead of a uniform array?
For the distance to 2D point, I would use smoothstep with the screenspace radius. Being pixel perfect is difficult. Screen resolution are insane and pixels are really tiny. I’d try to render small 2D disks instead.

Before we get into it, to reiterate my question is about converting an array of world coordinates to screen coordinates in a post-process shader, or alternately a much better way of doing a good starfield is welcome - e.g. should I just use a render target and do it the normal way?

scene.activeCamera.getProjectionMatrix() appears to return an unchanging constant value, so I assume it needs to be multiplied by a world matrix in order to get camera position into the scene as well. I can actually just use the projection matrix though, I’m just trying things out. I would like to know how to get the worldViewProjection matrix in a post-process shader however because the standard material names don’t seem to produce anything. I mentioned this in my first post and I’ve asked about it before on the forums. I guess it’s impossible and the camera-relative world matrix isn’t available from Babylon outside of materials?

here is a quick test I did to see what’s needed/missing:

starfield | Babylon.js Playground (

I thing I forgot is to properly handle paticles behind the view. Don’t forget to check z<0. In this case, I multiply the result by 0 when behind.


Ooh that’s a smart way of doing it. I think you’re on to something with using a plane. Cheers!

That effect is really cute!

1 Like

I ended up using a render target as I realised I wanted to add a number of other effects. Had some good progress:

The requirement is for this effect to fade into a different scene, so I’ve set that up in a post-process effect.

Is there a way to get meshes to ONLY draw in the render target? Currently it fades out from the starfield loading effect but the meshes and background I’m including in it are still in the world. Is my only option to hugely offset the meshes in the render target and use a second camera?

The other issue I’m having is that the nebula should be using alpha but as far as the instanced stars are concerned it ignores them. Do I need to do something extra here?

1 Like

Hey there, one way to get the meshes to only render to the render target is to give them a different layerMask than the camera. For instance on your playground, on lines 44-50 I set the meshes’ layer masks to 2 and the camera’s to 1. Then it fades to black at the end like you want I think. :slightly_smiling_face:

PS, I don’t recall ever seeing a playground like this where you have to press play to get it to start. Usually they just start automatically after they’re done loading. Color me curious LOL. :thinking:

Thanks! That was what i was after. Good progress:

I just need to fix that translucency now. Any ideas? Do the stars need something special or does the cloud plane?

1 Like

Hmm, I’m not sure what you mean. Is it just the star beams that should be translucent or the whole sta-field? :thinking: It doesn’t look like the Alpha uniform is being used in the shader thou, that jumped out at me.

PS, I made the code to set the layer masks probably overly general purpose - for every custom render texture in the scene, for each mesh of its renderList, it set the mesh’s layerMask to 2. But on second thought, in this case it’s prob simpler to just do it for the one render texture that’s created, since that’s the only one it looks like. :slightly_smiling_face:

Got it. The cloud image wasn’t refreshing properly, had to rename it. Result:

I am very pleased. Thanks for all the help everyone!


Awesome, it looks very nice with the translucent nebula clouds and the fade to clouds at the end! :slightly_smiling_face:

PS if you move the code that assigns the shader strings to the ShadersStore up before the shader is created then it works the first time, after the page is loaded. That’s why the screen was blank at first before, until play was pressed, since the shader was created before its code was assigned to the shader store, which was bugging me LOL. :beers:


Aha, thank you :slight_smile:

1 Like

When I implemented this into my app instead of the playground I started getting z-fighting on the clouds that wasn’t there before.

What are some things I can check to resolve this?

1 Like

You can try giving the source star mesh a renderingGroupId of 0 and the source cloud mesh a renderingGroupId of 1, that way the stars should always be drawn before the clouds. :slightly_smiling_face:

It’s not that, it’s the clouds fighting the clouds.

1 Like

Hmm, in the past when I had z-fighting between instances I solved it by making sure each instance had a different z or y position. That’s a little harder here where they’re random, but I think looping though and making sure none of the positions are too close would work. Maybe others will have a better idea thou. :thinking:

It’s odd because they’re distributed over quite a big range.

I wondered if the z-buffer has a lower bit depth in our app or something, or if I can manually push out the depth. It’s a shame render groups bring things forward rather than push them back like in UE.

The renderingGroupId just affects the order that the meshes are drawn in, it doesn’t push any mesh positions backward or forward - I might be misunderstanding you thou… :thinking:

Hmm, could try manually setting the y and z position of two clouds to be the same or really close and see if it creates the issue? Maybe can help narrow it down that way. :slightly_smiling_face:

Yes, that’s what I mean by push back or forward. Bringing stuff forward is less useful since then you have to offset the entire scene to push something back in the render list. It’s just a different approach.

I’ve been randomising the rotation a little bit just to help avoid flat planes fighting and it seemed to make no difference. That makes me think it’d be better solved by looking at how the depth buffer is configured. I just don’t know where to start.

1 Like

IDK but sounds like something that @sebavan or @Evgeni_Popov will know about, but might have to wait until tomorrow, after the weekend thou. :slightly_smiling_face: