I am working on a team-based survivor game and had a question about controlling non-visual scene assets possibly with animated control maps.
For illustrative purposes think of a terrain with a blight slowly creeping across it. I want to use an image sequence to direct the blight’s spread across the world. The spread would both physically transform the terrain by controlling dynamic asset spawn or replacement once dynamic assets pass the blight edge while also leveraging dynamically spawned particle effect to denote what area is actively advancing.
I was thinking of using the node material for this to control the blight advancing effects, but I am having trouble wrapping my head around how to leverage node editor to specifically spawn new emitters at the blights current edge and remove old emitters in already covered in blight. I was thinking an image-based approach best, though I could be wrong and I am open to suggestions, because I can send image blobs out of a central server via JSON to sync the blight advance state and its visual cues across multiple players collaboratively running or interacting with the blight to slow its progress.
How would I go about plumbing up a node material to do that… Take in a greyscale or RGB channel image sequence denoting freshness of the blight (white = new, black = old), and use that to spawn a series of emitters along the edge of new blight and truncate the old. If this is a painful process, what are alternatives considering this spread will be dynamically generated?
Did you try to work with a grid ? (could be just a JS dict object)
That way, you can work with X_Y keys as cells, and you are not limited to your image size to handle your “blight spreading”.
A few years ago I had done such grid-based spread with Python :
Looks complicated, but yet very simple :
Start with 1 cell
If new born, randomly add new cells next to it, when neighbor doesn’t exist (4 directions)
Age each cells up to death
I just gave it a try using BabylonJS Particle System, here you go
NB 1 : If you are unlucky you might have to trigger Run several times, since cells can die quickly at the begining with these params)
NB 2 : Here I limited the spread to a square in order not to fill the dict with infinite size, but in your case you would limit it using the ground mesh
I’m only updating particles but you could trigger some assets ON/OFF as well
@Calsa, you can leverage the same piece of terrain for both your visualization of the blight through the shader and as an input to spawn meshes on the surface based on either texture or vertex color. Node Geometry Editor can be used to spawn meshes onto another mesh to generate a new mesh. This graveyard generator is an example of passing several assets to a scene to be positioned on a dynamically generated ground plane. You can also just pass any mesh to a node geometry graph an use that to instantiate meshes on the surface (either at vertices, or randomly across faces).
Let’s say you are generating trees on your terrain. You can use both the ground mesh surface for positioning in world space to sit on the terrain, but you can also pass a texture to define where the trees can spawn. If you know the creep in the image sequence, you can generate your tree meshes in “waves” so that each time you grow your blight, you regenerate those tree assets but instead use downed trees or burnt trees or whatever your assets now look like. Regenerating the mesh is very quick. If you spam the button in the PG above, you can see that the graveyard ground is generated with a custom displacement in one node geometry and then passed to a second one which generates all of the elements on the ground without much of a delay at all.
You can mask this regeneration with a particle system or other technique as determined by your art target. If you need to spawn “emitters” across your terrain, you can mix NGE and mesh emitters by spawning small triangle meshes across your terrain and then using the generated mesh as a mesh emitter. This will emit particles from vertices and you will have a mesh of isolated triangles scattered across your terrain. You can simply not render the triangles, but you still get the vertex positions as emitters and you get the benefit of one emitter rather than many.
I hope this helps you think outside the box on this one, but feel free to ping with more questions.
The concept in this quote makes sense but if I have a dynamically refreshed raw texture and say want to spawn emitters at a specific point on a terrain map only if a specific channel (R, G, or B) is 255 (full white), otherwise there would be no emitter there, what would that setup look like?
I am having trouble with figuring out how to use a map itself as the control for the placement of meshes, particle emitters, anything across the terrain surface really. Emitters are the end-state goal, so would be the most helpful to see but any sort of base example of a texture serving as a distribution map for meshes / emitters would be amazing.
@Calsa, I will work up an example for you. Give me a little time to build it since I am currently out of office. I will get it to you as soon as possible, however.
I don’t want you burning out trying to help the community, and no longer having as much fun with Babylon as I do. You deserve guilt-free time off too in order to recharge and on top of that its the holidays for many of us.
@Calsa, here’s a quick and dirty example of using node geometry to generate a ground plane displaced by noise, then instantiate quads randomly across the faces of the ground mesh to be used as a mesh emitter for particles.
What is happening is that the ground plane is created in the first node geometry. This can be any mesh, so you will pass whatever ground mesh you have and it does not have to be node geometry. That ground mesh is passed to a second node geometry which is instantiating quads across random faces of the ground mesh. I used quads for speed, but you can pass a single triangle mesh if you want to reduce face count. This second generated mesh is used as the mesh emitter for the particle system. The particle system will choose random vertices to emit particles from.
In this example, I kept the quads small and didn’t align them to the surface normal, but you could do that if you want the particles to hug the ground plane. I also left the quads with an emissive green color so you can see them and understand how the distribution is working. You would obviously cull the emission mesh from the render loop in your final version.
To smooth out the growth action, every time the button is clicked, I create a new particle system and emitter mesh, and at the same time stop the previous particle system. This allows particles currently in the system to die. After a timeout that accounts for the particle life, I dispose of the previous particle system and mesh emitter to clean up the scene.
The amount of ground that is covered with emitter mesh is determined by the texture passed to the node geometry. With each click of the button, I offset the texture UVs and generate a new mesh. When the node geometry graph picks a position to instantiate, it looks to see if the red channel of the texture at that UV coordinate is less that 0.5 and if so a quad is instantiated. Otherwise nothing is created. This allows us to control where the emitter mesh is created.
I hope this helps explain a potential direction for you, but please feel free to ping back with questions.