I’ve been experimenting with the ParticleSystem for fire visualization, and I’m wondering if it is possible to influence particle emission with an image texture. An example of what I’m trying to achieve can be seen in this video: YouTube
I’ve considered creating my own particle world where particles are monitored through an array, but I worry about performance. Thoughts?
Thanks for your time!
Ping @PatrickRyan for the particles
@3D_Wave_Design, welcome to the community! I’m interested to hear more about what you want to do with making a particle world because there is a lot there left to interpretation. If you are thinking of a 3D world of meshes, we do not have mesh particles in the engine yet so growing your own forest with particles controlled by the texture on a mesh isn’t possible yet. We also haven’t implemented emission from a custom mesh yet as all of our emitters are known primitive shapes.
We want to implement both of these features in the future, but do have concerns about them while there are still a lot of engines requiring WebGL 1.0 and thus not able to take advantage of GPU particles. We are also limited to a single thread on the web which does limit some of the features we want to add based purely on the ability to use them and do other things in the scene.
@Deltakosh is trying to lead the charge for a solution for single thread web trap, and @sebavan has been following closely on the heels of Google’s efforts on WebGPU to integrate it into Babylon.js. You can see his low-poly forest populated on a texture in this thread about the Google I/O presentation. This one isn’t done with a particle system but rather thousands of individual meshes.
In the mean time, if you want to spawn billboard particles based on a texture, you will need to do some extra work to read the texture and extrapolate where particles in the emitter can spawn and then kill any spawned particles that are outside those limits. I hope this explanation helps a little and I’d be interested to hear more about what you are trying to do so we can identify where we may be able to add some support to make the job easier.
@PatrickRyan I think I understand the limitations of the current ParticleSystem based off what you said. I was going through Babylonjs’ documentation and noticed properties such as ‘layerMask’ and ‘textureMask’ and thought it might change certain particle’s alpha within the emitBox or something, if was referencing an image texture. Food for thought now that I’m rambling — I wonder if a solution like that could borrow from what you guys have done with the procedural noise texture particle influence, which is a greyscale image.
If I have any breakthroughs I’ll make sure to post to this thread. My goal is to visualize a growing wildfire, which I was hoping to have leverage an image sequence for particle positioning — or masking.
@3D_Wave_Design, I understand the confusion with the method names. Layer mask is used to group cameras to render only what is in their layer and texture mask is to filter out colors within a texture on a particle. We would need to add features to the emitter to achieve what you are wanting to do with a texture.
Knowing more of what you are trying to do, there is something else you can try. If you have an animated sprite for your fire, you could slowly move your emitter through your scene to “grow” your fire while emitting particles that never die and have no movement on them. That way you can have several billboards stacked to feel like fire burning while the emitter moves away to burn other things.
We also have the ability to create sub emitters which spawn from particles in a system. You could utilize this to create a few fire emitters that grow organically moving in random directions. You could further use the procedural textures to determine which direction the emitters travel which would give you a fully organic solution to growing your fire.
Hope this helps as well.
@PatrickRyan Thanks for the detailed response! I’ll look into sub emitters in the interim.