There’s an expectation about the SPS (Solid Particle System) that is as old as the SPS itself : the ability to add solid particles in the system once this one is built.
Well, waiting for the team to merge the PR, the feature is now on the stage and it’s called “expandable SPS”.
Usage :
// create an expandable SPS with 500 particles
var sps = new BABYLON.SolidParticleSystem('sps', scene, {expandable: true});
sps.addShape(model, 500);
sps.buildMesh();
// ... further in the code
// add 60 more particles
sps.addShape(otherModel, 30);
sps.addShape(thirdModel, 30);
sps.buildMesh(); // again, that simple
// and you can do it as many times you need
Under the hood, the difference with a fixed SPS is :
the dynamic array storing just created particles is never freed (so more memory used)
a new VertexData, so a new VertexBuffer, is created each time sps.buildMesh() is called, so the Garbage Collector can trigger when it wants then.
var sps = new BABYLON.SolidParticleSystem("sps", scene, {expandable: true});
var model = BABYLON.MeshBuilder.CreateBox("m", {}, scene);
sps.addShape(model, 1000);
sps.buildMesh();
// ... further in the code
sps.removeParticles(700, 999); // removes the last 300 particles
sps.removeParticles(0, 9); // then removes the first 10
sps.buildMesh(); // like for adding particles after the sps creation,
// the call to buildMesh() is required
PG (once the build is merged) : Babylon.js Playground
Each click removes the 100 first and 2 last particles from the SPS until it remains less than 102 ones.
Open the console to check the remaining number.
Still two features to add to the expandable SPS and it will be done (before writting the doc, my favorite part )
I think then it will maybe be usable to build some simple vortex engine or some dynamic maze generator.
the user can now generate particles and store them aside for a further use
formerly stored or removed particles can be inserted back in the SPS on demand
// let's create an expandable SPS
var stock = [];
var sps = new BABYLON.SolidParticleSystem("sps", scene, {expandable: true});
sps.addShape(boxes, 1000); // let's add 1000 boxes
sps.addShape(sphere, 500, {storage: stock}); // let's store the 500 spheres aside in the stock array
sps.buildMesh();
// farther in the code, let's remove 500 boxes from the SPS
var removed = sps.removeParticles(0, 499);
sps.buildMesh();
// farther again, let's restore the spheres and the removed boxes
sps.insertParticlesFromArray(stock);
sps.insertParticlesFromArray(removed);
sps.buildMesh();
dynamic test https://playground.babylonjs.com/#X1T859#3
Every 3 frames, a new particle is cloned from a stock array and inserted in the SPS until it reaches 300 particles, then they are removed one by one, then re-inserted one by one and so on…
As you can see on the FPS meter, the removal process consumes more CPU than the addition one, what is expected because a removal is a deletion AND a recreation (so insertion) process.
This test shows how the underlying memory allocations and collections behave (quite well on my machine).
It looks like it still remains a tiny bug on the normals in the insertion process … investigating.
[EDIT] the same using addShape() instead of insertParticlesFromArray() working as well https://playground.babylonjs.com/#X1T859#4
I’ve just had an idea about another (long time requested) feature to add to the SPS to keep its rank in the friendly competition wtih CPU/GPU particles and instances.
Not sure that I will be able to achieve it though but implementation ideas start to come to my mind. I’ll probably give a try in the next weeks.