Thinking that I’m developing a 2D environment using plane meshes and textures to simulate sprites, I’ve been searching in the doc and I don’t have clear how to animate the textures.
I want to render frames on a mesh from a spritesheet image.
What would be the proper way to animate a texture attached to a mesh?
Is there a native solution?
Should I do it switching the UV manually?
Should I use node materials for the UV updates to improve the performance?
Sounds like you are doing a 2.5d project and you can totally adapt some of the sprite materials code to your purposes, or whip up a UV solution fairly easy.
If you setup the UV of the plane correctly you can just use offsets to go to the correct frame of your texture. It will take some coordinate remapping. You can use the standard material and make some support methods to adjust the UV. The alternatives are a custom shader/material or some interesting usage of the dynamic texture class.
The sprite map would not be a good example because that is using pure shader code and texture buffers to control the animations and rendering, probably not what you would need to do normally.
I did not fully support instances and other things with it, if I finished up the one PR that converts it to a custom material then it would be easier to work with for that kind of usage.
But yes for all purposes he could just do a 1x1 spritemap.
What I would to is to draw the quad proceduraly, so you know what the UV array is (*). Then it’s easy to update the quad UV layout in an update loop using setVerticesData(BABYLON.VertexBuffer.UVKind, uvs)
(*) it could also work with BabylonJS built-in MeshBuilder.CreateQuad
In my opinion it’s fairly cost efficient. Maybe using a custom shader that does the UV computation could be better ? Maybe, but with this geometry-based approach, if many animated objects can share a sprite sheet, they can also share a Material, which is interesting.
Hi, there’s no possibility to use a third-party library. The only dependency of the project will be Babylon.js. That makes the bundle lighter and avoids extra work of maintenance.
Is it even possible updating the UV for animations with shaders?
I’ve been playing with Node Material Editor and despite I can modify UV parameters, those values are not present in the exported shader txt file.
Maybe coding the shader itself I find the way but I have concerns about how to load that code real-time in the sprite, I have to setup some values in the text code, it needs to be compiled, and I’m not sure if that’s the best for performance, because it can cause lag peaks if several sprites are spawned at the same frame. (I don’t know it but I will have to test).
After the sprites spawn, using shaders to update animations should be better than doing it manually using hte CPU. There could be tons of sprites in a scene and even if they are out of camera their animations should be updated.
Just keep a running global time and base the animations off of that, with a “local time offset” so they dont all playback at the same position. Then you only need to update the one that are being rendered.
Here’s a demo where the UVs are generated in the fragment shader, based on the value of an “index” uniform.
It moves the “evaluate the UV” computation from the update loop to the shader. And you no longer have to update the quad geometry at runtime.
But if I wanted to display an army of 100 000 sprites,
I feel like I’d rather update 100 000 geometries sharing one material (you can merge the mesheses with Solid Particle System, it will be very efficient),
than having 100 000 not-updated geometries that each have their own material, and thus can’t be merged.
If you only need a small squad of 100 sprites, it makes no difference at all performance wise. IMO the geometry update approach is way easier to implement and understand
Thanks! That’s pretty cool. I will implement it in the engine
I don’t think users will create 100k sprites, the 100 sprites approach is more real, I will use the geometry update using shaders and a material per sprite, that allow users to customize it.
I honestly really really like using the Solid Particle System for rendering sprites. With some tricky setup you can have different systems for different sets of particles that are all very customizable and controllable.
You can do some of the most amazing things with it, render planets and asteroids, do blaster effects, rain, sprites, trees, birds, etc. Its really one of my favorite systems to work with. Cant off the top of my head remember the original author but they killed it with this feature.
I’m full latelly and I don’t have so much time to continue with the engine and test so many things, I’m already implementing the method @SvenFrankson proposed.
Regarding complexity, peformance, and sprite customization, what do you think are the pros and cons of using SPS instead Mesh/ShaderMaterial?
You will be responsible for the shader bindings, tracking of the entities, the update loop etc.
It basically places all the control at your fingers, the SPS is just going to be the link to generate the mesh data needed to render the quads to put the sprites on.
Now that webGPU is running though there is for sure a better solution. It just has not been whipped up for us yet.
I’m treating to make all the “play animation” logic rely on the shader (frame start, frame end, delay ms, and loop)… let’s see if I have luck, GLSL is total new for me. I feel that’s the best solution.
After that, I have to do the shader material to completely ignore scene lights, sprites shouldn’t be affected by scene lights.