I want to build a complex character made by sprites, which has a body, and some nodes attached to it like arms, head, legs.
Is there a way to do it in native Babylon.js?
In case not, what would the best way to implement it?
I’m thinking in three ways:
Implement a node system for sprites, updating each node transform frame by frame accoring to the body transform. Problem: can affect the performance if there are many nodes in screen.
Use the physics engine to attach nodes to the body. Problem: I’d been forcing the user to use the physics engine.
Associate the sprite body to a rectangle mesh and attach rectangle mesh nodes to the body. Each rectangle mesh would contain a sprite. Problem: It can drive to complex scene compositions.
Another method would be to pre-calculate the sprite movements and later retrieve them from a single graphic and simply exchange the images. Like in this example from @Pryme8
Sorry, I’m not sure if I’m wrong, but in that example all graphics seem to be individual sprites.
Where is the composition of an actor by more than one sprite?
If you’re using pixel art, make sure to set the textures to use the “nearest” sampling mode, so that it doesn’t blur the pixels.
This setup assumes that the opacity in each texture is either fully transparent or fully opaque. Because the “greater than” node is used to check each pixel in the layer for a non-transparent pixel. If it is a non-transparent pixel, then draw that pixel on top of the existing pixels.
Depending on the art style you’re going for, you could also add support for different damage states, by simply adding an overlay texture, on top of each layer, which you can have only appear over transparent pixels in the layer.
Finally, if you want the character to flash white after taking damage, and other effects like that, it’s very simple to add.
I have not looked into how sprites work in Babylonjs, but I know in other 3D renderers that sprites are simply just textures rendered to quads. So, I would assume that node material also works for sprites. But if someone could correct me on that, that would be great.
As for using spritesheets, that is definitely possible as well.
In the graph, you can see a UV node. This node outputs a Vector 2, from 0 to 1 in both X and Y. So to use spritesheet, you just need to offset the UV value based on the bottom left corner of the sprite in the sheet, then multiply the UV value based on the width and height of the sprite.
Then to animate it at runtime, you just need to update 2 Vector2 Input blocks, one controlling the bottom left corner position, the other controlling the size.
For tutorials on how to create a node material, you can also look up videos on shader graph for Unity, as many of the same nodes are available in both.
If that’s possible and I can use animated textures in the same way as sprites, I will go for that option for the engine. The thing here is being able to build it using code. The node material editor would be a secondary option.
@sebavan What do you think about using textures and material nodes instead sprites for the 2D game engine side? @Evgeni_Popov Would using textures avoid the need to rearrange sprites depth (textures in this case) every time the camera distance changes?