I started using babylon.js, but I don’t know about bone.
I want to simulate the brush I use for painting.
In this case, I would like to put a bone in the bristles of the brush so that it bends when it comes in contact with a flat surface, but what is the way to do this with babylon.js?
Don’t know the related tutorials and references?
I want to make a brush model like this presentation.
Hmm? Have you tried using a physics impostor soft body for the brush bristles? Link to soft body documentation. You could then attach a movement constraint or physics joint in Babylon terms to the impostor and the handle of the brush. Plus if you want the brush strands interacting individually you can use a rope impostor then attach a rope movement constraint to each individual strand.
Thank you for your advice.
I didn’t know if it was better to calculate the brush movement on my own or if there was any other good way. I feel like I’ve got a clue to the solution.
Fix some colors on the bottom of the spheres with collison detection on the ground and mix the colors… It could work!
That’s a good idea!
I knew that I should use a soft body, so I’ll start working from there.
It seems okay to paste Texture on Canvas according to the shape of the brush because I have done it with OpenGL ES.
I will proceed from the range that can be done step by step as advised.
Thank you for sharing the link!
I don’t know if it’s babylon.js, but zack was making something amazing!
Thanks! This is using THREE.js rather than Babylon.js, but the volumetrics system uses plain WebGL at its core, so it should be possible to bring into Babylon.js. If you’re interested, the overview of what’s going on is essentially:
- Calculate position of brush in object space of the mesh that you want to paint
- Create a new webgl canvas that will contain the “paint”. The larger the canvas, the better the resolution you’ll get for detecting intersections.
- Setup a vertex shader for the canvas. The shader should get the mesh’s vertex coordinates and uv coordinates as inputs. It should use the UV coordinates (rather than vertex coordinates) as the output to gl_Position, that way the resulting canvas can be applied as a texture to the model and match the uv.
- The mesh’s vertex position should be passed through the vertex shader as a varying vec3 so it is accessible to the fragment shader.
- The fragment shader should get the brush’s position in the mesh’s object space as an input uniform (and any other brush information like color and such)
- The fragment shader calculates a signed distance field for the brush shape. If the fragment’s distance function is less than 0 (inside the shape) for the varying input position offset by the brush position, then it should result in being colored in, otherwise it should be transparent.
- The resulting canvas can then be composited with another canvas (with drawImage) to accumulate the changes over time
- Finally, the resulting canvas can then be applied as a texture to the mesh.
The relevant source files are here:
Best of luck creating your brush! I hope it comes out well!