Are there any other solution to achieve decal effect in Babylon?

I can only found two solutions to achieve the decal effect from Babylon documentation.

  1. By Decal : Babylon.js docs

Example : https://playground.babylonjs.com/#N10DXG#17

However, effect of this approach is not ideal. The texture is projected onto the object by this approach, but the texture becomes distorted on certain curved surface as shown in image below:

  1. By Dynamic Texture: Babylon.js docs

Example: https://playground.babylonjs.com/#XMEL56#7

This is the effect that I needed. However, it is limited by the config of UV setting of the model, and it requires to create a set of NodeMaterial in prior. Also, it becomes more complex when dealing with multiple decal patterns.

My most desired solution is to repeat a pattern on a clothing model, and allow user to freely apply pattern in any size at specific positions on the clothing model. Example as image below:

So, is there any other solution available to easity achieve the above requirements? Is it possible to achieve by implementing following steps?

Step 1: obtain the data of positions/ indices/ normals etc by using mouse to point at the centre of the pattern
Step 2: create a new mesh by these kinds of data and locate at the placement of mouse crusor
Step 3: apply the decal pattern onto this new mesh.

It may be because the texture is projected along the viewing direction. Try to use the normal of the mesh at the point of projection instead.

cc @PatrickRyan in case he was some advices for you.

1 Like

@Dino_Fung, to me it seems like decals are not ideal for creating tiling textures for garments. This is mostly due to the fact that you can utilize the UV unwrap of the garment to align textures as you would when cutting the pattern out of actual cloth. When doing a decal projection, even into texture space, you are creating a planar projection which ignores the UV space of the mesh. This approach seems more suited to dynamic textures to create your tiling texture. If you unwrap your garment like a clothing pattern you can lay the pieces in UV space to correctly orient the pattern as you want. I would create a separate UV set for this purpose and not worry about overlapping UV islands because this is specifically about laying the tiling base color texture. If you need things like ambient occlusion, use a second UV set that has every island atlased as normal with correct padding.

To create the texture, you can simply grab the UV coordinates from the picking info to locate the place the user picked and then iterate through the texture offset from the original position to create your tiling pattern. You can redraw your texture every frame to allow your user to click and drag the image on the mesh to simulate dragging the pattern around. This was the approach used for the skateboard customizer on the dynamic texture page you linked above.

I hope this helps reframe the issue for you, but please feel free to ping back with questions.

2 Likes

Thank you for your suggestion, actually I am currently implementing it by using Dynamic Textures, but I hope that my expected approach (as below quoted) can be implemented in Babylon in the future.

quoted
Step 1: obtain the data of positions/ indices/ normals etc by using mouse to point at the centre of the pattern
Step 2: create a new mesh by these kinds of data and locate at the placement of mouse crusor
Step 3: apply the decal pattern onto this new mesh.
unquoted

@Dino_Fung, I guess I am not understanding the steps of your expected approach.

  1. You say obtain data of positions/indices/normals etc. by pointing the cursor at the center of the pattern. What is the pattern in this instance? The pattern - to me - seems to be a texture that you need to repeat. What mesh data are you referring to here? Do you mean that you click on the shirt mesh and gather the mesh data at that pick point?
  2. What is the new mesh you want to create? Is it a clone of the shirt mesh? Is it a section of the mesh? Why are you duplicating the mesh and placing it at the cursor position? If you are duplicating the shirt, are you placing it in the exact same position as the original? If so, what is important about the cursor position? Or are you talking about a projection mesh for the decal? If that’s the case, you can simply use the surface normal from the pick position to orient the projection mesh to ensure that the planar projection is as perpendicular as possible. Of course, that also assumes there are no drastic changes in surface normal near your pick position, which would then fall victim to typical issues surrounding planar projection.
  3. You mention applying the decal pattern onto the new mesh, which makes me think you are talking about a duplicate of the shirt and not a projection mesh. This is where I am having trouble following your desired flow. What do you mean here?

If there is a user scenario here that we can’t achieve with our current feature set, I would love to dig into it to see if there is a need we can fill. But I am not understanding the desired flow enough to see if this is a missing feature or if this is just an issue of chaining our current features together to achieve this flow from the user’s perspective. Right now, I can’t see where there is something in any of these flows that we don’t support in some way. Coming up with the right approach is always the heart of the problem, but I think from what your end goal appears to be that we are not missing any features for the individual steps.

1 Like

Thank you for your response, and sorry for any confusion caused. Please allow me to provide a revised version.

In this case, i want to make a textured design as like as an embroidery with thickness.

As shown in the image below, when I want to add a design to the red-framed area of the clothing, I can drag the design onto the clothing.

At this point, I want to obtain the positions, indices, normals, and other data based on the coverage area of the design on the clothing.

Using this data, I can generate a new mesh and use this design as the texture of the new Mesh.

Finally, I can place the new Mesh at the position where the mouse is released.

In this way, I hope it can obtain a texture design as like as an embroidery with thickness.

I hope this can express my issue clearly and help you understand the problem I am facing. Thank you.

That’s what mesh decal exactly does: it creates a new mesh by extracting the geometry data from the mesh where the decal is projected to.

You can see it in this PG: when you click on the cat, a new decal is created and it is available in the node list of the inspector.

1 Like

@Dino_Fung, for something like embroidery, I think the best path is to still use texture for a decal or a dynamic texture, but in this case you will want to create both a base color texture AND a normal texture. The reason for the normal texture would be to give you some reaction to lighting which implies the embroidery stitches are proud of the textile surface.

The other options would be to displace the shirt mesh in the shape of the embroidery by adding a displacment texture or to add mesh with thickness representing the embroidery on top of the shirt mesh. Displacement means you would need a mesh that was so high resolution to accommodate displacement in any location that the perf would be terrible. Otherwise, you would get uneven displacement around complex ebroidered shapes, so this option really does not work for your use. If you have a very small patch of the shirt that the embroidery could be positioned, you may be able to get away with this extracting that area and making it very high resolution. Still, this seems like an unnecessary cost.

If you had a mesh of the embroidery with thickness already built in, you could ray cast to the shirt mesh and position at the pick point oriented with the surface normal. However, since the shirt is not flat, you will likely have vertices from the added mesh that do not touch the shirt mesh. You would then need a projection step to move each vertex a specific amount to feel like the mesh conforms to the shirt. For dragging the embroidery around the shirt, you would likely need to disregard this projection for perf considerations, but you would need to reproject the mesh to conform with every move of the embroidery mesh.

Both of those seem to be complex solutions that do not offer great perf without major concessions. The only downside of using a normal texture as a decal or in a dynamic texture would be that if you rotate your mesh so the embroidery sits on a glancing angle to the shirt, you won’t see any disruption of the shirt silhouette. But that is pretty common in real-time graphics where we are baking detail to a normal texture. The light playing off the surface of the embroidery reinforced by a normal texture will likely be enough to sell the concept to your users.

In this case, I would still follow the approach from above, but you would be modifying two textures - base color and normal - at the same time. Hope this helps!

1 Like

Hi Evgeni-Popov, thank you for your suggestion. However, the pattern still have distortion if using a new mesh. Example as below image


It seems that using texture is the only way to perfectly solve this problem, but it needs to depend on model’s existing UV mapping under this way.

Hi @PatrickRyan , thanks and your suggestion is really good, but the scenario we are facing is relatively complex. We are not only dealing with models that we have designed ourselves, but also the models uploaded by anyone.

We hope we are able to make a product that enable anyone to make their desire pattern (embroidery) on their own model. Hopefully, we can find a solution that does not depend on the original model’s UV mapping.