Scale mesh while keeping accurate uvs

Hello everyone,

to make clear what I am looking for I describe my desired result: I want to scale objects without stretching the textures. For scaling I don’t use mesh.scaling but I move specific vertices around. This technique is important to me to avoid a stretched mesh. As you can imagine moving some vertices to another position while keeping the same uvs will lead to distortions at the texture.

My first approach was to adjust uScale & vScale of the texture but this won’t work since uvs only have two dimensions. So scaling the width could be handled by uScale and the depth by vScale. But what should we do if I adjust the height as well? Or if a face is rotated? This can’t work.

My next approach was to unwrap the mesh every time I scale it. A while ago I created a function to unwrap meshes via cube projection. But it calls mesh.convertToFlatShadedMesh() which creates new indices. I can’t use this because I need to know the indices to be able to use my scale() function. Removing object.convertToFlatShadedMesh() keeps the indices but the unwrap isn’t correct anymore.

I created a pg so you can see it: https://playground.babylonjs.com/#IR9BL6#25

The result is partly achieved. The box is scaling and the texture is still not streched. But not for the whole mesh unfortunately :slightly_frowning_face:

Is it possible to fix this? Can my cube projection function get adjusted so it won’t need flat shaded mesh conversion?

Or is it possible to adjust the uvs according to the adjusted vertex positions? That would be the best solution imo. So I don’t have to create whole different uvs. It would be nice to keep the initial uvs but adjusting them to get a nice unstretched texture on my scaled mesh.

Thanks for your help!

Best

EDIT:

Soooo. After hours of tests I am still not able to get it done. I got rid of a lot of code to keep it clean. So I removed the cubeProjection() function to have the uv adjustment inside of scale() and tried to add some code to it. Unfortunately the result is even worse. :sweat_smile:

https://playground.babylonjs.com/#IR9BL6#34

I am not sure if I come closer or not. Is it even possible what I want to achieve?

1 Like

Let s ask @PatrickRyan how he would approach it ?

2 Likes

@samevision, I am not sure if I am completely understanding the limitations of the ultimate problem you are trying to solve because we are using a cube here. But for a solution with a cube, I would approach it like this:

What we have is a node material where we are passing different scales to the UV texture based on the facing direction of the cube by dotting the surface normal with the cardinal axis directions and taking the absolute value. This will give us a mask for both faces along each axis - X, Y, and Z. We then pass the scale for width, depth, and height from your code into floats in the shader and multiply them by U and V from UV space. This will give us a tile in U and V. The exact method for multiplying your width/depth/height scales by U and V will vary based on the UV unwrap of the mesh, but you can see I was able to do a little hacking to get them to work.

Once you have the three sets of UVs that scale correctly, those UVs are passed to three texture nodes sharing the same image source. The texture will only map correctly to the face along a specific axis, but the masks we made in the step above are now multiplied with each of the textures mapped to a particular plane (xy, xz, and zy) which will give you a correct texture on the appropriate face and black on the other faces. We then add all three versions together to get the final cube

Note that there are a few hacks in here like the remap node to get the inverse of the range coming from your code scale (which is likely due to how the face is UV unwrapped) so there will need to be some work done to set up your asset to have UVs that are optimal for this approach.

But I hope this gives you an idea of how we could approach the problem. If this isn’t what you are looking for, please feel free to ping us back. Here is the updated playground. Note I commented out your line adjusting your UVs in code so that we always have the faces mapped to UV space as they originally were for this method to work.

3 Likes

Hey @PatrickRyan,

thank you very much for your help. This is a whole new approach - interesting! It’s much more difficult to understand than just moving uv coordinates around but I’ll give my best to follow you.

First I tested how it behaves on other models than a cube since the cube is just for testing purpose and of course I will use more complex models later. Even the most models will be shaped similar to a cube the code should handle slight differences too. The masks for each axis relates on a float you set almost to 1. So faces with a normal that doesn’t match a mask will be black, right?

(1) So I decreased the value to 0.9 to catch this skewed faces. That works until I scale the model. The skewed faces can’t handel the scaling: https://playground.babylonjs.com/#IR9BL6#55

(2) I noticed that the texture is stretched. As you can see the squares are not 100% symmetrical. I assumed that it is caused by the model but it also happens on a normal cube in nme. Even though the model is correctly unwrapped in blender is gets stretched.

(3) Another thing I noticed is that the texture is inverted on my gltf model. is it because I use right handed system? I adjusted some nodes for multiplying the scales. I was able to flip the textures :slight_smile: But the skewed faces are still a problem. And the bottom gets stretched. Only the top behaves correclty (it would be nice if the texture won’t move around that much but that’s the next step I think) https://playground.babylonjs.com/#IR9BL6#56

(4) Is it possible to translate the code to a function when it’s working? I would like to use the PBRMaterial and just include some code in my scale() function so I won’t need NodeMaterial. Or am I forced to go with NodeMaterial?

I really want to understand this to solve such problem on my own in future. I also digged further at my initial approach. I was able to merge my scale() and cubeProjection() functions so the model gets unwrapped on scaling. It looks good but the problem here is that it only works for models that are converted to flat shading: https://playground.babylonjs.com/#IR9BL6#48

I don’t get how it is done if I export a model from blender. In blender I can create a cube and scale single uv islands as I want. If I load the model in bjs the texture looks exactly like it looks in blender. But how is it possible if the model is not converted to flat shading? Why can I manipulate single uv islands in blender and export it but I can’t adjust single uv coordinates in bjs without influencing other uv coordinates if the model is not converted to flat shading.

I really appreciate your help! Please enlighten me :slight_smile:

This method is going to need to be tailored to whatever your final model is. This one works for a cube, but once you start making changes to the model, you will need to make some changes to the shader, particularly where the masks are generated. The issue here is that if you are moving the vertices along the three axes independently, you will need to pass different tiling values to textures along UV space.

  1. The reason you see that the shader breaks when changing the sides into a more pyramid shape the dot with the cardinal axes are no longer 1 and -1. The step node is there to make sure that we block out just the faces. You can change the step value, but it will quickly break down on more complex models. If you still have a simple truncated pyramid, you could use a vector that matches the normals of the face to create your masks, but again without knowing exactly what you are trying to achieve I can only guess at the right method for you.
  2. The textures being stretched has to do with the UVs. If your UVs are changing when you change the position of the vertices, you will see this type of warping along the edge of the two face triangles. Other than that, there is a section of the shader that needs to calculate the correct tiling amount to pass into the mesh UVs to drive the texture. If the UV unwrap changes (different mesh or through code) that section will need to be adjusted to compensate for the new UVs.
  3. The issue with flipping is that glTF uses an OpenGL orientation of UV space and Babylon was originally written usign DirectX orientation of UV space (Babylon is much older than glTF). So when you load in a glTF model, we do a conversion so that the mesh and textures appear as authored (both handedness and UV orientation). We have parameters to invert textures that are loaded outside of the glTF and applied to the glTF. In the shader, you can simply add a one minus operation to the V component of the mesh to make a linked texture appear correctly.
  4. You can pass a lot of this to a function in terms of calculating vectors, scales, etc. However, our standard PBRMaterial shader is not equipped to handle multiple scales based on axis, so you will need a custom shader. You can certainly add PBR nodes into any node material so that you can take advantage of PBR rendering. You can read more about our PBR nodes and see our video tutorials here.

The other thing that you may be helpful in some cases, which takes some of this complexity away would be a triplanar projection node material. This is helpful in projecting textures onto a mesh even if that mesh isn’t optimized to use the projected texture.

You can see on the left sphere, there is distortion around the pole of the sphere. The center sphere and shader ball on the right at using triplanar projection and it does not matter what the mesh is doing underneath.

image

Granted, you will see some sliding of pixels across faces that have a very shallow angle to the projection vector, much like you would with an object in front of a projector in real life. But it’s a technique that may be useful.

I don’t have much experience modifying mesh uvs with code as I normally want to leave that work to a DCC tool like Blender. Dynamically changing your UVs in code comes with a lot of caveats. Are you still splitting the UVs the same way as you were in Blender or are you creating new UVs in your code? If you are moving the UVs that you have assigned in blender, are you moving the in the right direction? For example, even with your original test cube you can see that the faces are not all unwrapped in the same orientation, so you need to know which direction to offset each UV to keep alignment. With the image if your blender unwrap above, you have your sides in pairs reflected across the bottom, which means you need to know which UV you have before you can know which direction to offset it.

A lot of this comes down to carefully planning your UV space around the displacement you want to add in code. And the more complex the model, the worse this all gets. Not sure if I helped much here, but this approach will get very complex very quickly.

3 Likes

Thanks again for this information!

I made further tests regarding your node material approach and I also thought about using triplanar projection. Anyway I still can’t resist to try to manipulating the uvs while scaling. I know that it can get tricky but my initial goal is to create a working function for simple objects like cubes. It’s working good so far but I am stuck at determining the rotation of an uv island. The information is inside of the gltf file but how am I able to access this data? Is it possible to grab this information when it’s loaded? Or where does it get stored exactly? I assume @bghgary is the right person to ask here? :slight_smile:

I noticed that 3ds max offer exactly what I am looking for. It’s called “Preserve Uvs”:

EDIT:

I digged deeper in some gltf files and parsed the binary data. I noticed that there is no information for rotation of uv islands. There are only uv coordinates. So I have to be able to determine the orientiation of each face in uv space due to the order of it’s indices? Is that correct? How am I able to extract this information?

I gave it a try. But it’s neither working properly nor a sophisticated solution. Has someone any idea how to get the rotation/orientation of each uv island?

const getUvOrientation = (faceIndex, axis) => {

        const is = [indices[faceIndex*3], indices[faceIndex*3+1], indices[faceIndex*3+2]]

        const iMin = Math.min(...is)

        const iMax = Math.max(...is)

        const uv1 = { x: uvs[iMin*2], y: uvs[iMin*2+1] }

        const uv3 = { x: uvs[iMax*2], y: uvs[iMax*2+1] }

        const vector = { x: uv3.x-uv1.x, y: uv3.y-uv1.y }

        const direction = Math.abs(vector.x) > Math.abs(vector.y) ? "x" : "y"

        let orientation

        if (direction === "x" && axis === "x") orientation = "x"

        else if (direction === "x" && axis === "y") orientation = "y"

        else if (direction === "x" && axis === "z") orientation = "x"

        else if (direction === "y" && axis === "x") orientation = "x"

        else if (direction === "y" && axis === "y") orientation = "y"

        else if (direction === "y" && axis === "z") orientation = "x"

        return orientation

    }

    const setVertexUv = (index, ratio, orientation) => {

        if (orientation === "x") uvs[index*2] *= ratio

        else if (orientation === "y") uvs[index*2+1] *= ratio

    }

I encountered the same problem as you when using node materials. Have you solved this problem? Please provide me with a solution. I haven’t found a reliable solution yet.
If it can be solved, thank you very much!

@lxq100 can you provide a repro ?