UV Map flipped for Imported Mesh from Blender file.glb

I used a separate Polygon for each Wall section, without Holes, because I need to dynamically do CSG for windows, doors, etc.

I can argue that this is not a subjective matter, because according to Babylon docs, it is supposed to work with OpenGL’s standard. So I uploaded a Normal Texture from the same Babylon docs to test. And it does not look like the original uploaded Texture with default settings. I had to set invertY: false to make the normal direction correct, but then it had inverted the texture upside down, like posted above with screenshots, which you can easily repro in PG.

Here is the root cause of it all:

1 Like

As I mentioned previously

Could you supply a PG showing your construction of the wall please?

Why? is it related to this issue?

I cannot easily put it in PG because it was abstracted into React Components, and a lot of code related to wall construction logic uses external npm libraries.

It’s similar to this Babylon tutorial, but only each Polygon is built without rotation (MeshBuilder.CreatePolygon() with default settings). Top and Bottom walls correspond to Top and Bottom Polygon faces.

The logic related to UV is as posted.

1 Like

I wanted to see how the walls were constructed from CreatePolygon and it is too difficult to work out from just a description.

The following PG shows two ways that the inner walls can be formed, the first just using a translation of the outer wall reflects the text on the inner wall and the second (green one) using a rotation and a translation doesn’t reflect the text on the inner wall.

Choice of construction methods affects the texture display.

1 Like

I see. Well, I do not have UV translation, my inner/outer faces map the same way, only one goes in clockwise, another in counterclockwise direction between polygons. So it does not matter what order because it maps the same way as MeshBuilder.CreateCylinder (with UV cutoff for invisible wall faces).

The way I see it, is to fix this issue now, at least for my case, would be to set Texture invertY = false and refactor all of the UV work done previously with Babylon created meshes, including this wall tool. And it is a lot of work.

During wall construction, I noticed a lot of unpredictable result mapping Texture UV to wall faces. My trigonometry calculations on paper and stackoverflow solutions never seemed to match with Babylon. So it was very time consuming, because I had to blindly experiment with different values until I hit the gold… (for someone who aced in trigonometry in school).

Now I understand why.

That’s why I wanted to ask pros in here if there is a central place in Babylon where I can flip this UV map switch. But apparently there is none :frowning:

1 Like

Unfortunately I think you must the first person to work on applying a texture with text to multiple models from Babylon.js together with imports.

In the few years I have been working with Babylon.js I think I only once had a text reflection issue and then I did not think about the underlying issue I just solved it by flipping values in the faceUV

It will be difficult, I think, to find anyone who has thought about text direction except perhaps on planes, boxes or GUI.

2 Likes

Yes, I agree. But remember though, that statistically, only a small fraction (usually under 2%) of the people who encounter problems will report it.

In my case, I have no other options. Everything in the app is user generated, edited, updated on the fly. This means I cannot use adhoc one-off quick fixes like most people do. Everything has to follow a standardized pattern to produce consistent results.

This complexity is multiplied by the fact that I need to automatically export the entire scene to Blender for realistic rendering. Then import back the results to Babylon for interactive VR Experience (not just static snapshot images).

Aiming for this kind of interactive realism like what @PatrickRyan did, but only in a newbie friendly way, so that even my mom (who barely knows how to use YouTube) can do it.

The texture orientation in Babylon is taken from the DirectX standards which places UV (0,0) in the upper left corner in UV space. This was a decision made when the engine was created nine years ago as the engine was inspired by a different DirectX engine that @Deltakosh also wrote. The glTF 1.0 specification was released just over six years ago and those went with the OpenGL orientation with UV (0,0) in the lower left of UV space.

In all honesty, these are arbitrary decisions that each platform made. I remember being asked by my teammates contributing to the spec which orientation I would go with. I mentioned that a lot of DCC tools follow the OpenGL standard so that was my preference at the time. This was before I was working on the Babylon.js engine but if I had to make the decision again, I would likely still support OpenGL standards for glTF.

So what you are running into is not a bug. This was an intentional decision made at the start of both our engine and the glTF spec. We do support OpenGL orientation for UV space through the texture inversion flag but unless we know where the mesh is coming from, we have no idea what the UV space orientation is. In the case of our glTF loader, we know that the glTF file has OpenGL UV orientation, and so we can set the flag correctly on all textures loaded with the glTF. But if we have no frame of reference for UV space of a mesh, we can’t possibly know what to do with the textures we load.

And this isn’t something we can just change as backwards compatibility is a foundational principle of the engine and a change like this would break nearly every project made with Babylon.js. And the flag at the texture level is how we enable flexibility in whichever UV orientation is used.

1 Like

@PatrickRyan please explain why the following does not demonstrate that for uv (0, 0) is NOT the bottom left corner of the image when mapping onto a mesh.

Looking at the use of stored images on BJS StandardMaterials.

The used images are
texture ----
image

By examining the vertexData or by using the PG export tool to save as a .babylon file the positions and uv data are

positions = [-0.5, -0.5, 0,   0.5 ,-0.5, 0,   0.5, 0.5, 0,   -0.5, 0.5, 0]

uvs = [0, 0,   1, 0,   1, 1,   0, 1]

both of which have the order bottom left, bottom right, top right, top left

result is the PG plane - texture

I am really confused

EDIT OK I have worked it out, we are both using u and v but not for the same thing. My u, v are measured from the bottom left hand corner of the image mapped onto a vertex in 3D space where up and horizontal match the up and horizontal of the image. Whereas your u,v are measured from the upper left corner onto the projection of the vertex onto the canvas which has (0, 0) at top left corner. So my u = your u and my v = 1 - your v.

So depending on a readers background and experience the use of u and v in the documentation could be confusing for them.

However the text direction issues in this topic are about a reflection about the vertical not the horizontal so I would still say they arrive because of mesh construction rather than a gltf and babylon.js difference.

@JohnK, this topic is really complicated, so it’s easy to get confused about it. The thing you are running into is a default setting that is throwing off your expectation here. The positions and UVs of the plane in your scene are as you describe, but if you inspect the loaded texture, you will note that we have inverted Y by default which is shown under “Stored as inverted in Y” so it is stored in memory inverted.

To help illustrate what is going on, I quickly put together a playground so we can see what happens with textures loaded through glTF as opposed to loaded through code, and how the meshes treat those textures.

What you see is a loaded glTF plane in the upper left and a Babylon plane in the lower left. The upper right is a clone of the glTF plane and the lower right is a clone of the Babylon plane. The two on the left are using the texture as loaded by our glTF scene loader. The two on the right are using a texture loaded using BABYLON.Texture. Note that we do not specify whether to invert the texture or not in the loaded texture, we just go with whatever is default and what the engine expects.

You can see that the glTF planes (top row) have a different UV space from the Babylon planes (bottom row) and we know that the glTF spec uses OpenGL orientation and the Babylon engine uses DirectX orientation. We also loaded all assets using default settings as much as possible. What this tells us is that when we load a texture through a glTF, we know that we do not need to invert the texture to match with the OpenGL space and so textures are set to invertY = false. When we load a texture straight into the scene, however, the texture loader assumes this texture will be used on a Babylon mesh, and so the default for the texture loader is to set invertY to true. You can see from the inspector that the loaded UV sample texture is stored inverted:

image

If you want to dive further into this we do go into this in our docs around normal maps. I wrote this doc a long time ago, but it still should apply to this discussion.

2 Likes

Thanks for taking time to confirm it, Patrick! I linked to the same docs above twice, but people didn’t believe it :joy:

1 Like

@PatrickRyan thank you for your detailed explanation which clarifies much for me. I would be grateful if you would check whether what I am saying below is correct or not.

When reading this, from our docs around normal maps , I missed the stored part of To illustrate this, here is a simple graph to show how the UVs are stored in a glTF file versus .babylon file:. I was think about how they were viewed online rather than stored.

I still believe there is an error in this diagram

since top left and bottom right of the glTf cannot both be (0, 1)

For clarity and hopefully less confusion I am going to use the terms readable and upsidedown for image/ textures rather than coordinates. By readable I mean when viewed directly on the web English text will read as normal and a rotation of 180o makes them upsidedown. As viewable means the displayed orientation in 3D matches as seen on the web.

The following two are readable
image       image
Link                    Link

The following one (from inside a glTF) is upside down. (It was difficult to find an image with text inside a glTF)


Link

Taking a plane created in babylon.js and also saving it as glTF the following occur when default settings are used, applying the images as material after the plane is created or imported, the orientation of the image on the plane matches ; i.e. readable stays readable and upside down remains upside down. For example

When the image is within a glTF model and the whole model imported the image is displayed as readable although it is upside down within the glTF.

In the case of upsidedown images needing to be readable on the plane this can be dealt with on importing or by using uv scales.

However @ecoin 's issues are not that straight forward. When importing models with textures or models and textures separately there is no way of knowing prior to importing about the construction of the models or how the uv mapping onto vertices have been done. For example the PG below shows a cube created within Babylon.js and one imported from a glTF. If the imported cube had had its texture set before exporting Babylon.js would display the cube with its texture as saved.

When meshes and textures are being mixed and matched from a range of sources there can be no one way of importing them that will force the textures to all be readable nor can the user of the app necessarily know before importing whether the texture needs adjusting. It is therefore necessary for the user to have a simple way of adjusting the texture per model, perhaps with a reflectX and reflectY button. Of course this means for each texture adjustment there needs to be a seperate texture cloned from the base texture. The maximum stored materials would be 4, base, reflectX, reflectY, reflectXY.

For example https://playground.babylonjs.com/#KZDXTB#20

However this PG raises a new issue due to LHS v RHS, the imported cube and the created cube are both at position.x = 1.5 but moving in opposite directions and rotate in the opposite directions when both angles of rotation are increasing.

This should cover most but not all situations. It would not cover models such as the wall as it is currently constructed nor I believe merged meshes.

Hope that it shows I have understood more of the issues @ecoin faces and that it puts him and me in more of an agreement about the issues even if not the solutions :blush:

1 Like

We definitely came to an agreement. I realized our comparisons would never align because we were talking about different things. And these things are constructed in different ways in Babylon :joy:

Your reflect proposal sounds good in general. I believe it will make Babylon a lot more user friendly for newbies like me. It will save their hair :wink: because I was pulling my own a lot on this issue.

For my use case though, unfortunately reflect is not an option (or any custom hack solution for that matter) because of:

  • performance: the app cannot store x4 possible materials, it already has too many dynamic materials to endure 60 FPS with PBR, each having 3 textures
  • intuitiveness: if user has to manually adjust a texture on each upload, then they might as well use Blender… :wink:

So I only have one choice left - refactor UV mapping logic for all Babylon created Meshes (I should get started :melting_face:

1 Like

Hey team, thanks to @PatrickRyan who FORCED ME against my will, here is a PR to fix that:
Add a flag to force Babylon to use OpenGL convention for UVs by deltakosh · Pull Request #11962 · BabylonJS/Babylon.js (github.com)

I will let Patrick comment on his idea, I was only the poor dude forced to write it down.

I’m not sure how anyone can FORCE a Sith Overlord to do anything against their will. I think there is some misdirection happening here which would be right on brand for the Sith!

In any case, we looked at how we may be able to make the whole process a step easier because while we do support both UV orientations, it was a bit painful to have to track the invertY status on all textures in your scene and which orientation each of the meshes needed. So this is a global set on the scene, much like the useRightHandedSystem accessor for setting a scene to be right-handed, which will convert all meshes in the scene to use the same orientation in UV space. This will also mean that all textures in the scene will also orient the same way and so will be able to be passed freely between meshes without the need to invert them based on the individual mesh UV space.

We will need to do some testing on it once the linked PR is in, and we welcome any testing you are able to do so that it is ready for the next release.

1 Like

Looking at this line I’m a little skeptical:

Plz let me know once Sith’s changes are pushed to PG, I will test it.

Here is PG I did before with Texture transform setup for testing.

Texture transform are the same for both gltf and non gltf objects. With my PR you will be able to apply the same transform to a gltf or a BABYLON loaded object

Texture transform are not linked to object as you can apply various textures to the same object over time

@JohnK, you are correct that there are typos in the diagram. This is my fault and I have corrected them and a PR is up on the docs.

In looking at your test case using the readable and upside-down images, the first two PGs make sense because you are matching up expectations in UV space. Breaking it down:

  • PG one shows a Babylon MeshBuilder plane (DX orientation UV space) with a default BABYLON.Texture import (defaults to invertY= true to orient the image to DX space)
  • PG two shows the same… a Babylon MeshBuilder plane (DX orientation UV space) with a default BABYLON.Texture import (defaults to invertY= true to orient the image to DX space)

But PG three is a little misleading and may be skewing your thought process:

  • A Babylon MeshBuilder plane (DX orientation UV space) exported to glTF and then imported to the scene (now using an OGL orientation for UV space) with a default BABYLON.Texture import (defaults to invertY= true to orient the image to DX space). This should not work and the image should be inverted as the spaces don’t match. But take a look at the UVs for the plane as they were imported:

This shows that the UVs were exported in DX space, so even though we think that the model should be using OGL space, the UVs themselves are inverted to DX and so when we apply a DX orientation image (remember, by default when we import images using BABYLON.Texture we invertY unless you specifically set invertY to false) the image looks correct.

We need to look specifically at the export behavior, but I think the assumption here may be off. I don’t see a lot of users creating primitives in Babylon to export to glTF and then get placed back in Babylon so I haven’t noticed it happening.

And you are right about the complexities of models coming from different sources. The engine has no way of knowing how the model was authored and so it can only make base assumptions. It is up to the engineer/artist to come up with a system that allows textures to be passed correctly between models. I am hoping that the flag we posted about earlier will help some of the reflection gymnastics we are having to do with textures in the scene. And I hope this helps you flesh out your mental model of how we are dealing with UV space. I will look into the export behavior I mentioned and if it’s wrong we will get a fix in.

2 Likes

Know that you are busy sorting out your configurator but if you find time have a look at the reflection function

1 Like