Does gltf supports UV coordinates or am I doing something wrong? Or there is something wrong with exporter, maybe it’s some older version or something.
Ok, can you share your scene and your export logs? There is a known issue in glTF where certain UV transformations aren’t fully supported, I wanted to rule this out before investigating deeply.
When I export scene with option “Write textures” texture is properly applied. But I want to create material inside the code and apply it on meshes. When I export the scene with “write textures” on, and create material with the same diffuse texture, and overwrite the default one, it still doesn’t apply it properly. That’s weird because it’s the same material with the same texture, and it looks differently.
Both cases when I’ve noticed this behavior is when I imported fbx file into the 3dsMax and then exported the scene. I’ve tried creating test model inside 3ds max (without importing external files), and it works as it should, nothing is wrong.
Hi @nogalo, I’ve been busy last week resolving a regression with the Maya exporters, sorry for the slow reply.
I’m not sure if I’m able to reproduce what you’re encountering. Using the latest exporter I can configure a physical material using Drawer Texture(Final).png as the base color texture and export:
Can you help me understand how you create materials in code? If you can provide a BabylonJS PG that reproduces this process, it should help us root cause what’s going on.
I’m sorry, I don’t quite understand, can you add some examples or sample scenes that explain exactly what functionality you want? I can add a flag to the exporter that doesn’t invert our mesh UV coordinates at export, but I don’t fully understand why we would want this other than as a workaround for improper behavior.
Hey guys. To be honest, this is still issue for me. For a while now I am not using gltf/glb formats in my scenes for the reasons above. And I would like to use them as they support PBRMaterials and I can apply Draco compression easily, which would help a lot.
Basically the issue is the same. IF I export the model with the texture on it, that usually works properly. Texture is exported and applied on the mesh as intended. But if I change and apply new texture on the same mesh dynamically within the code, the UVs are completely off. I had that issue in many projects so far.
So as I mentioned before, initially if I export a model in glb with the texture it looks okay. But if I try to use that same texture and apply it to the existing material, suddenly it looks wrong, like UVs are messed up.
You can see in the PG that it looks okay initially. Uncomment line 21 to assign new texture (same texture basically).
you can try to load the babylon version of this (uncomment line 11 and mat2), and you can see that it doesn’t have this issue. Is it PBR vs Standard issue?
I don’t know what is going on, but this happens everytime I use this approach, I basically cannot manipulate materials in my projects, which I often need to do, so I am going with .babylon format. But at the moment I have specific project which would benefit greatly with glb and PBRMaterials.
var texture = new BABYLON.Texture("https://i.imgur.com/aitxVV3.png", scene,
true, false);
it does work.
The problem is not PBR but glTF: when loaded from a glTF file, the inverY flag of the texture constructor is set to false to comply with the spec. So you should do the same thing if you want to be compatible.
The two objects that you are applying the same texture to do not have the same UVs. Babylon textures define 0,0 as the topleft of the image, WebGL and glTF deifne 0,0 as the bottom left of the image. Therefore, to be used between models authored in the two formats, the images need to be inverted. (the alternative to this is inverting the glTF model’s texture coordinates to the babylon UV format, which will dramatically slowdown load times)
When we want to use a texture directly imported into babylon, we interpret the texture data as inverted by default, in order to fit babylon’s directX like texture space schema.
EDIT:
I would reccomend inverting the textures to be used on your glTF assets via cloning the texture (we use an internal datastructure to contain the actual texture buffers) this clone should just allow us another way to interpret the textures:
Thank you guys. This solved my problem. Actually few days ago I encountered this exact issue on different project. After some debugging, I figured out that I need to invert the textures (because they were obviously flipped on the model) I didn’t know why, I didn’t know why it was never the case before. But I did it and it worked.
This now completely explains it to me (that topleft/bottomleft thing was crucial, it is kinda obvious now that I know it -.-). This helps me a lot and enables me much better workflow for my project (in the case of this project, it wasn’t so obvious to me that textures are flipped, as mesh and baked textures are much more complex, so it kinda didn’t occur to me that I need to do this) And now I even know that my “accidental” implementation in one of the projects will not work for each case (as I am using different formats there).
Thanks again, I really appreciate it. All the best