Yes, but only at mesh/UV(W)s level. Not on a material/texture you create in BJS and would apply to all (or not anywhere else actually). It’s the UV’s defined on the mesh/submesh/selection that will define how the mesh (UVW) will receive the material/texture with the initial settings of this material/texture (scaling, offset and projection mode). Sure you can act on these initial settings in the material itself, but if you apply the same material to your mesh/submesh, it will of course be applied the same for all using this material/texture, according to the UVs set on the mesh.
Here’s a basic example showing this. I project a same material/texture at default scaling (1,1,1) on two identical planes which have different UVs. On the right plane, UVs are generated to fit the object in planar projection mode. On the left, plane, I have stretched my texture before generating the UVs.
I’ll let @paleRider see if this is usefull for us. But as far as I pseudo understand what you are doing, I guess the two planes have initial different UVs already?
If by ‘initial,’ you mean that I did set them on the mesh before generating the export in the 3D app (here c4d) then Yes, that’s exactly what I did here.
Else, if you want to redo your UVs in BJS on an imported mesh, I believe that’s possible but you should make ready for a bit more code fiddling and start with the link provided by @JohnK To be honest, I never did this. I don’t have this need nor level of patience I’m a simple designer. I like to keep things simple and done right from the start if by any means possible of course
Aha, you cheated! LOL
Ok, let me explain.
In MAYA, I can deform the UVs in order to stretch the texture, that’s fine with .babylon format.
The complex thing is that the model has to be non uniformly scaled during an animation and the UVs should deform accordingly.
I’m afraid .babylon format allows for UV deformation but NOT during animation, and that’s why we need to scale the already created UVs by the scaling factor in order to achieve this (I think).
I’m not a coding guy, so I rely on what I would do on my DCC, based on my knowledge, which can be right or wrong of course.
I see. Just like that, this sounds to me like it would be something to eventually handle with a node material. But then again, I’m not an expert on this.
Though with this new information, let’s see what @JohnK has to say.
Also cc @carolhmj who might be able to find the right person to answer this question in this period of Season’s Holidays. Might require a bit of patience.
Nah, there’s always a way and nearly always more than one.
It’s just that it’s not the best time to get the most immediate and accurate answers.
I believe you know you can also animate materials in BJS (though I still think a node material would be better and more ‘sexy’).
Please avoid that. It only creates confusion and will not make for better or faster replies. All it does is dillute the topic with bits and parts of information. As I said with just a bit of patience, I’m sure you’ll find your ‘hero’. Meanwhile, may be you could work on some other parts of the project.
I’m asking in other threads but not because I’m hurry but in order to have the topic of each focused.
Other way to do this is to use triplanar textures, but I thought it would easier to not mix topics in the same thread. Although if you understand it is the best way, a moderator may join both in one.
Nah, if it is really for two different methods, I guess it’s fine. No worries and I hope you will soon be oriented in the direction that matches your need.
…is mandatory to scale both, the geometry and the texture:
...
var scaling = value / _BOARD_MAX_SIZE;
_board[board].mesh.scaling.y = scaling;
_board[board].mesh.material.albedoTexture.vScale = scaling;
...
Failing to do it that way (in a trivial way, commenting the last line in the previous code) results in the stretching of the texture, that can’t “follow” the scaled geometry…
Of course, all that is causing the other board (right-bottom) showing a stretched texture once we start to “tweak” the other board, as they share the same material.
So, facing this scenario, we thought the only way to accomplish the pretended behavior was by means of triplanar PBR material, but sadly, the resulting scene can’t be exported to a GLTF, as that is not supporting triplanar at the moment and we need that GLTF to be show via AR on model-viewer.
UV animation would be another way, but it can only be done on a per-material basis level, so we could’t have there several meshes with same material.
Hi there, @Evgeni_Popov, and thank you for your time:
As you can see in this PG, obviously the problem is that we’re in need of a triplanar-fashion approach in order to avoid stretching of textures on the YZ and XZ planes.
BJS triplanar materials are not a solution here, as that is not a GLTF-compliant feature.
Any advice of the API to access that UVs? A BJS mesh have up to six UV set, isn’t it?
If you want to update the uvs by hand for each face independently, you need to make sure that each face (triangle) has its own 3 vertices that are not shared by any other faces.
By default, a cube (created with BABYLON.MeshBuilder.CreateBox) share some vertices with multiple faces. So, you should call daCube.convertToFlatShadedMesh() to make sure each face has 3 vertices of its own.
From there, you can update the PG to compute the right uv coordinates for each vertex, depending on which face it pertains:
I have also added the scaling in Z to show it works for all axis.
Now, it is very specific to the cube. If you want to do it for other shapes it will be much more complicated (is it even possible in the general case?).
If it is the Babylon.js viewer, you should be able to add some custom code to change the material by a triplanar material after loading (I don’t know much about the viewer but I suppose it is possible).
If it is not the Babylon.js viewer then I don’t see a solution as you are limited by what the glTF format supports.
May I add this to the conversation?
Well, the triplanar material we need is a PBR one, not the standard non PBR. So we are stuck and the only way to do that kind of triplanar is by making it via NME.
Cheers.
I agree with you about being WebXR the simplest way, but sadly, we have a lot of experience developing native AR apps for Android (ARCore) and iOS (ARKit) and at the moment the only way we’ve found to do it for web, with an acceptable performance, is via model-viewer web component, covering also iOS with its amazing client-side on-the-fly GLTF to USDZ converter.
I mean that, in our opinion, at the moment WebXR is a promising project but far from being a solution for commercial grade webapps, mainly in the less powerful devices.