Babylon project help

Hello,
I am hired on a customizer project and I am thinking about the ways how would I be able to do something like this.
I saw that I can separate geomerties on a plane like this. So I would use the plane and add one geometry on it, and I would enable user to add image on the geometry where the coordinates of the image could be copied and placed on a real mesh.
The question with this is: How do I get the face where to place the image on the mesh?
Since I have never used coordinates I am wondering if this is good idea or can it be done like that.
I would appreciate any answer and I am open to hear some other ways.

Hi and Welcome to the Community,
I saw your example (video) and it looks like this is simply using a dynamicTexture.
The texture is applied to the can using the UV of the mesh.

The problem I see with this example is that they have changed the ratio of the texture by constraining it into a square. I don’t find this very smart and also not very good in terms of UX. The user needs to stretch the texture vertically in the 2D square view so that once applied to the size/ratio of the can the image is correct. Don’t know why they did that? I would have just used the real size/ratio of the cylinder shape texture like an unwrap, may be adding just some helpers like 4 lines to illustrate front, rear, right and left. It would make things a lot easier for the user to keep with the correct ratio and positionning elements.

There are other methods used. Some prefer to work it with decals. Though I think that if it is for production, a dynamicTexture is a better way. I would simply show the 2D unwrapped texture (the same that will be used for production) and add an overlay of some helpers/rulers to show where the texture wraps (i.e. in case of a box).

Others might have a different approach to this. I guess important is that you first define your workflow and the UX and 3D representation can still be worked from there. Using a method or another.

You might want to make a search in the forum. As I said, there are similar projects that have been made or are WIP. It might help you decide which approach and method you are most likely to want for your project.

Meanwhile, again welcome to the BJS forum and have a great day :sunglasses:

2 Likes

I would unwrap the mesh, also I need to have active geometry because of the limitation of production.
Which means I should unwrap the mesh and only show the active geometry of that mesh. Next to it I could show live 3d display.
I want a user to interact only with the unwraped geometry, so I would need take the coordinates of the images and copy them on the mesh.
Since I dont see that dynamicTexture uses face, will the image find the face by it’s own if I just provide position?

1 Like

First things first: Your approach of having the user see and modify the unwrap (and then the unwrap of the actual editable portion) completely makes sense to me (as a user and as someone who participated (years-ago) to one of the first print platform). I’m sure this is a manageable and efficient way to do it (both on the side of UX and production).

Next, obviously, since the unwrap is based on the UV of (the editable portion) of the mesh, it will certainly wrap correctly. The only part that is missing and will not show by default (as you mentionned) is the faces (and/or the rulers for production). Detecting faces (for whatever faces there might be in the model) doesn’t seem to me like an appropriate way. Neither to show to the user nor to use as markers for production. I don’t know how many items you have, but I think I would create a default/ruler texture for each (and then show it as an overlay to the user). You could use the NME (node material editor) for that but you could also just overlay the ruling texture.

Of course, my opinion only. You should give it a bit of time. Others may want to kick-in and share their own experience.

I think this approach is too risky speaking about production. I wouldn’t transform any coordinates and would really have the user use and approve the 2D texture that will also be used for production.
You could i.e. use masks (or as I said NME) for the limitations and have an option to show the entire unwrap or only the editable parts.

Sorry, I didnt understand it fully.
Are you suggesting to have unwraped mesh for editing or to put 2d image of the product and edit it as an image?
However, to have a 3d preview of the product I will must put the images on the mesh, which means I will still have to detect the faces.

Sorry, I’m not an english native and the subject might also be a bit too complex to discuss without an example.
Do you know how to use the playground?
If by any chance, you would be able to import one or two sample models in the playground, we could use this as a start. Would make things a lot easier…

Meanwhile, I will search if I can find a base for editing a dynamicTexture with images. Might not be user imported images but that doesn’t really matter at this point. Or a version from NME, may be?
Let me cc some people from the team if they can recall something we could use as a base cc @carolhmj cc @Evgeni_Popov .

I imported a mesh in this playground.

Great. So, can you confirm this is for production/print? Do you also have the 2D image template for production showing the editable parts? And then, do you have only clothing or do you have other objects? Is it for merchandising items? If so, could you add one more of a different type? Are you creating the models or at least editing them (like this one is from sketchfab)?

This is for production.

I dont have 2d image of editable parts on clothing, but everywhere where the seams are, the image must be limited. I separate all the geometries that should be edited in the mesh in blender (that’s how I limit them). But some products have particular areas where to print.

I have differente types of products (from office to clothing), actually the software is for manufacturer of goods which has it’s own web shop.

Yes, it is supposed to be for merchandising.

I have little experience in blender, so I download models from skechfab and edit them.

I already have 2 objects ready for use, but I dont know how to import them since they are on my computer not on a server. Maybe I could push them on github and acces them in babylon.

1 Like

Thanks for your answers and sorry if you feel I’m bothering you instead of just replying to the 3D tech aspect. See, the thing is I’m not a dev. I’m essentially a PM and art/creative director so, for me, before tech or anything, the project needs to be set on tracks and follow a number of ‘must have’ and ‘nice to have’. In my opinion and from my experience, the first must have speaking about a platform that allows a user (end-user or company) to edit and order merchandising items is: What you see/order is what you get.

If so, and as you say, there are constraints for each object. There’s a printable area and other printing constraints depending on the technique (whether it is clothing with either screen printing, flocking or embrodery or if it is a metallic pen with screen printing or engraving… etc).

A manufacturer necessarly has all the templates for production with the area for printing, size and other constraints. Your client should deliver these to you for a start. My approach (and I believe the only true approach for this) is to start from there and do some ‘reverse-design’.

What is delivered in the end is the real thing and the client will instantly notice if it does or does not match the request/order. So, to me, it’s essential that the file that is generated perfectly matches the template and constraints and that what the user sees and approves is also the exact same (both in 2D and - as close as possible - in the 3D representation). Where I think the user should approve the 2D version, which will be the ‘production file’ (eventually with a lower resolution).

That doesn’t really matter. It’s eventually only to make sure that the mesh UV coordinates match what is required for the texture (according to the display and constraints for production). And if not, edit this but it should be fairly easy.

Yes, you can use github. Check the link I gave you above for importing external assets in the PG. You can do so from GH.

1 Like

It is described here - Using External Assets In the Playground | Babylon.js Documentation

1 Like

I updated the playground. You are true. For the beggining we have few products to work on, before scaling. You can see the geometries that I separated from a mesh, and I use their names to get the mesh where to add an image. I shuld be more aware of that since that’s very important for the end product.

well hopefully it is a case of a simple template area :wink: As with all these types of tools the initial idea seems straight forward but the reality is nearly always much more complex.

Even for something like a tee-shirt. The UV maps of a shirt would be the flattened out parts of material used to manufacture the shirt ( ideally ) but that doesnt relate to printing on a shirt that is already manufactured ( everything is already sewn together)

eg , you can heatpress an image on a already manufactured shirt across a seam no problem , in 3d software, this splits the image across multiple uv islands.

Hopefully that is not the case , because then that would require 3d viewport projection of an image and not just simple 2d editing. It would also require some further logic of how to define that projection from the web to the manufacturer etc…

Anyway you are correct in this. The defined templates for manufacturing the prints need to be understood upfront before building any such tooling.

1 Like

Don’t you discoureage our new members :grin:
The t-shirt is clearly not the best example to start with. I also didn’t mention :zipper_mouth_face: the matter of the size with clothing. Where an ‘S’ or an ‘XXL’ t-shirt is not just a simple upscale (but shh, don’t say it… not just now :face_with_hand_over_mouth:)

It’s ok, I gotta understand how to do it first, before starting.
I would do the templates with manufacturer, that’s not the problem at all. But the software here would be the problem since it’s not that easy.

That’s just one of the cases.

What would be the solution here (e.g this cup which has insert geometry for printing from the manufacturer), how would this cup be done (so I will have some understanding of the functionality)?

where is the cup so i can see what you mean…?

Huh. Can’t see the cup? Are you sure you updated the link to the PG?
Can you share the link again, please…

And then, according to my TZ, it’s time for me to prepare dinner so I will likely not reply before tomorrow.
Anyways, I think gathering these templates from the manufacturer (let’s start with 2 or 3 of different objects - exclude clothing for now (my advise)) would be the first step.
Then you can share those and your base models with us, and we could continue the discussion from there… If that’s ok with you?

2 Likes

outside the software part of this ( as mentioned that requires 3d viewport projection ) … the results of such things are normally handled by the app making a screenshot of the viewport.

That screenshot then serves as a reference to the manufacturer. So you then supply the source image and the screenshot

2 Likes