Hi, I’ve been trying to export a RawTexture image through code but haven’t been able to manage yet. Could anyone give me a push into the right direction or have an example of this? Thanks!
Hello! Where are you getting your RawTexture from? If it’s from an array of data, you could try to export this array instead. Or you could draw this texture in a canvas, get the image data from there, and then export it.
My main source is an array of 0-255. I transform this into a red/black texture using the raw texture as such:
const texture = new RawTexture(
new Uint8Array(ARRAY_OF_0-255_VALUES),
this.size.width,
this.size.height,
Engine.TEXTUREFORMAT_R,
this.scene,
false,
false,
Texture.TRILINEAR_SAMPLINGMODE,
)
How exactly would I be able to draw this onto a canvas for example?
You can use the Canvas API methods for that. Here’s an example: Saving img data | Babylon.js Playground (babylonjs.com)
Perfect, exactly what I was looking for, thank you!
How to generate the image for the texture if we change the uv scaling of the image, coz in the playground it’s not working.
Let’s say,
After the following line:
mat.diffuseTexture = texture;
if we add:
texture.uScale = 5;
It changes the texture in the sphere but the image still remains the same.
Any idea about how we can achieve this?
Any help from @Evgeni_Popov would really helpful
Thanks
Do you want to upscale the image itself?
This does not scale the texture but the uv information of the mesh.
I am not sure what you are trying to achieve as the uv scaling should be exported if you serialize in babylon format ?
I am trying to do the same thing as its been done in the playground and it works fine.
i.e., getting image from the texture using the following steps as per the playground
- Reading the texture using readPixels()
- Putting them in the canvas and creating an image file
Upto this, everything works fine, I am getting the image of the texture as well
Issue comes, when I apply some scaling to the texture, like horizontal/vertical scaling, the scene gets updated and the texture also looks good as per the scaling value
But when, I perform the above 2 steps after applying the scale, the image generated is same as that of the previous(i.e., image w/o any scaling)
This is my issue, why the generated remains same even if the texture gets changed in the scene? Any solutions to this?
@carolhmj No, I do not want to do any image manipulation, just want to manipulate(scaling, rotation, etc.) the texture and then generate image out of the texture, like its done in the playground of this question. Is it the way it should be done or I am missing something?
@sebavan are you saying, that uv scaling affects the mesh and not the texture? In that case how can I generate image from this? Also, exporting means, can we generate image out of the serialized data? Documentation link would be helpful or small code snippet as per your busy time.
Note: the generated texture is a DynamicTexture
Can you share a Playground with what you’re trying to do?
UV Scaling does change any data from the texture so it will always be the same when read back.
The mesh data are not updated either but a computation is applied in the shader to modify uvs before fetching the data.
This is the playground link https://playground.babylonjs.com/#VWFSD6
This is the same code as that of the accepted answer from @carolhmj
The only change is the additional line at line #61, which changes the uScale of the texture and which also gets reflected in the sphere. But, not in the canvas. Why?
And how do we convert this scaled texture to image?
@sebavan I got it, what you are trying to say. And sorry if I have conveyed the message wrongly. But, still I am unable to get any solution for the problem. How can I reflect the scaled texture in the generated image as well?
Because,
generated image in the canvas before scaling = generated image in the canvas after scaling
Maybe this kind of setup can give you your desired result:
@Mahesh_Pradhan why and how would you reuse this texture ?
@Forsaken
Thank you for the code in the playground link that you mentioned. But, I need to apply the texture in a model and then create an image out of the applied/generated texture, not making the whole scene as a texture.
@sebavan
So, what I am trying to do is, I am trying to load a model and then apply some texture on it and save the texture details and its image in the backend. So, the next time any user comes, I can just visualize the texture image as saved materials, and user can choose it to apply the texture in the model. So, the reuse/customization part comes when/if user wants to scale/rotate the uv settings to see fit according to his satisfaction and then I could save those settings as well for the texture. That’s my whole use case.
You can just save the uv transformation then and apply the same transform the next time the scene is opened
@carolhmj
Yeah, that’s what I am doing, but I need to show the users how the material would look after texture gets applied. So, I need to generate an image when user has done his customization and tries to save the material. Next time, when user visits, it will show the texture image in its collection and then user can know how the material looks before selecting and applying it on the model.
Issue is about generating the image as how the texture looks.
Isn’t that exactly what my playground does? When you have the texture you want, you apply it to a scene like the playground and it then exports the (customized) texture as a picture.
How the texture looks all depends on the target mesh and its uvs independently of the setup.
Why not applying the material on a plane to let the user see it ?
If you can not use the material, smthg like this could work javascript - Scale image in Canvas with offset origin - Stack Overflow
And if you really want the texture, you could read it back from the canvas in the question above.
To do it in Babylon, you will need to render a renderTargetTexture with a fullscreen plane using the texture with your setup.