P5js canvas on babylonjs mesh

  1. is this possible?
  2. is it performant?

hand_transport

I ask bc in the above example I wrote a p5js sketch that output files then textures the babylonjs meshes with those files, and I’m wondering if it’s possible to do that more automatically (or to have dynamic textures).

Doesn’t p5js have wayyy more 2D drawing methods than babylonjs so it’s much better for procedural texture design?

I’m a little technically out of my depth here, which is why I’m asking the community about it.

//

Here’s me trying to do this and failing: Glitch :・゚✧

I’m wacking my head against this trying to understand it: Dynamic Textures | Babylon.js Documentation

update, no p5js, but got a drawing using just canvas methods onto the mesh, going to try to add a dynamic drawing now

an example of an animated dynamic texture would be way helpful if anyone has it

1 Like

you can update any texture for a material channel at runtime. Also think of it this way, Babylon doesnt know or care what the source of the texture is. So long its a valid texture , it will work.

Dynamic texture in the Babylon API is just their way of providing a canvas you can edit more conveniently. Nothing stops you from doing your own canvas work or using 3rd party tools that output a base64 blob etc.

About performance with constant updating, i cant give you specifics, they should create an advanced section of the docs with tests for these things and best practices as im guessing there is room for error here, if done incorrectly getting memory leaks in the engine or browser itself…? not sure.

2 Likes

You will probably find HtmlElementTexture useful: HtmlElementTexture | Babylon.js Documentation (babylonjs.com)

When you’re constructing it, you can simply pass in a canvas element, and the image data from that canvas will be uploaded to the GPU as a texture. Whenever your canvas changes, you should call update() on the texture, which will re-upload the data.

Be warned that for larger textures this can be pretty expensive. One thing you might want to do is stagger your uploads so they only happen every few frames.

This is how the texture inspector in Babylon works: you can check out the source code here.

Let me know if you still have questions!

3 Likes

@shaderbytes also.

Have y’all found either an example of p5js integration OR (more likely) an example of dynamic drawing to the canvas (as a texture on a babylonjs mesh).

I tried diving the playgrounds and didn’t find it, e.g. even a flickering canvas texture on a babylonjs mesh would be awesome.

@DarraghBurke how do real time engines handle video smoothly , it has to be the same logic right? There is no way a GPU can draw a new texture without receiving it? I would think you cant have frame skipping for a smooth video playback, So a case of expensive - but not impossible?

@saitogroup i dont know anything about p5js. I do have experience updating materials at runtime with a few real time engines, using static assets or dynamic content. When you use a html canvas locally you can get the raw pixel data ( im sure this is what babylon moves around when required ) but when using 3rd party tools ( other domain ) or having to send the image to another domain you then have to use the canvas " toDataURL()" which provides the data as a DOMString using base 64. Many tools accept such a string as an image input ( they do the conversion internally ) I have not tested this with babylon myself yet.

As a known optimization, many times sprite sheets ( atlased sprite frames ) are used and the UV position of the mesh manipulated to provide a material animation. This has no overhead of needing to update the texture on the GPU. Yet is limited to smaller sizes obviously. Many particle effects are handled like this.

Just put together an example for you: Animated Canvas Displayed on Texture | Babylon.js Playground (babylonjs.com)

The code here could definitely stand to be cleaned up, but it demonstrates the basic idea. You should be able to hook up your p5js canvas to an HtmlElementTexture in a similar way. Let me know if you have trouble with it!

1 Like

Texture data (for 2D textures) is uploaded to the GPU using the texImage2D API call: WebGLRenderingContext.texImage2D() - Web APIs | MDN (mozilla.org)

This method supports multiple sources, including Canvas elements and Video elements. (Some browsers may not support direct upload from video elements, in which case Babylon actually creates a canvas and paints the video image onto that before uploading.)

I can’t say how this is implemented under the hood; for that, you’d have to dive into the browser source code. Luckily, Chromium and Gecko are both open source, so you can definitely look into it if you’re interested. I would assume that there are probably different optimizations for video and canvas textures to make the upload to the GPU as fast as possible, but I haven’t looked at the code myself.

1 Like

This is awesome (!!!); looking at it carefully now.

https://playground.babylonjs.com/#STVYXT#12

so is something like this very expensive? like-- could I have 300 meshes doing this? (in VR mind you, on a quest)

I kind of is as every frame you would need to update the texture content requiring a huge bandwith to upload the memory to the GPU.

1 Like

Maybe you could have one large canvas with all the texture data, and then each mesh could have its own UV coordinates mapping to a particular part of the texture. If your textures are low resolution enough, it might work. It’s hard to say if this would be performant, it’s probably highly hardware-specific. I think your best bet would be to put a quick test together in the PG and see how it runs :slight_smile:

So, is this not something that people commonly do? Like make big worlds and shove a bunch of animated textures into them? You would think this would be a common artistic effect… (especially for a cyberpunk world) :stuck_out_tongue:

It sounds like it might be worth trying to take 1 animated texture and apply it to all the blocks in the image above, bc it seems like making multiple animated textures is going to annihilate the frame rate; correct?

https://playground.babylonjs.com/#STVYXT#16 500 textured cubes, frame rate is fine, it must be that creating lots of different canvases probably destroys the frame rate…

Yes, the number of cubes will have limited impact (unless you get to really high numbers, in which case you might want to look into Thin Instances). The bottleneck will be the canvas → GPU upload, which is mostly dependent on how often you do the upload and the resolution. In my PG the upload is every 10ms, which is a totally arbitrary number - you could definitely do it at a different rate. For a flickering screen in a cyberpunk world, you probably don’t need a very high animation framerate.

Anyway, all of this is because we’re talking about dynamic textures coming from a canvas. If you have a premade animated texture that you want to use, that would probably be a lot simpler - maybe @sebavan can weigh in on that.

Hello. I was able to do this. I made a p5.Graphic object and drew on it. used this graphic’s canvas as input to babylons dynamic texture. It works perfectly. You can pretty much do anything from p5 and have it has texture on meshes and update them per frame. I cannot make PG. Heres some code, ignore the variable names. notice the update() called perframe.
Capture

let me know if you need elaboration.

Very cool. I’ve been interested in this as well, so I used your example as a starting point to create some PG examples. Thanks.

Simple example:

Animated example:

2 Likes

Very cool. Check out this game I am porting to babylon where you can even interact with the mesh textures, which are PGraphics, at realtime (repo). Opens up endless opportunities for cool things.

2 Likes