Creating 2D portrait generation

Hey,
I am about to create a 2D character portrait renderer.
The idea is to have a set of layers with face parts: noses, eyes, mouths, with alpha channel and simply blend them together.
In principle it’s simply a texture blending problem.
But…

  • there would be dozens of such portraits displayed on the screen
  • there would dozen layers each containing a set of body parts used

What approach do you suggest?

  • Render every portrait on the server and send ready img? (maybe with caching)
  • Send the resources to the client and make a custom shader?
  • other?

Maybe someone encountered similar problem and has a ready to use solution in mind?
What I’m need is in principle character generator that can work for different fantasy races.

The closest equivalent would the system used for Crusader Kings 2/3

I would definitely go the custom shader way to composite things together or maybe render each layers separately relying on the GPU blend modes.

Are the portrait animating ? (basically should they rerender in real time)

1 Like

Portraits are supposed to be static, all updates will come from character wearing different clothes, scars or getting older.
Another consideration is it would be to allow usage in Html.
Current GUI is in HTML dom not in canvas, so this would be a benefit

1 Like

yup so render each layer one after the other using your desired blend mode this would take less than a frame to render so no big deal :slight_smile: and you could export the output

1 Like

So correct me if I misunderstood, you suggest:

a) Clientside rendering/blending of layers with Custom shader, that selects a part from each layer (Kinda
b) Rendering & exporting somwhere to some other render target, so the DOM part of the app can reuse it (How? )
c) Profit - use portraits in HTML GUI and within the canvas.

a) YES
b) export only if you need to save as a png or pass to a server like this for instance Render Scenes To .png Files | Babylon.js Documentation
c) Enjoy :slight_smile:

Are you certain that without reloading the page I can render the texture for single object and then show it in the same page?

For this to be possible i need urlencode the image (and somehow get it from this render to texture)
then use image like this:
<img src=<image_just_rendered> >

Every solution that requires sending the image to server would be pretty inefficient.

Yes, you can first render into the render target texture and then use it as a regular texture later in the frame, this is actually how shadows or glow layers works.

Forgive me for being obtuse. I know there are renderTargets, the question is:
Can take the renderTarget and transform it into urlencoded image that set as html attribute in tag?

I haven’t noticed any API function for that.

Oh you want to put it in a tag after :slight_smile: I thought you were trying to use it as a texture.

The process is then a bit different as to get the base64 encoded png, the easiest is to rely on babylon js screenshot tools but they are meant to capture your scene not only the content of a render target.

There is a Tools.DumpFramebuffer which is meant to do exactly this to as soon as your render has been done to the RTT, you could call this to dump the FB content into a base64 encoded string.

What I have done before is copying a dynamic texture (canvas) to a new canvas html element outside Babylon (for debugging reasons).

// create a new canvas html element, when not yet existing
var debugCanvas = document.createElement(‘canvas’);

insert it somewhere in your DOM, and size it, in line with the original canvas (from Babylons texture)

var debugCanvasContext = debugCanvas.getContext(‘2d’);
debugCanvasContext.drawImage( ctx.canvas, 0, 0);

where ctx.canvas is the canvas of the 2d context of the dynamic Babylon texture