Dynamically created Texture Atlas

Hi team!

@Evgeni_Popov @sebavan @Deltakosh @RaananW @carolhmj
@mawa @labris

I’d like to propose a feature which allows to create texture atlases dynamically.

My first very quick thoughts about the interface are:

interface TextureAtlasEntry {
  id: string
  idx: number
  x: number
  y: number
  w: number
  h: number
  uvs: number[]

interface TextureAtlas {
  getTexture(): BABYLON.Texture
  getDimensions(): { w: number, h: number } // actual dimensions of the ADT texture
  length(): number // number of entries in the texture atlas

  putImage(id: string, image: Image): number
  putTexture(id: string, texture: BABYLON.Texture): number
  putCanvas(id: string, canvas: Canvas2D): number

  getById(id: string): TextureAtlasEntry
  getByIndex(id: number): TextureAtlasEntry
  entries(): TextureAtlasEntry[]

  crop(): void
  pack(): void

const atlas = new TextureAtlas(w: number, h: number)

The put method should add the textures to an ADT sequentially. If there is an overfloww (no room for the next entry) it will throw an error/or could resize the texture. Maybe all put methods could have a nullable options parameter where the user could copy part of the source image/canvas/texture to the the atlas:

interface TextureAtlasEntryOptions {
  sx: number
  sy: number
  dx: number
  dy: number

getTexture() will return a Texure for the material.

crop() will crop the ADT to optimal dimensions removing any removable blank space without rearanging the entries

pack will rearange the entries for best fit. I would start with a simple and fast algorithm (I found this quite interesting: Simple rectangle packing / Volodymyr Agafonkin | Observable)

You can the set the texture to the materail’s texture, get the atlas entry which returns the UVs and set those UVs on the mesh.

If interested, I will gladly implement tthis feature.

Let me know your thoughts, hints, ideas… Thank you!


The idea came from this topic:

Have you seen this?

I like the idea!
How to you think, what are the main use cases for this?


Yes. But this is something different. The main idea behind my proposal is to create the textures dynamically by code. Mostly using a 2d canvas which has pretty enhanced drawing functionality or by shaders. Or combining the two. You could get AI generated images from an API and put them in a texture atlas for example as well.

Thank you!

@labris this might be the answer to your question

1 Like

You still would need to update the meshes uv, and this is doing that along with creating the textures (I believe with a dt). Maybe just look at the underlying code and you could create a modified version to support what you need?

You cant use a texture atlas just with the atlas, there is underlying modifications to the vertex data that you will need to do as well. But if you are looking to just generate an atlas by itself without the mappings then take a look at the code responsible for making the texture atlas for the packer.

Doing it without a canvas 2d would be the most efficient. A simple enough shader should be able to copy/crop/scale/rotate wherever in another texture in order to efficiently pack without extra gpu copies.

This shader could be an enhancement of the copyTextureToTexure tools we already have and consumed by your atlas texture.

But the hardpart would still be there with UV remapping without extra lookup.


@Pryme8 @sebavan

I believe you don’t get my intentions yet :slight_smile:

This kind of approach would allow the user to create the textures dynamically in code and to use them MAINLY with parametric meshes. With parametric meshes there is no remappping of UVs because you are defining the UVs when creating the meshes. So this is not about loading a textured mesh and create a texture atlas fromt its textures.

As an example just imagine you would like to create a visualization of user connections between BabylonJS forum users and you want to display the avatar + username + add a badge with number of connections + draw lines between these entities. You can create a texture atlas with all user avatars, draw the username and the badge using canvas2d on the texture and put all textures in one texture atlas. You can then create one parametric mesh which use one big texture to display all of this in one draw call.

Another example is the image wall:


I get what you are saying XD, you can totally reuse the code I’m referencing but if you wanna reinvent the wheel have at it! I still love you lol. Sorry if I’m miscommunicating, but the process for making a texture atlas and a single mesh with a ton of different UVs would still be the same thing just going about the steps in a different order.

The main question would be do you need to update it dynamically as images come in or are all the images ready when you go to create the atlas?

The idea ends up being the same thing though!

If I get a chance ill whip up a PG for you. One other thing though at 5000 images it will probably not be one texture atlas. Lets say even if you do 64 x 64 you could only fit 4096 of your badges on a 4k texture, and that is probably too low of a resolution. The example that you referenced is only 16 images which is not hard you are basically just making a bunch of meshes with the same uv coords and turning them into a single mesh buffer. Things get more technical when you need to render 5000 different images.

So you will end up with more than one atlas probably you can still have it in one draw call but will require a secondary vertex buffer of what atlas they are assigned to as you move into multiple atlas references.

1 Like

yes but this is not a good solution because it cherry picked to single use case of one dynamic mesh with one UV and that is almost never how atlasing is used. Atlasing is about creating a single material to be shared by more than one object to reduce drawcalls and uvs do have to be adjusted in these mesh objects for the atlas. Whether its loaded mesh objects or parametric. This is the standard use scenario.

If you going to create an atlasing solution it should work for any use case where atlasing make sense. Its not that difficult to be honest. Unity Engine has such a tool in their code base and I have used it personally ( very long ago , i admit )

When meshes do share a material like this after being atlased and they are static , you could also join the mesh objects as one afterwards if you like.

Ok, I get the point. Thanks!

@Pryme8 @sebavan

So what about to add this functionality to the GreasedLineTools class? GreasedLine could heavily benefit from a dynamically created texture atlas. For example a game needs to display some badges above the players and it has hundreds of players. This can be easily done with GreasedLine ( yes, sorry I’m biased towards it and pushing using it everywhere it makes a bit of sense :smiley: ) One could draw and decorate the texts using canvas2d, push it to the atlas and draw it using GRL. We don’t even need pack() it his case.

I believe this functionlity can be coded in a ‘few’ lines reusing existing BJS code.