How to cache a texture buffer in memory?

Hello,

I want to load a buffer from a .png or .jpg file and store it in memory.
That buffer will be used to generate textures and use them in SpriteManagers at any of the app’s scenes.
The buffer can be cached to be used along the scenes avoiding to reload it between scenes.

How should I do it?

I’ve found a topic Texture class does not use the buffer argument and can accept a dataURI in place of a url but it is not working for me.

This is the code I use to load the buffer into a raw texture:

@sebavan some idea what’s wrong here?

Thanks in advance.

Hello!

From your PG:


But the console is not! :stuck_out_tongue:


@sebavan allow me to handle others console errors for you! I bet you are maxed out by some other tasks! :see_no_evil:


@Khanon first of all you should check your console in case there is something not working for you. That’s the most crucial step in debugging.

Voila!

The first error instantly reveals what issue are we facing here!

Now let’s try to debug ourselves what the heck is wrong:

We get this:
image

65580 bytes

Let’s count! width * height = 800 * 533 * 4 (RGBA) = 1 705 600 bytes

The buffer is much more smaller than one you tried to use in to create the RawTexture. That’s why we get the first error telling us:
image

We can be sure that the loadBuffer function is buggy. Let’s fetch the binary data in a bit more professional way. Fetch a blob:

async function binaryLoad(url) {
    const data = await fetch(url)
    const blob = await data.blob()
    return buffer
}

We get this and it reveals that your file is a jpeg not a png:
image

fileUrl = 'https://images.rawpixel.com/image_png_800/czNmcy1wcml2YXRlL3Jhd3BpeGVsX2ltYWdlcy93ZWJzaXRlX2NvbnRlbnQvZnJob3JzZV9nYWxsb3BfY2FudGVyX21hcmUtaW1hZ2Utcm01MDMtbDBqOXJmcmgucG5n.png' ← png in filename

We could use this code to fetch the binary data without using a Blob but as we all know jpeg and png is not a RAW fileformat so we won’t use this approach and I’m happy to announce neither the Blob approach (which you could but you’ll have to use canvas2d)

async function binaryLoad(url) {
    const data = await fetch(url)
    const buffer = await data.arrayBuffer()
    return buffer
}

So we will simply load the texture directly. Let’s change the fileUrl to a real png:

const fileUrl = "https://playgrounds.babylonjs.xyz/flowers.png"
const texture = new BABYLON.Texture(fileUrl)

Let babylon.js do the image decoding and when we are ready we can get the RAW pixel data like this:

scene.onReadyObservable.addOnce(async () => {
   const pixelData = await texture.readPixels()
})

Your texture is stored in memory. You can do whatever you want with it.

Be always sure to use the correct file extension for a given image when creating a Texture.


Answer to your original question is:

    const rawData = new Uint8Array([
        0, 240, 232,
        236, 0, 242,
        0, 240, 232,
        0, 37, 245,
    ])

    const texture = new BABYLON.RawTexture(
        rawData,
        rawData.length / 3,
        1,
        BABYLON.Engine.TEXTUREFORMAT_RGB,
        scene,
        false,
        true,
        BABYLON.Engine.TEXTURE_LINEAR_LINEAR
    )
    texture.wrapU = BABYLON.RawTexture.WRAP_ADDRESSMODE
    texture.name = 'color-texture'

Hello, seems I uploaded a wrong version to playground, there weren’t any kind of error in console when I published the message…

No problem. Post the correct PG I’ll check it.

My first approach is wrong, Babylon should do the job of decoding.

If I associate the Texture to the Engine instead to the Scene, I think I can use that Texture at any SpriteManager of any Scene.

That would solve the problem and I’d store the texture in memory instead the buffer.

const texture = new Texture(asset.defitinion.url, this.babylon.engine, this.spriteProps.textureOptions)

You can create the texture by whatever method you prefer (from URI, blob, canvas, raw data, …) and assign the same texture to different materials on different scenes this way:

1 Like

Yes, in my case:

const texture = new Texture(url, engine, textureOptions)

// Scene 1
let spriteManager1 = new SpriteManager(
      name,
      null,
      this.spriteProps.maxAllowedSprites,
      { width, height },
      scene1
)
spriteManager1.texture = texture

// Scene 2
let spriteManager2 = new SpriteManager(
      name,
      null,
      this.spriteProps.maxAllowedSprites,
      { width, height },
      scene2
)
spriteManager2.texture = texture

I’ll do that, I will store the texture in memory instead the buffer.

Thanks for the answers!

1 Like

Please mark an answer as a solution. Thanks!

I don’t get why did you change the topic name pushing my answers out of scope… :see_no_evil: