Sampling Pixel Brightness need an alternative

Hey guys so im currently experimenting with adaptive UI. Where the UIs font color will change based an the average pixel brightness in a given region.

Sample Code

// Is executed in scene.registerAfterRender
    testCanvasRegionForPixelBrightness() {
        if (!this.ADAPTIVE_UI) return;
        const canvas: HTMLCanvasElement = document.getElementById('renderCanvas') as HTMLCanvasElement;
        const context = canvas.getContext('2d', { willReadFrequently: true });

        const testOverlay = document.getElementById('sidebar-wrapper-id');
        const divBounds = testOverlay.getBoundingClientRect();
        const imageData = context.getImageData(divBounds.left, divBounds.right, divBounds.width, divBounds.height).data;
        
        this.UIWORKER.postMessage({imageData: imageData, downSample: 4});
        this.UIWORKER.onmessage = function (e) {
            const { normalizedBrightness } = e.data;
            if (normalizedBrightness > (0.4 / 4)) $(testOverlay).css('color', 'black');
            else $(testOverlay).css('color', 'lightgray');
        }
    }

Worker

self.onmessage = function (e) {
    const { imageData, downSample } = e.data;
    let totalBrightness = 0;

    for (let i = 0; i < imageData.length; i += 4 * downSample) {
        const pixelBrightness = (imageData[i] + imageData[i + 1] + imageData[i + 2]) / 3;
        totalBrightness += pixelBrightness;
    }
    
    const average = totalBrightness / (imageData.length / 4);
    const normalizedBrightness = average / 255;
    postMessage({ normalizedBrightness });
  };

So i’ve never done anything like this before so I dont even know if this is the correct way or the most efficient way to sample pixel brightness in a region (Maybe shaders, but I don’t speak the language).
A poke in the right direction is much appreciated. This method does work however it is very slow probably because im constantly fetching the imageData. I’ve limited the method to fire every 20th frame but Im looking at a drop in frame performance from 140fps to 35fps.

This is a common way to detect image brightness. The only thing which is possible to do here is to reduce the size of the data processed, for example, increase downSample value.
Also, seems it is possible to use (instead of scene.registerAfterRender) here scene.onAfterRenderObservable with Coroutines to spread the load between several frames (but I don’t know how it will work with the worker).

Have you tried without the worker?

Also might be reltaed: javascript - getImageData - Web workers - How can I reduce garbage collection? - Stack Overflow

1 Like

Thanks for the responses, I’ll try them out.

Yes. The worker gave around a 20% performance increase. However that thread you gave me is 6 years old. That has seen been patched and im not sending the entire imageData object just the Clamped Array, Sending the entire ImageData object had issues with cloning and cleaning was delayed.

1 Like

So I think I found either a bug or im incorrectly using Babylonjs.

So my current setup is I have 3 canvases. The first “mainCanvas” is created in javascript an assigned to the engine.

const mainCanvas = document.createElement('canvas');
const engine = new BABYLON.Engine(mainCanvas, true);

I have a second Canvas is on the HTML page itself

const canvas = document.getElementById('renderCanvas');
camera.attachControl(canvas, true);
guiCamera.attachControl(canvas, true);
engine.inputElement = canvas;
engine.registerView(canvas, [camera, guiCamera]);

Third canvas is also has a separate camera which is attached to the engine via registerView()

When my code grabs the canvas in my test function

  testCanvasRegionForPixelBrightness() {
        if (!this.ADAPTIVE_UI) return;
        const canvas: HTMLCanvasElement = document.getElementById('renderCanvas') as HTMLCanvasElement;
        const context = canvas.getContext('2d', { willReadFrequently: true });

        const testOverlay = document.getElementById('sidebar-wrapper-id');
        const divBounds = testOverlay.getBoundingClientRect();
        const imageData = context.getImageData(divBounds.left, divBounds.right, divBounds.width, divBounds.height).data;
        
        this.UIWORKER.postMessage({imageData: imageData, downSample: 4});
        this.UIWORKER.onmessage = function (e) {
            const { normalizedBrightness } = e.data;
            const currentStyle = $(testOverlay).css('color');
            if (normalizedBrightness > (0.4 / 4) && currentStyle !== 'black') {
                $(testOverlay).css('color', 'black');
            }
            if (!(normalizedBrightness > (0.4 / 4)) && currentStyle !== 'lightgrey') {
                $(testOverlay).css('color', 'lightgray');
            }
        }
    }

I found out through the profiler that the low frames is caused by a repaint of the canvas. Its repainting the canas with a dimension of 16777215 × 16777215.
I added an if statement to only update the css if the current color is different. But that isn’t the issue as its most defiantly the size of the canvas.

The canvas according to babylonjs is close to my desktop resolution of 1440p. Which leads me to believe that the issue might be the mainCanvas that is attached to the engine. No size was specified so it must be defaulting the maximum size WebGl can support.

However removing the extra canvases and the registerViews for a test and doing the following.

const canvas = <HTMLCanvasElement>document.getElementById("renderCanvas");
const engine = new BABYLON.Engine(canvas, true);
scene.activeCameras = [camera, guiCamera];

When I now fetch the canvas context

const canvas: HTMLCanvasElement = document.getElementById('renderCanvas') as HTMLCanvasElement;
const context = canvas.getContext('2d', { willReadFrequently: true });

context is now null and I have no idea why. Is there a method in BabylonJs to get the imageData?

Edit: So after much testing and reading more Babylon Docs it seems there was a better way of doing this that didn’t involve touching the canvas.

 async testCanvasRegionForPixelBrightness() {
        if (!this.ADAPTIVE_UI) return;
        const testOverlay = document.getElementById('sidebar-wrapper-id');
        const divBounds = testOverlay.getBoundingClientRect();
        const pixelData = await this._engine.readPixels(divBounds.left, divBounds.top, divBounds.width, divBounds.height);
        this.UIWORKER.postMessage({pixelData, divBounds});
        this.UIWORKER.onmessage = function (e) {
            const { normalizedBrightness } = e.data;
            const currentStyle = $(testOverlay).css('color');
            if (normalizedBrightness > (0.4 / 4) && currentStyle !== 'black') {
                $(testOverlay).css('color', 'black');
            }
            if (!(normalizedBrightness > (0.4 / 4)) && currentStyle !== 'lightgrey') {
                $(testOverlay).css('color', 'lightgray');
            }
        }
    }

Worker

self.onmessage = async function (e) {
    const { pixelData , divBounds }: {pixelData: ArrayBufferView, divBounds: DOMRect} = e.data;
    
    let totalBrightness = 0;

    const numSamples = 100;

    for (let sample = 0; sample < numSamples; sample++) {
        const haltonX = haltonSeq(sample, 2);
        const haltonY = haltonSeq(sample, 3);

        const randomX = Math.floor(haltonX * divBounds.width) + divBounds.left;
        const randomY = Math.floor(haltonY * divBounds.height) + divBounds.top;

        if (randomX >= divBounds.left && randomX < divBounds.left + divBounds.width && 
        randomY >= divBounds.top && randomY < divBounds.top + divBounds.height) {
            const localX = randomX - divBounds.left;
            const localY = randomY - divBounds.top;

            const index = (localY * divBounds.width + localX) * 4;
            const pixelBrightness = (pixelData[index] + pixelData[index + 1] + pixelData[index + 2]) / 3;
                totalBrightness += pixelBrightness;
        }
    }

    const average = totalBrightness / numSamples;
    const normalizedBrightness = average / 255;
    postMessage({ normalizedBrightness });

};

/**
 * Butchered Quasi-Monte_Carlo_method
 * https://en.wikipedia.org/wiki/Quasi-Monte_Carlo_method
 */
function haltonSeq(index: number, base: number): number {
    let result = 0;
    let f = 1 / base;
    let i = index;
    while (i > 0) {
        result += f * (i % base);
        i = Math.floor(i / base);
        f = f / base;
    }
    return result;
}

I just need to fine tune how sensitive it reacts to the pixel brightness.

Can you share a repro project somewhere like in jsFiddle or github ?