I’m currently trying to figure out how to handle GUI elements the user interacts with via pointer events like buttons when used with offscreen canvas / web workers, but had no luck so far.
I know that that for mesh picking I can use scene.pick(). That does however not work for GUI elements and I dont see any picking/hit test available on AdvancedDynamicTexture. Do I need to create my own control picking logic or is there some feature already in the library? What is the usual method used in such cases?
I will not be very happy exposing this function (even though there is “nothing to hide”, but I can suggest this - we have a way to simulate pointer events simulatePointerUp/Down/Move are used extensively throughout the framework and are probably the solution you are looking for. They are a part of the scene.
I already tried it with scene.simulatePointer*(), but it didn’t work:
The doc string for simulatePointerUp also explicitly states that it’s for simulating pointer events on a mesh. So I guess that’s the desired behavior, or am I missing something here?
When simulating pointer events, you need to do the picking yourself and pass the pick info.
But this won’t work with fullscreen UI, which is what I assume you are trying to achieve?
In that case we will need to expose some sort of a picking method for the ADT. It’s neither the private _doPicking ot the private _translateToPicking, so we’ll have to think about it.
Yes, I’m using a fullscreen UI. A simplified version of what I’m trying to achieve is here:
It’s a draggable pop up box, showing information about the picked mesh.
I have very little understanding of the inner mechanics of Babylon, so this might (probably) be a stupid question, but would it be possible to create a virtual DeviceInput, i.e. a mock/proxy of webDeviceInputSystem delegating the events on the canvas via the web worker’s message bus?