So from what I can see, clicking on the sphere will return picking information that shows either a hit on the ground if your cursor is over the ground mesh or null if it’s not. The sphere is not pickable through scene.onPointerObservable so that’s a at least a good sign. As far as why there’s a behavior change, I’m not sure. I’ll have to investigate and see what’s happening.
Hello, @PolygonalSun sorry to bother, but do you have any news here?
Just wanted to know whether you guys are going to change it back to original behavior, or I’ll just deal with the new one somehow
Sorry about the delay in response. I haven’t been able figure out the root cause on this one. As far as I’ve found whatever caused this change in behavior has been happened during one of the early 5.0 alpha pushes (prior to our repo refactor). I’ll keep digging and once I have something, I’ll let you know.
Hey, I just wanted to give an update on this. After talking with the team, I was able to find that the issue was related to when the pointer’s event information was being handled vs when the utilityLayer’s scene handles things. As it turns out, the utilityLayer does everything within the prepointer observable and pointer handling was being “released” after that point so there’s a bit of a timing mismatch. While I can address that, there’s another issue that I have to figure out before I could release a fix. That problem is, when the mouse is clicked on the active scene and released on an element in the utilityLayer scene, it can cause the pointerup event to not be received properly. That’s what I’m curently working on fixing.
Interesting reading. I wouldn’t have thought of that. Thinking back about some issues I had in my projects with multiple utilityLayers, it now kind of rings a bell for me. I hope you’ll find a solution for it. It would definitely be a nice fix
Actually, from my understanding @Evgeni_Popov is about to look into this management of utilityLayers. We might just add this thread to it (sry for the additional load)
I think that the problem i encounterd in my scene is related to this, when I trigger the gizmo in an other utility layer (moving an object) and switch back to the main layer (where there isn’t the gizmo) I can click on the invisible gizmo and drag more my object even if that gizmo is correctly in the other layer, it happens only after I trigger the gizmo moving the object (in the other layer).
Maybe it can help to solve this or let you know about something related idk.
Thanks for all your work!
I’ll just say that we resolved the issue in our project by getting rid of the utility layer altogether.
We use a separate rendering group to render our gizmos on top of other objects and that’s really all we needed.
Maybe this’ll help you
Unfortunately I can’t easily use that method in my scene, I fixed it disposing and resetting the gizmo and all its components when the viewport changes.
Still hoping for a patch about this later on anyway
Hey, sorry that this took so long to find a fix. The issue was an incredibly subtle one to find but I am currently testing a fix. If I don’t find anything weird in the edge cases, all that’s left is to just create a quick test for it and create the PR. I’ll let you know when the PR is live.