I encounter weird behaviour of gizmos and wonder if it’s a bug or an issue with my code.
- I use ‘registerView’ feature of engine to use single scene and multiple canvases.
- Each view is registered with different canvas and different camera.
- Only one view is created as interactive: it’s canvas is set as engine.inputElement and it’s camera is controlled.
Situation on screenshot:
- You see Dude with spheres created at positions of his bones.
- View at the top is created first.
- View at the top (when created as interactive):
→ Always works for picking meshes for gizmos.
→ Breaks for selecting gizmos axes as soon as camera is moved*.
- View at the bottom (when created as interactive - like on a attached screenshot):
→ Always works for selecting gizmos axes.
→ Breaks for picking meshes as soon as camera is moved*.
*You can still select mesh or axis, but by clicking in completely different place than the thing is rendered in on canvas you interact with.
My naive interpretation of the issue:
It seems as if raycast for picking mesh for gizmo is based on one camera, and raycast for picking gizmo elements (axes etc), is based on another camera.
What I tried:
1. Setting both cameras as controllable - this does fix the issue but misses point of one view being static.
2. Setting utility layer for gizmoManager and setting its renderedCamera to the controllable - no effect.
3. Initializing gizmoManager at different points in time relative to views creation (before after), or scene render (before first render, after first render, much after first render) - no effect.
4. Setting camera that is supposed to be controlled as active - no effect (well, apart from gizmo not working at all this time).
5. Setting camera that is supposed to be controlled as cameraToUseForPointers on scene - again, only breaking things completely.
(These are not two screenshots stacked, it’s a single scene on 2 canvases.)