I’m currently trying to design a system that can render things that are located on the BJS canvas (i.e., in the 3D scene) on the UI layer (widgets created in HTML and styled via CSS), as well as possibly the other way around.
Specific tasks that I’d want to support are:
- Render animated sprites as part of the UI, where the sprites are normally part of the scene and assembled manually, using fairly complex logic that I’d rather reuse from my 3D code
- Creating 2D text planes that are styled via CSS and then placing them somewhere in the scene, for example near sprites/meshes, as a billboard rather than regular “spatial” planes
Essentially, I’d like there to be a way to communicate between the 3D and the UI layers so that I may render things on the UI that are also available in the scene, and vice versa.
I’ve thought about the following (potential) solutions:
- Create another scene for each sprite/animation that is to be rendered, and put a canvas where the sprites should appear on the UI. Problem: There could be many (possible performance degradation?), it seems complicated, and I want the sprites to be blend in with the UI background (i.e., have transparent scene background?)
- As for the other direction, the best I could come up with is creating actual (3D) text planes in code and then somehow applying the same CSS-styles to them, if possible, so that they look somewhat close to the widgets that I’ve envisioned (and created via CSS)
Maybe there’s a standard way to accomplish this? I’m thankful for any input