@ericwood73 Just wanted to say that I somehow completely missed the topic when it came out (because I was in my winter/summer retreat in australia
… but then,
for this amazing contribution. My only hope is that this will not influence the BJS Team to enter ‘lazy mode’
on the improvements for the BJS GUI
us being lazy??? No way
GUI remains a super important cornerstone of our investments (just think about projected UI or support for XR)
Merged my friend! Thanks a lot!
DOM Overlay appears to be just for AR on Android Mobile devices (it doesn’t work on Meta Quest or other HMDs). There is some conversation of using WebXR Layers for rendering HTML in WebXR, but that seems to be a long ways away as it was last considered for the spec in 2022 and stagnated due to security concerns. Meta also has a new proposal that might work as long as you are okay with a modal browser window centered in your canvas.
I suggested on another thread that we might be able to pick up the Three.js implementation of HTMLMesh, which works in WebXR because it renders the canvas as a texture on a plane. Sorry to cross-post, I just realized that this thread might have been the better place for that conversation.
Thanks. I’ll take a look. I didn’t realize we had access to a canvas in WebXR.
When I made that PR was I supposed to update the version? I need to get this change to come in when npm i is fired but it looks like its not as of yet.
update never mind I put in the 1.0.48 which was what the package.json said it was at. But it looks like the 1.0.49 on npm had the fix!
tl;dr:
- manually draw the html to a hidden canvas, and convert the hidden canvas to a base64 encoded PNG that gets used as a texture on a plane mesh;
- catch events on a plane mesh and translate them to “synthetic” mouse or input events that get sent to the html for interaction
Here’s how I understand Three.js’s code:
- Create a canvas from the html by manually drawing all the html elements on to a canvas that isn’t attached to the DOM (all the way down to drawing the checkmarks in checkboxes)
- The canvas is rendered as a base64 encoded PNG using canvas.toDataUrl which is then used as a texture on a plane mesh.
- This is buried deep in the Three.js code: CanvasTexture → Texture superclass constructor → Texture converts the input “image” (a.k.a. canvas) to a Source → Source.toJson uses ImageUtils to create an image → finally the real magic happens when ImageUtils calls canvas.toDataUrl and the canvas is converted to a base64 encoded png.
- Interaction happens by catching events on the mesh creating converting them to new MouseEvents and InputEvents that get sent to the underlying DOM element(s).
- I have to admit that I don’t understand why this code is emitting events directly on
window
too, rather than letting events bubble.
- I have to admit that I don’t understand why this code is emitting events directly on
- Catch change events to the html and re-render the canvas when the html changes.
Version 1.0.50 is up and has your changes.
Yeah, looking at this code, I don’t think it actually works in XR. The forums seem to confirm this. Even if it did, it’s pretty hacky and probably won’t support 99% of the use cases. I’m investigating a real solution, but there’s alot to figure out and I don’t have anything to announce yet.
Here’s a demo of it working on my Meta Quest 3.
To be clear, I think your current implementation is better – it’s not a hacky manual drawing of content and your approach going to work for all websites and content in a very reliable manner. But after hours of research, this is my best guess of what can be done for VR / AR with the current state of WebXR specs and implementations.
I hope you find something better though.
Happy to collaborate on a WebXR solution if you’re interested.
HtmlMesh 1.1.2 uses Babylon 7 as a peer dependency. Note I did not find any backward compatibility issues, but if you are using 7 and you don’t want annoying peer dependency warnings, this is your version. HtmlMesh will use Babylon 7 moving forward, although again, I don’t anticipate any backwards compatibility issues.
Amazing!
BOOKMARKED!
@Deltakosh what about to make a Category which will contain hand picked topics by the babylon.js team? Topics which are really useful like this one. I mean all the tips are useful but the ones like this are really #1 and are pushing the posibilities of babylon.js by light years into future
I missed this topic by 4 months…
rotate the cube so it doesn’t face the camera. I bet a lot of users missed the PDF content and the other ones.
This is really super exciting!
We could think about adding a section in the doc for sure!! @PirateJC
Adding @PatrickRyan who now owns the docs.
@ericwood73 I just discovered this demo and I have to say it’s awesome.
The potential for VR is huge
++
Tricotou
Thanks. I don’t want to dampen your excitement, but this solution will not currently work in VR as it requires a DOM layer. The DOM Overlays proposal for WebXR will allow this to work in overlay mode only. There is a demo from Three.JS that we could easily port that renders HTML to a canvas, but it is extremely limited in what it supports, I’m exploring other solutions but there are quite involved and may not be practical. I am interested in understanding the market for this. I’d be greatly if you could DM me with your use case to help with my analysis.
Ah ok, yeah sorry I might have read a bit quickly the topic. I just discovered it and did not read 100% if the 58 previous message, just saw speaking about Metaverse, Quest3, etc…
Sure for DM, no problem
++
Tricotou