Hi!
I have a bit of a wild question, but I’m still looking into image tracking for WebXR or something like it. I found the mebjas/html5-qrcode libary based on zxing. And it works in combination with WebXR mode
The one thing that I still have to work around is the fact that you need to have a videostream on the page, by default it puts it in a div using an elementId.
But to make it make a bit more sense it would be amazing if I could render the div that it needs on a texture in the BabylonJS scene instead of on the page next to the scene. Is something like that possible? I was thinking maybe with the DynamicTexture but I honestly have no idea how to start on that.
Any tips tricks or comments are greatly appreciated, thanks!
CSS3DRenderer is what you want. Youtube videos on a mesh (port of CSS3DRenderer.js)
If you care about the tech details, it basically adds the html page as a dom element (iframe) underneath your canvas, then punches a hole in your canvas, using a mask that writes to the depth buffer but not the color buffer, the size and shape of your dom element. It uses CSS transforms to transform your DOM element to match the camera perspective.
1 Like
If you can get video into a video element somehow, you can actually use a video texture. But if you need to stream video from a site like YouTube, you need CSS3D renderer.
Hi! Thanks for the explanation, I totally forgot about CSS3DRenderer, I’ve looked into it in the past. I need this to also work on HoloLens, so I don’t think that’s an option.
But the VideoTexture seems like my best shot I don’t know how I missed that, thanks!
2 Likes