Thanks for the replies! So from what I understand, currently there is no tracking of an image / label in the real scene? And if I need marker tracking I will have to use BabylonAR, right?
Correct! WebXR has no image recognition (and also no real way of using the camera information while in AR). This will require an external implementation (by external i mean non-webxr).
Thanks you all for the replies! Basically I want at least to be able to track a label on a bottle through my camera. Is it possible you think to start working towards such a functionality? Can you provide some kind of development roadmap? I guess I can try to help in this process.
Best Panagiotis
Itâs definitely possible, though I donât think youâre looking for either a SLAM or a marker tracker. If your bottleâs label is singular enough, you could theoretically turn it into a marker of sorts; but more likely and more generally, what youâre probably looking for is some form of feature matching.
As far as a roadmap goes, I donât know of any active work to implement this right now; but itâs an awesome idea, and Iâd be happy to offer what support/advice I can to the effort! At a broad guess, the steps to make this happen are as follows:
- Confirm that the object you want to track is trackable. (Does it have good, recognizable features? Is it sufficiently matte and opaque?)
- Follow the tutorial linked above to understand and implement feature matching (presumably in C++).
- If you need 3D pose estimation and not just screen space tracking, follow the OpenCV tutorial for real-time pose estimation of textured objects to add that capability to your implementation.
- Compile all the above for WebAssembly and add a TypeScript interface, a la BabylonAR.
- Profile and optimize the implementation to achieve the desired performance.
Steps 1 through 4 should take a little time, but they shouldnât be too hard because there arenât any unsolved problems; itâs just following in the footsteps of existing work. Step 5 is where it gets interesting: depending on the use case, the target device, and the performance characteristics of the baseline implementation, Step 5 could range from fairly straightforward to nightmarishly difficult. Thatâs where the fun begins.
+1
I hope BabylonAR can be used for marker / image tracking on React Native like ViroReact
's approach with out-of-the-box components. I also wonder if BabylonNative is going to try to develop native capabilities for that.
I just saw that PlayCanvas has a PR ready for image tracking ?
When can we expect it in Babylon @RaananW ?
Awesome, I very much like the fact that more and more drafts are being proposed.
I will add that to my long todo list, I do hope to be able to test that very soon!
Ohhhhh yes !!!
Coming very soon
Here - [XR] AR image tracking feature by RaananW ¡ Pull Request #9512 ¡ BabylonJS/Babylon.js (github.com)
Well, what can I say! This is great!
I have started working with OpenCV and then found ARnft which also seems to be working GitHub - webarkit/ARnft at babylonjs-renderer
Now we have many options to check! Is any example code available for checking this feature?
This specific code example was in the PR i submitted yesterday - [XR] AR image tracking feature by RaananW ¡ Pull Request #9512 ¡ BabylonJS/Babylon.js (github.com), but I can share a playground later today or tomorrow. And of course the documentation page will be updated as soon as I get the chance
By the way @RaananW, do you know if it is possible to enable the immersive-ar in a common desktop environment using a USB camera?
AFAIK it is not possible. I remember looking into it - the classes required for immersive-AR experience (like the image detection or hit test) donât exist in your desktop browser, so even if you could emulate it, it wouldnât have worked.
I wonder, however, if an emulated android device on your desktop will work, if connected to your webcam. Havenât tried that yet
I just installed android studio, google ar services, and latest chrome on a Pixel 4 Android 11 emulated device, with the back camera set to use my webcam0. But the WebXR Incubations flag is not available in chrome and babylon says that âimmersive-ar WebXR session mode is not available in your browserâ. I also tried arcore elements but they also do not work.
I guess something is wrong with setting up AR in the emulator unfortunately. The same happens even if I use the VirtualScene for the back camera.
By the way it seems that the webcamera does not work even int he camera app, where it just crashes. The camera app only works with the virtualscene.
Hi @RaananW,
is it possible to share your code in playground? Unfortunately, I have not managed to make the image tracking work.
Want to share what you tried? The problem with sharing code here is that my images are hosted locally, so my example wont work on your device (due to the missing images). Share your code, we can work on that together
I get the following error running Babylon.js v5.0.0-alpha.6 (on chrome 87.0.4280.88 with all XR flags enabled).
babylon.js:16 Uncaught (in promise) Error: required feature not compatible
at e.enableFeature (babylon.js:16)
at Function.CreateScene ((index):102)
The feature is BABYLON.WebXRFeatureName.IMAGE_TRACKING as used in the following:
// get the features manager
const fm = xr.baseExperience.featuresManager;
const imageTracking = fm.enableFeature(BABYLON.WebXRFeatureName.IMAGE_TRACKING, 'latest', {
images: [{src: 'example.png', estimatedRealWorldWith: 0.1}
]});
What browser are you using to test the code? This only works in chrome canary (android) with the incubation features enabled in the flags