@ryantrem The not-in-the-right-place is on me, I’ll just elaborate what I’m currently trying.
I’m still figuring out a way to get a sort of image tracking working. And I thought maybe I could just use the react-native-camera qr code tracking. So I’m currently working on a little proof of concept for that.
So that’s basically what I have, a proof of concept. I can track a QR code and use that as a starting point.
Now, it’s not working well at all but it’s working enough to show the concept can or cannot work.
Unfortunately this approach doesn’t work well enough (so the swapping from react-native-camera to babylon xr), but the qr code tracking itself does. So I’m hoping that when the camera support is added (Babylon React Native snapshot without 3D objects - #3 by Peter) I could maybe use firebase ml (@react-native-firebase/ml - npm) to process the images or stream coming from that to do QR code tracking.
My question is basically if this has a chance of success and if the translation from screen to AR world is going to be consistent on different phones.
However, you saying you’re already doing this with good results in a production app answers that already, so thanks!