New WebAR solution with Babylon.js

Hi! I’m alex :slightly_smiling_face: and I’m new here.

I’ve been working on a WebAR engine called encantar.js. You can use it in combination with Babylon.js to create Augmented Reality experiences. It’s open source, easy to use, GPU-accelerated and it can be used anywhere (it’s not WebXR; it’s Augmented Reality made from scratch with computer vision).

At the present time the engine features image tracking. There is a Babylon.js demo that you can try with your mobile device or with your PC. I created this account to share it with you, guys!

I hope you find it useful! :slightly_smiling_face:

15 Likes

Hi @alemart This is a pretty impressive project for a 1st post on the forum!
Congrats and welcome!

2 Likes

Looks pretty impressive!

I currently use AR.js with marker tracking in my project, I wonder how this compares to it and what differences there are? I see it’s “just” image tracking (no marker tracking), presumably that will result in slightly lower tracking quality than AR.js marker tracking.
I also see that in the demo the input video is quite low resolution (320x568), it would be interesting to see how much it affects performance to increase that, to allow the markers to track better slightly further away…

Totally rooting for you with this project! :partying_face: Having basically only AR.js for open source AR tracking isn’t very good, especially considering it’s not in development any more…

1 Like

Welcome to the community!

Absolutely amazing! I had a look at the code and it’s very well written and commented too!

I can’t agree more!

KUDOS @alemart !

2 Likes

Thank you guys, I appreciate it :slightly_smiling_face:

I keep a low resolution by default to have it run on all kinds of devices of various audiences, but you can use the API to increase the resolution of the tracking, of the camera, and of the rendered scene. Performance is affected by different factors such as upload times (GPU).

2 Likes

Do you plan to continue updating this project over time? (not asking for you to see the future :smile: just short term at least, do you consider it finished and done with or might you continue working on it a bit?).
Just trying to decide whether to spend some time testing porting one of my projects to it from AR.js!

Sounds great!

Within my possibilities, yes. I dream of expanding it, but I’ve been doing it without support.

1 Like

excellent!

Hi,

I liked the idea, I tested it here and I enjoyed trying to hit the ball in the basketball hoop in a different way.

:smiley:

@alemart I’ve purchased your system and it works perfectly in frontal angles but fails in narrow ones, I think that it is related to the angle transform since the markers are perfectly attached to the reference image.

My impression is that the transformation of the tracker to the camera orientation has something wrong and at those narrow angles the camera is less rotated than it should be. This is an effect that can be seen very well in the babylon demos you provide.

Regards

Thank you for your feedback. I know the cause of it and I’m already looking into it.

It is not that it is “wrong”, but rather, it is the case that a underlying mathematical model can be enhanced. In a nutshell, the plan is to add a more sophisticated model. The tracking was improved in the latest release (0.4.0), and I intend to do more work on it still.

@alemart will we be informed on new releases? :blush:

I can share this update with you guys here :slight_smile:

2 Likes

A new version of encantar.js is out with improved tracking, particularly on steep angles!

@Escobar

5 Likes

Do you want to marry me? :grinning_face_with_smiling_eyes:

1 Like

Hello, Alemart. I wanted to try it. But I did not find the /dist folder and the encantar.js in alemart/encantar-js: GPU-accelerated Augmented Reality for the web.. Is that ok?
Is this instruction relevant at the moment? Set up a web server - encantar.js: GPU-accelerated Augmented Reality for the web

No, but I’m glad to see that you’re enchanted by my magic! :laughing:

I’d like to ask everyone who is interested in this project or who is already using it to support it by purchasing a copy or by becoming a sponsor.

If you’re able to tell others about the project, that also helps.

I dream of expanding this work, but having support is essential. If you’re like me and would also like to see the project expand, that’s how you can help.

Purchase your copy from the website in order to support the project, or get a source release and run the build script via npm.

1 Like

I already purchased it a couple of months ago and I encourage to do the same to everyone!

2 Likes

Thank you!

1 Like

This is really impressive. I took a quick look at the docs and see that this is based on speedy-vision. I’m working on a project that needs to track the pose of the device/user. Is this API able to do that? (practically most AR APIs can do that, my question is more on how reliably can it track pose over a larger distance, lets say if I walk 10-15 feet)
example:
YouTube (based on ORBSlam)
GitHub - raulmur/ORB_SLAM2: Real-Time SLAM for Monocular, Stereo and RGB-D Cameras, with Loop Detection and Relocalization Capabilities

Also does this require a depth camera/IMU or will this work with just a simple monocular web-cam. ?