p3d offers a very cool AR markerless feature that just requires you to point your camera around for a while so the software gets an idea of the surroundings. This is not something new, they have had it for a year already.
I would LOVE to be able to use something like this, but so far I’ve not found out how they do.
You can see it from the back, and get a closer look
You can see it from the bottom, pointing to the ceiling
This is way better than what I’ve seen so far in the playground, were you can’t really get closer, see from the sides or back, and so on: https://playground.babylonjs.com/#F41V6N#32
Try to place yourself behind it, or getting closer, it’s just not happening. It’s like the example above just takes in consideration the gyroscope but not the actual position of the phone.
With p3d.in - Link Redesigned you can get close from any angle and see every detail, and it’s very stable too. It takes a few seconds at first until the App gets enough information about the room you are at, but once it finishes, it’s done.
The example from the playground doesn’t analyze the room itself.
Please just try both, the difference is very clear.
Please note the sphere is about a 2 meter sphere 5 meters from you in the shared example to have something you can rotate around directly you should change the size and make it closer.
I didn’t notice the playground examples have such big and far away models.
The phone I use for testing is Samsung s20 pro, it was fine.
My next step will be to do a surface recognition feature or similar to be able to place objects around. As Three JS has some examples, porting should not be an issue.