AR teleportation on mobile

Hi,
I’m playing with AR, basically just reused everything I had for VR, just changed session mode to “immersive-ar”.
And to my surprise, I discovered that teleport mesh shows up on mobile. Here, recorded a session:

Teleportation doesn’t work though. Was that supposed to work, how to make it work?
(works out of the box on quest 3)
Last but not least, does it make sense at all? :slight_smile: I.e. would touchscreen navigation same as in non-immersive web make more sense in AR?

It seems like you are initializing two sessions (but entering only one). Teleportation is initialized pre-session to work out of the box. That might be a reason why it happens. Also, in AR session, make sure you are not providing ground meshes, this way teleportation will never trigger.

Yes, AR and VR sessions; I can enter and exit any time.
I guess that answers my questions with a no, thanks :slight_smile:

1 Like

… and then I implemented teleportation.
Here’s what happens: I touch the screen, new XRController gets connected, xrHelper.input.onControllerAddedObservable triggers, touch end, onControllerRemovedObservable fires.
This xrController.inputSource.profiles contains only one element, “generic-touchscreen”
Little timeout check, some copy & paste from WebXRControllerTeleportation.ts, and voila! :wink:
Makes me think - I could use a public teleportForward() method.

1 Like

There is a good reason why AR teleprompter is not the best idea :slightly_smiling_face:
In AR you should bring the models to you, and not teleport towards the model. I am pretty sure nothing stops you from technically doing that.

Oh I hear you :slight_smile:
But are there any out-of-the box controls in babylon that I could use?

Controls for what? model positioning? you can use one of our gizmos, if it makes sense to your use case:
Gizmos | Babylon.js Documentation (babylonjs.com)

Well, I don’t know what just yet, I’m just playing with it :slight_smile:
There’s this article on medium that contemplates whether it makes sense to mix VR and AR: The Martian Skiff Scenario: an XR Thought Experiment | by Babylon.js | Medium
So I’m experimenting along the lines of my comment there, quoting:

Recently I’ve also been looking into AR, like: I’m showing off to a friend, we’re looking at a virtual building on a real table through our smartphones. Then I take out my mobile headset and bluetooth game controller, we switch to VR experience and step into the building.

This is rather down-to-earth scenario involving folks like architects, BIM managers, investors and real-estate agents. So the same scene does have all of VR and AR features, yet it’s not the same experience. It’s switching the experience, and sure, should be as seamless and easy as possible.

Now, while I’m reasonably sure that this scenario makes sense, I have no idea what kind of controls and/or navigation does make sense.
I’ve tried placing an anchor with a hit test, but I don’t like it.
Furthermore, unbounded doesn’t seem to work well on quest 3. Common denominator for quest and mobile seems to be local-floor. How well local-floor works on mobile, that mostly depends on where camera points at the moment :slight_smile: But models do remain in the same place, and it feels better without an intermediate step of placing an anchor first.

And I have a feeling that for mobile some traditional touch screen panning and zooming would do best.
Except, I can’t do that with AR WebXRCamera, can I?
Alright, I’ll try gizmos, thanks!