AR teleportation on mobile

Hi,
I’m playing with AR, basically just reused everything I had for VR, just changed session mode to “immersive-ar”.
And to my surprise, I discovered that teleport mesh shows up on mobile. Here, recorded a session:

Teleportation doesn’t work though. Was that supposed to work, how to make it work?
(works out of the box on quest 3)
Last but not least, does it make sense at all? :slight_smile: I.e. would touchscreen navigation same as in non-immersive web make more sense in AR?

It seems like you are initializing two sessions (but entering only one). Teleportation is initialized pre-session to work out of the box. That might be a reason why it happens. Also, in AR session, make sure you are not providing ground meshes, this way teleportation will never trigger.

Yes, AR and VR sessions; I can enter and exit any time.
I guess that answers my questions with a no, thanks :slight_smile:

1 Like

… and then I implemented teleportation.
Here’s what happens: I touch the screen, new XRController gets connected, xrHelper.input.onControllerAddedObservable triggers, touch end, onControllerRemovedObservable fires.
This xrController.inputSource.profiles contains only one element, “generic-touchscreen”
Little timeout check, some copy & paste from WebXRControllerTeleportation.ts, and voila! :wink:
Makes me think - I could use a public teleportForward() method.

1 Like

There is a good reason why AR teleprompter is not the best idea :slightly_smiling_face:
In AR you should bring the models to you, and not teleport towards the model. I am pretty sure nothing stops you from technically doing that.

Oh I hear you :slight_smile:
But are there any out-of-the box controls in babylon that I could use?

Controls for what? model positioning? you can use one of our gizmos, if it makes sense to your use case:
Gizmos | Babylon.js Documentation (babylonjs.com)

Well, I don’t know what just yet, I’m just playing with it :slight_smile:
There’s this article on medium that contemplates whether it makes sense to mix VR and AR: The Martian Skiff Scenario: an XR Thought Experiment | by Babylon.js | Medium
So I’m experimenting along the lines of my comment there, quoting:

Recently I’ve also been looking into AR, like: I’m showing off to a friend, we’re looking at a virtual building on a real table through our smartphones. Then I take out my mobile headset and bluetooth game controller, we switch to VR experience and step into the building.

This is rather down-to-earth scenario involving folks like architects, BIM managers, investors and real-estate agents. So the same scene does have all of VR and AR features, yet it’s not the same experience. It’s switching the experience, and sure, should be as seamless and easy as possible.

Now, while I’m reasonably sure that this scenario makes sense, I have no idea what kind of controls and/or navigation does make sense.
I’ve tried placing an anchor with a hit test, but I don’t like it.
Furthermore, unbounded doesn’t seem to work well on quest 3. Common denominator for quest and mobile seems to be local-floor. How well local-floor works on mobile, that mostly depends on where camera points at the moment :slight_smile: But models do remain in the same place, and it feels better without an intermediate step of placing an anchor first.

And I have a feeling that for mobile some traditional touch screen panning and zooming would do best.
Except, I can’t do that with AR WebXRCamera, can I?
Alright, I’ll try gizmos, thanks!

I just had a bunch of kids over, and I’m afraid gizmo is a failure. I mean, it’s definitely better than what I had, and is probably as good as it gets with universal UI. But specifically in VR/AR, kids can’t really use it.
Just how much an UI element is intuitive is best observed with children. They either figure it on their own, or once shown how to use it, they pick it right away.
Well, that’s just not the case with gizmo. I’ve watched a 12yo trying to do this and that, and eventually giving up in frustration. Slight frustration, but still not child’s play.
I think that 2-hand (finger?) gestures may do better, something like here around 5:14 here:

Babylon provides you with everything you need to do exactly that. You have full control of the hand and finger positions, and you can implement gesture support based on finger proximity (for example). We provide basic extendable tools for the reason you mentioned - sometimes it is enough, sometimes it isn’t. And when it isn’t, we want to be sure you are able to do the work you want to.

I am not sure what went wrong with the gizmos (and what version of the gizmos you were using), so if you want to share that, it will be great. If it is a bug we will make sure to fix it. if it is by design, maybe we can see if creating a different gizmo for VR is viable.

Oh nothing went wrong with the gizmo (BoundingBoxGizmo), it’s just inappropriate UI element in VR, just because VR controller isn’t a mouse.
I think.
So I had 6 kids 8-12 yo, three were “playing vrspace”, two on quests 3 and 1, and third one on PC. “Playing VRSpace” included using vrspace world editor to search through 600k+ free models on sketchfab, fetch some and build their own world. I was observing in-world on my own pc (also playing with them), through quest 3 casting, and in real world.
The kid was was moving some cartoon character around, repeating how do I, how do I… for maybe 10 minutes and eventually gave up, leaving the object rotated under steep angle. That was the first VR experience of a 12yo, so no wonder if something went wrong, but everything that goes right is a big win, right?
Virtual keyboard for instance was a win. There was an aaargh of frustration as soon as it popped up, but the use went flawless. And overall, 3D GUI in is just fine.
Anyway, I’ll continue on my VR UI experiments, and sure, let you know.

1 Like