Implementing custom pinch gesture in AR

Hi all, I’m working on an AR project and I would like to use a screen pinch gesture to scale a single model in the scene.

AFAIK pinch gestures are not built-in to the Web API, I’m using the onPointerObservable to get the pointerdown and pointerup events and reusing this code from the MDN docs:

[Pinch zoom gestures - Web APIs | MDN]

which relies on the comparing the locations of the two pointer events (from each finger) and checking if theyre moving closer of further apart.

The issue is, when logging these events in AR they don’t seem to have a location property (what would be clientX / clientY on a screen), this is always set to 0 no matter where on the screen I press.

Any idea what property of the PointerInfo object to use to make this gesture work?

Many thanks!

Adding @RaananW and @syntheticmagus for the XR part

1 Like

for more info, these pointer observables are being added on the scene. I’m triyng now to add them to the canvas but nothing is registering so far.

Hi there, after a lot of debugging ive found one problem thats stopping me.

To make pinch work, it needs to register 2 “POINTERDOWN” events and compare their locations (by storing in a cache), before a “POINTERUP” event with the same ID is registered.

The problem is having is that it doesnt seem possible to do this, as soon as the second touch event happens it acts as if the first finger has been removed from the screen (on my device at least, Chrome Android)

Heres a playground demonstrating what I mean. When 2 fingers are pressed, the sphere should turn green, but it never does (and console logs back this up):

https://playground.babylonjs.com/#F41V6N#509

I’d really appreciate if anyone could let me know whether:

  • this technique is flawed and theres another way to calculate a pinch gesture
  • web XR is somehow affecting the pointer events

Thanks again :slight_smile:

@PolygonalSun could you have a look ?

1 Like

We are internally detecting it for zooming on arc rotate camera so there is definitely a solution I guess :slight_smile:

for further info, turning AR off here means the double-touch works:

https://playground.babylonjs.com/#F41V6N#510

so its definitely something with the XR… ominous… :cloud_with_lightning:

no worries @RaananW will be back on Monday and can have a look

1 Like

XR doesn’t support pointer events. we emulate them for your convenience, but it is not a full pointer event.

WebXR uses the select and squeeze events, which are of type XRInputSourceEvent (WebXR Device API). These interface are not pointer interfaces (think about the amount of devices XR and XR input is working on and you will understand why). We convert them to pointer events to unify the pointer system and to let you interact with 3D elements in the scene, but the screen coordinates are not being provided in this event and therefore cannot be used correctly. What can be done is hit test (which is what we are doing).
it will be a little harder to calculate a pinch using the hit test results, but as they are practically two parallel lines you might be able to calculate the distance between their normals, which might help? this is not tested at all, and I assume there are performance implications for this.
Another thing to test is whether the select event on Android provides you with the screen coordinates. As this is not cross-device we cannot support it in the framework itself, but maybe you could create a custom android experience if you used the events directly (they are also provided in the pointer observable for your convenience).

1 Like

Thanks a lot for the response, thats all clear.

In terms of there not being screen coordinates, I agree that there are other measurements that could be compared (from the hit test or rays), but the main issue with Babylon’s pointers is that I cant track more than one at a time, as touching the screen with two fingers only registers the first pointer event’s position and ID, and so I can’t find a way to compare them. Im not sure if this is a restriction of WebXR or of the emulated pointer events.

That’s interesting. So just to understand - are you getting the two pointer events but they have the same ID, or is the 2nd one is not registered at all?

So I’m logging the pointer down, pointer up, and pointer move events, and the id (via pointerInfo.event.pointerId)

this is a shot of the console logs to give an idea of what goes one when i first do a single press (line 1), then press again without releasing the first (line 3), then release the first (where the id changes, lines 4/5), then release the second second finger (end)):
image

im not sure what the yellow warning is but that happens on the second press

Basically the second press is only registered after the first press is released, so that final ‘pointer down - 201’) should really be happening on line 3.

hope thats clear :confused: … again hopefully this playground demonstrates the issue when opened in AR (ie the ball never turns green with multiple touches)
https://playground.babylonjs.com/#F41V6N#509

I’ll have to debug this later, but it does seem like XR (or more percise - the device you are on) only supports one pointer input at a time. Or that we are doing something wrong when the second one is popping up :slight_smile:

I’ll get back to you on this one.

1 Like

just checking if anythings been found, really eager to start pinching my AR models to resize them before I place them :pinching_hand:

1 Like

Took a little while, but I found the issue :slight_smile:

It’s not a bug but a feature! (and really is a feature…) The default behavior, for performance reasons, is to only trigger pointer events on one controller. You can enable pointer events on all:

AR screen pinch debugging | Babylon.js Playground (babylonjs.com)

Now you can get the pointer down events of more than one finger.

Now - you will notive that the pointer event doesn’t have screenX and screenY, as webxr doesn’t provide us with this information. As you want to measure the distance between the two fingers, you can use the ray’s origin in the pointer info and measure the distance between those two points on each frame

3 Likes

Thats great! Thanks a lot, yes now to the next challenge of distance calculations, might even try a rotation.

I’ll post where I get to back here :iphone: :pinching_hand: :tornado:

2 Likes

Hey again, its going pretty good thanks to the advice, I just wanted to ask about the nature of rays in AR (as I’m using the pickInfo.ray.origin coordinates to compare as you mentioned).

My question is, how is the origins.coordinates calculated? is it based on the camera (which, in AR, is the actual camera of the phone…?), and if so would that mean that pressing on the same spot on the phone screen will give different coordinates based on where the phones camera is looking at?

I think this might be the cause of a little underlying glitchyness/jumpiness I’m getting but wanted to make sure my mental model of the touch/ray and the 3D/AR scene in real space was right :brain:

Many thanks!

1 Like

That’s a great question!! :slight_smile:
XR provides us with the specific point in space that was pressed. If you want to know the specifics, it’s all in the specs : WebXR Device API

Since this is an XRInputSource object, it provides the target ray space, which can be used to update its position in spcae, relative to your view (to your reference point). This is being used as the origin of the ray. So - yes, this changes when the camera changes (and in real time, while touching!) so you need to calculate the distance on each frame.

And this is me thinking out loud - theoretically you should be able to project this touch point to 2D screen coordinates. It’ll be a fun experiment! I’ll add an issue on github and see if I find the time to add this function.

1 Like