Implementing custom pinch gesture in AR

Thanks a lot for the clarification, that explains a lot

So this is happening I believe (its being calculated on every move event at least) but needs to be compared to the previous value so we can get the difference in distance between touch pointers, and so I think that difference will be both the movement of the pointers mixed with the movement of the camera.

Heres where I have to so far, the behaviour for pinch is pretty accurate, but noticably keeps firing with different values when the fingers remain touching the screen (giving this rapid scale up and down that causes the box to wobble). The rotate gesture sort of works, its much more glitchy, but to be honest I’m not sure my algorithm for working out the direction of the fingers clockwise/anticlockwise is correct. But here it is for posterity :slight_smile:

Thanks again for looking into it :pray:

1 Like

Following up on this - I have enabled passing the projected screen coordinates to screenX and screenY of the pointer event:

[XR] multi-pointer screen coordinates projection by RaananW · Pull Request #11495 · BabylonJS/Babylon.js (github.com)

If you want to do it manually, you can do this:

AR screen pinch debugging | Babylon.js Playground (babylonjs.com)

1 Like

This is great, but one last thing, it isn’t working in Babylon 4.2 (unfortunately the version my project is built in currently). I hadn’t noticed till now as i was using the playgrounds for all the testing…
By not working, I mean the second touch gesture isn’t registered.

I know the setting you changed exists in 4.2 (“enablePointerSelectionOnAllControllers: true”). Is there some other reason you know of that would stop this working? I’m getting the warning mentioned here (Coldt not find a matching controller - Warning on Adroid · Issue #9938 · BabylonJS/Babylon.js · GitHub) but the notes on that page state it was just an unnecessary warning not a real problem, so not sure how connected it is?

Both of those changes were introduced in 5.0, so 4.2 will not support them.
I don’t remember if I fixed an issue with dual pointers in screen mode but i think it was missing in 4.2. Will have to debug this and check.

5.0 is backwards compatible with 4.2, so upgrading should be a breeze. We do recommend using our preview builds, so this could be a quick solution as well :slight_smile:

1 Like

Sorry to resurrect this thread but I am trying a similar gesture with pinch to zoom and I’m wondering - Is there is a way to set enablePointerSelectionOnAllControllers: true as an option when using the WebXRExperienceHelper instead of the WebXRDefaultExperience?

We need to use the webxr helper because we are dynamically enabling then detatching features depending on the state of the app and device compatibility.

Thank you so much

@RaananW can you help him?

So I did get it working but I’m a bit confused how.
I looked at the source code in github for 5.0.0-beta.3 which is our target (we’ll update soon).
I saw that the XRInput is constucted in one place in the whole codebase, in the defaultXRExperience here

So seemed simple enough, when using the XRHelper alone with BABYLON.WebXRExperienceHelper.CreateAsync(...) we have to make the xrInput ourselves when the pointers feature is added.

So I did this:

this._pointers = this._xrHelper.featuresManager.enableFeature(BABYLON.WebXRControllerPointerSelection.Name, 'latest', {
       enablePointerSelectionOnAllControllers: true,
       xrInput: new BABYLON.WebXRInput(this._xrHelper.sessionManager, this._xrHelper.camera, undefined),
     })

After which I saw in the onPointerObservable events that pointerDown (and all pointer states) were raising 2 pointers on a single touch.
So it seemed as though the xrHelper already had an xrInput somehow, and creating a new one was duplicating all the events.

So I removed the enableFeature code and left the scene.onPointerObservable events in - without registering the pointer feature, and using the default xr helper, never setting an option like enablePointerSelectionOnAllControllers anywhere…
And it started to work and pointers were having unique id’s regardless of multitouch.

So it all works and that’s great, but I’m confused how.
How does the xrHelper eventually attach it’s own xrInput if it doesn’t construct one?

As you are saying, the only place that a new XR Input is being created is the default experience.

Try creating an XR input right after creating the XRExperienceHelper, and pass a reference to the already-created one (to reuse it in different features, where needed. Something like this:

const xrHelper = await BABYLON.WebXRExperienceHelper.CreateAsync(...);
const xrInput = new BABYLON.WebXRInput(xrHelper.sessionManager, xrHelper.camera, undefined);

Afterwards, reference this xrInput when enabling the feature. There might be a point in your code (or an issue in ours?) that the feature is recreated, which will force a new xrInput without disposing the old one.

1 Like