Checking for pinch gesture in react native

Hi!

I’m trying to figure out how to catch a pinch gesture in react native. I’ve found this thread Implementing custom pinch gesture in AR - #8 by sebavan and tried to implement that on react native. What’s the correct way of enabling pointer events in typescript react native?

    pointerSelectionOptions: {
        enablePointerSelectionOnAllControllers: true
    }

doesn’t work since it wants a whole lot more options:

  • enablePointerSelectionOnAllControllers
  • disablePointerUpOnTouchOut
  • forceGazeMode
  • xrInput

The first three are just boolean, so I could easily figure those out but the xrInput is a little confusing to me. I’m sending this through scene.createDefaultXRExperienceAsync and I don’t see how I’d already have a xrinput thing since xr isn’t ‘created’ yet?

Or am I just going about this wrong?

cc @ryantrem

So I think I’ve found the solution, I keep forgetting that I can just use other react native libraries in combination with Babylon :smiley:
So just using PinchGestureHandler | React Native Gesture Handler is working out great.

If you only need your solution to work in a React Native app, then using a React Native gesture recognition library is fine.

If you want your code to be portable to the browser as well, then check out this other thread: How to enable touch input for ArcRotateCamera on Babylon React Native - #14 by ryantrem

Also, @PolygonalSun is doing more work right now to make more of the Babylon.js input system work with Babylon Native, so when that work is done it might be sufficient to simply use an ArcRotateCamera which already supports pinch zoom.

1 Like

Hi! Thanks for elaborating, I wanted to use pinch in AR though, if I wanted to make it work in the browser as well, is there another camera I could use? Or could ArcRotateCamera still catch the events even when I’m in AR?

Not a camera… ArcRotateCamera handles input and moves the camera relative to some point in space (e.g. the center of a model). If you are in AR, the camera is controlled by the pose of the physical device in the physical space, so I think what you want is some component that handles input and uses it to manipulate an actual model (translate, rotate, or scale the model). As far as I know Babylon.js does not have anything like that built in, but @Deltakosh may know for sure.

Let me summon our XR overlord @RaananW

1 Like

There is a very interesting thread where we discuss the exact same thing -

TL;dr - AR allows you to receive pointer events for both fingers, and you can use those two points to do your magic. Since we don’t currently provide the screen coordinates, the best way would be to check the distance (in 3d) between the two XR Input points (the two fingers) and use the deltas to scale a model (for example).

I also mention there that it is possible to get the screen coordinates, it just takes a bit of computation that might slow down the AR render loop. If that’s the missing piece I can show you how to project the 3D point to screen space.

1 Like

Hi again, actually I could do with some help getting the 3d space to 2d coordinates, I haven’t been able to do a deep dive into the matrices involved.

As for my progress, I had another idea for this where I suspended a transparent plane in the scene and parented it to the camera so it was always static relative to the screen, and then used texture coordinates taken from rays that hit it to replace the pointers. I had a lot of help from this thread:

HOWEVER the next issue i faced was that i couldn’t get multiple points at the same time using this technique - it just duplicated the first coordinate. So still no pinch…

So that leaves another possible avenue to get screen coordinates, if theres a way to get multiple texture coordinates from different rays at the same time. Or maybe some sort of multi-mesh grid, though that seems a bit wasteful

I’ll find time later today to create a demo for this.

1 Like

Just following up on this:

[XR] multi-pointer screen coordinates projection by RaananW · Pull Request #11495 · BabylonJS/Babylon.js (github.com)

This PR adds the ability to project to screen space and get the data in the pointer event.

If you want to do it manually, here is the general gist:

AR screen pinch debugging | Babylon.js Playground (babylonjs.com)

You will get both pointer’s screen coordinates and will be able to calculate delta in screen space.

1 Like

Hi @HedwigAR @Chrisor9 just checking in if any of you still needs any help :slight_smile:

2 Likes

No thanks! I see I forgot to mark a solution, just did that now :grinning_face_with_smiling_eyes:

3 Likes

Thanks! <3

2 Likes