[WebXR] Get 2D touch coordinates when using immersive-ar

Hi everyone,
I’m trying to implement typical touch input in order to manipulate objects loaded in the scene (e.g pinch to zoom for scaling a mesh) in an immersive-ar environment with babylonjs.

I tried to use both the “onPointerObservable” and “scene.onPointerDown” but they always return screenX, screenY and other coordinates equal to zero. I also tried to use events like “pointerdown” and “touchstart” but they don’t work because they are never triggered.

All these solutions work well in the classical 3d scene but stop working as soon as the immersive-ar session is started.

Is there a way to make touch input works properly in immersive-ar?

Thanks in advance.
Davide

1 Like

Hi @Davide , welcome to the forum!

This topic was quickly discussed here - Enable Gestures in AR - Questions - Babylon.js

The AR camera does not support moving towards or away from the object, as that will contradict the concept of AR - zooming means getting closer, rotating means moving around the object.

What you can do is add a gizmo to the mesh itself so that the mesh itself can rescale and transform. So instead of zooming in, you just make it bigger. You can use any of the gizmos here - Gizmos | Babylon.js Documentation

Thaks RaananW.

My goal is not to zoom in/out the camera becouse it would not make sense in ar enviroment but I would like to retrieve touch coordinates ad use them to implement a mechanism to scale the anchored mesh placed by the user (e.g use the pinch gesture and the distance value between the fingers in order to scale specific object)

But the main problem is that i cannot access the coordinates of the touched point when in immersive-ar view.

Have you tried the gizmos? want to share a playground?