Hi all, I’m working on an AR project and I would like to use a screen pinch gesture to scale a single model in the scene.
AFAIK pinch gestures are not built-in to the Web API, I’m using the
onPointerObservable to get the pointerdown and pointerup events and reusing this code from the MDN docs:
which relies on the comparing the locations of the two pointer events (from each finger) and checking if theyre moving closer of further apart.
The issue is, when logging these events in AR they don’t seem to have a location property (what would be
clientY on a screen), this is always set to 0 no matter where on the screen I press.
Any idea what property of the PointerInfo object to use to make this gesture work?