Hello,
I’m trying to read pointer events in an AR WebXR experience on mobile to read multi touch gestures. So I tried using WebXRInput source but I’m only able to get pointer position on down/up when controller is added/removed. I need to be able to read the updated touch position(s) in each frame. Hope this makes sense.
Every finger added to the scene will create a new controller for you. Existing controller will stay available only while the fiber still touches the screen. This is how the pointer system works when in ar mode on mobile.
TBH i never tested a multi touch scenario, but there shouldn’t be a problem using the pointer emulation system(on per default) and checking the pointer id to check individual fingers.
I hope this made sense somehow
1 Like
Thanks!
I ended up using regular touch events instead listening to ‘touchmove’ which did the trick. I was just curious if I was able to get something similar to that from a WebXRInput source
Really appreciate the fast response times here😀
2 Likes