Clicking mesh in React Native

Hi all!

I’m trying to add some interaction to my project where I want to check if a user clicks on a mesh. But I’m having some trouble figuring this out on native mobile.
I know how to check if there was a click with DeviceSourceManager.getDeviceSource but how do I then check the place of that click? Or is there another way to detect if a user clicks on a mesh in the scene? I’ve tried

scene.onPointerObservable.add(pointerInfo => {


 scene.onPointerDown = function () {
                //left mouse click
                console.log("POINTER DOWN ");

But both gave me nothing on react native, any thoughts?

Adding @ryantrem the brillant mind behing babylon react native


Wow, @ryantrem sorry to do this to you again but I think I just found what I was looking for in this thread here: DeviceSourceManager for touch inputs - #3 by Deltakosh
No idea how I missed that before, so according to that I can do this kind of thing:

        touchDevice = deviceSourceManager.getDeviceSource(BABYLON.DeviceType.Touch); 
        if (touchDevice) {

I was thinking that I could get that from the event, but I guess I have to save the ‘device’ to get the input data from it.
So touchDevice.getInput(BABYLON.PointerInput.Horizontal) gives the horizontal screen position and touchDevice.getInput(BABYLON.PointerInput.Vertical) gives the vertical one, I’m guessing.
I had tried touchDevice.getInput(PointerInput.LeftClick) which gives just a number so that’s what got me confused.

1 Like

The goal is for scene.onPointerObservable etc. to work as expected in Babylon [React] Native, and @PolygonalSun is working on that (any updates/ETA to share @PolygonalSun?). Until that work is done though, you do need to use DeviceSourceManager.

You can get the screen coordinates of the click as you suggested above (touchDevice.getInput with Horiontal/Vertical). Also, calling touchDevice.getInput with LeftClick just returns 0 if the pointer left button is up, and 1 if it is down (in the case of touch, I don’t think it really has meaning since touch “devices” only exist while in a left click down state).

Once you have the X/Y screen coordinate of the touch, if you want to know if this was on a mesh you can call scene.pick. You can find some example code here: Pointer position for raycast · Issue #214 · BabylonJS/BabylonReactNative · GitHub

This kind of input handling will be easier once the input system work is completed by @PolygonalSun.

1 Like

Alright! Thanks for taking the time to write that up. No worries I can absolutely make do with the devicesourcemanager :smiley:

Yeah, so for the work to get things like scene.onPointerObservable to work, the groundwork on the JS side should be ready and live (as of 5.0.0 alpha 62) and I have a draft PR (Modify Playground to use JS InputManager by PolygonalSun · Pull Request #953 · BabylonJS/BabylonNative ( up with my work on getting everything to work on the native side. Right now, I’m testing to make sure that there’s no loss of functionality and once it’s been tested and polished on all platforms, I’ll publish the PR and get it merged. When it’s merged, all that’s left is to integrate it into Babylon React Native.

Ideally, I’d like to have things merged by the middle of December but with the holidays coming up, my timeline could change.

1 Like