Using image tracking in babylon react native

Hi!

I’m quite hype about the image tracking that was just added to to BabylonNative and BabylonReactNative. I’ve been trying to get it working for a while now but I can’t seem to get anything out of the system.

This is how I’ve got it set up now:

  useEffect(() => {
      if (context.featuresManager && context.scene !== undefined) {
          console.log("SET TRACKING IMAGES");
          const webXRImageTrackingModule = context.featuresManager.enableFeature(
              WebXRFeatureName.IMAGE_TRACKING,
              "latest",
              {
                  images: [
                      {
                          src: "https://developers.google.com/ar/images/augmented-images-earth.jpg",
                          estimatedRealWorldWidth: 0.2
                      },
                      {
                          src: "https://raw.githubusercontent.com/KhronosGroup/glTF-Sample-Models/master/2.0/DragonAttenuation/screenshot/screenshot_large.png",
                          estimatedRealWorldWidth: 0.2
                      }
                  ]
              }
          ) as WebXRImageTracking;
          webXRImageTrackingModule.onTrackableImageFoundObservable.add(event => {
              console.log("IMAGE TRACKABLE " + event.id);
          });
          webXRImageTrackingModule.onUntrackableImageFoundObservable.add(event => {
              console.log("IMAGE UNTRACKABLE " + event);
          });
          webXRImageTrackingModule.onTrackedImageUpdatedObservable.add((imageObject: IWebXRTrackedImage) => {
              console.log("SHOULD BE SHOWING SOMETHING" + imageObject.id);
          });
      }
  }, [context.featuresManager]);

However, I only seem to get the first console message “SET TRACKING IMAGES”. None of the events seem to trigger. Is there a way for me to find out what’s going wrong? I already changed one of the images to the one suggested by google here: Add dimension to images  |  ARCore  |  Google Developers

I’m using these versions:
@babylonjs/core”: “5.2.0”,
@babylonjs/loaders”: “5.2.0”,
@babylonjs/react-native”: “1.1.0”,

Thanks! :smiley:

Adding @ryantrem and @bghgary to the thread

1 Like

@AlexTran?

1 Like

Hi Peter, awesome to hear you’re trying out the image tracking feature :smiley:!

I was able to get these images loading in the Babylon React-Native Playground using your code directly, so hopefully we can figure out what is going on in your application.

One important requirement about using the Image Tracking Feature is that it must be attached before calling context.baseExperience.enterXRAsync. The reason for this is the current WebXR Image Tracking spec only supports tracking for images declared at session initialization time. From your code it looks like you are waiting for the context object to be instantiated, so if the useEffect which sets the context object is also calling enterXRAsync it is likely that this chunk of code is being run after session init. What I would suggest is moving this code to the same useEffect/useCallback where you are initializing the XR session.

Here is the code I tested in the BabylonReactNative Playground app for reference:

  const onToggleXr = useCallback(() => {
    (async () => {
      if (xrSession) {
        await xrSession.exitXRAsync();
      } else {
        if (rootNode !== undefined && scene !== undefined) {
          const xr = await scene.createDefaultXRExperienceAsync({ disableDefaultUI: true, disableTeleportation: true });
          const webXRImageTrackingModule = xr.baseExperience.featuresManager.enableFeature(
            WebXRFeatureName.IMAGE_TRACKING,
            "latest",
            {
                images: [
                    {
                        src: "https://developers.google.com/ar/images/augmented-images-earth.jpg",
                        estimatedRealWorldWidth: 0.2
                    },
                    {
                        src: "https://raw.githubusercontent.com/KhronosGroup/glTF-Sample-Models/master/2.0/DragonAttenuation/screenshot/screenshot_large.png",
                        estimatedRealWorldWidth: 0.2
                    }
                ]
            }
          ) as WebXRImageTracking;
          webXRImageTrackingModule.onTrackableImageFoundObservable.add(event => {
              console.log("IMAGE TRACKABLE " + event.id);
          });
          webXRImageTrackingModule.onUntrackableImageFoundObservable.add(event => {
              console.log("IMAGE UNTRACKABLE " + event);
          });
          webXRImageTrackingModule.onTrackedImageUpdatedObservable.add((imageObject: IWebXRTrackedImage) => {
              console.log("SHOULD BE SHOWING SOMETHING" + imageObject.id);
          });

          const session = await xr.baseExperience.enterXRAsync('immersive-ar', 'unbounded', xr.renderTarget);
          setXrSession(session);
          session.onXRSessionEnded.add(() => {
            setXrSession(undefined);
            setTrackingState(undefined);
          })

          setTrackingState(xr.baseExperience.camera.trackingState);
          xr.baseExperience.camera.onTrackingStateChanged.add((newTrackingState) => {
            setTrackingState(newTrackingState);
          });

          // TODO: Figure out why getFrontPosition stopped working
          //box.position = (scene.activeCamera as TargetCamera).getFrontPosition(2);
          const cameraRay = scene.activeCamera!.getForwardRay(1);
          rootNode.position = cameraRay.origin.add(cameraRay.direction.scale(cameraRay.length));
          rootNode.rotate(Vector3.Up(), 3.14159);
        }
      }
    })();
  }, [rootNode, scene, xrSession]);
3 Likes

Hi! Thanks for the quick and thorough response! This is indeed happening after the ar initialisation so that could be it! Tomorrow is a national holiday here but I’ll check the day after :smile:

Thanks again!

2 Likes

Hi! It does detect now, and I also get the trackable etc events. Thanks! :smiley:

2 Likes