Hey I am looking to build a platform for supporting app development for AR glasses. The assumption is the sensor details can be collected through USB connected to the mobile.
What approach can I look at implementing ? Some thing I have in my mind is build a wrapper over BabylonJS React Native to swap the default mobile sensor with the sensors from glass.
Or try something like https://cloud.google.com/immersive-stream/xr where all the sensor data is sent to the cloud and only the frames are returned. Will I be able to run BabylonJS in the server to render AR and send back frames ?
Hello and welcome to the Babylon community!
cc @bghgary who is the most qualified to respond
2 Likes
Babylon Native can in theory run almost anywhere and can be extended/modified to do almost anything, so I think it is theoretically possible to do what you are asking but I don’t think anyone has tried, besides our minimal support for HoloLens. I’m not sure Babylon React Native will work for the server scenario since it can only run on Android, iOS, and Windows.
2 Likes
So if I am using the data from external sensors such as Camera and IMU from the AR Glass through the USB connected with mobile, I can make Babylon behave according to those sensors instead of the inbuilt mobile sensors. Right ?
In that case, I will be altering these code over here ?
Probably yes, but it depends on what you want to do with these sensors. AFAIK, Babylon.js doesn’t inherently have any direct usage of AR cameras and IMU except maybe through WebXR. If you are looking to replace existing WebXR capabilities, you will need to modify the NativeXR plugin and XR dependency. If you just want direct access to the sensors, then you can provide your own plugin and expose custom interfaces for the JavaScript side to access.
1 Like