Hey I am looking to build a platform for supporting app development for AR glasses. The assumption is the sensor details can be collected through USB connected to the mobile.
What approach can I look at implementing ? Some thing I have in my mind is build a wrapper over BabylonJS React Native to swap the default mobile sensor with the sensors from glass.
Or try something like https://cloud.google.com/immersive-stream/xr where all the sensor data is sent to the cloud and only the frames are returned. Will I be able to run BabylonJS in the server to render AR and send back frames ?
Hello and welcome to the Babylon community!
cc @bghgary who is the most qualified to respond
Babylon Native can in theory run almost anywhere and can be extended/modified to do almost anything, so I think it is theoretically possible to do what you are asking but I don’t think anyone has tried, besides our minimal support for HoloLens. I’m not sure Babylon React Native will work for the server scenario since it can only run on Android, iOS, and Windows.
So if I am using the data from external sensors such as Camera and IMU from the AR Glass through the USB connected with mobile, I can make Babylon behave according to those sensors instead of the inbuilt mobile sensors. Right ?
In that case, I will be altering these code over here ?