Apple Vision Pro // AI

I’ll put this in the form of questions, below, but the feature request is “As much as possible!” –

What support is there in Babylon for webXR using Apple Vision Pro?

What support is there for using AI to assist in Babylon development for webXR?

1 Like

Would be awesome!

Vision OS can display classic windows (2D surfaces), volumes (3D objects) and fullspaces (3D worlds).

Can web canvas + babylon engine be displayed as a Vision OS “volume” (for content) or a Vision OS “fullspace” (for the whole scene) directly? (instead of through a Safari 2D window)

I’m mostly interested in webXR (3D, 360 and also AR/MR) on AVP. There are hints that Safari will support webXR by the time AVP releases. If it does, I’m not sure whether it matters or not what tools are used to create the webXR. However, one feature of the AVP is that it is intended to work seamlessly with all other Apple devices, and that could require new features in Babylon.

The question should actually be: “What support the Apple Vision Pro will have for WebXR?”. We support the WebXR standard fully, the issue we’ve had for years was Apple not supporting it properly :sweat_smile:

I’m not sure what is the request here? :slight_smile: You can try using any existing LLM to assist you with development, for example, I use Copilot a lot and it’s very helpful, but we can’t exactly control how any model will work. We did have some experiments already with Cortex models, @sebavan worked on that, and I’m currently analyzing the feasibility of adding a chatbot helper to our docs page, but these are things outside of our core framework, as we shouldn’t change anything inside the core engine for AI.

AVP & webXR

There are several videos on WWDC about Safari and webXR. Here’s one, which indicates, not only that they intend to support webXR, but toward the end of the video, they even reference Babylon.js!

I cannot tell how much – if anything – Babylon needs to add (or change) to fully support the possibilities and requirements of Safari and Apple. It sounds from this video as if Apple may be building new features into the webXR specification. Apple also seems to want to allow all its devices to work seamlessly together, which might provide new possibilities for Babylon to allow native code to be created to run on other Apple devices.

AI –

The possibilities are many for AI to interact with Babylon in new ways. Text to Code-on-top-of-Babylon is one obvious example. Another might be Babylon features that simplify the introduction of AI into applications (for example: enhancing Avatars with the ability to understand speech and generate talk – – AWS Avatars do that now, I think, in Sumerian, which is powered by Babylon; Or, AI powered navigation thru complex websites.)