Webxr Room Mesh Access

I know I’m relentless @RaananW , but it seems like Oculus is exposing Room Mesh on the Quest 3 now. Wonder if that could get rolling in Babylon!

1 Like

Well, at least we already have depth sensing! :slight_smile:

The next feature I wanted to work on was the webxr mesh detection feature (WebXR Mesh Detection Module). we have it implemented, but currently only for babylon native. I assume this is what he is referring to here?

1 Like

That’s the one! Just confirmed the Quest 3 supports “mesh-detection” session feature

1 Like

Want to try this - Babylon.js Playground (babylonjs.com) ? I don’t have (yet) a quest 3 to test with, and it is not (yet) enabled on the quest pro.
It might fail! and I want to know how and why. Thanks!!!

Hi @RaananW , I was able to open up this PG in Quest 3 and there weren’t any errors ( I don’t think!), but I also don’t see any representation of the room mesh. Am I supposed to? Or how could I access the meshes themselves?

Do you have OS version 60 already? I still didn’t get the update…
I was also told that there should be a flag enabled, but i assume it’s the experimental flag that is needed.

The PR should (!!) generate a mesh on your screen, based on the canned room. If everything in expected to be working (OS 60, flag is on) and it doesn’t, something is wrong - either I messed something up, or the extension.

I will have to wait for version 60 to be able to test that.

Working! Link updated in the PR

3 Likes

Really cool @RaananW ! I used your example and I do have a grid appearing like in your photo, but that room mesh seems to be crooked and scaled strangely relative to my actual real world space. Did you have to do anything special to get the generated room mesh with the grid material to map nicely to your real world surroundings?

That’s interesting. No, I haven’t done anything other than using this example. Are you using the playground linked in the PR? Can you share a screenshot or the mesh itself? Would be great to try and reproduce this.

Hello. I saw the pull request in Babylon whats new… However, I can’t seem to find any instructions, tutorial, or implementation example.

I’m trying to access Meta Quest3 DepthAPI,MeshAPI, or sceneAPI ,etc with webXR and babylon.

Any suggestions?
@RaananW

Hi!

This is the feature - Babylon.js docs . There is not a lot of functionality to this feature - it gets a mesh detected by the underlying system.

Playground - https://playground.babylonjs.com/#CB9E35#7

1 Like

thanks for this. I notice that when I hold the oculus scene-re anchor button that the mesh moves too. Do you have a recommendation to keep the mesh relative to its proper location rather that move with the scene. Not even sure how to detect that xrScene reset type thing

i’m not sure what scene re-anchoring means, but the feature itself has 3 callbacks. onAdded, onRemoved, and it also has onMeshUpdatedObservable, which will provide you with new mesh data if the mesh’s data itself was updated. Would that help in your case?

I’m referring to the common XR feature.
Example. On Meta quest right controller, if you hold the Meta logo for a few seconds, the XR experience re orients itself so that forward and menus are right in front of you.

This functionality also happens when using babylonjs in a Meta Quest browser.

The problem is that this also changes the location of the real life scene mesh and it no longer aligns to the real world geometry. How do we detect when this re orientation occurs and then re align the scene mesh to match real world geometry.

the mesh should then be updated and the observable should trigger. If it doesn’t then either we are doing something wrong, or the quest is not informing us of the update. WebXR defines a timestamp for the last time a mesh was updated by the underlying system. I expect the quest browser to provide us with an update.