Did this turn out to be a permission issue? I’m trying to detect planes using Chrome 80, and the XRSession object is very different from the Babylon interface. It seems updateWorldTrackingState no longer exists. Does this match what you’ve seen?
I also tried the pattern in this playground of storing the object returned from xrFeatureManager.enableFeature( WebXRFeatureName.PLANE_DETECTION, 'latest', );
(xrPlanes in the example), but it does not have the onPlaneAddedObservable
shown in the playground.
No, chrome 80 implemented a totally different interface and the implementation is therefore obsolete.
I will find time in the next few weeks to implement the new plane detection and hit test. Hopefully sooner than later
Cool. The work you’ve done so far is great. Lots of fun playing around with the XR support. Thanks!
@RaananW thanks for your reply and the playground example! It’s definitly a lots of fun playing around. But i’m still curious about activating the “immersive-ar” mode with stereo rendering. Immersive VR autmatically starts with stereo rendering, but “immersive ar” does not. Immersive AR shows the “normal” view without splitting it into a right and a left eye view. Is there a chrome flag which has to be activated or anything else i’m missing?
Immersive AR session is usually just using the camera in hand, not in a headset like VR. This is the device’s capability and not babylon’s decision.
We simply ask - how many screens? the device answers, and we say - hmmm. 2? Man, now we need to render the entire scene twice!
Joke aside - without two cameras it will be difficult (and almost impossible) to render the 3D world in stereo-splitted headset mode when in VR. your camera is positioned in one section of your phone, and does not correlate to your eyes’ location.
That was exactly the answer I secretly did not want to hear. I thought the WebXR API would be a step further and would provide a workaround for exactly these use cases. After all, only the camera environment would have to be rendered additionally, everything else should be identical to the VR environment, right?
I would like to develop an AR webapp for smartphones which reflects the functionality of an AR headset. Especially the advantage of not having to hold the smartphone in the hand. Do you see any posibility to realise it?
yes
But I have no idea how it will look like.
Use WebXR in vr mode, load the camera to a render target texture, and show it to both eyes. All possible on android.
That sounds like a good idea. I’ll definitely look into it. The combination with markers should also not be a problem.
I will post an update, if I have got something going in the next weeks Thanks!
Any updates???
Do we have pointer down trigger functionality on rendering AR using babylon.js
Sorry, not sure what you mean?
Above u can mentioned that in AR we have following limitation:
- Pointer down and up is not triggered. So you cannot click buttons ATM.
- Plane detection and anchors are not yet supported.
So I askng if there is any update on the “Pointer down and up” functionality?
Oh, sorry
Pointer events are now working correctly when in AR mode,
anchors were not yet implemented, plane detection’s API is deprecated (still waiting on an update).
The new hit test feature is implemented ([WIP] New hit test by RaananW · Pull Request #7790 · BabylonJS/Babylon.js · GitHub) and working on latest chrome (mobile)
By plane detection API u mean webxr hit test?
no, I mean plane detection As I wrote, hit test is implemented
Is there any demo where I can see the hit test implemented??
I am confused with plane detection and hit test .Could you plz elaborate what is the difference??
hit test is ray tracing into the real world.
Plane detection is detracting the geometry of floors and walls. They can be combined, but they are not ATM.
For a demo you can try this - https://playground.babylonjs.com/#KVZI50#64
A short warning - it’s unstructured. I’m still working on a better demo
This demo is not working properly. Two cubes appear on the screen when touched and then move apruptly, sometimes with the motion of the screen.
I am using Chrome 80 and enabled all the webxr flags.
I tried in Chrome 81 as well but I am not able to run the demo there.
Yes, there are a few issues with it. This is the page I am using during development.
This is however, the way to use it. You can create your own demo as well
Documentation and better demos are coming soon(ish)