the ray SHOULD be invisible, at least in the first one. if it isn’t I need to check why
Could it be the teleportation ray and not the selection ray?
oh, and just in case this is not obvious… OMG IT IS WORKING!
the ray SHOULD be invisible, at least in the first one. if it isn’t I need to check why
Could it be the teleportation ray and not the selection ray?
oh, and just in case this is not obvious… OMG IT IS WORKING!
Ah you are right the picking ray is invisible the teleportation ray is visible.
It’s hard to tell where you are looking on the color cube or surrounding slider. One refinement that might be really nice is some sort of signifier that shows where the user is looking. Apple uses a sort of on-hover
decorator for their UI elements to denote where the person is looking (you can see that in the video above in the home screen and webpages). It seems like that interaction might be hard to enforce across all Babylon, so the easier solution might be some sort of translucent dot that’s a gaze signifier.
Not hard, impossible. Apple made it impossible. I only know where you are looking at when you tap. So beforehand there is no way to mark what you are looking at, since WebXR doesn’t (yet) support eye tracking.
Apple found an interesting way to emulate eye tracking, but it would have been much better if they would make the eyes a tracked pointer (and not a transient controller) so the eyes will behave like your hands (at least in terms of pointer events - constant move, down on tap, up when done tapping)
@apowers313 To add to that, it seems like a very intentional choice from apple for security reasons. You can’t even get the gaze position yourself developing natively, you just define how you want things to react when they are gazed over. Not sure how they will rectify that moving forward or if webxr can change to accommodate this but time will tell.
Ada Rose Cannon is Apple’s W3C WebXR rep. You can probably look at the issues in this repo to infer how she thinking about it
New webkit blog post about AVP Safari Hand input, would have been helpful a little sooner but better late than never.
Is there a way to bypass this controller implementation that resembles a lot with the way the Apple Vision Pro works and just make it work similar to the Oculus Quest 3? Would be more intuitive inside a VR environment i think.
I mean, instead of the crosshair and pinching to “click” on elements, just have 2 hands with rays and “click” on pinch.
Probably if i force loading the Quest 3 hand tracking instead of the Vision Pro, but how?
The quick answer is - no, apple doesn’t provide us the data to provide a different experience. No rays from the hands, no ray data from the hands as well. My first implementation actually expected that to work, but it failed miserably because of the select data we get from the device.
Actually, i have created this small playground and ran it on the Apple Vision Pro device immersive mode: https://playground.babylonjs.com/#ZI9AK7#4163
It seems that in a older version of babylonjs the hand tracking was almost working. I mean the hands motion was ok, there was also a ray coming out of the hand (one at a time) but there was no pinching detection and when the pinching happened the ray would move a bit together with the fingers (not so stable). See my video for version 5.71.1: https://www.youtube.com/watch?v=6NEa0xqwtI4
For the latest version of babylon the implementation is much different, mostly the way you have described it above in this long thread It is almost nice because it simulates the way the headset works but using this behavior it is really hard to hover objects and when you pinch your hand and keep it like that … the cursor goes all over the screen. It works the best when you use the “Head tracking”. The “Eye tracking” is really hard to control at least with my eyes and for the “Wrist” and “Finger” tracking the cursor is not so accurate on the directions it selects. See my video for version 7.26.3: https://www.youtube.com/watch?v=Gh3pJYMso9U
So, to bypass all these experimental trackings from Apple, it would be really nice to have a alternative control for the headset. Like you did have in the older versions but have also pinching working of course.
I am a developer, but not so specialized in this 3d world to go so deep into the inner workings babylon js but if i can provide some feedback to make this happen, i would be happy to.
I totally get that, but almost working is sadly not fully working. Apple does not provide us with the data to be able to fully support this kind of ray. The only way to get the events working correctly is the way apple wants us to work. It’s a sad reality with apple…