Hi,
I am using the scene.createDefaultXRExperienceAsync
API and testing on an Oculus Quest 2. How can I access the controllers pose?
Thanks,
Mark
Hi,
I am using the scene.createDefaultXRExperienceAsync
API and testing on an Oculus Quest 2. How can I access the controllers pose?
Thanks,
Mark
Pinging @RaananW (just FYi he is on vacation so please bear with us ;))
I have code for that. I have stopped for the day though. I got my info from :
You need to get your controllers through an onControllerAddedObservable
. This is because they do not show up till you enter XR.
Then I just started looking around at what I could get using my IDE intellisense.
Ok, here is the code to get the controllers. You can get the position / rotationQuaternion in a scene beforerender(), if you need it every frame. Hands are also controllers. I have a different process for them, so there is code to filter them out.
// somewhere in your code
let leftController : BABYLON.AbstractMesh;
let rightController : BABYLON.AbstractMesh;
BABYLON.WebXRDefaultExperience.CreateAsync(scene, options).then((defaultExperience) => { //
defaultExperience.input.onControllerAddedObservable.add((controller : BABYLON.WebXRInputSource) => {
// hands are controllers to; do not want to go do this code; when it is a hand
const isHand = controller.inputSource.hand;
if (isHand) return;
controller.onMotionControllerInitObservable.add((motionController : BABYLON.WebXRAbstractMotionController) =>{
const isLeft = motionController.handedness === 'left';
controller.onMeshLoadedObservable.add((mesh : BABYLON.AbstractMesh) => {
if (isLeft) leftController = mesh;
else rightController = mesh;
});
});
});
});
Thank you all for your prompt replies!
@JCPalmer I am looking for a way to get the rotation of the WebXR hands. My goal is to attach a mesh to the palm of the hands, e.g. a sword, so that it rotates with the palm of my hand. I have figured out how to get the position from the hand joints, but I am lost how to get the rotation. You mentioned in your response above that you have a different process for hands. I would be curious what that is.
I am not sure my process for hands would be of any use. I was too early for trying to work on hands, when the data was still excrement.
I was trying to detect gestures (like finger snapping), place buttons along the forearms, & implement touch for my own 3D controls. Along the way, I transformed the hands from GLB to embedded, & more sane way to deal with skeletons than GLB imo.
It failed beyond just bad tracking data. Lack of knowing where the elbow is greatly limits accurate control placement. Snap detection was the only thing that always did work, given the data at the time.
For you, I would get the positions of 2 of those spheres that correspond to joints, which are there even if not shown. The difference in position of the 1 at the wrist & just before middle finger form a direction, which might be converted to a rotation.