Are there any existing models for WebXR hands available?

Aframe has this hand controller component hand-controls – A-Frame that renders a basic hand that you can configure to be low poly or high poly, change the color and it even animates to a pointing hand (trigger up, grip down) or fist (both trigger and grip down).

I would like to know if there is something similar in the Babylonjs community I can use to build off of, or do I need to build my own … requiring me to learn how to model hand meshes and learn skeletons and bones animations first in order to accomplish this?

Thanks!

1 Like

pinging @RaananW

Hand meshes already exist, I am working on integrating them. Will be ready in a week or two (need to clear some time :slight_smile: ).

Unless something is totally off, we will have them for 4.2 release.

1 Like

That is awesome and exciting to hear!

I feel like being able to make pointing gestures and picking up and throwing things will be basic activities that many immersive experiences will need.

What will the API look like? Will it be possible to plug in other hand meshes and utilize custom animations, for example loading a gun or holding a sword?

we are planning simple gesture support and a very advanced one. Info coming “soon” :slight_smile:

On an unrelated matter, here is a random tweet - https://twitter.com/babylonjs/status/1308851589290037250

So is this related to XR Hand tracking? Reason I ask, is you would not really need gestures if you were tracking wearers actual hands. Either way I would recommend that the skeleton be very friendly to the one expected for hand tracking, so as to not dig a comparability hole for yourself.

Edit:
Also, I am doing A LOT with hands. I am going with a KinectV2 skeleton replaced with MakeHuman hands. I have standardized on the TPOSE, specifically for the hands, as shown from above in BJS:

The reason for the TPOSE, is because the bend of each finger is accomplished without some kind of helper Rig, due to being aligned with an axis and results in very exact poses. I also make the prior bone, K2-Hand.L or K2-Hand.R in my case, exactly on the X axis. All this means with a minimum poses I can fabricate any gesture in real time. Shown from Blender along with the list of poses:

To accomplish the “any pose” part, I break out each pose by finger at load time, and can then animate / interpolate each finger independently. Final general purpose gestures can be achieved, like ones shown here . This scene is very old, when I was still using morphing for hands.

I also use the interpolator to fabricate composites of individual poses for a finger at various percentage, then store the result under the same name for each finger.

Not sure how much of this you can use, but the TPOSE is definitely your friend. If you have stuff in Blender, I can generate in-line mesh sub-classes for geometry, but not doing a PR. For meshes of this level, you would not want some dorky load process & callback, though I suppose you can always store the file as text.

1 Like

I didn’t mean actual hand tracking of a persons real hands. As an experimental feature on the Quest, real hand-tracking is still very wonky for me. I find it hard to control and there is no vibration feed back or the sense that you’re actually grabbing on to something.

I’m referring to using the Oculus Quest controllers such as in a first person shooter game, where the player needs to see his own hands and be able to grab guns or ammo and load them etc, as well as a handful of gestures “punch”, “point”, “thumbsup” kind of guess.

I’m a newb so I haven’t yet explored animations with bones/skeletons so what you’re talking about is still beyond me. Looks like the short answer is, “first learn blender modeling, blender animation, how to export them to babylon.js, (what format is best? .babylon or gltf), how to trigger the animations when controller buttons are pressed”. Guess I have a lot of videos to watch. :slight_smile:

Oh! Interesting idea. I guess we will be able to use the hand meshes that are used for hand tracking and emulate their movement when the controller is touched/pressed. I doubt this will arrive for 4.2, but I will be happy to investigate further after the release. Want to create an issue for this?

I’d be happy to. WebXR option to display hand mesh instead of controllers · Issue #9089 · BabylonJS/Babylon.js · GitHub

1 Like

I’ve got a Quest 2 on pre-order. I am primarily interested hand tracking. Will probably be generating geometry / skeleton and trying to get tracking to work late this month.

Hi, I was looking for this feature when I came across this request. Looking at GitHub, it seems like this may be coming in version 5. Is this still the case? Is there a timeframe for when it may be available in the alpha version?
Thanks

It came very late in the last version. The api has changed since, so that will not work anymore. It is almost fixed in 5.0 alpha. Although I use custom meshes, I got them from the default ones. You do not have to make your own. It is now documented.

The api change does have the side effect of failing when you leave emmissive mode and return though.

Hi. I am also looking for this feature. It can be very good for user experience to have the same felling than oculus “echo VR” game for example. The ability to see hands moving according to trigger buttons pressed is very immersive.
Can you tell me what can I do to have hand meshes moving according to the buttons pressed on the controller? Thanks a lot.

This is a requested feature that can be tracked here - WebXR option to display hand mesh instead of controllers · Issue #9089 · BabylonJS/Babylon.js (github.com)

I didn’t get a chance to look into it yet

1 Like

It’s been a little while, just checking in. It looks like a lot of progress has been made on the

BABYLON.WebXRFeatureName.HAND_TRACKING

It’s a very cool looking hand model! I would like to borrow that hand tracking feature’s mesh as the hand displayed in non hand-tracking mode (i.e. in lieu of regular Quest 1 controllers).

Here is my plan:

  1. I will hide the default controller profile mesh.
  2. I will then load the model for the hand-tracking feature and parent it to the position of the hand controllers.
  3. I will add some observers to the component button values and trigger/grip squeeze intensities and bend the bones in the hands accordingly to approximate the pose of making a fist or pointing etc.

I’m a bit stuck on step one. I know that I can use this option to prevent loading any hand controller at all…:

createDefaultXRExperienceAsync({
       inputOptions: {
         doNotLoadControllerMeshes: true,
       },
});

However, I don’t think that’s what I want because that prevents the onModelLoadedObservable and onMeshLoadedObservable callbacks from executing. And I rely on the loaded mesh to get the position and rotation of the hand:

  inputSource.grip.onAfterWorldMatrixUpdateObservable.add(grip => {
   // get pos and rotation of this grip thing, 
   // it is my controller's hand pos and rot
})

Unless there is a better way of getting the hand positions and rotations, I need to keep the default model loading and just hide it after it has loaded. I also parent a little hand menu to the left controller.

Next I tried hiding the hand controller model inside the onModelLoadedObservable callback. The problem there is it is not obvious what mesh I need to hide. There seems to be more than one mesh/transform node, and there is a hierarchy of different buttons and parts. So what I did is get the thing called ‘grip’, e.g. named ‘controller-0-tracked-pointer-right-grip’

And then map over all its child mesh descendants and set all visibility to zero or setEnabled(false). For some reason, even though the properties are correctly set on all the children, they are still visible in the scene. It did disappear if I put in a bit of delay using setTimeout, but didn’t work without a delay. Seemed like there might be a race condition somewhere, as if setting visibility or setEnabled to false before the children were fully loaded was unable to correctly hide them. :man_shrugging:

However this delay isn’t a good idea, because the timing is arbitrary, and as I said I was trying to parent a little menu to my left hand, and if I hide all the descendants of the ‘grip’ after an arbitrary delay then the menu I bound to the hand gets hidden too.

Do y’all have any suggestions?