Are there any existing models for WebXR hands available?

Aframe has this hand controller component hand-controls – A-Frame that renders a basic hand that you can configure to be low poly or high poly, change the color and it even animates to a pointing hand (trigger up, grip down) or fist (both trigger and grip down).

I would like to know if there is something similar in the Babylonjs community I can use to build off of, or do I need to build my own … requiring me to learn how to model hand meshes and learn skeletons and bones animations first in order to accomplish this?

Thanks!

1 Like

pinging @RaananW

Hand meshes already exist, I am working on integrating them. Will be ready in a week or two (need to clear some time :slight_smile: ).

Unless something is totally off, we will have them for 4.2 release.

1 Like

That is awesome and exciting to hear!

I feel like being able to make pointing gestures and picking up and throwing things will be basic activities that many immersive experiences will need.

What will the API look like? Will it be possible to plug in other hand meshes and utilize custom animations, for example loading a gun or holding a sword?

we are planning simple gesture support and a very advanced one. Info coming “soon” :slight_smile:

On an unrelated matter, here is a random tweet - https://twitter.com/babylonjs/status/1308851589290037250

So is this related to XR Hand tracking? Reason I ask, is you would not really need gestures if you were tracking wearers actual hands. Either way I would recommend that the skeleton be very friendly to the one expected for hand tracking, so as to not dig a comparability hole for yourself.

Edit:
Also, I am doing A LOT with hands. I am going with a KinectV2 skeleton replaced with MakeHuman hands. I have standardized on the TPOSE, specifically for the hands, as shown from above in BJS:

The reason for the TPOSE, is because the bend of each finger is accomplished without some kind of helper Rig, due to being aligned with an axis and results in very exact poses. I also make the prior bone, K2-Hand.L or K2-Hand.R in my case, exactly on the X axis. All this means with a minimum poses I can fabricate any gesture in real time. Shown from Blender along with the list of poses:

To accomplish the “any pose” part, I break out each pose by finger at load time, and can then animate / interpolate each finger independently. Final general purpose gestures can be achieved, like ones shown here . This scene is very old, when I was still using morphing for hands.

I also use the interpolator to fabricate composites of individual poses for a finger at various percentage, then store the result under the same name for each finger.

Not sure how much of this you can use, but the TPOSE is definitely your friend. If you have stuff in Blender, I can generate in-line mesh sub-classes for geometry, but not doing a PR. For meshes of this level, you would not want some dorky load process & callback, though I suppose you can always store the file as text.

1 Like

I didn’t mean actual hand tracking of a persons real hands. As an experimental feature on the Quest, real hand-tracking is still very wonky for me. I find it hard to control and there is no vibration feed back or the sense that you’re actually grabbing on to something.

I’m referring to using the Oculus Quest controllers such as in a first person shooter game, where the player needs to see his own hands and be able to grab guns or ammo and load them etc, as well as a handful of gestures “punch”, “point”, “thumbsup” kind of guess.

I’m a newb so I haven’t yet explored animations with bones/skeletons so what you’re talking about is still beyond me. Looks like the short answer is, “first learn blender modeling, blender animation, how to export them to babylon.js, (what format is best? .babylon or gltf), how to trigger the animations when controller buttons are pressed”. Guess I have a lot of videos to watch. :slight_smile:

Oh! Interesting idea. I guess we will be able to use the hand meshes that are used for hand tracking and emulate their movement when the controller is touched/pressed. I doubt this will arrive for 4.2, but I will be happy to investigate further after the release. Want to create an issue for this?

I’d be happy to. WebXR option to display hand mesh instead of controllers · Issue #9089 · BabylonJS/Babylon.js · GitHub

1 Like

I’ve got a Quest 2 on pre-order. I am primarily interested hand tracking. Will probably be generating geometry / skeleton and trying to get tracking to work late this month.

Hi, I was looking for this feature when I came across this request. Looking at GitHub, it seems like this may be coming in version 5. Is this still the case? Is there a timeframe for when it may be available in the alpha version?
Thanks

It came very late in the last version. The api has changed since, so that will not work anymore. It is almost fixed in 5.0 alpha. Although I use custom meshes, I got them from the default ones. You do not have to make your own. It is now documented.

The api change does have the side effect of failing when you leave emmissive mode and return though.

Hi. I am also looking for this feature. It can be very good for user experience to have the same felling than oculus “echo VR” game for example. The ability to see hands moving according to trigger buttons pressed is very immersive.
Can you tell me what can I do to have hand meshes moving according to the buttons pressed on the controller? Thanks a lot.

This is a requested feature that can be tracked here - WebXR option to display hand mesh instead of controllers · Issue #9089 · BabylonJS/Babylon.js (github.com)

I didn’t get a chance to look into it yet

1 Like