Hand tracking and GUI

A few days ago with the help of @RaananW I was able to set up a space with objects and manipulate them totally using my hands with the Oculus hand tracking system. Then I wanted to go a little further by adding some GUI objects like a color picker and a virtual keyboard. Both work very well using beams but I wonder if there is a way to use my hands directly on those objects as well.
Is it possible to define some kind of physics in these graphical interface objects to be able to interact with them directly with the hands?
If so, what kind of interaction should I be observing?

1 Like

Hi Gustavo_Font,

I think it’s definitely possible to implement yourself (by watching things like finger proximity, etc.) but I think at least some of it might be provided with Babylon already. I personally haven’t used things like the TouchHolographicButton yet, but it sounds similar to what you’re looking for. Will that work for your scenario?


I did not know the TouchHolographicButton, thank you @syntheticmagus, I will see what it is about and how they work in an oculus and then I will tell you.


@Gustavo_Font hi, just checking in! Did the TouchHolographicButton work for you? :smiley:

Hi, I finally implemented a GUI keyboard on that job. I did not test the TouchHolographicButton. I also think I’m going to try it with another project now that you remind me.
I hope to have news soon :slight_smile:

1 Like

I’m excited to see the results! :star_struck: