AR/XR Gesture & User Input Integration

We are seeking an experienced BabylonJS developer to help our team for a project focused on displaying a terrain hologram in a room or on a desk, with capabilities for zoom, translation, and rotation. We have designed the interface, but due to a lack of bandwidth, now need assistance in implementing advanced user input features for both traditional touch and XR environments.

Responsibilities:

  1. Touch Input Integration: Replace the existing mouse and keyboard interface with touch inputs for multi-touch capable screens. This includes implementing touch gestures such as rotate, translate, and zoom.
  2. XR Gesture Implementation: Develop gesture-based input for XR environments (e.g., AR glasses or VR headsets). These gestures should allow users to perform the same operations as the touch interface: zooming, rotating, and translating the map.
  3. Collaborate with our team to ensure seamless integration with existing interfaces and maintain performance efficiency.
  4. Ensure compatibility across a range of devices, including desktops, tablets, and XR hardware.

Requirements:

  • Proven experience with BabylonJS.
  • Strong understanding of user input mechanisms (mouse, keyboard, touch, XR gestures).
  • Experience in developing for AR/XR environments, including hand tracking and gesture recognition.
  • Ability to deliver clean, maintainable code with attention to performance in web-based 3D environments.
  • Familiarity with holographic displays or terrain rendering is a plus.

Preferred Qualifications:

  • Experience working with WebXR or similar APIs.
  • Previous work with multi-touch interfaces and gesture-based interaction.
  • Understanding of 3D interaction design for XR.
3 Likes

Just a word from me: Working with @Guillaume_Pelletier will be a great privilege for anyone. This guy rocks big time!

3 Likes