XR Experience Design - UX with hands

· 7 min read
XR Experience Design - UX with hands
Design - UX with hands

Current VR and XR headsets such as the Oculus Quest 2 and Pro, Pico 4 and the Focus 3 offer navigation with the hands as a standard interaction and thus as an alternative to controllers. Operation, especially for mixed reality applications, is thus becoming more and more current. Especially for XR application, which thanks to the Passthrough function show the user the actual reality, it makes sense to use as natural interaction as possible.

I've put together some info and links that have helped me in current production. Since about 3 months I am working with my team on a Serious Game in Virtual Reality, which uses hand tracking as a standard interaction to operate the GUI and for navigation. Different gestures are placed in the serious game to help the player move freely despite being in a seated position.

XR control concept with IR- sensor

XR interaction with the hands had its first big rise back in 2017. This was mainly due to the first version of the LeapMotion. A compact little infrared tracker that plugs directly into the VR headset with a USB cable. This makes it relatively easy to transfer the IR data to a rig in Unity or Unreal, on top of which a 3D model is superimposed to match the rig. By using this additional device, users can use their hands instead of the controller Virtual Reality and Mixed Reality for interaction. The new version of this is called UltraLeap and has been built directly into some devices (e.g. Varjo).

Interaction with the fourth dimension

Sounds crazy at first, but it also happened a few years ago. However, one must also be under no illusion, it does not yet work particularly effectively and stably. The intuitive interface operation, which was demonstrated by Tom Cruise in Minority Report, cannot even be achieved to some extent. In terms of software and hardware, there is simply not enough availability yet. In addition, the corresponding awareness has not yet developed. Only through very frequent and cross-application establishment of gestures, an awareness of the control is developed.

So all smartphone users know that swiping from top to bottom will refresh an application. Most often, the scroll position for this must be at the top. But this NUI "refresh gesture" has arrived in common knowledge. The same applies to the "scroll" gesture (swipe from left to right and vice versa) and the operation of the so-called drawer to open general menus. Therefore one pulls from the upper screen edge downward and with Apple devices of in addition also from downward upward around so different simple Buttons such as mute, flight mode, WLAN ect. to be able to reach fast.

On VR, AR and XR devices, there are no standard gestures except for the pinching gesture (bringing thumb and index finger together) that has become established with Hololense. Cross-device availability of hand tracking, however, was also not developed to the point of native integration until 2021. Meta's termed Handtracking 2.0 tracking using the cameras on the headset (These cameras are also used to sense the environment) is proving to be very effective. More on this, especially from a technical perspective, in the talk by Robert Wang (Director of Research Science @Meta). Don't worry you just have to watch the first 13 minutes.

The presentation helps to understand the process of hand detection with Ki enhancement

Operating interfaces with your hands

From a UX/UI perspective, hand tracking belongs to the Natural User Interface (NUI) category as a "natural" way of operating interfaces. Despite the fact that hand tracking has been possible since 2015, the operation of applications with hands has hardly been used in VR and XR. Which may have been due to the lack of precision of the previous implementation on the one hand, and the lack of use cases on the other.

  • The "physical" design of interactive elements in VR should offer special usage possibilities.
  • Every interactive object should respond to any movement.
  • Necessary gestures should be clearly described by text-based prompts.
  • Interactive elements should be scaled appropriately.
  • Place text and images on slightly curved, concave surfaces.

No interface is the best interface and interaction with the hands always feels more natural. Since the implementation of applications that are operated with the hands should be carefully planned, a pro and con evaluation is necessary.

Pro: When does hand tracking make sense?

For short interactions and productivity applications, hand tracking can be ideal. But this form of interaction can also be effective and straightforward in games, fitness applications, and marketing and trade show applications.

The "Ok, do handtracking" list

  • Reduce hardware complexity (additional controllers)
  • Objects that are "briefly" interacted with
  • For gears, cranks, buttons and switches
  • Gestures and "power moves" (global effects)
  • To trigger animations

Cons: When does hand tracking not make sense?

As soon as it comes to grasping and holding virtual objects permanently, you should rely on the more precise controllers unless a glove is available for haptic feedback.

The "Don't do handtracking" list

  • Shooter, held ranged & firearms
    • Pistols
    • Rifles
  • Cut and thrust weapons, held melee weapons
    • Sword
    • Axe
    • Spear
  • Climbing and gripped climbing
  • precise drawing, modeling

Gloves, for haptic / perceptual grabbing

There should be a solution that makes longer grasping possible, right? Actually, there is. If you paid close attention while watching Minority Report, you'll have noticed that Tom Cruise wears gloves for interaction. Several gloves are already available that use exoskeletons as well as, more compact at times, plastic builds around a glove to help users combine interaction with haptic feedback.

This means when something is grabbed in the virtual world, it feels real to the user because the glove simulates resistance.

Manus Quantum Metagloves, at a price of $5,999. These gloves support high-precision finger tracking through absolute fingertip positioning.

As already outlined with the allusion to interaction in Minority Report, haptic gloves can be used to realize very precise realization of fingers and hand movements in real time. This means that modeling, drawing, or even operating complex robotic systems or machines is entirely possible. - Even medical applications can benefit from this. Unfortunately, the purchase price (1000 - 7000 dollars) of this gadget is currently still too high, for the consumer market.

XR Experiences with hand tracking

Meta's First Hand demo demonstrates the effectiveness of "hand tracking 2.0."

Official hand tracking demo using the Presence Platform Interaction SDK

Control with gestures instead of the buttons of the controllers in VR Is still rather experimental. As already described in the course of this post, this is due to the horizon of expectations and the availability of devices that support hand tracking. That means you should actually anticipate and plan for a learning curve right from the start.

However, there is a very limited number of gestures that users can remember. Therefore, when developing motion controls, it makes sense to build a base. or try to combine gestures for advanced functions.

An even better approach is to develop specific prompts (e.g., a rendering of a gesture) that users respond to, so that specific pose or movement does not have to be relearned.

Since people express their body language physically in different ways, gestures often don't work the same for everyone. One person overstretches his hand when grasping, while another turns his wrist in slightly.

Through testing, comparing, and observing, I've found that the absolute best way is to capture the users own gestures. This is done by showing the user the gesture and prompting him to repeat it, then calibrating the gesture made by the user.

The future of hand tracking & gesture control

The steady improvement in the precision of hand tracking can be increasingly used to establish gestures. Especially the mixed reality use cases are ideal for this and it is not only dependent on the headsets. Research already exists that demonstrates the effectiveness of gesture control with smartphones. In direct comparison with the Hololense 2, HGR (Hand Gesture Recognition) has been proven to be just as effective.

The VR-Action-Roguelite-Titel Ghost Signal: A Stellaris Game will support Hand Tracking 2.0 when it releases in early 2023!

AI development has made significant progress particularly in 2022. Native hand tracking support, for example, has become a standard. Assuming more virtual content like Ghost Signal capable of being operated with hand tracking is actually designed now, there's not much standing in the way of the next evolutionary stage of operable interfaces.

The increasingly availability of eye-tracking capabilities will also play a significant role in the interaction with virtual content. Not only for interacting, but also for effective UX testing without taking the user out of the experience.

But there is still a long way to go from the current state to truly comfortable controls (ex. mouse and keyboard) with hand tracking. But what can be confirmed with certainty: We won't be sitting in a completely empty room in the future and the office furnishings will also remain as far as possible. The fact remains, a comfortable sitting position and support is very important to be able to work productively for a longer period of time.

Sources and references

MagicLeap / UltraLeap
Manus
Meta