Virtual Reality
February 15, 2022
5 minutes

Oculus Quest Hand Tracking

by

Dejan Gajsek

Back in late 2019, the introduction of Oculus Quest hand tracking functionality shook the world in the virtual reality (VR) industry. To use your own fleshy hands instead of controllers in real-time? Sounds like a dream! While it's still early days for hand tracking, there is considerable hype among VR app developers as it may open the doors to exciting new possibilities in mixed reality technology.

Oculus Quest (now called Meta Quest) hand-tracking support seeks to give VR developers more freedom to create and express themselves through natural hand gestures and simple interactions. Over time, app developers may be able to leverage this feature to make virtual reality a more accessible, realistic experience for users.

Most AR headsets and more advanced VR headsets as Oculus Quest are striving for so-called six-degrees-of-freedom (6DOF) where you can freely move your hands and body in your space.

Oculus Quest comes with Oculus Touch Controllers where the hand movements are being tracked by the sensors in your headset. There's are no wires necessary (thanks to Oculus Quest Link) and since you're using your hands as a simulator there's no need to grab hold of the controllers either. It’s one of the most exciting features of the device and it’s not surprising both Quest models (64GB and 128GB) are sold out from the Oculus Store. The same goes for Oculus Quest 2 (also known as Meta Quest 2) of course which increased the tracking and accuracy of hand recognition.

It's also why we'd recommend getting the Quest 2 rather than an outdated Oculus Rift and HTC Vive headsets.

Oculus or rather Meta is trying to become a platform just like iOS or Android for virtual reality content and the best way to do so is to design and build experiences that are familiar for newcomers and veteran dwellers in virtual reality.

To experience this groundbreaking feature try a few of the examples of VR hand tracking games where you’re using your real hands instead of controllers:

Img

In this article, we'll explore the impact of Oculus Quest hand tracking feature, look at the current best practices and consider how you can get a grip on this potentially ground-breaking technology to create a better user interface (UI).

Consider this article as a practical advice towards developing your UX/UI design. Keep in mind the hand tracking field is still in the early days and there’s no “one size fits all” approach. Apart from limitations of the system the virtual experience of the applications is different from user to user. Saying that, this content is a good starting point.

Defining Hand Tracking

So, the obvious question:

What is Oculus Quest controller-free hand tracking?

Hand tracking is a feature on the Oculus Quest head-mounted display (HMD) that enables people to use their hands as a viable input method when using the VR device. Users can make simple gestures with their hands, such as pinching, holding, or dragging to perform tasks or actions in the VR environment.

The critical aspect is that it is fully-articulated tracking hardware that can determine where your hands are in the virtual space and what every finger or finger joint is doing at any moment. This design facilitates natural hand  interactions, which give the user a heightened sense of presence and a more engaging and immersive experience.

Tracked motion controllers or HoloLens 1 don't provide information about your hands' location or where your fingers are pointing. Similarly, Valve Knuckles are not hand tracking technology because they only offer one axis of movement.

You can see good examples of fully-articulated hand tracking in Leap Motion, HoloLens 2, in Smartphone SDKs, including MediaPipe and ManoMotion, as well as in VR, including Manus Prime and VRgluv.

Now, Oculus Quest hand tracking joins that list, which presents excellent opportunities for app developers.

Chapter 2: New Hardware Demands — New Paradigms

Good things happen when the product and design teams at Facebook Reality Labs put their heads together. What originally started as a research project has resulted in an innovative new paradigm for virtual reality input.

Design paradigms are the foundational concept that underpins how we interact with software. In VR, this landscape changes dramatically, which is an essential evolution for high-level hand tracking.

The software behind Oculus Quest hand tracking incorporates deep learning, allowing the computer to determine the position of the user’s fingers, using only the Quest’s native monochrome cameras. The technology creates a group of 3D points that map to the user’s hands and fingers, enabling it to represent movement to an accurate degree in the VR environment.

While the new Oculus feature is a leap forward for mixed reality, it poses a challenge for developers and users alike. All unexplored technology is untested, and therefore, it forces designers to think outside the box, as they effectively must come up with new ways of designing software for this unique hardware.

And let's not forget, the Oculus Quest 2 (the most popular virtual reality headset by far) has seen numerous improvements since the general release in October 2020, namely the ease of integration with the development computer via Oculus Link in first improvement and last Air Link in the last software update. These hardware improvement made Oculus Quest (2) owners extremely happy since it expanded the features of the base model. Oculus Quest 3 is set to release in 2022 as well.

The last major shift of this magnitude was when the world moved to a mobile-centric reality, swapping their desktop computers for smartphones and tablets. Developers had to forget about the mouse-based environment, and instead, think about touchscreens, as clicking was replaced by swiping.

Now, in the realm of VR development, app designers must figure out how to get the most of the Oculus Quest technology using their hands instead of controllers.



The Three Primary Ways of Interacting with Hand Tracking

When using Oculus Quest hand tracking, there are three ways you can interact with the virtual environment:

    1. Direct Manipulation
    2. Hand Rays
    3. Gesture Recognition

    Let's take a closer look at each one.

    Direct Manipulation

    Direct manipulation is a virtual reality input model where the user reaches out with their hands to touch and interact with holograms. Objects behave as they would in reality, and so, it's a fun and easy way of learning how to control a virtual world.

    You can press buttons, pick up objects, scroll windows, and activate 2D content aspects as if they were a virtual touchscreen. Direct manipulation is a near input model, which means it is best when you want to interact with content within arms reach.

    Challenges of Direct Manipulation

    While technology advances rapidly, there are some persistent issues with direct manipulation:

    1. Jitter is a misalignment between the virtual hands and your actual hands. In a lot of AR platforms, the virtual hands are represented in abstract ways, like with shapes, clouds, or sparkles. This design is intentional to hide the jitter effect, so users don't pick up on any obvious jittering.
    2. Drift is the feeling that you are moving in the virtual world, even while you stand still. Objects appear to move around you for no reason because of a constant offset of where the computer thinks your hand is compared to its real location. Bad lighting or too much light coming into the headset is a primary cause of drift issues.
    3. Grotesque Teleports are when a part of the virtual hands, like a finger, randomly appears inside the hand, or even completely disjointed somewhere else in the room. It may only last for a frame, as these errors happen when the technology misreads the environment and makes bizarre misjudgments. (It's also a really good name for a symphonic progressive rock band.)

    The Importance of Interaction Resolution

    Together, these three problems are the key factors that impact interaction resolution, which is the minimum object scale that users can comfortably engage in the virtual world without encountering any detrimental visual or performance issues.

    If you have an externally tracked controller, like a Quest controller, you have incredible accuracy. You may find it possible to comfortably interact with items the size of a pinhead in the virtual environment. However, with hand tracking, the interaction resolution is not as good because of the issues above. Realistically, any objects smaller than an inch will be hard to engage.

    While the quality and user interface of Oculus Quest hand tracking is still impressive, the technology still has room for improvement. The quality will get better over time, but for now, there are some limitations that designers must accept.

    Besides the object size, you must also think about the length of your arms when designing a virtual world. With direct manipulation, you can only interact with objects you can reach. More to the point, consider the length of your shortest-armed user and make sure not to overpack your virtual world with elements that may be out of reach.

    Hand Rays

    A hand ray is a virtual reality design concept where the user can “shoot” a beam from their hand at a distant object, and then use gestures to exercise control over that object from afar.

    Source: microsoft.github.io

    This feature enables you to interact with the user interface in many ways, such as flipping switches, pushing buttons, or picking up items from across the room. You may even get some haptic feedback as you touch your finger and thumb together to activate control on an object.

    You can anchor hand rays based on your head position, which adds a degree of accuracy before you adjust the angle through a broader range of motion with your arms and hands.

    Challenges of Hand Rays

    Typically, hand rays don’t work as well in a virtual environment as they do on computers.

    Gesture Recognition

    Gesture recognition is a virtual reality design concept where the computer analyzes the pose of the user's hand in regard to the position and shape of the fingers and palm and then triggers a corresponding action. For example, the computer may recognize common sign language uses, like the peace symbol or a rock n’ roll sign.

    hand recognition

    Source: Medium / Vincent Mühler

    Challenges of Gesture Recognition

    For designers, gestures can be tricky to deal with, as there is a lot of room for error. You must consider the following:

    1. Unintentional gesticulations through subconscious movement or common body language. People moving their hands naturally while talking might cause the software to do something in response.
    2. Intentional operations in your world may infringe on the user's ability to use certain poses for other reasons. For example, if the user can activate a button in the world, but also uses a similar pushing movement elsewhere, it can confuse the system.
    3. Intentional inherent meaning gestures like the OK sign or thumbs up already have an ingrained meaning in many cultures. Designers must avoid using these in their user interface because the user expects there to be a close one-to-one mapping between the meaning they already understand and the meaning that the software will perceive.
    4. Intentional meaningless gestures are specific poses or movements that don't have any preconceived cultural meaning and are only done intentionally. For example, Quest uses an eye-pinch gesture that is unlikely to be confused with any natural body language.

    When are gestures justified?

    Designers must incorporate certain gestures to enable users to improve the user experience for their Oculus Quest hand tracking application. A prominent example of an essential gesture is to perform a system escape. Users must have access to this ability at all times.

    When the inherent meaning of a gesture matches the outcome, it's worth including the gesture in your design. For example, if the user wants to take a screenshot, they can bring the tips of their thumbs toward the forefinger from the opposing hand to create a square shape, as if they are looking through the frame of a camera.

    A further example of a justifiable gesture in a hand tracking application is when the gesture is the sole interaction in the interface. For instance, in a virtual darts game, the only interaction the user would be doing is the motion of throwing a dart.

    Oculus Quest Design Workshop

    Watch the free on-demand workshop about designing UI for Hand Tracking. The workshop is hosted by Eric Carter, former principal game designer at Oculus Studios and former design engineer behind HoloLens 2

    Alternative Hand Tracking Ideas

    The latest developments in Oculus Quest hand tracking have got the VR/AR community talking, and many UI designers are chomping at the bit to push the boundaries of this feature.

    In doing so, they’ll quickly find the truth:

    We need new design paradigms.

    Virtual reality remains limited in many ways, as designers are yet to discover the most important and exploratory paradigms to maximize the potential of hand-tracking technology.

    Here are just some ideas that you can use when designing Oculus Quest hand tracking applications:

    • Poking
    • Tapping
    • Squeezing
    • Pinching
    • Dragging
    • Scratching
    • Throwing
    • Pulling
    • Pushing
    • Petting
    • Snapping
    • Finger walking
    • Finger skateboards
    • Finger painting
    • Flicking
    • Sign language
    • Crushing
    • Spinning top

    The list goes on and is only limited by the imagination of the designer. To get a feel for the possibilities of new paradigms in VR design, consider the examples below.

    1. Swipe Keyboard enables you to swipe your hands around a virtual keyboard. The software guesses what you are trying to write using machine learning technology, and the completed text appears on the screen as you continue swiping.
    2. Flying Hands makes it possible for you to interact with items that are far away. It is like a modified hand ray, where your hands are at the end of the ray, allowing near-field direct object manipulation on objects that are at a distance.
    3. Range/Size Amplification is another distance tool where the virtual reality hands are much bigger than the user’s actual hands. This allows for large-scale manipulation so you can control huge objects and entire scenes with relatively minimal movement.
    4. Hand Throwing is a fun concept where you can throw your hand away from you, and then use gestures to crawl the detached around the floor or other surfaces to get to objects.

    Some of these ideas may seem novel at best, but Daniel Beauchamp, the head of VR/AR at Shopify, explains this an essential road to evolution.

    “One of the best ways to unlock new and powerful ideas is to build upon silly ideas,” says Beauchamp.

    How to Start Designing Hand Tracking Applications

    To get to the ground-breaking ideas in VR/AR, you must first go through a lot of bad ideas. As you do this, think about how you make improvements to hand tracking faster. After you setup Oculus Quest for development move your mind to the researcher's mode.

    1. Start small and silly.

    Instead of trying to brainstorm a winning product with hand tracking instantly, you should play around with the technology, coming up with fun ideas that users enjoy. In doing this, you can get familiar with the capabilities of the tool, and may stumble upon interactions and concepts that you can apply to a more purposeful project.

    Beauchamp claims he wishes more VR developers took this approach, as the little ideas can lead to huge breakthroughs.

    “Don’t put the burden on yourself to build out a whole game or build out a whole product. Build many small things, no matter how silly they may seem. You’ll be surprised at just how much you learn.”

    2. Get some physical models.

    If someone walks into your office and sees a pair of fake hands on your desk, they may think you’re taking the silly approach a little too far. However, VR app developers should consider getting themselves some quality physical models.

    It's much easier to formulate ideas and explain your concepts to other team members when you have a set of hands. You can draw some axes on the back of the hands, which can make it easier to demonstrate different motions and suggested gestures with greater accuracy.

    3. Communication is crucial.

    It's vital to discuss design concepts and new ideas. Your vision for a virtual reality environment or application will only come to be if you talk openly about it with your team members.

    Make sure you have regular meetings or open platforms and project management tools that facilitate free-flowing discussion so you can bounce ideas around, and get the feedback needed to sculpt a rough brainwave into a polished concept.

    4. Experiment with rapid visualization tools

    Everybody in design knows that the faster you can get your designs visible, the better. Here are a few tools that UI designers can use to hone their skills in designing VR environments:

    1. Vectary takes 2D vectors and gives them depth to make them 3D. You can do this to quickly create a 3D world yourself, without needing a software engineer or high-level coding skills.
    2. Blocks by Google allows you to enter a VR world and use basic building blocks to construct shapes. You can then export your finished models directly into your game engine.
    3. Tilt Brush by Google is a similar tool to Blocks that allows you to create more complex, organic shapes through paint tools.

    5. Invest in Debug Tools

    Avoid the temptation to dive into VR design without a good debugging tool. You need to think about how you can improve the system and user interface, and so, a debugging tool is a vital aspect that will help you identify and eliminate issues quickly.

    6. Cut down your iteration time

    Time is often a significant constraint in VR design. Even if you're on top of the debugging issues, you must look for ways to reduce your iteration time.

    If you’re using the Oculus link, you can use Quest hand tracking on your PC instead of building an app and deploying it to the Quest for testing. Through Oculus link, your VR app will play live through Unity or Unreal.
    An alternative to this is Leap Motion, a camera that you can place on your desk or helmet that will enable everyone on the team to access hand tracking.

    Another idea is to configure ways of simulating hands through mouse or keyboard functions. Doing this means people in the team can quickly play with simple ideas on the fly, which speeds up the design process.

    7. User Testing

    When you use VR newbies for beta testing, it can prove unproductive. As they have no experience with the hardware, it's hard for them to get past the WOW factor. Instead, you should use people with some VR experience because they can give negative feedback and constructive criticism that helps you improve your app design.

    While many experts may have some bias toward certain paradigms or technology, they are more useful for testing and optimization assistance. To combat any bias, ask your testers to rely only on what they see or hear in the virtual environment. By playing dumb and acting solely on what the software tells them to do, users can identify flaws in the UI.

    The Future of Hand Tracking Design

    Oculus Quest hand tracking presents immense opportunities from education to enterprise. As virtual reality becomes ever-more integrated into the modern world, more companies want to tap into the raw potential of this technology.

    For now, there are still some kinks in the design. Still, UI designers and app developers can come together to create incredibly engaging and immersive VR environments for a diverse range of real-world uses.

    Communication within design teams is crucial, as is external communication with users, especially when it comes to testing and optimization. Developers must enlist the help of experienced VR users during testing, and leverage advanced debugging tools to reduce iteration times.

    Performance matters most, but fast production is vital as the competition heats up to become the company that creates the next "swipe" motion for VR on smartphones.

    You can learn more about designing for hands from Oculus or practice hand tracking techniques in Microsoft Design Labs Hands Playground for Quest and HoloLens 2.

    Download Syllabus

    Download XR Development with Unity Course Program

    Share this