Meta drops hand tracking

At the time of writing this brief manual, Meta’s hand tracking update has reached version 2.2. It went relatively unnoticed, perhaps because everyone is busy programming their useful second life clone. However, Meta has truly dropped a bomb.

Compared to the previous update, Meta’s system no longer loses tracking, and most importantly, hands are recognized mutually, enabling a whole array of new actions that were previously impossible. In short, now you can shake hands with your virtual friend, if you’ve managed to make one here at least.

The fundamental aspect that seems to not particularly spark interest is the ergonomics of this kind of progress. Entering the virtual world with full control of one’s hands allows for greater inclusivity. Without resorting to usability wizardry terms, I’m simply saying that getting Grandma to use the headset is a bit easier. If you’d like, I can make it a bit more academic for you: during a fall, our instinctive reaction is to extend our hands forward. You don’t need to learn commands; as Meta says, it’s something visceral, an innate interaction.

So why are there so few experiments around then? Does no one care to use these new tools to improve the lives of others? Why is everyone on Second Life? The answer exists, but the Metrohive study doesn’t know it; surely those who label the metaverse as wobbly web applications of questionable utility don’t appreciate it.

After this necessary introduction, let’s clarify that this article simply aims to become a compendium, it attempts to be such, in order not to get lost among technologies, guidelines, and development kits that all bear the same name.

These kits all have the same name.

Oculus Integration and Interaction SDK are the same thing! When you download the package from the asset store, that package contains the entire family of Oculus SDKs, including Voice SDK, Movement SDK, and Interaction SDK. Interaction is a small part of the Oculus Integration family and contains the code and interface for hand tracking.

When you navigate in Unity’s browser (or the less recommended engine for VR that you foolishly chose to use), you will always see the acronym Ovr in front of all the classes.

The XR Integration Toolkit is Unity’s SDK with all its specific classes and is similar to Oculus Integration. They perform the same function, so if you’ve chosen to use both, either you know very well what you’re doing, or you don’t know at all. Just know that with this toolkit, you can develop your product not only for Meta but also for multiple platforms.

But be aware, the XR Integration Toolkit also has its own set for programming hand tracking, called XR Hand. This works well, but not as well as Meta’s in-house SDK. If you want to play it safe, choose to use Meta’s kit. That being said, going for multiplatform development currently wouldn’t make much sense, as this is the only device with good hand tracking available at the moment.

Copy and paste the code

By going to Meta’s lab, you can download two demos completely for free. The first one is called “First Hand,” and it’s a demonstration of what the Interaction SDK offers in its examples.

The second one is “Fast Move,” and in its absolute uselessness, it shows you when the optical tracking of the headset becomes responsive. It doesn’t capture all movements perfectly, but it’s absolutely astonishing.

After trying out the demos, you can explore the actual project. Here’s the link to the GitHub repository: https://github.com/oculus-samples/Unity-FirstHand. You may have noticed that the project is not complete; it’s just a small part. However, this part is sufficient to understand how motion and gesture recognition have been implemented. For locomotion, you need to navigate to the “Example scenes” folder in the Interaction SDK examples.

For the rest, Meta’s documentation provides detailed explanations of the classes and functions in the development kit. The code within each example can be reused, and there’s a thorough section explaining the creation of poses and gesture recognition.

Hand tracking guidelines for Noobs

Below are some of the guidelines and best practices shared by Meta months ago on its YouTube channel. These guidelines have a direct correlation with the functions implemented in the classes of the kit. They actually apply to both hand-tracked experiences and mixed reality applications.

Providing continuous feedback.

Let’s remember that the user no longer has controllers, so goodbye buttons and sticks. This means that interactions must be extremely intuitive. I repeat, hands do not have buttons, so the user doesn’t know how or when to interact with the system. To simplify, if I press a button, I immediately see something happen – a shot fired from a gun, a menu opening, and so on. Without buttons, how do I understand what actions are allowed by the system designer? We need to guide the user through affordance. I promised no elitist terms; I hate the tendency to make things incomprehensible too, but “affordance” is quite charming.

Affordance

I asked chatGPT, and it says: “Affordance” or “capability to offer.” It refers to the property of an object or environment that suggests how it should be used or how it can be interacted with. In other words, affordance is what an object appears to suggest you can do with it based on its shape, position, and other visible characteristics.

Clear, isn’t it? Now we all know how a door works, so we don’t have to learn how to open one. In this case, you simply need to inform the user that a certain action is possible by providing a visual signal (signifiers) and giving feedback if that action is happening correctly.

If certain actions are possible through specific gestures in a broader context (e.g., opening the menu by pinching the index finger and thumb), the designer must guide the user step by step, requiring successful feedback for each action. Commonly referred to as a tutorial. It’s crucial.

Creating constraints

The complete freedom of interaction in hand tracking could be overwhelming. It’s important to constrain these actions through magnetic allocations that lock objects placed in a virtual space. Constraints are particularly useful in managing windows and UIs; we won’t have to endlessly scroll through virtual windows, but rather implement a limiter within that range. Meta explains how in their demo, they didn’t allow the user to enlarge an object to the maximum, as it caused confusion.

Ergonomics and Distances.

In designing a Handtracking application, it is crucial to place interactions near the user’s waist. This allows the arms to be at rest, maintaining a 90-degree angle at the elbow. This position enables a relaxed posture and neutral stance. Remember that using hands for interaction is a very intensive action. The user shouldn’t have to reach too far to pick up an object or interact with the virtual world.

When designing the experience, take into account the space where the object will be placed. If an object is too far away, you can utilize what’s called “instance grab,” while for menus, raycasting can be used. The distances our arms need to cover for a specific action hold significance for the usability of the experience. Meta divides it into three levels. The first level pertains to operations performed most frequently. The third level is for actions defined as occasional, situated at a greater distance.

Raycasting e distance grab

The user’s neutral posture is crucial in terms of ergonomics, which is why methods exist to facilitate interaction with distant objectives. “Distance grab” allows emulating the grip of an object even if it’s beyond our physical reach. “Raycasting” is slightly different; envision a distant pointer that enables interaction with panels and buttons outside our immediate reach. Remember, however, that these methods should still adhere to the rules of affordance and signifiers for good usability.

These paragraphs provide a quick and essential summary of the guidelines provided directly by Meta at this address: https://developer.oculus.com/resources/hands-design-bp/. There is another interesting reading about the Shared anchor space here

Share This Story, Choose Your Platform!