Use your hands in your Virtual and Augmented Reality experience
Finger tracking enables users to see virtual avatars of their hands, mimicking with precision the movements of hands and fingers. This is key in many use cases to truly interact with the model and even touch* virtual objects with the millimeter-accuracy of the fingers’ position and movements.
*touch = you can feel the touch using haptic feedback, depending on the selected device. A visual feedback is also possible through 3D collision detection.
Feel each part of your model at your fingertips
TechViz finger-tracking allows you to add external sensors such as tracking gloves and force-feedback devices. While interacting with your CAD model without controllers, you will be able to feel vibrations or create a physical resistance when grabbing virtual objects.
How does finger tracking work?
Finger-tracking happens when the augmented and virtual reality headset, external cameras or finger-tracking sensors can track your hand and finger’s position, pace, and orientation. The tracking data is then analyzed by your VR software into a real-time representation of your fingers and your interactions with the CAD model inside the virtual world.
The AR/VR system natively includes Finger-tracking
The tracking originates from your head-mounted device’s cameras and is called inside-out tracking. It uses the difference in pixels to determine your fingers’ movements.
This is the case for several VR headsets like HTC Vive Focus 3 and Meta Quest 2.
The AR/VR system relies on Ultraleap SDK
The immersive virtual reality system features Ultraleap’s optical tracking software.
It can either be integrated inside the head-mounted display’s cameras (like in Varjo XR-3 and VR-3) or displayed through an external module (such as Pimax vision 5K and 8K and Pico Neo 3)