Ultraleap Hyperion has been launched, replacing its Gemini software with a new product developed using insights gained by over a decade of machine learning.
The new software offers new modes to enable customers to get the best hand tracking performance for their use case. This includes a low-power mode that enables hand tracking to run with reduced power consumption and a high-performance mode that delivers accurate finger mapping with low latency when computer processing power is unrestricted.
Microgesture interactions within XR can now be tracked by Hyperion down to the millimetre, enabling subtle gestures from fingers that require minimal effort from the user to be picked-up on.
Equally, Hyperion is capable of improved hand tracking when holding an object, a key benefits for mixed reality when holding a physical prop, such as a welding torch, the hand tracking remains robust.
Hyperion enables direct access to the Leap Motion Controller 2 stereo IR camera hardware, expanding its use to other computer vision tasks, like depth sensing, 3D scanning, and object tracking. The Leap Motion Controller 2 camera can also track AR Markers (also known as fiducial markers) enabling tracking of any object.
With Hyperion, Ultraleap can further train hand tracking models for additional specific use cases and custom-tune models within days rather than months, increasing a customer’s speed to market.
It is also compatible with a wide range of hardware solutions, including integrated XR headset cameras for SLAM or peripherals such as Ultraleap’s Leap Motion Controller 2. It will also support the requirements of the next generation of AR/MR devices where the form factors are substantially smaller in size, for example event sensors like Prophesee’s breakthrough GenX320 metavision sensor.
Ultraleap CEO Tom Carter, said: “Thanks to our machine learning and computer vision capabilities, Ultraleap Hyperion allows users to unlock the Leap Motion Controller 2’s broader computer vision capabilities, enabling a number of new tracking possibilities for our partners and customers. I’m excited to see what will be developed with these new computer vision capabilities and to work with our partners to create the next generation of spatial interactivity!”