Insights From The Blog

Unity’s XR Toolkit Enhances Eye and Hand-tracking

The latest release of Unity’s XR Interaction Toolkit (XRI) has bolstered the hand and eye tracking substantially, making it a development that is likely to benefit both commercial and enterprise applications. Unity released a previous version of the toolkit – 2.3.0 upgrade early in the year that included hand- and eye-tracking functionality. These technologies have advanced with the most recent edition. XRI 2.4.0 now has an upgraded XR Gaze capability that allows eye-tracking to be modified by combining it with controller inputs. Although this may appear to be a bit of a step back, the company claims that the function is more of a “fallback” for when controllers are not actually available. 

Users can now control a cursor with their eyes while selecting things with the controller buttons, thanks to the new feature. Unity has also announced “additions and improvements” to the toolkit’s hand-tracking functionality. Version 2.4.0 adds a new tracking function that gives a library of predefined hand positions to help with the building of XR apps. It also incorporates dynamic images, which should improve the user’s virtual hands and fingers’ credibility. The major changes in release 2.4.0 include:

  • Visual improvements. The XR Interactor Line Visual has been modified so that it automatically conforms to the currently active user-selected interactable. This makes using the Ray Interactor feel more natural and streamlines the user experience. In addition to the graphical updates, the performance of the line visual and ray interactor has been enhanced by optimising most of the line computation maths for the Burst compiler. To take advantage of these optimisations, the user will need to add the Burst package to their project. In addition to the graphical and performance enhancements, the XR Transform Stabiliser has been implemented to stabilise rays. This implements refined stabilisation methods to eliminate pose jitter, facilitating ray-based aiming and selection. The updated versions of the prefabs that comprise the Starter Assets are now available for use.
  • Gaze and aim assistance. When the user is off-screen or pointing off-screen, XR Gaze Assistance will automatically switch to eye-gaze for primary aiming and selection, expanding on the gaze interaction capabilities introduced in the previous version of the software. With this module, the user can utilise both eye gaze and a joystick to aim and make selections. The XR Gaze Assistance add-on also functions as an aiming aid. A thrown object’s trajectory is automatically adjusted so that it will more closely approach the target being looked at.
  • Hand interactions. The new XR Input Modality Manager handles the dynamic switching between monitored hands and controllers. Sample prefabs included in the package have been updated to make use of this new feature. Unity have revamped the Hands Interaction Demo with fresh images that change in reaction to the intensity of the user’s touch. The Hands Interaction Demo’s XR Origin Hands (XR Rig) prefab has been upgraded to make use of individual prefabs for each hand visual, each of which features affordances designed to draw attention to the fingers as they are used. Another significant upgrade has been made to the XR Device Simulator, allowing for the simulation of Hand Tracking. In-editor testing of hand-based interactions is made easier with this set of common, pre-defined stances.

The new Climb Locomotion Provider and Climb Interactable in version 2.4.0 make it possible for players to grip onto and pull themselves up a series of climbable objects. Ladders, climbing walls, and even monkey bars can be made with this in any orientation. The Climb Sample prefab and a Climb Provider instance have been added to the XR Origin Preconfigured sample’s Starter Assets for use in the demo scene. Furthermore, the Interactables now have a new Focus State attached to them. When an interactable is clicked on, it becomes the focal point of the user’s attention and remains so until either another interactable is clicked on or the user clicks away from the item by selecting an inactive element. A highlighted object can then be modified in a new user interface panel, such as by changing its size or colour. The release also sees changes and upgrades to toolkit settings to make them easier to access and handle, and toolkit interaction layers, which now need to be selected manually. 

Major industry players such as Microsoft and Magic Leap are increasingly using the hand- and eye-tracking capabilities of XR to help create a more efficient workplace. The enhancements in this recent release should be extremely beneficial to developers looking to create cutting-edge apps for these platforms. These additions are also being seen as a major benefit to users who find the usual hardware controllers difficult to use, making Unity more inclusive. Unity release 2.4.0 is being seen as a big step forward in enabling eye and hand tracking for applications and we at Unity Developers are excited.