Eye tracking

Tutorial

·

intermediate

·

+10XP

·

30 mins

·

Unity Technologies

Eye tracking

Eye tracking is a feature of the Magic Leap 2 that allows you to track the user’s eye movements, blinks, fixation point, and other data. Using this information, you can make improvements to the user experience, such as activating or highlighting an object that the user looks at. This data can also be used for research purposes, such as determining what the user looks at or spends time focusing on in a given experience.

In this tutorial, you’ll learn how to implement eye tracking in your applications and respond to what the user looks at.

Languages available:

1. Overview

What is eye tracking?


The Magic Leap 2 has a capability called eye tracking that can detect the user's eye movements, blinks, fixation point, and other eye-related data.


In the example scene provided by Magic Leap, you can see some of the information captured by the device, including left and right eye center, gaze, fixation point, pupil size, and confidence values.



Why include eye tracking?


Information about the user’s eye movements can be utilized to enhance the user's experience by activating or emphasizing an object that the user looks at. Eye tracking can also be employed for research purposes, such as identifying what the user focuses on during an experience.



What you’ll learn in this tutorial


In this tutorial, you’ll learn how to implement eye tracking in your application and respond to what the user looks at.



You’ll also review an example where eye tracking is used alongside other features like meshing, occlusion, mapping, and spatial anchors to save sticky notes around the room. The sticky notes then automatically come closer to the user when the user looks at them.


2. Before you begin

Before you begin working through this tutorial, you should make sure your development environment is set up for Magic Leap 2 development and download the example prototype project.


Set up your development environment


If this is your first time developing in Unity for the Magic Leap 2, make sure you have your development environment configured properly using one of the following guided options:




Download and open the example prototype


This tutorial will guide you through the implementation of this feature in your own application using an example prototype. If you want to explore the code of the prototype yourself, download the accompanying prototype project and open it in Unity.


3. Basic eye tracking setup

Run the Custom Fit app


You can increase the accuracy of your eye tracking dramatically by doing the custom fit calibration on your Magic Leap 2 headset before you run the application.


Add an XR Rig to the Unity scene


Just like with any Magic Leap 2 project, the first step is to add the XR Rig prefab to the scene from the Magic Leap SDK package (Packages > Magic Leap SDK > Runtime > Tools > Prefabs) and then delete the Main Camera GameObject.



Enable the Eyes Controller GameObject


The XR Rig prefab provided by Magic Leap comes with an Eyes Controller GameObject that is configured to work just like another controller. You can use the fixation point to hover over, select, or activate objects just like you would with a controller. However, the Eye Controller GameObject is inactive by default.


Expand the XR Rig GameObject, locate the Eyes Controller GameObject, and set it as active in the Inspector.


At the time of writing (September 2023), the eye-tracking functionality provided with the XR Interaction toolkit is not fully supported on the Magic Leap 2 headset. We provide a modified script, named GazeRayInput.cs, that inherits from the XR Controller (Action-based) class. This script provides the same functionality but links the XR controller functions with the native eye-tracking features of the Magic Leap 2 SDK.


Download the script and copy it to your project assets folder.


Remove the current XR Controller component from the Eye Controller and replace it with the modified script.



Notice that the Eyes Controller GameObject uses eyesdata for input instead of left or right hand data. The Eyes Controller comes with an XR Ray Interactor component, a Line Renderer component, and an XR Interactor Line Visual component, just like a default controller in Unity’s XR Interaction Toolkit (XRI).


When you enter Play mode, you’ll see additional rays coming from the XR Rig, representing the user’s eye gaze.



However, you will not see these rays if you build the application on your device, since the application requires the user to grant eye tracking permission first.


Request permission and initiate eye tracking


The user must explicitly grant permission for eye tracking at runtime since eye tracking is one of the device’s Dangerous (Runtime) permissions. To determine which permission is required for which feature, you can refer to the documentation on the Magic Leap 2 Developer Portal.


Create a new script named “EyeTracking” and add the code below, which performs the following actions:


  • Subscribes to the permission events and requests permission on Awake.

  • If permission for eye tracking is granted, start eye tracking.

[@portabletext/react] Unknown block type "code", specify a component for it in the `components.types` prop

With the script above in your scene, the application should request permission for eye tracking and, if granted, you should see red lines at the center of your vision in each eye, tracking your eye movements.



4. Respond to the user’s fixation point

With the Eyes Controller activated and set up just like a regular controller, you can interact with interactable objects just like you would with an XRI controller. Now, when the user looks at an object, the device behaves in the same way that it does when the user hovers over it with a controller. When the user focuses on an object for an extended period of time, you can treat this action as a selection or activation event.


To set up a simple interactable object for your eyes controller functionality to interact with, follow these instructions:


1. In the XR Ray Interactor component for the Eyes Controller GameObject, enable the Hover to Select property, and set the Hover Time to Select property to a value between 1 and 2 seconds.


With this property enabled, users can use their eyes to select an object by looking at it for long enough.



2. Disable the Keep Selected Target property. This ensures you can deselect the cube by looking away from it.



3. Add a simple 3D object to your scene and add an XR Simple Interactable component to it. Increase the size of the collider around the object so that you don’t need to look directly at the object to activate it.



4. In the XR Simple Interactable component, expand Interactable Events and add some simple actions to the Hover or Select events.


In the example below, the cube’s material changes on Hover Entered, reverts on Hover Exited, and changes to a sphere on Select Entered, which should trigger after hovering for more than two seconds. You don’t need any custom scripts for this functionality.



5. Test this functionality in Play mode by simply rotating the XR RIg to look towards the interactable object and see how it responds.



6. Build the application to your device and test the gaze interactions. If you find the visible red lines tiring on your eyes, disable the Line Renderer component.



5. Eye tracking in context

Now that you know how to set up eye tracking, let’s check out an example of how this feature can be combined with other interactive functionality.


In this example, the user can use the Spaces app on the Magic Leap 2 to localize into a particular space. Then, the user can place sticky notes around the room, which can be stuck to meshes in the environment - and which are occluded by real objects. If the user leaves and reenters that space, the sticky notes will be reloaded in the environment, just as the user left them. Then, when the user focuses on one of those sticky notes, the note is brought closer to the user for easier viewing and editing.


6. Next steps and additional resources

In this tutorial, you learned about eye tracking on the Magic Leap 2. You can learn more about these features with the following resources:






You may also want to learn more about out the other features highlighted in the example prototype:




Otherwise, feel free to go back to the overview page, where you can explore other tutorials and resources.


Complete this tutorial