After completing this section, the developer should:
Understand how to set up their Unreal project to use the Movement SDK including the use of Meta Horizon Link.
Understand how to set up permissions for eye tracking, face tracking or body tracking.
To set up a Movement SDK feature (body, face, or eye tracking) on your Unreal project for Meta Quest, you must first properly configure your project for Unreal development.
Unreal project setup
Prerequisites
To use Movement SDK for Unreal, the following are required:
A Meta Quest Pro headset for eye tracking and visual-based face tracking.
A Meta Quest 2, Meta Quest 3, Meta Quest 3S, or Meta Quest Pro headset for body tracking and audio-based face tracking.
Unreal 5.0 or newer installed (Unreal 5.4 recommended).
An installed version of the Meta XR plugin for Unreal.
A Meta Quest device with developer mode enabled. See Mobile Device Setup for instructions.
If you are not familiar with setting up an Unreal project for Meta Quest, follow the Setting up your development environment tutorial to create and configure a basic project that runs on your headset.
After completing the tutorial, confirm the following:
The Meta XR Project Setup Tool has been run with all required and recommended rules applied.
Note: The OpenXR, OpenXREyeTracker, OpenXRHandTracking, OpenXRMsftHandInteraction, and OpenXRViveTracker plugins are not compatible with Movement SDK and should be disabled. Disable these in Edit > Plugins by searching for each plugin name and unchecking the Enabled box. The Meta XR plugin provides the OpenXR compatibility.
OculusXRMovement module
The OculusXRMovement module provides the tracking services for body, face, and eye tracking. This module is included as part of the Meta XR Plugin and does not require a separate download or installation. When you install the Meta XR Plugin, the OculusXRMovement module is available automatically.
For technical details about the animation nodes and configuration options provided by this module, see OculusXRMovement plugin reference.
To enable the tracking features of the Meta XR plugin, go to Edit > Project Settings.
Scroll down to the Plugins header and select the Meta XR subheading.
Under Mobile, scroll down and enable the tracking features you want to use:
Plugins menu with body, eye, and face tracking selected.
Under General, make sure that XR API is set to Meta XR with OVRPlugin.
Note: The tracking feature settings (body, face, and eye tracking) are only visible when XR API is set to Meta XR with OVRPlugin. These settings do not appear under the Epic Native OpenXR (Recommended) backend.
Set up Android permissions
Your application requires the following permissions to use the tracking features in Movement SDK:
Body tracking: com.oculus.permission.BODY_TRACKING
Face tracking: com.oculus.permission.FACE_TRACKING and android.permission.RECORD_AUDIO (required because face tracking uses the device microphone as a data source on all Quest devices)
Eye tracking: com.oculus.permission.EYE_TRACKING
Note: These permissions are automatically added to the Android manifest when you enable the corresponding tracking features in Project Settings. The steps below describe how to request runtime user consent using Unreal Engine’s Blueprint API. See the Unreal Engine Android permissions documentation for more details.
In your GameMode (if using the VR Template, this is called VRGameMode) or GameInstance, create a string array variable and name it Permissions.
Change the variable Type to String and make it an Array container by selecting the icon to the right of the variable type in the Variable Detail panel.
Compile and manually add the relevant permissions to the Permissions variable (the permissions list will be empty by default).
After the BeginPlay event, add a Request Android Permissions node to trigger the permission request.
Compile and save the GameMode (or GameInstance).
If you are using PCVR and want to use Meta Horizon Link for play-in-editor:
Open your Meta Horizon Link app and go to Settings > Developer.
Enable Developer Runtime Features.
Enable Eye Tracking over Meta Horizon Link.
Enable Natural Facial Expressions over Meta Horizon Link.
Before continuing, verify that your project is configured correctly:
Select Save All in your project.
Play in the editor by selecting VR Preview while using Meta Horizon Link, or build, deploy, and run on your headset.
What’s next
You have configured your Unreal project for Movement SDK and set up the required tracking permissions. Continue to the Code Walkthrough to implement body, face, and eye tracking in your project.