Meta XR Interaction SDK Essentials provides the core implementations of all the Meta XR interaction models, along with necessary shaders, materials, and prefabs.
Note: Some features are only supported for Meta XR Interaction SDK with Meta XR Core SDK. Meta XR Interaction SDK Essentials with Unity XR does not support the full set of Interaction features, but it does offer the possibility of cross platform support. To learn how to get started with Interaction SDK with Meta XR Core SDK, check out Getting Started with Interaction SDK.
Project setup
With Interaction SDK and OpenXR installed through XR Hand dependencies, OpenXR must be enabled.
Navigate to Edit > Project Settings and select XR Plugin-Management.
Navigate to XR Plugin-Management > OpenXR.
Under Interaction Profiles, click the + icon to add and enable your intended profiles, i.e., Oculus Touch Controller Profile and Meta Quest Touch Pro Controller Profile
Under OpenXR Feature Groups enable:
Hand Tracking Subsystem
Meta Hand Tracking Aim
Meta Quest Support (Android only)
Navigate to XR Plugin-Management > Project Validation.
The Project Validation tool optimizes project settings. The tool applies the required settings for the configured dependencies.
Add the rig
In Interaction SDK, the rig is a predefined collection of GameObjects that enable you to see your virtual environment and initiate actions, like a grab, a teleport, or a poke. The rig is contained in a prefab called UnityXRInteractionComprehensive, it requires a working camera rig, and will add support for hands, controllers, and controller driven hands to your scene.
Instead of manually adding these prefabs to the scene, using Interaction SDK “Quick Actions” is recommended.
Delete the default Main Camera if it exists because Interaction SDK uses its own camera rig.
Right click on the Hierarchy and select Interaction SDK > Add UnityXR Interaction Rig.
If Fix All is enabled in the Unity XR Interaction Rig dialog, click it to create a camera rig.
Click Create to add the UnityXR Interaction Rig to the scene.
Open the Link desktop application on your computer.
Put on your headset, and, when prompted, enable Link.
On your development machine, in Unity Editor, select the Play button.
In your headset, you can you can see your hands in the app.
Test your Interaction with an APK
Build your project into an .apk to test your project.
Make sure your headset is connected to your development machine.
In Unity Editor, select File > Build Profiles.
Click Open Scene List to open the Scene List window.
Add your scene to the Scene List by dragging it from the Project panel or by clicking Add Open Scenes.
Click Build and Run to generate an .apk and run it on your headset. In the File Explorer that opens, select a location to save the .apk to and give it a name. The build process may take a few minutes.
Key differences between ISDK with Meta XR Core SDK and with Unity XR
Interaction SDK was built to run on the Meta XR Core SDK, which now supports Unity XR.
However, Unity XR cannot access all Meta devices features. You must install the Meta XR Core SDK to access all Meta device features
because some of the Interaction SDK tools depend on the Core SDK.
Data sources
There is a single comprehensive sample scene available in the Unity XR package samples, but this does not represent a limitation for how Unity XR can be integrated. For many Interaction SDK Core SDK Sample scenes, if the hand and HMD data sources were swapped to a Unity XR source they would work just the same.
FromUnityXRHandDataSource and FromUnityXRHmdDataSource are Monobehaviors which take the OpenXR data provided through XR Hands or the XROrigin and translates it into the Core SDK data format the ISDK expects.