Meta XR Simulator Interaction
Updated: Jul 2, 2024
Outdated XR Simulator Version
This information applies to an older version of the XR Simulator, for new projects use the Standalone XR Simulator which supports any OpenXR application.On Meta Quest devices, users utilize controllers or hands to interact with apps. Meta XR Simulator offers three primary methods to simulate this interaction: keyboard and mouse input, an Xbox controller, and Meta Quest controllers. This page also covers additional input features including Point-and-Click input and Movement Tracking Controls.
The default method is keyboard and mouse input. You can use the WASD keys to move around, and the mouse to change your view. Refer to the Input Bindings panel for full key binding information.
Keyboard and mouse input also supports Point-and-Click input, where the controller ray follows your mouse cursor. This is useful for interacting with UI elements and aiming at specific targets. For details, see
Point-and-Click input.
An Xbox controller can also serve as an input device. Once connected, you can use it directly with Meta XR Simulator. Refer to the Input Bindings panel for full key binding information.
Keyboard and mouse input, along with Xbox controllers, are commonly used for app interactions but are limited in 3D space. Meta Quest controllers, however, simplify actions like grabbing an object and placing it on a table edge.
The Input Bindings panel displays the keys or buttons assigned to each simulated action, such as movement, head rotation, and controller interactions like grabbing. You can customize these bindings by remapping keys or buttons to suit your preferences.
Movement Tracking Controls
The Movement Tracking Controls section of the Inputs panel allows you to play back recorded tracking data to test body, face, and eye tracking in Meta XR Simulator. This is useful for validating tracking-dependent features without a physical headset.