Unity's built-in VR support
In general, your Unity project must include a camera object that can render stereoscopic views, one for each eye on the VR headset. Since Unity 5.1, support for VR headsets has been built into Unity for various devices across several platforms.
You can simply use a standard camera component, like the one attached to the default Main Camera when you create a new scene. As we'll see, you can have Virtual Reality Supported enabled in XR Player Settings for Unity to render stereoscopic camera views and run your project on a VR headset (HMD). In Player Settings, you then choose which specific virtual reality SDK(s) to use when the project is built. The SDK talks to the device runtime drivers and underlying hardware. Unity's support for VR devices is collected in the XR class, and is documented as follows:
- XR Settings: Global XR-related settings including a list of supported devices in the build, and eye textures for the loaded device. See https://docs.unity3d.com/ScriptReference/XR.XRSettings.html.
- XR Device: Query the capabilities of the current device such as the refresh rate and tracking space type. See https://docs.unity3d.com/ScriptReference/XR.XRDevice.html.
- XR Input Tracking: Access the VR positional tracking data including the position and rotation of inpidual nodes. See https://docs.unity3d.com/ScriptReference/XR.InputTracking.html.
Input controller buttons, triggers, touchpads, and thumbsticks can also map generically to Unity's Input system. For example, the OpenVR hand controller mappings can be found here: https://docs.unity3d.com/Manual/OpenVRControllers.html.