Intro

Hi folks, apologies for my lack of activity. I’m going to change the format slightly for this blog article and write a tutorial. In this article, I will outline the setup of a modern XR development environment in Unity 2020 LTS. The example project will use Unity XR Plugin Management, OpenXR, the new Unity Input System, as well as Unity Action Based Input. Exciting right?! But first, let’s go through a quick overview of OpenXR, what it is, and how it applies to Unity.

UPDATE:

Goal: Show how to track, and do the following with actions

-> add input action for controllers

  • controller input

    • position rotation
  • joystick xyz

  • buttons

  • trigger 0-1

  • add a custom input

  • generate class example

Input mappings https://docs.unity3d.com/Manual/xr_input.html

Bug? trigger non 0 at rest (> 1)

OpenXR

OpenXR is an open standard that is developed and maintained by Khronos. You may recognize that name, as they brought us OpenGL, WebGL, and Vulkan standards, among others. The specification is publicly available, and the goal is to simplify XR development across platforms. Unity are supporting the OpenXR standard.

test

The XR ecosystem has become very fragmented over the last couple of years, especially around headsets and input systems. So Unity supporting OpenXR is a big deal, as it allows developers to seamlessly target a wide range of AR/VR devices with a single Unity project. Amazing! but there are a couple caveats! so you should read through the docs before deciding if it is right for your project in its current state.

Project Setup

Lets get the project set up for the rest of this article. Edit -> Project Settings in the resulting dialog choose XR Plugin Management and click the Install XR Plugin Management button.

test

Now enable OpenXR and the MockHMD (which will allow you to develop XR without a headset).

test

Next, you will need to Fix all of the incompatibilities for your project. Click the ! icon, and click the Fix All button.

test

After this is done, you will need to make sure the correct Interaction Profiles are selected in the project settings. Interaction profiles are device layouts in the new Unity Input System, which we will discuss later.

test

Ok we have a couple more things to do before we are done, but we are almost done with setup. Next we are going to install the Unity XR Interaction Toolkit. Open Window -> Package Manager. Now in the resulting window, in the top right click the little cog icon and choose the Advanced Project Settings menu option. In the resulting window, check the Enable Preview Packages checkbox then close the window.

test

In the Package Manager window, select Unity Registry. This will cause all Unity packages to show.

test

Now search for an install the package name XR Interaction Toolkit. test

Once this is installed, in the same window expand the Samples and import both Default Input Actions and XR Device Simulator.

test

OK, we are ready to go! Next up we will create an XR rig, and discuss input handling.

Add Action Based OpenXR Rig

To handle our input navigate to the default action presets. test

Then click on the Left and Right controller defaults, and click the Add to ActionBasedController default in the inspector window.

test

Once this is complete create an XR Rig in your scene by right clicking anywhere in the scene hierarchy and choosing XR -> XR Rig (Action Based) on the resulting menu.

test

Setup Custom Actions

  1. [TODO] Right click and create Input Actions
  2. [TODO] Open the input actions in editor, click auto save.

Create Action Maps

  1. [TODO] Create action maps for VR HMD, VR LHand, VR RHand

Create Actions For Each Action Map

First for the head, we need to track the Position and Rotation. So we create two actions and name them Position and Rotation. Both of these actions are value types that will be read from the VR headset device every time the state changes. For the Position action we should set the properties to Action Type -> Value and Control Type -> Vector3. For the Rotation action, we should set the properties to Action Type -> Value, and Control Type -> Quaternion.

test

Now we have the actions set up, but we need to bind them to device interfaces. In this case, we are going to bind to the OpenXR device interface – this will give us a rather beautiful abstraction. So, in the Properties -> Binding Path dropdown and select XR HMD -> centerEyePosition

test

And now do the same for the Rotation Action.

Hands

Now do similar for the hands, create position and rotation actions with the same process as we did with the HMD. The diffference being the device bindings. Here we use the XR left and right controllers.

test

Strongly Typed

Topcs

  • controller input

    • position rotation
  • joystick xyz

  • buttons

  • trigger 0-1

  • add a custom input

  • generate class example

Smooth Movement

  • get input, move around
  • controller, with collider and bodyd

https://docs.unity3d.com/Packages/com.unity.inputsystem@1.1/manual/ActionAssets.html

This article requires Unity 2020.x, the packages used

Note:

  • Currently only Windows x64 is supported

OpenXR https://blog.unity.com/technology/unity-xr-platform-updates

Contain experimental packages, please let me know if anything changes.

– OpenXR Support – Input system – Unity Interaction Toolkit

Install packages

Input Action Assets

Import unity xr toolkit

Import the unity xr plugin management, and enable the new unity input system

References

Unity Input System | How to Use Input Action Assets https://www.youtube.com/watch?v=mvuXOyKz7k4