Gestures are input events based on human hands.
There are two types of devices that raise gesture input events in Mixed Reality Toolkit(MRTK):
Windows Mixed Reality devices such as HoloLens. This describes pinching motions ("Air Tap") and tap-and-hold gestures.
WindowsMixedRealityDeviceManagerarrow-up-right wraps the Unity XR.WSA.Input.GestureRecognizerarrow-up-right to consume Unity's gesture events from HoloLens devices.
WindowsMixedRealityDeviceManager
Touch screen devices.
UnityTouchControllerarrow-up-right wraps the Unity Touch classarrow-up-right that supports physical touch screens.
UnityTouchController
Both of these input sources use the Gesture Settings profile to translate Unity's Touch and Gesture events respectively into MRTK's Input Actionsarrow-up-right. This profile can be found under the Input System Settings profile.
Last updated 5 years ago
Was this helpful?