# What are Gestures?

**Gestures** are input events based on human **hands**.&#x20;

There are two types of devices that raise **gesture input events in Mixed Reality Toolkit(MRTK)**:

* Windows Mixed Reality devices such as HoloLens. This describes pinching motions ("Air Tap") and tap-and-hold gestures. &#x20;

[`WindowsMixedRealityDeviceManager`](https://microsoft.github.io/MixedRealityToolkit-Unity/api/Microsoft.MixedReality.Toolkit.WindowsMixedReality.Input.WindowsMixedRealityDeviceManager.html) wraps the [Unity XR.WSA.Input.GestureRecognizer](https://docs.unity3d.com/ScriptReference/XR.WSA.Input.GestureRecognizer.html) to consume Unity's gesture events from **HoloLens** devices.

* Touch screen devices.&#x20;

&#x20;[`UnityTouchController`](https://microsoft.github.io/MixedRealityToolkit-Unity/api/Microsoft.MixedReality.Toolkit.Input.UnityInput.html) wraps the [Unity Touch class](https://docs.unity3d.com/ScriptReference/Touch.html) that supports physical touch screens.

&#x20;Both of these input sources use the *Gesture Settings* profile to translate Unity's Touch and Gesture events respectively into MRTK's [Input Actions](https://microsoft.github.io/MixedRealityToolkit-Unity/Documentation/Input/InputActions.html). This profile can be found under the *Input System Settings* profile.

![Gesture Profile Settings](https://1227696974-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2F-LoMjAI1irMPA8c0ezFQ%2Fuploads%2Fgit-blob-4260c4ad5525020c1c12b587929c22c90aa0ffdb%2FGestureProfile.png?alt=media)
