Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
In this lesson, you will learn about the basic concepts of Mixed Reality and explore the applications of Mixed Reality in different industries.
Read through the questions below. If you feel comfortable with the answer, feel free to skip to project section or next chapters.
The first revolution in computing happened with the creation of mainframe computers: computers that, at times, occupied a whole room. Mainframes were used by large organizations such as NASA for critical applications that process data.
The second wave of computing is defined by the Personal Computers(PC) becoming widely available.
We believe third wave of computing is going to include many devices to manage data and include IoT sensors and Mixed Reality devices.
We have more data than ever before. To be able to process the data and make informed decisions, we need to have access to the data in the right time and right place. Mixed Reality is able to bring that data into our context, real world.
Design & Prototyping: Enables real-time collaborative iteration of 3D physical and virtual models across cross-functional teams and stakeholders.
Training & Development: Provides instructors with better tools to facilitate teaching/coaching sessions. It offers trainees an enhanced and engaging learning experiences through 3D visualizations and interactivity.
Geospatial Planning: Enables the assessment and planning of indoor and outdoor environments (i.e. future construction sites, new store locations, interior designs) and removing the need for manual execution.
Sales Assistance: Improves the effectiveness of individuals in sales-oriented roles by providing tools such as 3D catalogs and virtual product experiences that increase customer engagement and strengthen buyer confidence.
Field Service: Improves the first-visit resolution and customer satisfaction of customer support issues. It is typically used for complex products that would otherwise require a field visit. It can serve as a platform for targeted up-sell opportunities, as well.
Productivity & Collaboration: Transform the space around you into a shared augmented workplace. Remote users can collaborate, search, brainstorm and share content as if they were in the same room
Augmented Reality(AR) is defined as a technology that superimposes a computer-generated image on a user's view of the real world, thus providing a composite view. Augmented Reality experiences are not limited to visual addition to our world. You can create augmented experiences that are only audio addition to your physical world or both audio and visual.
Augmented Reality experiences are also not limited to headsets like HoloLens. Today, millions of mobile devices have depth-sensing capabilities to augment your real world with digital information.
Virtual Reality(VR) is when you are absolutely immersed in a Virtual World by wearing a headset. In Virtual Reality you lose connection to the real world visually. Virtual Reality applications are great for training and for simulations where users would benefit from total immersion to replicate the real life situation. Some examples include training for firefighters, emergency room healthcare providers and flight simulations.
Mixed reality is the merging of real and virtual worlds to produce new environments and visualizations, where physical and digital objects co-exist and interact in real time.
We think of Mixed reality as a spectrum from the physical world to an augmented world to fully immersive virtual world and all the possibilities in between.
Introduction to Mixed Reality Applications and Development
Short link:
You can jump directly into setting up your first project on the .
Medical
Museums and Libraries
In this project we we will setup our development environment for Mixed Reality Development with Unity3d
Check your knowledge by answering below question before you move into the project. Feel free to skip sections you feel comfortable. Make sure you read through the first download section to make sure you have all the modules necessary.
Let’s review some key concepts, which will help you as you begin to explore editing scripts for mixed reality development.
In Unity, areas of the game that a player can interact with are generally made up of one or more Scenes. Small games may only use one Scene; large ones could have hundreds.
Every Unity project you create comes with a SampleScene that has a light and a camera.
You can create a new scene by right clicking under the assets tab and selecting Create > Scene. Organizing scenes under a Scenes folder is only for the organization purposes.
You can use scenes to organize navigation inside your application or adding different levels to a game.
Every object in the game world exists as a GameObject in Unity. GameObjects are given specific features by giving them appropriate components which provide a wide range of different functionality.
When you create a new GameObject, it comes with a Transform component already attached. This component controls the GameObject’s positional properties in the 3D (or 2D) gamespace. You need to add all other components manually in the Inspector.
Prefabs are a great way to configure and store GameObjects for re-use in your game. They act as templates, storing the components and properties of a specific GameObject and enabling you to create multiple instances of it within a Scene.
All copies of the Prefab template in a Scene are linked. This means that if you change the object values for the health potion Prefab, for example, each copy of that Prefab within the Scene will change to match it. However, you can also make specific instances of the GameObject different to the default Prefab settings.
Unity Introduction.
Before you get started with developing for Mixed Reality for Unity, make sure to check everything in the below list and follow the instructions for each download.
Not following the instructions for specific download might result in errors while developing or building your application. Before you try to debug, check the list and detailed instructions.
Install the most recent version of Windows 10 Education or Pro Education so your PC's operating system matches the platform for which you are building mixed reality applications.
You can check your Windows version by typing "about" in the Windows search bar and selecting About your PC as shown in the below image.
You can learn more about upgrading your Windows 10 Home to Pro at aka.ms/WinHome2Pro.
We need to install and enable Hyper-V, which does not work on Windows Home. Make sure to upgrade to Education, Pro Education, Pro or Enterprise versions.
Go to: https://unity3d.com/get-unity/download page and download the Unity Hub instead of Unity Editor.
Do not use Beta software in general before you feel very comfortable with debugging, the software itself and your way around github issues and stackover. Don't learn this lesson the hard way! I have tried that for your benefit and/or my optimism.
Unity Hub allows you to download multiple Unity Editors and organize your projects in one place. Since Unity upgrades are not backward compatible, you have to open the projects with the same Unity version that it was created with. You can update the projects to the latest Unity version but that requires a lot of debugging usually. Easiest way to get going with a project is to keep the same version. I will show you how to debug to update your projects later in this chapter.
You will need to download Windows development related modules along with your Unity Editor. Make sure Universal Windows Platform Build Support and Windows Build Support is checked while downloading Unity Editor through Unity Hub or add it after by modifying the install.
You can add modules or check if you have them in your editor by clicking on the hamburger button for the Unity Editor version and checking the above module check-boxes.
If you would like to build for an Android or iOS mobile device, make sure the related modules are checked as well.
You can download Visual Studio by adding Microsoft Visual Studio 2019 module to your Unity Editor as shown in previous step or download it at aka.ms/VSDownloads.
Make sure to download Mixed Reality related modules along with Visual Studio.
You can always add the necessary workflows to Visual Studio after download:
Go to Edit > Preferences.
Change the color scheme under General, if it is available.
You can change the default editor by selecting External Tools > External Script Editor drop-down will have your editors currently available in your computer.
Developing for Mixed Reality using Unity3D
You can clone and delete this repository's history and start a new git project by running the below script. You need to create your own github repo first. Replace with your own github project url.
Or by running the below github commands:
You can check to see if your remote origin and upstream by copy and pasting to your terminal:
You can remove the upstream anytime by running:
If you are using HoloLens Seed project, you do not need to follow this step. Seed project already comes with MRTK. Still, it's good to know how to import the MRTK assets for your future projects.
Examples
Extensions
Foundation
Tools
In your Unity project, select Assets tab and select Import Package > Custom Package from the drop down.
Navigate to MRTK downloaded folders to select and import them into your project.
Once you have MRTK assets imported, a new tab called Mixed Reality Toolkit will appear in your Unity editor. Navigate to new tab and select Add Scene and Configure from the dropdown menu. In your Scene Hierarchy, a new MixedRealityToolkit and MixedRealityPlayspace dropdowns will appear.
MixedRealityPlayspace now includes your Main Camera and the camera is configured for Mixed Reality applications. Camera background is black to render transparent and MixedRealityInputModule, EventSystem, GazeProvider components are now added to your camera.
You can create a new scene to compare the camera settings that has changed by MRTK.
You might be prompted to select a configuration. You can choose the default MRTK configuration or if you are developing for an HoloLens device, you can choose the configuration for the appropriate version.
In this section, you will learn Unity3D interface, tools and keyboard shortcuts.
The Unity Editor has four main sections:
This is where you can edit the current Scene by selecting and moving objects in the 3D space for the game. In this kit, the game level is contained in one Scene.
This is a list of all the GameObjects in a Scene. Every object in your game is a GameObject. These can be placed in a parent-child hierarchy, which lets you group objects — this means that when the parent object is moved, all of its children will move at the same time.
This display all settings related to the currently selected object. You will explore this window more during the walkthrough.
This is where you manage your Project Assets. Assets are the media files used in a Project (for example, images, 3D models and sound files). The Project window acts like a file explorer, and it can be used to explore and create folders on your computer. When the walkthrough asks you to find an Asset at a given file path, use this window.
TIP: If your Editor layout doesn’t match the image above, use the layout drop-down menu at the top right of the toolbar to select Default.
The toolbar includes a range of useful tool buttons to help you design and test your game.
Play is used to test the Scene which is currently loaded in the Hierarchy window, and enables you to try out your game live in the Editor.
Pause, as you have probably guessed, allows you to pause the game playing in the Game window. This helps you spot visual problems or gameplay issues that you wouldn’t otherwise see.
Step is used to walk through the paused Scene frame by frame. This works really well when you’re looking for live changes in the game world that it would be helpful to see in real time.
These tools move and manipulate the GameObjects in the Scene view. You can click on the buttons to activate them, or use a shortcut key.
You can use this tool to move your Scene around in the window. You can also use middle click with the mouse to access the tool.
This tool enables you to select items and move them individually.
Select items and rotate them with this tool.
Tool to scale your GameObjects up and down.
This tool does lots of things. Essentially, it combines moving, scaling and rotation into a single tool that’s specialized for 2D and UI.
This tool enables you to move, rotate, or scale GameObjects, but is more specialized for 3D.
Another useful shortcut is the F key, which enables you to focus on a selected object. If you forget where a GameObject is in your Scene, select it in the Hierarchy. Then, move your cursor over the Scene view and press F to center it.
When you’re in the Scene view, you can also do the following:
Left click to select your GameObject in the Scene.
Middle click and drag to move the Scene view’s camera using the hand tool.
Turn on your HoloLens device.
Tab your wrist(HoloLens 2) or make a bloom gesture(HoloLens 1) to initiate Windows menu.
Open the Settings > Update & Security.
Select For Developers tab on the right hand panel.
On your Project panel select Assets > MixedRealityToolkit.Examples > Demos.
Select from the folders that you want to see an example of, ex: HandTracking, EyeTracking...
Open the Scenes folder and select a scene and double click to open.
You can press play to try out the scene in your editor window.
The Settings app on Android includes a screen called Developer options that lets you configure system behaviors that help you profile and debug your app performance. For example, you can enable debugging over USB, capture a bug report, enable visual feedback for taps, flash window surfaces when they update, use the GPU for 2D graphics rendering, and more.
On Android 4.1 and lower, the Developer options screen is available by default. On Android 4.2 and higher, you must enable this screen. To enable developer options, tap the Build Number option 7 times. You can find this option in one of the following locations, depending on your Android version:
Android 9 (API level 28) and higher: Settings > About Phone > Build Number
Android 8.0.0 (API level 26) and Android 8.1.0 (API level 26): Settings > System > About Phone > Build Number
Android 7.1 (API level 25) and lower: Settings > About Phone > Build Number
At the top of the Developer options screen, you can toggle the options on and off (figure 1). You probably want to keep this on. When off, most options are disabled except those that don't require communication between the device and your development computer.
Before you can use the debugger and other tools, you need to enable USB debugging, which allows Android Studio and other SDK tools to recognize your device when connected via USB. To enable USB debugging, toggle the USB debugging option in the Developer Options menu. You can find this option in one of the following locations, depending on your Android version:
Android 9 (API level 28) and higher: Settings > System > Advanced > Developer Options > USB debugging
Android 8.0.0 (API level 26) and Android 8.1.0 (API level 26): Settings > System > Developer Options > USB debugging
Android 7.1 (API level 25) and lower: Settings > Developer Options > USB debugging
The rest of this page describes some of the other options available on this screen.
Mixed Reality Toolkit (MRTK) provides a set of components and features to accelerate cross-platform Mixed Reality application development in Unity. MRTK includes:
UI and interaction building blocks.
Tools.
Example Scenes.
Short link:
: Introduction to Mixed Reality Applications and Development.
: Introduction to Mixed Reality Developer Tools and 3D Concepts.
: Working with Hand Interactions and Controllers.
: Eye and Head Gaze Tracking.
: Spatial Visualization using Bing Maps.
: Working with REST APIs.
: Azure Spatial Anchors and Backend Services.
: Displaying Spatial Anchors on a map.
: Working with QR codes.
: Working with Scene Understanding.
: Getting Started with AI.
: Project Discussion and Case Studies.
is a github repository that is configured for Windows Mixed Reality development. The repo includes Mixed Reality Toolkit and .gitignore files.
You can create a new project from the seed instead of downloading the different assets and setting up your git project. To be able to use the seed project, you can and setup your development environment or directly download the repository content.
Whenever there is a new update for or packages, this repo will be updated with the latest version. You can automaticly get the latest packages by adding the seed repo as your upstream and pulling from it.
First, you need to download MRTK by going to their github page: and navigating to releases tab. Scroll down to Assets section and download the tools:
For more advice on moving GameObjects in the Scene view, see in the Manual.
You can learn more about the components at: .
1 ) Make sure you have imported Microsoft.MixedReality.Toolkit.Unity.Foundation as a custom asset or through NuGet.
2 ) In the Unity Package Manager (UPM), install the following packages:
Android
iOS
AR Foundation Version: 2.1.4
AR Foundation Version: 2.1.4
ARCore XR Plugin Version: 2.1.2
ARKit XR Plugin Version: 2.1.2
3 ) Enabling the Unity AR camera settings provider.
The following steps presume use of the MixedRealityToolkit object. Steps required for other service registrars may be different.
Select the MixedRealityToolkit object in the scene hierarchy.
2. Select Copy and Customize to Clone the MRTK Profile to enable custom configuration.
3. Select Clone next to the Camera Profile.
4. Navigate the Inspector panel to the camera system section and expand the Camera Settings Providers section.
5. Click Add Camera Settings Provider and expand the newly added New camera settings entry.
6. Select the Unity AR Camera Settings provider from the Type drop down.
There are no additional steps after switching the platform for Android.
Unchecking Strip Engine Code is the short term solution to an error in Xcode #6646. We are working on a long term solution.
In the Unity menu, select File > Build Settings... to open the Build Settings window.
In the Build Settings window, select Universal Windows Platform and click the Switch Platform button.
Click on Project Settings in the Build Settings window or the Unity menu, select Edit > Project Settings... to open the Project Settings window.
In the Project Settings window, select Player > XR Settings to expand the XR Settings.
In the XR Settings, check the Virtual Reality Supported checkbox to enable virtual reality, then click the + icon and select Windows Mixed Reality to add the Windows Mixed Reality SDK.
Your projects settings might have been configured by Mixed Reality Toolkit.
Optimize the XR Settings as follows:
Set Windows Mixed Reality Depth Format to 16-bit depth.
Check the Windows Mixed Reality Enable Depth Sharing checkbox.
Set Stereo Rendering Mode* to Single Pass Instanced.
In the Project Settings window, select Player > Publishing Settings to expand the Publishing Settings. Scroll down to the Capabilities section and check the SpatialPerception checkbox.
Save your project and open up the Build Settings. Click on Build button, not Build and Run. When prompted, create a new folder(ex:HoloLensBuild) and select your new folder to build your files into.
Click on Build button, not Build and Run.
When your build is done, your file explorer will automatically open to the build folder you just created.
Common issues to consider while developing for Mixed Reality
Since a Mixed Reality application might have access to the user video stream, developers might be able to save or share private information about the user. Be careful to not to save any sensitive data or image anywhere other than users device. Never send sensitive information to any backend.
Iris scan is a more accurate identification method than fingerprint. Since iris scan data can be used to identify and sing in a user, it should never leave the users device. HoloLens 2 does not send the iris scan to the cloud and does not give access to the data.
Eye tracking, while a very useful tool to make your application more accessible, it can also be used to collect data about the user's attention and might be used to manipulate the user's attention.
Unity versions are not backward compatible. If you decide to open a project on a newer version, Unity will try to update your project automatically but it is not guaranteed that the newer version will work with your imported assets. There might be some incompatibilities with your asset or your code and the new version.
Let's take the latest version in the image below, 2019.3.5f:
2019: is the year the Unity version was developed. Changes are issued once a year. If there are major changes, that will break your application. Stick to the same year version unless you are creating a new application from scratch for now. We will talk about how to update your project to the latest version in the following lessons.
3: implies the 3rd iteration in 2019. When a version updates from 2 to 3, there are minor breaking code. Make sure to read changelog before updating your project from 2 to 3.
.5f: is for bug fixes. Usually there are few fixes that does not break your code or the APIs being used. Feel free to update your project from 2019.3.4f to 2019.3.5f.
In your Unity Hub, under the project tab, you can select the Unity version drop down for your application and select a newer version of Unity. Unity will confirm your choice before updating your project. It is a good idea to save a version of your project as a new branch in Github, in case you need to revert back.
Here is a detailed article about the subject:
Introduction to Mixed Reality Developer Tools and 3D Concepts
Short link: aka.ms/UnityMixedRealityDeveloperTools
In this section, we will go through the developer tools and how to get started with debugging our applications.
Second part of the course is focused on creating and using 3D assets in your applications.
Mixed Reality getting started resources
Windows Mixed Reality Docs: aka.ms/MixedRealityDocs
Mixed Reality Curriculum Youtube Playlist: aka.ms/MixedRealityCurriculumVideos
Mixed Reality Resources Repository: aka.ms/MixedRealityResourcesRepository
HoloLens Seed Project Repository: aka.ms/HoloLensSeedProject
Code Samples: aka.ms/MixedRealityUnitySamples
Mixed Reality Development Tools to install: https://aka.ms/HoloLensToolInstalls
Elliminate Texture Confusion: Bump, Normal and Displacement Maps: https://www.pluralsight.com/blog/film-games/bump-normal-and-displacement-maps
Normal vs. Displacement Mapping & Why Games Use Normals: https://cgcookie.com/articles/normal-vs-displacement-mapping-why-games-use-normals
Live editing WebGL shaders with Firefox Developer Tools: https://hacks.mozilla.org/2013/11/live-editing-webgl-shaders-with-firefox-developer-tools/
Reuse the model instance instead of a new model where ever you can.
A 3D model is a digital representation of a real world object. Representing a 3D object requires you to get to know some parts that makes up the 3D object.
Polygonal modeling is an approach for modeling objects by representing or approximating their surfaces using polygon meshes.
Objects created with polygon meshes must store different types of elements. These include vertices, edges, faces, polygons and surfaces.
The more edges and faces a model has, detail if the model improves. On the other hand, having a high polygon count model will reduce the performance of your app. The calculation that needs to be done to render the model is expensive.
Debugging is the process of finding and resolving defects or problems within a computer program that prevent correct operation of the software.
Debugging tactics can involve:
Interactive debugging.
Control flow analysis.
Monitoring at the application or system level.
Mixed Reality Toolkit(MRTK) supports in-editor input simulation. Simply run your scene by clicking Unity’s play button. Use these keys to simulate input.
Press W, A, S, D keys to move the camera.
Hold the right mouse button and move the mouse to look around.
To bring up the simulated hands, press Space bar(Right hand) or left Shift key(Left hand).
To keep simulated hands in the view, press T or Y key.
To rotate simulated hands, press Q or E(horizontal) / R or F(vertical).
In this section we will install Windows Mixed Reality developer tools and learn about how to use them.
The HoloLens Emulator lets you test holographic applications on your PC without a physical HoloLens. It also includes the HoloLens development toolset.
You can download the latest HoloLens Emulator update here: bit.ly/emulator2.
Before installing the emulator, make sure your PC meets the following hardware requirements:
Windows 10 Home Edition does not support Hyper-V or the HoloLens Emulator. The HoloLens 2 Emulator requires the Windows 10 October 2018 update or later.
Developer Tools and 3D assets resources
Asset creation tools: https://github.com/Yonet/MixedRealityResources#asset-creation-tools.
Asset Libraries: https://github.com/Yonet/MixedRealityResources#asset-libraries.
Debugging C# code in Unity: https://docs.unity3d.com/Manual/ManagedCodeDebugging.html
Unity IL2CPP debugging: https://aka.ms/AA7qap4.
Hand interaction is a very natural way to interact with 3D models. Since we interact and modify real objects with our hands, a new user of your application can start interacting with your application without having to learn about your application interface first.
Working with Hand Interactions.
Short link: aka.ms/UnityHandInteractions
In this section, we will look into the hand interactions as an input in our application.
Hand Interactions are currently available for only HoloLens 2 and Oculus devices.
In project section, we will create our first hand interactions to scale, move and rotate objects.
The Windows Device Portal for HoloLens lets you configure and manage your device remotely over Wi-Fi or USB. The Device Portal is a web server on your HoloLens that you can connect to from a web browser on your PC. The Device Portal includes many tools that will help you manage your HoloLens and debug and optimize your apps.
Turn on your HoloLens device.
Tab your wrist(HoloLens 2) or make a bloom gesture(HoloLens 1) to initiate Windows menu.
Open the Settings > Update & Security.
Select For Developers tab on the right hand panel.
Enable "use developer features" by toggling on/off button.
Scroll down at the For Developer settings to enable "Device Portal".
Go back to all settings page by clicking "Home" on the left hand panel and select "Network & Internet" settings.
Select "Wifi" tab on the left, if it is not already selected.
Select the wifi you are connected to and click on "Advanced Options".
Scroll down and write down the IPV4 address.
You will type in this IP address to your browser to reach to your device portal.
You might see a connection Alert as shown below:
Go ahead and click Advanced button and click Proceed to <your IP address>(unsafe).
Congrats, you made it to your device portal.
Click Views on the right hand panel and select "Live Preview" to see the camera view of your HoloLens.
You can turn off PV camera if you would like to share or record what you are seeing through your HoloLens but do not want to capture your environment.
You can see the videos recorded or screenshots you snapped by asking Cortana here, on the Videos and Photos section, if you enabled voice commands.
The HandInteractionExamples.unity example scene contains various types of interactions and UI controls that highlight articulated hand input.
To try the hand interaction scene, first open the HandInteractionExamples scene under Assets\MixedRealityToolkit.Examples\Demos\HandTracking\Scenes\HandInteractionExamples
This example scene uses TextMesh Pro. If you receive a prompt asking you to import TMP Essentials, select the Import TMP Essentials button. Some of the MRTK examples use TMP Essentials for improved text rendering. After you select Import TMP Essentials, Unity will then import the package.
After Unity completes the import, close the TMP Importer window and reload the scene. You can reload the scene by double-clicking the scene in the Project window.
After the scene is reloaded, press the Play button.
To grab and move an object, first ensure that the Manipulation Handler (Script) and Near Interaction Grabbable (Script) components are added to the object. The Manipulation Handler (Script) allows you to manipulate an object while the **Near Interaction Grabble (Script) allows the object to respond to near hand interactions.
To add the scripts to the object, first select the object in the Hierarchy window. In the Inspector window, click Add Component and search for each script. Once found, select the script to add to the object.
With the object selected, in the Inspector window, navigate to the Manipulation Handler (Script) component to modify the component's parameters.
You can move an object using one or two hands. This setting is dependent on the Manipulation Type parameter. The Manipulation Type can be limited to either:
One Handed Only
Two Handed Only
One and Two Handed
Select the preferred Manipulation Type so that the user is restricted to use one of the available manipulation types.
You can now test grabbing and moving the object using the in-editor simulation. Press the Play button to enter Game mode. Once in Game mode, hold the space bar to bring up the hand and use the mouse to grab and move the object.
You can configure an object to play a sound when the user touches an object by adding a trigger touch event to the object.
To be able to trigger touch events, the object must have the following components:
Collider component, preferably a Box Collider
Near Interaction Touchable (Script) component
Hand Interaction Touch (Script) component
To add audio feedback, first add an Audio Source component to the object. The audio source component enables you to play audio back in the scene. In the Hierarchy window, select the object and click Add Component in the Inspector window. Search for Audio Source to add the Audio Source component.
Once the Audio Source component has been added to the object, in the Inspector window, change the Spatial Blend property to 1 to enable spatial audio.
Next, with the object still selected, click Add Component and search for the Near Interaction Touchable (Script). Once found, select the component to add to the object. Near interactions come in the form of touches and grabs - which is an interaction that occurs when the user is within close proximity to an object and uses hand interaction.
After the Near Interaction Touchable (Script) is added to the object, click the Fix Bounds and Fix Center buttons. This will update the Local Center and Bounds properties of the Near Interaction Touchable (Script) to match the BoxCollider.
With the object still selected, click Add Component and search for the Hand Interaction Touch (Script). Once found, select the component to add to the object.
To make audio play when the object is touched, you will need to add an On Touch Started event to the Hand Interaction Touch (Script) component. In the Inspector window, navigate to the Hand Interaction Touch (Script) component and click the small + icon to create a new On Touch Started () event.
Drag the object to receive the event and define AudioSource.PlayOneShot as the action to be triggered. PlayOneShot will play the audio clip.
Next, assign an audio clip to the trigger. You can find audio clips provided by MRTK by navigating to Assets > MixedRealityToolkit.SDK > StandardAssets > Audio. Once you've found a suitable audio clip, assign the audio clip to the Audio Clip field.
You can now test the touch interaction using the in-editor simulation. Press the Play button to enter Game mode. Once in Game mode, hold the spacebar to bring up the hand and use the mouse to touch the object and trigger the sound effect.
Mixed Reality Toolkit is equipped with a variety of button prefabs that you could add to your project. A prefab is a pre-configured GameObject stored as a Unity Asset and can be reused throughout your project.
You can find button prefabs available in MRTK by navigating to MixedRealityToolkit.SDK > Features > UX > Interactable > Prefabs.
In this project, you will learn how to change the color of a cube when a button is pressed.
First, select the button of your choice from the Project window and drag into the Hierarchy window.
Change the button's Transform Position so that it's positioned in front of the camera to x = 0, y = 0, and z = 0.5
Next, right click on an empty spot in the Hierarchy window and click 3D Object > Cube.
With the Cube object still selected, in the Inspector window, change the Transform Position so that the cube is located near but not overlapping the button. In addition, resize the cube by changing the Transform Scale.
In the Hierarchy window, select the button. In the Inspector window, navigate to the Interactable (Script) component.
In the Events section, expand the Receivers section.
Click the Add Event button to create a new event receiver of Event Receiver Type InteractableOnPressReceiver.
For the newly created InteractableOnPressReceiver event, change the Interaction Filter to Near and Far.
From the Hierarchy window, click and drag the Cube GameObject into the Event Properties object field for the On Press() event to assign the Cube as a receiver of the On Press () event.
Next, click the action dropdown (currently assigned No Function) and select MeshRenderer > Material material. This action will set the Cube's material property to change when the button is pressed.
Now, assign a color for the Cube to change to when the button is pressed. Click the small circle icon next to the Material field (currently assigned None (Material)) to open the Select Material window.
MRTK provides a variety of materials that can be used in your projects. In the search bar, search for MRTK_Standard and select your color of choice.
Now that the event is configured for when the button is pressed, you now need to configure an event that occurs when the button is released. For the On Release () event, click and drag the Cube GameObject into the Event Properties.
Next, click the action dropdown (currently assigned No Function) and select MeshRenderer > Material material. This action will set the Cube's material property to change when the button is released.
Now, assign a color for the Cube to change to when the button is released. Click the small circle icon next to the Material field (currently assigned None (Material)) to open the Select Material window and search for MRTK_Standard. Select your choice of color.
Now that both the On Press () and On Trigger () events are configured for the button, press Play to enter Game mode and test the button in the in-editor simulator.
To press the button, press the space bar + mouse scroll forward.
To release the button, press the space bar + mouse scroll backward.
You can organize any objects in Unity into a grid by using an Object collection script. In this example, you will learn how to organize 6 3D objects into a 3 x 3 grid.
First, configure your Unity scene for the Mixed Reality Toolkit. Next, in the Hierarchy window, right click in an empty space and select Create Empty. This will create an empty GameObject. Name the object CubeCollection.
In the Inspector window, position CubeCollection so that the collection displays in front of the user (example, X = 0, Y = -0.2, Z = 2).
With CubeCollection still selected, in the Hierarchy window, create a child Cube object. Change the scale of the object to x = .25, y = .25, z = .25.
Duplicate the child Cube object 8 times so that there is a total of 9 Cube child objects within the CubeCollection object.
In the Hierarchy window, select CubeCollection. In the Inspector window, click Add Component and search for the Grid Object Collection (Script). Once found, select the component to add to the object.
Configure the Grid Object Collection (Script) component by changing the Sort Type property to Child Order. This will ensure that the child objects (the 9 Cube objects) are sorted in the order you placed them under the parent object.
Click Update Collection to apply the new configuration.
You can adjust the parameters within the Grid Object Collection (Script) component to further customize the grid. For example, you could change the number of rows to 2 by changing the value in the Num Rows properties. Be sure to click Update Collection to apply the new configuration.
To rotate and scale an object, first ensure that the Manipulation Handler (Script) and Near Interaction Grabbable (Script) components are added to the object. The Manipulation Handler (Script) allows you to manipulate an object while the Near Interaction Grabble (Script) allows the object to respond to near hand interactions.
To add the scripts to the object, first select the object in the Hierarchy window. In the Inspector window, click Add Component and search for each script. Once found, select the script to add to the object.
With the object selected, in the Inspector window, navigate to the Manipulation Handler (Script) component to modify the component's parameters.
You can rotate an object using one or two hands. This setting is dependent on the Manipulation Type parameter. The Manipulation Type can be limited to either:
One Handed Only
Two Handed Only
One and Two Handed
Select Two Handed Only for Manipulation Type so that the user can only manipulate the object with two hands.
To limit the two handed manipulation to rotating and scaling, change Two Handed Manipulation Type to Rotate Scale.
To limit whether the object can be rotated on the x, y or z axis, change Constraint on Rotation to your preferred axis.
You can now test rotating and scaling the object using the in-editor simulation. Press the Play button to enter Game mode. Once in Game mode, press T and Y on the keyboard to toggle both hands. This will permanently display both hands in Game mode. Press the space bar to move the right hand and use left mouse click + Shift to move the left hand. While either controlling the left or right hand, use the mouse to rotate and scale the object.
Gestures are input events based on human hands.
There are two types of devices that raise gesture input events in Mixed Reality Toolkit(MRTK):
Windows Mixed Reality devices such as HoloLens. This describes pinching motions ("Air Tap") and tap-and-hold gestures.
WindowsMixedRealityDeviceManager
wraps the Unity XR.WSA.Input.GestureRecognizer to consume Unity's gesture events from HoloLens devices.
Touch screen devices.
UnityTouchController
wraps the Unity Touch class that supports physical touch screens.
Both of these input sources use the Gesture Settings profile to translate Unity's Touch and Gesture events respectively into MRTK's Input Actions. This profile can be found under the Input System Settings profile.
Bounding boxes make it easier and more intuitive to manipulate objects with one hand for both near and far interaction by providing handles that can be used for scaling and rotating. A bounding box will show a cube around the hologram to indicate that it can be interacted with. The bounding box also reacts to user input.
You can add a bounding box to an object by adding the BoundingBox.cs script as a component of the object.
To add the Bounding Box (Script) component to an object, first select the object in the Hierarchy window. In the Inspector window, click Add Component and search for Bounding Box.
Select the Bounding Box script to apply the component to the object. The bounding box is only visible in Game mode. Press play to view the bounding box. By default, the HoloLens 1st gen style is used.
To reflect the MRTK bounding box style, you need to change the parameters inside the Handles section of the Bounding Box (Script) component.
You can change the color of the handles by assigning a material to the Handle Material property.
In the Handles section, click the circle icon to open the Select Material window.
In the Select Material window, search for BoundingBoxHandleWhite. Once found, select to assign the color to the handle material.
When you press play, the handle colors for the bounding box will be white.
You can change the color of the handles when an object is grabbed by assigning a material to the Handle Grabbed Material property.
In the Handles section, click the circle icon to open the Select Material window.
In the Select Material window, search for BoundingBoxHandleBlueGrabbed. Once found, select to assign the color to the handle material.
When you press play, grab one of the handles of the bounding box. The color of the handle will change to blue.
You can change the scale handles in corners by assigning a scale handle prefab in the Scale Handle Prefab and Scale Handle Slate Prefab (for 2D slate) parameters.
First, assign a prefab to the Scale Handle Prefab. In the Handles section, click the circle icon to open the Select GameObject window.
In the Select GameObject window, switch to the Assets tab and search for MRTK_BoundingBox_ScaleHandle. Once found, select to assign the prefab to the scale handle.
Next, assign a prefab to the Scale Handle Slate Prefab. In the Handles section, click the circle icon to open the Select GameObject window.
In the Select GameObject window, switch to the Assets tab and search for MRTK_BoundingBox_ScaleHandle_Slate. Once found, select to assign the prefab to the scale handle.
When you press play, grab one of the handles of the bounding box to see the change in how the scale handle look.
You can change the rotation handles by assigning a rotation handle prefab in the Rotation Handle Prefab parameter.
In the Handles section, click the circle icon to open the Select GameObject window.
In the Select GameObject window, switch to the Assets tab and search for MRTK_BoundingBox_RotateHandle. Once found, select to assign the prefab to the scale handle.
When you press play, grab one of the handles of the bounding box to see the change in how the scale handle look.
MRTK uses what are known as Solvers to allow UI elements to follow the user or other game objects in the scene. The Radial View solver is a tag-along component that keeps a particular portion of a GameObject within the user's view.
You can make a button follow your hand by adding the Radial View (Script) component to the object.
First, drag a button prefab from MixedRealityToolkit.SDK > Features > UX > Interactable > Prefabs to the Hierarchy window.
In the Hierarchy window, select the button prefab. In the Inspector window, click Add Component. Search for Radial View. Once found, select to add the component to the button.
When you add the Radial View (Script) component to the button, the Solver Handler (Script) component is added as well because it is required by the Radial View (Script).
The Solver Handler (Script) component needs to be configured so that the button follows the user's hand. First, change Tracked Target Type to Hand Joint. This will enable you to define which hand joint the button follows.
Next, for the Solver Handler (Script) component, change Tracked Handness to Right. This setting determines which hand is tracked.
There over 20 hand joints available for tracking. Still inside the Solver Handler (Script) component, change Tracked Hand Joint to Wrist so that the button tracks the user's wrist.
Now that the hand tracking is configured, you need to configure the Radial View (Script) component to further define where the button is located and how it is viewed in relation to the user. First, change Reference Direction to Facing World Up. This parameter determines which direction the button faces.
Next, in the Radial View (Script) component, change the Min Distance and Max Distance to 0. The Min and Max Distance parameters determine how far the button should be kept from the user. As a reminder, the unit of measurement in Unity is meters. Therefore, a Min Distance of 1 would push the buttona way to ensure it is never closer than 1 meter to the user.
Now that the button is configured to follow your right wrist, press Play to enter Game mode and test the solver in the in-editor simulator. Press and hold the space bar to bring up the hand. Move the mouse cursor around to move the hand, and click and hold the left mouse button to rotate the hand:
Eye and Head Gaze Tracking.
Code Samples: https://aka.ms/MixedRealityUnitySamples
A good visualization allows the users to understand a data better by seeing the data points in the right context. Check out some of the examples below to see what the visualization provides that you would have hard time to understand just by seeing the information data points.
Small arms and ammunition import and export interactive visualization:
using .
Compare Covid-19 Data tab and map tab to see the difference it makes in your perception:
Wind and weather visualizations:
Chrome experiments with Globe:
With spatial data you can discover growth insights, manage facilities and networks, and provide location information to customers. Without considering spatial components and how they relate to your business, your risks and possibility of poor results will increase.
Spatial analysis allows you to solve complex location-oriented problems and better understand where and what is occurring in your world. It goes beyond mere mapping to let you study the characteristics of places and the relationships between them. Spatial analysis lends new perspectives to your decision-making.
Spatial Visualization using Bing Map using HoloLens 2 and Windows Mixed Reality Headsets.
Shortlink: aka.ms/UnityBingMapsVisualizationLesson
This project is for HoloLens 2 and Windows Mixed Reality Headsets.
In this project, we will create a 3D Map visualization using Bing Maps Unity SDK: aka.ms/BingMapsUnitySDK.
Outings, a sample app created by Bings Map SDK can be found on Microsoft Store for PC and HoloLens 1: aka.ms/OutingsHoloLens1
We will build the app in below video for HoloLens 2. You can render it for Windows Mixed Reality Headset and use hand controllers instead of hand gestures.
Maps SDK, a Microsoft Garage project provides a control to visualize a 3D map in Unity. The map control handles streaming and rendering of 3D terrain data with world-wide coverage. Select cities are rendered at a very high level of detail. Data is provided by Bing Maps.
The map control has been optimized for mixed reality applications and devices including the HoloLens, HoloLens 2, Windows Immersive headsets, HTC Vive, and Oculus Rift. Soon the SDK will also be provided as an extension to the Mixed Reality Toolkit (MRTK).
A Bing Maps developer key is required to enable the mapping functionality of the SDK.
Sign-in to the Bing Maps Dev Center.
For new accounts, follow the instructions at Creating a Bing Maps Account.
Select My keys under My Account, and select the option to create a new key.
Provide the following required information to create a key:
Application name: The name of the application.
Key type: Basic or Enterprise. Key types are explained here.
Application type: Select Other Public Mobile App.
Click the Create button. The new key displays in the list of available keys. This key will be used later when setting up the Unity project.
See the Understanding transactions page for more details about transaction accounting.
HoloLens 2 and Windows Mixed Reality Headset project using Bing Maps SDK
In this project we will create a 3D map visualization as shown in the video below:
Follow along the next steps or answer the questions below to see if you can skip some of the steps.