Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
In this lesson, you will learn about the basic concepts of Mixed Reality and explore the applications of Mixed Reality in different industries.
Read through the questions below. If you feel comfortable with the answer, feel free to skip to project section or next chapters.
Augmented Reality(AR) is defined as a technology that superimposes a computer-generated image on a user's view of the real world, thus providing a composite view. Augmented Reality experiences are not limited to visual addition to our world. You can create augmented experiences that are only audio addition to your physical world or both audio and visual.
Augmented Reality experiences are also not limited to headsets like HoloLens. Today, millions of mobile devices have depth-sensing capabilities to augment your real world with digital information.
Virtual Reality(VR) is when you are absolutely immersed in a Virtual World by wearing a headset. In Virtual Reality you lose connection to the real world visually. Virtual Reality applications are great for training and for simulations where users would benefit from total immersion to replicate the real life situation. Some examples include training for firefighters, emergency room healthcare providers and flight simulations.
In this project we we will setup our development environment for Mixed Reality Development with Unity3d
Check your knowledge by answering below question before you move into the project. Feel free to skip sections you feel comfortable. Make sure you read through the first download section to make sure you have all the modules necessary.
Mixed Reality Toolkit (MRTK) provides a set of components and features to accelerate cross-platform Mixed Reality application development in Unity. MRTK includes:
UI and interaction building blocks.
Tools.
Example Scenes.
You can learn more about the components at: aka.ms/MRTKGuides.
The first revolution in computing happened with the creation of mainframe computers: computers that, at times, occupied a whole room. Mainframes were used by large organizations such as NASA for critical applications that process data.
The second wave of computing is defined by the Personal Computers(PC) becoming widely available.
We believe third wave of computing is going to include many devices to manage data and include IoT sensors and Mixed Reality devices.
We have more data than ever before. To be able to process the data and make informed decisions, we need to have access to the data in the right time and right place. Mixed Reality is able to bring that data into our context, real world.
Design & Prototyping: Enables real-time collaborative iteration of 3D physical and virtual models across cross-functional teams and stakeholders.
Training & Development: Provides instructors with better tools to facilitate teaching/coaching sessions. It offers trainees an enhanced and engaging learning experiences through 3D visualizations and interactivity.
Geospatial Planning: Enables the assessment and planning of indoor and outdoor environments (i.e. future construction sites, new store locations, interior designs) and removing the need for manual execution.
Sales Assistance: Improves the effectiveness of individuals in sales-oriented roles by providing tools such as 3D catalogs and virtual product experiences that increase customer engagement and strengthen buyer confidence.
Field Service: Improves the first-visit resolution and customer satisfaction of customer support issues. It is typically used for complex products that would otherwise require a field visit. It can serve as a platform for targeted up-sell opportunities, as well.
Productivity & Collaboration: Transform the space around you into a shared augmented workplace. Remote users can collaborate, search, brainstorm and share content as if they were in the same room
Unity Introduction.
Before you get started with developing for Mixed Reality for Unity, make sure to check everything in the below list and follow the instructions for each download.
Not following the instructions for specific download might result in errors while developing or building your application. Before you try to debug, check the list and detailed instructions.
Install the most recent version of Windows 10 Education or Pro Education so your PC's operating system matches the platform for which you are building mixed reality applications.
You can check your Windows version by typing "about" in the Windows search bar and selecting About your PC as shown in the below image.
You can learn more about upgrading your Windows 10 Home to Pro at aka.ms/WinHome2Pro.
We need to install and enable Hyper-V, which does not work on Windows Home. Make sure to upgrade to Education, Pro Education, Pro or Enterprise versions.
Go to: https://unity3d.com/get-unity/download page and download the Unity Hub instead of Unity Editor.
Do not use Beta software in general before you feel very comfortable with debugging, the software itself and your way around github issues and stackover. Don't learn this lesson the hard way! I have tried that for your benefit and/or my optimism.
Unity Hub allows you to download multiple Unity Editors and organize your projects in one place. Since Unity upgrades are not backward compatible, you have to open the projects with the same Unity version that it was created with. You can update the projects to the latest Unity version but that requires a lot of debugging usually. Easiest way to get going with a project is to keep the same version. I will show you how to debug to update your projects later in this chapter.
You will need to download Windows development related modules along with your Unity Editor. Make sure Universal Windows Platform Build Support and Windows Build Support is checked while downloading Unity Editor through Unity Hub or add it after by modifying the install.
You can add modules or check if you have them in your editor by clicking on the hamburger button for the Unity Editor version and checking the above module check-boxes.
If you would like to build for an Android or iOS mobile device, make sure the related modules are checked as well.
You can download Visual Studio by adding Microsoft Visual Studio 2019 module to your Unity Editor as shown in previous step or download it at aka.ms/VSDownloads.
Make sure to download Mixed Reality related modules along with Visual Studio.
You can always add the necessary workflows to Visual Studio after download:
In this section, you will learn Unity3D interface, tools and keyboard shortcuts.
The Unity Editor has four main sections:
This is where you can edit the current Scene by selecting and moving objects in the 3D space for the game. In this kit, the game level is contained in one Scene.
This is a list of all the GameObjects in a Scene. Every object in your game is a GameObject. These can be placed in a parent-child hierarchy, which lets you group objects — this means that when the parent object is moved, all of its children will move at the same time.
This display all settings related to the currently selected object. You will explore this window more during the walkthrough.
This is where you manage your Project Assets. Assets are the media files used in a Project (for example, images, 3D models and sound files). The Project window acts like a file explorer, and it can be used to explore and create folders on your computer. When the walkthrough asks you to find an Asset at a given file path, use this window.
TIP: If your Editor layout doesn’t match the image above, use the layout drop-down menu at the top right of the toolbar to select Default.
The toolbar includes a range of useful tool buttons to help you design and test your game.
Play is used to test the Scene which is currently loaded in the Hierarchy window, and enables you to try out your game live in the Editor.
Pause, as you have probably guessed, allows you to pause the game playing in the Game window. This helps you spot visual problems or gameplay issues that you wouldn’t otherwise see.
Step is used to walk through the paused Scene frame by frame. This works really well when you’re looking for live changes in the game world that it would be helpful to see in real time.
These tools move and manipulate the GameObjects in the Scene view. You can click on the buttons to activate them, or use a shortcut key.
You can use this tool to move your Scene around in the window. You can also use middle click with the mouse to access the tool.
This tool enables you to select items and move them individually.
Select items and rotate them with this tool.
Tool to scale your GameObjects up and down.
This tool does lots of things. Essentially, it combines moving, scaling and rotation into a single tool that’s specialized for 2D and UI.
This tool enables you to move, rotate, or scale GameObjects, but is more specialized for 3D.
Another useful shortcut is the F key, which enables you to focus on a selected object. If you forget where a GameObject is in your Scene, select it in the Hierarchy. Then, move your cursor over the Scene view and press F to center it.
When you’re in the Scene view, you can also do the following:
Left click to select your GameObject in the Scene.
Middle click and drag to move the Scene view’s camera using the hand tool.
For more advice on moving GameObjects in the Scene view, see Scene View Navigation in the Manual.
Introduction to Mixed Reality Applications and Development
Short link: aka.ms/UnityIntroToMixedReality
In this lesson, you will learn about the basic concepts of Mixed Reality and explore the applications of Mixed Reality in different industries.
In the Project section, you will set up your first Mixed Reality project using Unity and Mixed Reality Tool Kit.
You can jump directly into setting up your first project on the How to get started with mixed reality development using Unity3D section.
Let’s review some key concepts, which will help you as you begin to explore editing scripts for mixed reality development.
In Unity, areas of the game that a player can interact with are generally made up of one or more Scenes. Small games may only use one Scene; large ones could have hundreds.
Every Unity project you create comes with a SampleScene that has a light and a camera.
You can create a new scene by right clicking under the assets tab and selecting Create > Scene. Organizing scenes under a Scenes folder is only for the organization purposes.
You can use scenes to organize navigation inside your application or adding different levels to a game.
Every object in the game world exists as a GameObject in Unity. GameObjects are given specific features by giving them appropriate components which provide a wide range of different functionality.
When you create a new GameObject, it comes with a Transform component already attached. This component controls the GameObject’s positional properties in the 3D (or 2D) gamespace. You need to add all other components manually in the Inspector.
Prefabs are a great way to configure and store GameObjects for re-use in your game. They act as templates, storing the components and properties of a specific GameObject and enabling you to create multiple instances of it within a Scene.
All copies of the Prefab template in a Scene are linked. This means that if you change the object values for the health potion Prefab, for example, each copy of that Prefab within the Scene will change to match it. However, you can also make specific instances of the GameObject different to the default Prefab settings.
Mixed reality is the merging of real and virtual worlds to produce new environments and visualizations, where physical and digital objects co-exist and interact in real time.
We think of Mixed reality as a spectrum from the physical world to an augmented world to fully immersive virtual world and all the possibilities in between.
Go to Edit > Preferences.
Change the color scheme under General, if it is available.
You can change the default editor by selecting External Tools > External Script Editor drop-down will have your editors currently available in your computer.
HoloLens Seed project is a github repository that is configured for Windows Mixed Reality development. The repo includes Mixed Reality Toolkit and .gitignore files.
You can create a new project from the seed instead of downloading the different assets and setting up your git project. To be able to use the seed project, you can get a github account and setup your development environment or directly download the repository content.
You can clone and delete this repository's history and start a new git project by running the below script. You need to create your own github repo first. Replace with your own github project url.
Or by running the below github commands:
Whenever there is a new update for Mixed Reality Toolkit or Azure Spatial Anchors packages, this repo will be updated with the latest version. You can automaticly get the latest packages by adding the seed repo as your upstream and pulling from it.
You can check to see if your remote origin and upstream by copy and pasting to your terminal:
You can remove the upstream anytime by running:
The Settings app on Android includes a screen called Developer options that lets you configure system behaviors that help you profile and debug your app performance. For example, you can enable debugging over USB, capture a bug report, enable visual feedback for taps, flash window surfaces when they update, use the GPU for 2D graphics rendering, and more.
On Android 4.1 and lower, the Developer options screen is available by default. On Android 4.2 and higher, you must enable this screen. To enable developer options, tap the Build Number option 7 times. You can find this option in one of the following locations, depending on your Android version:
Android 9 (API level 28) and higher: Settings > About Phone > Build Number
Android 8.0.0 (API level 26) and Android 8.1.0 (API level 26): Settings > System > About Phone > Build Number
Android 7.1 (API level 25) and lower: Settings > About Phone > Build Number
At the top of the Developer options screen, you can toggle the options on and off (figure 1). You probably want to keep this on. When off, most options are disabled except those that don't require communication between the device and your development computer.
Before you can use the debugger and other tools, you need to enable USB debugging, which allows Android Studio and other SDK tools to recognize your device when connected via USB. To enable USB debugging, toggle the USB debugging option in the Developer Options menu. You can find this option in one of the following locations, depending on your Android version:
Android 9 (API level 28) and higher: Settings > System > Advanced > Developer Options > USB debugging
Android 8.0.0 (API level 26) and Android 8.1.0 (API level 26): Settings > System > Developer Options > USB debugging
Android 7.1 (API level 25) and lower: Settings > Developer Options > USB debugging
The rest of this page describes some of the other options available on this screen.
On your Project panel select Assets > MixedRealityToolkit.Examples > Demos.
Select from the folders that you want to see an example of, ex: HandTracking, EyeTracking...
Open the Scenes folder and select a scene and double click to open.
You can press play to try out the scene in your editor window.
Turn on your HoloLens device.
Tab your wrist(HoloLens 2) or make a bloom gesture(HoloLens 1) to initiate Windows menu.
Open the Settings > Update & Security.
Select For Developers tab on the right hand panel.
If you are using HoloLens Seed project, you do not need to follow this step. Seed project already comes with MRTK. Still, it's good to know how to import the MRTK assets for your future projects.
First, you need to download MRTK by going to their github page: and navigating to releases tab. Scroll down to Assets section and download the tools:
Examples
Extensions
Foundation
Tools
In your Unity project, select Assets tab and select Import Package > Custom Package from the drop down.
Navigate to MRTK downloaded folders to select and import them into your project.
Once you have MRTK assets imported, a new tab called Mixed Reality Toolkit will appear in your Unity editor. Navigate to new tab and select Add Scene and Configure from the dropdown menu. In your Scene Hierarchy, a new MixedRealityToolkit and MixedRealityPlayspace dropdowns will appear.
MixedRealityPlayspace now includes your Main Camera and the camera is configured for Mixed Reality applications. Camera background is black to render transparent and MixedRealityInputModule, EventSystem, GazeProvider components are now added to your camera.
You can create a new scene to compare the camera settings that has changed by MRTK.
You might be prompted to select a configuration. You can choose the default MRTK configuration or if you are developing for an HoloLens device, you can choose the configuration for the appropriate version.
In the Unity menu, select File > Build Settings... to open the Build Settings window.
In the Build Settings window, select Universal Windows Platform and click the Switch Platform button.
Click on Project Settings in the Build Settings window or the Unity menu, select Edit > Project Settings... to open the Project Settings window.
In the Project Settings window, select Player > XR Settings to expand the XR Settings.
In the XR Settings, check the Virtual Reality Supported checkbox to enable virtual reality, then click the + icon and select Windows Mixed Reality to add the Windows Mixed Reality SDK.
Your projects settings might have been configured by Mixed Reality Toolkit.
Optimize the XR Settings as follows:
Set Windows Mixed Reality Depth Format to 16-bit depth.
Check the Windows Mixed Reality Enable Depth Sharing checkbox.
Set Stereo Rendering Mode* to Single Pass Instanced.
In the Project Settings window, select Player > Publishing Settings to expand the Publishing Settings. Scroll down to the Capabilities section and check the SpatialPerception checkbox.
Save your project and open up the Build Settings. Click on Build button, not Build and Run. When prompted, create a new folder(ex:HoloLensBuild) and select your new folder to build your files into.
Click on Build button, not Build and Run.
When your build is done, your file explorer will automatically open to the build folder you just created.
1 ) Make sure you have imported Microsoft.MixedReality.Toolkit.Unity.Foundation as a custom asset or through NuGet.
2 ) In the Unity Package Manager (UPM), install the following packages:
3 ) Enabling the Unity AR camera settings provider.
The following steps presume use of the MixedRealityToolkit object. Steps required for other service registrars may be different.
Select the MixedRealityToolkit object in the scene hierarchy.
2. Select Copy and Customize to Clone the MRTK Profile to enable custom configuration.
3. Select Clone next to the Camera Profile.
4. Navigate the Inspector panel to the camera system section and expand the Camera Settings Providers section.
5. Click Add Camera Settings Provider and expand the newly added New camera settings entry.
6. Select the Unity AR Camera Settings provider from the Type drop down.
There are no additional steps after switching the platform for Android.
Common issues to consider while developing for Mixed Reality
Since a Mixed Reality application might have access to the user video stream, developers might be able to save or share private information about the user. Be careful to not to save any sensitive data or image anywhere other than users device. Never send sensitive information to any backend.
Iris scan is a more accurate identification method than fingerprint. Since iris scan data can be used to identify and sing in a user, it should never leave the users device. HoloLens 2 does not send the iris scan to the cloud and does not give access to the data.
Eye tracking, while a very useful tool to make your application more accessible, it can also be used to collect data about the user's attention and might be used to manipulate the user's attention.
Unity versions are not backward compatible. If you decide to open a project on a newer version, Unity will try to update your project automatically but it is not guaranteed that the newer version will work with your imported assets. There might be some incompatibilities with your asset or your code and the new version.
Let's take the latest version in the image below, 2019.3.5f:
2019: is the year the Unity version was developed. Changes are issued once a year. If there are major changes, that will break your application. Stick to the same year version unless you are creating a new application from scratch for now. We will talk about how to update your project to the latest version in the following lessons.
3: implies the 3rd iteration in 2019. When a version updates from 2 to 3, there are minor breaking code. Make sure to read changelog before updating your project from 2 to 3.
.5f: is for bug fixes. Usually there are few fixes that does not break your code or the APIs being used. Feel free to update your project from 2019.3.4f to 2019.3.5f.
In your Unity Hub, under the project tab, you can select the Unity version drop down for your application and select a newer version of Unity. Unity will confirm your choice before updating your project. It is a good idea to save a version of your project as a new branch in Github, in case you need to revert back.
Medical
Museums and Libraries
Unchecking Strip Engine Code is the short term solution to an error in Xcode . We are working on a long term solution.
Here is a detailed article about the subject:
Android
iOS
AR Foundation Version: 2.1.4
AR Foundation Version: 2.1.4
ARCore XR Plugin Version: 2.1.2
ARKit XR Plugin Version: 2.1.2
Mixed Reality getting started resources
Windows Mixed Reality Docs: aka.ms/MixedRealityDocs
Mixed Reality Curriculum Youtube Playlist: aka.ms/MixedRealityCurriculumVideos
Mixed Reality Resources Repository: aka.ms/MixedRealityResourcesRepository
HoloLens Seed Project Repository: aka.ms/HoloLensSeedProject
Code Samples: aka.ms/MixedRealityUnitySamples
Mixed Reality Development Tools to install: https://aka.ms/HoloLensToolInstalls
Elliminate Texture Confusion: Bump, Normal and Displacement Maps: https://www.pluralsight.com/blog/film-games/bump-normal-and-displacement-maps
Normal vs. Displacement Mapping & Why Games Use Normals: https://cgcookie.com/articles/normal-vs-displacement-mapping-why-games-use-normals
Live editing WebGL shaders with Firefox Developer Tools: https://hacks.mozilla.org/2013/11/live-editing-webgl-shaders-with-firefox-developer-tools/