As of now, we do not know exactly what shape the Metaverse will take. That does not matter, either. What matters is that someday, a global network of spatially organized, predominantly 3D content will be available to all without restriction, for use in all human endeavors — a new and profoundly transformational medium, enabled by major innovations in hardware, human-computer interface, network infrastructure, creator tools and digital economies.
Lesson 1: Introduction to Mixed Reality Applications and Development.
Lesson 2: Introduction to Mixed Reality Developer Tools and 3D Concepts.
: Working with Hand Interactions and Controllers.
: Eye and Head Gaze Tracking.
: Spatial Visualization using Bing Maps.
: Working with REST APIs.
: Azure Spatial Anchors and Backend Services.
: Displaying Spatial Anchors on a map.
: Working with QR codes.
: Working with Scene Understanding.
: Getting Started with AI.
: Project Discussion and Case Studies.
01 - Introduction to Mixed Reality
Introduction to Mixed Reality Applications and Development
Short link:
Overview
In this lesson, you will learn about the basic concepts of Mixed Reality and explore the applications of Mixed Reality
Mixed Reality Curriculum
Learn Mixed Reality development using Azure Mixed Reality Services
Mixed Reality Curriculum:
WebXR Lessons:
Unity Lessons:
How to use MRTK Visual Profiler?
How to add visual feedback?
How to create 3D models with splines?
How to upload 3D models to your project?
How to use simplified joint data access?
Why the hand interaction is important?
Hand interaction is a very natural way to interact with 3D models. Since we interact and modify real objects with our hands, a new user of your application can start interacting with your application without having to learn about your application interface first.
How to make an object respond to input events?
What could go wrong?
Common issues working with developer tools and 3D objects
How to start debugging performance issues?
How to enable eye calibration?
How to log for debugging purposes?
How to add MRTK(Mixed Reality Toolkit) Diagnostic System to your project?
How to create 3D models using Autodesk 3dsMax?
How to setup eye-tracking?
How to add hand interactions to an object?
Where to find pre-made 3D models?
How to create polygon models?
How to use eye-tracking to select an object?
How to create your own models using Maquette?
Using eye movement without a delay to select an object.
How to add Manipulation Handler to your object?
Having a small bounding box.
How to setup head tracking?
How to monitor performance of your app?
How to visualize eye tracking data?
What could go wrong?
How to organize your buttons into a grid view?
Working with 3D Objects
in different
industries
.
In the Project section, you will set up your first Mixed Reality project using Unity and Mixed Reality Tool Kit.
This book is designed as a collection of classes that starts from basic concepts and builds a project over time. Each lesson can also be used as an individual workshop. Each class follows the below structure:
Core concepts and discussion points.
Project step-by-step walk-through.
What could go wrong. A section to discuss the common mistakes and issues.
Further reading resources.
Each class has questions as sections and builds the corresponding part of the project. If you feel you can correctly a question, feel free to move on to the next question or the next class.
In this lesson, you will learn about the basic concepts of Mixed Reality and explore the applications of Mixed Reality in different industries.
Read through the questions below. If you feel comfortable with the answer, feel free to skip to section or next chapters.
Why is Mixed Reality important?
The first revolution in computing happened with the creation of mainframe computers: computers that, at times, occupied a whole room. Mainframes were used by large organizations such as NASA for critical applications that process data.
The second wave of computing is defined by the Personal Computers(PC) becoming widely available.
We believe third wave of computing is going to include many devices to manage data and include IoT sensors and Mixed Reality devices.
We have more data than ever before. To be able to process the data and make informed decisions, we need to have access to the data in the right time and right place. Mixed Reality is able to bring that data into our context, real world.
How to change preferences in Unity?
Go to Edit > Preferences.
Change the color scheme under General, if it is available.
You can change the default editor by selecting
Will mixed reality replace our phones and Personal Computers?
How to deploy your app to a HoloLens?
How do I decide if I need to develop for Virtual Reality or Augmented Reality?
What is HoloLens Emulator?
The HoloLens Emulator lets you test holographic applications on your PC without a physical HoloLens. It also includes the HoloLens development toolset.
How to Get Started with Mixed Reality Development Using Unity?
Unity Introduction.
How to create a new scene?
On the Project panel, right click and select Create > Scene.
What are some use cases for Mixed Reality applications?
How to open MRTK example scenes?
On your Project panel select Assets > MixedRealityToolkit.Examples > Demos.
Select from the folders that you want to see an example of, ex: HandTracking, EyeTracking...
Bounding boxes make it easier and more intuitive to manipulate objects with one hand for both near and far interaction by providing handles that can be used for scaling and rotating. A bounding box will show a cube around the hologram to indicate that it can be interacted with. The bounding box also reacts to user input.
You can add a bounding box to an object by adding the BoundingBox.cs script as a component of the object.
To add the Bounding Box (Script) component to an object, first select the object in the Hierarchy window. In the Inspector window, click Add Component and search for Bounding Box.
Select the Bounding Box script to apply the component to the object. The bounding box is only visible in Game mode. Press play to view the bounding box. By default, the HoloLens 1st gen style is used.
To reflect the MRTK bounding box style, you need to change the parameters inside the Handles section of the Bounding Box (Script) component.
Change Handle Color
You can change the color of the handles by assigning a material to the Handle Material property.
In the Handles section, click the circle icon to open the Select Material window.
In the Select Material window, search for BoundingBoxHandleWhite. Once found, select to assign the color to the handle material.
When you press play, the handle colors for the bounding box will be white.
Change Handle Color When Object is Grabbed
You can change the color of the handles when an object is grabbed by assigning a material to the Handle Grabbed Material property.
In the Handles section, click the circle icon to open the Select Material window.
In the Select Material window, search for BoundingBoxHandleBlueGrabbed. Once found, select to assign the color to the handle material.
When you press play, grab one of the handles of the bounding box. The color of the handle will change to blue.
Change Scale Handles
You can change the scale handles in corners by assigning a scale handle prefab in the Scale Handle Prefab and Scale Handle Slate Prefab (for 2D slate) parameters.
First, assign a prefab to the Scale Handle Prefab. In the Handles section, click the circle icon to open the Select GameObject window.
In the Select GameObject window, switch to the Assets tab and search for MRTK_BoundingBox_ScaleHandle. Once found, select to assign the prefab to the scale handle.
Next, assign a prefab to the Scale Handle Slate Prefab. In the Handles section, click the circle icon to open the Select GameObject window.
In the Select GameObject window, switch to the Assets tab and search for MRTK_BoundingBox_ScaleHandle_Slate. Once found, select to assign the prefab to the scale handle.
When you press play, grab one of the handles of the bounding box to see the change in how the scale handle look.
Change Rotation Handles
You can change the rotation handles by assigning a rotation handle prefab in the Rotation Handle Prefab parameter.
In the Handles section, click the circle icon to open the Select GameObject window.
In the Select GameObject window, switch to the Assets tab and search for MRTK_BoundingBox_RotateHandle. Once found, select to assign the prefab to the scale handle.
When you press play, grab one of the handles of the bounding box to see the change in how the scale handle look.
How to add audio feedback?
How to add audio feedback?
You can configure an object to play a sound when the user touches an object by adding a trigger touch event to the object.
To be able to trigger touch events, the object must have the following components:
Collider component, preferably a Box Collider
Near Interaction Touchable (Script) component
Hand Interaction Touch (Script) component
To add audio feedback, first add an Audio Source component to the object. The audio source component enables you to play audio back in the scene. In the Hierarchy window, select the object and click Add Component in the Inspector window. Search for Audio Source to add the Audio Source component.
Once the Audio Source component has been added to the object, in the Inspector window, change the Spatial Blend property to 1 to enable spatial audio.
Next, with the object still selected, click Add Component and search for the Near Interaction Touchable (Script). Once found, select the component to add to the object. Near interactions come in the form of touches and grabs - which is an interaction that occurs when the user is within close proximity to an object and uses hand interaction.
After the Near Interaction Touchable (Script) is added to the object, click the Fix Bounds and Fix Center buttons. This will update the Local Center and Bounds properties of the Near Interaction Touchable (Script) to match the BoxCollider.
With the object still selected, click Add Component and search for the Hand Interaction Touch (Script). Once found, select the component to add to the object.
To make audio play when the object is touched, you will need to add an On Touch Started event to the Hand Interaction Touch (Script) component. In the Inspector window, navigate to the Hand Interaction Touch (Script) component and click the small + icon to create a new On Touch Started () event.
Drag the object to receive the event and define AudioSource.PlayOneShot as the action to be triggered. PlayOneShot will play the audio clip.
Next, assign an audio clip to the trigger. You can find audio clips provided by MRTK by navigating to Assets > MixedRealityToolkit.SDK > StandardAssets > Audio. Once you've found a suitable audio clip, assign the audio clip to the Audio Clip field.
You can now test the touch interaction using the in-editor simulation. Press the Play button to enter Game mode. Once in Game mode, hold the spacebar to bring up the hand and use the mouse to touch the object and trigger the sound effect.
Concepts
Concepts and Discussion
What's the difference between eye and head gaze?
What is 6 Degrees of Freedom?
What are good use cases for eye or head tracking?
What are the security concerns with using eye-tracking?
04 - Eye and Head Gaze
Eye and Head Gaze Tracking.
Concepts and Discussion
What's the difference between eye and head gaze?
What is 6 Degrees of Freedom?
What are good use cases for eye or head tracking?
What are the security concerns with using eye-tracking?
How to get permission to use eye-tracking?
How to setup eye-tracking?
How to simulate eye-tracking in the Unity editor?
How to enable eye calibration?
How to use eye-tracking for selection of an object?
With spatial data you can discover growth insights, manage facilities and networks, and provide location information to customers. Without considering spatial components and how they relate to your business, your risks and possibility of poor results will increase.
Spatial analysis allows you to solve complex location-oriented problems and better understand where and what is occurring in your world. It goes beyond mere mapping to let you study the characteristics of places and the relationships between them. Spatial analysis lends new perspectives to your decision-making.
How to get permission to use eye-tracking?
Using eye gaze to influence the user.
Project
Design & Prototyping: Enables real-time collaborative iteration of 3D physical and virtual models across cross-functional teams and stakeholders.
Training & Development: Provides instructors with better tools to facilitate teaching/coaching sessions. It offers trainees an enhanced and engaging learning experiences through 3D visualizations and interactivity.
Geospatial Planning: Enables the assessment and planning of indoor and outdoor environments (i.e. future construction sites, new store locations, interior designs) and removing the need for manual execution.
Sales Assistance: Improves the effectiveness of individuals in sales-oriented roles by providing tools such as 3D catalogs and virtual product experiences that increase customer engagement and strengthen buyer confidence.
Field Service: Improves the first-visit resolution and customer satisfaction of customer support issues. It is typically used for complex products that would otherwise require a field visit. It can serve as a platform for targeted up-sell opportunities, as well.
Productivity & Collaboration: Transform the space around you into a shared augmented workplace. Remote users can collaborate, search, brainstorm and share content as if they were in the same room
3rd Wave of Computing
External Tools > External Script Editor
drop-down will have your editors currently available in your computer.
Name your scene and drag it under Scenes folder for organization purposes.
Creating a new Unity Scene.
Every new Scene comes with a light and camera. We have to modify the camera later for our Mixed Reality project.
New scene camera.
Open the Scenes folder and select a scene and double click to open.
You can press play to try out the scene in your editor window.
What are some key concepts for working with Unity?
Let’s review some key concepts, which will help you as you begin to explore editing scripts for mixed reality development.
Scenes
In Unity, areas of the game that a player can interact with are generally made up of one or more Scenes. Small games may only use one Scene; large ones could have hundreds.
Every Unity project you create comes with a SampleScene that has a light and a camera.
SampleScene with a light and camera.
You can create a new scene by right clicking under the assets tab and selecting Create > Scene. Organizing scenes under a Scenes folder is only for the organization purposes.
You can use scenes to organize navigation inside your application or adding different levels to a game.
GameObjects and components
Every object in the game world exists as a GameObject in Unity. GameObjects are given specific features by giving them appropriate components which provide a wide range of different functionality.
When you create a new GameObject, it comes with a Transform component already attached. This component controls the GameObject’s positional properties in the 3D (or 2D) gamespace. You need to add all other components manually in the Inspector.
Prefabs
Prefabs are a great way to configure and store GameObjects for re-use in your game. They act as templates, storing the components and properties of a specific GameObject and enabling you to create multiple instances of it within a Scene.
All copies of the Prefab template in a Scene are linked. This means that if you change the object values for the health potion Prefab, for example, each copy of that Prefab within the Scene will change to match it. However, you can also make specific instances of the GameObject different to the default Prefab settings.
What is Debugging?
Debugging is the process of finding and resolving defects or problems within a computer program that prevent correct operation of the software.
Tab your wrist(HoloLens 2) or make a bloom gesture(HoloLens 1) to initiate Windows menu.
Windows menu.
Open the Settings > Update & Security.
Select For Developers tab on the right hand panel.
How to choose performant 3D models for your application?
Bump Maps
Reuse the model instance instead of a new model where ever you can.
How to simulate input interactions in Unity editor?
Mixed Reality Toolkit(MRTK) supports in-editor input simulation. Simply run your scene by clicking Unity’s play button. Use these keys to simulate input.
Press W, A, S, D keys to move the camera.
Hold the right mouse button and move the mouse to look around.
To bring up the simulated hands, press Space bar(Right hand) or left Shift key(Left hand).
To keep in the view, press T or Y key.
To rotate simulated hands, press Q or E(horizontal) / R or F(vertical).
What is the difference between Augmented Reality, Virtual Reality and Mixed Reality?
Augmented Reality(AR) is defined as a technology that superimposes a computer-generated image on a user's view of the real world, thus providing a composite view. Augmented Reality experiences are not limited to visual addition to our world. You can create augmented experiences that are only audio addition to your physical world or both audio and visual.
Augmented Reality experiences are also not limited to headsets like HoloLens. Today, millions of mobile devices have depth-sensing capabilities to augment your real world with digital information.
Virtual Reality(VR) is when you are absolutely immersed in a Virtual World by wearing a headset. In Virtual Reality you lose connection to the real world visually. Virtual Reality applications are great for training and for simulations where users would benefit from total immersion to replicate the real life situation. Some examples include training for firefighters, emergency room healthcare providers and flight simulations.
What is Mixed Reality?
Mixed reality is the merging of real and virtual worlds to produce new environments and visualizations, where physical and digital objects co-exist and interact in real time.
Mixed Reality Experiences
We think of Mixed reality as a spectrum from the physical world to an augmented world to fully immersive virtual world and all the possibilities in between.
Mixed Reality Spectrum
What makes a 3D model? What are Polygons, Splines, Vertices, Meshes and Materials?
A 3D model is a digital representation of a real world object. Representing a 3D object requires you to get to know some parts that makes up the 3D object.
Polygonal modeling is an approach for modeling objects by representing or approximating their surfaces using polygon meshes.
Example of triangle mesh.
Objects created with polygon meshes must store different types of elements. These include vertices, edges, faces, polygons and surfaces.
Why are polygons important?
The more edges and faces a model has, detail if the model improves. On the other hand, having a high polygon count model will reduce the performance of your app. The calculation that needs to be done to render the model is expensive.
How to build and deploy your project for Windows Mixed Reality Headset?
Getting Super Powers
Becoming a super hero is a fairly straight forward process:
$ give me super-powers
Super-powers are granted randomly so please submit an issue if you're not happy with yours.
Once you're strong enough, save the world:
Project
In this section we will install Windows Mixed Reality developer tools and learn about how to use them.
Windows Device Portal
What are Gestures?
Gestures are input events based on human hands.
There are two types of devices that raise gesture input events in Mixed Reality Toolkit(MRTK):
Windows Mixed Reality devices such as HoloLens. This describes pinching motions ("Air Tap") and tap-and-hold gestures.
Both of these input sources use the Gesture Settings profile to translate Unity's Touch and Gesture events respectively into MRTK's . This profile can be found under the Input System Settings profile.
Mixed Reality Toolkit (MRTK) provides a set of components and features to accelerate cross-platform Mixed Reality application development in Unity. MRTK includes:
UI and interaction building blocks.
UI building blocks
Tools.
Example Scenes.
You can learn more about the components at: .
How to deploy to HoloLens Emulator?
How to run the (Mixed Reality Toolkit)MRTK Hand Interaction examples in Unity Editor?
MRTK hand interactions example scene.
The HandInteractionExamples.unity example scene contains various types of interactions and UI controls that highlight articulated hand input.
To try the hand interaction scene, first open the HandInteractionExamples scene under Assets\MixedRealityToolkit.Examples\Demos\HandTracking\Scenes\HandInteractionExamples
This example scene uses TextMesh Pro. If you receive a prompt asking you to import TMP Essentials, select the Import TMP Essentials button. Some of the MRTK examples use TMP Essentials for improved text rendering. After you select Import TMP Essentials, Unity will then import the package.
Importing TMP Essentials
After Unity completes the import, close the TMP Importer window and reload the scene. You can reload the scene by double-clicking the scene in the Project window.
After the scene is reloaded, press the Play button.
02 - Mixed Reality Developer Tools and Concepts
Introduction to Mixed Reality Developer Tools and 3D Concepts
In this section, we will go through the developer tools and how to get started with debugging our applications.
Second part of the course is focused on creating and using 3D assets in your applications.
How to grab and move an object?
To grab and move an object, first ensure that the Manipulation Handler (Script) and Near Interaction Grabbable (Script) components are added to the object. The Manipulation Handler (Script) allows you to manipulate an object while the **Near Interaction Grabble (Script) allows the object to respond to near hand interactions.
To add the scripts to the object, first select the object in the Hierarchy window. In the Inspector window, click Add Component and search for each script. Once found, select the script to add to the object.
Add Manipulation Handler Script component
With the object selected, in the Inspector window, navigate to the Manipulation Handler (Script) component to modify the component's parameters.
Manipulation Handler Parameters
You can move an object using one or two hands. This setting is dependent on the Manipulation Type parameter. The Manipulation Type can be limited to either:
One Handed Only
Two Handed Only
One and Two Handed
Select the preferred Manipulation Type so that the user is restricted to use one of the available manipulation types.
You can now test grabbing and moving the object using the in-editor simulation. Press the Play button to enter Game mode. Once in Game mode, hold the space bar to bring up the hand and use the mouse to grab and move the object.
In the Unity menu, select File > Build Settings... to open the Build Settings window.
In the Build Settings window, select Universal Windows Platform and click the Switch Platform button.
How to set up your project for iOS or Android[Experimental]?
1 ) Make sure you have imported Microsoft.MixedReality.Toolkit.Unity.Foundation as a custom asset or through NuGet.
2 ) In the Unity Package Manager (UPM), install the following packages:
How to add Mixed Reality Toolkit(MRTK) to a project?
If you are using HoloLens Seed project, you do not need to follow this step. Seed project already comes with MRTK. Still, it's good to know how to import the MRTK assets for your future projects.
First, you need to download MRTK by going to their github page: and navigating to releases tab. Scroll down to Assets section and download the tools:
Examples
Project
In this project we we will setup our development environment for Mixed Reality Development with Unity3d
Check your knowledge by answering below question before you move into the project. Feel free to skip sections you feel comfortable. Make sure you read through the first download section to make sure you have all the modules necessary.
How to build your scene for Android and iOS Devices?
How to set-up HoloLens 2 Emulator
You can download the latest here:.
Can the HoloLens Emulator run on my device?
Before installing the emulator, make sure your PC meets the following hardware requirements:
How to make your buttons follow your hand?
How to make your buttons follow your hand?
MRTK uses what are known as Solvers to allow UI elements to follow the user or other game objects in the scene. The Radial View solver is a tag-along component that keeps a particular portion of a GameObject within the user's view.
You can make a button follow your hand by adding the Radial View (Script) component to the object.
What could go wrong?
Common issues to consider while developing for Mixed Reality
What are some of the security issues with Mixed Reality Applications?
Since a Mixed Reality application might have access to the user video stream, developers might be able to save or share private information about the user. Be careful to not to save any sensitive data or image anywhere other than users device. Never send sensitive information to any backend.
How to organize your objects into a grid view?
You can organize any objects in Unity into a grid by using an Object collection script. In this example, you will learn how to organize 6 3D objects into a 3 x 3 grid.
First, configure your Unity scene for the Mixed Reality Toolkit. Next, in the Hierarchy window, right click in an empty space and select Create Empty. This will create an empty GameObject. Name the object CubeCollection.
In the Inspector window, position CubeCollection so that the collection displays in front of the user (example, X = 0, Y = -0.2, Z = 2).
05 - Map Visualization
Spatial Visualization using Bing Map using HoloLens 2 and Windows Mixed Reality Headsets.
Overview
Shortlink:
What are some good spatial visualizations for Mixed Reality?
A good visualization allows the users to understand a data better by seeing the data points in the right context. Check out some of the examples below to see what the visualization provides that you would have hard time to understand just by seeing the information data points.
Small arms and ammunition import and export interactive visualization:
using .
What is Bing Maps SDK?
Maps SDK, a Microsoft Garage project provides a control to visualize a 3D map in Unity. The map control handles streaming and rendering of 3D terrain data with world-wide coverage. Select cities are rendered at a very high level of detail. Data is provided by Bing Maps.
The map control has been optimized for mixed reality applications and devices including the HoloLens, HoloLens 2, Windows Immersive headsets, HTC Vive, and Oculus Rift. Soon the SDK will also be provided as an extension to the
First, drag a button prefab from MixedRealityToolkit.SDK > Features > UX > Interactable > Prefabs to the Hierarchy window.
In the Hierarchy window, select the button prefab. In the Inspector window, click Add Component. Search for Radial View. Once found, select to add the component to the button.
When you add the Radial View (Script) component to the button, the Solver Handler (Script) component is added as well because it is required by the Radial View (Script).
The Solver Handler (Script) component needs to be configured so that the button follows the user's hand. First, change Tracked Target Type to Hand Joint. This will enable you to define which hand joint the button follows.
Next, for the Solver Handler (Script) component, change Tracked Handness to Right. This setting determines which hand is tracked.
There over 20 hand joints available for tracking. Still inside the Solver Handler (Script) component, change Tracked Hand Joint to Wrist so that the button tracks the user's wrist.
Now that the hand tracking is configured, you need to configure the Radial View (Script) component to further define where the button is located and how it is viewed in relation to the user. First, change Reference Direction to Facing World Up. This parameter determines which direction the button faces.
Next, in the Radial View (Script) component, change the Min Distance and Max Distance to 0. The Min and Max Distance parameters determine how far the button should be kept from the user. As a reminder, the unit of measurement in Unity is meters. Therefore, a Min Distance of 1 would push the buttona way to ensure it is never closer than 1 meter to the user.
Now that the button is configured to follow your right wrist, press Play to enter Game mode and test the solver in the in-editor simulator. Press and hold the space bar to bring up the hand. Move the mouse cursor around to move the hand, and click and hold the left mouse button to rotate the hand:
Mixed Reality Toolkit is equipped with a variety of button prefabs that you could add to your project. A prefab is a pre-configured GameObject stored as a Unity Asset and can be reused throughout your project.
You can find button prefabs available in MRTK by navigating to MixedRealityToolkit.SDK > Features > UX > Interactable > Prefabs.
In this project, you will learn how to change the color of a cube when a button is pressed.
First, select the button of your choice from the Project window and drag into the Hierarchy window.
Change the button's Transform Position so that it's positioned in front of the camera to x = 0, y = 0, and z = 0.5
Next, right click on an empty spot in the Hierarchy window and click 3D Object > Cube.
With the Cube object still selected, in the Inspector window, change the Transform Position so that the cube is located near but not overlapping the button. In addition, resize the cube by changing the Transform Scale.
In the Hierarchy window, select the button. In the Inspector window, navigate to the Interactable (Script) component.
In the Events section, expand the Receivers section.
Click the Add Event button to create a new event receiver of Event Receiver TypeInteractableOnPressReceiver.
For the newly created InteractableOnPressReceiver event, change the Interaction Filter to Near and Far.
From the Hierarchy window, click and drag the Cube GameObject into the Event Properties object field for the On Press() event to assign the Cube as a receiver of the On Press () event.
Next, click the action dropdown (currently assigned No Function) and select MeshRenderer > Material material. This action will set the Cube's material property to change when the button is pressed.
Now, assign a color for the Cube to change to when the button is pressed. Click the small circle icon next to the Material field (currently assigned None (Material)) to open the Select Material window.
MRTK provides a variety of materials that can be used in your projects. In the search bar, search for MRTK_Standard and select your color of choice.
Now that the event is configured for when the button is pressed, you now need to configure an event that occurs when the button is released. For the On Release () event, click and drag the Cube GameObject into the Event Properties.
Next, click the action dropdown (currently assigned No Function) and select MeshRenderer > Material material. This action will set the Cube's material property to change when the button is released.
Now, assign a color for the Cube to change to when the button is released. Click the small circle icon next to the Material field (currently assigned None (Material)) to open the Select Material window and search for MRTK_Standard. Select your choice of color.
Now that both the On Press () and On Trigger () events are configured for the button, press Play to enter Game mode and test the button in the in-editor simulator.
To press the button, press the space bar + mouse scroll forward.
To release the button, press the space bar + mouse scroll backward.
hello.sh
# Ain't no code for that yet, sorry
echo 'You got to trust me on this, I saved the world'
Click on Project Settings in the Build Settings window or the Unity menu, select Edit > Project Settings... to open the Project Settings window.
Project settings.
In the Project Settings window, select Player > XR Settings to expand the XR Settings.
Player XR settings.
In the XR Settings, check the Virtual Reality Supported checkbox to enable virtual reality, then click the + icon and select Windows Mixed Reality to add the Windows Mixed Reality SDK.
XR settings Mixed Reality Supported checkbox.
Your projects settings might have been configured by Mixed Reality Toolkit.
Optimize the XR Settings as follows:
Set Windows Mixed Reality Depth Format to 16-bit depth.
Check the Windows Mixed Reality Enable Depth Sharing checkbox.
Set Stereo Rendering Mode* to Single Pass Instanced.
Optimization settings for XR.
In the Project Settings window, select Player > Publishing Settings to expand the Publishing Settings. Scroll down to the Capabilities section and check the SpatialPerception checkbox.
Spatial Perception enabled.
Save your project and open up the Build Settings. Click on Build button, not Build and Run. When prompted, create a new folder(ex:HoloLensBuild) and select your new folder to build your files into.
Click on Build button, not Build and Run.
Build your project into a new folder by clicking build button.
When your build is done, your file explorer will automatically open to the build folder you just created.
Build settings.
Switch platform to Windows Universal Platform.
3 ) Enabling the Unity AR camera settings provider.
The following steps presume use of the MixedRealityToolkit object. Steps required for other service registrars may be different.
Select the MixedRealityToolkit object in the scene hierarchy.
MixedReality Toolkit in Hierarchy panel.
2. Select Copy and Customize to Clone the MRTK Profile to enable custom configuration.
Copy and Customize to Clone the MRTK Profile.
3. Select Clone next to the Camera Profile.
Clone camera profile.
4. Navigate the Inspector panel to the camera system section and expand the Camera Settings Providers section.
Camera Settings Providers
5. Click Add Camera Settings Provider and expand the newly added New camera settings entry.
New camera settings expanded view.
6. Select the Unity AR Camera Settings provider from the Type drop down.
Unity AR Camera Settings.
Android
iOS
AR Foundation Version: 2.1.4
AR Foundation Version: 2.1.4
ARCore XR Plugin Version: 2.1.2
ARKit XR Plugin Version: 2.1.2
Extensions
Foundation
Tools
Download MRTK Releases.
Add MRTK assets into your project
In your Unity project, select Assets tab and select Import Package > Custom Package from the drop down.
Navigate to MRTK downloaded folders to select and import them into your project.
Once you have MRTK assets imported, a new tab called Mixed Reality Toolkit will appear in your Unity editor. Navigate to new tab and select Add Scene and Configure from the dropdown menu. In your Scene Hierarchy, a new MixedRealityToolkit and MixedRealityPlayspace dropdowns will appear.
MixedRealityPlayspace now includes your Main Camera and the camera is configured for Mixed Reality applications. Camera background is black to render transparent and MixedRealityInputModule, EventSystem, GazeProvider components are now added to your camera.
You can create a new scene to compare the camera settings that has changed by MRTK.
You might be prompted to select a configuration. You can choose the default MRTK configuration or if you are developing for an HoloLens device, you can choose the configuration for the appropriate version.
There are no additional steps after switching the platform for Android.
Optimization header, uncheck Strip Engine Code.
Unchecking Strip Engine Code is the short term solution to an error in Xcode #6646. We are working on a long term solution.
64-bit Windows 10 Pro, Enterprise, or Education
Windows 10 Home Edition does not support Hyper-V or the HoloLens Emulator. The HoloLens 2 Emulator requires the Windows 10 October 2018 update or later.
Iris scan is a more accurate identification method than fingerprint. Since iris scan data can be used to identify and sing in a user, it should never leave the users device. HoloLens 2 does not send the iris scan to the cloud and does not give access to the data.
Why is eye tracking important for users privacy?
Eye tracking, while a very useful tool to make your application more accessible, it can also be used to collect data about the user's attention and might be used to manipulate the user's attention.
Can I open my unity project in the current version, if it is originally saved in an older version?
Unity versions are not backward compatible. If you decide to open a project on a newer version, Unity will try to update your project automatically but it is not guaranteed that the newer version will work with your imported assets. There might be some incompatibilities with your asset or your code and the new version.
What does the Unity Versioning mean and when is it safe to update the Unity version?
Let's take the latest version in the image below, 2019.3.5f:
2019: is the year the Unity version was developed. Changes are issued once a year. If there are major changes, that will break your application. Stick to the same year version unless you are creating a new application from scratch for now. We will talk about how to update your project to the latest version in the following lessons.
3: implies the 3rd iteration in 2019. When a version updates from 2 to 3, there are minor breaking code. Make sure to read changelog before updating your project from 2 to 3.
.5f: is for bug fixes. Usually there are few fixes that does not break your code or the APIs being used. Feel free to update your project from 2019.3.4f to 2019.3.5f.
How to update my Unity version to a newer one?
In your Unity Hub, under the project tab, you can select the Unity version drop down for your application and select a newer version of Unity. Unity will confirm your choice before updating your project. It is a good idea to save a version of your project as a new branch in Github, in case you need to revert back.
With CubeCollection still selected, in the Hierarchy window, create a child Cube object. Change the scale of the object to x = .25, y = .25, z = .25.
Cube transforms.
Duplicate the child Cube object 8 times so that there is a total of 9 Cube child objects within the CubeCollection object.
Duplicated cubes collection.
In the Hierarchy window, select CubeCollection. In the Inspector window, click Add Component and search for the Grid Object Collection (Script). Once found, select the component to add to the object.
Add Grid Object Collection Script component
Configure the Grid Object Collection (Script) component by changing the Sort Type property to Child Order. This will ensure that the child objects (the 9 Cube objects) are sorted in the order you placed them under the parent object.
Change sort type.
Click Update Collection to apply the new configuration.
Update Collection.
You can adjust the parameters within the Grid Object Collection (Script) component to further customize the grid. For example, you could change the number of rows to 2 by changing the value in the Num Rows properties. Be sure to click Update Collection to apply the new configuration.
Change number of rows.
Grid layout of boxes.
Empty CubeCollection object.
Position attributes of the CubeCollection.
This project is for HoloLens 2 and Windows Mixed Reality Headsets.
Outings, a sample app created by Bings Map SDK can be found on Microsoft Store for PC and HoloLens 1: aka.ms/OutingsHoloLens1
Outings Immersive App.
What we will build?
We will build the app in below video for HoloLens 2. You can render it for Windows Mixed Reality Headset and use hand controllers instead of hand gestures.
HoloLens Seed project is a github repository that is configured for Windows Mixed Reality development. The repo includes Mixed Reality Toolkit and .gitignore files.
You can create a new project from the seed instead of downloading the different assets and setting up your git project. To be able to use the seed project, you can get a github account and setup your development environment or directly download the repository content.
Download Seed project from github.
Setup
You can clone and delete this repository's history and start a new git project by running the below script. You need to create your own github repo first. Replace with your own github project url.
Or by running the below github commands:
How to update your project to latest seed?
Whenever there is a new update for or packages, this repo will be updated with the latest version. You can automaticly get the latest packages by adding the seed repo as your upstream and pulling from it.
You can check to see if your remote origin and upstream by copy and pasting to your terminal:
Elliminate Texture Confusion: Bump, Normal and Displacement Maps:
Normal vs. Displacement Mapping & Why Games Use Normals:
Live editing WebGL shaders with Firefox Developer Tools:
How to enable Developer Mode on an Android Device?
The Settings app on Android includes a screen called Developer options that lets you configure system behaviors that help you profile and debug your app performance. For example, you can enable debugging over USB, capture a bug report, enable visual feedback for taps, flash window surfaces when they update, use the GPU for 2D graphics rendering, and more.
Enable developer options and USB debugging
On Android 4.1 and lower, the Developer options screen is available by default. On Android 4.2 and higher, you must enable this screen. To enable developer options, tap the Build Number option 7 times. You can find this option in one of the following locations, depending on your Android version:
Android 9 (API level 28) and higher: Settings > About Phone > Build Number
Android 8.0.0 (API level 26) and Android 8.1.0 (API level 26): Settings > System > About Phone > Build Number
Android 7.1 (API level 25) and lower: Settings > About Phone > Build Number
At the top of the Developer options screen, you can toggle the options on and off (figure 1). You probably want to keep this on. When off, most options are disabled except those that don't require communication between the device and your development computer.
Before you can use the debugger and other tools, you need to enable USB debugging, which allows Android Studio and other SDK tools to recognize your device when connected via USB. To enable USB debugging, toggle the USB debugging option in the Developer Options menu. You can find this option in one of the following locations, depending on your Android version:
Android 9 (API level 28) and higher: Settings > System > Advanced > Developer Options > USB debugging
Android 8.0.0 (API level 26) and Android 8.1.0 (API level 26): Settings > System > Developer Options > USB debugging
Android 7.1 (API level 25) and lower: Settings > Developer Options > USB debugging
The rest of this page describes some of the other options available on this screen.
How to rotate and scale an object?
To rotate and scale an object, first ensure that the Manipulation Handler (Script) and Near Interaction Grabbable (Script) components are added to the object. The Manipulation Handler (Script) allows you to manipulate an object while the Near Interaction Grabble (Script) allows the object to respond to near hand interactions.
To add the scripts to the object, first select the object in the Hierarchy window. In the Inspector window, click Add Component and search for each script. Once found, select the script to add to the object.
Manipulation Handler Script
With the object selected, in the Inspector window, navigate to the Manipulation Handler (Script) component to modify the component's parameters.
Manipulation Handler parameters
You can rotate an object using one or two hands. This setting is dependent on the Manipulation Type parameter. The Manipulation Type can be limited to either:
One Handed Only
Two Handed Only
One and Two Handed
Select Two Handed Only for Manipulation Type so that the user can only manipulate the object with two hands.
To limit the two handed manipulation to rotating and scaling, change Two Handed Manipulation Type to Rotate Scale.
To limit whether the object can be rotated on the x, y or z axis, change Constraint on Rotation to your preferred axis.
You can now test rotating and scaling the object using the in-editor simulation. Press the Play button to enter Game mode. Once in Game mode, press T and Y on the keyboard to toggle both hands. This will permanently display both hands in Game mode. Press the space bar to move the right hand and use left mouse click + Shift to move the left hand. While either controlling the left or right hand, use the mouse to rotate and scale the object.
The Windows Device Portal for HoloLens lets you configure and manage your device remotely over Wi-Fi or USB. The Device Portal is a web server on your HoloLens that you can connect to from a web browser on your PC. The Device Portal includes many tools that will help you manage your HoloLens and debug and optimize your apps.
How to setup device portal?
What are some examples of Mixed Reality Applications?
What do I need to download for Mixed Reality development with Unity for HoloLens?
Before you get started with developing for Mixed Reality for Unity, make sure to check everything in the below list and follow the instructions for each download.
Not following the instructions for specific download might result in errors while developing or building your application. Before you try to debug, check the list and detailed instructions.
Windows 10
Install the most recent version ofor so your PC's operating system matches the platform for which you are building mixed reality applications.
You can check your Windows version by typing "about" in the Windows search bar and selecting About your PC as shown in the below image.
You can learn more about upgrading your Windows 10 Home to Pro at .
We need to install and enable Hyper-V, which does not work on Windows Home. Make sure to upgrade toEducation, Pro Education, Pro or Enterprise versions.
Unity
Go to: page and download the Unity Hub instead of Unity Editor.
Do not use Beta software in general before you feel very comfortable with debugging, the software itself and your way around github issues and stackover. Don't learn this lesson the hard way! I have tried that for your benefit and/or my optimism.
Unity Hub allows you to download multiple Unity Editors and organize your projects in one place. Since Unity upgrades are not backward compatible, you have to open the projects with the same Unity version that it was created with. You can update the projects to the latest Unity version but that requires a lot of debugging usually. Easiest way to get going with a project is to keep the same version. I will show you how to debug to update your projects later in this chapter.
You will need to download Windows development related modules along with your Unity Editor. Make sure Universal Windows Platform Build Support and Windows Build Support is checked while downloading Unity Editor through Unity Hub or add it after by modifying the install.
You can add modules or check if you have them in your editor by clicking on the hamburger button for the Unity Editor version and checking the above module check-boxes.
If you would like to build for an Android or iOS mobile device, make sure the related modules are checked as well.
Visual Studio
You can download Visual Studio by adding Microsoft Visual Studio 2019 module to your Unity Editor as shown in previous step or download it at .
Make sure to download Mixed Reality related modules along with Visual Studio.
You can always add the necessary workflows to Visual Studio after download:
How to get started with Unity3D Editor interface?
In this section, you will learn Unity3D interface, tools and keyboard shortcuts.
The Unity Editor has four main sections:
Scene view
This is where you can edit the current Scene by selecting and moving objects in the 3D space for the game. In this kit, the game level is contained in one Scene.
03 - Hand Interactions and Controllers
Working with Hand Interactions.
Short link:
Overview
In this section, we will look into the hand interactions as an input in our application.
// Clone the seed project
git clone --depth=1 https://github.com/Yonet/HoloLensUnitySeedProject.git
-- Remove the history from the repo
rm -rf .git
-- recreate the repos from the current content only
git init
git add .
git commit -m "Initial commit"
-- push to the github remote repos ensuring you overwrite history
git remote add origin [email protected]:<YOUR ACCOUNT>/<YOUR REPOS>.git
git push -u --force origin master
Will mixed reality replace our phones and personal computers?
How do I decide if I need to develop for Virtual Reality or Augmented Reality?
Changing preferences in Unity3D
How to check or enable "Hyper-V" on your PC
How to install or update HoloLens Emulator?
Hierarchy window
This is a list of all the GameObjects in a Scene. Every object in your game is a GameObject. These can be placed in a parent-child hierarchy, which lets you group objects — this means that when the parent object is moved, all of its children will move at the same time.
Inspector window
This display all settings related to the currently selected object. You will explore this window more during the walkthrough.
Project window
This is where you manage your Project Assets. Assets are the media files used in a Project (for example, images, 3D models and sound files). The Project window acts like a file explorer, and it can be used to explore and create folders on your computer. When the walkthrough asks you to find an Asset at a given file path, use this window.
TIP: If your Editor layout doesn’t match the image above, use the layout drop-down menu at the top right of the toolbar to select Default.
Going back to default editor layout.
Unity Editor Toolbar
Unity Editor Toolbar.
The toolbar includes a range of useful tool buttons to help you design and test your game.
Play Buttons
Play
Play is used to test the Scene which is currently loaded in the Hierarchy window, and enables you to try out your game live in the Editor.
Pause
Pause, as you have probably guessed, allows you to pause the game playing in the Game window. This helps you spot visual problems or gameplay issues that you wouldn’t otherwise see.
Step
Step is used to walk through the paused Scene frame by frame. This works really well when you’re looking for live changes in the game world that it would be helpful to see in real time.
Manipulating objects
These tools move and manipulate the GameObjects in the Scene view. You can click on the buttons to activate them, or use a shortcut key.
Hand Tool
Hand Tool Keyboard Shortcut: Q
You can use this tool to move your Scene around in the window. You can also use middle click with the mouse to access the tool.
Move Tool
Move Tool Keyboard Shortcut: W
This tool enables you to select items and move them individually.
Rotate Tool
Rotate Tool Keyboard Shortcut: E
Select items and rotate them with this tool.
Scale Tool
Scale Tool Keyboard Shortcut: R
Tool to scale your GameObjects up and down.
Rect Transform Tool
Rect Transform Tool Keyboard Shortcut: T
This tool does lots of things. Essentially, it combines moving, scaling and rotation into a single tool that’s specialized for 2D and UI.
Rotate, Move or Scale
Rotate, Move or Scale Tool Keyboard Shortcut: Y
This tool enables you to move, rotate, or scale GameObjects, but is more specialized for 3D.
Focusing on GameObject
Focusing on an GameObject Keyboard Shortcut: F
Another useful shortcut is the F key, which enables you to focus on a selected object. If you forget where a GameObject is in your Scene, select it in the Hierarchy. Then, move your cursor over the Scene view and press F to center it.
Navigating with the mouse
When you’re in the Scene view, you can also do the following:
Left click to select your GameObject in the Scene.
Middle click and drag to move the Scene view’s camera using the hand tool.
For more advice on moving GameObjects in the Scene view, see Scene View Navigation in the Manual.
Unity3D Editor Interface
Hand Interactions are currently available for onlyHoloLens 2 and Oculus devices.
In project section, we will create our first hand interactions to scale, move and rotate objects.