Only this pageAll pages
Powered by GitBook
Couldn't generate the PDF for 307 pages, generation stopped at 100.
Extend with 50 more pages.
1 of 100

Mixed Reality Docs

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

How to use MRTK Visual Profiler?

Loading...

Loading...

Loading...

Loading...

How to monitor performance of your app?

Working with 3D Objects

How to log for debugging purposes?

How to add MRTK(Mixed Reality Toolkit) Diagnostic System to your project?

Where to find pre-made 3D models?

How to upload 3D models to your project?

How to create your own models using Maquette?

How to create polygon models?

How to create 3D models with splines?

How to create 3D models using Autodesk 3dsMax?

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

How to add hand interactions to an object?

How to add Manipulation Handler to your object?

Loading...

Loading...

Loading...

How to make an object respond to input events?

Loading...

How to add visual feedback?

Loading...

Loading...

How to organize your buttons into a grid view?

Loading...

How to use simplified joint data access?

What could go wrong?

Mixing scaling and moving.

Having a small bounding box.

Loading...

Loading...

Loading...

Project

How to get permission to use eye-tracking?

How to setup eye-tracking?

How to simulate eye-tracking in the Unity editor?

How to enable eye calibration?

How to use eye-tracking to select an object?

How to use eye-tracking for infinite scroll?

How to visualize eye tracking data?

How to setup head tracking?

Loading...

Using eye movement without a delay to select an object.

Using eye gaze to influence the user.

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

01 - Introduction to Mixed Reality

Introduction to Mixed Reality Applications and Development

Overview

In this lesson, you will learn about the basic concepts of Mixed Reality and explore the applications of Mixed Reality in different industries.

In the Project section, you will set up your first Mixed Reality project using Unity and Mixed Reality Tool Kit.

Short link:

You can jump directly into setting up your first project on the .

aka.ms/UnityIntroToMixedReality
How to get started with mixed reality development using Unity3D section
Concepts
Project
What could go wrong?
Resources

Concepts

In this lesson, you will learn about the basic concepts of Mixed Reality and explore the applications of Mixed Reality in different industries.

Mixed Reality Curriculum

Learn Mixed Reality development using Azure Mixed Reality Services

How to use this book?

This book is designed as a collection of classes that starts from basic concepts and builds a project over time. Each lesson can also be used as an individual workshop. Each class follows the below structure:

  • Core concepts and discussion points.

  • Project step-by-step walk-through.

  • What could go wrong. A section to discuss the common mistakes and issues.

  • Further reading resources.

Hope you enjoy developing your mixed reality application!

Unity3D Lessons

Links

Read through the questions below. If you feel comfortable with the answer, feel free to skip to section or next chapters.

Mixed Reality Curriculum:

WebXR Lessons:

Unity Lessons:

AI Lessons:

Unreal Lessons:

Each class has questions as sections and builds the corresponding part of the. If you feel you can correctly a question, feel free to move on to the next question or the next class.

If you have any questions, suggestions and improvements, please here: .

We welcome your contributions. If you would like to contribute, check out how to in the section.

: Introduction to Mixed Reality Applications and Development.

: Introduction to Mixed Reality Developer Tools and 3D Concepts.

: Working with Hand Interactions.

: Eye and Head Gaze Tracking.

: Spatial Visualization using Bing Maps.

: Working with REST APIs.

: Azure Spatial Anchors and Backend Services.

: Displaying Spatial Anchors on a map.

: Working with QR codes.

: Working with Spatial Awareness and Scene Understanding.

: Getting Started with AI.

: Project Discussion and Case Studies.

Short link:

Mixed Reality Curriculum Playlist: .

Code Samples: .

Github:

Slack Channel:

project
What is Mixed Reality?
What is the difference between Augmented Reality, Virtual Reality and Mixed Reality?
Why is Mixed Reality important?
Will mixed reality replace our phones and PCs?
How do I decide if I need to develop for Virtual Reality or Augmented Reality?
What are some use cases for Mixed Reality applications?
What are some examples of Mixed Reality Applications?
What is Mixed Reality Toolkit(MRTK)?
aka.ms/MixedRealityCurriculum
www.learnwebxr.dev
aka.ms/MixedRealityUnityLessons
www.learnaiml.dev
aka.ms/MixedRealityUnrealLessons
project
submit an issue
https://github.com/Yonet/AzureMixedRealityDocs/issues
contributing
Lesson 1
Lesson 2
Lesson 3
Lesson 4
Lesson 5
Lesson 6
Lesson 7
Lesson 8
Lesson 9
Lesson 10
Lesson 11
Lesson 12
aka.ms/MixedRealityCurriculum
https://aka.ms/MixedRealityCurriculumVideos
https://aka.ms/MixedRealityUnitySamples
https://github.com/Yonet/AzureMixedRealityDocs
https://holodevelopers.slack.com/archives/G012X50UVML

What is the difference between Augmented Reality, Virtual Reality and Mixed Reality?

Augmented Reality(AR) is defined as a technology that superimposes a computer-generated image on a user's view of the real world, thus providing a composite view. Augmented Reality experiences are not limited to visual addition to our world. You can create augmented experiences that are only audio addition to your physical world or both audio and visual.

Augmented Reality experiences are also not limited to headsets like HoloLens. Today, millions of mobile devices have depth-sensing capabilities to augment your real world with digital information.

Virtual Reality(VR) is when you are absolutely immersed in a Virtual World by wearing a headset. In Virtual Reality you lose connection to the real world visually. Virtual Reality applications are great for training and for simulations where users would benefit from total immersion to replicate the real life situation. Some examples include training for firefighters, emergency room healthcare providers and flight simulations.

Will mixed reality replace our phones and Personal Computers?

What are some examples of Mixed Reality Applications?

Medical

Museums and Libraries

What is the Metaverse?

As of now, we do not know exactly what shape the Metaverse will take. That does not matter, either. What matters is that someday, a global network of spatially organized, predominantly 3D content will be available to all without restriction, for use in all human endeavors — a new and profoundly transformational medium, enabled by major innovations in hardware, human-computer interface, network infrastructure, creator tools and digital economies.

Please read the article to learn the basic concepts and language.

Incubator for Medical Mixed and Extended Reality at Stanford
Take a Ride on a Root Canal: The VR Tooth Tour Story
Mont-Saint-Michel: The historic 3D model comes to life
Apollo 11 Mission Unreal Engine Experience
Dutch National Museum HoloLens Experience
Paris Museum HoloLens Experience
OneDome - Unreal Garden HoloLens Exhibition
Bouluvard HoloLens App
Dinosaur Passage: A Hololens Museum Experience
Smithsonian Apollo 11 Module VR
Making of Apollo 11 Module Experience
Google I/O 2018 AR demo
SF Moma movement controlled experience
Google Augmented Reality museum experience
MoMa Jackson Pollack
DaVinci AR Layers Google IO 2018
Petersen Automotive Museum: a HoloLens experience
Mixed Reality Museum Tour Solution with HoloLens
Experimenting with Mixed Reality in Museums
Explore WWII’s French Resistance in mixed reality
Mixed Reality Museum in Kyoto: A unique insight into centuries-old Japanese artwork
Catalina HoloLens Experience
Museum Next: How Museums are using Augmented Reality
Museum Next: Virtual Reality is a big trend in museums, but what are the best examples of museums using VR?
Augmenting Museum Experiences with Mixed Reality
Tony Parisi
The Seven Rules of the Metaverse

Unity Lessons

Developing for Mixed Reality using Unity3D

What are some use cases for Mixed Reality applications?

Project

In this project we we will setup our development environment for Mixed Reality Development with Unity3d

Check your knowledge by answering below question before you move into the project. Feel free to skip sections you feel comfortable. Make sure you read through the first download section to make sure you have all the modules necessary.

Short link:

: Introduction to Mixed Reality Applications and Development.

: Introduction to Mixed Reality Developer Tools and 3D Concepts.

: Working with Hand Interactions and Controllers.

: Eye and Head Gaze Tracking.

: Spatial Visualization using Bing Maps.

: Working with REST APIs.

: Azure Spatial Anchors and Backend Services.

: Displaying Spatial Anchors on a map.

: Working with QR codes.

: Working with Scene Understanding.

: Getting Started with AI.

: Project Discussion and Case Studies.

aka.ms/MixedRealityUnityLessons
Lesson 1
Lesson 2
Lesson 3
Lesson 4
Lesson 5
Lesson 6
Lesson 7
Lesson 8
Lesson 9
Lesson 10
Lesson 11
Lesson 12
6 Use Cases for Enterprise Mixed Reality
5 Use Cases Of Augmented Reality That Boosted Businesses’ Sales
What do I need to download for Unity Development?
How to get started with Unity3D Editor interface?
What are some key concepts for working with Unity?
How to Get Started with Mixed Reality Development Using Unity?
How to get started with HoloLens Seed Project?
How to change preferences in Unity?
How to add Mixed Reality Toolkit(MRTK) to a project?
How to open MRTK example scenes?
How to enable Developer Mode in HoloLens?
How to enable Developer Mode on an Android device?
How to build your project for HoloLens?
How to deploy your app for HoloLens?
How to set up your project for iOS and Android[Experimental]?
How to build and deploy your project for Android?
How to build and deploy your project for Windows Mixed Reality Headset?

Microsoft Mesh

Why is Mixed Reality important?

The first revolution in computing happened with the creation of mainframe computers: computers that, at times, occupied a whole room. Mainframes were used by large organizations such as NASA for critical applications that process data.

The second wave of computing is defined by the Personal Computers(PC) becoming widely available.

We believe third wave of computing is going to include many devices to manage data and include IoT sensors and Mixed Reality devices.

We have more data than ever before. To be able to process the data and make informed decisions, we need to have access to the data in the right time and right place. Mixed Reality is able to bring that data into our context, real world.

Design & Prototyping: Enables real-time collaborative iteration of 3D physical and virtual models across cross-functional teams and stakeholders.

Training & Development: Provides instructors with better tools to facilitate teaching/coaching sessions. It offers trainees an enhanced and engaging learning experiences through 3D visualizations and interactivity.

Geospatial Planning: Enables the assessment and planning of indoor and outdoor environments (i.e. future construction sites, new store locations, interior designs) and removing the need for manual execution.

Sales Assistance: Improves the effectiveness of individuals in sales-oriented roles by providing tools such as 3D catalogs and virtual product experiences that increase customer engagement and strengthen buyer confidence.

Field Service: Improves the first-visit resolution and customer satisfaction of customer support issues. It is typically used for complex products that would otherwise require a field visit. It can serve as a platform for targeted up-sell opportunities, as well.

Productivity & Collaboration: Transform the space around you into a shared augmented workplace. Remote users can collaborate, search, brainstorm and share content as if they were in the same room

How to Get Started with Mixed Reality Development Using Unity?

Unity Introduction.

How to create a new scene?

  • On the Project panel, right click and select Create > Scene.

  • Name your scene and drag it under Scenes folder for organization purposes.

Every new Scene comes with a light and camera. We have to modify the camera later for our Mixed Reality project.

How to get started with Unity3D Editor interface?

In this section, you will learn Unity3D interface, tools and keyboard shortcuts.

The Unity Editor has four main sections:

Scene view

This is where you can edit the current Scene by selecting and moving objects in the 3D space for the game. In this kit, the game level is contained in one Scene.

Hierarchy window

This is a list of all the GameObjects in a Scene. Every object in your game is a GameObject. These can be placed in a parent-child hierarchy, which lets you group objects — this means that when the parent object is moved, all of its children will move at the same time.

Inspector window

This display all settings related to the currently selected object. You will explore this window more during the walkthrough.

Project window

This is where you manage your Project Assets. Assets are the media files used in a Project (for example, images, 3D models and sound files). The Project window acts like a file explorer, and it can be used to explore and create folders on your computer. When the walkthrough asks you to find an Asset at a given file path, use this window.

TIP: If your Editor layout doesn’t match the image above, use the layout drop-down menu at the top right of the toolbar to select Default.

Unity Editor Toolbar

The toolbar includes a range of useful tool buttons to help you design and test your game.

Play Buttons

Play

Play is used to test the Scene which is currently loaded in the Hierarchy window, and enables you to try out your game live in the Editor.

Pause

Pause, as you have probably guessed, allows you to pause the game playing in the Game window. This helps you spot visual problems or gameplay issues that you wouldn’t otherwise see.

Step

Step is used to walk through the paused Scene frame by frame. This works really well when you’re looking for live changes in the game world that it would be helpful to see in real time.

Manipulating objects

These tools move and manipulate the GameObjects in the Scene view. You can click on the buttons to activate them, or use a shortcut key.

Hand Tool

You can use this tool to move your Scene around in the window. You can also use middle click with the mouse to access the tool.

Move Tool

This tool enables you to select items and move them individually.

Rotate Tool

Select items and rotate them with this tool.

Scale Tool

Tool to scale your GameObjects up and down.

Rect Transform Tool

This tool does lots of things. Essentially, it combines moving, scaling and rotation into a single tool that’s specialized for 2D and UI.

Rotate, Move or Scale

This tool enables you to move, rotate, or scale GameObjects, but is more specialized for 3D.

Focusing on GameObject

Another useful shortcut is the F key, which enables you to focus on a selected object. If you forget where a GameObject is in your Scene, select it in the Hierarchy. Then, move your cursor over the Scene view and press F to center it.

Navigating with the mouse

When you’re in the Scene view, you can also do the following:

  • Left click to select your GameObject in the Scene.

  • Middle click and drag to move the Scene view’s camera using the hand tool.

For more advice on moving GameObjects in the Scene view, see in the Manual.

Scene View Navigation

What do I need to download for Mixed Reality development with Unity for HoloLens?

Before you get started with developing for Mixed Reality for Unity, make sure to check everything in the below list and follow the instructions for each download.

Not following the instructions for specific download might result in errors while developing or building your application. Before you try to debug, check the list and detailed instructions.

Windows 10

You can check your Windows version by typing "about" in the Windows search bar and selecting About your PC as shown in the below image.

We need to install and enable Hyper-V, which does not work on Windows Home. Make sure to upgrade to Education, Pro Education, Pro or Enterprise versions.

Unity

Do not use Beta software in general before you feel very comfortable with debugging, the software itself and your way around github issues and stackover. Don't learn this lesson the hard way! I have tried that for your benefit and/or my optimism.

Unity Hub allows you to download multiple Unity Editors and organize your projects in one place. Since Unity upgrades are not backward compatible, you have to open the projects with the same Unity version that it was created with. You can update the projects to the latest Unity version but that requires a lot of debugging usually. Easiest way to get going with a project is to keep the same version. I will show you how to debug to update your projects later in this chapter.

You will need to download Windows development related modules along with your Unity Editor. Make sure Universal Windows Platform Build Support and Windows Build Support is checked while downloading Unity Editor through Unity Hub or add it after by modifying the install.

You can add modules or check if you have them in your editor by clicking on the hamburger button for the Unity Editor version and checking the above module check-boxes.

If you would like to build for an Android or iOS mobile device, make sure the related modules are checked as well.

Visual Studio

Make sure to download Mixed Reality related modules along with Visual Studio.

You can always add the necessary workflows to Visual Studio after download:

What are some key concepts for working with Unity?

Let’s review some key concepts, which will help you as you begin to explore editing scripts for mixed reality development.

Scenes

In Unity, areas of the game that a player can interact with are generally made up of one or more Scenes. Small games may only use one Scene; large ones could have hundreds.

Every Unity project you create comes with a SampleScene that has a light and a camera.

You can create a new scene by right clicking under the assets tab and selecting Create > Scene. Organizing scenes under a Scenes folder is only for the organization purposes.

You can use scenes to organize navigation inside your application or adding different levels to a game.

GameObjects and components

Every object in the game world exists as a GameObject in Unity. GameObjects are given specific features by giving them appropriate components which provide a wide range of different functionality.

When you create a new GameObject, it comes with a Transform component already attached. This component controls the GameObject’s positional properties in the 3D (or 2D) gamespace. You need to add all other components manually in the Inspector.

Prefabs

Prefabs are a great way to configure and store GameObjects for re-use in your game. They act as templates, storing the components and properties of a specific GameObject and enabling you to create multiple instances of it within a Scene.

All copies of the Prefab template in a Scene are linked. This means that if you change the object values for the health potion Prefab, for example, each copy of that Prefab within the Scene will change to match it. However, you can also make specific instances of the GameObject different to the default Prefab settings.

What is Mixed Reality?

Mixed reality is the merging of real and virtual worlds to produce new environments and visualizations, where physical and digital objects co-exist and interact in real time.

We think of Mixed reality as a spectrum from the physical world to an augmented world to fully immersive virtual world and all the possibilities in between.

How do I decide if I need to develop for Virtual Reality or Augmented Reality?

Install the most recent version of or so your PC's operating system matches the platform for which you are building mixed reality applications.

You can learn more about upgrading your Windows 10 Home to Pro at .

Go to: page and download the Unity Hub instead of Unity Editor.

You can download Visual Studio by adding Microsoft Visual Studio 2019 module to your Unity Editor as shown in previous step or download it at .

Windows 10 Education
Pro Education
aka.ms/WinHome2Pro
https://unity3d.com/get-unity/download
aka.ms/VSDownloads

How to change preferences in Unity?

  • Go to Edit > Preferences.

  • Change the color scheme under General, if it is available.

  • You can change the default editor by selecting External Tools > External Script Editor drop-down will have your editors currently available in your computer.

What is Mixed Reality Toolkit(MRTK)?

Mixed Reality Toolkit (MRTK) provides a set of components and features to accelerate cross-platform Mixed Reality application development in Unity. MRTK includes:

  • UI and interaction building blocks.

  • Tools.

  • Example Scenes.

You can learn more about the components at: .

aka.ms/MRTKGuides

How to enable Developer Mode in HoloLens?

  • Turn on your HoloLens device.

  • Tab your wrist(HoloLens 2) or make a bloom gesture(HoloLens 1) to initiate Windows menu.

  • Open the Settings > Update & Security.

  • Select For Developers tab on the right hand panel.

How to enable Developer Mode on an Android Device?

The Settings app on Android includes a screen called Developer options that lets you configure system behaviors that help you profile and debug your app performance. For example, you can enable debugging over USB, capture a bug report, enable visual feedback for taps, flash window surfaces when they update, use the GPU for 2D graphics rendering, and more.

Enable developer options and USB debugging

On Android 4.1 and lower, the Developer options screen is available by default. On Android 4.2 and higher, you must enable this screen. To enable developer options, tap the Build Number option 7 times. You can find this option in one of the following locations, depending on your Android version:

  • Android 9 (API level 28) and higher: Settings > About Phone > Build Number

  • Android 8.0.0 (API level 26) and Android 8.1.0 (API level 26): Settings > System > About Phone > Build Number

  • Android 7.1 (API level 25) and lower: Settings > About Phone > Build Number

At the top of the Developer options screen, you can toggle the options on and off (figure 1). You probably want to keep this on. When off, most options are disabled except those that don't require communication between the device and your development computer.

Before you can use the debugger and other tools, you need to enable USB debugging, which allows Android Studio and other SDK tools to recognize your device when connected via USB. To enable USB debugging, toggle the USB debugging option in the Developer Options menu. You can find this option in one of the following locations, depending on your Android version:

  • Android 9 (API level 28) and higher: Settings > System > Advanced > Developer Options > USB debugging

  • Android 8.0.0 (API level 26) and Android 8.1.0 (API level 26): Settings > System > Developer Options > USB debugging

  • Android 7.1 (API level 25) and lower: Settings > Developer Options > USB debugging

The rest of this page describes some of the other options available on this screen.

How to get started with HoloLens Seed Project?

Setup

You can clone and delete this repository's history and start a new git project by running the below script. You need to create your own github repo first. Replace with your own github project url.

git clone --depth=1 https://github.com/Yonet/HoloLensUnitySeedProject.git <your-project-name>

Or by running the below github commands:

// Clone the seed project
git clone --depth=1 https://github.com/Yonet/HoloLensUnitySeedProject.git

-- Remove the history from the repo
rm -rf .git

-- recreate the repos from the current content only
git init
git add .
git commit -m "Initial commit"

-- push to the github remote repos ensuring you overwrite history
git remote add origin git@github.com:<YOUR ACCOUNT>/<YOUR REPOS>.git
git push -u --force origin master

How to update your project to latest seed?

git remote add upstream https://github.com/Yonet/HoloLensUnitySeedProject.git
git pull upstream master

You can check to see if your remote origin and upstream by copy and pasting to your terminal:

git remote -v

You can remove the upstream anytime by running:

git remote remove upstream https://github.com/Yonet/HoloLensUnitySeedProject.git

is a github repository that is configured for Windows Mixed Reality development. The repo includes Mixed Reality Toolkit and .gitignore files.

You can create a new project from the seed instead of downloading the different assets and setting up your git project. To be able to use the seed project, you can and setup your development environment or directly download the repository content.

Whenever there is a new update for or packages, this repo will be updated with the latest version. You can automaticly get the latest packages by adding the seed repo as your upstream and pulling from it.

HoloLens Seed project
get a github account
Mixed Reality Toolkit
Azure Spatial Anchors

How to add Mixed Reality Toolkit(MRTK) to a project?

If you are using HoloLens Seed project, you do not need to follow this step. Seed project already comes with MRTK. Still, it's good to know how to import the MRTK assets for your future projects.

  • Examples

  • Extensions

  • Foundation

  • Tools

Add MRTK assets into your project

In your Unity project, select Assets tab and select Import Package > Custom Package from the drop down.

Navigate to MRTK downloaded folders to select and import them into your project.

Once you have MRTK assets imported, a new tab called Mixed Reality Toolkit will appear in your Unity editor. Navigate to new tab and select Add Scene and Configure from the dropdown menu. In your Scene Hierarchy, a new MixedRealityToolkit and MixedRealityPlayspace dropdowns will appear.

MixedRealityPlayspace now includes your Main Camera and the camera is configured for Mixed Reality applications. Camera background is black to render transparent and MixedRealityInputModule, EventSystem, GazeProvider components are now added to your camera.

You can create a new scene to compare the camera settings that has changed by MRTK.

  • You might be prompted to select a configuration. You can choose the default MRTK configuration or if you are developing for an HoloLens device, you can choose the configuration for the appropriate version.

First, you need to download MRTK by going to their github page: and navigating to releases tab. Scroll down to Assets section and download the tools:

aka.ms/MRTKGithub

How to build your project for HoloLens?

  • In the Unity menu, select File > Build Settings... to open the Build Settings window.

  • In the Build Settings window, select Universal Windows Platform and click the Switch Platform button.

  • Click on Project Settings in the Build Settings window or the Unity menu, select Edit > Project Settings... to open the Project Settings window.

  • In the Project Settings window, select Player > XR Settings to expand the XR Settings.

  • In the XR Settings, check the Virtual Reality Supported checkbox to enable virtual reality, then click the + icon and select Windows Mixed Reality to add the Windows Mixed Reality SDK.

Your projects settings might have been configured by Mixed Reality Toolkit.

  • Optimize the XR Settings as follows:

    • Set Windows Mixed Reality Depth Format to 16-bit depth.

    • Check the Windows Mixed Reality Enable Depth Sharing checkbox.

    • Set Stereo Rendering Mode* to Single Pass Instanced.

  • In the Project Settings window, select Player > Publishing Settings to expand the Publishing Settings. Scroll down to the Capabilities section and check the SpatialPerception checkbox.

Save your project and open up the Build Settings. Click on Build button, not Build and Run. When prompted, create a new folder(ex:HoloLensBuild) and select your new folder to build your files into.

Click on Build button, not Build and Run.

When your build is done, your file explorer will automatically open to the build folder you just created.

How to open MRTK example scenes?

  • On your Project panel select Assets > MixedRealityToolkit.Examples > Demos.

  • Select from the folders that you want to see an example of, ex: HandTracking, EyeTracking...

  • Open the Scenes folder and select a scene and double click to open.

  • You can press play to try out the scene in your editor window.

How to deploy your app to a HoloLens?

How to build your scene for Android and iOS Devices?

There are no additional steps after switching the platform for Android.

How to build and deploy your project for Windows Mixed Reality Headset?

Getting Super Powers

Becoming a super hero is a fairly straight forward process:

Super-powers are granted randomly so please submit an issue if you're not happy with yours.

Once you're strong enough, save the world:

Unchecking Strip Engine Code is the short term solution to an error in Xcode . We are working on a long term solution.

#6646
$ give me super-powers
hello.sh
# Ain't no code for that yet, sorry
echo 'You got to trust me on this, I saved the world'

How to set up your project for iOS or Android[Experimental]?

1 ) Make sure you have imported Microsoft.MixedReality.Toolkit.Unity.Foundation as a custom asset or through NuGet.

2 ) In the Unity Package Manager (UPM), install the following packages:

Android

iOS

AR Foundation Version: 2.1.4

AR Foundation Version: 2.1.4

ARCore XR Plugin Version: 2.1.2

ARKit XR Plugin Version: 2.1.2

3 ) Enabling the Unity AR camera settings provider.

The following steps presume use of the MixedRealityToolkit object. Steps required for other service registrars may be different.

  1. Select the MixedRealityToolkit object in the scene hierarchy.

2. Select Copy and Customize to Clone the MRTK Profile to enable custom configuration.

3. Select Clone next to the Camera Profile.

4. Navigate the Inspector panel to the camera system section and expand the Camera Settings Providers section.

5. Click Add Camera Settings Provider and expand the newly added New camera settings entry.

6. Select the Unity AR Camera Settings provider from the Type drop down.

02 - Mixed Reality Developer Tools and Concepts

Introduction to Mixed Reality Developer Tools and 3D Concepts

Overview

In this section, we will go through the developer tools and how to get started with debugging our applications.

Second part of the course is focused on creating and using 3D assets in your applications.

Short link:

aka.ms/UnityMixedRealityDeveloperTools
Concepts
Project
What could go wrong?
Resources

Resources

Mixed Reality getting started resources

Windows Mixed Reality Docs:

Mixed Reality Curriculum Youtube Playlist:

Mixed Reality Resources Repository:

HoloLens Seed Project Repository:

Code Samples:

Mixed Reality Development Tools to install:

Elliminate Texture Confusion: Bump, Normal and Displacement Maps:

Normal vs. Displacement Mapping & Why Games Use Normals:

Live editing WebGL shaders with Firefox Developer Tools:

WebXR Emulator:

aka.ms/MixedRealityDocs
aka.ms/MixedRealityCurriculumVideos
aka.ms/MixedRealityResourcesRepository
aka.ms/HoloLensSeedProject
aka.ms/MixedRealityUnitySamples
https://aka.ms/HoloLensToolInstalls
https://www.pluralsight.com/blog/film-games/bump-normal-and-displacement-maps
https://cgcookie.com/articles/normal-vs-displacement-mapping-why-games-use-normals
https://hacks.mozilla.org/2013/11/live-editing-webgl-shaders-with-firefox-developer-tools/
https://chrome.google.com/webstore/detail/webxr-api-emulator/mjddjgeghkdijejnciaefnkjmkafnnje?hl=en

Concepts

What makes a 3D model? What are Polygons, Splines, Vertices, Meshes and Materials?

A 3D model is a digital representation of a real world object. Representing a 3D object requires you to get to know some parts that makes up the 3D object.

Polygonal modeling is an approach for modeling objects by representing or approximating their surfaces using polygon meshes.

Objects created with polygon meshes must store different types of elements. These include vertices, edges, faces, polygons and surfaces.

Why are polygons important?

The more edges and faces a model has, detail if the model improves. On the other hand, having a high polygon count model will reduce the performance of your app. The calculation that needs to be done to render the model is expensive.

What is Debugging?
What makes a 3D model? What are Polygons, Splines, Vertices, Meshes and Materials?
How to choose performant 3D models?

What could go wrong?

Common issues to consider while developing for Mixed Reality

What are some of the security issues with Mixed Reality Applications?

Since a Mixed Reality application might have access to the user video stream, developers might be able to save or share private information about the user. Be careful to not to save any sensitive data or image anywhere other than users device. Never send sensitive information to any backend.

Why is eye scan data sensitive information?

Iris scan is a more accurate identification method than fingerprint. Since iris scan data can be used to identify and sing in a user, it should never leave the users device. HoloLens 2 does not send the iris scan to the cloud and does not give access to the data.

Why is eye tracking important for users privacy?

Eye tracking, while a very useful tool to make your application more accessible, it can also be used to collect data about the user's attention and might be used to manipulate the user's attention.

Can I open my unity project in the current version, if it is originally saved in an older version?

Unity versions are not backward compatible. If you decide to open a project on a newer version, Unity will try to update your project automatically but it is not guaranteed that the newer version will work with your imported assets. There might be some incompatibilities with your asset or your code and the new version.

What does the Unity Versioning mean and when is it safe to update the Unity version?

Let's take the latest version in the image below, 2019.3.5f:

  • 2019: is the year the Unity version was developed. Changes are issued once a year. If there are major changes, that will break your application. Stick to the same year version unless you are creating a new application from scratch for now. We will talk about how to update your project to the latest version in the following lessons.

  • 3: implies the 3rd iteration in 2019. When a version updates from 2 to 3, there are minor breaking code. Make sure to read changelog before updating your project from 2 to 3.

  • .5f: is for bug fixes. Usually there are few fixes that does not break your code or the APIs being used. Feel free to update your project from 2019.3.4f to 2019.3.5f.

How to update my Unity version to a newer one?

In your Unity Hub, under the project tab, you can select the Unity version drop down for your application and select a newer version of Unity. Unity will confirm your choice before updating your project. It is a good idea to save a version of your project as a new branch in Github, in case you need to revert back.

What could go wrong with Unity NuGet packages?

Here is a detailed article about the subject:

https://www.what-could-possibly-go-wrong.com/unity-and-nuget/

What is Debugging?

Debugging is the process of finding and resolving defects or problems within a computer program that prevent correct operation of the software.

Debugging tactics can involve:

How to choose performant 3D models for your application?

  • Reuse the model instance instead of a new model where ever you can.

Project

In this section we will install Windows Mixed Reality developer tools and learn about how to use them.

How to set-up HoloLens 2 development environment?

How to simulate input interactions in Unity editor?

Mixed Reality Toolkit(MRTK) supports in-editor input simulation. Simply run your scene by clicking Unity’s play button. Use these keys to simulate input.

  • Press W, A, S, D keys to move the camera.

  • Hold the right mouse button and move the mouse to look around.

  • To bring up the simulated hands, press Space bar(Right hand) or left Shift key(Left hand).

  • To rotate simulated hands, press Q or E(horizontal) / R or F(vertical).

What is HoloLens Emulator?

The HoloLens Emulator lets you test holographic applications on your PC without a physical HoloLens. It also includes the HoloLens development toolset.

debugging.

analysis.

.

.

Monitoring at the or level.

.

.

Bump Maps
Displacement Maps

To keep in the view, press T or Y key.

Interactive
Control flow
Unit testing
Integration testing
Log file analysis
application
system
Memory dumps
Profiling
simulated hands

What could go wrong?

Common issues working with developer tools and 3D objects

How to start debugging performance issues?

How to deploy to HoloLens Emulator?

03 - Hand Interactions and Controllers

Working with Hand Interactions.

Overview

In this section, we will look into the hand interactions as an input in our application.

Hand Interactions are currently available for only HoloLens 2 and Oculus devices.

In project section, we will create our first hand interactions to scale, move and rotate objects.

Concepts

Short link:

aka.ms/UnityHandInteractions
​Concepts​
​Project​
​What could go wrong?​
​Resources

Why the hand interaction is important?

Hand interaction is a very natural way to interact with 3D models. Since we interact and modify real objects with our hands, a new user of your application can start interacting with your application without having to learn about your application interface first.

Why the hand interaction is important?
What are gestures?

How to set-up HoloLens 2 Emulator

Can the HoloLens Emulator run on my device?

Before installing the emulator, make sure your PC meets the following hardware requirements:

Windows 10 Home Edition does not support Hyper-V or the HoloLens Emulator. The HoloLens 2 Emulator requires the Windows 10 October 2018 update or later.

How to check or enable "Hyper-V" settings?

How to install or update the Emulator?

You can download the latest here:.

In the BIOS, the following features must be :

HoloLens Emulator update
bit.ly/emulator2
supported and enabled

What is HoloLens Device Portal?

The Windows Device Portal for HoloLens lets you configure and manage your device remotely over Wi-Fi or USB. The Device Portal is a web server on your HoloLens that you can connect to from a web browser on your PC. The Device Portal includes many tools that will help you manage your HoloLens and debug and optimize your apps.

How to setup device portal?

  • Turn on your HoloLens device.

  • Tab your wrist(HoloLens 2) or make a bloom gesture(HoloLens 1) to initiate Windows menu.

  • Open the Settings > Update & Security.

  • Select For Developers tab on the right hand panel.

  • Enable "use developer features" by toggling on/off button.

  • Scroll down at the For Developer settings to enable "Device Portal".

  • Go back to all settings page by clicking "Home" on the left hand panel and select "Network & Internet" settings.

  • Select "Wifi" tab on the left, if it is not already selected.

  • Select the wifi you are connected to and click on "Advanced Options".

  • Scroll down and write down the IPV4 address.

  • You will type in this IP address to your browser to reach to your device portal.

  • You might see a connection Alert as shown below:

  • Go ahead and click Advanced button and click Proceed to <your IP address>(unsafe).

  • Congrats, you made it to your device portal.

  • Click Views on the right hand panel and select "Live Preview" to see the camera view of your HoloLens.

  • You can turn off PV camera if you would like to share or record what you are seeing through your HoloLens but do not want to capture your environment.

  • You can see the videos recorded or screenshots you snapped by asking Cortana here, on the Videos and Photos section, if you enabled voice commands.

Project

How to run the MRTK Hand Interaction examples in Unity editor?
How to organize your objects into a grid view?
How to add manipulation handler to your object?
How to grab and move an object?
How to rotate and scale an object?
How to make an object respond to input events?
How to add audio feedback?
How to add visual feedback?
How to place an object onto a surface?
How to style bounding box?
How to add button prefabs to your project?
How to make your buttons follow your hand?
How to use simplified joint data access?

How to run the (Mixed Reality Toolkit)MRTK Hand Interaction examples in Unity Editor?

The HandInteractionExamples.unity example scene contains various types of interactions and UI controls that highlight articulated hand input.

To try the hand interaction scene, first open the HandInteractionExamples scene under Assets\MixedRealityToolkit.Examples\Demos\HandTracking\Scenes\HandInteractionExamples

This example scene uses TextMesh Pro. If you receive a prompt asking you to import TMP Essentials, select the Import TMP Essentials button. Some of the MRTK examples use TMP Essentials for improved text rendering. After you select Import TMP Essentials, Unity will then import the package.

After Unity completes the import, close the TMP Importer window and reload the scene. You can reload the scene by double-clicking the scene in the Project window.

After the scene is reloaded, press the Play button.

Resources

Developer Tools and 3D assets resources

What are Gestures?

Gestures are input events based on human hands.

There are two types of devices that raise gesture input events in Mixed Reality Toolkit(MRTK):

  • Windows Mixed Reality devices such as HoloLens. This describes pinching motions ("Air Tap") and tap-and-hold gestures.

  • Touch screen devices.

Asset creation tools: .

Asset Libraries: .

Debugging C# code in Unity:

Unity IL2CPP debugging: .

wraps the to consume Unity's gesture events from HoloLens devices.

wraps the that supports physical touch screens.

Both of these input sources use the Gesture Settings profile to translate Unity's Touch and Gesture events respectively into MRTK's . This profile can be found under the Input System Settings profile.

https://github.com/Yonet/MixedRealityResources#asset-creation-tools
https://github.com/Yonet/MixedRealityResources#asset-libraries
https://docs.unity3d.com/Manual/ManagedCodeDebugging.html
https://aka.ms/AA7qap4
WindowsMixedRealityDeviceManager
Unity XR.WSA.Input.GestureRecognizer
UnityTouchController
Unity Touch class
Input Actions

How to organize your objects into a grid view?

You can organize any objects in Unity into a grid by using an Object collection script. In this example, you will learn how to organize 6 3D objects into a 3 x 3 grid.

First, configure your Unity scene for the Mixed Reality Toolkit. Next, in the Hierarchy window, right click in an empty space and select Create Empty. This will create an empty GameObject. Name the object CubeCollection.

In the Inspector window, position CubeCollection so that the collection displays in front of the user (example, X = 0, Y = -0.2, Z = 2).

With CubeCollection still selected, in the Hierarchy window, create a child Cube object. Change the scale of the object to x = .25, y = .25, z = .25.

Duplicate the child Cube object 8 times so that there is a total of 9 Cube child objects within the CubeCollection object.

In the Hierarchy window, select CubeCollection. In the Inspector window, click Add Component and search for the Grid Object Collection (Script). Once found, select the component to add to the object.

Configure the Grid Object Collection (Script) component by changing the Sort Type property to Child Order. This will ensure that the child objects (the 9 Cube objects) are sorted in the order you placed them under the parent object.

Click Update Collection to apply the new configuration.

You can adjust the parameters within the Grid Object Collection (Script) component to further customize the grid. For example, you could change the number of rows to 2 by changing the value in the Num Rows properties. Be sure to click Update Collection to apply the new configuration.

How to rotate and scale an object?

To rotate and scale an object, first ensure that the Manipulation Handler (Script) and Near Interaction Grabbable (Script) components are added to the object. The Manipulation Handler (Script) allows you to manipulate an object while the Near Interaction Grabble (Script) allows the object to respond to near hand interactions.

To add the scripts to the object, first select the object in the Hierarchy window. In the Inspector window, click Add Component and search for each script. Once found, select the script to add to the object.

With the object selected, in the Inspector window, navigate to the Manipulation Handler (Script) component to modify the component's parameters.

You can rotate an object using one or two hands. This setting is dependent on the Manipulation Type parameter. The Manipulation Type can be limited to either:

  • One Handed Only

  • Two Handed Only

  • One and Two Handed

Select Two Handed Only for Manipulation Type so that the user can only manipulate the object with two hands.

To limit the two handed manipulation to rotating and scaling, change Two Handed Manipulation Type to Rotate Scale.

To limit whether the object can be rotated on the x, y or z axis, change Constraint on Rotation to your preferred axis.

You can now test rotating and scaling the object using the in-editor simulation. Press the Play button to enter Game mode. Once in Game mode, press T and Y on the keyboard to toggle both hands. This will permanently display both hands in Game mode. Press the space bar to move the right hand and use left mouse click + Shift to move the left hand. While either controlling the left or right hand, use the mouse to rotate and scale the object.

How to grab and move an object?

To grab and move an object, first ensure that the Manipulation Handler (Script) and Near Interaction Grabbable (Script) components are added to the object. The Manipulation Handler (Script) allows you to manipulate an object while the **Near Interaction Grabble (Script) allows the object to respond to near hand interactions.

To add the scripts to the object, first select the object in the Hierarchy window. In the Inspector window, click Add Component and search for each script. Once found, select the script to add to the object.

With the object selected, in the Inspector window, navigate to the Manipulation Handler (Script) component to modify the component's parameters.

You can move an object using one or two hands. This setting is dependent on the Manipulation Type parameter. The Manipulation Type can be limited to either:

  • One Handed Only

  • Two Handed Only

  • One and Two Handed

Select the preferred Manipulation Type so that the user is restricted to use one of the available manipulation types.

You can now test grabbing and moving the object using the in-editor simulation. Press the Play button to enter Game mode. Once in Game mode, hold the space bar to bring up the hand and use the mouse to grab and move the object.

How to add button prefabs to your project?

How to add button prefabs to your project?

Mixed Reality Toolkit is equipped with a variety of button prefabs that you could add to your project. A prefab is a pre-configured GameObject stored as a Unity Asset and can be reused throughout your project.

You can find button prefabs available in MRTK by navigating to MixedRealityToolkit.SDK > Features > UX > Interactable > Prefabs.

In this project, you will learn how to change the color of a cube when a button is pressed.

First, select the button of your choice from the Project window and drag into the Hierarchy window.

Change the button's Transform Position so that it's positioned in front of the camera to x = 0, y = 0, and z = 0.5

Next, right click on an empty spot in the Hierarchy window and click 3D Object > Cube.

With the Cube object still selected, in the Inspector window, change the Transform Position so that the cube is located near but not overlapping the button. In addition, resize the cube by changing the Transform Scale.

In the Hierarchy window, select the button. In the Inspector window, navigate to the Interactable (Script) component.

In the Events section, expand the Receivers section.

Click the Add Event button to create a new event receiver of Event Receiver Type InteractableOnPressReceiver.

For the newly created InteractableOnPressReceiver event, change the Interaction Filter to Near and Far.

From the Hierarchy window, click and drag the Cube GameObject into the Event Properties object field for the On Press() event to assign the Cube as a receiver of the On Press () event.

Next, click the action dropdown (currently assigned No Function) and select MeshRenderer > Material material. This action will set the Cube's material property to change when the button is pressed.

Now, assign a color for the Cube to change to when the button is pressed. Click the small circle icon next to the Material field (currently assigned None (Material)) to open the Select Material window.

MRTK provides a variety of materials that can be used in your projects. In the search bar, search for MRTK_Standard and select your color of choice.

Now that the event is configured for when the button is pressed, you now need to configure an event that occurs when the button is released. For the On Release () event, click and drag the Cube GameObject into the Event Properties.

Next, click the action dropdown (currently assigned No Function) and select MeshRenderer > Material material. This action will set the Cube's material property to change when the button is released.

Now, assign a color for the Cube to change to when the button is released. Click the small circle icon next to the Material field (currently assigned None (Material)) to open the Select Material window and search for MRTK_Standard. Select your choice of color.

Now that both the On Press () and On Trigger () events are configured for the button, press Play to enter Game mode and test the button in the in-editor simulator.

To press the button, press the space bar + mouse scroll forward.

To release the button, press the space bar + mouse scroll backward.

How to make your buttons follow your hand?

How to make your buttons follow your hand?

MRTK uses what are known as Solvers to allow UI elements to follow the user or other game objects in the scene. The Radial View solver is a tag-along component that keeps a particular portion of a GameObject within the user's view.

You can make a button follow your hand by adding the Radial View (Script) component to the object.

First, drag a button prefab from MixedRealityToolkit.SDK > Features > UX > Interactable > Prefabs to the Hierarchy window.

In the Hierarchy window, select the button prefab. In the Inspector window, click Add Component. Search for Radial View. Once found, select to add the component to the button.

When you add the Radial View (Script) component to the button, the Solver Handler (Script) component is added as well because it is required by the Radial View (Script).

The Solver Handler (Script) component needs to be configured so that the button follows the user's hand. First, change Tracked Target Type to Hand Joint. This will enable you to define which hand joint the button follows.

Next, for the Solver Handler (Script) component, change Tracked Handness to Right. This setting determines which hand is tracked.

There over 20 hand joints available for tracking. Still inside the Solver Handler (Script) component, change Tracked Hand Joint to Wrist so that the button tracks the user's wrist.

Now that the hand tracking is configured, you need to configure the Radial View (Script) component to further define where the button is located and how it is viewed in relation to the user. First, change Reference Direction to Facing World Up. This parameter determines which direction the button faces.

Next, in the Radial View (Script) component, change the Min Distance and Max Distance to 0. The Min and Max Distance parameters determine how far the button should be kept from the user. As a reminder, the unit of measurement in Unity is meters. Therefore, a Min Distance of 1 would push the buttona way to ensure it is never closer than 1 meter to the user.

Now that the button is configured to follow your right wrist, press Play to enter Game mode and test the solver in the in-editor simulator. Press and hold the space bar to bring up the hand. Move the mouse cursor around to move the hand, and click and hold the left mouse button to rotate the hand:

How to add audio feedback?

How to add audio feedback?

You can configure an object to play a sound when the user touches an object by adding a trigger touch event to the object.

To be able to trigger touch events, the object must have the following components:

  • Collider component, preferably a Box Collider

  • Near Interaction Touchable (Script) component

  • Hand Interaction Touch (Script) component

To add audio feedback, first add an Audio Source component to the object. The audio source component enables you to play audio back in the scene. In the Hierarchy window, select the object and click Add Component in the Inspector window. Search for Audio Source to add the Audio Source component.

Once the Audio Source component has been added to the object, in the Inspector window, change the Spatial Blend property to 1 to enable spatial audio.

Next, with the object still selected, click Add Component and search for the Near Interaction Touchable (Script). Once found, select the component to add to the object. Near interactions come in the form of touches and grabs - which is an interaction that occurs when the user is within close proximity to an object and uses hand interaction.

After the Near Interaction Touchable (Script) is added to the object, click the Fix Bounds and Fix Center buttons. This will update the Local Center and Bounds properties of the Near Interaction Touchable (Script) to match the BoxCollider.

With the object still selected, click Add Component and search for the Hand Interaction Touch (Script). Once found, select the component to add to the object.

To make audio play when the object is touched, you will need to add an On Touch Started event to the Hand Interaction Touch (Script) component. In the Inspector window, navigate to the Hand Interaction Touch (Script) component and click the small + icon to create a new On Touch Started () event.

Drag the object to receive the event and define AudioSource.PlayOneShot as the action to be triggered. PlayOneShot will play the audio clip.

Next, assign an audio clip to the trigger. You can find audio clips provided by MRTK by navigating to Assets > MixedRealityToolkit.SDK > StandardAssets > Audio. Once you've found a suitable audio clip, assign the audio clip to the Audio Clip field.

You can now test the touch interaction using the in-editor simulation. Press the Play button to enter Game mode. Once in Game mode, hold the spacebar to bring up the hand and use the mouse to touch the object and trigger the sound effect.

How to style Bounding Box?

Bounding boxes make it easier and more intuitive to manipulate objects with one hand for both near and far interaction by providing handles that can be used for scaling and rotating. A bounding box will show a cube around the hologram to indicate that it can be interacted with. The bounding box also reacts to user input.

You can add a bounding box to an object by adding the BoundingBox.cs script as a component of the object.

To add the Bounding Box (Script) component to an object, first select the object in the Hierarchy window. In the Inspector window, click Add Component and search for Bounding Box.

Select the Bounding Box script to apply the component to the object. The bounding box is only visible in Game mode. Press play to view the bounding box. By default, the HoloLens 1st gen style is used.

To reflect the MRTK bounding box style, you need to change the parameters inside the Handles section of the Bounding Box (Script) component.

Change Handle Color

You can change the color of the handles by assigning a material to the Handle Material property.

In the Handles section, click the circle icon to open the Select Material window.

In the Select Material window, search for BoundingBoxHandleWhite. Once found, select to assign the color to the handle material.

When you press play, the handle colors for the bounding box will be white.

Change Handle Color When Object is Grabbed

You can change the color of the handles when an object is grabbed by assigning a material to the Handle Grabbed Material property.

In the Handles section, click the circle icon to open the Select Material window.

In the Select Material window, search for BoundingBoxHandleBlueGrabbed. Once found, select to assign the color to the handle material.

When you press play, grab one of the handles of the bounding box. The color of the handle will change to blue.

Change Scale Handles

You can change the scale handles in corners by assigning a scale handle prefab in the Scale Handle Prefab and Scale Handle Slate Prefab (for 2D slate) parameters.

First, assign a prefab to the Scale Handle Prefab. In the Handles section, click the circle icon to open the Select GameObject window.

In the Select GameObject window, switch to the Assets tab and search for MRTK_BoundingBox_ScaleHandle. Once found, select to assign the prefab to the scale handle.

Next, assign a prefab to the Scale Handle Slate Prefab. In the Handles section, click the circle icon to open the Select GameObject window.

In the Select GameObject window, switch to the Assets tab and search for MRTK_BoundingBox_ScaleHandle_Slate. Once found, select to assign the prefab to the scale handle.

When you press play, grab one of the handles of the bounding box to see the change in how the scale handle look.

Change Rotation Handles

You can change the rotation handles by assigning a rotation handle prefab in the Rotation Handle Prefab parameter.

In the Handles section, click the circle icon to open the Select GameObject window.

In the Select GameObject window, switch to the Assets tab and search for MRTK_BoundingBox_RotateHandle. Once found, select to assign the prefab to the scale handle.

When you press play, grab one of the handles of the bounding box to see the change in how the scale handle look.

04 - Eye and Head Gaze

Eye and Head Gaze Tracking.

Concepts and Discussion

What's the difference between eye and head gaze?

What is 6 Degrees of Freedom?

What are good use cases for eye or head tracking?

What are the security concerns with using eye-tracking?

How to get permission to use eye-tracking?

How to setup eye-tracking?

How to simulate eye-tracking in the Unity editor?

How to enable eye calibration?

How to use eye-tracking for selection of an object?

How to use eye-tracking for infinite scroll?

How to visualize eye tracking data?

How to setup head tracking?

What could go wrong?

Further Reading

Resources

Code Samples:

https://aka.ms/MixedRealityUnitySamples
MRTK input system.
MRTK input events.

Why is spatial data important?

With spatial data you can discover growth insights, manage facilities and networks, and provide location information to customers. Without considering spatial components and how they relate to your business, your risks and possibility of poor results will increase.

Spatial analysis allows you to solve complex location-oriented problems and better understand where and what is occurring in your world. It goes beyond mere mapping to let you study the characteristics of places and the relationships between them. Spatial analysis lends new perspectives to your decision-making.

Concepts

Concepts and Discussion

What's the difference between eye and head gaze?

What is 6 Degrees of Freedom?

What are good use cases for eye or head tracking?

What are the security concerns with using eye-tracking?

Concepts

Why is spatial data important?
What are some good spatial visualizations for Mixed Reality?
What is Bing Maps SDK?

05 - Map Visualization

Spatial Visualization using Bing Map using HoloLens 2 and Windows Mixed Reality Headsets.

Overview

This project is for HoloLens 2 and Windows Mixed Reality Headsets.

What we will build?

We will build the app in below video for HoloLens 2. You can render it for Windows Mixed Reality Headset and use hand controllers instead of hand gestures.

Shortlink:

In this project, we will create a 3D Map visualization using :.

Outings, a sample app created by Bings Map SDK can be found on Microsoft Store for PC and HoloLens 1:

aka.ms/UnityBingMapsVisualizationLesson
Bing Maps Unity SDK
aka.ms/BingMapsUnitySDK
aka.ms/OutingsHoloLens1

What could go wrong?

What is AR/VR/MR/XR?
Will mixed reality replace our phones and personal computers?

What are some good spatial visualizations for Mixed Reality?

A good visualization allows the users to understand a data better by seeing the data points in the right context. Check out some of the examples below to see what the visualization provides that you would have hard time to understand just by seeing the information data points.

Small arms and ammunition import and export interactive visualization:

using .

Compare Covid-19 Data tab and map tab to see the difference it makes in your perception:

Wind and weather visualizations:

Chrome experiments with Globe:

https://armsglobe.chromeexperiments.com/
Outings Garage Project
MapsBing-SDK
https://ncov2019.live/
https://www.windy.com/
https://experiments.withgoogle.com/chrome/globe
LogoIntroducing Microsoft Mesh | Here can be anywhere.
Microsoft Mesh

Resources

Mixed Reality Unity Lessons link.
3rd Wave of Computing
Creating a new Unity Scene.
New scene camera.
Unity3D Editor Interface
Going back to default editor layout.
Unity Editor Toolbar.
Hand Tool Keyboard Shortcut: Q
Move Tool Keyboard Shortcut: W
Rotate Tool Keyboard Shortcut: E
Scale Tool Keyboard Shortcut: R
Rect Transform Tool Keyboard Shortcut: T
Rotate, Move or Scale Tool Keyboard Shortcut: Y
Focusing on an GameObject Keyboard Shortcut: F
Check your Windows Version in your System Settings under About.
Download Unity Hub instead of the Unity Editor.
Unity Hub Editor Installs
Unity Hub Projects Page
Check Universal Windows Platform Build Support and Windows Build Support modules for Unity Editor.
Unity Android Build Support Modules.
SampleScene with a light and camera.
Creating a new scene.
Mixed Reality Experiences
Mixed Reality Spectrum
UI building blocks
Mixed Reality Toolkit Examples
MRTK Examples Hub
Mixed Reality Toolkit Documentation.
Windows menu.
Update & Security Settings.
For Developers Settings.
Download Seed project from github.
Download MRTK Releases.
Build settings.
Switch platform to Windows Universal Platform.
Project settings.
Player XR settings.
XR settings Mixed Reality Supported checkbox.
Optimization settings for XR.
Spatial Perception enabled.
Build your project into a new folder by clicking build button.
MRTK Examples.
Project Configurator Settings.
Optimization header, uncheck Strip Engine Code.
MixedReality Toolkit in Hierarchy panel.
Copy and Customize to Clone the MRTK Profile.
Clone camera profile.
Camera Settings Providers
New camera settings expanded view.
Unity AR Camera Settings.
Mixed Reality Developer Tools.
Windows Mixed Reality Documantation.
Curriculum Youtube Playlist.
HoloLens Seed Project Repository.
Code Sample Repository.
Tools to install link.
WebXR Emulator Extention.
Example of triangle mesh.
Windows Device Portal
Unity Hand Interactions link.
Hand interactions on HoloLens 2
Windows menu.
Update & Security Settings.
For Developers Settings.
Device Portal Toggle.
Network and Internet Settings.
Wifi Advanced Options.
Your connection is not private alert.
Device Portal.
Live Preview on Windows Device Portal.
MRTK hand interactions example scene.
Importing TMP Essentials
Gesture Profile Settings
Grid layout of boxes.
Empty CubeCollection object.
Position attributes of the CubeCollection.
Cube transforms.
Duplicated cubes collection.
Add Grid Object Collection Script component
Change sort type.
Update Collection.
Change number of rows.
Manipulation Handler Script
Manipulation Handler parameters
Manipulation Type
Rotate and Scale
Rotation Constrain
Rotate interaction
Add Manipulation Handler Script component
Manipulation Handler Parameters
Manipulation type
Add object and action
Bing Maps SDK Visualization
Bing Maps Visualization Example.
Outings Immersive App.
QR code for Introduction to Mixed Reality
Curriculum Link QR Code
Virtual Reality Headset

Code Samples:

LogoMicrosoft Mesh hands-on demodocsmsft
Mesh Demo video
Getting started with Unity development.
How do I decide if I need to develop for Virtual Reality or Augmented Reality?
Changing preferences in Unity3D
LogoDeploying your HoloLens 2 applicationdocsmsft
https://aka.ms/MixedRealityUnitySamples

What is Bing Maps SDK?

Maps SDK, a Microsoft Garage project provides a control to visualize a 3D map in Unity. The map control handles streaming and rendering of 3D terrain data with world-wide coverage. Select cities are rendered at a very high level of detail. Data is provided by Bing Maps.

The map control has been optimized for mixed reality applications and devices including the HoloLens, HoloLens 2, Windows Immersive headsets, HTC Vive, and Oculus Rift. Soon the SDK will also be provided as an extension to the

Bing Maps SDK
Mixed Reality Toolkit (MRTK).
LogoSetting up your HoloLens 2 development environmentdocsmsft
LogoSetting up your HoloLens 2 development environmentdocsmsft
LogoHoloLens 2 Emulator Overviewdocsmsft
HoloLens Emulator Overview
How to check or enable "Hyper-V" on your PC
How to install or update HoloLens Emulator?
HoloLens 2 Bing Maps project end product.