arrow-left

Only this pageAll pages
gitbookPowered by GitBook
triangle-exclamation
Couldn't generate the PDF for 307 pages, generation stopped at 100.
Extend with 50 more pages.
1 of 100

Mixed Reality Docs

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Microsoft Mesh

What is the Metaverse?

As of now, we do not know exactly what shape the Metaverse will take. That does not matter, either. What matters is that someday, a global network of spatially organized, predominantly 3D content will be available to all without restriction, for use in all human endeavors — a new and profoundly transformational medium, enabled by major innovations in hardware, human-computer interface, network infrastructure, creator tools and digital economies.

Tony Parisiarrow-up-right

Please read the article The Seven Rules of the Metaversearrow-up-right to learn the basic concepts and language.

Unity Lessons

Developing for Mixed Reality using Unity3D

Short link: aka.ms/MixedRealityUnityLessonsarrow-up-right

Mixed Reality Unity Lessons link.
  • Lesson 1: Introduction to Mixed Reality Applications and Development.

  • Lesson 2: Introduction to Mixed Reality Developer Tools and 3D Concepts.

  • : Working with Hand Interactions and Controllers.

  • : Eye and Head Gaze Tracking.

  • : Spatial Visualization using Bing Maps.

  • : Working with REST APIs.

  • : Azure Spatial Anchors and Backend Services.

  • : Displaying Spatial Anchors on a map.

  • : Working with QR codes.

  • : Working with Scene Understanding.

  • : Getting Started with AI.

  • : Project Discussion and Case Studies.

01 - Introduction to Mixed Reality

Introduction to Mixed Reality Applications and Development

Short link:

hashtag
Overview

In this lesson, you will learn about the basic concepts of Mixed Reality and explore the applications of Mixed Reality

Mixed Reality Curriculum

Learn Mixed Reality development using Azure Mixed Reality Services

  • Mixed Reality Curriculum:

  • WebXR Lessons:

  • Unity Lessons:

How to use MRTK Visual Profiler?

How to add visual feedback?

How to create 3D models with splines?

How to upload 3D models to your project?

How to use simplified joint data access?

Why the hand interaction is important?

Hand interaction is a very natural way to interact with 3D models. Since we interact and modify real objects with our hands, a new user of your application can start interacting with your application without having to learn about your application interface first.

How to make an object respond to input events?

What could go wrong?

Common issues working with developer tools and 3D objects

hashtag
How to start debugging performance issues?

How to enable eye calibration?

How to log for debugging purposes?

How to add MRTK(Mixed Reality Toolkit) Diagnostic System to your project?

How to create 3D models using Autodesk 3dsMax?

How to setup eye-tracking?

How to add hand interactions to an object?

Where to find pre-made 3D models?

How to create polygon models?

How to use eye-tracking to select an object?

How to create your own models using Maquette?

Using eye movement without a delay to select an object.

How to add Manipulation Handler to your object?

Having a small bounding box.

How to setup head tracking?

How to monitor performance of your app?

How to visualize eye tracking data?

What could go wrong?

How to organize your buttons into a grid view?

Working with 3D Objects

in different
industries
.

In the Project section, you will set up your first Mixed Reality project using Unity and Mixed Reality Tool Kit.

You can jump directly into setting up your first project on the How to get started with mixed reality development using Unity3D section.

  • Concepts

  • Project

  • What could go wrong?

aka.ms/UnityIntroToMixedRealityarrow-up-right
QR code for Introduction to Mixed Reality
  • AI Lessons: www.learnaiml.devarrow-up-right

  • Unreal Lessons: aka.ms/MixedRealityUnrealLessonsarrow-up-right

  • hashtag
    How to use this book?

    This book is designed as a collection of classes that starts from basic concepts and builds a project over time. Each lesson can also be used as an individual workshop. Each class follows the below structure:

    • Core concepts and discussion points.

    • Project step-by-step walk-through.

    • What could go wrong. A section to discuss the common mistakes and issues.

    • Further reading resources.

    Each class has questions as sections and builds the corresponding part of the projectarrow-up-right. If you feel you can correctly a question, feel free to move on to the next question or the next class.

    If you have any questions, suggestions and improvements, please submit an issuearrow-up-right here: https://github.com/Yonet/AzureMixedRealityDocs/issuesarrow-up-right.

    We welcome your contributions. If you would like to contribute, check out how to in the contributing section.

    Hope you enjoy developing your mixed reality application!

    hashtag
    Unity3D Lessons

    • Lesson 1: Introduction to Mixed Reality Applications and Development.

    • Lesson 2: Introduction to Mixed Reality Developer Tools and 3D Concepts.

    • Lesson 3: Working with Hand Interactions.

    • : Eye and Head Gaze Tracking.

    • : Spatial Visualization using Bing Maps.

    • : Working with REST APIs.

    • : Azure Spatial Anchors and Backend Services.

    • : Displaying Spatial Anchors on a map.

    • : Working with QR codes.

    • : Working with Spatial Awareness and Scene Understanding.

    • : Getting Started with AI.

    • : Project Discussion and Case Studies.

    hashtag
    Links

    Short link: aka.ms/MixedRealityCurriculum

    Curriculum Link QR Code

    Mixed Reality Curriculum Playlist: https://aka.ms/MixedRealityCurriculumVideosarrow-up-right.

    Code Samples: https://aka.ms/MixedRealityUnitySamplesarrow-up-right.

    Github: https://github.com/Yonet/AzureMixedRealityDocsarrow-up-right

    Slack Channel: https://holodevelopers.slack.com/archives/G012X50UVMLarrow-up-right

    aka.ms/MixedRealityCurriculumarrow-up-right
    www.learnwebxr.devarrow-up-right
    aka.ms/MixedRealityUnityLessonsarrow-up-right
    Lesson 3
    Lesson 4
    Lesson 5
    Lesson 6
    Lesson 7
    Lesson 8
    Lesson 9
    Lesson 10
    Lesson 11
    Lesson 12

    Concepts

    In this lesson, you will learn about the basic concepts of Mixed Reality and explore the applications of Mixed Reality in different industries.

    Read through the questions below. If you feel comfortable with the answer, feel free to skip to section or next chapters.

    Why is Mixed Reality important?

    The first revolution in computing happened with the creation of mainframe computers: computers that, at times, occupied a whole room. Mainframes were used by large organizations such as NASA for critical applications that process data.

    The second wave of computing is defined by the Personal Computers(PC) becoming widely available.

    We believe third wave of computing is going to include many devices to manage data and include IoT sensors and Mixed Reality devices.

    We have more data than ever before. To be able to process the data and make informed decisions, we need to have access to the data in the right time and right place. Mixed Reality is able to bring that data into our context, real world.

    How to change preferences in Unity?

    • Go to Edit > Preferences.

    • Change the color scheme under General, if it is available.

    • You can change the default editor by selecting

    Will mixed reality replace our phones and Personal Computers?

    hashtag

    How to deploy your app to a HoloLens?

    How do I decide if I need to develop for Virtual Reality or Augmented Reality?

    What is HoloLens Emulator?

    The HoloLens Emulator lets you test holographic applications on your PC without a physical HoloLens. It also includes the HoloLens development toolset.

    How to Get Started with Mixed Reality Development Using Unity?

    Unity Introduction.

    hashtag
    How to create a new scene?

    • On the Project panel, right click and select Create > Scene.

    What are some use cases for Mixed Reality applications?

    How to open MRTK example scenes?

    • On your Project panel select Assets > MixedRealityToolkit.Examples > Demos.

    • Select from the folders that you want to see an example of, ex: HandTracking, EyeTracking...

    Resources

    • Code Samples:

    Concepts

    How to set-up HoloLens 2 development environment?

    What is the difference between Augmented Reality, Virtual Reality and Mixed Reality?
  • Why is Mixed Reality important?

  • Will mixed reality replace our phones and PCs?

  • How do I decide if I need to develop for Virtual Reality or Augmented Reality?

  • What are some use cases for Mixed Reality applications?

  • What are some examples of Mixed Reality Applications?

  • What is Mixed Reality Toolkit(MRTK)?

  • project
    What is Mixed Reality?
    6 Use Cases for Enterprise Mixed Realityarrow-up-right
    5 Use Cases Of Augmented Reality That Boosted Businesses’ Salesarrow-up-right
    https://aka.ms/MixedRealityUnitySamplesarrow-up-right
    MRTK input system.arrow-up-right
    MRTK input events.arrow-up-right

    Concepts

    • What is Debugging?

    • What makes a 3D model? What are Polygons, Splines, Vertices, Meshes and Materials?

    • How to choose performant 3D models?

    How to style Bounding Box?

    Bounding boxes make it easier and more intuitive to manipulate objects with one hand for both near and far interaction by providing handles that can be used for scaling and rotating. A bounding box will show a cube around the hologram to indicate that it can be interacted with. The bounding box also reacts to user input.

    You can add a bounding box to an object by adding the BoundingBox.cs script as a component of the object.

    To add the Bounding Box (Script) component to an object, first select the object in the Hierarchy window. In the Inspector window, click Add Component and search for Bounding Box.

    Select the Bounding Box script to apply the component to the object. The bounding box is only visible in Game mode. Press play to view the bounding box. By default, the HoloLens 1st gen style is used.

    To reflect the MRTK bounding box style, you need to change the parameters inside the Handles section of the Bounding Box (Script) component.

    hashtag
    Change Handle Color

    You can change the color of the handles by assigning a material to the Handle Material property.

    In the Handles section, click the circle icon to open the Select Material window.

    In the Select Material window, search for BoundingBoxHandleWhite. Once found, select to assign the color to the handle material.

    When you press play, the handle colors for the bounding box will be white.

    hashtag
    Change Handle Color When Object is Grabbed

    You can change the color of the handles when an object is grabbed by assigning a material to the Handle Grabbed Material property.

    In the Handles section, click the circle icon to open the Select Material window.

    In the Select Material window, search for BoundingBoxHandleBlueGrabbed. Once found, select to assign the color to the handle material.

    When you press play, grab one of the handles of the bounding box. The color of the handle will change to blue.

    hashtag
    Change Scale Handles

    You can change the scale handles in corners by assigning a scale handle prefab in the Scale Handle Prefab and Scale Handle Slate Prefab (for 2D slate) parameters.

    First, assign a prefab to the Scale Handle Prefab. In the Handles section, click the circle icon to open the Select GameObject window.

    In the Select GameObject window, switch to the Assets tab and search for MRTK_BoundingBox_ScaleHandle. Once found, select to assign the prefab to the scale handle.

    Next, assign a prefab to the Scale Handle Slate Prefab. In the Handles section, click the circle icon to open the Select GameObject window.

    In the Select GameObject window, switch to the Assets tab and search for MRTK_BoundingBox_ScaleHandle_Slate. Once found, select to assign the prefab to the scale handle.

    When you press play, grab one of the handles of the bounding box to see the change in how the scale handle look.

    hashtag
    Change Rotation Handles

    You can change the rotation handles by assigning a rotation handle prefab in the Rotation Handle Prefab parameter.

    In the Handles section, click the circle icon to open the Select GameObject window.

    In the Select GameObject window, switch to the Assets tab and search for MRTK_BoundingBox_RotateHandle. Once found, select to assign the prefab to the scale handle.

    When you press play, grab one of the handles of the bounding box to see the change in how the scale handle look.

    How to add audio feedback?

    hashtag
    How to add audio feedback?

    You can configure an object to play a sound when the user touches an object by adding a trigger touch event to the object.

    To be able to trigger touch events, the object must have the following components:

    • Collider component, preferably a Box Collider

    • Near Interaction Touchable (Script) component

    • Hand Interaction Touch (Script) component

    To add audio feedback, first add an Audio Source component to the object. The audio source component enables you to play audio back in the scene. In the Hierarchy window, select the object and click Add Component in the Inspector window. Search for Audio Source to add the Audio Source component.

    Once the Audio Source component has been added to the object, in the Inspector window, change the Spatial Blend property to 1 to enable spatial audio.

    Next, with the object still selected, click Add Component and search for the Near Interaction Touchable (Script). Once found, select the component to add to the object. Near interactions come in the form of touches and grabs - which is an interaction that occurs when the user is within close proximity to an object and uses hand interaction.

    After the Near Interaction Touchable (Script) is added to the object, click the Fix Bounds and Fix Center buttons. This will update the Local Center and Bounds properties of the Near Interaction Touchable (Script) to match the BoxCollider.

    With the object still selected, click Add Component and search for the Hand Interaction Touch (Script). Once found, select the component to add to the object.

    To make audio play when the object is touched, you will need to add an On Touch Started event to the Hand Interaction Touch (Script) component. In the Inspector window, navigate to the Hand Interaction Touch (Script) component and click the small + icon to create a new On Touch Started () event.

    Drag the object to receive the event and define AudioSource.PlayOneShot as the action to be triggered. PlayOneShot will play the audio clip.

    Next, assign an audio clip to the trigger. You can find audio clips provided by MRTK by navigating to Assets > MixedRealityToolkit.SDK > StandardAssets > Audio. Once you've found a suitable audio clip, assign the audio clip to the Audio Clip field.

    You can now test the touch interaction using the in-editor simulation. Press the Play button to enter Game mode. Once in Game mode, hold the spacebar to bring up the hand and use the mouse to touch the object and trigger the sound effect.

    Concepts

    hashtag
    Concepts and Discussion

    hashtag
    What's the difference between eye and head gaze?

    hashtag
    What is 6 Degrees of Freedom?

    hashtag
    What are good use cases for eye or head tracking?

    hashtag
    What are the security concerns with using eye-tracking?

    04 - Eye and Head Gaze

    Eye and Head Gaze Tracking.

    hashtag
    Concepts and Discussion

    hashtag
    What's the difference between eye and head gaze?

    hashtag
    What is 6 Degrees of Freedom?

    hashtag
    What are good use cases for eye or head tracking?

    hashtag
    What are the security concerns with using eye-tracking?

    hashtag
    How to get permission to use eye-tracking?

    hashtag
    How to setup eye-tracking?

    hashtag
    How to simulate eye-tracking in the Unity editor?

    hashtag
    How to enable eye calibration?

    hashtag
    How to use eye-tracking for selection of an object?

    hashtag
    How to use eye-tracking for infinite scroll?

    hashtag
    How to visualize eye tracking data?

    hashtag
    How to setup head tracking?

    hashtag
    What could go wrong?

    hashtag
    Further Reading

    How to simulate eye-tracking in the Unity editor?

    What could go wrong?

    Mixing scaling and moving.

    How to use eye-tracking for infinite scroll?

    Resources

    • Code Samples: https://aka.ms/MixedRealityUnitySamplesarrow-up-right

    Why is spatial data important?

    With spatial data you can discover growth insights, manage facilities and networks, and provide location information to customers. Without considering spatial components and how they relate to your business, your risks and possibility of poor results will increase.

    Spatial analysis allows you to solve complex location-oriented problems and better understand where and what is occurring in your world. It goes beyond mere mapping to let you study the characteristics of places and the relationships between them. Spatial analysis lends new perspectives to your decision-making.

    How to get permission to use eye-tracking?

    Using eye gaze to influence the user.

    Project

    Design & Prototyping: Enables real-time collaborative iteration of 3D physical and virtual models across cross-functional teams and stakeholders.

    Training & Development: Provides instructors with better tools to facilitate teaching/coaching sessions. It offers trainees an enhanced and engaging learning experiences through 3D visualizations and interactivity.

    Geospatial Planning: Enables the assessment and planning of indoor and outdoor environments (i.e. future construction sites, new store locations, interior designs) and removing the need for manual execution.

    Sales Assistance: Improves the effectiveness of individuals in sales-oriented roles by providing tools such as 3D catalogs and virtual product experiences that increase customer engagement and strengthen buyer confidence.

    Field Service: Improves the first-visit resolution and customer satisfaction of customer support issues. It is typically used for complex products that would otherwise require a field visit. It can serve as a platform for targeted up-sell opportunities, as well.

    Productivity & Collaboration: Transform the space around you into a shared augmented workplace. Remote users can collaborate, search, brainstorm and share content as if they were in the same room

    3rd Wave of Computing
    External Tools > External Script Editor
    drop-down will have your editors currently available in your computer.
    Name your scene and drag it under Scenes folder for organization purposes.
    Creating a new Unity Scene.

    Every new Scene comes with a light and camera. We have to modify the camera later for our Mixed Reality project.

    New scene camera.
    Open the Scenes folder and select a scene and double click to open.
  • You can press play to try out the scene in your editor window.

  • MRTK Examples.
    Resources
    Lesson 4
    Lesson 5
    Lesson 6
    Lesson 7
    Lesson 8
    Lesson 9
    Lesson 10
    Lesson 11
    Lesson 12
    Why the hand interaction is important?
    What are gestures?
    Hand interactions on HoloLens 2

    What are some key concepts for working with Unity?

    Let’s review some key concepts, which will help you as you begin to explore editing scripts for mixed reality development.

    hashtag
    Scenes

    In Unity, areas of the game that a player can interact with are generally made up of one or more Scenes. Small games may only use one Scene; large ones could have hundreds.

    Every Unity project you create comes with a SampleScene that has a light and a camera.

    SampleScene with a light and camera.

    You can create a new scene by right clicking under the assets tab and selecting Create > Scene. Organizing scenes under a Scenes folder is only for the organization purposes.

    You can use scenes to organize navigation inside your application or adding different levels to a game.

    hashtag
    GameObjects and components

    Every object in the game world exists as a GameObject in Unity. GameObjects are given specific features by giving them appropriate components which provide a wide range of different functionality.

    When you create a new GameObject, it comes with a Transform component already attached. This component controls the GameObject’s positional properties in the 3D (or 2D) gamespace. You need to add all other components manually in the Inspector.

    hashtag
    Prefabs

    Prefabs are a great way to configure and store GameObjects for re-use in your game. They act as templates, storing the components and properties of a specific GameObject and enabling you to create multiple instances of it within a Scene.

    All copies of the Prefab template in a Scene are linked. This means that if you change the object values for the health potion Prefab, for example, each copy of that Prefab within the Scene will change to match it. However, you can also make specific instances of the GameObject different to the default Prefab settings.

    What is Debugging?

    Debugging is the process of finding and resolving defects or problems within a computer program that prevent correct operation of the software.

    Debugging tactics can involve:

    • Interactivearrow-up-right debugging.

    • Control flowarrow-up-right analysis.

    • .

    • .

    • Monitoring at the or level.

    • .

    • .

    How to enable Developer Mode in HoloLens?

    • Turn on your HoloLens device.

    • Tab your wrist(HoloLens 2) or make a bloom gesture(HoloLens 1) to initiate Windows menu.

    Windows menu.
    • Open the Settings > Update & Security.

    • Select For Developers tab on the right hand panel.

    How to choose performant 3D models for your application?

    Bump Maps

    • Reuse the model instance instead of a new model where ever you can.

    How to simulate input interactions in Unity editor?

    Mixed Reality Toolkit(MRTK) supports in-editor input simulation. Simply run your scene by clicking Unity’s play button. Use these keys to simulate input.

    • Press W, A, S, D keys to move the camera.

    • Hold the right mouse button and move the mouse to look around.

    • To bring up the simulated hands, press Space bar(Right hand) or left Shift key(Left hand).

    • To keep in the view, press T or Y key.

    • To rotate simulated hands, press Q or E(horizontal) / R or F(vertical).

    What is the difference between Augmented Reality, Virtual Reality and Mixed Reality?

    Augmented Reality(AR) is defined as a technology that superimposes a computer-generated image on a user's view of the real world, thus providing a composite view. Augmented Reality experiences are not limited to visual addition to our world. You can create augmented experiences that are only audio addition to your physical world or both audio and visual.

    Augmented Reality experiences are also not limited to headsets like HoloLens. Today, millions of mobile devices have depth-sensing capabilities to augment your real world with digital information.

    Virtual Reality(VR) is when you are absolutely immersed in a Virtual World by wearing a headset. In Virtual Reality you lose connection to the real world visually. Virtual Reality applications are great for training and for simulations where users would benefit from total immersion to replicate the real life situation. Some examples include training for firefighters, emergency room healthcare providers and flight simulations.

    What is Mixed Reality?

    Mixed reality is the merging of real and virtual worlds to produce new environments and visualizations, where physical and digital objects co-exist and interact in real time.

    Mixed Reality Experiences

    We think of Mixed reality as a spectrum from the physical world to an augmented world to fully immersive virtual world and all the possibilities in between.

    Mixed Reality Spectrum

    What makes a 3D model? What are Polygons, Splines, Vertices, Meshes and Materials?

    A 3D model is a digital representation of a real world object. Representing a 3D object requires you to get to know some parts that makes up the 3D object.

    Polygonal modeling is an approach for modeling objects by representing or approximating their surfaces using polygon meshes.

    Example of triangle mesh.

    Objects created with polygon meshes must store different types of elements. These include vertices, edges, faces, polygons and surfaces.

    hashtag
    Why are polygons important?

    The more edges and faces a model has, detail if the model improves. On the other hand, having a high polygon count model will reduce the performance of your app. The calculation that needs to be done to render the model is expensive.

    How to build and deploy your project for Windows Mixed Reality Headset?

    hashtag
    Getting Super Powers

    Becoming a super hero is a fairly straight forward process:

    $ give me super-powers
    circle-info

    Super-powers are granted randomly so please submit an issue if you're not happy with yours.

    Once you're strong enough, save the world:

    Project

    In this section we will install Windows Mixed Reality developer tools and learn about how to use them.

    Windows Device Portal

    What are Gestures?

    Gestures are input events based on human hands.

    There are two types of devices that raise gesture input events in Mixed Reality Toolkit(MRTK):

    • Windows Mixed Reality devices such as HoloLens. This describes pinching motions ("Air Tap") and tap-and-hold gestures.

    WindowsMixedRealityDeviceManagerarrow-up-right wraps the Unity XR.WSA.Input.GestureRecognizerarrow-up-right to consume Unity's gesture events from HoloLens devices.

    • Touch screen devices.

    wraps the that supports physical touch screens.

    Both of these input sources use the Gesture Settings profile to translate Unity's Touch and Gesture events respectively into MRTK's . This profile can be found under the Input System Settings profile.

    Project

    • How to run the MRTK Hand Interaction examples in Unity editor?arrow-up-right

    • How to organize your objects into a grid view?arrow-up-right

    • How to add manipulation handler to your object?arrow-up-right

    Resources

    Developer Tools and 3D assets resources

    • Asset creation tools: https://github.com/Yonet/MixedRealityResources#asset-creation-toolsarrow-up-right.

    • Asset Libraries: https://github.com/Yonet/MixedRealityResources#asset-librariesarrow-up-right.

    • Debugging C# code in Unity:

    • Unity IL2CPP debugging: .

    What is Mixed Reality Toolkit(MRTK)?

    Mixed Reality Toolkit (MRTK) provides a set of components and features to accelerate cross-platform Mixed Reality application development in Unity. MRTK includes:

    • UI and interaction building blocks.

    UI building blocks
    • Tools.

    • Example Scenes.

    You can learn more about the components at: .

    How to deploy to HoloLens Emulator?

    How to run the (Mixed Reality Toolkit)MRTK Hand Interaction examples in Unity Editor?

    MRTK hand interactions example scene.

    The HandInteractionExamples.unity example scene contains various types of interactions and UI controls that highlight articulated hand input.

    To try the hand interaction scene, first open the HandInteractionExamples scene under Assets\MixedRealityToolkit.Examples\Demos\HandTracking\Scenes\HandInteractionExamples

    This example scene uses TextMesh Pro. If you receive a prompt asking you to import TMP Essentials, select the Import TMP Essentials button. Some of the MRTK examples use TMP Essentials for improved text rendering. After you select Import TMP Essentials, Unity will then import the package.

    Importing TMP Essentials

    After Unity completes the import, close the TMP Importer window and reload the scene. You can reload the scene by double-clicking the scene in the Project window.

    After the scene is reloaded, press the Play button.

    02 - Mixed Reality Developer Tools and Concepts

    Introduction to Mixed Reality Developer Tools and 3D Concepts

    Short link: aka.ms/UnityMixedRealityDeveloperToolsarrow-up-right

    Mixed Reality Developer Tools.

    hashtag
    Overview

    In this section, we will go through the developer tools and how to get started with debugging our applications.

    Second part of the course is focused on creating and using 3D assets in your applications.

    How to grab and move an object?

    To grab and move an object, first ensure that the Manipulation Handler (Script) and Near Interaction Grabbable (Script) components are added to the object. The Manipulation Handler (Script) allows you to manipulate an object while the **Near Interaction Grabble (Script) allows the object to respond to near hand interactions.

    To add the scripts to the object, first select the object in the Hierarchy window. In the Inspector window, click Add Component and search for each script. Once found, select the script to add to the object.

    Add Manipulation Handler Script component

    With the object selected, in the Inspector window, navigate to the Manipulation Handler (Script) component to modify the component's parameters.

    Manipulation Handler Parameters

    You can move an object using one or two hands. This setting is dependent on the Manipulation Type parameter. The Manipulation Type can be limited to either:

    • One Handed Only

    • Two Handed Only

    • One and Two Handed

    Select the preferred Manipulation Type so that the user is restricted to use one of the available manipulation types.

    You can now test grabbing and moving the object using the in-editor simulation. Press the Play button to enter Game mode. Once in Game mode, hold the space bar to bring up the hand and use the mouse to grab and move the object.

    Concepts

    Bing Maps SDK Visualization
    • Why is spatial data important?

    • What are some good spatial visualizations for Mixed Reality?

    How to build your project for HoloLens?

    • In the Unity menu, select File > Build Settings... to open the Build Settings window.

    • In the Build Settings window, select Universal Windows Platform and click the Switch Platform button.

    How to set up your project for iOS or Android[Experimental]?

    1 ) Make sure you have imported Microsoft.MixedReality.Toolkit.Unity.Foundation as a custom asset or through NuGet.

    2 ) In the Unity Package Manager (UPM), install the following packages:

    How to add Mixed Reality Toolkit(MRTK) to a project?

    If you are using HoloLens Seed project, you do not need to follow this step. Seed project already comes with MRTK. Still, it's good to know how to import the MRTK assets for your future projects.

    First, you need to download MRTK by going to their github page: and navigating to releases tab. Scroll down to Assets section and download the tools:

    • Examples

    Project

    In this project we we will setup our development environment for Mixed Reality Development with Unity3d

    Check your knowledge by answering below question before you move into the project. Feel free to skip sections you feel comfortable. Make sure you read through the first download section to make sure you have all the modules necessary.

    How to build your scene for Android and iOS Devices?

    How to set-up HoloLens 2 Emulator

    You can download the latest here:.

    hashtag
    Can the HoloLens Emulator run on my device?

    Before installing the emulator, make sure your PC meets the following hardware requirements:

    How to make your buttons follow your hand?

    hashtag
    How to make your buttons follow your hand?

    MRTK uses what are known as Solvers to allow UI elements to follow the user or other game objects in the scene. The Radial View solver is a tag-along component that keeps a particular portion of a GameObject within the user's view.

    You can make a button follow your hand by adding the Radial View (Script) component to the object.

    What could go wrong?

    Common issues to consider while developing for Mixed Reality

    hashtag
    What are some of the security issues with Mixed Reality Applications?

    Since a Mixed Reality application might have access to the user video stream, developers might be able to save or share private information about the user. Be careful to not to save any sensitive data or image anywhere other than users device. Never send sensitive information to any backend.

    How to organize your objects into a grid view?

    You can organize any objects in Unity into a grid by using an Object collection script. In this example, you will learn how to organize 6 3D objects into a 3 x 3 grid.

    First, configure your Unity scene for the Mixed Reality Toolkit. Next, in the Hierarchy window, right click in an empty space and select Create Empty. This will create an empty GameObject. Name the object CubeCollection.

    In the Inspector window, position CubeCollection so that the collection displays in front of the user (example, X = 0, Y = -0.2, Z = 2).

    05 - Map Visualization

    Spatial Visualization using Bing Map using HoloLens 2 and Windows Mixed Reality Headsets.

    hashtag
    Overview

    Shortlink:

    circle-exclamation

    What are some good spatial visualizations for Mixed Reality?

    A good visualization allows the users to understand a data better by seeing the data points in the right context. Check out some of the examples below to see what the visualization provides that you would have hard time to understand just by seeing the information data points.

    • Small arms and ammunition import and export interactive visualization:

    • using .

    What is Bing Maps SDK?

    Maps SDK, a Microsoft Garage project provides a control to visualize a 3D map in Unity. The map control handles streaming and rendering of 3D terrain data with world-wide coverage. Select cities are rendered at a very high level of detail. Data is provided by Bing Maps.

    The map control has been optimized for mixed reality applications and devices including the HoloLens, HoloLens 2, Windows Immersive headsets, HTC Vive, and Oculus Rift. Soon the SDK will also be provided as an extension to the

    First, drag a button prefab from MixedRealityToolkit.SDK > Features > UX > Interactable > Prefabs to the Hierarchy window.

    In the Hierarchy window, select the button prefab. In the Inspector window, click Add Component. Search for Radial View. Once found, select to add the component to the button.

    When you add the Radial View (Script) component to the button, the Solver Handler (Script) component is added as well because it is required by the Radial View (Script).

    The Solver Handler (Script) component needs to be configured so that the button follows the user's hand. First, change Tracked Target Type to Hand Joint. This will enable you to define which hand joint the button follows.

    Next, for the Solver Handler (Script) component, change Tracked Handness to Right. This setting determines which hand is tracked.

    There over 20 hand joints available for tracking. Still inside the Solver Handler (Script) component, change Tracked Hand Joint to Wrist so that the button tracks the user's wrist.

    Now that the hand tracking is configured, you need to configure the Radial View (Script) component to further define where the button is located and how it is viewed in relation to the user. First, change Reference Direction to Facing World Up. This parameter determines which direction the button faces.

    Next, in the Radial View (Script) component, change the Min Distance and Max Distance to 0. The Min and Max Distance parameters determine how far the button should be kept from the user. As a reminder, the unit of measurement in Unity is meters. Therefore, a Min Distance of 1 would push the buttona way to ensure it is never closer than 1 meter to the user.

    Now that the button is configured to follow your right wrist, press Play to enter Game mode and test the solver in the in-editor simulator. Press and hold the space bar to bring up the hand. Move the mouse cursor around to move the hand, and click and hold the left mouse button to rotate the hand:

    Unit testingarrow-up-right
    Integration testingarrow-up-right
    Log file analysisarrow-up-right
    applicationarrow-up-right
    systemarrow-up-right
    Memory dumpsarrow-up-right
    Profilingarrow-up-right
    How to grab and move an object?arrow-up-right
    How to rotate and scale an object?arrow-up-right
    How to make an object respond to input events?arrow-up-right
    How to add audio feedback?arrow-up-right
    How to add visual feedback?arrow-up-right
    How to place an object onto a surface?arrow-up-right
    How to style bounding box?arrow-up-right
    How to add button prefabs to your project?arrow-up-right
    How to make your buttons follow your hand?arrow-up-right
    How to use simplified joint data access?arrow-up-right
    https://docs.unity3d.com/Manual/ManagedCodeDebugging.htmlarrow-up-right
    https://aka.ms/AA7qap4arrow-up-right

    What are some key concepts for working with Unity?

  • How to Get Started with Mixed Reality Development Using Unity?

  • How to get started with HoloLens Seed Project?

  • How to change preferences in Unity?

  • How to add Mixed Reality Toolkit(MRTK) to a project?

  • How to open MRTK example scenes?

  • How to enable Developer Mode in HoloLens?

  • How to enable Developer Mode on an Android device?

  • How to build your project for HoloLens?

  • How to deploy your app for HoloLens?

  • How to set up your project for iOS and Android[Experimental]?

  • How to build and deploy your project for Android?

  • How to build and deploy your project for Windows Mixed Reality Headset?

  • What do I need to download for Unity Development?
    How to get started with Unity3D Editor interface?

    Compare Covid-19 Data tab and map tab to see the difference it makes in your perception: https://ncov2019.live/arrow-up-right

  • Wind and weather visualizations: https://www.windy.com/arrow-up-right

  • Chrome experiments with Globe: https://experiments.withgoogle.com/chrome/globearrow-up-right

  • https://armsglobe.chromeexperiments.com/arrow-up-right
    Outings Garage Projectarrow-up-right
    MapsBing-SDKarrow-up-right

    How to add button prefabs to your project?

    hashtag
    How to add button prefabs to your project?

    Mixed Reality Toolkit is equipped with a variety of button prefabs that you could add to your project. A prefab is a pre-configured GameObject stored as a Unity Asset and can be reused throughout your project.

    You can find button prefabs available in MRTK by navigating to MixedRealityToolkit.SDK > Features > UX > Interactable > Prefabs.

    In this project, you will learn how to change the color of a cube when a button is pressed.

    First, select the button of your choice from the Project window and drag into the Hierarchy window.

    Change the button's Transform Position so that it's positioned in front of the camera to x = 0, y = 0, and z = 0.5

    Next, right click on an empty spot in the Hierarchy window and click 3D Object > Cube.

    With the Cube object still selected, in the Inspector window, change the Transform Position so that the cube is located near but not overlapping the button. In addition, resize the cube by changing the Transform Scale.

    In the Hierarchy window, select the button. In the Inspector window, navigate to the Interactable (Script) component.

    In the Events section, expand the Receivers section.

    Click the Add Event button to create a new event receiver of Event Receiver Type InteractableOnPressReceiver.

    For the newly created InteractableOnPressReceiver event, change the Interaction Filter to Near and Far.

    From the Hierarchy window, click and drag the Cube GameObject into the Event Properties object field for the On Press() event to assign the Cube as a receiver of the On Press () event.

    Next, click the action dropdown (currently assigned No Function) and select MeshRenderer > Material material. This action will set the Cube's material property to change when the button is pressed.

    Now, assign a color for the Cube to change to when the button is pressed. Click the small circle icon next to the Material field (currently assigned None (Material)) to open the Select Material window.

    MRTK provides a variety of materials that can be used in your projects. In the search bar, search for MRTK_Standard and select your color of choice.

    Now that the event is configured for when the button is pressed, you now need to configure an event that occurs when the button is released. For the On Release () event, click and drag the Cube GameObject into the Event Properties.

    Next, click the action dropdown (currently assigned No Function) and select MeshRenderer > Material material. This action will set the Cube's material property to change when the button is released.

    Now, assign a color for the Cube to change to when the button is released. Click the small circle icon next to the Material field (currently assigned None (Material)) to open the Select Material window and search for MRTK_Standard. Select your choice of color.

    Now that both the On Press () and On Trigger () events are configured for the button, press Play to enter Game mode and test the button in the in-editor simulator.

    To press the button, press the space bar + mouse scroll forward.

    To release the button, press the space bar + mouse scroll backward.

    hello.sh
    # Ain't no code for that yet, sorry
    echo 'You got to trust me on this, I saved the world'

    Resources

    Concepts
    Project
    What could go wrong?
  • Click on Project Settings in the Build Settings window or the Unity menu, select Edit > Project Settings... to open the Project Settings window.

  • Project settings.
    • In the Project Settings window, select Player > XR Settings to expand the XR Settings.

    Player XR settings.
    • In the XR Settings, check the Virtual Reality Supported checkbox to enable virtual reality, then click the + icon and select Windows Mixed Reality to add the Windows Mixed Reality SDK.

    XR settings Mixed Reality Supported checkbox.
    circle-info

    Your projects settings might have been configured by Mixed Reality Toolkit.

    • Optimize the XR Settings as follows:

      • Set Windows Mixed Reality Depth Format to 16-bit depth.

      • Check the Windows Mixed Reality Enable Depth Sharing checkbox.

      • Set Stereo Rendering Mode* to Single Pass Instanced.

    Optimization settings for XR.
    • In the Project Settings window, select Player > Publishing Settings to expand the Publishing Settings. Scroll down to the Capabilities section and check the SpatialPerception checkbox.

    Spatial Perception enabled.

    Save your project and open up the Build Settings. Click on Build button, not Build and Run. When prompted, create a new folder(ex:HoloLensBuild) and select your new folder to build your files into.

    triangle-exclamation

    Click on Build button, not Build and Run.

    Build your project into a new folder by clicking build button.

    When your build is done, your file explorer will automatically open to the build folder you just created.

    Build settings.
    Switch platform to Windows Universal Platform.
    3 ) Enabling the Unity AR camera settings provider.

    The following steps presume use of the MixedRealityToolkit object. Steps required for other service registrars may be different.

    1. Select the MixedRealityToolkit object in the scene hierarchy.

    MixedReality Toolkit in Hierarchy panel.

    2. Select Copy and Customize to Clone the MRTK Profile to enable custom configuration.

    Copy and Customize to Clone the MRTK Profile.

    3. Select Clone next to the Camera Profile.

    Clone camera profile.

    4. Navigate the Inspector panel to the camera system section and expand the Camera Settings Providers section.

    Camera Settings Providers

    5. Click Add Camera Settings Provider and expand the newly added New camera settings entry.

    New camera settings expanded view.

    6. Select the Unity AR Camera Settings provider from the Type drop down.

    Unity AR Camera Settings.

    Android

    iOS

    AR Foundation Version: 2.1.4

    AR Foundation Version: 2.1.4

    ARCore XR Plugin Version: 2.1.2

    ARKit XR Plugin Version: 2.1.2

    Extensions
  • Foundation

  • Tools

  • Download MRTK Releases.

    hashtag
    Add MRTK assets into your project

    In your Unity project, select Assets tab and select Import Package > Custom Package from the drop down.

    Navigate to MRTK downloaded folders to select and import them into your project.

    Once you have MRTK assets imported, a new tab called Mixed Reality Toolkit will appear in your Unity editor. Navigate to new tab and select Add Scene and Configure from the dropdown menu. In your Scene Hierarchy, a new MixedRealityToolkit and MixedRealityPlayspace dropdowns will appear.

    MixedRealityPlayspace now includes your Main Camera and the camera is configured for Mixed Reality applications. Camera background is black to render transparent and MixedRealityInputModule, EventSystem, GazeProvider components are now added to your camera.

    circle-info

    You can create a new scene to compare the camera settings that has changed by MRTK.

    • You might be prompted to select a configuration. You can choose the default MRTK configuration or if you are developing for an HoloLens device, you can choose the configuration for the appropriate version.

    aka.ms/MRTKGithubarrow-up-right
    iOS Project Configurator Settings.
    Project Configurator Settings.
    circle-info

    There are no additional steps after switching the platform for Android.

    Optimization header, uncheck Strip Engine Code.
    circle-info

    Unchecking Strip Engine Code is the short term solution to an error in Xcode #6646arrow-up-right. We are working on a long term solution.

    64-bit Windows 10 Pro, Enterprise, or Education

  • circle-exclamation

    Windows 10 Home Edition does not support Hyper-V or the HoloLens Emulator. The HoloLens 2 Emulator requires the Windows 10 October 2018 update or later.

    hashtag
    How to check or enable "Hyper-V" settings?

    hashtag
    How to install or update the Emulator?

    HoloLens Emulator updatearrow-up-right
    bit.ly/emulator2arrow-up-right
    hashtag
    Why is eye scan data sensitive information?

    Iris scan is a more accurate identification method than fingerprint. Since iris scan data can be used to identify and sing in a user, it should never leave the users device. HoloLens 2 does not send the iris scan to the cloud and does not give access to the data.

    hashtag
    Why is eye tracking important for users privacy?

    Eye tracking, while a very useful tool to make your application more accessible, it can also be used to collect data about the user's attention and might be used to manipulate the user's attention.

    hashtag
    Can I open my unity project in the current version, if it is originally saved in an older version?

    Unity versions are not backward compatible. If you decide to open a project on a newer version, Unity will try to update your project automatically but it is not guaranteed that the newer version will work with your imported assets. There might be some incompatibilities with your asset or your code and the new version.

    hashtag
    What does the Unity Versioning mean and when is it safe to update the Unity version?

    Let's take the latest version in the image below, 2019.3.5f:

    • 2019: is the year the Unity version was developed. Changes are issued once a year. If there are major changes, that will break your application. Stick to the same year version unless you are creating a new application from scratch for now. We will talk about how to update your project to the latest version in the following lessons.

    • 3: implies the 3rd iteration in 2019. When a version updates from 2 to 3, there are minor breaking code. Make sure to read changelog before updating your project from 2 to 3.

    • .5f: is for bug fixes. Usually there are few fixes that does not break your code or the APIs being used. Feel free to update your project from 2019.3.4f to 2019.3.5f.

    hashtag
    How to update my Unity version to a newer one?

    In your Unity Hub, under the project tab, you can select the Unity version drop down for your application and select a newer version of Unity. Unity will confirm your choice before updating your project. It is a good idea to save a version of your project as a new branch in Github, in case you need to revert back.

    hashtag
    What could go wrong with Unity NuGet packages?

    Here is a detailed article about the subject: https://www.what-could-possibly-go-wrong.com/unity-and-nuget/arrow-up-right

    With CubeCollection still selected, in the Hierarchy window, create a child Cube object. Change the scale of the object to x = .25, y = .25, z = .25.
    Cube transforms.

    Duplicate the child Cube object 8 times so that there is a total of 9 Cube child objects within the CubeCollection object.

    Duplicated cubes collection.

    In the Hierarchy window, select CubeCollection. In the Inspector window, click Add Component and search for the Grid Object Collection (Script). Once found, select the component to add to the object.

    Add Grid Object Collection Script component

    Configure the Grid Object Collection (Script) component by changing the Sort Type property to Child Order. This will ensure that the child objects (the 9 Cube objects) are sorted in the order you placed them under the parent object.

    Change sort type.

    Click Update Collection to apply the new configuration.

    Update Collection.

    You can adjust the parameters within the Grid Object Collection (Script) component to further customize the grid. For example, you could change the number of rows to 2 by changing the value in the Num Rows properties. Be sure to click Update Collection to apply the new configuration.

    Change number of rows.
    Grid layout of boxes.
    Empty CubeCollection object.
    Position attributes of the CubeCollection.
    This project is for HoloLens 2 and Windows Mixed Reality Headsets.

    In this project, we will create a 3D Map visualization using Bing Maps Unity SDKarrow-up-right: aka.ms/BingMapsUnitySDKarrow-up-right.

    Bing Maps Visualization Example.

    Outings, a sample app created by Bings Map SDK can be found on Microsoft Store for PC and HoloLens 1: aka.ms/OutingsHoloLens1arrow-up-right

    Outings Immersive App.

    hashtag
    What we will build?

    We will build the app in below video for HoloLens 2. You can render it for Windows Mixed Reality Headset and use hand controllers instead of hand gestures.

    aka.ms/UnityBingMapsVisualizationLessonarrow-up-right
    simulated handsarrow-up-right
    UnityTouchControllerarrow-up-right
    Unity Touch classarrow-up-right
    Input Actionsarrow-up-right
    Gesture Profile Settings
    aka.ms/MRTKGuidesarrow-up-right
    Mixed Reality Toolkit Examples
    MRTK Examples Hub
    Mixed Reality Toolkit Documentation.
    What is Bing Maps SDK?
    Mixed Reality Toolkit (MRTK).arrow-up-right
    Bing Maps SDK
    Creating a new scene.
    Update & Security Settings.
    For Developers Settings.
    Displacement Maps
    Virtual Reality Headset
    Manipulation type
    Add object and action

    How to get started with HoloLens Seed Project?

    HoloLens Seed projectarrow-up-right is a github repository that is configured for Windows Mixed Reality development. The repo includes Mixed Reality Toolkit and .gitignore files.

    You can create a new project from the seed instead of downloading the different assets and setting up your git project. To be able to use the seed project, you can get a github accountarrow-up-right and setup your development environment or directly download the repository content.

    Download Seed project from github.

    hashtag
    Setup

    You can clone and delete this repository's history and start a new git project by running the below script. You need to create your own github repo first. Replace with your own github project url.

    Or by running the below github commands:

    hashtag
    How to update your project to latest seed?

    Whenever there is a new update for or packages, this repo will be updated with the latest version. You can automaticly get the latest packages by adding the seed repo as your upstream and pulling from it.

    You can check to see if your remote origin and upstream by copy and pasting to your terminal:

    You can remove the upstream anytime by running:

    Resources

    Mixed Reality getting started resources

    • Windows Mixed Reality Docs: aka.ms/MixedRealityDocsarrow-up-right

    Windows Mixed Reality Documantation.
    • Mixed Reality Curriculum Youtube Playlist: aka.ms/MixedRealityCurriculumVideosarrow-up-right

    Curriculum Youtube Playlist.
    • Mixed Reality Resources Repository:

    • HoloLens Seed Project Repository:

    • Code Samples:

    • Mixed Reality Development Tools to install:

    • Elliminate Texture Confusion: Bump, Normal and Displacement Maps:

    • Normal vs. Displacement Mapping & Why Games Use Normals:

    • Live editing WebGL shaders with Firefox Developer Tools:

    How to enable Developer Mode on an Android Device?

    The Settings app on Android includes a screen called Developer options that lets you configure system behaviors that help you profile and debug your app performance. For example, you can enable debugging over USB, capture a bug report, enable visual feedback for taps, flash window surfaces when they update, use the GPU for 2D graphics rendering, and more.

    hashtag
    Enable developer options and USB debugging

    On Android 4.1 and lower, the Developer options screen is available by default. On Android 4.2 and higher, you must enable this screen. To enable developer options, tap the Build Number option 7 times. You can find this option in one of the following locations, depending on your Android version:

    • Android 9 (API level 28) and higher: Settings > About Phone > Build Number

    • Android 8.0.0 (API level 26) and Android 8.1.0 (API level 26): Settings > System > About Phone > Build Number

    • Android 7.1 (API level 25) and lower: Settings > About Phone > Build Number

    At the top of the Developer options screen, you can toggle the options on and off (figure 1). You probably want to keep this on. When off, most options are disabled except those that don't require communication between the device and your development computer.

    Before you can use the debugger and other tools, you need to enable USB debugging, which allows Android Studio and other SDK tools to recognize your device when connected via USB. To enable USB debugging, toggle the USB debugging option in the Developer Options menu. You can find this option in one of the following locations, depending on your Android version:

    • Android 9 (API level 28) and higher: Settings > System > Advanced > Developer Options > USB debugging

    • Android 8.0.0 (API level 26) and Android 8.1.0 (API level 26): Settings > System > Developer Options > USB debugging

    • Android 7.1 (API level 25) and lower: Settings > Developer Options > USB debugging

    The rest of this page describes some of the other options available on this screen.

    How to rotate and scale an object?

    To rotate and scale an object, first ensure that the Manipulation Handler (Script) and Near Interaction Grabbable (Script) components are added to the object. The Manipulation Handler (Script) allows you to manipulate an object while the Near Interaction Grabble (Script) allows the object to respond to near hand interactions.

    To add the scripts to the object, first select the object in the Hierarchy window. In the Inspector window, click Add Component and search for each script. Once found, select the script to add to the object.

    Manipulation Handler Script

    With the object selected, in the Inspector window, navigate to the Manipulation Handler (Script) component to modify the component's parameters.

    Manipulation Handler parameters

    You can rotate an object using one or two hands. This setting is dependent on the Manipulation Type parameter. The Manipulation Type can be limited to either:

    • One Handed Only

    • Two Handed Only

    • One and Two Handed

    Select Two Handed Only for Manipulation Type so that the user can only manipulate the object with two hands.

    To limit the two handed manipulation to rotating and scaling, change Two Handed Manipulation Type to Rotate Scale.

    To limit whether the object can be rotated on the x, y or z axis, change Constraint on Rotation to your preferred axis.

    You can now test rotating and scaling the object using the in-editor simulation. Press the Play button to enter Game mode. Once in Game mode, press T and Y on the keyboard to toggle both hands. This will permanently display both hands in Game mode. Press the space bar to move the right hand and use left mouse click + Shift to move the left hand. While either controlling the left or right hand, use the mouse to rotate and scale the object.

    Microsoft Mesh hands-on demoMicrosoftLearnchevron-right
    Mesh Demo video
    Where light fallsSearch - Microsoft Bingchevron-right
    Microsoft Mesh

    What is HoloLens Device Portal?

    The Windows Device Portal for HoloLens lets you configure and manage your device remotely over Wi-Fi or USB. The Device Portal is a web server on your HoloLens that you can connect to from a web browser on your PC. The Device Portal includes many tools that will help you manage your HoloLens and debug and optimize your apps.

    hashtag
    How to setup device portal?

    What are some examples of Mixed Reality Applications?

    Medical

    WebXR Emulator:https://chrome.google.com/webstore/detail/webxr-api-emulator/mjddjgeghkdijejnciaefnkjmkafnnje?hl=enarrow-up-right

    aka.ms/MixedRealityResourcesRepositoryarrow-up-right
    aka.ms/HoloLensSeedProjectarrow-up-right
    aka.ms/MixedRealityUnitySamplesarrow-up-right
    https://aka.ms/HoloLensToolInstallsarrow-up-right
    https://www.pluralsight.com/blog/film-games/bump-normal-and-displacement-mapsarrow-up-right
    https://cgcookie.com/articles/normal-vs-displacement-mapping-why-games-use-normalsarrow-up-right
    https://hacks.mozilla.org/2013/11/live-editing-webgl-shaders-with-firefox-developer-tools/arrow-up-right
    HoloLens Seed Project Repository.
    Code Sample Repository.
    Tools to install link.
    WebXR Emulator Extention.
    Turn on your HoloLens device.
  • Tab your wrist(HoloLens 2) or make a bloom gesture(HoloLens 1) to initiate Windows menu.

  • Windows menu.
    • Open the Settings > Update & Security.

    Update & Security Settings.
    • Select For Developers tab on the right hand panel.

    For Developers Settings.
    • Enable "use developer features" by toggling on/off button.

    • Scroll down at the For Developer settings to enable "Device Portal".

    Device Portal Toggle.
    • Go back to all settings page by clicking "Home" on the left hand panel and select "Network & Internet" settings.

    Network and Internet Settings.
    • Select "Wifi" tab on the left, if it is not already selected.

    • Select the wifi you are connected to and click on "Advanced Options".

    Wifi Advanced Options.
    • Scroll down and write down the IPV4 address.

    • You will type in this IP address to your browser to reach to your device portal.

    • You might see a connection Alert as shown below:

    Your connection is not private alert.
    • Go ahead and click Advanced button and click Proceed to <your IP address>(unsafe).

    • Congrats, you made it to your device portal.

    Device Portal.
    • Click Views on the right hand panel and select "Live Preview" to see the camera view of your HoloLens.

    Live Preview on Windows Device Portal.
    • You can turn off PV camera if you would like to share or record what you are seeing through your HoloLens but do not want to capture your environment.

    • You can see the videos recorded or screenshots you snapped by asking Cortana here, on the Videos and Photos section, if you enabled voice commands.

    Museums and Libraries
    • Mont-Saint-Michel: The historic 3D model comes to lifearrow-up-right

    • Apollo 11 Mission Unreal Engine Experiencearrow-up-right

    • Dutch National Museum HoloLens Experiencearrow-up-right

    Incubator for Medical Mixed and Extended Reality at Stanfordarrow-up-right
    Take a Ride on a Root Canal: The VR Tooth Tour Storyarrow-up-right
    Mixed Reality Toolkitarrow-up-right
    Azure Spatial Anchorsarrow-up-right
    Manipulation Type
    Rotate and Scale
    Rotation Constrain
    Rotate interaction
    Logo

    What do I need to download for Mixed Reality development with Unity for HoloLens?

    Before you get started with developing for Mixed Reality for Unity, make sure to check everything in the below list and follow the instructions for each download.

    triangle-exclamation

    Not following the instructions for specific download might result in errors while developing or building your application. Before you try to debug, check the list and detailed instructions.

    hashtag
    Windows 10

    Install the most recent version of or so your PC's operating system matches the platform for which you are building mixed reality applications.

    You can check your Windows version by typing "about" in the Windows search bar and selecting About your PC as shown in the below image.

    You can learn more about upgrading your Windows 10 Home to Pro at .

    circle-info

    We need to install and enable Hyper-V, which does not work on Windows Home. Make sure to upgrade to Education, Pro Education, Pro or Enterprise versions.

    hashtag
    Unity

    Go to: page and download the Unity Hub instead of Unity Editor.

    circle-exclamation

    Do not use Beta software in general before you feel very comfortable with debugging, the software itself and your way around github issues and stackover. Don't learn this lesson the hard way! I have tried that for your benefit and/or my optimism.

    Unity Hub allows you to download multiple Unity Editors and organize your projects in one place. Since Unity upgrades are not backward compatible, you have to open the projects with the same Unity version that it was created with. You can update the projects to the latest Unity version but that requires a lot of debugging usually. Easiest way to get going with a project is to keep the same version. I will show you how to debug to update your projects later in this chapter.

    You will need to download Windows development related modules along with your Unity Editor. Make sure Universal Windows Platform Build Support and Windows Build Support is checked while downloading Unity Editor through Unity Hub or add it after by modifying the install.

    You can add modules or check if you have them in your editor by clicking on the hamburger button for the Unity Editor version and checking the above module check-boxes.

    If you would like to build for an Android or iOS mobile device, make sure the related modules are checked as well.

    hashtag
    Visual Studio

    You can download Visual Studio by adding Microsoft Visual Studio 2019 module to your Unity Editor as shown in previous step or download it at .

    circle-exclamation

    Make sure to download Mixed Reality related modules along with Visual Studio.

    You can always add the necessary workflows to Visual Studio after download:

    How to get started with Unity3D Editor interface?

    In this section, you will learn Unity3D interface, tools and keyboard shortcuts.

    The Unity Editor has four main sections:

    hashtag
    Scene view

    This is where you can edit the current Scene by selecting and moving objects in the 3D space for the game. In this kit, the game level is contained in one Scene.

    03 - Hand Interactions and Controllers

    Working with Hand Interactions.

    Short link:

    hashtag
    Overview

    In this section, we will look into the hand interactions as an input in our application.

    Deploying your HoloLens 2 applicationMicrosoftLearnchevron-right
    Getting started with Unity development.
    git clone --depth=1 https://github.com/Yonet/HoloLensUnitySeedProject.git <your-project-name>
    // Clone the seed project
    git clone --depth=1 https://github.com/Yonet/HoloLensUnitySeedProject.git
    
    -- Remove the history from the repo
    rm -rf .git
    
    -- recreate the repos from the current content only
    git init
    git add .
    git commit -m "Initial commit"
    
    -- push to the github remote repos ensuring you overwrite history
    git remote add origin [email protected]:<YOUR ACCOUNT>/<YOUR REPOS>.git
    git push -u --force origin master
    git remote add upstream https://github.com/Yonet/HoloLensUnitySeedProject.git
    git pull upstream master
    git remote -v
    git remote remove upstream https://github.com/Yonet/HoloLensUnitySeedProject.git
    Paris Museum HoloLens Experiencearrow-up-right
    OneDome - Unreal Garden HoloLens Exhibitionarrow-up-right
    Bouluvard HoloLens Apparrow-up-right
    Dinosaur Passage: A Hololens Museum Experiencearrow-up-right
    Smithsonian Apollo 11 Module VRarrow-up-right
    Making of Apollo 11 Module Experiencearrow-up-right
    Google I/O 2018 AR demoarrow-up-right
    SF Moma movement controlled experiencearrow-up-right
    Google Augmented Reality museum experiencearrow-up-right
    MoMa Jackson Pollackarrow-up-right
    DaVinci AR Layers Google IO 2018arrow-up-right
    Petersen Automotive Museum: a HoloLens experiencearrow-up-right
    Mixed Reality Museum Tour Solution with HoloLensarrow-up-right
    Experimenting with Mixed Reality in Museumsarrow-up-right
    Explore WWII’s French Resistance in mixed realityarrow-up-right
    Mixed Reality Museum in Kyoto: A unique insight into centuries-old Japanese artworkarrow-up-right
    Catalina HoloLens Experiencearrow-up-right
    Museum Next: How Museums are using Augmented Realityarrow-up-right
    Museum Next: Virtual Reality is a big trend in museums, but what are the best examples of museums using VR?arrow-up-right
    Augmenting Museum Experiences with Mixed Realityarrow-up-right
    Will mixed reality replace our phones and personal computers?
    How do I decide if I need to develop for Virtual Reality or Augmented Reality?
    Changing preferences in Unity3D
    Logo
    How to check or enable "Hyper-V" on your PC
    How to install or update HoloLens Emulator?
    hashtag
    Hierarchy window

    This is a list of all the GameObjects in a Scene. Every object in your game is a GameObject. These can be placed in a parent-child hierarchy, which lets you group objects — this means that when the parent object is moved, all of its children will move at the same time.

    hashtag
    Inspector window

    This display all settings related to the currently selected object. You will explore this window more during the walkthrough.

    hashtag
    Project window

    This is where you manage your Project Assets. Assets are the media files used in a Project (for example, images, 3D models and sound files). The Project window acts like a file explorer, and it can be used to explore and create folders on your computer. When the walkthrough asks you to find an Asset at a given file path, use this window.

    TIP: If your Editor layout doesn’t match the image above, use the layout drop-down menu at the top right of the toolbar to select Default.

    Going back to default editor layout.

    hashtag
    Unity Editor Toolbar

    Unity Editor Toolbar.

    The toolbar includes a range of useful tool buttons to help you design and test your game.

    hashtag
    Play Buttons

    hashtag
    Play

    Play is used to test the Scene which is currently loaded in the Hierarchy window, and enables you to try out your game live in the Editor.

    hashtag
    Pause

    Pause, as you have probably guessed, allows you to pause the game playing in the Game window. This helps you spot visual problems or gameplay issues that you wouldn’t otherwise see.

    hashtag
    Step

    Step is used to walk through the paused Scene frame by frame. This works really well when you’re looking for live changes in the game world that it would be helpful to see in real time.

    hashtag
    Manipulating objects

    These tools move and manipulate the GameObjects in the Scene view. You can click on the buttons to activate them, or use a shortcut key.

    hashtag
    Hand Tool

    Hand Tool Keyboard Shortcut: Q

    You can use this tool to move your Scene around in the window. You can also use middle click with the mouse to access the tool.

    hashtag
    Move Tool

    Move Tool Keyboard Shortcut: W

    This tool enables you to select items and move them individually.

    hashtag
    Rotate Tool

    Rotate Tool Keyboard Shortcut: E

    Select items and rotate them with this tool.

    hashtag
    Scale Tool

    Scale Tool Keyboard Shortcut: R

    Tool to scale your GameObjects up and down.

    hashtag
    Rect Transform Tool

    Rect Transform Tool Keyboard Shortcut: T

    This tool does lots of things. Essentially, it combines moving, scaling and rotation into a single tool that’s specialized for 2D and UI.

    hashtag
    Rotate, Move or Scale

    Rotate, Move or Scale Tool Keyboard Shortcut: Y

    This tool enables you to move, rotate, or scale GameObjects, but is more specialized for 3D.

    hashtag
    Focusing on GameObject

    Focusing on an GameObject Keyboard Shortcut: F

    Another useful shortcut is the F key, which enables you to focus on a selected object. If you forget where a GameObject is in your Scene, select it in the Hierarchy. Then, move your cursor over the Scene view and press F to center it.

    hashtag
    Navigating with the mouse

    When you’re in the Scene view, you can also do the following:

    • Left click to select your GameObject in the Scene.

    • Middle click and drag to move the Scene view’s camera using the hand tool.

    For more advice on moving GameObjects in the Scene view, see Scene View Navigationarrow-up-right in the Manual.

    Unity3D Editor Interface
    circle-info

    Hand Interactions are currently available for only HoloLens 2 and Oculus devices.

    In project section, we will create our first hand interactions to scale, move and rotate objects.

    • ​Concepts​

    • ​Project​

    • ​What could go wrong?​

    aka.ms/UnityHandInteractionsarrow-up-right
    Unity Hand Interactions link.
    Windows 10 Education arrow-up-right
    Pro Educationarrow-up-right
    aka.ms/WinHome2Proarrow-up-right
    https://unity3d.com/get-unity/downloadarrow-up-right
    aka.ms/VSDownloadsarrow-up-right
    Check your Windows Version in your System Settings under About.
    Download Unity Hub instead of the Unity Editor.
    Unity Hub Editor Installs
    Unity Hub Projects Page
    Check Universal Windows Platform Build Support and Windows Build Support modules for Unity Editor.
    Unity Android Build Support Modules.
    HoloLens 2 Emulator OverviewMicrosoftLearnchevron-right
    HoloLens Emulator Overview
    What is AR/VR/MR/XR?
    HoloLens 2 Bing Maps project end product.
    Logo
    ​Resources
    Logo
    Setting up your HoloLens 2 development environmentMicrosoftLearnchevron-right
    Setting up your HoloLens 2 development environmentMicrosoftLearnchevron-right
    Logo
    Logo