LogoLogo
  • Mixed Reality Curriculum
  • What is the Metaverse?
    • Microsoft Mesh
  • Unity Lessons
    • 01 - Introduction to Mixed Reality
      • Concepts
        • What is Mixed Reality?
        • What is the difference between Augmented Reality, Virtual Reality and Mixed Reality?
        • Why is Mixed Reality important?
        • Will mixed reality replace our phones and Personal Computers?
        • How do I decide if I need to develop for Virtual Reality or Augmented Reality?
        • What are some use cases for Mixed Reality applications?
        • What are some examples of Mixed Reality Applications?
        • What is Mixed Reality Toolkit(MRTK)?
      • Project
        • What do I need to download for Mixed Reality development with Unity for HoloLens?
        • How to get started with Unity3D Editor interface?
        • What are some key concepts for working with Unity?
        • How to Get Started with Mixed Reality Development Using Unity?
        • How to get started with HoloLens Seed Project?
        • How to change preferences in Unity?
        • How to add Mixed Reality Toolkit(MRTK) to a project?
        • How to open MRTK example scenes?
        • How to enable Developer Mode in HoloLens?
        • How to enable Developer Mode on an Android Device?
        • How to build your project for HoloLens?
        • How to deploy your app to a HoloLens?
        • How to set up your project for iOS or Android[Experimental]?
        • How to build your scene for Android and iOS Devices?
        • How to build and deploy your project for Windows Mixed Reality Headset?
      • What could go wrong?
      • Resources
    • 02 - Mixed Reality Developer Tools and Concepts
      • Concepts
        • What is Debugging?
        • What makes a 3D model? What are Polygons, Splines, Vertices, Meshes and Materials?
        • How to choose performant 3D models for your application?
      • Project
        • How to simulate input interactions in Unity editor?
        • How to set-up HoloLens 2 development environment?
        • How to use MRTK Visual Profiler?
        • What is HoloLens Emulator?
        • How to set-up HoloLens 2 Emulator
        • How to deploy to HoloLens Emulator?
        • What is HoloLens Device Portal?
        • How to monitor performance of your app?
        • Working with 3D Objects
        • How to log for debugging purposes?
        • How to add MRTK(Mixed Reality Toolkit) Diagnostic System to your project?
        • Where to find pre-made 3D models?
        • How to upload 3D models to your project?
        • How to create your own models using Maquette?
        • How to create polygon models?
        • How to create 3D models with splines?
        • How to create 3D models using Autodesk 3dsMax?
      • What could go wrong?
      • Resources
    • 03 - Hand Interactions and Controllers
      • Concepts
        • Why the hand interaction is important?
        • What are Gestures?
      • Project
        • How to run the (Mixed Reality Toolkit)MRTK Hand Interaction examples in Unity Editor?
        • How to add hand interactions to an object?
        • How to add Manipulation Handler to your object?
        • How to organize your objects into a grid view?
        • How to grab and move an object?
        • How to rotate and scale an object?
        • How to make an object respond to input events?
        • How to style Bounding Box?
        • How to add visual feedback?
        • How to add audio feedback?
        • How to add button prefabs to your project?
        • How to organize your buttons into a grid view?
        • How to make your buttons follow your hand?
        • How to use simplified joint data access?
      • What could go wrong?
        • Mixing scaling and moving.
        • Having a small bounding box.
      • Resources
    • 04 - Eye and Head Gaze
      • Concepts
      • Project
        • How to get permission to use eye-tracking?
        • How to setup eye-tracking?
        • How to simulate eye-tracking in the Unity editor?
        • How to enable eye calibration?
        • How to use eye-tracking to select an object?
        • How to use eye-tracking for infinite scroll?
        • How to visualize eye tracking data?
        • How to setup head tracking?
      • What could go wrong?
        • Using eye movement without a delay to select an object.
        • Using eye gaze to influence the user.
      • Resources
    • 05 - Map Visualization
      • Concepts
        • Why is spatial data important?
        • What are some good spatial visualizations for Mixed Reality?
        • What is Bing Maps SDK?
      • Project
        • How to sign up for Bing Maps Developer Key?
        • How to add BingMaps SDK to your project?
        • How to create and configure your first map in unity?
        • How to style your map?
        • What is a Map Terrain Type?
        • How to add hand interactions for scaling and rotation?
        • How to style bounding box?
        • How to animate your map?
        • How to add pins to your map?
        • How to add pins using the MapPinLayer?
        • How to cluster map pins for larger data-sets?
        • What are the different considerations, settings you need for Virtual Reality vs Augmented Reality?
      • What could go wrong?
        • Does the dimensions of the map effect performance?
        • Does Map Terrain Type effect performance?
        • Can I customize the materials and shaders?
        • Can quality settings effect the performance?
        • Can adding pins would slow down my application?
      • Resources
    • 06 - REST APIs
      • Concepts
      • Project
        • What is UnityWebRequest?
      • What could go wrong?
      • Resources
    • 07 - Spatial Anchors
      • Concepts
        • What is a Spatial Anchor?
        • Why use Spatial Anchors?
        • Which devices does Azure Spatial Anchors support?
        • What do I need to do to make sure Android, iOS and HoloLens are using the same point as my anchor?
        • What is Anchor Relationships and what is it useful for?
        • What information about an environment is transmitted and stored on the ASA service?
      • Project
        • How to sign up for Azure Account?
        • How to create an Azure Spatial Anchor resources?
        • How to include Azure Spatial Anchors(ASA) SDK to your project?
        • How to create an Azure Spatial Anchor app and configure a scene?
        • How to add ASA script to your scene?
        • How to update the UI when a callback resolves?
        • How to initialize a CloudSpatialAnchorSession?
        • How to save the new CloudSpatialAnchor as a WorldAnchor on the local platform?
        • How to upload your local Anchor into the cloud?
        • How to build and use the ASA app for HoloLens?
        • How do I know my anchors are saved on Azure resources?
        • How to create a CosmosDB table to save and share the anchors between devices?
      • What could go wrong?
      • Resources
    • 08 - Spatial Anchor Visualization on Map
    • 09 - QR Codes
    • 10 - Spatial Awareness
      • Concepts
        • What is Spatial Mapping?
        • What is Scene Understanding?
        • What are SceneQuads?
        • How to decide to use Spatial Map or Scene Understanding?
      • Project
      • What could go wrong?
      • Resources
    • 11 - AI
    • 12 - Project Discussion and Case Studies
      • What are the things to consider before you decide on an idea?
      • What could go wrong?
  • WebXR Lessons
    • 3D on the Web
      • Concepts
        • What is WebGL?
        • What is Field of View?
        • What is Aspect Ratio of a Camera?
        • What is near and far clipping planes of a camera?
        • What does updating projection matrix do?
        • What are 3D primitive objects?
        • What is a Vertex?
        • What are 3D model loaders?
        • Materials
        • What is a Normal in 3D?
        • Environment Maps
        • Normal Maps
        • Subsurface Scattering
        • UV Mapping
        • Baking
        • Texturing
        • Animations and Rigging
        • 3D scene interactions
        • What is a Transformation Matrix
        • What are 3DOF or 6DOF?
      • Project
        • How to create a basic 3D scene?
        • BabylonJS Scene
        • ThreeJS Scene
        • AFrame Scene
        • How to create a globe visualization with ThreeJS
        • Lighting your scene
        • 3D Performance
      • What could go wrong?
      • Resources
    • WebXR Device APIs
      • Concepts
        • What are WebXR Device APIs?
        • Which Devices are Compatible with WebXR?
        • Which Browsers support WebXR?
        • What is the Lifecycle of a WebXR Application?
        • What is XRReferenceSpaceType?
      • Project
        • How to enable VR?
        • How to enable AR and Hit-test?
        • How to debug and test your WebXR Application with Chrome Dev Tools?
        • How to load a 3D Model
      • Resources
      • What could go wrong?
    • A-Frame
      • Concepts
      • Basic A-Frame Scene
      • Project
        • Inclusive Apps with WebXR & AI
        • Creating WebXR apps for Transportation with A-Frame
        • Transportation Game
      • What could go wrong?
      • Resources
    • Three.js
    • Babylon.js
      • Concepts
        • What is BabylonJS?
        • What is a scene?
        • What is Playground?
      • Project
        • How to create your first Scene in Playground
        • What is an Arc Camera?
        • What is an Hemispheric Light?
        • What is a Box Mesh?
        • Basic Scene Exercises
        • How to load a 3D model on Playground?
        • How to add user interactions?
        • How to add WebXR support?
          • How to add Virtual Reality capabilities to your app?
          • How to test your WebXR app on an Android phone or Google Cardboard?
          • How to add Augmented Reality capabilities to your app?
        • How to setup a BabylonJS local development environment and project?
        • How to Create a WebXR Augmented Reality App on Your Local Device
        • Babylon.js AR scene
        • Add Speech SDK
        • Server
        • Token Util
        • Create Speech Resources
      • Resources
    • WebXR Meetups
    • Resources
  • Unreal Engine Lessons
    • Blueprints
      • Concepts
        • What are Blueprints?
        • How Do Blueprints Work?
      • Project
        • How to add functionality to buttons using blueprints?
      • What could go wrong?
  • Artificial Intelligence(AI) Lessons
    • Introduction to AI Fundamental
      • Concepts
        • What is training data?
      • Project
        • How to create a no code AI application using Power Apps and Custom Vision?
          • What is Power Platform?
          • Is AI Builder the right choice?
          • What is Object Detection?
          • How to detect objects from images?
          • How to improve Model performance?
          • How to use your Custom Vision model in a Power App?
          • What's next?
      • What could go wrong?
      • Resources
    • Speech
      • What's the problem we are solving?
      • Where to get started?
      • Creating an Azure Resource
      • How to make the Speech API call?
      • How to transcribe Speech
      • How to Translate Speech
      • Tools
    • Cognitive Services Best Practices
      • Concepts
      • Project
      • What could go wrong?
      • Resources
    • AI Show Episode 46
      • What could go wrong?
    • AI Show Episode 48
      • Resources
  • Exercises
    • Creating your first app
    • Adding interactions
    • Working with Coordinate System
    • Working with Spatial Sound
    • Adding Voice Commands
    • Working with Speech services to create subtitles
    • Working with Translation Services
    • Detecting objects with Vision services
    • Creating IoT data visualizations
    • Working with Digital Twins to collect spatial data
    • Working with Azure Spatial Anchors for shared experiences
  • FAQ
  • Glossary
    • AI
    • XR
      • 3D Concepts
    • Computer Science
  • How to contribute?
Powered by GitBook
LogoLogo

Lessons

  • WebXR
  • AI
  • Unity
  • Unreal

Other Links

  • Home
  • FAQ
  • Glossary
  • Github

YouTube

  • WebXR
  • AysSomething
  • XR Dev
  • JavaScript

Social

  • Twitter
  • Instagram
  • Bluesky
  • Reddit

ARTistus LLC

On this page

Was this helpful?

Export as PDF
  1. Unity Lessons
  2. 10 - Spatial Awareness
  3. Concepts

What is Spatial Mapping?

Spatial Awareness is one of the core features of HoloLens. It allows the device to understand the real world around you.

The Spatial Awareness system builds a collection of meshes that represent the shapes in your environment, allowing developers to make some very cool interactions between holograms and real-world objects.

The most common use case is for placing Holograms on surfaces. If we ignore the room surfaces we can get some strange results where Holograms are behind a wall or go through physical objects. This is bad for a few reasons.

  • This is very uncomfortable for users because their eyes are constantly fighting to focus between the real surface and the Hologram.

  • And second, it breaks the illusion of our 3D objects being placed in the real world.

What we recommend is to use spatial awareness to place holograms on top of a surface, or attached to a surface. If we enable spatial mapping in our apps we can place holograms in a more natural way, but if we use scene understanding you can achieve even better results, but I'll dive into that in a minute.

Make scanning a part of your experience!

Even though HoloLens is constantly scanning your environment, make sure that your app has enough spatial information in order for it to be a great experience. So incorporate scanning moments into the flow of your app. These moments can be disguised by asking the user to move around an object or to look in certain a direction, but always remember to keep it short and fun. Now back to Scene Understanding. If you use Scene Understanding in your apps, you can create magical moments where holograms are automatically placed in your room. For example, you could place a holographic screen on a wall, or even better, sit a virtual character on a user's real sofa! On HoloLens you can download room data from the device to your PC. This allows you to import the room data as a 3d model into Unity and other development tools, and simulate the user's environments. But most importantly, you can test your spatially aware app in different settings and make tweaks to the model inside the editor. Whenever you spawn objects in your apps, make sure that their positions respect the room's bounds. Spawning visible objects behind a real wall is a bad idea, especially if you are using spatial mapping with an occlusion shader! The Occlusion shader will hide the object since it is behind a real one. Always check that spawned objects stay within the bounds of your room. Constantly showing the spatial mesh in your app, despite looking cool, can be very noisy and distracting for the user. That's why we recommend showing the mesh only if the user needs to know that it's there, and even in those cases, pulses should be used so that their appearance is subtle and elegant. Our research has taught us that users expect holograms to behave like real objects. For example, users stay away from holograms with sharp edges, and they avoid walking through holograms. Makes sure that your holograms respect their surroundings because your users will expect them to do so.

In this chapter we've explained some fundamental recommendations around Spatially Aware apps:

  • Use Scene Understanding to place objects on specific surfaces. (Use SU for hologram placement)

  • Make scanning a part of your user experience ("Scanning part of UX")

  • Test your app in different environments by importing mesh data into your development tool.

  • Spawn objects within the room bounds.

  • Only show the spatial mesh when needed.

  • Make holograms respect their surroundings.

Spatially aware apps can be incredibly powerful, and if they're designed properly they can truly blend holograms with reality.

PreviousConceptsNextWhat is Scene Understanding?

Last updated 5 years ago

Was this helpful?