arrow-left

All pages
gitbookPowered by GitBook
1 of 14

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Concepts

How to enable AR and Hit-test?

Similar to VR experience, you can add AR Button to enable AR experiences. Additionally, you can specify the required and optional AR features your experience will use.

import { ARButton } from "/jsm/webxr/ARButton";

document.body.appendChild(ARButton.createButton(renderer, { requiredFeatures: ["hit-test"] }));

Add a controller: Returns a Group representing the so called target ray space of the controller. Use this space for visualizing 3D objects that support the user in pointing tasks like UI interaction.

	controller = renderer.xr.getController(0);
	controller.addEventListener("select", onSelect);
	scene.add(controller);

When we are hitting a surface, to indicate the surface, we will create a reticle to our scene.

	//Hit-test indicator
	reticle = new Mesh(new RingBufferGeometry(0.15, 0.2, 32).rotateX(-Math.PI / 2), new MeshBasicMaterial());
	reticle.matrixAutoUpdate = false;
	reticle.visible = false;
	scene.add(reticle);

Let's define the onSelect event that we attached to controller. When the select event happen, meaning user decides to place the object and we create it on the chosen location.

Finally in our render function, we will check in every XRFrame if we have an hit-test source to display the reticle on the surface.

To run the code on your device, you have to give access to your camera when prompted.

function onSelect() {
	if (reticle.visible) {
		const mesh = new Mesh(geometry, phongMaterial);
		mesh.position.setFromMatrixPosition(reticle.matrix);
		mesh.scale.y = Math.random() * 2 + 1;
		scene.add(mesh);
	}
}
function render(timestamp: number, frame: any) {
	if (frame) {
		earth.visible = false;
		const referenceSpace = renderer.xr.getReferenceSpace();
		const session = renderer.xr.getSession();
		if (hitTestSourceRequested === false) {
			session.requestReferenceSpace("viewer").then((referenceSpace) => {
				session.requestHitTestSource({ space: referenceSpace }).then((source) => {
					hitTestSource = source;
				});
			});

			session.addEventListener("end", () => {
				hitTestSourceRequested = false;
				hitTestSource = null;
			});
			hitTestSourceRequested = true;
		}
		if (hitTestSource) {
			const hitTestResults = frame.getHitTestResults(hitTestSource);
			if (hitTestResults.length) {
				const hit = hitTestResults[0];
				reticle.visible = true;
				reticle.matrix.fromArray(hit.getPose(referenceSpace).transform.matrix);
			} else {
				reticle.visible = false;
			}
		}
	}
	renderer.render(scene, camera);
}

How to load a 3D Model

Only a few loaders (e.g. ObjectLoader) are included by default with three.js — others should be added to your app individually.

Once you've imported a loader, you're ready to add a model to your scene. Syntax varies among different loaders — when using another format, check the examples and documentation for that loader. For glTF, usage with global scripts would be:

Change the onSelect function to load and place the model, instead of the Sphere mesh we were placing previously.

We can add event callbacks for loading manager.

See for further details.

hashtag
Exercise

hashtag
Troubleshooting

You've spent hours modeling an artisanal masterpiece, you load it into the webpage, and — oh no! 😭 It's distorted, miscolored, or missing entirely. Start with these troubleshooting steps:

  1. Check the JavaScript console for errors, and make sure you've used an onError callback when calling .load() to log the result.

  2. View the model in another application. For glTF, drag-and-drop viewers are available for three.jsarrow-up-right and babylon.jsarrow-up-right. If the model appears correctly in one or more applications, file a bug against three.jsarrow-up-right. If the model cannot be shown in any application, we strongly encourage filing a bug with the application used to create the model.

  3. Try scaling the model up or down by a factor of 1000. Many models are scaled differently, and large models may not appear if the camera is inside the model.

  4. Try to add and position a light source. The model may be hidden in the dark.

  5. Look for failed texture requests in the network tab, like C:\\Path\To\Model\texture.jpg. Use paths relative to your model instead, such as images/texture.jpg — this may require editing the model file in a text editor.

GLTFLoader documentationarrow-up-right
import { GLTFLoader } from '/jsm/loaders/GLTFLoader.js';

//Model loader
const manager = new LoadingManager();
const loader = new GLTFLoader(manager).setPath("/assets/models/AyaSofia/");
let modelLoaded = false;
function onSelect() {
	if (reticle.visible && !modelLoaded) {
		loader.load(
			"GM_poly.gltf",
			function (gltf) {
				gltf.scene.children[0].position.setFromMatrixPosition(reticle.matrix);
				scene.add(gltf.scene);
				modelLoaded = true;
			},
			undefined,
			function (error) {
				console.error(error);
			}
		);
	}
}
manager.onStart = function (url, itemsLoaded, itemsTotal) {
	console.log("Started loading file: " + url + ".\nLoaded " + itemsLoaded + " of " + itemsTotal + " files.");
};

manager.onLoad = function () {
	console.log("Loading complete!");
};

manager.onProgress = function (url, itemsLoaded, itemsTotal) {
	console.log("Loading file: " + url + ".\nLoaded " + itemsLoaded + " of " + itemsTotal + " files.");
};

manager.onError = function (url) {
	console.log("There was an error loading " + url);
};

What could go wrong?

If you are running into issues, check out the scenarios below that might be causing the problem.

Serving your site using http

You would not be able to see your site if you are serving over http. WebXR APIs require secure htttps server.

What is XRReferenceSpaceType?

XRReferenceSpaceType defines how much your user can move in you experience.

XRReferenceSpaceType

Description

Interface

bounded-floor

Similar to the local type, except the user is not expected to move outside a predetermined boundary, given by the in the returned object.

local

A tracking space whose native origin is located near the viewer's position at the time the session was created. The exact position depends on the underlying platform and implementation. The user isn't expected to move much if at all beyond their starting position, and tracking is optimized for this use case.

For devices with six degrees of freedom (6DoF) tracking, the local reference space tries to keep the origin stable relative to the environment.

Which Devices are Compatible with WebXR?

WebXR-compatible devices include fully-immersive 3D headsets(VR Headsets) with motion and orientation tracking, Augmented Reality glasses, like HoloLens and MagicLeap ,which overlay graphics atop the real world scene passing through the frames, and AR compatible(ARCore and ARKit supported) handheld mobile phones which augment reality by capturing the world with a camera and augment that scene with computer-generated imagery.

circle-info

iOS devices such as iPhone and iPad currently does not support WebXR APIs in Safari or Chrome but WebXR Viewer App is available from the App Store: https://apps.apple.com/us/app/webxr-viewer/id1295998056arrow-up-right

hashtag
Virtual Reality WebXR Navigation and Teleportation Demo

hashtag
Augmented Reality on Android Devices

XRReferenceSpacearrow-up-right

local-floor

Similar to the local type, except the starting position is placed in a safe location for the viewer to stand, where the value of the y axis is 0 at floor level. If that floor level isn't known, the user agentarrow-up-right will estimate the floor level. If the estimated floor level is non-zero, the browser is expected to round it such a way as to avoid fingerprinting (likely to the nearest centimeter).

XRReferenceSpacearrow-up-right

unbounded

A tracking space which allows the user total freedom of movement, possibly over extremely long distances from their origin point. The viewer isn't tracked at all; tracking is optimized for stability around the user's current position, so the native origin may drift as needed to accommodate that need.

XRReferenceSpacearrow-up-right

viewer

A tracking space whose native origin tracks the viewer's position and orientation. This is used for environments in which the user can physically move around, and is supported by all instances of XRSessionarrow-up-right, both immersive and inline, though it's most useful for inline sessions. It's particularly useful when determining the distance between the viewer and an input, or when working with offset spaces. Otherwise, typically, one of the other reference space types will be used more often.

XRReferenceSpacearrow-up-right

boundsGeometryarrow-up-right
XRBoundedReferenceSpacearrow-up-right
WebXR Demo

Which Browsers support WebXR?

Different browsers are implementing and WebXR APIs in different timeframes. Currently Chrome and new Edge browsers have WebXR APIs turned on as default and some of the features are under experimental flags.

You can check the current support status at CanIUse.comarrow-up-right.

Check browser support at CanIUse.com

hashtag
How to try out experimental features in Chrome and Edge

You can turn on experimental flags by navigating to chrome://flags/ or edge://flags/ and searching for the experimental flag you are looking to enable and choosing enable from the drop down menu.

What is the Lifecycle of a WebXR Application?

The basic steps most WebXR applications will go through are:

  1. Query to see if the desired XR mode is supported.

  2. If support is available, advertise XR functionality to the user.

  3. A indicates that the user wishes to use XR.

  4. Request an immersive session from the device

  5. Use the session to run a render loop that produces graphical frames to be displayed on the XR device.

  6. Continue producing frames until the user indicates that they wish to exit XR mode.

  7. End the XR session.

How to enable VR?

There are few changes we need to make to turn our experience into a VR experience.

user-activation eventarrow-up-right

Resources

Append the VRButton to your html body.

hashtag
Run the VR experience on your phone

If you are on an iOS

import { VRButton } from "/jsm/webxr/VRButton";
renderer.xr.enabled = true;
document.body.appendChild(VRButton.createButton(renderer));

WebXR Device APIs

What are WebXR Device APIs?

WebXR is a group of standards being implemented by the browsers, which are used together to support rendering 3D scenes to hardware designed for Mixed Reality. Mixed Reality devices are presenting virtual worlds (virtual reality, or VR), or for adding graphical imagery to the real world, (augmented reality, or AR).

The WebXR Device API implements the core of the WebXR feature set, managing the selection of output devices, render the 3D scene to the chosen device at the appropriate frame rate, and manage input, such as controllers and hand interactions.

WebXR Device APIs are replacing the deprecated WebVR API. WebVR API was designed with only VR devices in mind. With the addition of new AR headsets and AR capable handheld devices, the WebVR API is deprecated in favor of WebXR Device APIs, that include AR Modules.

  • Introduction

  • WebXR Device APIs:

  • Immersive Devices

  • VR Headsets for Mobile Devices.

  • Oculus Quest

  • Augmented Reality(AR) or Mixed Reality(MR) Headsets

  • WebXR with Mobile Devices

  • Pokemon Go

  • WebVR vs WebXR

  • Virtual Reality on the Web

  • Mozilla Hello WebXR Demo:

  • WebXR Features 11: 30 Gamepad API

  • Input Profiles Library

  • WebAR Module

  • Hit Test

  • How to Get Started Building WebXR Experiences

  • WebGL

  • WebXR Libraries

  • ThreeJS:

  • A-Frame:

  • BabylonJS:

  • React 360:

  • PlayCanvas:

  • More on A-Frame

  • A-Frame Hello World Code

  • Future of WebXR APIs

  • WebXR Accessibility and DOM Overlay API

  • Lighting Estimation Using Computer Vision

  • Anchors

  • Layers

  • Hand Interactions

  • WebXR Resources 30: 10 How to Get Involved with Immersive Web Working and Community Groups

00:00arrow-up-right
01:09arrow-up-right
https://github.com/immersive-web/webx... arrow-up-right
02:30arrow-up-right
03:59arrow-up-right
05:09arrow-up-right
06:07arrow-up-right
07:00arrow-up-right
07:36arrow-up-right
08:18arrow-up-right
09:43arrow-up-right
10:36arrow-up-right
https://mixedreality.mozilla.org/hell...arrow-up-right
11:12arrow-up-right
12:56arrow-up-right
14:51arrow-up-right
15:50arrow-up-right
17:11arrow-up-right
17:25arrow-up-right
18:00arrow-up-right
18:08arrow-up-right
https://threejs.org/arrow-up-right
18:17arrow-up-right
https://aframe.io/arrow-up-right
18:26arrow-up-right
https://www.babylonjs.com/arrow-up-right
18:37arrow-up-right
https://facebook.github.io/react-360/arrow-up-right
18:53arrow-up-right
https://playcanvas.com/arrow-up-right
19:00arrow-up-right
19:55arrow-up-right
21:07arrow-up-right
21:29arrow-up-right
24:13arrow-up-right
25:05arrow-up-right
26:42arrow-up-right
28:16arrow-up-right
29:10arrow-up-right
three.js examplesthreejs.orgchevron-right
ThreeJS gltf loader demo
Logo

How to debug and test your WebXR Application with Chrome Dev Tools?

Chrome Dev Tools Debugging an AR app

Project

Create your first AR & VR applications on the Web

In this section, we will turn our 3D experience from the last section into an immersive experience. We will develop on our local device, laptop or desktop and run the code locally.

hashtag
Prerequisites

You can follow along the tutorial using an online editor. If you want to learn how to develop on your local environment, please follow the below checklist.

3D on the Web Resources
Androidarrow-up-right
iOSarrow-up-right
Samsungarrow-up-right
HoloLensarrow-up-right
Oculusarrow-up-right
HParrow-up-right
An introduction to WebXR APIs and feature samples