Immersive Design XR
  • Immersive Design XR
  • Learning Goals Immersive Design XR
  • Basic assignments (lesson 1-2)
    • Basic introductory workshop
    • Basic assignment Concept & Identity
    • Basic assignment User Experience Design
    • Basic assignment Interaction Techniques
  • Expert assignments (lesson 3-8)
    • ANALYSIS operating space (lesson 3)
    • CREATE: space through Light & Sound (lesson 4)
    • TEST: introduction VR-methods (lesson 5)
    • TEST: testscripts & questionnaires (lesson 6)
    • EVALUATE: heuristic evaluation & personal plan IED (lesson 7)
  • Concept & Identity
    • C&I: storytelling
    • C&I: virtual identity
    • C&I: body ownership
    • C&I: emotions & sentiment
  • User Experience Design
    • UX: general design principles & patterns
    • UX: space (II) social space
    • UX: space (I) active sensing
    • UX: human factors (I) cognition
    • UX: human factors (II) sensory perception
    • UX: human factors (III): ergonomics
  • Interaction Techniques
    • IT: navigation
    • IT: wayfinding
    • IT: system control
    • IT: selection & manipulation
    • IT: feedback, feedforward & force feedback
  • Testing in XR
    • Testing (I): immersion, presence & agency
    • Testing (II): methods for testing
    • Testing (III): questionnaires
  • Related Materials
    • Narrative Theory
    • Social Space theory
    • Social Space experts
    • Embodied Reality: being bodily
    • Movement & Animation
    • Avatar Creation Tools
    • Audio & Sound
    • Hardware Technology
    • Prototyping Controllers
    • 3D Data Visualisation
    • Mobile AR/MR
  • Getting Started
    • Getting Started - History Reality Caravan
    • Getting Started - Founding Brothers & Sisters
    • Getting Started - Advice for Designers VR by Jaron Lanier
    • Getting Started - Play! Games in STEAM
    • Getting started - Platforms & Engines
    • Getting Started: controllers & environments
  • Organisational
    • MIT License
Powered by GitBook
On this page
  • Introduction to Mixed Reality
  • Web-based API
  • Tracking: motion, location and outside-inside
  • Light estimation
  • Anchor
  • Spatial Mapping
  1. Related Materials

Mobile AR/MR

Previous3D Data VisualisationNextGetting Started - History Reality Caravan

Last updated 6 years ago

In this chapter:

  • Getting Started with Mobile AR/MR

  • Tracking: motion, location and outside-inside

  • Light estimation

  • Anchor

Introduction to Mixed Reality

Web-based API

Tracking: motion, location and outside-inside

Tracking

AR relies on computer vision to see the world and recognise the objects in it. The first step in the computer vision process is getting the visual information, the environment around the hardware to the brain inside the device. The process of scanning, recognising, segmenting, and analysing environmental information is called tracking, in immersive technologies. For AR, there’s two ways tracking happens, inside-out tracking and outside-in tracking.

Outside-In Tracking

With Outside-in Tracking, cameras or sensors aren’t housed within the AR device itself. Instead, they’re mounted elsewhere in the space. Typically, mounted on walls or on stands to have an unobstructed view of the AR device. They then feed information to the AR device directly or through a computer.

Inside-Out Tracking

With inside-out tracking, cameras and sensors are built right into the body of the device. Smartphones are the most obvious example of this type of tracking. They have cameras for seeing and processors for thinking in one wireless battery-powered portable device. On the AR headset side Microsoft’s HoloLens is another device that uses inside-out tracking in AR.

Motion Tracking: Accelero meter, Gyroscope & camera Location-based AR: Magneto meter, GPS

Simultaneous Localisation and Mapping or SLAM.

This is the process by which technologies like robots and smartphones analyse, understand, and orient themselves to the physical world. SLAM processes require data collecting hardware like cameras, depth sensors, light sensors, gyroscopes, and accelerometers.

Concurrent Odometry and Mapping or COM.

COM tells a smartphone where it’s located in space in relationship to the world around it. It does this by capturing visually distinct features in your environment. These are called feature points. These feature points can be the edge of a chair, a light switch on a wall, the corner of a rug, or anything else that is likely to stay visible and consistently placed in your environment. Any high-contrast visual conserve as a feature point.

Light estimation

Anchor

Spatial Mapping

Hololens

Tracking in AR
Environmental understanding: feature points and plane-finding
Light Estimation
Anchors
Microsoft HoloLens: Spatial Mapping
What is Mixed Reality?
Mixed Reality in the Workspace, Mark Billinghurst
Introduction to Augmented Reality, Mark Billinghurst
Developing AR and VR experiences with Unity, Mark Billinghurst
AR Interaction, Mark Billinghurst
3D graphics libraries
WebGL