3D Data Visualisation

In this chapter:

  • Getting Started with Data Visualisation

  • Storyconcept for Data

  • Interaction Techniques for Data Visualisation

  • Data & 3D printing

Interaction techniques are methods used to accomplish a given task via the interface. They include both hardware and software components. The software components are responsible for translating information from the input devices into system actions that are then displayed to the user. Many of the techniques can be implemented using a variety of different devices, the interaction concept and implementation details are what make them unique.

article 0: Go through this powerpoint to get a quick overview of the material below.

Data visualisation: basics

Getting into data visualization: where do I start?

Storyconcept for Data

article 3: Periodic Table of Visualisation Methods article 4: Data Visualisations - a game of decisions by Andy Kirk Example Data Visualisation in VR

Interaction Techniques for Data Visualisation

article 1: Designing Virtual Reality Data Visualisations - Ana Asnes Becker

In this talk, Ana will show a few examples of VR data visualizations that make great use of the medium, discuss some of the challenges of designing for VR, and walk us through the tools used at WSJ to create a browser-based VR data interactive of the Nasdaq.

article 2: Next Reality - Data Visualisation news

A.I. Experiments: Visualising High-Dimensional Space

Wayfinding cues in VR

  • Field of View (FOV): larger FOV (> 40-80 degrees) reduce head movement. Small FOV's lead to cyber sickness.

  • Motion Cues: the peripheral vision provides strong motion cues (direction, velocity, orientation during movement), yet additional vestibular (inertia & balance, which usually are related to embodied self-motion cues) cues are necessary as well.

  • Multisensory Output: Tactile maps (a map which contours are raised so they can be sensed by touch as well as sight. Tactile cues can aid in the formation and usage of spatial memory. Audio use for Wayfinding is still an open question.

  • Presence: influential factors can be sensory immersion, proprioception, immersion into the whole experience. The inclusion of the 'own' virtual body also enhanced presence.

  • Search Strategies: novice users depend strongly on landmarks, whereas skilled users make use of cues like paths (such as a coast line). Usage of search strategies can increase the effectiveness. (Birds eye-view, temporarily move to a height above the ground.

Environment-Centered Wayfinding Cues

Environment-centered Wayfinding cues refer to the conscious design of the virtual world to support Wayfinding.

  • Environment legibility: think of paths (linear), edges (enclosing), districts (quickly identifiable), nodes (gathering points), landmarks (static objects). Great examples come from urban planning, for urban traffic zones like Schiphol by Paul Mijksenaar.

  • Landmarks: easily distinguishable objects, global landmarks for directional cues and local for decision-making by providing information.

  • Maps: most common in daily life, but very complex to design for in virtual environments. Needn't necessarily be a spatial representation, but can also categorise and place hierarchical structure at its core. Environmental clutter vs Neat & Empty

  • Compasses: provide directional cues, great for implementation in maps.

  • Signs: see the video of Paul Mijksenaar at 'environment legibility'.

  • Trails: helps users to retrace their steps.

  • Reference Objects:well-known objects like chairs and human figures to determine size in virtual reality.

Reference Frames

  • Virtual-World Reference Frame: matches the layout of the virtual environment and includes geographic directions (eg. North) and global distances (e.g. meters, inches) independent of how the user is oriented, positioned or scaled.

  • Real-World Reference Frame: defined by real world physical space and is independent of any user motion (virtual or physical). For example, as a user virtually flies forward the user's physical body is still located in the real-world reference frame.

  • Torso Reference Frame: useful for interaction because of proprioception and can be useful for steering in the direction the body is facing.

  • Hand Reference Frame: defined by the position and orientation of the user's hands, and hand-centric judgements occur when holding an object in the hand. Is especially important when using a phone, tablet or VR controller.

  • Head Reference Frame: is based on the point between the two eyes and a reference direction perpendicular to the forehead. (Cyclopean eye, Heads-up Display pilots)

  • Eye Reference Frames: defined by the position of the eye balls.

article 2: Moving in a Box: Improving Spatial Orientation in Virtual Reality using Simulated Reference Frames.

Navigational Search in VR: Do Reference Frames Help?

Dataviz & 3D printing

Making Data Matter: Voxel-Printing for the Digital Fabrication of Data across Scales and Domains

Overview of other Prototyping Tools (2014)

Prototyping Tools (VVVV, MAX-MSP, BUG, Dragonfly, Phidgets Immersive Vis and 3D Interaction for Volume Data Analysis

Last updated