Immersive Design XR
  • Immersive Design XR
  • Learning Goals Immersive Design XR
  • Basic assignments (lesson 1-2)
    • Basic introductory workshop
    • Basic assignment Concept & Identity
    • Basic assignment User Experience Design
    • Basic assignment Interaction Techniques
  • Expert assignments (lesson 3-8)
    • ANALYSIS operating space (lesson 3)
    • CREATE: space through Light & Sound (lesson 4)
    • TEST: introduction VR-methods (lesson 5)
    • TEST: testscripts & questionnaires (lesson 6)
    • EVALUATE: heuristic evaluation & personal plan IED (lesson 7)
  • Concept & Identity
    • C&I: storytelling
    • C&I: virtual identity
    • C&I: body ownership
    • C&I: emotions & sentiment
  • User Experience Design
    • UX: general design principles & patterns
    • UX: space (II) social space
    • UX: space (I) active sensing
    • UX: human factors (I) cognition
    • UX: human factors (II) sensory perception
    • UX: human factors (III): ergonomics
  • Interaction Techniques
    • IT: navigation
    • IT: wayfinding
    • IT: system control
    • IT: selection & manipulation
    • IT: feedback, feedforward & force feedback
  • Testing in XR
    • Testing (I): immersion, presence & agency
    • Testing (II): methods for testing
    • Testing (III): questionnaires
  • Related Materials
    • Narrative Theory
    • Social Space theory
    • Social Space experts
    • Embodied Reality: being bodily
    • Movement & Animation
    • Avatar Creation Tools
    • Audio & Sound
    • Hardware Technology
    • Prototyping Controllers
    • 3D Data Visualisation
    • Mobile AR/MR
  • Getting Started
    • Getting Started - History Reality Caravan
    • Getting Started - Founding Brothers & Sisters
    • Getting Started - Advice for Designers VR by Jaron Lanier
    • Getting Started - Play! Games in STEAM
    • Getting started - Platforms & Engines
    • Getting Started: controllers & environments
  • Organisational
    • MIT License
Powered by GitBook
On this page
  • User testing
  • 1. Cognitive Walkthrough
  • 2. Heuristic Evaluation VR
  • 3. Formative Evaluation
  • 4. Summative Evaluation
  • 5. Task Analysis
  1. Testing in XR

Testing (II): methods for testing

PreviousTesting (I): immersion, presence & agencyNextTesting (III): questionnaires

Last updated 6 years ago

User testing

article 1: article 2: article 3: )

1. Cognitive Walkthrough

Cognitive Walkthrough is a formal method for evaluating a UI without users.

  • Focuses on first time use

  • Task oriented: requires tasks and walkthrough scenarios

  • Will users be able to follow this scenario? Can you tell a believable story?

  • Must be aware of user capabilities

Stages of Action Model, Norman (2001)

2. Heuristic Evaluation VR

Heuristic or guidelines-based expert evaluation is a method in which several usability experts separately evaluate a UI Design by applying a set of heuristics or design guidelines that are either general enough to apply to any UI of are tailored for 3D UIs in particular. No representative users are involved. Below a few different Heuristic setups are presented. Have a look at them and then choose which one suits your needs.

3. Formative Evaluation

Formative Evaluation is an observational empirical evaluation method, applied during evolving stage of design, that assesses user interaction by iteratively placing representative users in task-based scenario’s in order to identify usability problems, as well as to assess the design’s ability to support user exploration, learning and task performance. Can be done in a formal and informal way.

4. Summative Evaluation

Summative evaluations do what their name suggests, they “sum up” or statistically compare two or more different configurations of a user interface design, components of the design, or specific interaction techniques, by having representative users try out each version while evaluators collect quantitative and qualitative information. They can be informally applied, usually collecting just qualitative data, or more formally applied, also helping to collect quantitative data (time on task, error rate). They specifically differ from formative evaluations in that they must compare more than one design. Research involving users is crucial for VE perhaps more so than other types of interfaces as the technology is relatively new and varied, meaning expertise in VR is hard. For this reason, summative evaluations are particularly important as they help to compare specific I/O combinations and or interactions techniques. Summative evaluations typically occur after user interface designs are complete, and are comparing specific differences between configurations. As such, they are most appropriate for late­stage prototypes in which general usability has been established, but specific interaction or interface questions persist. Comparing 3D UI’s requires a consistent set of user task scenarios. Requires Representative Users, Generic, Qualitative and/or Quantitative

5. Task Analysis

user research should focus on collecting the following five types of data, which you will use later during the task analysis phase:

  • Trigger: What prompts users to start their task?

  • Desired Outcome: How users will know when the task is complete?

  • Base Knowledge: What will the users be expected to know when starting the task?

    Required Knowledge: What the users actually need to know in order to complete the task?

  • Artifacts: What tools or information do the users utilize during the course of the task?

According to the UXPA’s Usability Body of Knowledge Site, the process of task analysis can be broken down into the following steps:

  • Identify the task to be analyzed: Pick a persona and scenario for your user research, and repeat the task analysis process for each one. What is that user’s goal and motivation for achieving it?

  • Break this goal (high-level task) down into subtasks: You should have around 4–8 subtasks after this process. If you have more, then it means that your identified goal is too high-level and possibly too abstract. As Don Norman (1998) said, users are notoriously bad at clearly articulating goals: e.g., ”I want to be a good mom” – where do you even begin? Each subtask should be specified in terms of objectives. Put together, these objectives should cover the whole area of interest—i.e., help a user achieve a goal in full.

  • Draw a layered task diagram of each subtask and ensure it is complete: You can use any notation you like for the diagram, since there is no real standard here. Larry Marine shares some helpful advice on the notation he uses, which is examined below.

  • Write the story: A diagram is not enough. Many of the nuances, motivations and reasons behind each action are simply lost in the diagram, because all that does is to depict the actions and not the reasons behind them. Make sure you accompany your diagram with a full narrative that focuses on the whys.

  • Validate your analysis: Once you’re happy with your work, review the analysis with someone who was not involved in the decomposition, but who knows the tasks well enough to check for consistency. This person can be another team member working on the same project, but you could also enlist the help of actual users and stakeholders for this purpose.

article 4:

Stages of Action Model

article 5: article 6: article 7: article 8:

article 9:

article 10:

Cognitive walkthrough procedure
Heuristic Evaluation proces
Comprehendable Heuristics for VR systems, Murtza, Youmans, Monroe, 2018
Online VR Heuristic Evaluation Tool
Heuristics specified for Virtual Reality, 2004, accessed Jul 18 2018
Case study Evaluation of the therapist manual and interfaces of the Rutgers Ankle Rehabilitation System (RARS)
Task Analysis
Virtually Usable: A Review of Virtual Reality Usability Evaluation Methods, Dana Martens, 2016
Basics of Usability Testing
Tools for prototyping & testing (e.g. Role-Playing, Processing and Advanced Interface Technology
377KB
Heuristic-Evaluation-for-Gameful-Design.pdf
pdf
Heuristic Evaluation for Gameful Design
Testing in 1989