IT: system control
Last updated
Last updated
article 0: Go through this powerpoint to get a quick overview of the material below.
article 0: Foundational principles of Cybernetics, Paul Pangaro
System Control can be defined as the user task in which commands are issued to:
(1) request the system to perform a particular function
(2) change the mode of interaction
(3) change the system state.
The key word in this definition is command. In selection, manipulation and travel tasks, the user typically specifies not only what should be done, but also how it should be done, more or less directly controlling the action. In system control tasks, the user typically specifies only what should be done and leaves it up to the system to determine the details. The system control is considered to ben an explicit action in stead of an implicit action.
The Future of Cybernetics, Paul Pangaro: Man-Machine Symbiosis
Meaningful Human Control over Autonomous Systems: A Philosophical Account
In 2D interfaces, system control is supported by the use of a specific interaction style, such as pull-down menus, text-based command lines or tool palettes. Many of these interaction styles have also been adapted to 3D UIs to provide for a range of system control elements, which may be highly suitable for desktop-based 3D UI’s. 2D methods may also be appropriate in handheld AR applications, where the application often relies on screen-based (touch)inout. But for immersive application in particular, WIMP (windows-icons-menus-pointer)-style interaction may not always be effective.
In immersive VR, users have to deal with 6-DOF input as opposed to 2-DOF on the desktop. These system control methods of traditional 2D methods and non-conventional 3D system control techniques might be combined and hybrid interaction techniques will come into being. There are two categories that influence the effectiveness of all techniques:
Human Factors (see chapter Human Factors, concerning Perception, Cognitive and Ergonomic issues)
System Factors
article 1: The information about menus below comes from this article
command & control cube
TULIP menu
collapsible cylindrical trees
ToolFinger
Spin Menu
generalized 3D carousel view
WIMP-solutions: pop-up and pull-down menus
WIndows Widgets & Menus
Floating menus
Virtual Windtunnel Menus
Tear-off Palette
Ring menus
Spin menus
3D fade-up menu
1. Users and User Tasks
navigation
locomotion
selection
manipulation
user tasks
2. User Interface Input Mechanisms
tracking user location & orientation
speech recognition & natural language input
interface mechanisms in general
pointers&clicks&props, like magic wands, flying mice, space balls, real-world props.
data gloves & gestural recognition
3. Virtual Model
system information
user representation & presentation
agent representation & behaviour
virtual surrounding & selling
4. User Interface Presentation Components
visual feedback & graphical presentation
haptic feedback & force and tactile presentation
aural feedback & acoustic presentation
environmental feedback and other presentation.
The taxonomy of 3D menus consists of the following categories: 1. Intention of Use 2. Appearance and Structure 3. Placement 4. Invocation of non-visible menus 5. Interaction and I/O setting 6. Usability (see Chapter on Testing)
1. Intention of Use
Intention of use: What does the designer want the user to choose from and for which purpose?
Number of displayed items in the menu
Hierarchical nature menu // Temporary option menus: The menu is only invoked for a short time and vanishes after the selection.
Hierarchical nature menu // Single menus: Basically the same as the first type, but displayed for a longer time or even visible all the time. The number of selectable items can be greater than with the first menu type, and arbitrary items can also be displayed. This type includes toolbars and tool palettes.
Hierarchical nature menu // Menu systems: This is the same as the second type but extended to contain a submenu for each entry (if appropriate). That is, menu systems are menu hierarchies with a depth of 2. This is exemplified with the revolving stage/rondel.
Hierarchical nature menu // Menu hierarchies: These menus allow an arbitrary number of items, which are arranged in an arbitrary number of submenus (depth of hierarchy >= 3). This type resembles well-known menu solutions from traditional desktop environments. (Cascading, tree-structure)
2. Appearance and Structure
Geometric structure and layout have a significant influence on memorability and interaction speed. Items are: geometric structure, structural layout - acyclic, cyclic list, matrix, free arrangements, layouts following geometric structure. Size and spacing of menu items play an important role in selection, space consumption and usability of the menus. The type of displayed data is an important property as well. We distinguish between menu options appearing as:
3D-objects
Text entries
Images
Images and text combined
3D-objects and text combined
3. Placement
surround-fixed
display-fixed,
world-fixed windows
According to the extended placement options by Bowman et al. menus can be placed in the following ways:
world-referenced (most desktop VR menus).
object-referenced (e.g. combo box in).
head-referenced (e.g. look-at-menu).
body-referenced (e.g. TULIP).
device-referenced (e.g. tool menu of the responsive workbench.
PIP tool-palette.
fade-up menu.
4. Invocation of non-visible menus (by animation for example)
Selecting an icon or other miniature.
Context dependent activation related to either an object, other menu (for submenus) or some specific background.
Free activation at an arbitrary point (menu hidden).
No action, i.e. the menu is persistently visible.
5. Interaction and I/O setting
Have a look at the Tinmith video (around 2006)
Hand-held menus These improve upon the previously mentioned solutions by allowing a virtual menu (usually an object palette) to be controlled with one hand, whilst the other is selecting items from it. (CHIMP, tear-off Palette, ring menu.)
Prop-based ‘physical menus’ Usually, 3D widgets and especially menus are attached to physical surfaces, which inherently provide means of constraining the interaction and providing natural feedback to the user. Usually, the position of the physical surfaces is tracked to allow for appropriate visual representations in space.
Glove-based menus (Tinmith, FingARtips, TULIP menu) One of the problems resulting from the usage of instrumented gloves for gestural and spatial input is that hands can be too encumbered to use other tools.
Gestures are: mimic gesture, symbolic gestures, sweeping, sign language, speech-connected hand gesture, surface-based gestures and whole-body interaction
Speech recognition enhanced menus That was the motivation for the development of the hands-off interaction technique [32] for menu display which involves the presentation of menu items as a 2D overlay onto the 3D world. Textual menu items are displayed on a view plane which moves relative to the user. The menu items are selected via speech recognition. Other menu solutions employ speech recognition as an alternative input channel in addition to graphical selection. (3D Palette, Billinghurst et al.)
Pen-and-tablet menus It is an interface for creating virtual scenes using a tracked tablet and digitizing pen. (3D palette)
Tool and object palette This was attached to the so-called Personal Interaction Panel, which could be used in various VR settings.
Workbench Menus The responsive workbench and similar configurations are very attractive for direct manipulation. Typically, menus are used by means of a toolbox containing various 3D-icons. Interaction is done with the stylus or by pinching with the gloves.
Command & control cube 3D equivalent of the quick keyboard hotkey mechanism known from WIMP interfaces.
Spin Menu Items are arranged on a portion of a circle and controlled by rotating the wrist in the horizontal plane. Since at most 9–11 items can be displayed on a ring, hierarchical spin menus are suggested with crossed, concentric, or stacked layouts.
Menus with body-relative interaction can for example be attached to the user’s body and thus take advantage of proprioception during operation. (CHIMP look-at-menu)
Toolspaces and glances Instead of switching contexts between 2D and 3D interaction, this approach relies entirely on 3D widgets, which are stored on the so-called toolspaces attached to the user’s virtual body.
6. Usability (see Chapter on Testing)
Evaluation criteria (selection speed, error rate, efficiency, user comfort, ease of use and learning)
Comparison (different layouts, selection methods, menu solutions)