Your cart is currently empty!

ISO 9241:2021
ISO 9241:2021 Ergonomics of human-system interaction – Part 430: Recommendations for the design of non-touch gestural input for the reduction of biomechanical stress
CDN $115.00
Description
This document provides guidance on the design, selection and optimization of non-contacting hand and arm gestures for human-computer interaction. It addresses the assessment of usability and fatigue associated with different gesture set designs and provides recommendations for approaches to evaluating the design and selection of gestures. This document also provides guidance on the documentation of the process for selecting gesture sets.
This document applies to gestures expressed by humans. It does not consider the technology for detecting gestures or the system response when interpreting a gesture. Non-contacting hand gestures can be used for input in a variety of settings, including the workplace or in public settings and when using fixed screens, mobile, virtual reality, augmented reality or mixed-mode reality devices.
Some limitations of this document are:
-   The scope is limited to non-contacting gestures and does not include other forms of inputs. For example, combining gesture with speech, gaze or head position can reduce input error, but these combinations are not considered here.
-   The scope is limited to non-contacting arm, hand and finger gestures, either unilateral (one-handed) or bilateral (two-handed).
-   The scope assumes that all technological constraints are surmountable. Therefore, there is no consideration of technological limitations with interpreting ultra-rapid gestures, gestures performed by people of different skin tones or wearing different colours or patterns of clothing.
-   The scope is limited to UI-based command-and-control human computer interaction (HCI) tasks and does not include gaming scenarios, although the traversal of in-game menus and navigation of UI elements is within scope.
-   The scope does not include HCI tasks for which an obviously more optimal input method exists. For example, speech input is superior for inputting text than gesture input.
-   The scope includes virtual reality (VR), augmented reality (AR) and mixed reality (MR) and the use of head-mounted displays (HMDs).
-   The scope does not include the discoverability of gestures but does include the learnability and memorability of gestures. It is assumed that product documentation and tutorials will adequately educate end users about which gestures are possible. Therefore, assessing gesture discoverability is not a primary goal of the recommendations in this document.
Edition
1
Published Date
2021-12-06
Status
PUBLISHED
Pages
12
Format 
Secure PDF
Secure – PDF details
- Save your file locally or view it via a web viewer
- Viewing permissions are restricted exclusively to the purchaser
- Device limits - 3
- Printing – Enabled only to print (1) copy
See more about our Environmental Commitment

Abstract
This document provides guidance on the design, selection and optimization of non-contacting hand and arm gestures for human-computer interaction. It addresses the assessment of usability and fatigue associated with different gesture set designs and provides recommendations for approaches to evaluating the design and selection of gestures. This document also provides guidance on the documentation of the process for selecting gesture sets.
This document applies to gestures expressed by humans. It does not consider the technology for detecting gestures or the system response when interpreting a gesture. Non-contacting hand gestures can be used for input in a variety of settings, including the workplace or in public settings and when using fixed screens, mobile, virtual reality, augmented reality or mixed-mode reality devices.
Some limitations of this document are:
-   The scope is limited to non-contacting gestures and does not include other forms of inputs. For example, combining gesture with speech, gaze or head position can reduce input error, but these combinations are not considered here.
-   The scope is limited to non-contacting arm, hand and finger gestures, either unilateral (one-handed) or bilateral (two-handed).
-   The scope assumes that all technological constraints are surmountable. Therefore, there is no consideration of technological limitations with interpreting ultra-rapid gestures, gestures performed by people of different skin tones or wearing different colours or patterns of clothing.
-   The scope is limited to UI-based command-and-control human computer interaction (HCI) tasks and does not include gaming scenarios, although the traversal of in-game menus and navigation of UI elements is within scope.
-   The scope does not include HCI tasks for which an obviously more optimal input method exists. For example, speech input is superior for inputting text than gesture input.
-   The scope includes virtual reality (VR), augmented reality (AR) and mixed reality (MR) and the use of head-mounted displays (HMDs).
-   The scope does not include the discoverability of gestures but does include the learnability and memorability of gestures. It is assumed that product documentation and tutorials will adequately educate end users about which gestures are possible. Therefore, assessing gesture discoverability is not a primary goal of the recommendations in this document.
Previous Editions
Can’t find what you are looking for?
Please contact us at:
Related Documents
-
ISO 9687:2015 Dentistry – Graphical symbols for dental equipment
0 out of 5CDN $273.00 Add to cart -
ISO 15416:2025 Automatic identification and data capture techniques – Bar code print quality test specification – Linear symbols
0 out of 5CDN $312.00 Add to cart -
ISO 1990:1982 Fruits – Nomenclature – First list
0 out of 5CDN $115.00 Add to cart -
ISO 16840:2023 Wheelchair seating – Part 14: Concepts related to managing external forces to maintain tissue integrity
0 out of 5CDN $115.00 Add to cart