Sie sind auf Seite 1von 18

Interaction Styles

&
Devices

Interaction Styles & Devices


Touch / multi-touch
Handwriting
Voice
Gesture
Eye-tracking
Virtual

Touch / multi-touch
Pros

More natural (direct control)


More convenient (display is input)
Better real-estate (no buttons)
Multi-touch -> Multiuser

Cons

Fat finger and ccclusion


Ambiguous actions
Cost
Reduced screen brightness
No haptic feedback

When to use
Other inputs not practical

When not to use


High precision work

Standards/Guidelines
Follow consistent behavior patterns
Large target sizes

Handwriting
Pros
Natural/familiar
Useful for non-linear input
Pen device or touch

Cons

Messy doctor penmanship


Small writing space
Attention-heavy
Available & reliable recognition software
Continuous text

Handwriting
When to use
Mathematics learning
Note taking
Mobile devices (+ pen)

When not to use


Other attention-heavy activities
Small screens w/o pen
Long passage/data input

Standards/Guidelines
Large writing area for touch input
Tested & reliable recognition software
supports correction of recognition errors

Voice
Pros
Hands free
Fast and accurate for text
Little/no training

Cons
Trouble for accents and weak support
for some languages
Limited vocabulary / applicability
Computationally expensive

When to use
Small devices (smart watches)
Text-only input

When not to use


Large operation set

Standards/Guidelines
Other information
Semantics / natural language
processing

Gesture
Pros
Pairable with other input modalities
Touch gesture or near-field gesture
(NFG)
Natural
Pen-style tools or body only
Wearable devices

Cons
Transition from 2d pointing to 3d
motion
Fatigue
Mental demand on gesture recall
False positive commands
Limited gesture-viewing field
Limited by standard point-n-click UI

Gesture
When to use

Sterile environments
Hand/arm disabilities (+ pen tool)
3D models/graphics
AR/VR immersive environments

When not to use


Low tracking/sensing environment
Standard UI
Narrow paths/menus

Standards/Guidelines

Begin with a clean slate


Larger targets (or expanding) (NFG)
Middle of the display (NFG)
Obvious/intuitive gesture commands
Provide feedback
Robust tracking system

Eye-tracking
Pros
Hands free
Easy to learn
Extra input channel

Cons
Deliberate eye-control is mentally
demanding
Head-tracking / accuracy challenge,
saccadic movements
Involuntary indicators of action

When to use
Motor disabilities
Hands-free environments

When not to use


High precision tasks
Complicated action set

Virtual / Augmented Reality (VR & AR)


Pros
Digital real-world overlay
Full environment immersion
Combines with other input
modalities
Additional information to
environment
Vision-deficit assistance (VDR)

Cons
Tracking technique
Ergonomics (size, weights, power,
ruggedness)
Portability
Alignment of AR object to realworld environment,
Limited FOV (depends on tracking
technique)
Display problems (color depth,
resolution, contrast, luminance)

Virtual / Augmented Reality (VR & AR)


When to use

Medical training tool


Virtual instructions
Information visualization
Production sets for entertainment
(e.g. theatre)
Gaming
Military aid
Navigation
Many many more.

When not to use


Non-user-present environments
(e.g. flying) (AR only)
Long-duration use
Poor tracking environments
(depends on techniques used)
No access to wireless or data
networks

Virtual / Augmented Reality (VR & AR)


Standards/Guidelines
Information enhances real
world
Supports collaborative work
No special apparatus
Supports need to naturally
display 3D images
Portable & minimally invasive
(handheld device, glasses)
Interfaces supports:
Manipulation of virtual
objects
Drawing paths or
trajectories
Assigning quantitative
values & text input
Direct manipulation of
physical objects

Head-to-Head: Eye tracking vs Gesture vs Traditional

Head-to-Head: Eye tracking vs Gesture vs Traditional

Other Interaction Styles & Devices


Sip-n-puff
BCI
Multi-modal
Sensory substitution/augmentation

Thought Question
You want to design a user interface for a specific population (your choice).
What device(s) are best for that population, and what user interface design
considerations do you have to make?

References
Touch/Multitouch: https://developer.apple.com/library/ios/documentation/UserExperience/Conceptual/MobileHIG/
Touch/Multitouch: https://www.google.com/design/spec/layout/metrics-keylines.html
Touch/Multitouch: Boring, S., Ledo, D., Chen, X. A., Marquardt, N., Tang, A., & Greenberg, S. (2012, September). The fat thumb: using the
thumb's contact size for single-handed mobile interaction. In Proceedings of the 14th international conference on Human-computer
interaction with mobile devices and services (pp. 39-48). ACM.
Touch/Multitouch: Jung, E. S., & Im, Y. (2015). Touchable area: An empirical study on design approach considering perception size and touch
input behavior. International Journal of Industrial Ergonomics, 49, 21-30.
Handwriting: Anthony, L., Yang, J., & Koedinger, K. R. (2006, July). Towards the application of a handwriting interface for mathematics
learning. In Multimedia and Expo, 2006 IEEE International Conference on (pp. 2077-2080). IEEE.
Handwriting: Bharath, A., & Madhvanath, S. (2008, January). FreePad: a novel handwriting-based text input for pen and touch interfaces. In
Proceedings of the 13th international Conference on intelligent User interfaces (pp. 297-300). ACM.
Handwriting (PPT): Snowdon, J. L. (2003). Pen Computing: Challenges and Applications. New York.
Handwriting: Subrahmonia, J., & Zimmerman, T. (2000). Pen computing: Challenges and applications. In Pattern Recognition, 2000.
Proceedings. 15th International Conference on (Vol. 2, pp. 60-66). IEEE.
Voice: Katangur, A. K., Akkaladevi, S., & Osei, H. (2013). Voice remote computer control using speech recognition through PSTN. Journal of
Applied Global Research, 6(18).
Voice: Marukami, T., Tani, S., Matsuda, A., Takemoto, K., Shindo, A., & Inada, H. (2012). A basic study on application of voice recognition input
to an electronic nursing record system-evaluation of the function as an input interface. Journal of medical systems, 36(3), 1053-1058.
Gesture: Dhawale, P., Masoodian, M., & Rogers, B. (2006, July). Bare-hand 3D gesture input to interactive systems. In Proceedings of the 7th
ACM SIGCHI New Zealand chapter's international conference on Computer-human interaction: design centered HCI (pp. 25-32). ACM.
Gesture: Wigdor, D., & Wixon, D. (2011). Brave NUI world: designing natural user interfaces for touch and gesture. Elsevier.
Eye Tracking: Sesin, A., Adjouadi, M., Cabrerizo, M., Ayala, M., & Barreto, A. (2008). Adaptive eye-gaze tracking using neural-network-based
user profiles to assist people with motor disability. J Rehabil Res Dev, 45(6), 801-817.
Eye Tracking: Zhai, S. (2003). What's in the eyes for attentive input. Communications of the ACM, 46(3), 34-39.
AR: Azuma, R. T. (1997). A survey of augmented reality. Presence, 6(4), 355-385.
AR: Van Krevelen, D. W. F., & Poelman, R. (2010). A survey of augmented reality technologies, applications and limitations. International
Journal of Virtual Reality, 9(2), 1.
AR: Zhou, F., Duh, H. B. L., & Billinghurst, M. (2008, September). Trends in augmented reality tracking, interaction and display: A review of ten
years of ISMAR. In Proceedings of the 7th IEEE/ACM International Symposium on Mixed and Augmented Reality (pp. 193-202). IEEE
Computer Society.

References
Head-to-head: Canare, D., Chaparro, B., & He, J. (2015). A comparison of gaze-based and gesture-based input for a point-and-click task. In
Universal Access in Human-Computer Interaction. Access to Interaction (pp. 15-24). Springer International Publishing.
Device Comparison Discussion: Hinckley, K., & Wigdor, D. (2002). Input technologies and techniques. The human-computer interaction
handbook: fundamentals, evolving technologies and emerging applications, 151-168.
Mutli-Modal: Jaimes, A., & Sebe, N. (2007). Multimodal humancomputer interaction: A survey. Computer Vision and Image Understanding,
108(1), 116-134.
Mutli-Modal: Nigay, L., & Coutaz, J. (1993, May). A design space for multimodal systems: concurrent processing and data fusion. In
Proceedings of the INTERACT'93 and CHI'93 Conference on Human Factors in Computing Systems (pp. 172-178). ACM.
Tongue Operated: Ghovanloo, M. (2007). Tongue operated assistive technologies. IEEE Engineering in Medicine and Biology Magazine, 1,
4376
Tongue Operated: Huo, X., & Wang, J. (2008). Introduction and preliminary evaluation of the Tongue Drive System: wireless tongue-operated
assistive technology for people with little or no upper-limb function. Journal of Rehabilitation Research and Development, 45(6), 921.
Tongue Operated: Yousefi, B., Huo, X., Veledar, E., & Ghovanloo, M. (2011). Quantitative and comparative assessment of learning in a
tongue-operated computer input device. Information Technology in Biomedicine, IEEE Transactions on, 15(5), 747-757.
BCI: Wolpaw, J. R., McFarland, D. J., Neat, G. W., & Forneris, C. A. (1991). An EEG-based brain-computer interface for cursor control.
Electroencephalography and clinical neurophysiology, 78(3), 252-259.
BCI: Schettini, F., Riccio, A., Simione, L., Liberati, G., Caruso, M., Frasca, V., ... & Cincotti, F. (2015). Assistive device with conventional,
alternative, and brain-computer interface inputs to enhance interaction with the environment for people with amyotrophic lateral sclerosis: a
feasibility and usability study.Archives of physical medicine and rehabilitation, 96(3), S46-S53.
Sensory Aug: Bach-y-Rita, P. (1967). Sensory plasticity. Acta Neurologica Scandinavica, 43(4), 417-426.
Sensory Aug: Bach-y-Rita, P. (2001). Nonsynaptic diffusion neurotransmission in the brain: functional considerations. Neurochemical research,
26(8), 871-873.
Sensory Aug: Bach-y-Rita, P., Kaczmarek, K. A., Tyler, M. E., & Garcia-Lara, J. (1998). Form perception with a 49-point electrotactile stimulus
array on the tongue: A technical note. Journal of rehabilitation research and development, 35, 427-430.

Das könnte Ihnen auch gefallen