Beruflich Dokumente
Kultur Dokumente
164
165
array of new mobile location-based services. For example, users real-time geospatial
information can be incorporated into mobile permission marketing [15] to create a
new location-based mobile marketing service.
Mobile AR are the most compatible systems for geospatial data as the systems are
designed to register virtual information to locations in space far more precisely than
the typical geographic information system (GIS). An example is the use of AR to
tightly integrate medical 3D data (e.g., CAT scans, MRI images) with the patients
body during surgery [1, 29]. This capability creates the potential for location-based
services that provide an additional dimension to existing information systems and
servicesthe guidance of user mobile attention to any spatial location for guidance,
alerts, navigation, or object retrieval.
At the user level, mobile interfaces that can continuously guide users place demands
on user attention. However, despite the rapid growth of mobile telephony and the
mobile Internet, research concerning m-commerce interfaces is still in the early stages
[17, p. 98]. Mobile information-rich applications of AR systems begin to push up
against a fundamental human factors limitation, the limited attention capacities of
the human cognitive system. For example, cell phones split attention between virtual
information (i.e., a caller talking about a different spatial context) and the demands of
the users physical environment. These attention demands of mobile interfaces such
as cellular phones appear to contribute to automobile accidents [28, 33].
If AR interfaces are to guide user attention in real time, then a fundamental interface
issue needs to be addressed: How can an AR system successfully manage and guide
visual attention to places in the environment where critical information or objects
are present, even when they are not within the visual eld? To describe the problem
another way: What does a 3D omnidirectional cursor look like? This question is part
of a larger set of issues that we refer to as attention management and augmentation
in mobile AR and VR interfaces.
166
Object Search
A warehouse worker uses a mobile AR information system to manage inventory, and
is searching for a specic box in an aisle stocked with dozens of virtually identical
boxes. Based on inventory records of the information systems integrated into the
warehouse, the box is stored on a shelf behind the user. What is the most efcient
way to signal the location to the user?
Procedural Cueing During Training
A trainee repair technician uses an AR system to learn a sequence of procedural steps
where parts and tools are used to repair complex manufacturing equipment. How
can the computer best indicate which tool and part to select next in the procedural
sequence, especially when the parts and tools may be distributed throughout a large
workspace?
Spatial Navigation
A service repair technician with a personal digital assistant (PDA) equipped with the
GPS is looking for a specic building and piece of equipment in a large ofce complex
with many similar buildings. The building is around the corner down the street. What
is the fastest way to signal a walking path to the front door of the building?
Attention Management
ATTENTION IS ONE OF THE MOST LIMITED MENTAL RESOURCES [30]. Attention is used to
focus the human cognitive capacity on a certain sensory input so that the brain can
concentrate on processing information of interest. Attention is primarily directed
internally, from the top down according to the current goals, tasks, and larger dispositions of the user. Attention, especially visual attention, can also be cued by the
environment. For example, attention can be user driven, that is, nd the screwdriver,
collaborator driven, use this scalpel now, or system driven, please use this tool
for the next step.
Attention management is a central humancomputer interaction issue in the design
of interfaces and devices [12, 24]. For example, the attention demands of current interfaces such as cellular phones and PDAs may play a signicant role in automobile
accidents [28, 33]. The scenarios from the previous section illustrate various cases
where attention must be guided, augmented, or managed by the AR system or by a
remotely communicating user.
167
168
When communicating with remote users, the indexical cues of interpersonal communication are not available or are presented in a decreased modality, so nger-pointing
and eye gazing are useless and linguistic references to this, that, and over there
are even more ambiguous than in direct communication.
169
170
171
172
approaches the target from the viewers direction. The curvatures of the starting and
ending points are specied in the application.
The orientation of each pattern along the visual path is obtained by spherical linear
interpolation of the up direction of the source frame and the up direction of the target
frame, so as to transition from an alignment with the view frame to an upright alignment with the target. Spherical linear interpolation was introduced to the computer
graphics society by Shoemake [32], and it is different from linear interpolation in that
the angle between each interval is constantthat is, the changes of orientations of the
patterns are smooth. The formula used is:
( t ) = 1
sin ( (1 t ) )
sin ( )
+ 2
sin ( t )
.
sin ( )
173
174
Figure 4. Example of the Attentional Funnel Drawing Attention of the User to an Object on
the Shelfthe Box
175
Methodology
A within-subjects experiment was conducted to test the performance of the attention funnel design against other conventional attention direction techniquesvisual
highlighting and verbal cues. The experiment had one independent variable, the method
used for directing attention, with three alternatives: (1) the attention funnel, (2) visual
highlight techniques, and (3) a control condition consisting of a simple linguistic cue
common in current mobile phones (i.e., look for the red box.)
Participants
Fourteen paid participants drawn from a university student population participated
in the study.
176
177
Measurements
Search Time, Error, and Variability
Search time in milliseconds was measured as the time it took for participants to grab
a target object from among the 48 objects following the onset of an audio cue tone.
The end of the search time was triggered by the pressure sensor on the thumb of the
glove when the user touched the target object. An error was logged for cases when
participants selected the wrong object.
Mental Workload
Participants perceived task workload in each condition was measured using the NASA
Task Load Index after each experimental condition [9].
Procedure
Participants entered a training environment where they were introduced and trained
to use each interface (audio, visual highlight, attention funnel). They then began the
experiment. Each subject experienced the interface treatment conditions (audio, visual
highlight, and attention funnel) and each object search trial in a randomized order. For
each condition, participants were cued to nd and touch one of the 48 objects in the
environment as quickly and accurately as possible. Participants participated in 24 trials
balanced such that 12 trials involved searching for a random selection of primitive
objects and 12 trials involved randomly selected general everyday objects.
Results
A general linear model repeated measure analysis of variance (ANOVA) was conducted to test the effect of metaphors on the different performance indicators. There
was a signicant effect of interface type on search time, F(2,13) = 10.031, p < 0.001,
and on search time consistency (i.e., smallest standard deviation), F(2,13) = 23.066,
p < 0.000. The attention funnel interface clearly allows subjects to nd objects in the
least amount of time and with the most consistency (mean [M] = 4473.75 milliseconds
[ms], standard deviation [SD] = 1064.48) compared to the visual highlight interface
178
Discussion
When compared to standard cueing techniques such as visual highlighting and audio
cueing, we found that the attention funnel decreased the visual search time by 22
percent overall, or approximately 28 percent for the visual search phase alone, and 14
percent over its next fastest, as shown in Figure 6. While increased speed is valuable
in some applications of AR, such as medical emergency and other high-risk applications, it may be critical that the system support the users consistent performance. The
179
Figure 7. Mental Workload Measured by NASA TLX [9] for Each Experimental Condition
attention funnel had a very robust effect on making the user search consistently, with
signicantly lower standard deviation comparing with the other two cueing techniques.
The interface increased users consistency by an average of 65 percent and 56 percent
over the next best interface.
A key criterion for a mobile interface is the need for minimal attention demand. In
cases where AR environments are used for emergency services, repair work, other
time-critical and attention-demanding applications, search time may require costly
mental effort. The effects of interface type of mental workload are illustrative, as
shown in Figure 7. Cueing users with only audio, which involved holding the object
in memory, required additional mental workload. But visual highlighting techniques,
which demand less memory, demanded additional mental workload, possibly because
of the uncertainty of where to search. The attention funnel, which placed limited demand on memory and which directed search immediately and continuously, provided
an 18 percent decrease in mental workload.
In summary, the attention funnel led to faster search and retrieval times, greater
consistency of performance, and decreased mental workload when compared to verbal
cueing and visual highlighting techniques.
Limitations
THE ATTENTION FUNNEL WAS DESIGNED as a unique interface technique for directing and
guiding users attention to any location in 4 steradians. The approach is unique and
180
patent pending. As indicated above, current techniques used in 3D games and simulations, such as the highlighting of 3D objects, are not feasible in real-world scenes.
No virtual 3D model will preexist for most real-world objects such as buildings,
packages, tools, and so on, even if the location is known using global positioning or
RFID tags.
As there is no standard, we tested the attention funnel against the most commonly
used AR techniques [3]. This presents a limitation to the current study, as the logical
comparison is a set of possible or unknown techniques, which have not been implemented. We are currently implementing and exploring other possible cueing techniques
such as 3D arrows, lines, and so on.
Furthermore, an ideal test of the attention funnel would take place in complex,
outdoor environments with fully mobile individuals cued to nd objects within and
far outside of reach. This would add ecological validity to the ndings.
181
With the success of AR systems, designers will seek to add potentially rich, even
unlimited, layers of virtual information onto physical space. As AR systems are used
in various real, demanding, mobile applications such as manufacturing assembly,
warehousing, tourism, navigation, training, and distant collaboration, interface techniques appropriate to the AR medium will be needed to manage the mobile users
limited attention, improve user performance, and limit cognitive demands for optimal
spatial performance. The AR attention funnel paradigm represents an example of cognitive engineering interface techniques for which there is no real-world equivalent,
and which is specically adapted for users of AR systems navigating and working in
information- and object-rich environments.
Future Work
WE ARE CURRENTLY IMPLEMENTING the attention funnel technique on other mobile devices, including handheld devices such as PDAs and cell phones. The attention funnel
can be overlaid on a live video stream captured by a handheld camera, while spatial
location of the user can be determined using GPS, digital compass, or triangulation
of cellular or RFID signals. Figure 8 illustrates the implementation of the attention
funnel technique on a tablet PC. The attention funnel technique has some important
implications to usability of location-based consumer information systems. As an example, the attention funnel can be used to display navigation information generated
by commercial GISs (e.g., Microsoft Mappoint, Google Maps) from a rst-person
perspective, as illustrated in Figure 9. The attention funnel technique can also be used
to display location-based touring alert information to a mobile user via a PDA or cell
phone (e.g., the location of a shop or restaurant can be cued by an attention funnel
displayed on the screen of a PDA).
182
Acknowledgments: The authors acknowledge the assistance of Betsy McKeon, Amanda Hart,
and Mark Rosen in the preparation of this paper. They also appreciate the suggestions and recommendations provided by the three anonymous reviewers on an earlier version of this paper.
This project is one element of the Mobile Infospaces project and supported in part by a grant
from the National Science Foundation CISE 0222831. Any opinions, ndings, and conclusions
or recommendations expressed in this material are those of the authors and do not necessarily
reect the views of the National Science Foundation.
REFERENCES
1. Bajura, M.; Fuchs, H.; and Ohbuchi, R. Merging virtual objects with real world: Seeing
ultrasound imagery within the patient. Computer Graphics, 26, 2 (1992), 203210.
2. Bimber, O. Video see-through AR on consumer cell phones. In Proceedings of the Third
IEEE and ACM International Symposium on Mixed and Augmented Reality. Los Alamitos, CA:
IEEE Computer Society Press, 2004, pp. 252253.
3. Biocca, F.; Tang, A.; Owen, C.; and F., Xiao. Attention funnel: Omnidirectional 3D cursor
for mobile augmented reality platforms. In R. Grinter, T. Rodden, P. Aoki, E. Cutrell, R. Jeffries, and G. Olson (eds.), Proceedings of the ACM CHI 2006, Conference on Human Factors
in Computer Systems. New York: ACM Press, 2006, pp. 11151122.
4. Brown, D.; Stripling, R.; and Coyne, J. Augmented reality for urban skills training. In
Proceedings of IEEE Virtual Reality Conference 2006. Los Alamitos, CA: IEEE Computer
Society Press, 2006, pp. 249252.
5. Caudell, T., and Mizell, D. Augmented reality: An application of heads-up display technology to manual manufacturing processes. In Ralph H. Sprague Jr. (ed.), Proceedings of the
Twenty-Fifth Annual Hawaii International Conference on System Sciences. Los Alamitos, CA:
IEEE Computer Society Press, 1992, pp. 659669.
6. Fang, X.; Chan, S.; Brzezinski, J.; and Xu, S. Moderating effects of task type on wireless
technology acceptance. Journal of Management Information Systems, 22, 3 (Winter 20056),
123157.
183
7. Feiner, S.; MacIntyre, B.; and Seligmann, D. Knowledge-based augmented reality. Communications of the ACM, 36, 7 (1993), 5262.
8. Feiner, S.; Webster, A.; Krueger, T.; MacIntyre, B.; and Keller, E. Architectural anatomy.
Presence: Teleoperators and Virtual Environments, 4, 3 (1995), 318325.
9. Hart, S. Development of NASA-TLX (task load index): Results of empirical and theoretical research. In P. Hancock and N. Meshkati (eds.), Human Mental Workload. Amsterdam:
North-Holland, 1988, pp. 239250.
10. Hearn, D., and Baker, M.P. Computer Graphics, C Version. Upper Saddle River, NJ:
Prentice Hall, 1996.
11. Hochberg, J. Representation of motion and space in video and cinematic displays. In K.
Boff, L. Kaufman, and J. Thomas (eds.), Handbook of Perception and Human Performance,
vol. 1. New York: Wiley, 1986, pp. 22.122.64.
12. Horvitz, E.; Kadie, C.; Paek, T.; and Hovel, D. Models of attention in computing and
communication: From principles to applications. Communications of the ACM, 46, 3 (2003),
5259.
13. Jebara, T.; Eyster, C.; Weaver, J.; Starner, T.; and Pentland, A. Stochasticks: Augmenting
the billiards experience with probabilistic vision and wearable computers. In Proceedings of the
First International Symposium on Wearable Computers. Los Alamitos, CA: IEEE Computer
Society Press, 1997, pp. 138145.
14. Julier, S.; Baillot, Y.; Lanzagorta, M.; Brown, D.; and Rosenblum, L. BARS: Battleeld
augmented reality system. Paper presented at the NATO Symposium on Information Processing
Techniques for Military Systems, Istanbul, Turkey, October 2000.
15. Kavassalis, P.; Spyropoulou, N.; Drossos, D.; Mitrokostas, E.; Gikas, G.; and Hatzistamatiou, A. Mobile permission marketing: Framing the market inquiry. International Journal
of Electronic Commerce, 8, 1 (Fall 2003), 5579.
16. Klinker, G.; Stricker, D.; and Reiners, D. Augmented reality for exterior construction
applications. In W. Bareld and T. Caudell (eds.), Fundamentals of Wearable Computers and
Augmented Reality. Mahwah, NJ: Lawrence Erlbaum, 2001, pp. 379427.
17. Lee, Y., and Benbasat, I. A framework for the study of customer interface design for mobile
commerce. International Journal of Electronic Commerce, 8, 3 (Spring 2004), 79102.
18. Livingston, M.; Brown, D.; Julier, S.; and Schmidt, G. Military applications of augmented
reality. Paper presented at the NATO Human Factors and Medicine Panel Workshop on Virtual
Media for Military Applications, West Point, June 2006.
19. Livingston, M.; Rosenblum, L.; Julier, S.; Brown, D.; Baillot, Y.; Swan, E., II; Gabbard,
J.; and Hix, D. An augmented reality system for military operations in urban terrain. Paper
presented at the Interservice/Industry Training, Simulation and Education Conference, Orlando,
FL, December 2002.
20. Livingston, M.; Swan, E., II; Julier, S.; Baillot, Y.; Brown, D.; Rosenblum, L.; Gabbard,
J.; and Hllerer, T. Evaluating system capabilities and user performance in the battleeld
augmented reality system. Paper presented at the Performance Metrics for Intelligent Systems
Workshop, Gaithersburg, MD, August 2004.
21. Loomis, J.; Golledge, R.; and Klatzky, R. Navigation system for the blind: Auditory
display modes and guidance. Presence: Teleoperators and Virtual Environments, 7, 2 (1998),
193203.
22. Mann, S. Telepointer: Hands-free completely self contained wearable visual augmented
reality without headwear and without any infrastructural reliance. In Proceedings of Fourth
International Symposium on Wearable Computers. Los Alamitos, CA: IEEE Computer Society
Press, 2000, pp. 177178.
23. Marston, J.; Loomis, J.; Klatzky, R.; Golledge, R.; and Smith, E. Evaluation of spatial
displays for navigation without sight. ACM Transactions on Applied Perception, 3, 2 (2006),
110124.
24. McCrickard, D., and Chewar, C. Attentive user interface: Attuning notication design to
user goals and attention costs. Communications of the ACM, 46, 3 (2003), 6772.
25. Middlebrooks, J., and Green, D. Sound localization by human listeners. Annual Review
of Psychology, 42 (1991), 135159.
26. Ohshima, T.; Satoh, K.; Yamamoto, H.; and Tamura, H. AR2 hockey system: A collaborative mixed reality system. Transactions of the Virtual Reality Society of Japan, 3, 2 (1998),
5560.
184
27. Owen, C.; Tang, A.; and Xiao, F. ImageTclAR: A blended script and compiled code
development system for augmented reality. Paper presented at STARS2003: The International
Workshop on Software Technology for Augmented Reality Systems, Tokyo, Japan, 2003.
28. Redelmeier, D.A., and Tibshirani, R.J. Association between cellular telephone calls and
motor vehicle collisions. New England Journal of Medicine, 336, 7 (1997), 453458.
29. Rolland, J.; Wright, D.; and Kancherla, A. Towards a novel augmented-reality tool to
visualize dynamic 3D anatomy. In K. Morgan, H. Hoffman, D. Stredney, and S. Weghorst (eds.),
Proceedings of Medicine Meets Virtual Reality 5. 1997, pp. 337348.
30. Shiffrin, R. Visual processing capacity and attentional control. Journal of Experimental
Psychology: Human Perception and Performance, 5, 1 (1979), 522526.
31. Shinn-Cunningham, B. Localizing sounds in rooms. Paper presented at the ACM SIGGRAPH and EUROGRAPHICS Campre: Acoustic Rendering for Virtual Environments,
Snowbird, UT, May 2001.
32. Shoemake, K. Animating rotation with quaternion curves. Computer Graphics, 19, 3
(1985), 245254.
33. Strayer, D.L., and Johnston, W. Driven to distraction: Dual-task studies of simulated driving and conversing on a cellular phone. Psychological Science, 12, 6 (2001), 462466.
34. Tang, A.; Owen, C.; Biocca, F.; and Mou, W. Experimental evaluation of augmented
reality in object assembly task. In Proceedings of the First IEEE and ACM International Symposium on Mixed and Augmented Reality. Los Alamitos, CA: IEEE Computer Society Press,
2002, pp. 265266.
35. Tang, A.; Owen, C.; Biocca, F.; and Mou, W. Comparative effectiveness of augmented
reality in object assembly. In V. Bellotti, T. Erickson, G. Cockton, and P. Korhonen (eds.),
Proceedings of ACM CHI 2003, Conference on Human Factors in Computing Systems. New
York: ACM Press, 2003, pp. 7380.
36. Tang, A.; Owen, C.; Biocca, F.; and Mou, W. Performance evaluation of augmented reality for directed assembly. In A. Nee and S. Ong (eds.), Virtual Reality and Augmented Reality
Applications in Manufacturing. London: Springer-Verlag, 2004, pp. 301322.
37. Zwass, V. Management information systemsBeyond the current paradigm. Journal of
Management Information Systems, 1, 1 (Summer 1984), 310.