Beruflich Dokumente
Kultur Dokumente
Computer Interaction (HCI)
Laurian C. Hobby
Center for HCI, Computer Science
Virginia Tech
660 McBryde
Blacksburg, Virginia 24060
lhobby@vt.edu
Abstract. When designing or choosing an analytic method, the designer should
choose a method based on the criteria of having a good cost benefit tradeoff,
ease of use, and reproducibility of results. Even these criteria can lead a design
method astray if it does not have the right focus of attention. HCI’s future in
design depends on shifting the focus away from simply the user, to add work
context which incorporates the user and the work being done.
Technomethodology is only one of many possible design methods that attempts
to meet these criteria and has the proper focus. This paper explores these issues
and how a merging of social and tangible computing can create a superior
analytic design methodology that will guide HCI into the future of design.
1 Introduction
The world and its systems are complex. To understand them correctly requires the
inquisitive mind to analyze their structure and content in detail. To be analytic is to
think in terms of “elemental parts or basic principles” [10], or to be a logician of
sorts. However, the analytical process is much more than that it is to come to the
right conclusion by breaking the problem into manageable parts in a structured and
methodical way. To further understand how to think analytically, it is beneficial to
think of its counterpart, how to think empirically, and what the two say about each
other. Empiricism, to define it, is the use of metrics and measures to accept or reject a
hypothesis. In comparing the two approaches, it may seem that an empirical method is
sterile. This is because the rules of analytic methods can often be illdefined, whereas
the rules of empirical methods are more likely rigorous and sturdy. A typical example
would be to think of the evaluation of software: to use a checklist of needed
characteristics would be an analytic method, whereas to take a measurement of how
2 Laurian C. Hobby
many people were able to complete a task with a piece of software would be an
empirical method. There are benefits and costs of using an analytic method for design,
therefore, the process used to select and develop an analytic method should be
carefully selected. How this method compares to general design methods, and how it
can help to initiate or inspire formal design methods, should also be explored. The rest
of this paper will investigate these ideas in depth and come to a set of conclusions.
2 What Makes a Good Analytic Method?
When it comes to the creation of a software system the criteria for which analytic
method to use is unclear. It is easy to apply general rules to software system’s design,
such as generalizability, precision, and realism [21]; but for selecting an analytic
method for the assessment, the criteria needs to be more specific. Typical examples of
analytic methods to choose from, (although they mostly, if not only, focus on one
phase of design, e.g., evaluation,) are usability inspections [23, 24], modelbased
analysis [17, 25], and claims analysis [32, 7] (for more information on all three see
[29]). When to use which method, and what criteria to apply in order to create a good
analytic method, are still unclear in the current literature. However, the primary
criteria for what makes a good analytic method are listed and discussed in the next
three subparagraphs in order to help clarify and ground what is a superior analytic
method.
2.1 Criterion 1: The Cost Benefit Tradeoff
The cost benefit tradeoff, when viewed at a high level, is similar to asking whether the
analytic method is cost effective, i.e., are the benefits gained from using this method
equal or better to the cost of using it? The costs and the benefits are computed and
compared. As long as the comparison is in favor of the benefits, then this analytic
method may be considered as one that is good. An example of a software design
project where the cost benefit ratio was in favor of performing the analytic method
was the Goals, Operators, Methods, and Selection Rules project (GOMS). Bonnie
John and her colleagues worked on a project using the GOMS methodology wherein
the activity levels of phone operators were desired to be faster/quicker. A new
software system was proposed to replace the original system. John was able to
accurately predict that the new computing system was going to cause the phone
operator to be slower with the proposed system than with the original system. She was
able to do this with a minimal cost of time and effort because she was able to break
Design and the Analytic Method: The Future of Human Computer Interaction (HCI) 3
down the tasks that the phone operators would be doing with the new system with a
short amount of effort and time [17]. Thus, the costs were low when comparing them
to the costs of the creation of bad software. This is an example where the cost of
performing the cost benefit analysis was low; however, the benefits of understanding
that the new system would not improve the situation were high. When the cost of
applying an analytic method results in significant benefits, then it fits the criterion for
being a superior analytic method.
2.2 Criterion 2: The Ease of Use
2.3 Criterion 3: Reproducible
The last and perhaps the most important criterion for a superior analytic method is for
the process to produce repeatable results. This is similar to having ecological validity
[21] in that the method should be rigorous enough to be used by different people over
different periods of time in different settings and to still be able to produce reliable
repeatable results. Although this seems to be the holy grail of design, it is specifically
the most important one for an analytic method because of the high variability within
the current methods. This may be due to the fact that designers naturally bring in their
own biases and expectations into analytic design. For example, usability analysis is
the analytic method where the operation of a created computer system is tested and
problems with the software (example: the user is not able to find a box to click) are
reported and fixed. Rolf Molich, Meghan Ede, Klaus Kaasgaard, and Barbara
Karyunkin showed that evaluating the same software at nine different organizations
resulted in 310 different usability problems being reported; however, only two of those
problems were the same [Molich]. Although this example is an extreme case, it shows
the high variability that can exist in using analytic methods, and because of this, an
analytic method that is able to produce repeatable results will be a superior one. (For
more information on variability see [15, 16].)
3 The Evolution of Analytic and Design Methods
choose? Is the use of an analytic model inherently what makes a design method good?
Clearly the random selection of the chosen design method will not be correct because
the design methodology will likely have the wrong focus. To explore all these issues, a
look into the past is the best place to investigate how these questions and distinctions
have arisen and what answers it can lead us to for the future of design – an analytic
method that meets the criteria and has the right focus.
3.1 The History of the Analytic Method and Design
All design methods are based on an analytic method. Although design methods may
have analytic components to them, as seen in the examples below, it is not necessary.
For example, the Commercial OffTheShelf (COTS) method of design is considered
to be an empirical design method [2], but it should be recognized that the COTS
design was likely founded on an analytic method. A second example would be to
consider an analytic method such as Pattern Languages [4]. Pattern languages are
similar to COTS in that the designer can create a design using patterns that have been
defined before; however, to create those patterns in the first place an analysis would
have to have been performed to quantify the patterns in existence. Thus, it seems, all
design methods being employed today, whether empirical, analytical, or a mixture of
both, can trace their lineage to one or more analytic methods. It must also be
recognized that this traceability does not automatically mean that the design
method(s) will produce good design results. The focus, or the paradigm, upon which
the design method was originally established is the main criterion for whether or not
the design method will be successful in today’s computing environment.
The first design method to be widely accepted within the computing industry was the
Waterfall Model. During this period of design evolution the hardware platform was
considered to be augmented by the software. The software merely helped the
hardware get its job done. So this model was based upon the idea that the computer
hardware was the locus of attention and computation, instead of the human being.
Within this method, the designer was guided through the steps of how to gather
requirements, derive requirements, and then construct software in an iterative fashion.
Although this method was widely adopted, it later went on to take significant
criticism due to its design limitations and resulted in an eventual paradigm shift
within the software engineering community. Nevertheless, the Waterfall Model is
important as it was the first to incorporate an analytic phase into the design process.
In the analysis phase the designer would create a requirements document that would
specify the needs and costs of creating a piece of software. Even with the
incorporation of an analytic phase however, the indiscriminate use of a method that
6 Laurian C. Hobby
may have an incorrect focus on where the analysis should be taking place, is one of
the main factors that resulted in the failure of this model to be applicable to solve all
classes/types of software design problems.
Hence, a second paradigm shift arose where the focus of the resulting design was on
the human user and not computer hardware. It is generally agreed that this shift was
mostly due to the influx of personal computers into the academic and business
environments. This shift resulted in design methodologies such as the Spiral Model
and Evolutionary Prototyping. What these new designs had in common was that they
incorporated the input/output methods of the human users more closely as the
computer systems themselves were designed. There has also been a gradual
incorporation of involving the user more and more into the design process.
It is important to note that this paradigm may also have started to fail. It has been
postulated by Alan Crabtree [8, 9] that the reason for this failure is its reliance on
formal methodologies to guide the designer: “The problem, then, is that the
understanding of work generated through the use of formal design methods tend to be
very abstract, focusing more on what should be done rather than on what is actually
needed to be done in a particular work setting”. Formal methods of design [22] were
revolutionary in their time because there was a need for accurate, shareable, and
traceable information in design teams [Randall]. Here, however, history may be
repeating itself. By not using an analytic method properly, as earlier with the Waterfall
Model, the design methods from the second paradigm were, and are still, failing to
produce good repeatable results. Evidence of this can be seen with the increase in new
design methods, especially in HCI, such as those listed above, an example being
Usability Engineering. Specific methodologies such as participatory design [6] are
starting to become used more frequently in design. This has led many designers, such
as Crabtree, Button, and Dourish [5, 8] to conjecture that we are on the cusp of a third
paradigm shift within the HCI design field.
In summary, software design methods have shifted their focus from the computer
hardware, to the human user, and are still failing to produce good software design
solutions for all types of design problems. The proposed new paradigm will be
focused on two changes: namely the work context and secondly a much larger use of
an analytic model for design.
Design and the Analytic Method: The Future of Human Computer Interaction (HCI) 7
3.2 A Third Paradigm Shift
The beginning of the third paradigm shift started with the publishing of Lucy
Suchman’s work on plans and situated action [30]. She proposed a reinvention of the
understanding of plans and humanproblem solving to one that involved situated
action: i.e., how plans are not executed into actions, but instead, actions are guided
momentbymoment responses to “the rational accountability of action” [31]. She
showed how the sterile previous understanding of how plans are actually executed was
unrealistic when compared to the reality of chaotic and sinuous circumstances that
users of computer systems really work within. With this new understanding, along
with a change in technology to focus on social and tangible computing, the field of
design began to focus more upon process instead of product (to use Andy Crabtree’s
terms [8]). Other terms used in this third paradigm are ‘work’ versus ‘system’, where
‘process’ and ‘work’ are analytic methods for design in which the focus is on the work
context (i.e. the user and the work) instead of only on the user. (For more information
on the above see [13]).
Crabtree argues that there are two forms of design: first, the old (product) and second
paradigm methods; and secondly, the new (process) paradigm methods. Crabtree also
argues that although product cannot be disregarded, it should not be the ruling
paradigm for how to design. The product oriented view to design, in essence, is rooted
in the belief that software is a product that stands on its own. Meaning, that the
context of use is well understood and that all requirements can be elicited before the
creation of the software: “The productoriented perspective then, ignores or is
insensitive to, the interactional dynamics of analysis and design. That is, to the human
process of learning and communication through which design solutions are produced
in the collaborations and cooperative work of the parties to system development” [8].
Table 1 highlights some important factors to distinguish the two orientations to
design. Here Crabtree users the term ‘referent system’ to mean the work setting that
includes the people and the work being done there.
Table 1. The Product vs. Processoriented view of how to design for referent systems.
ProductOriented View ProcessOriented View
1. The referent system is chosen with a 1. The referent system is chosen with a
view towards developing the software view towards designing the work process
system. supported by the software system
2. The referent system is described in 2. The referent system is described in
terms of information processing. terms of work and its processes.
8 Laurian C. Hobby
3. The software developer is outside 3. The software developer becomes a part
the referent system. of the referent system in the process of
coming to understand the use context.
4. The interaction between software 4. The interaction between software and
and it's environment is predefined as its environment is tailored by users to the
part of the development. actual needs arising from their work.
The new paradigm emerges with a processoriented view that focuses on how to work
with, and within, the referent system [14]. A productoriented view cannot be
discarded completely however due to the fact that some level of design decomposition
must occur in order to understand the referent system. This means that an indivisible
combination of the two should be used, with the processoriented view being the
leading view to the design approach. This call for a new design methodology is one
that seeks to merge the analytic method with work context/referent system [26]; i.e., to
merge technical design with social understandings. Dourish proposes that “the most
fruitful place to forge these relationships is at a foundational level, one that attempts to
take sociological insights into the process and fabric of design” [13]. Wendy Mackay
and AnneLaure Fayard propose a triangulation across disciplines that merges natural
and social science with engineering, design, and fine arts. By using pieces from all
three Mackay and Fayard believe that the best possible design method will be created
[19]. It should be noted that this background of social and tangible computing
provided thus far is not enough to guide us successfully with this merging, because
even with computing involved in every physical medium, the systems that are created
will remain reductionist until the design principles for internal representation are
changed. The next section will discuss one proposed design methodology that
attempts to focus on work context and fits the above criteria.
4 Technomethodology
Technomethodology is one attempt to merge the social with the technical. It is based
on the theory of ethnomethodology and resulted from a call to the HCI community to
turn towards the area of the social sciences [5]. Ethnomethodology, in references to
Technomethodology, is the method of detailing what actions and interactions occur
during the work process. It is accomplished by doing field work investigations, where
the designers move into the work setting to understand the organization from the
“inside”, and/or by gaining insight into the organization of activities and interactions
within the workplace by other means. By incorporating ethnomethodology into the
design of technological systems, a hybrid design methodology has arisen:
Design and the Analytic Method: The Future of Human Computer Interaction (HCI) 9
technomethodology. Technomethodology is able to address Suchman’s and Dourish’s
call for a change in the conceptual model and internal representations within design. It
is able to do this because, as Dourish says, it is “an approach that deals not so much
with this technology or that form of work, but rather more generally with interactive
technology per se and the generally operative social process that underpins any
sociological account of behavior.” Thus, by moving away from the rules that guide
formal methods, Technomethodology is able to ground itself fully into the work
context in two ways: (1) by not relying only on observations from the work setting, but
instead by using ethnomethodology’s ability to understand that an organization of
action are improvised, spontaneous, and natural; and (2) it relates the understandings
gained in (1) to fundamental principles in design like abstraction, function,
substitution, identity, and representation instead of a specific system or setting. [12, 9]
Thus, with technomethodology, sociological understanding has been forged into the
core of software design principles. Technomethodology generally meets the criteria
listed in Section Two. It scores well in the cost benefit ratio area due to that almost
any design emanating from it should have the correct focus, and thus inherently be a
better design. However, since it is still a new analytic method, the users and the results
of technomethodology have not yet been widely tested. Therefore, it is premature
today to predict whether or not it specifically will be easy to use for any level of
design. Ethnomethodology, on the other hand, has been fairly successful at being able
to reproduce good results (see examples in [13)] and [5]). As Technomethodology is
founded on ethnomethodology and it is an extension on how to refocus design
methods, it can be easily argued that criteria two and three are currently being met by
ethnomethodology and can thus be extended to technomethodology. To summarize,
Technomethodology is one attempt at merging the social sciences with design by
understanding momentbymoment human actions/reactions and by relating those
understandings to the foundational principles of design. Therefore, with the correct
focus to design, the work context, and by meeting the criteria from Section Two,
Technomethodology is a good first step at attempting to create a better and superior
design.
5 Conclusions
The analytic and design methods are the areas of HCI research that are continuing to
evolve at a rapid rate. By remodeling design with the criteria proposed in Section One
and with the focus proposed in Section Two, a whole new era of design is starting to
emerge. Technomethodology is one attempt at promoting this evolution, and although
it has its critics, it provides a strong starting position. The important point of design
10 Laurian C. Hobby
methodologies is to continuously question what makes a good design and to try and
find the method that matches those criteria. As computing changes and evolves it is
important to reevaluate what is the focus of the design method. It has evolved from the
computer, to the user, and hopefully, now to the work context. We should expect more
paradigm shifts as the work is performed. What is needed is the ability to change and
to evolve existing methodologies into more useful and usable theories. This has been
done recently by continuously involving the analytic method into the design process to
the point where in some cases the analytic method has become synonymous with the
design method. The power of the analytic method is its ability to inspire and initiate
design methods and with the right focus there is no foreseeable end to the science of
design.
References
1. Bertelsen, O. W., Bødker, S.: Activity Theory. In: HCI Models, Theories, and Frameworks.
Carroll, J. M. (Eds). Morgan Kaufmann Publishers, San Francisco (2003) 291 – 324
2. Bianchi, A., Caivano, D., Conradi, R., Jaccheri, L., Torchiano, J., Visaggio, G.: Chapter 13:
COTS Products Characterization: Propose and Empirical Assessment. In: Empirical
Methods and Studies in Software Engineering – Experiences from ESERNET Project,
Conradi, R., Inge, A. (Eds). Spring Verlag LNCS, (2003) 20
3. Blackwell, A., Green, T.: Notational Systems – The Cognitive Dimensions of Notations
Framework. In: HCI Models, Theories, and Frameworks, Carroll, J. M. (Eds). Morgan
Kaufmann Publishers, San Francisco (2003) 103 – 133
4. Borchers, J.: A Pattern Approach to Interaction Design. In: ACM DIS 2002 – Designing
Interactive Systems. ACM, (2002) 369 – 378
5. Button, G., Dourish, P.: Technomethodology: Paradoxes and Possibilities. In: Proceedings of
the 1996 Conference on Human Factors in Computing Systems. ACM, Vancouver Canada
(1996) 1926
6. Carroll, J. M.: Dimensions of Participation: Elaborating Herbert H. Simon’s “Science of
Design”. In: Les Sciences de la Conception (Science of Design), The International
Conference in Honour of Herbert Simon, Jacques Perrin (Eds). INSA de Lyon, (2002)
7. Carroll, J. M: Making Use: Scenariobased design of humancomputer interactions. MIT
Press (2000)
8. Crabtree, A.: Designing Collaborative Systems: A Practical Guide to Ethnography. Spring
Verlag, London (2003)
9. Crabtree, A.: Taking Technomethodology Seriously: Hybrid Change in the
EthnomethodologyDesign Relationship. In: European Journal of Information Systems, Vol.
(13). (2004) 195 209
10. Dictionary.com: ‘analytic’. January 5th, 2004
Design and the Analytic Method: The Future of Human Computer Interaction (HCI) 11
11. Dix, A: UpsideDown A’s and Algorithms – Computational Formalism and Theory. In: HCI
Models, Theories, and Frameworks. Carroll, J. M. (Eds). Morgan Kaufmann Publishers, San
Francisco (2003) 381 – 430
12. Dourish, P., Button, G., On “Technomethodology”: Foundational Relationships between
Ethnomethodology and System Design. In: Human Computer Interaction. (1998) 13(4) 395
432
13. Dourish, P.: Where the Action Is: The Foundations of Embodied Interaction. MIT Press,
Cambridge Massachusetts (2001)
14. Grudin, J., Grinter, R.: Ethnography and Design. In: ComputerSupported Cooperative
Work. (1995) 3(1) 55 59
15. Hartson, R., Andre, T., Williges, R.: Criteria for Evaluating Usability Evaluation Methods.
In: International Journal of HumanComputer Interaction. Lawrence Erlbaum Associates,
(2001) 13(4) 373 410
16. Hix, D., Hartson, R.: Developing User Interfaces: Ensuring Usability Through Product &
Process. Wiley, USA (1993)
17. John, B. E., Kieras, D. E.: Using GOMS for User Interface Design and Evaluation: Which
Technique? In: ACM Transactions on Computer Human Interaction. ACM, 3(4) 287 – 319
18. Keith, S., Blandford, A., Fields, R., Theng, Y. L.: An Investigation into the application of
Claims Analysis to Evaluate Usability of a Digital Library Interface. In: Proc. JCDL
Workshop on Usability, Blandford, A., Buchanan G. (Eds). (2002)
19. MacKay, W. E., Fayard, A. L.: HCI, Natural Science and Design: A Framework for
Triangulation Across Disciplines. In: ACM DIS’97—Proceedings of the Conference on
Designing Interactive Systems. ACM, (1997) 223234
20. MacKenzie, S.: Motor Behavior Models for HumanComputer Interaction. In: HCI Models,
Theories, and Frameworks. Carroll, J. M. (Eds). MorganKaufmann Publishers, San
Francisco (2003) 2754
21. McGrath, J.: Methodology Matters: Doing Research in the Behavioral and Social Sciences.
In: Readings in HumanComputer Interaction: Towards the Year 2000, Beaker, R., Grudin,
J., Buxton, W., Greenberg, S. (Eds.). MorganKaufman Publishers, San Francisco (1994)
152169
22. Monk, A., Gilbert, N.: Chapter 2 Formal Methods. In: Perspectives on HCI: Diverse
Approaches. Academic Press, London (1995) 943
23. Nielsen, J.: Usability Inspection Methods. In: Conference Companion CHI’94 (Boston,
MA, 2428 April). ACM, (1994) 413 414
24. Nielsen, J., Molich, R.: Heuristic Evaluation of User Inerfaces. In: Proc. ACM CHI’90 Conf
(Seattle, WA, 15 April). ACM, (1990) 249 – 256
25. Payne, S., Green, T.: Taskaction Grammars: A Model of the Mental Representation of Task
Languages. HumanComputer Interaction, (1986) 2(2) 93 – 133
26. Pejtersen, A. M., Hertzum, M., Andersen, P. B., Bødker, S.: Bridging the Gab between Field
Studies and Design. In: NordiCHI 2002. ACM, Aarhus Denmark (2002)
27. Pierce, J. R.: A Introduction to Information Theory: Symbols, Signals and Noise. Dover
Publications, New York (1980)
28. Randall, D., Rouncefield M.: Tutorial 7 Fieldwork for System Design. Computer Supported
Collaborative Work ACM, Chicago (2004)
12 Laurian C. Hobby
29. Rosson, M. B., Carroll, J. M.: Usability Engineering. Morgan Kaufmann Publishers, San
Francisco (2002)
30. Suchman, L: Plans and Situated Actions: The problems of HumanMachine
Communication. Cambridge University Press, Cambridge (1987)
31. Suchman, L: Writing and Reading: A Response to Comments on Plans and Situated
Actions. In: The Journal of the Learning Sciences. Lawrence Earlbaum Associates, (2003)
12(2) 299306
32. Sutcliffe, A. G., Carroll, J. M.: Designing Claims for reuse in Interactive systems design.
Int. J. HumanComputer Stud., 50(3) (1999) 213 241
33. Winograd, T.: From Programming Environments to environments for designing. In:
Communications of the ACM. ACM, (1995) 38(6) 6574