Beruflich Dokumente
Kultur Dokumente
198
199
200
different domain (e.g., land navigation). DS had difficulty understanding the instructional
techniques they witnessed (e.g., discussing minutes of angle in ballistics) as exemplars of a more
general instructional strategy (e.g., explaining the why, encouraging deliberate thought, and
awareness of context dependence of skills). This is not surprising given that exposure to only one
exemplar is not likely to promote generalization and abstraction. Hence, we recommend that
instructor education include applications of OBTE across multiple domains of skill and learning.
It also is clear that additional support can be provided to participants in a field-based train-thetrainer course soon after they complete the course. For instance, elements such as take-home
materials that explore application to different domains could be provided, perhaps employing
multimedia-based vignettes that challenge the instructors to apply lessons learned (Bruny,
Riccio, Sidman, Darowski, & Diedrich, 2006). In addition, such materials and exercises could be
employed in workshops, following marksmanship training for example, that promote student
application to domains outside of marksmanship (Sidman et al., 2009). The critical point is to
provide support that gives instructors opportunities to apply what they have learned about OBTE
to a variety of domains. Consistent with OBTE, the approach to further instructor education
would be to introduce additional domains that provide instructors with general strategies and
considerations rather than an apparent script for instruction in the new domain. Some general
strategies are reiterated below:
Where possible, instructors should allow Soldiers to solve problems. The emphasis
should be on asking leading questions instead of telling Soldiers the solution. From the
perspective of OBTE, this strategy gradually builds skills of thinking and problem
solving, and reinforces expectations of accountability rather than dependence on their
superiors.
The easiest change with a potential for significant impact is to create an environment that
fosters communication with Soldiers, rather than communication directed at Soldiers or
the absence of communication. From the perspective of OBTE, to the extent that
Soldiers feel that they can ask questions, and make some mistakes, they will come to
better understand combat application, master skills, and grow in confidence.
Where possible, peer coach should be utilized to overcome the limitations of instructorstudent ratio. Although peer coaches will no doubt provide inappropriate guidance on
occasion, this may be outweighed by the benefit of Soldiers coming to see themselves as
active participants in problem solving and discovery. This is especially true with the high
student-to-instructor ratios that are likely to be the biggest perceived obstacle to full
implementation of OBTE. To the extent that instruction is viewed not only as the transfer
of knowledge, but as an opportunity for collaborative problem solving, then the role of
the peer coach becomes critical.
It is important to note that inculcation of a mindset consistent with OBTE requires that the work
environment should be open to and supportive of change (e.g., Bandura, 1995; Burke & Hutchins,
2007; Rasmussen, 1997). As noted by Dean et al. (2009), DS indicated that they felt somewhat
constrained by the nature of the programs of instruction and associated resources. It is likely that
gaps in transfer of OBTE were rooted, at least in part, in perceived constraints on rifle
marksmanship training and the extent to which the DS believed they could deviate from the
common practices. A key challenge, therefore, is the extent to which programs of instruction can
be made to be more flexible and DS can be empowered to believe that they can take initiative to
more fully implement OBTE. Hence, there is a critical need to educate commanders and
supporting units about the value of OBTE as a service system. Without a command climate that
Asymmetric Warfare Group
201
fosters agility in instructional units, an approach like OBTE is not likely to be successful
(Appendix C; Haskins, 2009; Schwitters, 2009).
11.2.3 Further Verification and Validation of OBTE
There is a broad range of best practices for peer review, verification, and validation in Service
System Development (CMMI Product Team, 2009, pp. 454-462). As in the current investigation,
it is important to employ a multifaceted and multidisciplinary approach to these activities to avoid
sub-optimized solutions and unintended consequences. Consideration of a portfolio of
methodologies, and even some exploration with alternative methods, also helps an organization
find the right level of effort and detail for its process improvement (Garcia & Turner, 2006). In
the present context, a specific aim was to identify ways in which the behavioral and social
sciences can provide guidance to verification and validation of an instructional service system.
There are two general characteristics of our approach in this respect: (a) develop grounded theory
for OBTE; and (b) promulgate scientific inquiry into OBTE as a process over time conducted
within a diverse community of practice.
On our view, identification and development of theoretical foundations is critically important in
service system development because of the diversity of methodologies that are available and that
may have to be employed for verification and validation. Theoretical coherence arguably is the
only way to identify convergence among otherwise incommensurate sources of evidence. From
the outset, we intentionally included in the research team, a diversity of subdisciplines and
theoretical commitments, mostly within the discipline of scientific psychology. The grounded
theory that emerged in this investigation came to be dominated by theoretical commitments allied
with the interrelated lineages of social learning theory, situated cognition, motivation and
emotion, ecological psychology, and dynamical systems theory in the social and behavioral
sciences. We are explicit about this bias wherever appropriate. More generally, we believe that
reflection and candor about theoretical biases should be a characteristic of systems engineering
applied to the integration and development of capabilities that have an impact on behavioral and
social phenomena (Flyvbjerg, 2001; Godfrey-Smith, 2003; Schrim & Caterino, 2006; see also,
Epilogue in this monograph).
We believe that the value of a theoretical commitment (e.g., coherence and directedness)
outweighs the potential disadvantage of narrowed vigilance in the development of grounded
theory. At the same time, we recognize that involvement of a broader scientific community can
mitigate the potential problem of narrowness. This is important beyond the value of debate,
skepticism, and alternative sources of evidence. It emphasizes an aspect of science that is
consistent with the intent of peer review, verification, and validation in systems engineering.
Scientific and technical understanding becomes more refined and elaborate over time with the
accumulation of evidence. One should avoid presumptive judgments based on point estimates
and isolated comparisons with questionable generalizability in domains as broad as training and
education or in specific applications that are replete with uncontrollable sources of variance.
In this regard, there is much to be learned from the other evidence-based services systems and
associated scientific disciplines about how to use and how not to use scientific evidence in social
decision-making (American Association for the Advancement of Science, 1993/2009; Bailar,
1997; Best, Trochim, Haggerty, Moor, & Norman, 2008; Foster & Huber, 1998; Glasziou &
Haynes, 2005; Kohn, Corrigan, & Donaldson, 1999; Mislevy & Riconscente, 2006; National
Research Council, 2009; Pellegrino, Chudowsky, & Glaser, 2001; Sackett, Rosenberg, Gray,
Haynes, & Richardson, 1996; Swales, 2000). A socially aware and scientifically based approach
to verification and validation should strive for theoretical coherence, juxtaposition of
Asymmetric Warfare Group
202
complementary and opposing perspectives, traceability to programs of research that span decades,
empirical evidence from multiple methods that are replicable, and consideration of sources of
variability and uniqueness in empirical findings to ensure that conclusions are credible,
transferable, dependable, and confirmable (cf., Denzin & Lincoln, 2003; Foucault, 1966/2002;
Godfrey-Smith, 2003; Kuhn, 1962/1970; Popper, 1959). It should not be limited by narrow
conceptions of hypothesis testing (cf., Flyvbjerg, 2001; Henkel & Morrison, 1970/2006; Kline,
2004; A. Ryan, 1959; Schrim & Caterino, 2006).
With respect to verification and validation, as well as scientific influence on these endeavors, an
important contribution of this investigation is to stress the importance of productive dialog about
OBTE within a diverse community of stakeholders. The intent of this dialog should not be to
prescribe the use of particular instructional methods or techniques but to provide some scientific
guidance about the most fruitful topics of conversation and innovation by instructors (cf., James,
1899/1907, pp. 7-11). In particular, we believe this investigation can stimulate productive dialog
because of the definition of OBTE in terms of instructor behavior and instructor-student
interactions and, more specifically, because of the measures of instructor and student behavior
that enable OBTE to be verifiable. They provide one topic of conversation for forums in which
there is peer-to-peer sharing of information about best practice in training and education (e.g.,
Costanza, Leibrecht, Cooper, & Sanders, 2009). In the context of continuous verification and
validation, we believe such conversations should include scientists as well as instructors,
instructional designers, course developers, quality assurance personnel, and commanders in
instructional programs. The most radical departure would be to facilitate contributions from
stakeholders in theater (Riccio, dEchert, et al., 2006; Riccio, Lerario, et al., 2006).
There is dialog and debate about OBTE in a diverse community of stakeholders, and it appears to
be growing (AWG, 2009). Figure 6 suggests a challenge in achieving efficient sharing of issues
and lessons learned about OBTE. Stakeholders are widely dispersed. Decentralized collaboration
and the resulting lessons learned thus are not readily apparent in timely fashion to decision
makers. Potential solutions to this problem are emerging in the nascent Army Training Network
(ATN) that builds on the recently revised Army Field Manual, FM 7-0 Training for Full
Spectrum Operations, and transforms FM 7-1 into a Virtual Field Manual (Davis, 2009). One
use of this forum would be for continuous peer review that is central to verification and validation
in CMMI Service System Development (CMMI Product Team, 2009). A scientific approach to
verification and validation can help ATN establish a topic of peer-to-peer discussion that is more
likely to be on point, efficient, systematic, and actionable for OBTE in particular and for good
training in general.
Figure 6: Needs for distributed peer-to-peer collaboration about OBTE (after Devens, 2009)
203
11.3 References
American Association for the Advancement of Science (2009). Benchmarks for scientific literacy.
New York: Oxford University Press. Retrieved April, 2009, from American Association for
the Advancement of Science, Project 2061, http://tinyurl.com/c9mrcr. (Original work
published 1993)
Asymmetric Warfare Group (2009). U.S. Army Asymmetric Warfare Group workshop on
Outcomes-based Training and Education. Applied Physics Laboratory, Johns Hopkins
University, Laurel, MD.
Bailar, J. (1997). The promise and problems of meta-analysis. New England Journal of Medicine,
337(8), 559-561.
Bandura, A. (Ed.) (1995). Self efficacy in changing societies. Cambridge, UK: Cambridge
University Press.
Best, A., Trochim, W., Haggerty, J., Moor, G., & Norman, C. (2008). Systems Thinking for
Knowledge Integration: New Models for Policy-Research Collaboration. In L. McKee, E.
Ferlie & P. Hyde (Eds.), Organizing and Reorganizing: Power and Change in Health Care
Organizations (pp. 154-166). New York, NY: Palgrave Macmillan.
Bransford, J. D., Brown, A. L., & Cocking, R. R. (2000). How people learn: Brain, mind,
experience, & school. Washington, DC: National Academy Press.
Bruny, T., Riccio, G., Sidman, J., Darowski, A., & Diedrich, F. (2006). Enhancing warrior ethos
in initial entry training. Proceedings of the 50th Annual Meeting of the Human Factors and
Ergonomics Society, San Francisco, CA.
Burke, L.A., & Hutchins, H.M. (2007). Training transfer: An integrative literature review. Human
Resources Development Review, 6, 263-296.
CMMI Product Team (2009). CMMI for services, version 1.2. (CMU/SEI-TR-2009-001; ESCTR-2009-001). Pittsburgh, PA: Carnegie Mellon University.
Cornell-dEchert, B. (2009a). An introduction to outcomes-based training and education. Fort
Meade, MD: Asymmetric Warfare Group.
Cornell-dEchert, B. (2009b). Outcomes-based training and education: Implementation guide.
Fort Meade, MD: Asymmetric Warfare Group.
Costanza, M., Leibrecht, B., Cooper, W., & Sanders, W. (2009). Peer-To-Peer Training
Facilitators Guide (ARI Research Product, in press). Alexandria, VA: U.S. Army Research
Institute for the Behavioral and Social Sciences.
Darwin, M. (2008). Outcomes-based training and education: fostering adaptability in full
spectrum operations (Briefing, December 2008). Fort Meade, MD: Asymmetric Warfare
Group.
Davis, J. (2008, January 21). Call for Army Training Network (ATN) training products. Retrieved
March, 2009 from US Army Combined Arms Center, Blog Network, Army Training
Network, http://tinyurl.com/alxknw.
Dean, C., Diedrich, F., Artis, S., Horn, Z., Jefferson, T., & Riccio, G. (2008). Outcomes-based
training and education: Student measures for rifle marksmanship. Report to the Asymmetric
Warfare Group (Contract W9113M-06-D-0005). Vienna, VA: Wexford-CACI, Inc.
Devens, M. (2009, March). Introductory remarks. Presentation at the US Army Asymmetric
Warfare Group workshop on Outcomes-based Training and Education, Applied Physics
Laboratory, Johns Hopkins University, Laurel, MD.
Deci, E.L. & Ryan, R.M. (1985). Intrinsic motivation and self-determination in human behavior.
New York: Plenum.
Deci. E.L. & Ryan, R.M. (2008). Self determination theory: A macrotheory of human motivation,
development, and health. Canadian Psychology, 49, 182-185.
204
Flyvbjerg, B. (2001). Making social science matter: Why social inquiry fails and how it can
succeed again. Cambridge, UK: Cambridge University.
Foster, K, & Huber, P. (1998). Judging science: Scientific knowledge and the federal courts.
Cambridge, MA: MIT.
Foucault, M. (2002). Order of things: An archaeology of the human sciences New York:
Routledge Classics. (Original work published 1966).
Garcia, S, & Turner, R. (2006). CMMI(R) survival guide: Just enough process improvement.
Upper Salle River, NJ: Addison-Wesley.
Glasziou, P., & Haynes, B. (2005). The paths from research to improved health outcomes. ACP
Journal Club, 142(2), A8-10.
Godfrey-Smith, P. (2003). Theory and reality. Chicago, IL: University of Chicago Press.
Haskins, C. (2009, March). Development of outcomes based training. Presentation at the US
Army Asymmetric Warfare Group workshop on Outcomes-based Training and Education,
Applied Physics Laboratory, Johns Hopkins University, Laurel MD.
Henkel, R. & Morrison, D. (Eds.) (2006). The significance test controversy: A reader.
Piscataway, NJ: Aldine Transaction. (Original work published 1970)
James, W. (1899/1907). Talks to teachers on psychology: and to students on some of life's ideals.
New York, NY: Henry Holt and Company.
Kline, R.B. (2004). Beyond significance testing: Reforming data analysis methods in behavioral
research. Washington, DC: APA.
Kohn, L., Corrigan, J., & Donaldson, M. (Eds.) (1999). To err is human: Building a safer health
system. Washington, DC: National Academy Press.
Kuhn, Thomas (1970). The structure of scientific revolutions. Chicago: University of Chicago
Press. (Original work published 1962)
Mislevy, R. J., & Riconscente, M. M. (2006). Evidence-centered assessment design: Layers,
concepts, and terminology. In S. Downing & T. Haladyna (Eds.), Handbook of Test
Development (pp. 61-90). Mahwah, NJ: Erlbaum.
National Research Council (2009). Strengthening forensic science in the United States: A path
forward. Retrieved February, 2009 from National Academies Press,
http://www.nap.edu/catalog/12589.html.
Pellegrino, J., Chudowsky, N., & Glaser, R. (Eds.) (2001). Knowing what students know: The
science and design of educational assessment. Washington, DC: National Academy Press.
Popper, Karl (1959). The logic of scientific discovery. New York: Basic Books.
Rasmussen, J. (1997). Risk management in a dynamic society: a modeling problem. Safety
Science, 27, 183-213.
Riccio, G., dEchert, B.C., Lerario, M., Pound, D., Bruny, T., & Diedrich, F. (2006). Enhancing
Joint Task Force Cognitive Leadership Skills. Report to the Army Research Institute for the
Behavioral and Social Sciences, contract number Army contract no. W74V8H-06-P-0186.
Vienna, VA: The Wexford Group International.
Riccio, G., Lerario, M., Cornell dEchert, B., Pound, D., Bruny, T., & Diedrich, F. (2006).
Training a Joint and expeditionary mindset. Report to the Army Research Institute for the
Behavioral and Social Sciences, contract number W74V8H-06-P-0189. Vienna, VA: The
Wexford Group International.
Ryan, T. (1959). Multiple comparisons in psychological research, Psychological Bulletin, 56, 2647.
Sackett, D.L., Rosenberg, W.M., Gray, J.A., Haynes, R.B., & Richardson, W.S. (1996). Evidence
based medicine: What it is and what it isn't. British Journal of Medicine, 312(7023), 71-72.
Schwartz, D.L., Lin, X., Brophy, S., & Bransford, J.D. (1999). Toward the development of
flexibly adaptive instructional designs. In: C.M. Reigilut (Ed.). Instructional Design Theories
and Models: Volume II. Hillsdale, NJ: Erlbaum.
205
Schwitters, J. (2009, March). Command imperative for change. Presentation at the US Army
Asymmetric Warfare Group workshop on Outcomes-based Training and Education, Applied
Physics Laboratory, Johns Hopkins University, Laurel, MD.
Schrim, S. & Caterino, B. (2006). Making political science matter: debating knowledge,
research, and method. New York: New York University.
Sidman, J., Riccio, G., Semmens, R., Geyer, A., Dean, C., & Frederick Diedrich, F. (2009).
Reshaping Army institutional training: Current training. Final Report to the Army Research
Institute for the Behavior and Social Sciences, contract number W74V8H-04-D-0047 DO
0010.
Swain, R. (2005). Changes in instructional system design (ISD): Improving training product
delivery to United States Army Soldiers. USAWC Strategic Research Project. Carlisle
Barracks, PA: U.S. Army War College.
Swales, J. (2000). The troublesome search for evidence: Three cultures in need of integration.
Journal of the Royal Society of Medicine, 93, 402-407.
Tobias, S. & Duffy, T. (eds.) (2009). Constructivist instruction: Success or failure. New York:
Routledge:
ix
TABLE OF CONTENTS
.
page
Prologue: A Programmatic View of the Inquiry into Outcomes-Based Training & Education.......1
Historicity of our Research on OBTE ..........................................................................................1
The Approach and Lessons Learned from the Research..............................................................3
Documentation of the Research ...................................................................................................4
Section I. Development of Stakeholder Requirements for OBTE..............................................6
Chapter 1. Preparation for Full Spectrum Operations ......................................................................7
1.1 Requirements of Full Spectrum Operations ...........................................................................8
1.2 Outcomes-Based Training and Education (OBTE)..............................................................10
1.2.1 Exemplar of OBTE: Combat Applications Training Course........................................11
1.2.2 OBTE as a Multifaceted Instructional System .............................................................12
1.3 An Appraisal of Instruction with Respect to OBTE ............................................................13
1.3.1 A Systems Engineering Framework for Integration and Development of OBTE ........13
1.3.2 Preparation for Validation and Verification .................................................................14
1.4 References ............................................................................................................................17
Chapter 2. Formative Measures for Instructors ..............................................................................20
2.1 Development of Formative Measures ..................................................................................20
2.1.1 The COMPASS Methodology ......................................................................................20
2.1.2 Development of Measures for OBTE ...........................................................................21
2.2 Description of Formative Measures .....................................................................................21
2.2.1 Results of the COMPASS Process................................................................................21
2.2.2 Elaboration on the Description of Measures.................................................................23
2.3 OBTE Performance Measures: Planning for Training.........................................................23
2.3.1 Define Outcomes ..........................................................................................................23
2.3.2 Create a Positive Learning Environment ......................................................................25
2.3.3 Create the Parameters of Learning................................................................................27
2.4 OBTE Performance Indicators: Training Execution............................................................28
2.4.1 Communicate the Parameters of Learning....................................................................28
2.4.2 Training Emphasizes Broad Combat or Mission Success ............................................29
2.4.3 Customize Instruction When Possible Based on Constraints/Conditions ....................31
2.4.4 Facilitates Learning of Concepts ..................................................................................32
2.4.5 Creates a positive learning environment.......................................................................34
2.4.6 Instructors Utilize Measures of Effectiveness & Self-Evaluation ................................36
2.4.7 Uses scenarios to facilitate learning..............................................................................38
2.4.8 Instructors exhibit intangible attributes in own actions ................................................40
2.4.9 Hotwashes and Mini-AAR............................................................................................42
2.5 Uses of the Measures ...........................................................................................................43
2.5.1 Formative Measures for Instructors ..............................................................................44
2.5.2 Quality Assurance and Instructor Education ................................................................44
2.5.3 Continuous Improvement of Assessments....................................................................45
2.5.4 Program Evaluation and Organizational Change..........................................................46
2.6 References ............................................................................................................................46
Table of Contents
xi
Table of Contents
xii
xiii
Table of Contents
xiv
Chapter 15. Five ways OBTE can enable the Army Leader Development Strategy....................242
15.1 Background ......................................................................................................................242
15.2 An Emerging Consensus ..................................................................................................244
15.2.1 What Part to Balance?...............................................................................................244
15.2.2 Improving Training, by Design ................................................................................245
15.2.3 Increased Use of dL and Dependence on Self-Development ...................................246
15.2.4 Future Orientation, Unknown Requirements............................................................247
15.2.5 The Quality Instructor Challenge .............................................................................247
15.2.6 Purpose and Design are Key .....................................................................................248
15.2.7 A Natural Advantage ................................................................................................249
15.2.8 Task Specialization or Generalized Competency .....................................................249
15.3 Conclusion........................................................................................................................251
15.4 References ........................................................................................................................252
Epilogue. Integration of Leader Development, Education, Training, and Self-Development .....254
Toward Values-Based Standards for Army Doctrinal Requirements ......................................254
Nested Standards and Quality Assurance.................................................................................256
Needs and Opportunities for Staff & Faculty Development ....................................................259
A Role for Science and Measurement .................................................................................259
Toward Best Practices in Instructor Education....................................................................260
Critical Considerations for Further Scientific Investigation ....................................................263
The Necessity of Long-Term Studies ..................................................................................263
False Dichotomy of Objective-Subjective ...........................................................................264
Clarity About What Is Evaluated.........................................................................................265
Next Steps ............................................................................................................................266
References ................................................................................................................................268
Section IV. Appendices...............................................................................................................270
Appendix A. OBTE Principles & Practices: Instructor Measures................................................271
A.1 Genesis of Formative Measures for Instructors ................................................................271
A.2 Principles of Outcomes-Based Training & Education ......................................................272
A.3 Guide to Using Measures of Instructor Behavior..............................................................276
A.4 Complete Menu of Instructor Measures............................................................................279
Appendix B. OBTE Principles & Practices: Student Measures ...................................................318
B.1 Guide to Using Measures of Student Behavior .................................................................318
B.2 Complete Menu of Student Measures ...............................................................................319
Appendix C: A Commanders View of Outcomes-Based Training and Education .....................340
Summary ..................................................................................................................................340
Definition .............................................................................................................................340
Description...........................................................................................................................340
Elements of OBTE. ..................................................................................................................341
Developing the Outcomes....................................................................................................341
Developing the Training Plan ..............................................................................................341
Conducting Training ............................................................................................................342
How Training is Assessed....................................................................................................344
Conclusion................................................................................................................................344
xv