Sie sind auf Seite 1von 15

Mixed-Methods 1

Guidelines for Conducting a High-Quality Mixed-Methods Dissertation

by

Rebecca Glover-Kudon, PhD


University Health Center
The University of Georgia

for

American Evaluation Association Annual Conference

November 11, 2010

Your comments on this document are welcome and invited!

Is the information useful?


Is the information clearly presented?
What’s missing that would be helpful for students/faculty?

Please submit any comments, suggestions, questions, or corrections to the author:


rebglover@yahoo.com.
Mixed-Methods 2

About this document

Using a “frequently asked questions” format, this paper summarizes literature on essential components of
mixed-methods research. Graduate students and faculty mentors may find this document useful in
establishing focused dialogue around mutual expectations for theses and dissertations, possibly including
assessment criteria. The author assumes acceptance of mixed-methods research as a legitimate form of
inquiry and, therefore, avoids discussion of epistemological differences between quantitative and qualitative
research traditions.

About the Author

Rebecca Glover-Kudon holds a doctoral degree in Health Promotion and Behavior with a Graduate
Certificate in Interdisciplinary Qualitative Studies. For her dissertation, she conducted a mixed-methods
study with a sequential explanatory design. An evaluator with fifteen years’ experience managing and
conducting program evaluation projects for local and national health organizations, Dr. Glover-Kudon is
currently involved with evaluation projects benefiting UGA’s University Health Center and the Centers for
Disease Control and Prevention.

Acknowledgements

The idea for this paper germinated during graduate study in the Interdisciplinary Qualitative Studies
Department (QUAL) of the University of Georgia. Special thanks go to Drs. Trisha Reeves, Jodi Kaufman,
Jude Preissle, and Kathy Roulston for nurturing novice qualitative researchers and being so generous with
their time and intellectual energies.
Mixed-Methods 3

Abstract

Mixed-methods research has emerged as a distinct methodology. As a result, graduate students in the
evaluation field are increasingly interested in learning how to combine quantitative and qualitative methods
in a single study. Because faculty mentors have typically specialized in either quantitative or qualitative
methods, they may lack experience in conducting mixed-methods research and feel reluctant to guide
students’ mixed-methods endeavors. This paper summarizes the literature on how to conduct and produce
high-quality mixed-methods research and suggests criteria for assessment. Specifically, the paper details
core elements of mixed-methods research proposals including study design and procedures, explicit
rationale for data mixing, and standard notation for data prioritization, sequencing, and integration.
Features and challenges of various designs are also discussed. The intended audience for this paper and
presentation are graduate students contemplating or conducting mixed-methods studies and faculty
members who advise them.
Mixed-Methods 4

What is mixed-methods research?

Mixed-methods1 is a research paradigm characterized by the use of both quantitative and qualitative
methods in a single study (Creswell, Fetters, & Ivankova, 2004; Creswell, 2003; Morse, 2003). An
essential requirement of mixed-methods research is that different forms of data are “integrated, related, or
mixed at some stage of the research process” (Creswell et al., 2004, p. 7). Rooted in pragmatism (Johnson
& Onwuegbuzie, 2004; Maxcy, 2003), mixed-methods research involves overcoming shortcomings of one
method with the strengths of the other. Therefore, a central tenet of mixed-methods research is that sole
use of one method will not adequately address a given research problem. Both methods, in complement to
each other, are viewed as necessary to achieve more comprehensive understanding.

How is mixed-methods research different than more traditional approaches?

While mixed-methods research has many similarities to traditional research approaches, it involves the
additional steps of identifying a theoretical lens that influences methodological choices (e.g., advocacy,
feminism) (Crotty, 1998), deciding whether the two types of data have equivalency or whether one type has
dominance, specifying how data collection will occur (i.e., sequentially or concurrently), and deciding when
and where data integration will occur (Hanson, Creswell, Plano Clark, Petska, & Creswell, 2005).

To what degree is mixed-method research accepted?

Creswell & Plano Clark (2007) present a 12-item checklist describing three levels of acceptance within a
discipline. Minimal acceptance is suggested, for example, by graduate students’ using mixed-methods for
dissertation-related research and discussion in journals about the potential use of mixed-methods.
Hallmarks of moderate acceptance include having discipline leaders advocate for mixed-methods research
and funders support mixed-methods research projects. Major acceptance is indicated by publications,
often in special issue format, on the use of mixed-methods to advance the specific discipline.
Institutionalizing mixed-methods research as part of graduate study by offering mixed-methods research
courses is further evidence of major acceptance (Creswell & Plano Clark, 2007; Plano Clark, 2005).
Similarly, faculty-graduate student mentoring likely falls at the distal end of the acceptance continuum.

What are the strengths of mixed-methods research?

In general, the major strength of mixed-methods research is that the shortcomings of either quantitative or
qualitative research as a mono-method can be offset by the strengths of the other. For example, quotes,
pictures, or narrative resulting from qualitative inquiry can supplement statistics to convey meaning and
offer additional insight, whereas numbers generated from quantitative research can augment pictures and
words to add precision, scope, and generalizability. With greater methodological flexibility, a deeper, more
comprehensive understanding is possible, undergirded by more credible evidence resulting from
convergence and corroboration (Johnson & Onwuegbuzie, 2004).

1
Multi-method research refers to using both quantitative and qualitative methods in a complementary series of
projects examining the same general topic (Morse, 2003). In mixed-methods research, versus multi-methods
research, “the less dominant strategies do not have to be a complete study in themselves” (Morse, 2003, p. 195).
Mixed-Methods 5

What are the challenges associated with mixed-methods research?

Mixed-methods research requires an advanced skill set with a strong foundation in both quantitative and
qualitative methods. Some experts recommend undertaking mixed-methods research only after gaining
sufficient experience with both methodological approaches separately (Creswell & Plano Clark, 2007).
Given the prerequisite skills, mixed-methods research can be especially difficult for the lone researcher
without access to a research team (e.g., doctoral students). In general, mixed-methods research is also
considered more burdensome in terms of labor, time, and resources (Collins, Onwuegbuzie, & Sutton,
2006; Creswell & Plano Clark, 2007; Johnson & Onwuegbuzie, 2004)—three commodities commonly in
scarce supply among graduate students.

Aside from logistical considerations, there are other common roadblocks as well. Mixed-methods
researchers may encounter resistance from methodological purists who strive to uphold the epistemological
tenets of either quantitative or qualitative research (Johnson & Onwuegbuzie, 2004). Faculty members
serving on dissertation committees, in particular, may take a cautionary stance given their responsibility of
ensuring the quality and rigor of research projects conducted by students who are typically novice
researchers. In addition, because faculty members typically received specialized training in either form of
inquiry, but rarely both, they may not be comfortable or believe themselves sufficiently experienced to guide
students’ research endeavors and serve as mentors.

Specific mixed-methods designs, described elsewhere, also have inherent challenges. Divergent results
are possible, when, for example, attempting concurrent triangulation. Obtaining approval from Institutional
Review Boards (IRBs) is sometimes problematic if there is skepticism about accepting research designs
that are emergent, which is characteristic of sequential explanatory and exploratory research (Creswell &
Plano Clark, 2007).
.
What are the steps involved in mixed-methods research?

Experts have identified as many as 13 steps involved in conducting mixed-methods research (Collins,
Onwuegbuzie, & Sutton, 2006). While some steps are similar to those in mono-method studies, others are
distinctive to mixed-methods. Before undertaking mixed-methods research, an investigator must: clarify the
rationale and purpose for data mixing, determine data prioritization, identify appropriate implementation
sequencing, and pinpoint where and how data integration will occur (Creswell et al., 2004; Creswell &
Plano Clark, 2007; Punch, 1998).

Rationale and Purpose

The terms ‘rationale’ and ‘purpose’ for mixing quantitative and qualitative data are not consistently defined
in the literature. Some researchers seem to use these terms interchangeably and synonymously with
‘function’ or ‘justification’ (Bryman, 2006; Creswell, 2002, 2003), whereas others differentiate according to
their relative generality or specificity (Collins et al., 2006). Despite confusing nomenclature, the intent is to
explicate why mixed-methods are needed for a given research problem and specify the relationship
between the quantitative and qualitative forms of inquiry (Collins et al., 2006). Greene, Caracelli, &
Graham (1989) discovered five general purposes: triangulation, complementarity, development,
expansion, and initiation (Bryman, 2006; Collins et al., 2006). Bryman (2006) advocates a more detailed
schema that allows for multiple purposes. In an explanatory study, for example, a researcher may be
interested in furthering understanding about unexpected findings, outliers, or clusters, giving voice to
Mixed-Methods 6

respondents, or achieving greater completeness by augmenting “dry” quantitative results with more vivid,
illustrative quotes from respondents.

As Bryman (2006) discovered in conducting a review of mixed-methods studies, researchers’ stated


rationale for mixing quantitative and qualitative data do not always match what occurs in practice. As
Bryman asserts, one reason for this may be that inadequate attention is given to the exact purpose of
combining numerical and text data. A second explanation is that once a variety of data are available, plans
for using data become emergent, evolving well beyond what was originally envisioned.

Data Prioritization

In mixed-methods research, the researcher must decide which type of data has methodological dominance,
or, alternatively, whether quantitative and qualitative data have equivalence. The literature also refers to
priority given to data as emphasis or weight. This decision results from a confluence of factors including
the epistemological foundation of the researcher’s discipline, the theoretical drive (Morse, 2003)
underpinning the research (e.g., either inductive or deductive), and, most importantly, the research purpose
and research questions framing the study (e.g., testing theory or generating theory).

Implementation Sequence

The temporal relationship between quantitative and qualitative processes in a mixed-methods study is
referred to in the literature as implementation, sequence, or timing (Creswell & Plano Clark, 2007; Creswell
et al., 2004). Timing refers to the order of collecting, but also analyzing and interpreting different forms of
data. These processes are described as either concurrent or sequential. Concurrent implementation is
characterized by the simultaneous collection, analysis, and interpretation of data in a single phase.
Sequential timing involves two phases, one informing the other. Whether qualitative or quantitative data
collection and analysis ensue first entirely depends on the overall purpose. For example, is the intent to
explain (QUAN first) or explore (QUAL first)?

Data Integration

Mixed-methods research requires the integration, or mixing, of quantitative and qualitative data at some
point during the research process. Researchers should clearly explain and depict in a visual model where
data integration is to occur (Creswell, 2002; Ivankova, Creswell, & Stick, 2006). In addition, researchers
should clarify whether it is methods, data, or findings that are being combined (Punch, 1998). Quantitative
and qualitative components can be connected during the design phase by having different types of
research questions (e.g., embedded design), as well as at the end during interpretation (e.g., triangulation).
With sequential designs, data are also integrated intermediately, connecting such that findings from one
phase can inform the other.
Mixed-Methods 7

How are mixed-methods studies represented?

Morse (1991) originally developed the standard notation system that describes data prioritization and
methodological sequencing. This notation system, now in widespread use among mixed-methods
researchers, prescribes the use of capital letters (i.e., either QUAN or QUAL) to indicate the dominant
method, while lowercase letters indicate the secondary method (i.e., either quan or qual). It is important to
note that four-letter abbreviations are used in each case to promote the “equal stature” of both quantitative
and qualitative methods (Creswell & Plano Clark, 2007). How the mixed-method study is conducted is
represented either by a plus sign (+) for concurrent implementation of methods or a single-headed arrow
(→) for sequential implementation. Plano Clark (2005) expanded the notation system to include “methods
that are embedded within other methods” (Creswell & Plano Clark, 2007, p. 41). Embedded designs are
represented with parentheses.

The notation system is used in combination with a graphical depiction of the specific mixed-methods
design. Ivankova and colleagues (2006) advanced earlier works describing how to represent mixed-
methods research (Creswell et al., 2003; Tashakkori & Teddlie, 1998) by providing ten definitive steps for
creating visual models to depict various mixed-methods designs. In addition to using standard notation
regarding dominance and implementation sequence, researchers should also: draw boxes for each stage
of qualitative and quantitative data collection, analysis, and interpretation, specify data collection and
analysis procedures for each method, and identify anticipated products and outcomes for each quantitative
and qualitative procedure. Additionally, visual models should be simple, no greater than one page with
either a horizontal or vertical layout, use concise language, and have a clear descriptive title. An example
of a visual model for a sequential explanatory mixed-methods design is found in Appendix A and is taken
from the author’s dissertation.

What are the most common mixed-methods designs?

There are numerous ways to conduct mixed-methods research (Creswell et al., 2004; Creswell & Plano
Clark, 2007; Johnson & Onwuegbuzie, 2004; Morse, 1991, 2003). The four major typologies according to
Creswell and Plano Clark (2007) are the Triangulation Design, the Embedded Design, the Exploratory
Design, and the Explanatory Design. Each design has a distinct purpose, an inherent theoretical drive
(Morse, 2003), an explicit emphasis on the data (either dominance of one type or equivalence of both
types), a prescribed sequencing for implementing data collection and analysis, and a typical pattern for
data integration. Each design also carries its own strengths and challenges. The four major designs are
summarized in Table 1 below.
Mixed-Methods 8

Table 1: Mixed-Methods Designs2

Design Type Notation Purpose Procedures Mixing Strengths Challenges

Triangulation QUAL + QUAN To gather different, Separate data collection & Converges during Intuitive, efficient; High skill set
yet complementary analysis, occurring at the interpretation works well if research required for both
(also called data on same topic in same time team is available types of data;
Concurrent order to validate, May also involve
Triangulation design)3 confirm, or transforming data into Reconciling possibly
corroborate evidence4 other type during discrepant results
analysis
Embedded QUAN(qual) To supplement a One or two phase data During design phase; Works well with a Difficult to integrate
larger study of one collection, with different QUAN intervention results on different
(also called Nested QUAL(quan) type with data from types of questions; one study with an types of questions
Model, when qual and another type, on a question is clearly experimental design
QUAN components different question. supplemental to the other. Difficult to
answer different Less time-intensive differentiate this
questions)5 because of unequal mixed-methods
data priority design with others
(also called Data
Transformation Design Appealing to primarily Fewer examples
Model, when qual data quantitative disciplines available in
are converted to literature
numerical format)6

2 The main source for this table is the text by Creswell & Plano Clark (2007). Supplemental material is also referenced.
3
Hanson et al., 2005
4 Creswell & Plano Clark, 2007; Morse, 1991
5 Creswell et al., 2004; Hanson et al., 2005
6 Creswell et al., 2004
Mixed-Methods 9

Design Type Notation Purpose Procedures Mixing Strengths Challenges

Exploratory QUAL → quan To explore a Two-phase data Data connects in between “Straightforward to describe, Time-intensive
phenomenon collection, with the quantitative and qualitative implement, and report”9
(also called Sequential qual → QUAN qualitatively in the qualitative phase phases IRB may be hesitant
Exploratory Design)7 absence of clearly informing Appealing to quantitative to accept emergent
theoretical the QUAN phase Qualitative data guide disciplines quantitative
(also called Instrument constructs and quantitative instrument procedures
Design, when QUAN is existing scales development Can be used in single-study or
prioritized)8 multi-phase research designs May require different
sample of
Can be used to: respondents
ƒ develop quantitative instruments
ƒ test emerging theory
ƒ generalize findings
ƒ develop taxonomy
Explanatory QUAN → qual To explain or Two distinct Data connects in between Straightforward implementation and Time-intensive
advance quantitative phases beginning quantitative and qualitative reporting
(also called Sequential quan → QUAL findings with with quantitative phases, and, again, during May require
Explanatory Design)10 qualitative data data collection and interpretation13 Can be done by single researcher speculative
analysis followed without a team decisions about
(if QUAN is dominant, by qualitative data QUAN results inform sampling
also called Follow-up collection and purposeful sampling for Helps explain outlying cases,
Explanations Model)11 analysis. qualitative phase unexpected results, and group IRB may be hesitant
characteristics to accept emergent
(if QUAL is dominant, Qualitative questions recruitment steps for
also called Participant emerge from QUAN phase qualitative phase
Selection Model)12

7
Hanson et al., 2005
8 Creswell et al., 2004
9 Creswell & Plano Clark (2007), p. 78
10 Hanson et al., 2005
11 Creswell & Plano Clark (2007)
12 Creswell & Plano Clark (2007)
13 Ivankova et al., 2006; Onwuegbuzie & Teddlie, 2003
Mixed-Methods 10

Which design is right for my study?

Experts recommend choosing one distinct mixed-methods design to have a solid framework to guide one’s
work, as well as to facilitate description and implementation (Creswell & Plano Clark, 2007). The primary
consideration about which design to use stems from its fit with the research problem and corresponding
research questions (Creswell & Plano Clark, 2007; Johnson & Onwuegbuzie, 2004). A researcher should
also consider his/her skill set and academic preparation, especially if carrying out the study as the sole
investigator (e.g., dissertation research). Alternatively, the researcher should assess his/her ability to
assemble a team with the necessary expertise to conduct mixed-methods research. Feasibility is another
key factor when undertaking mixed-methods research, as it generally requires more time and financial
resources. A final, yet important, consideration is whether mixed-methods research is accepted by the
intended audience or primary consumer (e.g., dissertation committee) (Creswell & Plano Clark, 2007).
Assessing levels of acceptance within a discipline is described elsewhere, but parallels can easily be drawn
to individuals’ relative degree of acceptance.

Are there guidelines for conducting mixed-methods research for dissertations?

Guidelines (i.e., assessment criteria) for conducting a mixed-methods dissertation are not in widespread
use. At present, assessment criteria are typically and solely in the purview of the student’s dissertation
committee, whose collective expertise varies considerably in regard to conducting mixed-methods
research.

Why should guidelines be considered?

There is growing interest in this topic in the field (Creswell, 2002; Sale & Brazil, 2004) and among
qualitative researchers specifically, who may or may not have mixed-method experience yet desire to stay
abreast of current methodological trends and participate in the scholarly discourse surrounding them. The
impetus for having assessment criteria, it seems, is 1) to preserve the integrity and epistemological tenets
of qualitative inquiry in a mixed-method research paradigm; 2) to help faculty members guide graduate
students who are increasingly interested in mixed-methods research, typically for their dissertation, to have
rigorous experiences and produce high-quality work; and 3) to provide direction to graduate students who
are contemplating or conducting mixed-methods research. Faculty broadly report varying levels of quality
among dissertations (Lovitts, 2005), which include studies that purportedly use mixed-methods designs but
are fundamentally lacking in their demonstration of core principles regarding how to conduct mixed-
methods research.

What are recommended assessment criteria for mixed-methods research?

Critical assessment criteria for mixed-method studies are not in widespread use in the field (Bryman 1988,
2006; Sale & Brazil, 2004), although several researchers offer suggestions for how to evaluate research
quality (Creswell, 2002, 2003; Creswell & Plano Clark, 2007; Sale & Brazil, 2004). One way is to use
criteria specific to each methodological approach (Creswell & Plano Clark, 2007; Sale & Brazil, 2004). In
that way, mixed-methods research can be assessed for rigor relative to its truth value, applicability,
consistency, and neutrality (Lincoln & Guba 1985, 1986; Sale & Brazil, 2004). The operationalization of
these research goals, however, varies according to the philosophical tenets and principles espoused by the
respective research tradition. Applicability, for example, refers to generalizability or external validity among
quantitative researchers, while to qualitative scholars it denotes transferability. Table 2 below provides
Mixed-Methods 11

examples of standards as they apply separately to both quantitative and qualitative research. By and large,
the table excerpts findings presented by Sale and Brazil (2004) in their review of the literature on
assessment criteria for mixed-methods research.

Table 2: Assessment Criteria for Mixed-Methods Research

Goals Quantitative Qualitative

internal validity credibility

(e.g., treatment group (e.g., triangulation,


Truth Value
compared to control or member checks,
comparison group) negative case analysis)

external validity transferability

Applicability (e.g., sampling frame (e.g., study context, data


described with sampling collection & analysis
procedures) adequately described)

reliability dependability

Consistency (e.g., Cronbach’s alphas (e.g., audit of research


reported for scales) process)

objectivity confirmability

Neutrality (e.g., description of (e.g., audit trail follow-


standardized, possibly up, bracketing,
“blind,” observation) subjectivity statement)

A second approach to assessment criteria involves focusing on mixed-methods research as a distinct


methodology, rather than using standards unique to each form of inquiry. As Creswell and Plano Clark
(2007) assert, consensus about critical appraisal criteria for mixed-methods research does not exist and
debate is ongoing among scholars. As such, these researchers advocate the development of general
guidelines pertaining to whether the study meets criteria to classify it as mixed-methods, the degree of
methodological rigor, and, finally, the researcher’s level of understanding about the specific mixed-methods
design that undergirds the research effort.
Mixed-Methods 12

For students engaged in dissertation research (and their faculty committee), a practical approach may work
best toward ensuring the quality of mixed-methods research projects. Borrowing heavily from the literature,
the following guidelines are offered for consideration. For ease of use, guidelines are presented as
questions in a checklist format, which could later be converted into a rubric.

Yes, No, Comments


or Unclear?
Academic Preparation
Has student had sufficient training and/or experience in quantitative
methods? 1
Has student had sufficient training and/or experience in qualitative
methods? 1
Situational Factors
Is proposed project realistic, given time and resources available? 2
Is appropriate expertise 2 available and represented on dissertation
committee?
Research Proposal & Report
Is ‘mixed-methods’ evident in the title? 2
Is mixed-methods research defined? 3
Do methods fit the research questions, and are both types of questions
being posed? 2
Is the general rationale for mixing data clearly presented (e.g.,
complementarity, expansion)? 2, 4, 5
Is the specific purpose for conducting a mixed-methods study clear (e.g.,
triangulation, exploration, explanation)? 3, 4, 5
Is a specific mixed-methods design presented and appropriately labeled? 2
Is a visual model presented to show data collection and analysis
procedures? 2, 3, 6
Is standard notation used to represent the design? 3
Is there a clear description of data prioritization, weight, dominance,
emphasis, or equivalence? 2
Is there a clear description of implementation sequence (e.g., sequential or
concurrent)? 2
Is there a clear description of plans/procedures for data integration (e.g.,
question formulation, data collection, data analysis, interpretation)? 4
Are sampling strategies relevant to the specific mixed-methods design? 3
Are both quantitative and qualitative data collected, as discussed in
methods/procedures section? 2
Are analysis procedures for both types of data adequately described, and
do they fit the specific mixed-methods design? 2, 3
Are validation procedures for both types of data discussed? 3
Is the paper structured to match the specific mixed-methods design? 2, 3
1Creswell & Plano Clark, 2007; 2 Creswell, 2002; 3 Creswell, 2003; 4 Bryman, 2006; 5 Collins, Onwuegbuzie, &
Sutton, 2006; 6 Ivankova, Creswell, & Stick, 2006
Mixed-Methods 13

Helpful Resources

Creswell, J. W., & Plano Clark, V. L. (2007). Designing and Conducting Mixed Methods Research.
Thousand Oaks, CA: Sage.

Johnson, R. B. & Onwuegbuzie, A. J. (2004). Mixed Methods Research: A Research Paradigm Whose
Time Has Come. Educational Researcher, 33, 14-26.

Morse, J. (2003). Principles of mixed-and multi-method research design. In A. Tashakkori C. Teddlie


(Eds.) Handbook of Mixed Methods in Social and Behavioral Research. Thousand Oaks, CA: SAGE.

Tashakkori, A., & Teddlie, C. (Eds.). (2003). Handbook of mixed methods in social and behavioral
research. Thousand Oaks, CA: Sage Publications.

Cited References

Bryman, A. (2006). Integrating quantitative and qualitative research: how is it done? Qualitative Research,
6, 97-113.

Collins, K. M. T., Onwuegbuzie, A. J., & Sutton, I. L. (2006). A model incorporating the rationale and
purpose for conducting mixed-methods research in special education and beyond. Learning
Disabilities: A Contemporary Journal, 4(1), 67-100.

Creswell, J. W. (2002). Educational research: Planning, conducting, and evaluating quantitative and
qualitative research. (Chapter 17). Upper Saddle River, NJ: Merrill Prentice-Hall.

Creswell, J. W. (2003). Research design: Qualitative, quantitative and mixed methods approaches (2nd
ed.). Thousand Oaks, CA: Sage Publications.

Creswell, J. W., Fetters, M. D., Ivankova, N. V. (2004). Designing a mixed methods study in primary care.
Annals of Family Medicine, 2(1), 7-12.

Creswell, J. W., & Plano Clark, Vicki L. (2007). Designing and Conducting Mixed Methods Research.
Thousand Oaks, CA: Sage.

Creswell, J. W., Plano Clark, V., Gutmann, M., & Hanson, W. (2003). Advances in mixed method design. In
A. Tashakkori & C. Teddlie (Eds.), Handbook of mixed methods in social and behavioral research.
Thousand Oaks, CA: Sage.

Crotty, M. (1998). The foundations of social research: Meaning and perspective in the research process. St.
Leonards, Australia: Allen & Unwin.

Greene, J. C., Caracellli, V. J., & Graham, W. F. (1989). Toward a conceptual framework for mixed-method
evaluation designs. Educational Evaluation and Policy Analysis, 11, 255-274.

Hanson, W. E., Creswell, J. W., Plano Clark, V. L., Petska, K. S., & Creswell, J. D. (2005). Mixed methods
research designs in counseling psychology. Journal of Counseling Psychology, 52(2), 224-235.
Mixed-Methods 14

Ivankova, N. V., Creswell, J. W., & Stick, S. L. (2006). Using Mixed-Methods Sequential Explanatory
Design: From Theory to Practice. Field Methods, 18, 3-20.

Johnson, R. B. & Onwuegbuzie, A. J. (2004). Mixed Methods Research: A Research Paradigm Whose
Time Has Come. Educational Researcher, 33, 14-26.

Lincoln, Y. S., & Guba, E. G. (1985). Naturalistic Inquiry. Newbury Park, CA: Sage Publications.

Lincoln, Y. S., & Guba, E. G. (1986). But is it rigorous? Trustworthiness and authenticity in naturalistic
evaluation. In D. D. Williams (Ed.) Naturalistic Evaluation. New Directions for Program Evaluation, 90
(pp. 78-84). San Francisco: Jossey-Bass.

Lovitts, B. E. (2005). How to Grade a Dissertation. Academe, 91, 18-23.

Maxcy, S. J. (2003). Pragmatic threads in mixed methods research in the social sciences: The search for
multiple modes of inquiry and the end of the philosophy of formalism. In A. Tashakkori & C. Teddlie
(Eds.), Handbook of mixed methods in social and behavioral research (pp. 51-89). Thousand Oaks,
CA: Sage.

Morse, J. (2003). Principles of mixed-and multi-method research design. In A. Tashakkori C. Teddlie


(Eds.) Handbook of Mixed Methods in Social and Behavioral Research. Thousand Oaks, CA: SAGE.

Morse, J. M. (1991). Approaches to qualitative-quantitative methodological triangulation. Nursing


Research, 40, 120-123.

Onwuegbuzie, A., & Teddlie, C. (2003). A framework for analyzing data in mixed methods research. In A.
Tashakkori & C. Teddlie (Eds.) Handbook of Mixed Methods in Social and Behavioral Research (351-
384). Thousand Oaks, CA: Sage.

Plano Clark, V. L. (2005). Cross-disciplinary analysis of the use of mixed methods in physics education
research, counseling psychology, and primary care (Doctoral dissertation, University of Nebraska-
Lincoln, 2005). Dissertation Abstracts International, 66, 02A.

Punch, K. F. (1998). Introduction to Social Research: Quantitative and Qualitative Approaches. London:
Sage Publications. (Chapter 11: Mixed Methods and Evaluative Criteria)

Sale, J. E. M. & Brazil, K. (2004). A Strategy to Identify Critical Appraisal Criteria for Primary Mixed-Method
Studies. Quality & Quantity, 38, 351-365.

Tashakkori, A., & Teddlie, C. (1998). Mixed methodology: Combining qualitative and quantitative
approaches. Thousand Oaks, CA: Sage.
Mixed-Methods 15

APPENDIX A: Example of a Sequential Explanatory Mixed-Methods Design

Phase Procedure Product


QUANTITATIVE ƒ Cross-sectional, Web-based survey of ƒ Numeric data
Data Collection healthy female adult adoptees over 40.

ƒ Data screening (univariate, multivariate ƒ Missing data


normality) ƒ Outliers
ƒ Frequencies ƒ Descriptive statistics
ƒ Structural equation modeling ƒ Covariance matrix
QUANTITATIVE ƒ Deductive theory testing ƒ Cronbach’s alphas
Data Analysis ƒ Scale refinement ƒ Factor loadings
ƒ Parameter estimation ƒ Structural path coefficients
ƒ Software: SPSS, Lisrel, Prelis ƒ Residuals
ƒ Fit indices
ƒ Modification indices

ƒ Develop qualitative interview questions • Extreme cases (n = 12) at both ends of


Connecting based on quantitative findings the risk continuum (6 with low perceived
QUAN & ƒ Purposeful selection of extreme cases to risk, 6 with high)
Qual Phases participate in qualitative interview • Semi-structured qualitative data
collection instrument

ƒ In-depth telephone interviews and/or e- ƒ Text data (e-mail correspondence,


mail dialogue with 12 participants interview transcripts)
Qual
Data Collection

ƒ Coding and thematic analysis, according ƒ Codes


to conventions of constructivist grounded ƒ Themes
Qual theory ƒ Rich & thick description
Data Analysis ƒ Inductive & deductive reasoning ƒ Illustrative quotes
ƒ Atlas-ti ƒ Suggested refinements to theoretical
model

ƒ Interpretation of quantitative and qualitative ƒ Refined theoretical explanation of female


Integration of findings—“combining inferences into a adult adoptees’ coping responses to the
QUAN & Qual coherent whole” threat of breast cancer amidst
Findings ambiguous biological family medical
history
ƒ Implications
ƒ Limitations
ƒ Suggestions for future research
Author’s design of this figure borrows heavily from the work of Ivankova, Creswell, & Stick (2006).
Grey shading indicates where data are mixed.

Das könnte Ihnen auch gefallen