Sie sind auf Seite 1von 12

Journal of Mixed Methods Research

http://mmr.sagepub.com/

Doing Mixed Methods Research Pragmatically: Implications for the Rediscovery of


Pragmatism as a Research Paradigm
Martina Yvonne Feilzer
Journal of Mixed Methods Research 2010 4: 6 originally published online 22 October 2009
DOI: 10.1177/1558689809349691

The online version of this article can be found at:


http://mmr.sagepub.com/content/4/1/6

Published by:

http://www.sagepublications.com

Additional services and information for Journal of Mixed Methods Research can be found at:

Email Alerts: http://mmr.sagepub.com/cgi/alerts

Subscriptions: http://mmr.sagepub.com/subscriptions

Reprints: http://www.sagepub.com/journalsReprints.nav

Permissions: http://www.sagepub.com/journalsPermissions.nav

Citations: http://mmr.sagepub.com/content/4/1/6.refs.html

>> Version of Record - Dec 29, 2009


Proof - Oct 22, 2009

What is This?

Downloaded from mmr.sagepub.com by guest on November 14, 2011


Articles
Journal of Mixed Methods Research
4(1) 6–16
Doing Mixed Methods Research ªThe Author(s) 2010
Reprints and permission: http://www.
Pragmatically: Implications for sagepub.com/journalsPermissions.nav
DOI: 10.1177/1558689809349691

the Rediscovery of Pragmatism http://jmmr.sagepub.com

as a Research Paradigm

Martina Yvonne Feilzer1

Abstract
This article explores the practical relevance of pragmatism as a research paradigm through the
example of a piece of pragmatic research that not only used both quantitative and qualitative
research methods but also exploited the inherent duality of the data analyzed. Thus, the article
aims to make the case that pragmatism as a research paradigm supports the use of a mix of
different research methods as well as modes of analysis and a continuous cycle of abductive
reasoning while being guided primarily by the researcher’s desire to produce socially useful
knowledge. It will be argued that pragmatism can serve as a rationale for formal research design
as well as a more grounded approach to research.

Keywords
pragmatism, mixed methods research, duality of data

Mixed methods research has been hailed as a response to the long-lasting, circular, and remark-
ably unproductive debates discussing the advantages and disadvantages of quantitative versus
qualitative research as a result of the paradigm ‘‘wars’’. The main paradigms or worldviews
that traditionally are presented as being fundamentally opposed are those of positivism/postpo-
sitivism and constructivism/interpretivism (Creswell & Plano Clark, 2007; for a discussion of the
different interpretation of paradigm, see Morgan, 2007). In a (admittedly rather simplistic) nut-
shell, the positivist notion of a singular reality, the one and only truth that is out there waiting to
be discovered by objective and value-free inquiry underpins quantitative research methods. It is
contrasted with the idea that there is no such thing as a single objective reality and that ‘‘subjec-
tive inquiry is the only kind possible to do’’ and for that reason constructivists favor qualitative
research methods (Creswell & Plano Clark, 2007; Erlandson, Harris, Skipper, & Allen, 1993,
p. xi). Notwithstanding important advances made by feminist, postmodernist, poststructuralist,
and critical researchers, and many more nuanced positions within these broad frameworks, these
two paradigms are still dominating methodological textbooks and epistemological debates in so-
cial sciences (Hughes & Sharrock, 2007; Teddlie & Tashakkori, 2009).

1
Bangor University Gwynedd, UK

Corresponding Author:
Martina Yvonne Feilzer, School of Social Sciences, Bangor University, Gwynedd LL57 2DG, UK
Email: m.feilzer@bangor.ac.uk

Downloaded from mmr.sagepub.com by guest on November 14, 2011


Feilzer 7

Proponents of mixed methods research strive for an integration of quantitative and qualitative
research strategies and thus, this approach does not fall comfortably within one or the other
worldview described above. As a consequence, researchers have attempted to construct an alter-
native framework that accommodates the diverse nature of such research (Creswell & Plano
Clark, 2007, pp. 26-28). However, there appears to be little agreement amongst mixed methods
researchers on the nature of this framework. Thus, whereas Creswell and Plano Clark (2007,
p. 26) describe three alternative stances on the paradigm issue, Greene, Benjamin, and Goodyear
(2001, p. 28) list four different frameworks for mixing methods, and although Tashakkori and
Teddlie in 1998 discuss only one framework in detail, they include another framework, namely
the transformative perspective, in their latest textbook on mixed methods research (Teddlie &
Tashakkori, 2009, p. 87). The approach most commonly associated with mixed methods research
(Teddlie & Tashakkori, 2009, p. 7), although clearly not the only one, is pragmatism, which
offers an alternative worldview to those of positivism/postpositivism and constructivism and
focuses on the problem to be researched and the consequences of the research (Brewer & Hunter,
1989, p. 74; Creswell & Plano Clark, 2007, p. 26; Miller, 2006; Tashakkori & Teddlie, 1998,
pp. 29-30).
In this article, I do not intend to review the evolution and development of pragmatism as a phi-
losophy. The history of pragmatism is a long one, and it is marked by transformations and a cer-
tain ‘‘muddledness’’ (Rorty, 1991, p. 64), leading one commentator to suggest that there are ‘‘in
effect, . . . two pragmatisms’’ (Mounce, 1997, p. 2), if not more. Instead, I aim to show how doing
mixed methods research pragmatically has highlighted the practical relevance of philosophical
pragmatism to research methodology, in particular, but not exclusively, to mixed methods re-
search. Thus, it may not only help synthesize the different interpretations of pragmatism as un-
derlying mixed methods research but also help rediscover pragmatism as a practically relevant
research paradigm for all types of research (Denscombe, 2008, pp. 273-275).
The article will proceed by outlining the author’s understanding of paradigms and pragmatism
as a paradigm; describing a pragmatic research project based on a mixed methods design; and
drawing conclusions on whether a pragmatic approach to problem solving in the social world
offers an alternative, flexible, and more reflexive guide to research design and grounded
research.

Pragmatism as a Research Paradigm


A paradigm could be regarded as an ‘‘accepted model or pattern’’ (Kuhn, 1962, p. 23), as an
organizing structure, a deeper philosophical position relating to the nature of social phenomena
and social structures. This use of paradigm relates it directly to research, as an epistemological
stance (for a discussion of other interpretations of paradigm, see Morgan, 2007). In this sense,
a paradigm directs research efforts, it serves to reassert itself to the exclusion of other paradigms
and to articulate the theories it already established (Kuhn, 1962, p. 24). Thus, from the objective
and measurable reality of positivism via the ‘‘contextualised causal understanding’’ of realism
(Greene et al., 2001, p. 29) to the subjective plurality of interpretivism, paradigms could be
interpreted as prescriptive and as requiring particular research methods and excluding others.
In that sense, a paradigm can constrain intellectual curiosity and creativity, blind researchers
to aspects of social phenomena, or even new phenomena and theories (Kuhn, 1962, p. 24),
and limit the ‘‘sociological imagination’’ (Mills, 1959).
The choice of social sciences research questions and methods, albeit sometimes dictated by
research funders, is a reflection of researchers’ epistemological understanding of the world,
even if it is not articulated or made explicit. Moreover, the interpretation of any research findings
will expose the researchers’ underlying philosophies, drawing on, and extending the notion that

Downloaded from mmr.sagepub.com by guest on November 14, 2011


8 Journal of Mixed Methods Research 4(1)

‘‘all knowledge is knowledge from some point of view’’ (Fishman, 1978, p. 531; see also
Mounce, 1997, p. 14 on Peirce’s pragmatism). For a discussion of pragmatism as nonparadig-
matic, a position that assumes an independence of method and underlying theory, see for exam-
ple, Greene et al. (2001, p. 28), Greene and Caracelli (2002, p. 94), and Teddlie and Tashakkori
(2009, p. 97).
Pragmatism, when regarded as an alternative paradigm, sidesteps the contentious issues of
truth and reality, accepts, philosophically, that there are singular and multiple realities that are
open to empirical inquiry and orients itself toward solving practical problems in the ‘‘real world’’
(Creswell & Plano Clark, 2007, pp. 20-28; Dewey, 1925; Rorty, 1999). In that sense, pragmatism
allows the researcher to be free of mental and practical constraints imposed by the ‘‘forced
choice dichotomy between postpositivism and constructivism’’ (Creswell & Plano Clark,
2007, p. 27), and researchers do not have to ‘‘be the prisoner of a particular [research] method
or technique’’ (Robson, 1993, p. 291).
Pragmatists’ view of the measurable world relates more closely to an ‘‘existential reality’’
(Dewey, 1925, p. 40), a reference to an experiential world with different elements or layers,
some objective, some subjective, and some a mixture of the two. There are layers of the ‘‘stable
and the precarious’’ (Dewey, 1925, p. 40), layers of ‘‘completeness, order, recurrences which
make possible prediction and control, and singularities, ambiguities, uncertain possibilities, pro-
cesses going on to consequences as yet indeterminate’’ (Dewey, 1925, p. 47).
One of Dewey’s (1925) contentions is that the main research paradigms of positivism and sub-
jectivism derive from the same paradigm family, that they seek to find ‘‘the truth’’ —whether it
is an objective truth or the relative truth of multiple realities (Dewey, 1925, p. 47). Both objective
as well as subjective inquiry attempts to produce knowledge that best corresponds to, or repre-
sents, reality (Rorty, 1999, p. xxii). Thus, pragmatists are ‘‘anti-dualists’’ (Rorty, 1999, p. ixx)
questioning the dichotomy of positivism and constructivism and calling for a convergence of
quantitative and qualitative methods, reiterating that they are not different at an epistemological
or ontological level and that they share many commonalities in their approaches to inquiry
(Hanson, 2008; Johnson & Onwuegbuzie, 2004). Hanson (2008, pp. 103-106) argues that the dis-
tinctions between phenomena as objective or subjective are primarily a result of political divi-
sions among social scientists combined with the development of distinctive skill sets for
quantitative and qualitative research (see also Jick, 1979; Morgan, 2007, pp. 60-61).
Pragmatists also hold an ‘‘antirepresentational view of knowledge’’ arguing that research
should no longer aim to most accurately represent reality, to provide an ‘‘accurate account
of how things are in themselves’’ but to be useful, to ‘‘aim at utility for us’’ (Rorty, 1999,
p. xxvi). The notion of utility raises some difficult questions about how such a concept could
be defined. Do we refer back to the consensus among our peers about the questions that are worth
asking (Morgan, 2007, p. 66) and risk the charge of conservatism? It may be more useful to sug-
gest that the notion of utility calls for reflexive research practice. Thus, any inquiry begs the
question of ‘‘what it is for’’ and ‘‘who it is for’’ and ‘‘how do the researchers’’’ values influence
the research, and it is these questions that need to be considered by researchers to make inquiry
more than an attempt to ‘‘mirror reality’’.
At the level of translating epistemological concerns into research methodology and finally the
decision of research methods, a pragmatic paradigm, poses some methodological questions. If
phenomena have different layers how can these layers be measured or observed? Mixed methods
research offers to plug this gap by using quantitative methods to measure some aspects of the
phenomenon in question and qualitative methods for others. Mixed methods research has grown
in popularity, so much that this journal has been developed which is devoted to mapping mixed
methods research studies and the extent to which they integrate the different research method-
ologies employed and, ultimately, developing a strategy to achieve consistent integration

Downloaded from mmr.sagepub.com by guest on November 14, 2011


Feilzer 9

(Creswell & Tashakkori, 2007, p. 107). This would suggest that some mixed methods researchers
are struggling with true integration—in the sense of looking at phenomena from different per-
spectives and providing an enriched understanding (Jick, 1979, pp. 603-604)—and in its current
form a lot of mixed methods research is confining itself to a presentation of findings by juxta-
position, that is, putting the data derived through different methods alongside each other and dis-
cussing findings separately. Thus, it seems that some if not most empirical mixed methods
research has not been able to transcend the forced dichotomy of quantitative and qualitative
methods and data, and that they are still used and presented as ‘‘totally or largely independent
of each other’’ (Bryman, 2007, p. 8).

Doing Research Pragmatically—The Crime Scene study


This article, using the author’s recent research, the ‘‘Crime Scene study’’1, as an example,
attempts to show how the research process pointed to the practical relevance of pragmatism,
inductively reaffirming pragmatism as an alternative paradigmatic framework to positivism/
postpositivism and constructivism. It looks beyond its instrumental link to mixed methods
research to thinking about its philosophical basis and implications for the advancement of knowl-
edge and research methodology generally. The intention in this article is to concentrate on the
methodological and theoretical implications of the Crime Scene study’s research methodology
rather than discuss the substantive findings from the research which have been presented else-
where (Feilzer, 2007).
The Crime Scene study aimed to measure the impact of the provision of factual information
on crime and criminal justice to members of the public through a local newspaper column and to
explain any findings from the experimental part of the research. The intervention consisted of the
author writing 26 columns on crime and criminal justice for a weekly local newspaper, which
were published every week for a period of 6 months. The Crime Scene columns were written
specifically for a local audience with local concerns in mind and thus have to be understood
in a spatial and temporal context.
Measuring the impact of these columns required a naturalistic research design, which incor-
porated two different research strategies, quantitative and qualitative. The research methods that
were expected to complement each other to better understand the underlying processes of recep-
tion, take-up, and recollection of information were a natural quasi-experimental intervention, the
impact of which was measured by a large-scale survey, the ‘‘Oxford Public Opinion Survey’’
(OPOS), and in-depth interviews. As a quasi-experiment, participants were not randomly
assigned to the experimental or comparison group as is the process for a ‘‘true’’ experiment
(Bachman & Schutt, 2003, p. 139), but the experimental and the comparison group self-selected
through their choice of reading a particular local newspaper. The random allocation of partici-
pants was not possible as the study was designed to test the natural effect of providing informa-
tion about crime and the criminal justice system through a medium regularly accessed by
members of the public.
Using a multilevel sequential mixed design (Teddlie & Tashakkori, 2009, p. 151), the
different methods were meant to inform and supplement each other not only because they
addressed different aspects of the study (or different layers of the phenomenon) but also be-
cause they are taken from different research strategies. In Denscombe’s (2008, p. 272)
terms, methods were mixed to produce a more complete picture, to avoid the biases intrinsic
to the use of monomethod design, and as a way of building on, and developing, initial
findings.
The OPOS was designed as a quantitative exercise, resulting in data that were analyzed
statistically, controlling for confounding variables as far as possible,2 with the potential of

Downloaded from mmr.sagepub.com by guest on November 14, 2011


10 Journal of Mixed Methods Research 4(1)

providing support for inferences of cause and effect. However, somewhat unexpectedly the
OPOS also resulted in a large amount of qualitative data as respondents commented on questions
or used the space for comments provided. Rather than dismissing these data as unwanted noise,
a pragmatic decision was made to exploit this unexpected data source.
The in-depth interviews were designed to follow the survey research sequentially to
explore in more detail the survey findings—a sequential mixed methods design. As the nat-
ural quasi-experiment explored whether or not the provision of information would have
a measurable impact on readers of the local newspaper, the method chosen for the second
stage of the research design, the interview schedule was not finalized until after the comple-
tion of the natural quasi-experiment and subsequent data analysis. Obviously, the outcome
of the natural quasi-experiment was unknown at the research design stage. However, both
potential outcomes (natural quasi-experimental intervention did or did not have an impact)
were simulated to start considering an interview schedule that would explore the eventual
findings in more detail. In-depth interviews were thought to be the method most suitable
for exploring the processes of take-up, reception, and retention of the information contained
in the columns.
The sequential or two-phase design provided the flexibility to adapt the second stage to the
findings from the first research stage, in this case the natural quasi-experiment. The self-reported
readership of the column had been low and it appeared that the column—the natural quasi-
experimental intervention—had no measurable aggregate, and little individual, effect on readers
of the local newspaper. As a consequence, the in-depth interviews were adapted to explore the
processes of transmitting factual information to members of the public, the low take-up of
the column, and the experience of filling in a fairly lengthy public opinion survey. In short, while
the natural quasi-experimental intervention and the OPOS were designed to establish whether
any effect had occurred, the interviews were trying to understand the processes by which the
resultant effect had occurred.
The interview data were analyzed in three separate stages, first ‘‘quantitatively,’’ reducing
(where possible) in-depth discursive answers to categorical responses to the questions posed
and analyzing them using SPSS. In a separate step, raw interview data were analyzed using
a qualitative, in-depth approach by grouping responses according to questions and emerging
themes. Finally, again going back to the raw interview material, interviews were scrutinized
by re-reading interview transcripts and listening to interviews again ‘‘looking out’’ for well-
rehearsed metaphors, slogans, or narratives used in the media, by politicians, or policymakers
speaking ‘‘on behalf’’ of the public. Thus, the analysis of the qualitative data looked for the
‘‘stable’’, ‘‘order and recurrence’’ in the interview data as well as the ‘‘precarious’’, the ambi-
guities and singularities.
Analyzing the interview data quantitatively as well as qualitatively has been described as
conversion mixed design by Teddlie and Tashakkori (2009, p. 151). The decision to analyze
the interview data quantitatively was not built into the research design but was made pragmat-
ically on the strength of noticing abductively (Morgan, 2007, p. 71) that the interview data had
qualities of stability and recurrence. Abductive reasoning refers to the logical connection made
by researchers between data and theory, often used for theorizing about surprising ‘‘events’’
(Teddlie & Tashakkori, 2009, p. 89). In abduction, researchers ‘‘move back and forth between
induction and deduction—first converting observations into theories and then assessing those
theories through action’’ (Morgan, 2007, p. 71). This requires reflection on ‘‘different
approaches to theory and data’’ and offers great opportunity to ‘‘work back and forth between
the kinds of knowledge [ . . . ] produced under the separate banners of Qualitative and Quanti-
tative Research’’ (Morgan, 2007, p. 71).

Downloaded from mmr.sagepub.com by guest on November 14, 2011


Feilzer 11

Reading Between the Lines—Qualitative Data


Emerging From a Quantitative Research Tool
The use of large-scale ‘‘tick-box’’ questionnaires is often favored by researchers employing
primarily quantitative research methods because it provides data that are precise and, arguably,
unambiguous. The implicit expectation is that survey respondents comprehend the questions
posed in the same way as the researchers/pollsters do, that they hold attitudes on all the issues
raised, and that they are willing to share these views with the researchers.
Thus, most research reports presenting findings from quantitative surveys do not provide
details of ‘‘unwanted noise’’ in the survey process, such as comments scribbled onto survey
forms and phone calls explaining why forms were not returned.3 In choosing to ignore this
type of data, researchers may unconsciously pledge allegiance to the positivist paradigm
(Morgan, 2007, pp. 63-64), neglect an important part of the research process, and unduly limit
the value of their data. The OPOS invited comments from respondents on its last page, but notes
were also found scribbled next to questions throughout the questionnaire. The number of com-
ments made on the survey forms was noticeably high and thus they were recorded during data
entry.
About a third of respondents felt the need to comment on some aspect of the survey. Some of
these comments were purely practical, for example, advising the researchers on a change of
address,4 but most were much more substantive, inquiring about the validity of the questions,
and commenting on, and qualifying, answers. More than a quarter (27%) of all survey respond-
ents made such substantive comments.
The methodology literature suggests that survey respondents and interview participants,
once they have agreed to take part in research, ‘‘accept the framework of questions and try
earnestly to work within that framework’’ (Schumann & Presser, 1996, p. 299). And this
certainly rings true for the ‘‘silent’’ majority of those taking part in the OPOS. However,
the fact that a sizeable minority of respondents expressed doubts about particular questions,
the survey methodology, or took the opportunity to qualify what could be interpreted as sim-
plistic responses to very complex issues, raises questions about the appropriate interpretation
of survey research.
Using these additional qualitative data that were volunteered by survey respondents enabled
some more general reflections on the limitations of survey questions. What do survey respond-
ents actually mean or what are they thinking about when answering Likert scale–type questions?
Two notes scribbled by two different respondents next to the same survey question provided
a stark reminder of how little survey questions reveal about a respondent’s reasoning. The par-
ticular question and answer categories are included in Box 1.

Box 1. Question 39 From OPOS (Oxford Public Opinion Survey) Questionnaire


39. How effective do you think a prison sentence is in reducing an offender’s likelihood of reconviction?
, Very effective
, Fairly effective
, Not very effective
, Not at all effective
, Don’t know

Downloaded from mmr.sagepub.com by guest on November 14, 2011


12 Journal of Mixed Methods Research 4(1)

Both survey respondents ticked the same answer, namely that prison sentences were ‘‘not very
effective’’ in reducing an offender’s likelihood of reconviction; however, they provided very
different explanations for their choice through the scribbled comments next to the survey question.
One commented that prisons were not very effective ‘‘due to prisons being more like holiday
camps’’ (comment on survey form, HiPpre132), whereas the other respondent expressed very dif-
ferent sentiments.
I think our prisons are much too full and that due consideration on the primitive effects of
a prison sentence on the offender’s family/dependants are not adequately taken into
account. (Comment on survey form, Nopost029)
By acknowledging respondents’ comments, it was possible to illustrate not only that survey ques-
tions can be interpreted differently but also that the same answers can have very different explanatory
value. Thus, to regard survey results as accurate reflections of survey respondents’ attitudes and be-
liefs seems to ignore not only a considerable body of social psychological research and scholarship but
also survey respondents’ explicit concerns about the limitations of tick-box questionnaires.
The choice of including these ‘‘unwanted’’ and unexpected data based on a pragmatic under-
standing of research methods thus highlighted one of the key advantages of pragmatism as a par-
adigm, namely ‘‘allowing for new and deeper dimensions to emerge’’ (Jick, 1979, p. 604). As
indicated by the lack of references to such unwanted noise in the most established and largest
British social surveys, the British Social Attitudes Survey and the British Crime Survey,
many quantitative researchers dismiss such comments as unreliable and unrepresentative. Sim-
ilarly, many qualitative researchers would dismiss them because of their nature as responses to
a quantitative research tool. Embracing a pragmatist framework, it was possible to regard these
comments as data regardless of the type of method used to prompt them.
Combining the quantitative OPOS, the in-depth interviews, and contextual data facilitated inter-
pretation of each data set, two of them collected through fairly standard social scientific research
methods, and the last one in a more grounded, inductive research fashion (Bottoms, 2000, pp.
42-44; Bryman, 2004, p. 10 and chapter 19). Thus, analyzing the data sets abductively as well as
deductively and inductively, separately at first, then moving back and forth between the data sets
with the knowledge produced by each one, finally bringing them together, enabled the interpretation
of the data from a multidimensional perspective, each data set informed, questioned, and enhanced
by the others. Using separate analytical steps of ‘‘between-method’’ triangulation made it possible
to reflect on the added value of using more than one method to address the research questions (see
Jick, 1979 for an in-depth discussion of different forms of integration and triangulation).
The analysis of the data was situated by providing the spatial and temporal context to the
Crime Scene study. Although it is clear that small local studies suffer from limitations as regards
their generalizability, they do have their advantages. They offer insights into the structures and
processes involved in creating certain findings, aid appropriate interpretation, enhance the val-
idity of research and its meaningfulness (Erlandson et al., 1993, pp. 16-18). Arguably, only the
more detailed understanding of where, when, and how the research was implemented provides
the necessary insights into the processes that resulted in its particular findings. As some argue,
it is impossible to draw meaningful conclusions in the sense of time- and context-free
generalizations (Erlandson et al., 1993, pp. 14-19; Tashakkori & Teddlie, 1998, p. 10).

Reflecting on the Crime Scene Study—


The Practicalities of Mixed Methods Research
Individual elements of the Crime Scene study were resource intensive; designing and implement-
ing a public opinion survey and writing weekly columns for a local newspaper; designing and

Downloaded from mmr.sagepub.com by guest on November 14, 2011


Feilzer 13

implementing in-depth interviews; and doing observational work; were all labor and time-intense.
Moreover, they demanded that the author took on a number of different roles, as pollster, (lay) stat-
istician, interviewer, participant and non-participant observer, columnist, and ‘‘ethnographer.’’
Thus, mixed methods research of this kind requires researchers that have the ability and, more
important, the inclination to do ‘‘number crunching’’ as well as working with ‘‘soft’’ data.
The OPOS was designed to provide evidence on whether or not the experiment ‘‘worked,’’
that is, whether Oxford Times readers would be interested in the column, read it, and improve
their levels of knowledge of crime and criminal justice. In isolation, the survey research would
have been unable to offer much in the way of explaining the findings, their meanings, and how to
understand and thus interpret them (Greene et al., 2001, p. 27). Why did Oxford Times readers
choose not to read the column? Why did they not read the column avidly every week? And if they
did read them, why did it not increase their knowledge or change their views? The survey alone
would have provided little more than a snapshot, a photograph of a social experiment without the
subtitle enabling it to come to life. The limitations of survey research and ‘‘polls’’ have been
discussed elsewhere (Blumer, 1948; Bourdieu, 1993; Osborne & Rose, 1999), and restricting
the research to that one method alone would have limited the research’s value. The in-depth in-
terviews offered some useful insights into the ways in which the public respond to standardized
questionnaires on crime and criminal justice, thus helping to overcome some of the inherent
problems in interpreting one-dimensional survey research. Thus, the findings from the different
research methods are used in a coordinated way, to ‘‘illustrate, enhance, help to explain, or refine
the other set of findings’’ (Greene et al., 2001, p. 31).
There is a chance, of course, that a mixed methods design leads to heterogeneous results that
need to be interpreted carefully. Do such results undermine one or other of the methods used or
do they simply represent different dimensions of the interrogated phenomenon? As Greene et al.
(2001, p. 41) suggest, designing, analyzing, and interpreting mixed methods research requires
reflexivity and care. It became clear that, so-called, quantitative research methods such as
large-scale public opinion surveys also capture qualitative data, whilst qualitative data can
also be quantified. Thus assuming logical contradictions and a duality of data based on rigid
interpretation of the main paradigms of positivism/postpositivism and constructivism is neither
sensitive nor receptive to the complexity of the social world. Pragmatism as a rationale for mixed
methods research has proven to be a great tool to go beyond testing a particular idea and describ-
ing a status quo. Applying the reasoning of pragmatism has enabled me to make use of a valuable
source of data, the scribbled, spontaneous comments provided by OPOS respondents.
However, this still leaves open some questions. How does pragmatism reconcile the notions
of causality and relativism and subjectivity? Can those notions coexist in one paradigmatic
framework? The answer to those questions warrants a separate article, but I would tentatively
suggest that pragmatism can incorporate both ideas as it acknowledges the existence of structural
regularities that are moderated by the unpredictability of human nature. Thus, causal relation-
ships can apply ‘‘most of the time’’ unless the ‘‘human element’’ undermines and changes them.

What Guides Pragmatic Research?


So, does all of this suggest that the paradigm of pragmatism should be linked to mixed methods
research? Pragmatism does not require a particular method or methods mix and does not exclude
others. It does not expect to find unvarying causal links or truths but aims to interrogate a partic-
ular question, theory, or phenomenon with the most appropriate research method. Hanson (2008,
p. 107) regards validity, interpreted as the ‘‘relationship between theory and method,’’ the closest
possible match of theory and method, as the ‘‘paramount criteria for judging the legitimacy for

Downloaded from mmr.sagepub.com by guest on November 14, 2011


14 Journal of Mixed Methods Research 4(1)

a method.’’ Important in this interpretation, validity is not the same as truth in the scientific sense
of ‘‘correspondence to reality’’ (Rorty, 1991, p. 64; see also, Mounce, 1997, p. 98).
This type of validity, in contrast to the mechanistic validity of quantitative methods (‘‘the
logic of scientific method,’’ Rorty, 1991, p. 65), requires reflection on the question or theory
to which the data speak. Data were collected aiming to answer a specific research question; how-
ever, as may be far too familiar to many researchers, fieldwork does not necessarily turn out that
way. Researchers have to be aware from the outset that the data collated as part of research may
not ‘‘fit’’ the research question as well as desired, that it may point to uncertainties and a ‘‘human
element’’ that was not considered at the design stage. However, this does not suggest that we can
abandon the original research question and ‘‘just answer another one’’ but that the findings need
reflection and abductive reasoning, and the research methods or the underlying theory need
sharpening or rethinking.
In a way, pragmatism is a commitment to uncertainty, an acknowledgement that any knowl-
edge ‘‘produced’’ through research is relative and not absolute, that even if there are causal re-
lationships they are ‘‘transitory and hard to identify’’ (Teddlie & Tashakkori, 2009, p. 93). This
commitment to uncertainty is different from philosophical skepticism saying that we cannot
know anything but an appreciation that relationships, structures, and events that follow stable
patterns are open to shifts and changes dependent on precarious and unpredictable occurrences
and events (Mounce, 1997, pp. 99-101).
The acknowledgement of the unpredictable human element forces pragmatic researchers to be
flexible and open to the emergence of unexpected data. This means that in Kuhn’s (1962, 1970)
terms, as a paradigmatic constraint pragmatism reminds researchers of their ‘‘duty’’ to be curious
and adaptable.
Ultimately, pragmatism brushes aside the quantitative/qualitative divide and ends the para-
digm war by suggesting that the most important question is whether the research has helped
‘‘to find out what [the researcher] want[s] to know’’ (Hanson, 2008, p. 109). Are quantitative
and qualitative methods really that different or is their dichotomy politically motivated and
sociologically constructed (Hanson, 2008)? Pragmatists do not ‘‘care’’ which methods they
use as long as the methods chosen have the potential of answering what it is one wants to
know. Naturally, this is not an excuse for sloppy research and pragmatic should never be con-
fused with expedient (Denscombe, 2008, p. 274) but requires a good understanding of quantita-
tive and qualitative methods and analyses, which is transparent and replicable (as much as is
possible). In that pragmatists should contribute to coming to an agreement of what constitutes
good-quality social research (Hammersley, 2008, p. 177).
This discussion has relevance for mixed methods research in two forms. Pragmatism can be
used as a guide not only for top-down deductive research design but also for grounded inductive
or abductive research. It offers the chance to produce a ‘‘properly integrated methodology for the
social sciences’’ (Morgan, 2007, p. 73) in acknowledging the value of both quantitative and qual-
itative research methods and the knowledge produced by such research in furthering our under-
standing of society and social life. Pragmatism may thus enable researchers to enjoy the
complexity and messiness of social life and revive a flagging sociological imagination.

Acknowledgments
My thanks go to Howard Davis for his invaluable comments on a draft of this article and to Richard Young
for supporting me and providing inspiration during the research at the heart of the discussion. I am very
grateful for the constructive comments of six anonymous reviewers and the editors on earlier versions of
the article.

Downloaded from mmr.sagepub.com by guest on November 14, 2011


Feilzer 15

Declaration of Conflicting Interests


The author declared no potential conflicts of interests with respect to the authorship and/or publication of
this article.

Funding
The author received funding from the Nuffield Foundation to carry out the research on which the paper is
based.

Notes
1. The Crime Scene study was funded by the Nuffield Foundation.
2. For example, a Solomon four-group design was used to control for the interaction effect of pretesting
participants.
3. The British Crime Survey, for example, comes with its own Technical Report; however, no reference is
made to comments or other data volunteered by the interviewees except for a detailed account of how
verbatim offence descriptions need to be coded into the appropriate offence category (see, e.g., Grant,
Bolling, & Sexton, 2006). Similarly, every issue of the British Social Attitudes Survey contains a tech-
nical report. However, again in the three issues sampled, including the introduction to the first survey,
there was no mention of information volunteered by survey respondents over and above the categorical
answers to specific survey questions (Jowell, 1984; Park, Curtice, Thomson, Bromley, & Phillips, 2004;
Park, Curtice, Thomson, Phillips, & Johnson, 2007).
4. Questionnaires were coded; the code could be related back to respondents’ names and addresses for the
follow-up survey. Contact details were deleted once the survey process was complete and data analysis
was carried out on anonymous survey data.

References
Bachman, R., & Schutt, R. (2003). The practice of research in criminology and criminal justice (2nd ed.).
Thousand Oaks, CA: SAGE.
Blumer, H. (1948). Public opinion and public opinion polling. American Sociological Review, 13, 542-549.
Bottoms, A. (2000). The relationship between theory and research in Criminology. In R. D. King & E. Wincup
(Eds.), Doing research on crime and justice (pp. 15-60). Oxford, UK: Oxford University Press.
Bourdieu, P. (1993). Public opinion does not exist (R. Nice, Trans.). In P. Bourdieu (Ed.), Sociology in
question (pp. 149-158). London: SAGE.
Brewer, J., & Hunter, A. (1989). Multimethod research: A synthesis of styles. Newbury Park, CA: SAGE.
Bryman, A. (2004). Social research methods (2nd ed.). Oxford, UK: Oxford University Press.
Bryman, A. (2007). Barriers to integrating quantitative and qualitative research. Journal of Mixed Methods
Research,1, 8-22.
Creswell, J. W., & Plano Clark, V. L. (2007). Designing and conducting mixed methods research. Thousand
Oaks, CA: SAGE.
Creswell, J.W. & Tashakkori, A. (2007). Editorial: Developing Publishable Mixed Methods Manuscripts.
Journal of Mixed Methods Research, 1(2), 107-111.
Denscombe, M. (2008). Communities of practice. Journal of Mixed Methods Research,2, 270-283.
Dewey, J. (1925). Experience and nature. Whitefish, MT: Kessinger.
Erlandson, D. A., Harris, E. L., Skipper, B. L., & Allen, S. D. (1993). Doing naturalistic inquiry: A guide to
methods. Newbury Park, CA: SAGE.
Feilzer, M. (2007). Criminologists making news? Providing factual information on crime and criminal
justice through a weekly newspaper column. Crime, Media, Culture, 3, 285-304.
Fishman, M. (1978). Crime wave as ideology. Social Problems, 25, 531-543.

Downloaded from mmr.sagepub.com by guest on November 14, 2011


16 Journal of Mixed Methods Research 4(1)

Grant, C., Bolling, K., & Sexton, M. (2006). 2005-6 British crime survey (England and Wales): Technical
Report, Volume I. London: Home Office.
Greene, J., Benjamin, L., & Goodyear, L. (2001). The merits of mixing methods in evaluation. Evaluation,
7, 25-44.
Greene, J., & Caracelli, V. (2002). Making paradigmatic sense of mixed methods practice. In A. Tashakkori
& C. Teddlie (Eds.). Handbook of mixed methods in social and behavioural research (pp. 91-111).
London: SAGE.
Hammersley, M. (2008). Questioning qualitative inquiry. Critical essays. London: SAGE.
Hanson, B. (2008). Wither qualitative/quantitative? Grounds for methodological convergence. Quality &
Quantity,42, 97-111.
Hughes, J. A., & Sharrock, W. W. (2007). Theory and methods in sociology. Basingstoke, UK: Palgrave
Macmillan.
Jick, T. D. (1979). Mixing qualitative and quantitative methods: Triangulation in action. Administrative
Science Quarterly,24, 602-611.
Johnson, R. B., & Onwuegbuzie, A. J. (2004). Mixed methods rersearch: A research paradigm whose time
has come. Educational Researcher,33(7), 14-26.
Jowell, R. (1984). Introducing the survey. In R. Jowell & C. Airey (Eds.), British social attitudes: The 1984
report (pp. 1-10). Aldershot, UK: Gower.
Kuhn, T. S. (1962). The structure of scientific revolutions (1st ed.). Chicago: University of Chicago Press.
Kuhn, T. S. (1970). The structure of scientific revolutions (2nd ed.). Chicago: University of Chicago Press.
Miller, S. (2006). Mixed methods as methodological innovations: Problems and prospects. Methodological
Innovations Online, 1, 1-7.
Mills, C. W. (1959). The sociological imagination. New York: Oxford University Press.
Morgan, D. L. (2007). Paradigms lost and pragmatism regained. Journal of Mixed Methods Research, 1,
48-76.
Mounce, H. O. (1997). The two pragmatisms: From Peirce to Rorty. London: Routledge.
Osborne, T., & Rose, N. (1999). Do the social sciences create phenomena?: The example of public opinion
research. British Journal of Sociology, 50, 367-396.
Park, A., Curtice, J., Thomson, K., Bromley, C., & Phillips, M. (Eds.). (2004). British social attitudes: The
21st report. London: SAGE.
Park, A., Curtice, J., Thomson, K., Phillips, M., & Johnson, M. (Eds.). (2007). British social attitudes: The
23rd report. London: SAGE.
Robson, C. (1993). Real world research. Oxford, UK: Blackwell.
Rorty, R. (1991). Objectivity, relativism and truth: Philosophical papers (Series–Philosophical Papers, Vol.
1). Cambridge, UK: Cambridge University Press.
Rorty, R. (1999). Philosophy and social hope. London: Penguin Books.
Schumann, H., & Presser, S. (1996). Questions and answers in attitude surveys: Experiments in question
form, wording and context. London: SAGE.
Tashakkori, A., & Teddlie, C. (1998). Mixed methodology: Combining qualitative and quantitative
approaches. Thousand Oaks, CA: SAGE.
Teddlie, C., & Tashakkori, A. (2009). Foundations of mixed methods research. Thousand Oaks, CA:
SAGE.

Bio
Martina Feilzer is a lecturer in criminology and criminal justice, School of Social Sciences, Bangor
University, United Kingdom.

Downloaded from mmr.sagepub.com by guest on November 14, 2011

Das könnte Ihnen auch gefallen