Beruflich Dokumente
Kultur Dokumente
Pat Bazeley
Research Support P/L
PO Box 2005
Bowral NSW 2576
Australia
Email: pat@researchsupport.com.au
Abstract
In a context of increasing emphasis on academic performance and accountability, data
from a structured survey in which academics elaborated on eight different attributes of
high-performing researchers were used to build a conceptual model of research
performance. From these data, research performance was seen to comprise two basic
components, with six secondary level dimensions and a range of potential indicators.
Four essential (necessary and sufficient) dimensions, relating to the research activity
component of research performance were: engagement, task orientation, research
practice and intellectual processes. Two alternative dimensions (of which at least one is
necessary) relating to the performance, or making research visible, component of
research performance were: dissemination, and collegial engagement. Research
performance was seen to occur within conditions provided by an institutional context
(education and training; opportunity and resources), and to bring about a range of
outcomes (product, impact and reputation).
Aim/Questions
The purpose of this project was to develop an empirically-based, theoretical concept of
research performance which could then be used to inform thinking about useful
indicators of research performance at a local, if not national level.
It is recognised that indicators at a national level must employ a degree of
expediency—the cost should not outweigh the benefits. At the local level, however, (at
least in Australia) expedient national performance indicators, designed only to compare
whole institutions, are employed inappropriately to assess the ‘research active status’ of
individual academics and as a basis for distribution of funding to departments (Bazeley,
2006a). While the results of this study are designed to inform work potentially leading
to operationalisation and measurement of research performance, that consequential
stage is not developed here.
Specific questions guiding this foundational (conceptual) part of the study,
therefore, were:
(1) How are high performing researchers described by Australian academics?
(2) What are the dimensions of research performance and how are they structured?
* Number of people using the listed descriptor for the identified researcher attribute
Overall, quality and ability were seen to be most evident in creativity, innovation
and originality supported by a high level of research skills and personal application.
Productivity and recognition, in contrast, are evidenced more by dissemination of work,
especially through publications, with recognition being aided also by networking and
making oneself known. These four attributes attracted fuller descriptions than the
remaining four, and the nature of the descriptions given reinforces the notion of a
significant quality-quantity divide in observations of research performance, with
attributes such as benefit, satisfaction and approachability being independent of this
dichotomy.
*
organised-disciplined; careful-thorough; task
finisher; commitment-persistence; orientation
problem solver; confident
Research
* activity
*
creative-innovative; analytic thinker; intellectual Research
curiosity-open mind process
* performance
Legend:
ontological dimension
substitutable
conjunction of non-causal necessary conditions
+ logical OR
* logical AND
Discussion
Data from a cross-section of academics across a variety of universities have been
analysed to explore academic perceptions of the characteristics of high-performing
researchers and to build a dimensional, conceptual model of research performance. The
dimensions constituting research activity and performing identified through the
interpretive conceptual analysis of the qualitative data provide a basis for challenging,
extending and re-building outcome-focused models currently driving research
assessment exercises. One can argue, for example, that engagement, task orientation,
research skills and intellectual processes will be most evident in the quality (rather than
quantity) of output from research, and hence these dimensions and their indicators can
be used to give more specific meaning to that term. Furthermore, the dimensions and
indicators identified in this study could have particular relevance at the local and
individual level of assessment, when the ‘research active’ status of an academic,
department or centre is being determined for workload or other purposes. This is
something that is mismanaged and discriminatory in the current system in Australia, if
not elsewhere (Bazeley, 2006a).
To the extent that the concept of research performance that has been developed
in this study is based on empirical data, the items included have been impacted by the
questions asked and the form of asking, and perhaps also by Australian academic
culture of the mid-1990s when the data were collected. The danger inherent in brief,
self-report questionnaires is that respondents will ‘dash off’ answers without deep
reflection, and this is apparent here, for example, in the relative level of emphasis on
simply ‘having publications’ as a desirable characteristic of researchers. This contrasts,
to some extent, with Åkerlind’s (2008) finding of an equivalent emphasis being given to
personal and real-world benefits stemming from research, and with data gathered in a
non-assessment context which found that professional and public audiences were at
least as important, if not more, than academic audiences for researchers working in
social science, professional and humanities disciplines (Bazeley, 2006a).
That the empirical data was based on a cross-disciplinary sample of academics
with different levels of personal research expertise and from established, middle-
ranking and new universities contributes to the potential universality of the concept.
What the researcher brings to the data analysis and conceptualisation, to intuitively add
to the breadth of that perspective, is almost 40 years of research experience, with 20 of
those working in a cross-disciplinary and cross-national developmental role. Thus, the
concept of research performance is intended to be universal at the dimensional level. At
the indicator level, however, those in different disciplines (and at different levels of
maturity) will quite possibly require different emphases to reflect the dimension for
their discipline (Åkerlind, 2008). This issue will be explored in a further article, along
with the possible influence of gender and qualification (as a proxy for experience).
A number of interesting dilemmas were faced in building and proposing this
conceptual model. The equation between volume of published output and impact on the
research community has long been a matter of contention. I have proposed that products
of research (such as publications of various kinds), impact and indicators of enhanced
reputation should be viewed as consequences of performance—yet these three things
are what most measures of performance attempt to assess. When Steele (2004. p. 67)
suggested that publication had become more important than dissemination, he went on
to say that the results of research have ‘often been disseminated well before the
publication. The publication is for the accreditation and tenure.’ While the act of
publishing is a form of dissemination (unlike citations, invitations or awards), and was
seen as such in the process of coding the data, it is important, nevertheless, to
distinguish conceptually between the action and its outcome, as this model is designed
to do.
Perhaps my most controversial decision was to view collegial engagement as an
alternative dimension of the performing component of research performance. The
reason for doing so is well presented in a description given of an active researcher by a
female senior scientist: ‘Full of ideas, willing to communicate those ideas – often at the
expense of conventional research articles, sharing ideas with students or colleagues is a
higher priority.’ One is reminded that much of George Herbert Mead’s seminal work
has been published by and through his students, so that the impact he has had on social
psychology and related social sciences has been as much, if not more, though that
mentoring/teaching/inspiring route than through directly publishing. Collegial
communities of practice and micro communities of knowledge, with their vital
interpersonal communication channels, are a primary means of converting tacit
knowledge, generated through knowledge creation projects, into explicit knowledge—
yet these are being eroded in the new competitive and isolating environment of
universities (Moss & Kubacki, 2007).
The necessity for ‘hands-on’ engagement points to one further issue in what
makes for research performance. Many researchers regarded as highly productive
‘spend little time doing research themselves; [they] focus on money to employ people
and getting names on publications’ (male, senior scientist). Apart from the significant
number of respondents who mentioned active personal involvement as being important,
some years ago Frost and Stablein (1992) pointed to the significance of ‘handling your
own rat’ for excellence in research. The professor who finds that their assistant or junior
colleague, who was responsible for carrying out a major part of the research process,
has disappeared shortly before a report is due can be placed in a very awkward
situation! Desirable qualities for a researcher, such as being intensively engaged in
one’s research (like any obsession), can have negative side effects, however, such as the
complaint registered in this data that such people are often neglectful of their other
duties (they are often neglectful of their families as well).
Further substantive analyses of this data will identify whether interpretations of
the dimensions in the model vary in relation to discipline, gender, and experience, as
noted above. To empirically test the necessity for each of the dimensions and to refine
the indicators outlined above, further data will need to be gathered. Intensive case
studies of a purposive sample of academics at different points in their research career
would provide a useful first assessment, would assist also in seeing more clearly how
desirable behaviours might be evidenced in work done, and would help to clarify
potential issues around the nature and extent of the role played by the published paper in
effective academic communication and advancement of research. Measures based on a
refined set of indicators might then be developed and tested in discriminant analyses
with groups selected to vary by broad consensus in their level of research performance.
Acknowledgements
Initial work on this study was supported by the Australian Research Council. Further analytic work was
undertaken while I was Visiting International Fellow at the Institute for Social Research at the University
of Surrey in 2006.
References
Åkerlind, G. S. (2008). An academic perspective on research and being a researcher: an integration of the
literature. Studies in Higher Education, 33(1), 17-31.
Archer, L. (2008). Younger academics’ constructions of ‘authenticity’, ‘success’ and professional
identity. Studies in Higher Education, 33(4), 385-403.
Batterham, R. (2004). Measuring excellence: A chief scientist perspective. In National Academies Forum
(22 June), Measuring excellence in research and research training, pp.3-8. Canberra: The
Academy of Science.
Bazeley, P. (2006a). Research dissemination in Creative Arts, Humanities and the Social Sciences.
Higher Education Research and Development, 25(3), 215-229.
Bazeley, P. (2006b). The contribution of computer software to integrating qualitative and quantitative
data and analyses. Research in the Schools, 13(1), 63-73.
Bazeley, P. (2009a). Mixed methods data analysis. In S. Andrew & E. Halcomb (Eds.), Mixed methods
research for nursing and the health sciences (pp. 84-118). Chichester, UK: Wiley-Blackwell.
Bazeley, P. (2009b). Integrating analyses in mixed methods research [Editorial]. Journal of Mixed
Methods Research, 3(3), 203-207.
Brew, A. (2001). Conceptions of research: a phenomenographic study. Studies in Higher Education, 26,
271-285.
Bourdieu, P. (2001). Homo academicus. Cambridge: Polity Press.
Bowden, J., Green, P., Cherry, N., & Usher, R. (2005). Academics’ ways of understanding success in
research activities. In J. Bowden & P. Green (Eds.), Doing developmental phenomenography
(pp. 128-144). Melbourne: RMIT University Press.
Boyatzis, R. E. (1998). Transforming qualitative information: thematic analysis and code development.
Thousand Oaks, CA: Sage.
Bruce, C., Pham, B., & Stoodley, I. (2004). Constituting the significance and value of research: views
from information technology academics and industry professionals. Studies in Higher Education,
29, 219-238.
Creamer, E. G. (1998). Assessing faculty publication productivity: issues of equity (Vol. 26). Washington
D.C.: The George Washington University, Graduate School of Education and Human
Development.
Frost, P. J., & Stablein, R. E. (1992). Doing exemplary research. Newbury Park, CA: Sage.
Goertz, G. (2006). Social science concepts: a user's guide. Princeton, NJ: Princeton University Press.
Marton, F. (1981). Phenomenography - describing conceptions of the world around us. Instructional
Science, 10, 177-200.
McNay, I. (2003). Assessing the assessment: an analysis of the UK Research Assessment Exercise, 2001,
and its outcomes, with special reference to research in education. Science and Public Policy,
30(1), 47-54.
Moss, G., & Kubacki, K. (2007). Researchers in higher education: a neglected focus of study? Journal of
Further and Higher Education, 31(3), 297-310.
Nussbaum, M. C. (1992). Human functioning and social justice: in defence of Aristotelian essentialism.
Political Theory, 20(2), 202-246.
Steele, C. (2004). Changing research practices in the digital information and communication
environment. In National Academies Forum (22 June), Measuring excellence in research and
research training, pp. 61-71. Canberra: The Academy of Science.
Steele, C., Butler, L., & Kingsley, D. (2006). The publishing imperative: the pervasive influence of
publication metrics. Learned Publishing, 19, 277-290.
Tight, M. (2004). Research into higher education: an a-theoretical community of practice? Higher
Education Research and Development, 23(4), 395-411.