Sie sind auf Seite 1von 14

The current issue and full text archive of this journal is available at www.emeraldinsight.com/0737-8831.

htm

Evidence-based librarianship: the EBL process


Jonathan Eldredge
Health Sciences Library and Informatics Center, University of New Mexico, School of Medicine, Albuquerque, New Mexico, USA
Abstract
Purpose The paper seeks to describe the EBL process in sufcient detail that the readers can apply it to their own professional practice. Design/methodology/approach The paper takes the form of a narrative literature review. Findings The EBL process can be summarized through its ve steps: formulate a clearly dened, relevant, and answerable question; search for an answer in both the published and unpublished literature, plus any other authoritative resources, for the best available evidence; critically appraise the evidence; assess the relative value of expected benets and costs of any decided upon action plan; and evaluate the effectiveness of the action plan. Originality/value References for readers to pursue more in-depth research into any particular step or a specic aspect of the EBL process are provided. The EBL process assists librarians in applying the best available evidence to answering the more important questions facing their practice, their institutions, and the profession. This evidence can become the basis for making sound decisions. Keywords Evidence-based practice, Librarianship, Librarians Paper type Literature review

The EBL process

341
Received 4 May 2006 Accepted 25 May 2006

Background What is EBL? A broad international consensus includes in its denition of EBL the use of the EBL process as a means of integrating the best available evidence into making important decisions. Wide agreement also surrounds the steps in the EBL process, as well as the view that certain forms of evidence are better than other forms for answering different types of EBL questions. Agreement over certain particulars of EBL begins to thin when librarians must reconcile the course prescribed by the evidence within the particular social, cultural, or political circumstances in which they function. These areas of divergence in dening EBL also correspond with operationalizing steps 4 and 5 in the EBL process, as this article will explain. Librarians are dedicated to serve their communities, and these communities can vary greatly in their needs and values. As Crumley and Koufogiannakis (2002) remind us, librarianship has grown out of the social sciences. This observation reects the need for librarians to be aligned closely with their communities expectations. Thus, while elements of a common EBL denition such as the EBL process have broad international consensus, a more thorough and visionary denition statement has eluded our profession. Given the diversity of national practices and contexts, perhaps it might be inappropriate even to expect a comprehensive denition with wide international consensus for a denition of EBL? With these considerations in mind, the author wants readers to understand exactly how he denes EBL, in the event that some specic transnational adaptations are in order when reading this article:

Library Hi Tech Vol. 24 No. 3, 2006 pp. 341-354 q Emerald Group Publishing Limited 0737-8831 DOI 10.1108/07378830610692118

LHT 24,3

Evidence-based librarianship (EBL) provides a process for integrating the best available scientically-generated evidence into making important decisions. EBL seeks to combine the use of the best available research evidence with a pragmatic perspective developed from working experiences in librarianship. EBL actively supports increasing the proportion of more rigorous applied research studies so the results can be available for making informed decisions.

342

This denition requires two distinctions. First, EBL reects both the efforts of practitioners, who consume the results of research in making those decisions, and the efforts of applied researchers, who strive to produce the research evidence intended for use by practitioners. This article on the EBL process has been written primarily for EBL practitioners. Second, EBL practice should not be confused with the vital roles of EBM librarians. These EBM librarians have played important roles in the evidence-based medicine (EBM) movement by assisting teams of researchers in solving clinical problems. In contrast, EBL focuses upon using the best available evidence to make sound decisions on library and informatics issues. Although many EBL librarians (including the author) have participated extensively in these EBM librarian roles and there are shared elements between EBM and EBL the types of questions and evidence utilized in EBL are distinct from EBM. In addition, while EBL originated within health sciences librarianship, the movement has expanded over the past two years to include all types of specialties within librarianship. When this articles refers to librarianship it also generally includes informatics or information science, since the questions and forms of evidence are very similar between these three highly interrelated elds. This denition of EBL also requires a working denition of science to ensure a more operational understanding of what EBL means in practice. Annals of Internal Medicine Editor Frank Davidoff (1995) indicates that:
Science is cognitive, involving accurate observation and clear description, hypothesis generation, data gathering and interpretation, and the creation of theory. But science is also a state of mind: skeptical, open, balanced, respectful of evidence, thorough, always on the alert for bias.

Timothy Ferris (1998) has dened science more metaphorically:


Science is not a collection of facts, any more than opera is a collection of notes. Its a process, a way of thinking, a method, based on a single insight that the degree to which an idea seems true has nothing to do whether it is true, and the way to distinguish factual ideas from false ones is to test them by experiment.

In the same vein, the role of the EBL process within EBL might be considered analogous to the role of the scientic method within science. With this denition and these distinctions and clarications in mind, we can outline the EBL process as follows: (1) formulate a clearly-dened, relevant, and answerable question; (2) search for an answer in both the published and unpublished literature, plus any other authoritative resources, for the best available evidence; (3) critically appraise the evidence; (4) assess the relative value of expected benets and costs of any decided upon action plan; and (5) evaluate the effectiveness of the action plan (Eldredge, 2000c).

The EBL process Formulate The author has taught a continuing education course on EBL on 18 occasions. The course begins with the author asking each participant via e-mail in advance to pose a question they have had, which relates to their work. Their assignment, specically, asks them to: Provide me with at least one focused, practical question directly related to librarianship that has arisen during the past year while fullling your job duties. With hardly any exceptions, participants have been able to quickly identify and articulate their questions. This experience in the authors course seems consistent with a large body of research involving other healthcare professionals who identify questions frequently at a rate of about one or two questions per patient encounter (McKnight and Peet, 2000; Gorman and Helfand, 1995; Forsetlund and Bjrndal, 2001; Ely et al., 2005). The author has collected technology-related questions from participants in his EBL courses, with locations of the courses, of potential interest to Library Hi Tech readers in Table I. Librarians devote much of their working hours to answering questions for their users. Librarians generally identify high-quality information resources, collect or make
Is online instruction alone more effective than online instruction combined with in-person class time? How do we evaluate the potential impact of introducing virtual reference? What is the most effective method for teaching doctors database searching skills? Waterloo, Ontario, Canada Waterloo, Ontario, Canada Oxford, England, UK

The EBL process

343

What personality characteristics in a librarian make her or him a good Oxford, England, UK or bad searcher? Does training in accessing an online health information resource for Charlottesville, VA, USA consumers improve their health status on measurable outcomes such as blood pressure, weight loss or hospital admission rates? Does patient education increase patient satisfaction? Does weeding increase subsequent circulation of a collection? How can we best measure usability of our librarys web pages? Do students from health sciences universities where informatics instruction is integrated into the curriculum have better outcomes than students from universities where these skills are not fully integrated? What is the best way to implement access to electronic journals? What print reference books do we still need to buy now that many reference questions can be answered with free web resources? San Diego, CA, USA Chicago, IL, USA San Antonio, TX, USA Denver, CO, USA

Miami, FL, USA Salt Lake City, UT, USA

Why should libraries and database providers, including the National Salt Lake City, UT, USA Library of Medicine, continue to put time and effort into developing sophisticated search systems (MeSH headings, subject terms, etc) when the world is increasingly google happy with one or two words into any searchbox we provide? What statistics should be collected for electronic full-text journals? Waterloo, Ontario, Canada

Table I. Questions arising in EBL courses with locations

LHT 24,3

344

them available to their users, train users how to nd these resources, and assist users in our reference roles in formulating and then answering their questions. Yet, librarians nd it more challenging than they might expect to formulate and rene their own questions. Time taken at the outset to formulate a question appropriately produces in the end more closely matched answers, and assists EBL librarians while still in the rst step to anticipate the nal step of evaluating the overall results of the EBL process. The author has found it helpful for participants in his course to work on their questions in small groups. This approach of discussing these questions in a group seems to produce more answerable questions, perhaps because the group process allows for clarifying the meaning of these questions amongst people with different experiences and perspectives. There are a number of resources to help readers with the rst step in the EBL process (Booth, 2006; Medical Library Association, Research Section. Evidence-Based Librarianship Implementation Committee, 2001; Eldredge, 2000a). In addition, here are a few practical suggestions on identifying, formulating, and rening answerable EBL questions: . Cultivate the habit of recognizing and recording questions related to our profession. . Capture recognize questions as they arise and, before they are forgotten, record them immediately, no matter how vague or in need of further renement. . Rene your question during a quiet moment ask colleagues for assistance in helping you rene and clarify what you really want to know, recalling Oxman and Guyatts (1988) adage: Fuzzy questions lead to fuzzy answers. . Reframe is there really another question behind your initial question? Experience suggests that the question you initially ask rarely continues to be the question you eventually pursue. . Prioritize determine how important this question is to you, your institution, or the profession; further determine the immediacy of the need to know an answer: today, tomorrow, or just some day? Clearly, not all questions that we formulate can deserve our full attention. Yet, when making important decisions, our emphasizing the EBL process increases the probability of yielding a high-quality answer. . Courage Great discoveries are made when someone asks a new question rather than provides a new answer (Shoppers Window, 1998). Search for an answer in both the published and unpublished literature, plus any other authoritative resources, for the best available evidence Winning (2004) has observed that The multifaceted nature of librarianship and information science [. . .] means that the evidence base is contained in multiple and varied information resources (p. 71). Winning builds upon others previous descriptions and assessments of our evidence base to suggest some effective global strategies for searching the evidence base. Beverley (2004) also outlines practical tactics for extracting evidence efciently from those databases with relevance to our eld. The existing library literature can be accessed via several databases. This strategy presents additional challenges as these databases have been demonstrated to index the library literature inconsistently not only between themselves but also

within the same databases across different years (Eldredge, 2004a, b). Yerkey and Glogowski (1990) have noted that librarianship tends to scatter through literatures outside of librarianship just as other literatures exert strong inuences through their own scatter within the library literature. Librarians regularly constrict the scope of their searches for the best evidence to only the library literature. Yet, many times, by genericizing an EBL question, we can expand the subject coverage scope of our searches for the needed evidence outside the library literature (Eldredge, 2000b, 2004a, b). Crumley and Koufogiannakis (2001, 2002) have made the profound observation that most EBL questions can be assigned to one of six subject domains: (1) reference; (2) education; (3) collections; (4) management; (5) information access and retrieval; and (6) marketing/promotion. They have further demonstrated that, in at least three of these six subject domains, the needed evidence will likely exist outside of the library literature. Librarians interested in adapting this approach should read their work. Unique characteristics of the library literature present challenges to nding the best available evidence, however. A major portion of our knowledge base resides in the realm of the grey literature of conference papers and posters, supplemented by oral histories within respective workplaces (Eldredge, 2000b; Genoni et al., 2004). Professional incentives that enable librarians to receive funding to present papers and display their posters at conferences, but do not exist sufciently to publish in the peer-reviewed literature, might explain this peculiar situation. This pattern seems laced with irony, given our professional role of assisting users from other subject domains or professional literatures in the effective extraction of information from their own knowledge bases. This aberrant pattern requires our utilizing unusual strategies for identifying needed forms of evidence. Structured abstracts in both the published and unpublished segments of our knowledge base increase the likelihood that relevant evidence will be found for making decisions (Mulrow, 1987; Ad Hoc Working Group for Critical Appraisal of the Medical Literature, 1987; Haynes et al., 1990; Hartley et al., 1996; Hartley, 1997; Bayley et al., 2002; Bayley and Eldredge, 2003). The structured abstract format enables busy librarians to assess quickly whether a professional communication, either published or unpublished, contains needed evidence and then to extract that evidence. For several years the Medical Library Association (MLA) has required that annual meeting papers and posters be summarized with structured abstracts (Shedlock et al., 2005) to aid the extraction of useful information. MLA has produced tools for assisting librarians in writing structured abstracts from its Research Sections home page (Medical Library Association, 2005). The innovation of MLA publishing structured abstracts from its annual meetings on its website builds upon its decades-long tradition of publishing abstracts of papers and posters as a service to annual meeting attendees. Retrieving relevant abstracts

The EBL process

345

LHT 24,3

346

from this gray literature for answering EBL questions still presents a fairly daunting challenge. The journal Hypothesis, usually in its summer issue each year, provides structured abstracts that summarize the papers and posters that have received research awards at the MLA Annual Meeting, which might be an efcient shortcut for busy librarians searching for the best evidence to answer their questions. Some papers and posters submitted for the three international EBL conferences also have used structured abstracts for summarizing these research reports (University of Shefeld, School of Health and Related Research, 2005; University of Alberta, Alberta Society of Evidence Based Librarians, 2003; Australian Library and Information Association, Evidence Based Librarianship Group, 2005). The leading research-oriented journals in health sciences librarianship employ structured abstracts for summarizing research reports: BMC Biomedical Digital Libraries, Health Information and Libraries Journal, Hypothesis, and Journal of the Medical Library Association. Emerald Group Publishing recently adopted the structured abstract requirement for all of its journals (De Verteuil, 2005). Critically appraise the evidence Once an EBL question has been formulated clearly in an answerable form, the search for the best available evidence executed, then the available evidence must be critically appraised. Booth and Brice (2004) are the undisputed masters of critical appraisal in EBL. Booth and Brice have developed the checklist shown in Table II to guide critical appraisal in the EBL process. The author strongly recommends that all librarians, even those still aspiring to be EBL practitioners rather than perhaps EBL researchers, to become familiar with this checklist so they can apply it in evaluating any piece of relevant evidence produced by a search. Brice et al. (2005) recently guided audience members at an International Federation of Library Association Annual Conference through a critical appraisal example, which might offer readers a more practical simulation. The EBL Levels of Evidence also serve as another helpful tool for appraising evidence. The Levels of Evidence shown in Table III assist the practitioner in evaluating any given piece of evidence, but also allow the busy practitioner to make quick comparisons between multiple pieces of evidence with conicting results. The Levels of Evidence enable busy librarians to determine if the original question is a Prediction, Intervention, or an Exploration inquiry. It then directs the practitioner to a hierarchal schedule of methodologies for judging what methodologies are most appropriate for answering this one of three types of queries and most likely to be free of bias (Eldredge, 2002). An inventory of research methods for librarianship and informatics also will serve as a companion guide for dening the varied research methods utilized in the eld (Eldredge, 2004a, b). The Levels of Evidence place the systematic review at the top of the categories of evidence for all three types of queries. The systematic review consists of a scientic review of the literature intended to answer a focused question. Literature review search strategies for systematic reviews should be completely transparent so that a colleague could replicate these searches to nd the same cited literature. As the knowledge base of our profession grows and uses more sophisticated research methods, the possibility of conicting results arises. The systematic review aids the busy EBL practitioner in reviewing the literature in order to make sound decisions. Two recent systematic

Is the study a close representation of the truth?

Does the study address a clearly focused issue? Does the study position itself in the context of other studies? Is there a direct comparison that provides an additional frame of reference? Were those involved in collection of data also involved in delivering a service to the user group? Were the methods used in selecting users appropriate and clearly described? Was the planned sample of users representative of all users (actual and eligible) who might be included in the study? What was the response rate and how representative was it of the population under study? Are the results complete and have they been analyzed in an easily interpretable way? Are any limitations in the methodology (that might have inuenced results) identied and discussed?

The EBL process

347

Are the results credible and repeatable?

Will the results help me in my Can the results be applied to your local population? own information practice? What are the implications of the study for your practice? in terms of current deployment of services? in terms of cost? in terms of the expectations and attitudes of your users? What additional information do you need to obtain locally to assist you in responding to the ndings of the study? Source: Booth and Brice (2004)

Table II. Booth and Brices critical appraisal checklist

Prediction Systematic review Meta-analysis Prospective cohort study Retrospective cohort study Descriptive study Case study

Intervention Systematic review Meta-analysis RCTs Prospective cohort study Retrospective cohort study Descriptive survey Case study

Exploration Systematic review Summing Up a Qualitative studiesb Descriptive study Case study

Notes: aSee Light and Pillemer (1984) for a comprehensive review of creative ways to synthesis exploratory research results; bqualitative studies include but are not limited to focus groups, ethnographic studies, naturalistic observations, and historical analyses Source: q 2002 Jonathan Eldredge. Reprinted with permission from the author

Table III. EBL Levels of Evidence

reviews by two different teams of librarians, one team in the US and the other in the UK, critically examined the efcacy of clinical medical librarians (Wagner and Byrd, 2004; Winning and Beverley, 2003). Variations exist in how systematic reviews are conducted (Mulrow and Cook, 1998; Light and Pillemer, 1984; Booth and Farmer, 1999; Farmer et al., 1998), although all bring a rigor needed for sound decision making not found normally in traditional literature reviews.

LHT 24,3

348

Assess the relative value of expected benets and costs of any decided upon action plan At the outset, it should be noted that costs and benets are by no means restricted to monetary amounts in this fourth step in the EBL process. Clark and Wilson (1961) expanded the concept of costs and benets in their classic description of organizational incentives. They distinguished three types: (1) Material These are tangible rewards; that is, rewards that have a monetary value or can easily be translated into ones that have. This incentive most closely matches the traditional concepts of costs and benets. (2) Solidary These derive in the main from associating and include such rewards as socializing, congeniality, the sense of group membership and identication. (3) Purposive These incentives derive in the main from the stated ends of the association [. . .] the incentive provided by belief in the organizations purposes. By keeping in mind these three forms of costs and benets (material, solidary, and purposive) librarians increase the likelihood that they will avoid a strict focus upon the bottom line nancial considerations when making important decisions. This expanded concept of costs and benets also keeps librarians attuned to local social or cultural contexts, which might need to be factored into the decision-making processes. Decision tree analyses appear to be highly adaptable tools for sorting out various possible scenarios, and the comparative strengths of their corresponding evidence (Richardson and Detsky, 1995). Koufogiannakis and Crumley (2004) offer many practical steps for navigating through the adaptation and application of the relevant evidence in specic circumstances. Their discussion of the politics and the major stakeholders reminds us that even the most sound analyses of the best available evidence might be in vain if a prudent political strategy does not accompany and possibly modulate the diffusion of these results into a specic cultural or social context for steps 4 and 5 in the EBL process. Their observations underscore why international (and even to some extent intra-national) variances in circumstances make it difcult to be too prescribed in any local situation. Similarly, these local circumstances probably explain the inability of earnest international efforts to comprehensively dene EBL. This article attempts to provide resources that might be adapted across various contexts, however. A personal anecdote offers insight into how EBL decisions need to incorporate more than just the research evidence since exceptions sometimes need to be made for political reasons. In short, the evidence sometimes needs to be modulated to t the local circumstances. The author worked with collection development colleagues at another library a number of years ago in implementing an evidence-based approach to selecting and canceling journals in their collection. As the authors consultations progressed, all of this librarys journals were subjected to a formula that calculated their costs divided by their use. A journal subscription costing $1,000 per year and having 100 uses within the past year, under this formula, would have a cost/benet ration of $10 per use. In this instance, it would be most economical to retain such a journal rather than to cancel it and pay royalty and administrative costs to access it as many times via document delivery/interlibrary loan. The collection development librarians found a single aberration related to their calculations. One relatively expensive journal subscription was expensive and used

only a few times a year. The problem was that a powerful faculty member who brought in millions of research dollars was a staunch supporter of this one journal subscription. This journal was a little above the already high average cost of library subscriptions so it was not a dramatic example. Should the collection development librarians strictly adhere to the EBL approach, despite the political ramications of anticipated criticism of the library from this one faculty member? To invoke a popular question when confronting principled versus pragmatic considerations: Is this a hill to die for? The collection development librarians at that library adhered to their EBL formula overall, but decided to retain the one journal title for political reasons. On the other hand, and importantly, if this EBL approach had been so riddled with political considerations in this case so that the journal collection retention decisions included ve or more exceptions, it probably would destroy the legitimacy of the EBL approach. Readers with an interest in other EBL applications to answering collection development questions should peruse Gallager et al.s (2005) recent EBL investigation of retaining print versus electronic formats. Evaluate the effectiveness of the action plan Most librarians probably associate the terms project evaluation or program evaluation immediately with the overall concept of evaluation. These areas indeed are critically important to institutional and to a lesser extent the profession levels whereas evaluation in this context also incorporates the practitioner level of EBL. Although Booth (2004) has described similar levels of evaluation, a close reading of his chapter validates the perception, due to its limited application of this chapter to the authors own situation, that EBL practice varies between national contexts. This last step in the EBL process will probably be open to the greatest amount of transnational variance as readers adapt it to their particular circumstances. Practitioner level. EBL practice occurs within mainly an institutional context. Even if the term EBL never receives specic mention within this context, some sort of evidence enters into the decision making processes of an institution. The individual EBL practitioner needs to evaluate the EBL process critically and, if at all possible, voice concerns about the integrity of that process in the course of discussions among co-workers. EBL practitioners need to ask whether they individually are cultivating the habit of identifying the many questions that occur in everyday practice. As a member of committees or task forces within the institution, are they contributing their individual questions to the discussions? As members of their institutions are these individuals contributing to the renement of questions? Once a set of questions is put before the working group, are they helping to identify the most answerable and important identied questions? Continuing through the EBL process, are EBL practitioners identifying the most comprehensive and effective strategies for searching for the needed evidence? Are they critically appraising the evidence gathered? Are they alert to their own individual biases, the biases of other individuals, or the biases of the group (groupthink)? Are they considering the many costs and benets, including the material, solidary and purposive? Individual practitioners roles are essential within the institutional EBL context so their honest and full discussant roles should be cultivated by institutional leaders to ensure the collective wisdom of all individuals involved with any decision.

The EBL process

349

LHT 24,3

350

Beyond the institutional context, EBL practitioners need to pursue individual professional literature reading and research interests. There are many questions that occur to individuals that will have relevance to their colleagues within their own specialty that those within their institutions who are geographically proximal colleagues will not necessarily be attuned. EBL practitioners should strive to pursue these specialty related areas such as collection development, library instruction, reference, distance services, etc. through professional searches or reviews of their specialty literature. If they pursue research at even the narrative review level of investigation do they communicate the results of their research? Are they aware of their own biases and build in corrective features in their research designs? While engaged in these specialty-oriented pursuits, the individual EBL librarian should still follow the steps of the EBL process. Institutional level. When librarians consider the topic of evaluation they normally think almost exclusively at the institutional level with the activities of project or program evaluation, as already noted. The author has found Weisss book (1998) on evaluation to have an orientation toward the subject that seems particularly well-suited to our eld. Weisss distinction between formative and summative evaluations warrants review. She denes these types of evaluation as:
Formative evaluations are designed to assist those who are developing projects in the early phases. The emphasis is on feedback to developers with an eye to improving the nal product. Summative evaluation is meant for decisions about whether to continue or end a program, extend it to other locations or cut it back (p. 31).

Of direct relevance to practitioners in our eld, particularly for those with prior knowledge and experience with evaluation, will be Burroughs and Woods (2000) book Measuring the Difference. Burroughs and Wood use many practical examples, which they link with higher concepts of evaluation practice. Burroughs and Woods describe and illustrate four types of objectives used in evaluation: (1) process; (2) educational; (3) behavioral & environmental; and (4) program (Burroughs and Wood, 2000, pp. 18-19, Appendix D). Project or program evaluation oftentimes returns to the rst step of the EBL process by examining the clarity of the original question, and an examination of whether that question ever was answered in the project or program. An unclear or misdirected question at the outset will become starkly exposed at this end stage of the process. If you do not know where you are going, you will certainly get there, the old proverb reminds us. Professional level. As a profession, are we: . Articulating the most relevant questions as a profession? . Employing the most appropriate methodologies to answer these questions? . Making the necessary distinctions between different forms of evidence? Traditionally, our profession has addressed its broad evaluation issues through the editorial function. Journal editors in our profession play key roles in performing this function, as do editorial peer reviewers and members of the profession who write

commentaries or letters to the editor. The editorial function reects on the professions priorities and deciencies. T. Scott Plutchaks recent (2005) editorial on the evidence base of our profession illustrates this essential editorial role. Sometimes these evaluation issues are handled by groups such as the MLA Research Policy Statement Task Force (Marshall, 2005), which has been charged with updating the 1995 MLA research policy (Medical Library Association, 1995). Other times, targeted research point to areas of investigation in the literature (Ioannidis, 2005). New questions arise as the result of answering or even just attempting to answer existing EBL questions. The profession has an obligation to ensure that results are communicated, even if these results are not dramatic or new, to build a more solid foundation to our knowledge base. Otherwise, we distort our understanding of reality by focusing on only novel or dramatic research results. If 20 libraries embark on a new program, and 19 have negative or neutral results, whereas one library alone has a dramatic outcome worthy of communicating widely, how does that distort our understanding of what approaches might work, or might not work? Conclusion The EBL process provides a framework for making important decisions based upon the best available evidence. Each of the ve steps in the process require librarians to integrate their professional experience in judging the relevance and appropriateness of the best evidence. While wide consensus surrounds the steps of the process, the interpretation and application of step 4 might be subject to widely varying local circumstances. Step 5 also might vary due to different values within respective professional associations.
References Ad Hoc Working Group for Critical Appraisal of the Medical Literature (1987), A proposal for more informative abstracts of clinical articles, Annals of Internal Medicine, Vol. 106 No. 4, pp. 598-604. Australian Library and Information Association, Evidence Based Librarianship Group (2005), Third International Evidence-Based Librarianship Conference, Brisbane, Australia, 16-19 October, available at: http://conferences.alia.org.au/ebl2005 (accessed 19 November 2004). Bayley, L. and Eldredge, J. (2003), The structured abstract: an essential tool for researchers, Hypothesis, Vol. 17 No. 1, pp. 1, 11-13, available at: http://research.mlanet.org Bayley, L., Wallace, A. and Brice, A. (2002), Evidence Based Librarianship Implementation Committee. Research results, Dissemination Task Force recommendations, Hypothesis, Vol. 16 No. 1, pp. 6-8, available at: http://research.mlanet.org (accessed 16 May 2005). Beverley, C. (2004), Searching the library and information science literature, in Booth, A. and Brice, A. (Eds), Evidence-Based Practice for Information Professionals, Facet Publishing, London, pp. 89-103. Booth, A. (2004), Evaluating your performance, in Booth, A. and Brice, A. (Eds), Evidence-Based Practice for Information Professionals, Facet Publishing, London, pp. 127-37. Booth, A. (2006), Clear and present questions: formulating questions for evidence based practice, Library Hi Tech, Vol. 24 No. 3, pp. 355-68.

The EBL process

351

LHT 24,3

352

Booth, A. and Brice, A. (2004), Appraising the evidence, in Booth, A. and Brice, A. (Eds), Evidence-Based Practice for Information Professionals, Facet Publishing, London, pp. 104-18. Booth, A. and Farmer, J. (1999), Between idea and reality, Library Association Record, Vol. 101 No. 2, p. 104. Brice, A., Booth, A. and Bexon, N. (2005), Evidence based librarianship: a case study in the social sciences, World Library and Information Congress: 71st International Federation of Library Associations General Conference and Council, 14-18 August, Oslo, Code 111-E, available from: www.ia.org/IV/ia71/programme.htm (accessed 10 September 2005). Burroughs, C.M. and Wood, F.B. (2000), Measuring the Difference: Guide to Planning and Evaluating Health Information Outreach, National Network of Libraries of Medicine, Pacic Northwest Region, Seattle, WA and National Library of Medicine, Bethesda, MD, available at: http://nnlm.gov/evaluation (accessed 27 September 2005). Clark, P.B. and Wilson, J.Q. (1961), Incentive systems: a theory of organizations, Administrative Science Quarterly, Vol. 6 No. 2, pp. 129-66. Crumley, E. and Koufogiannakis, D. (2001), Developing evidence based librarianship in Canada: six aspects for consideration, Hypothesis, Vol. 15 No. 2, pp. 9-10, available at: http:// research.mlanet.org (accessed 11 January 2005). Crumley, E. and Koufogiannakis, D. (2002), Developing evidence-based librarianship: practical steps for implementation, Health Information Libraries Journal, Vol. 19 No. 2, pp. 61-70. Davidoff, F. (1995), The future of Annals, Annals of Internal Medicine, Vol. 122 No. 5, pp. 375-6. De Verteuil, S. (2005), Note from the publisher, Journal of Documentation, Vol. 61 No. 2, pp. 321-2. Eldredge, J.D. (2000a), Evidence-based librarianship: formulating EBL questions, Bibliotheca Medica Canadiana, Vol. 22 No. 2, pp. 74-7. Eldredge, J.D. (2000b), Evidence-based librarianship: searching for the needed EBL evidence, Medical Reference Services Quarterly, Vol. 19 No. 3, pp. 1-18. Eldredge, J.D. (2000c), Evidence-based librarianship: an overview, Bulletin of the Medical Library Association, Vol. 88 No. 4, pp. 289-302. Eldredge, J.D. (2002), Evidence-based librarianship: levels of evidence, Hypothesis, Vol. 16 No. 3, pp. 10-13, available at: http://research.mlanet.org (accessed 20 October 2004). Eldredge, J.D. (2004a), Inventory of research methods for librarianship and informatics, Journal of the Medical Association, Vol. 92 No. 1, pp. 83-90. Eldredge, J.D. (2004b), How good is the evidence base?, in Booth, A. and Brice, A. (Eds), Evidence-Based Practice for Information Professionals, Facet Publishing, London, pp. 36-48. Ely, J.W., Osheroff, J.A., Chambliss, M.L., Ebell, M.H. and Rosenbaum, M.E. (2005), Answering physicians clinical questions: obstacles and potential solutions, Journal of the American Medical Informatics Association, Vol. 12 No. 2, pp. 217-24. Farmer, J., Booth, A., Madge, B. and Forsythe, E. (1998), What is the Health Libraries Group doing about research?, Health Libraries Review, Vol. 15, pp. 139-41. Ferris, T. (1998), Not rocket science, New Yorker, Vol. 74 20, 20 July, pp. 4-5. Forsetlund, L. and Bjrndal, A. (2001), The potential for research-based information and public health: identifying unrecognized information needs, BMC Public Health, Vol. 1 No. 1, available at: www.biomedcentral.com/1471-2458/1/1 (accessed 19 August 2004).

Gallagher, J., Bauer, K. and Dollar, D.M. (2005), Evidence-based librarianship: utilizing data from all available sources to make judicious print cancellation decisions, Library Collections, Acquisitions, & Technical Services, Vol. 29, pp. 169-79. Genoni, P., Haddow, G. and Ritchie, A. (2004), Why dont librarians use research?, in Booth, A. and Brice, A. (Eds), Evidence-Based Practice for Information Professionals, Facet Publishing, London, pp. 49-60. Gorman, P.N. and Helfand, M. (1995), Information seeking in primary care: how physicians choose which clinical questions to pursue and which to leave unanswered, Medical Decision Making, Vol. 15 No. 2, pp. 113-9. Hartley, J. (1997), Is it appropriate to use structured abstracts in social science journals?, Learned Publishing, Vol. 10 No. 4, pp. 313-7. Hartley, J., Sydes, M. and Blurton, A. (1996), Obtaining information accurately and quickly: are structured abstracts more efcient?, Journal of Information Science, Vol. 22 No. 5, pp. 349-56. Haynes, R.B., Mulrow, C.D., Huth, E.J., Altman, D.G. and Gardner, M.J. (1990), More informative abstracts revisited, Annals of Internal Medicine, Vol. 113 No. 1, pp. 69-76. Ioannidis, J.P.A. (2005), Contradicted and initially stronger effects in highly cited clinical research, Journal of the American Medical Association, Vol. 294 No. 2, pp. 218-28. Koufogiannakis, D. and Crumley, E. (2004), Applying evidence to your everyday practice, in Booth, A. and Brice, A. (Eds), Evidence-Based Practice for Information Professionals, Facet Publishing, London, pp. 119-26. Light, R.J. and Pillemer, D.B. (1984), Summing Up: The Science of Reviewing Research, Harvard University Press, Cambridge, MA. McKnight, M. and Peet, M. (2000), Health care providers information seeking: recent research, Medical Reference Services Quarterly, Vol. 19 No. 2, pp. 27-50. Marshall, J.G. (2005), Personal communication, 14 March. Medical Library Association (1995), Using Scientic Evidence to Improve Information Practice, Medical Library Association, Chicago, IL, available at: www.mlanet.org/research/science4. html (accessed 6 October 2005). Medical Library Association (2005), Research Section homepage, available at: http://research. mlanet.org (accessed 2 October 2005). Medical Library Association, Research Section. Evidence-Based Librarianship Implementation Committee (2001), The most relevant and answerable research questions facing the practice of health sciences librarianship, Hypothesis, Vol. 15 No. 1, p. 9-15, 17, available at: http://research.mlanet.org (accessed 8 June 2005). Mulrow, C. and Cook, D. (1998), Systematic Reviews: Synthesis of Best Evidence for Health Care Decisions, American College of Physicians, Philadelphia, PA. Mulrow, C.D. (1987), The medical review article: state of the science, Annals of Internal Medicine, Vol. 106 No. 3, pp. 485-8. Oxman, A.D. and Guyatt, G.H. (1988), Guidelines for reading literature reviews, Canadian Medical Association Journal, Vol. 138 No. 8, pp. 697-703. Plutchak, T.S. (2005), Building a body of evidence, Journal of the Medical Library Association, Vol. 93 No. 2, pp. 193-5. Richardson, W.S. and Detsky, A.S. (1995), Users guides to the medical literature: VII. How to use a clinical decision analysis: B. What are the results and will they help me in caring for my patients?, Journal of the American Medical Association, Vol. 273 No. 20, pp. 1610-3.

The EBL process

353

LHT 24,3

354

Shedlock, J., Ketchell, D. and Greenberg, C.J. (2005), Call for MLA 06 contributed papers and posters, Medical Library Association News, August, No. 378, pp. 1, 24-5, available at: www.mlanet.org/publications/mlanews/index.html (accessed 1 October 2005). Shoppers Window (1998), Words of wisdom, Shoppers Window, April, p. 35. University of Alberta, Alberta Society of Evidence Based Librarians (2005), Second International Evidence-Based Librarianship Conference: Improving Practice Through Research, Edmonton, Alberta, 4-6 June, available at: www.eblib.net (accessed 2 October 2005). University of Shefeld, School of Health and Related Research (2005), First International Evidence Based Librarianship Conference, Shefeld, 3-4 September, available at: www. shef.ac.uk/scharr/eblib.html (accessed 2 October 2005). Wagner, K.C. and Byrd, G.D. (2004), Evaluating the effectiveness of clinical medical librarian programs: a systematic review of the literature, Journal of the Medical Library Association, Vol. 92 No. 1, pp. 14-33, available at www.pubmedcentral.nih.gov (accessed 29 September 2005). Weiss, C.H. (1998), Evaluation, 2nd ed., Prentice-Hall, Upper Saddle River, NJ. Winning, A. (2004), Identifying sources of evidence, in Booth, A. and Brice, A. (Eds), Evidence-Based Practice for Information Professionals, Facet Publishing, London, pp. 71-88. Winning, M.A. and Beverley, C.A. (2003), Clinical librarianship: a systematic review of the literature, Health Information and Libraries Journal, Vol. 20, Supplement 1, pp. 10-21. Yerkey, A.N. and Glogowski, M. (1990), Scatter of library and information science topics among bibliographic databases, Journal of the American Society for Information Science, Vol. 41, pp. 245-54. Further reading Booth, A. (1998), Testing the lore of research, Library Association Record, Vol. 100 No. 12, p. 654. Booth, A. and Haines, M. (1998), Room for a review?, Library Association Record, Vol. 100 No. 8, pp. 411-2. Wallace, A., Bayley, A. and Brice, A. (2001), Evidence-Based Librarianship Implementation Committee Report, Hypothesis, Vol. 15 No. 2, pp. 6-7, available at: http://research.mlanet. org (accessed 8 June 2005). Corresponding author Denise Koufogiannakis can be contacted at: denisekoufogiannakis@ualberta.ca

To purchase reprints of this article please e-mail: reprints@emeraldinsight.com Or visit our web site for further details: www.emeraldinsight.com/reprints

Das könnte Ihnen auch gefallen