Sie sind auf Seite 1von 9

Clinical learning

The development and validation of a knowledge, attitude


and behaviour questionnaire to assess undergraduate
evidence-based practice teaching and learning

Janice M Johnston, Gabriel M Leung, Richard Fielding, Keith Y K Tin & Lai-Ming Ho

Objectives Most evidence-based practice (EBP) edu- measures of EBP activity and examined responsiveness
cational assessment tools evaluated to date have through paired t-test of the pre ⁄ post factor mean scores.
focused on specific knowledge components or tech- Results A 43-item questionnaire was developed. Four
nical skills. Other important potential barriers to the factors were identified from both student groups. The
adoption of EBP, such as attitudinal, perceptual and overall questionnaire as well as each factor had high
behavioural factors, have yet to be studied, especially construct validity (Cronbach’s alpha > 0Æ7 for each
in the undergraduate setting. Therefore, we devel- scale). No significant correlations were found between
oped and validated a knowledge, attitude and beha- the 4 factors, confirming their orthogonality. Positive
viour questionnaire designed to evaluate EBP correlations, however, resulted between factor mean
teaching and learning in an undergraduate medical scores and other EBP activities. The responsiveness of
curriculum. the questionnaire was satisfactory.
Methods We derived the questionnaire from a compre- Conclusion A reliable knowledge, attitude and beha-
hensive literature review, informed by international and viour measure of EBP teaching and learning appropri-
local experts and a Year 5 student focus group. We ate for undergraduate medical education has been
determined its factor structure and refined and valid- developed and validated.
ated the questionnaire according to the responses of a
Keywords education, medical, undergraduate ⁄ *methods;
cohort of Year 5 and a combined group of Years 2 and
evidence based medicine ⁄ *education; curriculum; atti-
3 students using principal components factor analysis
tude; questionnaires; reproducibility of results; Hong
with varimax rotation. Factor reliability was computed
Kong; literature review.
using Cronbach’s alpha coefficient. We assessed con-
struct validity by correlating the factors with other Medical Education 2003;37:992–1000

technical EBP skills.5,6 The main emphasis in this


Introduction
literature tends to be on specific aspects of EBP skill
The evidence-based practice (EBP) approach to med- acquisition, such as asking clinical questions,6–8 search-
ical care has been recognised as a key competency for ing for evidence,6–8 retrieving appropriate evidence8
doctors.1,2 To provide optimal care, clinicians must be and critical appraisal techniques.6 These skills are
able to locate, interpret and apply current best evidence normally assessed through knowledge-based exercises
to a given clinical situation. However, they often lack such as pre ⁄ post performance tests,3,5 EBP reports9
confidence in this process and in their own decision- and questionnaires examining computer literacy3 and
making skills.3,4 self-assessed competencies.6,10
Previous research has shown that certain educational However, there is good evidence to indicate that
interventions are effective in improving cognitive and undergraduates may be inhibited in adopting the EBP
approach not only because they lack the aforementioned
Department of Community Medicine, University of Hong Kong, cognitive and technical skills, but also because of other
Hong Kong
equally important factors such as the lack of recognition,
Correspondence: Gabriel M Leung MD, MFPHM, Department of support and use of EBP methods by clinical teach-
Community Medicine, University of Hong Kong, Faculty of Medicine
Building, 21 Sassoon Road, Pokfulam, Hong Kong. Tel.: 00 852 2819 ers,10,11 time pressures,12 lack of peer group support,11
9209; Fax: 00 852 2855 9528; E-mail: gmleung@hku.hk and an inability to adopt and use in a timely and efficient

992  Blackwell Publishing Ltd M ED I C AL E D UC AT I O N 2003;37:992–1000


KAB questionnaire for evidence-based practice • J M Johnston et al. 993

published literature evaluating EBP teaching, educa-


Key learning points tional interventions in medicine for undergraduates or
postgraduate and practising clinicians, and EBP teach-
There is a lack of appropriate assessment tools for
ing evaluation questionnaires. A comprehensive search
evidence-based practice activities and processes,
of Medline from 1966 to 2001 using the MeSH terms
especially in the undergraduate setting.
evidence-based medicine, education, medical-under-
A reliable evaluation instrument, in the form of a graduate, clinical clerkship and educational measure-
knowledge, attitude and behaviour questionnaire, ment yielded 66 papers. Thirteen of these papers
was developed and validated to measure the report specific educational interventions and the eval-
effectiveness of evidence-based practice teaching uation of EBP skills such as asking clinical questions,7,8
and learning activities. acquiring evidence,8,16 appraising evidence16 and
applying evidence in the clinical context.17 We found
Face, content, criterion and construct validity was
only 1 paper that reported the development and
confirmed, as was responsiveness of the question-
validation of a questionnaire to evaluate an educational
naire.
intervention for practising doctors in a continuing
medical education context.18
manner evidence-based search and appraisal tech-
niques.12 These latter attributes of attitudes, percep-
Questionnaire development
tions and behaviour cannot be assessed through existing
modes of skill-based evaluation formats. A 4-step approach was used to select items. Firstly, a
In addition, while much previous EBP research has thematic review of EBP educational assessment and
focused on postgraduate and continuing medical edu- specific questionnaire items10,18–20 described above
cation,5,10,13,14 very few reports have looked at the identified issues which were assembled into 33 multi-
undergraduate learning environment. A growing body nomial categorical response questions and 11 questions
of literature exploring EBP teaching and learning requiring estimates, dichotomised or other responses.
confirms the research deficit in undergraduate EBP These questions address:
education and calls for further work in this area.3,6,9,15
1 knowledge (8 multinomial items, 1 item requiring an
We therefore developed and validated a knowledge,
estimate);
attitude and behaviour (KAB) questionnaire at the
2 attitude (11 multinomial items);
University of Hong Kong (HKU) designed to evaluate
3 practice (7 multinomial items, 8 discrete response
EBP teaching and learning in the undergraduate
items, 1 open-ended item);
medical education setting.
4 actual use of EBP (6 multinomial items, 1 item
requiring an open-ended estimate), and
Subjects and Methods 5 anticipated future use of EBP (1 multinomial item).
(See Table 1).
Subjects
Secondly, a focus group of Year 5 students (n ¼ 10)
The HKU undergraduate medical school curriculum was convened to inform the questionnaire development
follows a 5-year, system-oriented, problem-based learn- process. Using semistructured questions, the focus
ing ⁄ lecture hybrid curriculum. The first 2 years focus group explored students’ experiences of EBP teaching,
on preclinical activity and the final 3 on patient care their perceptions of the usefulness of EBP to patient
and clinical clerkships. Clinical exposure and experi- care and learning, and the anticipated impact of EBP
ence with EBP increases as students progress through on their daily clinical practice (Table 2).
the curriculum. All Year 5 students (n ¼ 159) were Thirdly, a panel of international experts comprising
recruited for the questionnaire development and pilot specialists in EBP, medical education, clinical medi-
testing. All Years 2 and 3 students (n ¼ 338) were cine, psychometrics, and evaluation and measurement
recruited for the questionnaire validation. was asked to assess the preliminary questions and
provide structured comments with respect to face and
content validity, comprehensibility and comprehensive-
Identifying issues important to EBP teaching and
ness. The amended questionnaire was further refined
learning
by a local panel, the members of which had expertise in
The general content and specific items of the question- psychometrics, evaluation and measurement and were
naire were initially derived from a review of the familiar with questionnaire development in the local

 Blackwell Publishing Ltd M ED I C A L E D UC A T I O N 2003;37:992–1000


994 KAB questionnaire for evidence-based practice • J M Johnston et al.

Table 1 Content of questionnaire items grouped under 5 categories

1 Knowledge of EBP13,19,20
Content: asking clinical questions, acquiring evidence, appraising evidence, applying evidence to a clinical
situation, and assessing treatment effectiveness in terms of patient outcomes
Question format: statements rated on a Likert scale (1 ¼ strongly disagree, 6 ¼ strongly agree)
2 Attitudes towards EBP6,13,20
Content: perceived need for information, willingness to practice EBP, perceived role of EBP in clinical
practice, attitude about EBP’s threat to clinical practice
Question format: statements rated on a Likert scale (1 ¼ strongly disagree, 6 ¼ strongly agree)
3 Practice of EBP6,13,19–21
Content: access and acquisition of evidence, application to patient care, influence of positive role models on
EBP adoption, barriers to adopting EBP, contribution of EBP to clinical reasoning and learning
Question format: statements rated on a Likert scale (1 ¼ never, 5 ¼ every day; 1 ¼ not at all, 6 ¼ completely)
4 Actual use of EBP
Content: current proportion of clinical activity based on EBP principles, frequency of actual use of EBP,
perceived need for EBP each day or week and for each patient encounter, overall use of EBP in the past year
Question format: mostly open-ended responses with some multiple choices
5 Future use of EBP19
Content: perceived future importance of EBP to medical practice, willingness to practise EBP in the future,
usefulness of EBP in the future, potential barriers to the adoption of EBP currently and in the future
Question format: statements rated on a Likert scale (1 ¼ very unwilling, 6 ¼ very willing; 1 ¼ completely
useless, 6 ¼ very useful; 1 ¼ not at all, 6 ¼ completely)

Cantonese-speaking context. While individual items questionnaire. Students were not specifically asked to
were not formally rated, all feedback was incorporated advise on comprehensiveness of the items as their
through an iterative process from primary to secondary limited experience was considered likely to restrict their
feedback. An additional 3 multinomial categorical appraisal of the full range of relevant issues. The
items were generated as a result of this process. resultant preliminary draft questionnaire contained 36
Fourthly, the Year 5 (n ¼ 10) undergraduate med- multinomial categorical response questions and 11
ical students were then asked to comment on the questions requiring estimates, dichotomised or other
comprehensibility and relevance of the items in the responses. This questionnaire was then returned to the

Table 2 Year 5 focus group discussion questions

1 How do you define EBP ⁄ EBM?


2 Can you tell me about the challenges or problems that you encountered in your previous EBP learning?
3 What did you find most useful about EBP learning? Why? Tell me more
4 What did you find least useful about EBP learning? Why? Tell me more
5 What aspects of EBP learning do you think will be most useful in facilitating your care for your patients?
(Break down into different areas, e.g. understanding the disease ⁄ illness, patient-related outcomes, clinical
decision making) Why? Tell me more. What else? (Probe the students to consider different areas in their
clinical practice)
6 Which aspect(s) in this EBP learning experience (focus on the tool they were using) did you find most
useful in clinical practice? Why? Tell me more. What else? (Probe the students to consider different areas
in their clinical practice)
7 Which aspect(s) in this EBP learning did you find least useful in clinical practice? Why? Tell me more.
What else?
8 Do you think you will use EBM in your daily clinical practice as a result of this EBP learning: Why? Or: Why not?
9 Has your intention to use EBM in your daily clinical practice changed (increased or decreased)? Which aspects
in this learning opportunity enhanced or discouraged your intention to integrate EBM in your clinical practice?
Why? Tell me more
10 Do you have other opinions about this EBP learning experience?
11 What about the perceptions of others, patients, clinical staff, professors?
What about the students’ reactions to these perceptions?

 Blackwell Publishing Ltd M ED I C AL E D UC AT I O N 2003;37:992–1000


KAB questionnaire for evidence-based practice • J M Johnston et al. 995

expert panel for final reconfirmation of its face and EBP activity were used to assess criterion validity.10,18
content validity. The students estimated:
1 the proportion of current medical practice that is
Questionnaire refinement based on evidence;
2 their perceived need for evidence, and
The preliminary 47-item questionnaire was adminis-
3 their perceived change in self-behaviour in looking up
tered to all Year 5 undergraduate students (n ¼ 159) at
evidence compared to that of the previous year.
the Faculty of Medicine, University of Hong Kong. The
questionnaire contained 7 items using a 5-point scale The respondents also reported how frequently they
and 29 items using a 6-point scale, adopting both Likert engaged in EBP and whether they considered them-
and adjectival scales (strongly agree)strongly disagree, selves to be EBP practitioners. As this approach
not at all)completely, never)all the time, very diffi- required multiple comparisons, only those correlations
cult)very easy, completely unprepared)completely with P £ 0Æ01 were considered significant.18
prepared, completely useless)very useful, very unwill- Responsiveness, or the extent to which the instru-
ing)very willing). Negative statements were recoded for ment can detect change,10 was assessed on the basis of a
the analysis. As none of the items were severely skewed, pre ⁄ post comparison of the Year 2 students, who had
the data can be analysed as interval data without undergone a full year of EBP teaching through a series
introducing bias.21 The 36 categorical response items of 6 learning modules during the academic year. Paired
were included in a principal components factor analysis Student’s t-test was used to compare differences
with varimax rotation. Pairwise deletion of missing between the pre ⁄ post mean factor scores.10,18 Effect
values, eigenvalues ‡ 1 and factor loading scores size was calculated to assess sensitivity to change.21
‡ 0Æ4 were used to sort items into factors.18,22 The
Scree plot was used to confirm the optimum number of
Ethics approval
factors to include in the final set. Five items did not reach
the preset factor loading threshold and were excluded This study received approval from the Undergraduate
from the questionnaire. The revised questionnaire Education Committee and the Faculty Ethics Com-
contained 42 items, 31 of which were multinomial. mittee, Faculty of Medicine, University of Hong
The 4 factors derived from the factor analysis included Kong.
EBP knowledge (9 items), attitudes towards EBP
(6 items), personal application and use of EBP (8 items)
Results
and future use of EBP (8 items). A summary score (the
mean score of all the items in each factor) and A total of 158 of 159 Year 5 students (response rate
Cronbach’s alpha coefficient were calculated for each 99%) completed the preliminary, 47-item question-
factor. naire. Individual item response rates ranged from 97%
to 100%. Students ranged in age from 21 to 29 years
but 94% were either 23 or 24 years old. The
Evaluation of the final questionnaire
male : female ratio was 2 : 1.
The final version of the questionnaire containing 43 After the refinement procedure, the confirmatory
items (with 31 categorical response items) was distri- factor analysis of the 31 categorical items in the revised
buted, completed and collected during a whole class questionnaire yielded 4 factors or subscales. The Scree
session for the Years 2 and 3 classes (n ¼ 293) at the plot supported a 4-factor solution explaining 50Æ7% of
beginning of the academic year. The Year 2 students the variance. The 4 components derived were: factor
completed the same questionnaire again at another 1 – EBP knowledge (9 items; Cronbach’s alpha ¼
whole class session at the conclusion of the school year 0Æ88); factor 2 – attitudes towards EBP (6 items;
8 months later. Cronbach’s alpha ¼ 0Æ79); factor 3 – personal applica-
The data were analysed according to the methods tion and use of EBP (8 items; Cronbach’s
described above. Following the determination of the alpha ¼ 0Æ75), and factor 4 – future use of EBP (8
factor structure, Cronbach’s alpha was used to assess items; Cronbach’s alpha ¼ 0Æ76) (Table 3). Satisfac-
the internal consistency of each factor.13,18,22,23 In tory Cronbach’s alphas were noted for each factor.
order to establish the independence of the factors and (Means, standard deviations and factor loadings for
to further examine factor construct validity, between- each item are detailed in the Appendix.)
factor correlations were calculated.10,18 Comparisons The final version of the questionnaire was completed
between the factors and estimates of actual or perceived by Years 2 and 3 students (293 ⁄ 338; response rate

 Blackwell Publishing Ltd M ED I C A L E D UC A T I O N 2003;37:992–1000


996 KAB questionnaire for evidence-based practice • J M Johnston et al.

Table 3 Factor analysis for Year 5 and combined Years 2 and 3 students

Factor analysis for Year 5 students

Variance
Number Mean explained by Cronbach’s
Factor of items scale score SD each factor alpha

EBP knowledge 9 4.42 0.64 19.1 0.88


Attitudes towards EBP 6 3.73 0.56 11.8 0.79
Personal application and use of EBP 8 2.62 0.54 10.4 0.75
Future use of EBP 8 4.12 0.53 9.4 0.76
Total variance
explained ¼ 50.7

Factor analysis for Years 2 and 3 combined

Variance
Number Mean explained by Cronbach’s
Factor of items scale score SD each factor alpha

Future use of EBP 9 3.68 0.61 16.3 0.88


Attitudes toward EBP 6 3.86 0.45 11.8 0.74
EBP knowledge 5 4.63 0.58 11.8 0.80
Personal application and use of EBP 6 2.33 0.56 10.4 0.71
Total variance
explained ¼ 50.4

SD ¼ standard deviation.

86%). These students were predominately 20 (76%) or


Correlation between factors and other
21 (14%) years of age (range 19–24 years). The
evidence-based activity
male : female ratio was 1 : 1. Individual item response
rates ranged from 89% to 100%. Principal components To further assess the criterion validity, a correlation
analysis indicated 4 components, as did the Scree plot, analysis was performed between the factor scores and
which explained 44Æ8% of the variance. Although the 4 other evidence-based activity reported by the students.
components derived were similar in content to the Future use of EBP was positively correlated with the
factors in the Year 5 group, the factor order changed and students’ perceptions of themselves as EBP practition-
3 items were found to have moved between factors. ers, frequency of practising EBP and need for evidence
These were removed for the final Years 2 and 3 analysis. per day. Personal application and use of EBP was also
Factor 1, representing EBP knowledge, loaded lower for positively correlated with the students’ perception of
the Years 2 and 3 students, whereas their expected how frequently they engaged in EBP and how fre-
future use of EBP loaded higher (Table 3). Satisfactory quently they accessed evidence compared to during the
Cronbach’s alphas were noted for each factor. previous year. Evidence-based practice knowledge was
positively correlated with the need for evidence per
week (Table 5).
Correlation between factor scores

The factors derived from the combined Years 2 and 3


Responsiveness
group were found to be coherent and represented
separate scales. As part of construct validation, interfac- The Year 2 students completed the survey once at the
tor correlation coefficients comparing the mean score for beginning of the academic year and then again
each of the 4 factors for each student were computed. No 8 months later, following 6 modules of EBP teaching.
significant correlations were found between the 4 factors, The pre ⁄ post assessment Year 2 mean factor scores
confirming their orthogonality (Table 4). calculated on the basis of the items identified in the

 Blackwell Publishing Ltd M ED I C AL E D UC AT I O N 2003;37:992–1000


KAB questionnaire for evidence-based practice • J M Johnston et al. 997

Table 4 Correlation matrix between the combined Years 2 and 3 factor scores

Future use of Attitudes toward Evidence-based Personal application and


evidence-based evidence-based practice: use of evidence-based
practice practice knowledge practice

Future use of evidence-based practice 1.000


Attitudes toward evidence-based practice 0.059 1.000
Evidence-based practice: knowledge 0.016 )0.007 1.000
Personal application and use of 0.023 0.010 )0.038 1.000
evidence-based practice

F score ¼ 1.118; P-values all greater than ¼ 0.01.

Table 5 Correlation matrix between the 4 factors and other evidence-based activity for the combined Years 2 and 3 students

Future use Attitudes EBP Personal application


Item n of EBP towards EBP knowledge and use of EBP

Amount of current clinical activity 163 0.150 )0.031 0.015 0.061


based on evidence
Need for evidence per day 135 0.196* 0.095 0.122 0.142
Need for evidence per week 157 0.123 0.080 0.160* 0.153
Current frequency of accessing 153 0.078 0.108 0.013 0.387*
evidence compared to previous year
Frequency of practising EBP 163 0.365* 0.024 0.011 0.513*
Self-rating as an EBP practitioner 149 0.353* 0.026 0.118 0.147

*Significant at P £ 0.01.

Table 6 Year 2 pre ⁄ post-assessment comparison of mean differences in mean factor scores

Paired
Mean score Mean score t-test Effect
Factor n pre-assessment* post-assessment* (T score) P-value size

Future use of EBP 108 3.93 3.98 )0.777 0.439 0.075


Attitudes towards EBP 104 3.98 3.92 1.003 0.32 0.098
EBP knowledge 108 4.60 4.83 )3.498 0.001 0.33
Personal application 107 2.46 2.48 )0.329 0.74 0.031
and use of EBP

*Scored out of a maximum of 6.

combined Years 2 and 3 component structure were the first to assess these non skill-based attributes for
compared. Evidence-based practice knowledge (factor 1) EBP and also the first to assess EBP in the undergra-
was significantly different at the post-assessment and duate setting.
demonstrated the largest effect size (Table 6). Our findings indicate that the questionnaire has
satisfactory reliability and validity. The development of
the questionnaire was informed by issues identified as
Discussion
important in the literature by international and local
This paper has focused on the development and experts and as relevant by the Year 5 focus group. The
validation of a self-administered questionnaire to assess acceptability of the questionnaire to the students was
EBP knowledge, attitude, behaviour and perceptions in demonstrated by the high response rates to individual
undergraduate medical students. To our knowledge it is items and by the year groups as a whole. While the

 Blackwell Publishing Ltd M ED I C A L E D UC A T I O N 2003;37:992–1000


998 KAB questionnaire for evidence-based practice • J M Johnston et al.

combined Years 2 and 3 student response rate was


Future research and development
lower than that for Year 5, this most likely reflects the
lack of direct patient contact for these students and We have presented preliminary results from a multistage
therefore the lack of perceived direct relevance of some validation of an EBP evaluation questionnaire for use in
of the questions to their learning routine. the undergraduate medical curriculum. This research
The derived factors for both the confirmatory Year 5 has contributed to the development of tools to assess the
and Years 2 and 3 analyses appeared to be independent, effectiveness of undergraduate medical education. Fur-
coherent, consistent and multidimensional. They inclu- ther research should focus on assessing the effectiveness
ded knowledge of EBP, personal application and use of of the realisation of EBP in practice and on evaluating the
EBP, attitudes towards EBP and future use of EBP. Each generalisability of these findings in other settings.
factor had good internal consistency and construct
validity. The proportion of explained variance for each
Contributors
factor was similar for Year 5 and combined Years 2 and 3.
While the content of each factor was similar, some JMJ participated in questionnaire development, data
differences were noted and the 3 items loading on analysis and the writing of the paper. GML conceived
different scales between Year 5 and Years 2 and 3 were and designed the study and participated in question-
removed from the analysis to ensure coherence. naire development, data analysis and the writing of the
Although the factor loading order varied between the paper. RF participated in questionnaire development,
Year 5 and combined Years 2 and 3 groups, the data analysis and the writing of the paper. KYK
consistency of the factors demonstrates suggests con- undertook project co-ordination and data analysis.
gruence between these 2 subpopulations. Such variation LMH participated in questionnaire development and
likely reflects the different educational and patient care data analysis.
experiences of these students. To account for these
expected differences, we tested the questionnaire across
Acknowledgements
multiple year groups to ensure robustness.
The combined Years 2 and 3 correlation matrix of We thank Dave Slawson, Mark Ebell, Howard Stras-
the factors with other evidence-based activities sup- berg, Sunita Stewart and Wendy Lam for their invalu-
ports this assessment. Three of the 4 scales (except for able input and advice during the drafting of the
attitudes towards EBP) were associated with increased questionnaire; and Marie Chi for expert secretarial
EBP activity as assessed by actual or anticipated assistance in the preparation of the manuscript.
behaviour. While their current need for evidence
reflected their lack of patient contact, students did
Funding
perceive the need to acquire and use evidence in the
future. The mean difference between pre- and post- This study was funded by a Teaching Development
assessment and the regression of the pre- on the post- Grant from the University of Hong Kong.
assessment for the combined Years 2 and 3 students
suggested the questionnaire was capable of measuring
References
change in EBP knowledge in undergraduate medical
students. 1 World Federation for Medical Education. In: Walton H, ed.
Evidence-based practice and the evaluation of EBP Proceedings of the World Summit on Medical Education. Med
teaching and learning is often reduced to one of its Educ 1994;28 (1):1–3.
several components, such as asking good clinical 2 Muller S. Physicians for the twenty-first century: report of the
project panel on the general professional education of the
questions, accessing and retrieving quality evidence,
physician and college preparation for medicine. J Med Educ
appraising the evidence and applying this evidence to a
1984;59:127–8, 155–67.
clinical situation.13 As changes in attitudes and per- 3 Barnett SH, Kaiser S, Morgan LK et al. An integrated pro-
ceptions as well as knowledge are important precursors gramme for evidence-based medicine in medical school. Mt
to changes in behaviour, we set out to develop an Sinai J Med 2000;67:163–8.
instrument to assess change in students’ EBP attitudes, 4 Poses RM. Money and mission? Addressing the barriers to
perceptions, behaviour and knowledge. Our analysis evidence-based medicine. J Gen Intern Med 1999;14:262–4.
demonstrates that this questionnaire supports the 5 Smith CA, Ganschow PS, Reilly BM et al. Teaching residents
assessment of EBP teaching in the undergraduate evidence-based medicine skills: a controlled trial of effective-
medical curriculum and is useful in evaluating evi- ness and assessment of durability. J Gen Intern Med
dence-based learning and teaching. 2000;15:710–5.

 Blackwell Publishing Ltd M ED I C AL E D UC AT I O N 2003;37:992–1000


KAB questionnaire for evidence-based practice • J M Johnston et al. 999

6 Ghali WA, Saitz R, Eskew AH, Gupta M, Quan H, Hershman 16 Wayland WC, Barry HC, Farquhar L, Holzman C, White A.
WY. Successful teaching in evidence-based medicine. Med Training medical students in evidence-based medicine: a
Educ 2000;34:18–22. community campus approach. Fam Med 1999;31:703–8.
7 Ellis P, Green M, Kernan W. An evidence-based medicine 17 Schoenfeld P, Cruess D, Peterson W. Effect of an evidence-
curriculum for medical students: the art of asking focused based medicine seminar on participants’ interpretations of
clinical questions. Acad Med 2000;75:528. clinical trials: a pilot study. Acad Med 2000;75:1212–4.
8 Rosenberg WM, Deeks J, Lusher A, Snowball R, Dooley G, 18 Taylor R, Reeves B, Mears R et al. Development and valid-
Sackett D. Improving searching skills and evidence retrieval. ation of a questionnaire to evaluate the effectiveness of evi-
J R Coll Physicians Lond 1998;32:557–63. dence-based practice teaching. Med Educ 2001;35:544–7.
9 Thomas PA, Cofrancesco J Jr. Introduction of evidence-based 19 McAlister FA, Graham I, Karr GW, Laupacis A. Evidence-
medicine into an ambulatory clinical clerkship. J Gen Intern based medicine and the practising clinician. J Gen Intern Med
Med 2001;16:244–9. 1999;14:236–42.
10 Green ML, Ellis PJ. Impact of an evidence-based medicine 20 McColl A, Smith H, White P, Field J. General practitioners’
curriculum based on adult learning theory. J Gen Intern Med perceptions of the route to evidence-based medicine: a ques-
1997;12:742–50. tionnaire survey. BMJ 1998;316:361–5.
11 Sackett DL, Parkes J. Teaching critical appraisal: no quick 21 Streiner DL, Norman GR. Health Measurement Scales. A
fixes. [Editorial.] Can Med Assoc J 1998;158:203–4. Practical Guide to their Development and Use. 2nd edn. Oxford:
12 Sackett DL, Straus SE. Finding and applying evidence during Oxford Medical Publications 1995;38,168.
clinical rounds. The evidence cart. JAMA 1998;280:1336–8. 22 Cork RD, Detmer WM, Friedman CP. Development and
13 Bordley DR, Fagan M, Theige D. Evidence-based medicine: a initial validation of an instrument to measure physicians’ use
powerful educational tool for clerkship education. Am J Med of, knowledge about and attitudes toward computers. JAMA
1997;102:427–32. 1998;5:164–76.
14 Green ML. Evidence-based medicine training in graduate 23 Bland JM, Altman DG. Validating scales and indexes. BMJ
medical education: past, present and future. J Eval Clin Pract 2002;321:606–7.
2000;6:121–38.
15 Srinivasan M, Weiner M, Breitfeld PP, Brahmi F, Dickerson Received 21 August 2002; editorial comments to authors 12 December
KL, Weiner G. Early introduction of an evidence-based 2002; accepted for publication 26 March 2003
medicine course to preclinical medical students. J Gen Intern
Med 2002;17:58–65.

 Blackwell Publishing Ltd M ED I C A L E D UC A T I O N 2003;37:992–1000


1000 KAB questionnaire for evidence-based practice • J M Johnston et al.

Appendix
Combined Years 2 and 3 factor scores for individual items included in each component

Mean Factor loading


Statement (range 1–6) SD score

Future use of evidence-based practice


Compared to 1 year ago, how useful do you believe evidence-based 4.30 0.913 0.623
medicine will be in your future practice as a doctor?
Compared to 1 year ago, how willing are you to practise evidence-based 4.15 0.871 0.589
medicine as a doctor in the future?
You personally appreciate the advantages of practising evidence-based medicine 3.99 0.680 0.456
Evidence-based medicine should be an integral part of the 3.95 0.825 0.556
undergraduate medical curriculum
Compared to 1 year ago, how much do you support the principles of 3.74 1.015 0.788
evidence-based medicine?
Compared to 1 year ago, how much do you support lifelong learning using 3.71 1.103 0.698
evidence-based medicine techniques?
How much do you consider the practice of evidence-based medicine a 3.23 1.009 0.739
routine part of your learning?
How much has the practice of evidence-based medicine changed the way you learn? 3.18 1.012 0.763
How easy or difficult has it been for you to practise evidence-based medicine 2.93 0.879 0.556
as a medical student in the last month?
Attitudes toward evidence-based practice
If evidence-based medicine is valid, then anyone can see patients 4.04 0.75 0.626
and do what doctors do
There is no reason for me personally to adopt evidence-based medicine because 4.02 0.57 0.622
it is just a fad (or fashion) that will pass with time
Evidence-based medicine is cook-book medicine that disregards clinical experience 3.92 0.622 0.732
Doctors, in general, should not practise evidence-based medicine because medicine 3.89 0.657 0.678
is about people and patients, not statistics
Evidence-based medicine ignores the art of medicine 3.79 0.729 0.703
Previous work experience is more important than research findings in 3.43 0.833 0.417
choosing the best treatment available for a patient
Evidence-based practice – knowledge
Evidence-based medicine requires the use of critical appraisal skills to 4.81 0.844 0.782
ensure the quality of all the research papers retrieved
Effective searching skills ⁄ easy access to bibliographic databases and 4.76 0.811 0.773
evidence sources are essential to practising evidence-based medicine
Critically appraised evidence should be appropriately applied to the 4.64 0.772 0.747
patient using clinical judgement and experience
The evidence-based medicine process requires the appropriate identification 4.57 0.754 0.684
and formulation of clinical questions
Practising evidence-based medicine increases the certainty that the 4.38 0.734 0.547
proposed treatment is effective
Personal application and use of evidence-based practice
How frequently do you access medical evidence from a textbook? 3.91 1.19 0.553
How frequently to you access medical evidence in general? 2.87 1.01 0.710
How frequently do you access medical evidence on the Internet 2.47 0.880 0.692
(excluding Medline and Cochrane Reviews)?
How frequently do you access medical evidence from original research papers? 1.96 0.784 0.713
How frequently do you access medical evidence from the Cochrane database? 1.40 0.664 0.565
How frequently do you access medical evidence from secondary sources such as 1.30 0.601 0.554
the ACP Journal Club, the Journal of Evidence-Based Medicine, POEMs
(Patient Oriented Evidence that Matters) or CATs (Critically Appraised Topics)?

 Blackwell Publishing Ltd M ED I C AL E D UC AT I O N 2003;37:992–1000

Das könnte Ihnen auch gefallen