Beruflich Dokumente
Kultur Dokumente
net/publication/256083310
The Objective Structured Clinical Examination (OSCE): AMEE Guide No. 81.
Part I: An historical and theoretical perspective
CITATIONS READS
61 1,466
4 authors, including:
All content following this page was uploaded by Piyush Pushkar on 25 August 2015.
WEB PAPER
AMEE GUIDE
Abstract
The Objective Structured Clinical Examination (OSCE) was first described by Harden in 1975 as an alternative to the existing
methods of assessing clinical performance (Harden et al. 1975). The OSCE was designed to improve the validity and reliability of
Med Teach Downloaded from informahealthcare.com by 203.172.198.75 on 05/20/14
assessment of performance, which was previously assessed using the long case and short case examinations. Since then the use of
the OSCE has become widespread within both undergraduate and postgraduate clinical education. We recognise that the
introduction of the OSCE into an existing assessment programme is a challenging process requiring a considerable amount of
theoretical and practical knowledge. The two parts of this Guide are designed to assist all those who intend implementing the
OSCE into their assessment systems. Part I addresses the theoretical aspects of the OSCE, exploring its historical development, its
place within the range of assessment tools and its core applications. Part II offers more practical information on the process of
implementing an OSCE, including guidance on developing OSCE stations, choosing scoring rubrics, training examiners and
standardised patients and managing quality assurance processes. Together we hope these two parts will act as a useful resource
both for those choosing to implement the OSCE for the first time and also those wishing to quality assure their existing OSCE
For personal use only.
programme.
comprehensive manual to help institutions with the practi- . A well-designed OSCE can drive learning, and therefore,
20
calities of implementing the OSCE for the first time, though a can have a positive educational impact.
lot of studies covering various aspects of the OSCE have been
published in peer reviewed journals. This Guide will present
an evidence-based perspective on setting up an OSCE for
those new to the approach, and will also provide some before moving any further in designing and administering an
guidance and thought to those who would like to revisit their OSCE. We hope that the contents of Part I will act as a suitable
programmes for quality assurance purposes. and informative introduction for the readers, enabling them
The Guide consists of two parts; Part I focuses on the eventually to understand and implement the ideas and
historical background and educational principles of the OSCE. practical advice given in Part II, which will describe the
Knowledge and understanding of these principles is essential organisation and administration of the OSCE.
Correspondence: Dr Kamran Khan, Department of Anaesthetic, Mafraq Hospital, Mafraq, Abu Dhabi, UAE; email: Kamran.Khan950@gmail.com
ISSN 0142–159X print/ISSN 1466–187X online/13/091437–10 ß 2013 Informa UK Ltd. e1437
DOI: 10.3109/0142159X.2013.818634
K. Z. Khan et al.
evolving; from the assessment of knowledge in the period up history and perform a complete physical examination on a real
to the 1960s, to the assessment of performance from the 1970s patient, frequently chosen from the current in-patient or out-
onwards (Epstein 2007; Hays 2008). A range of assessment patient cohort. Candidates at different examination sittings may
tools have subsequently been developed for the assessment of be allocated different patients with varying conditions and
performance, including the OSCE with standardised patients, clinical signs. Candidates are asked to take a history and
workplace based assessments and assessments using com- perform a complete physical examination in the first 30–
puter enhanced simulations (Norcini & McKinley 2007). 45 min, often unobserved by the examiners (Ponnamperuma
Prior to the development of the OSCE in the 1970s by et al. 2009). Typically, unstructured questioning of the candi-
For personal use only.
Harden (1975), combinations of ‘short case’ and ‘long case’ dates follows this, which is usually focused on their clinical
examinations were used in an attempt to assess perform- findings, diagnosis and management plan of the patients’
ance. Frequently, these methods also included an oral or problems. The candidates’ interaction with the patient,
viva voce examination (Sood 2001). These assessment tools including history taking, general communication and clinical
are still being used widely in different parts of the world and skills is not always observed. Most frequently the discussion is
warrant some consideration before we move on to discuss based on the theoretical aspects of the case in question,
the OSCE. exploring the depth of candidates’ understanding i.e. their
knowledge and the management plan for the patient (Sood
2001; Wass et al. 2001a; Wass & van der Vleuten 2004).
Short case examinations A number of drawbacks with this form of assessment have
been identified; for example, the marks awarded for long cases
In short case examinations candidates are asked to perform a are based upon unstructured questioning rather than checklists
brief and focused clinical examination of around five or six or standardised scoring sheets (Norman 2002). There is an
patients with specific clinical findings. Fifteen to twenty inherent inter-case variability at separate sittings of the same
minutes per patient are usually allocated. Following this, the examination (Norcini 2002; Wass & van der Vleuten 2004).
findings and diagnosis are discussed with a pair of examiners. Further, a single case does not allow the assessment of the
The examiners award marks independently on the candidates’ candidates’ performance on a range of areas, restricting the
performance; however, this scoring is often unstructured sampling of the ‘test universe’. The content specificity of long
(Ponnamperuma et al. 2009), with the same examiners cases makes the generalisation of the skills assessed using this
examining the candidate throughout the entire time. technique nearly impossible (Wass et al. 2001b; Norcini 2002).
Although the patients have real signs and symptoms, these Despite the disadvantages described above, long case
patients often change between candidates, affecting the examinations have a high ‘face validity’, as the encounter
standardisation of the examination (Norcini 2002; Wass & between the candidate and the patient is realistic and tests the
van der Vleuten 2004). This format of examination has the integrated interaction which is very close to actual daily
advantage of having multiple, brief and observed real patient practice (Wass & van der Vleuten 2004). Table 1 lists the
encounters similar to a clinicians’ outpatient experience and advantages and disadvantages of long cases (Table 1).
are therefore considered as a very good assessment tool for To overcome the above mentioned drawbacks, many
clinical examination skills (Ponnamperuma et al. 2009). modifications of the long case examinations have been
However, the reproducibility and validity of these examin- developed. The major focus of these modifications is on the
ations is affected by unstructured questioning by the exam- observation of the candidates’ performance during the
iners, a lack of standardisation of patients between long case. For instance, the Observed Long case (Newble
candidates, and a lack of ability to assess history taking 1991), the Leicester Assessment Package (Fraser et al. 1994),
(Walsh 2006). the Objective Structured Long Case Record (OSLER)
e1438
Objective Structured Clinical Examination
Table 2. Assessment domains in the OSLER (Gleeson 1997). The original OSCE
Prior to the development of the OSCE, the candidate’s
Specific criteria
Assessment domains in each domain assessment could be affected by the patient’s performance,
Med Teach Downloaded from informahealthcare.com by 203.172.198.75 on 05/20/14
Station 3
Station 1 Station 2 Station 4
Examine neck Question on Station 1 Take history from patient Questions on Station 3
with abdominal pain
Station 5
Station 6 Station 7 Station 8
Neurological examination
Questions on Station 5 Examine chest Questions on Station 7
of legs
Station 10 Station 12
Station 9 Station 11
Read written summary of Look at written history of
Inspect photographs of Look at ward drug
patient’s history and patient and fluid balance
patients and answer prescription chart and
patient’s chest radiograph chart and answer
questions answer questions
and answer questions questions
Med Teach Downloaded from informahealthcare.com by 203.172.198.75 on 05/20/14
Station 15
Station 13
Station 14 Take history from patient Station 16
Take history from
Questions on station 13 with acute onset of Questions on station 15
patients with haematuria
dyspnoea
For personal use only.
18 Station OSCE as
Station 17 Station 18 originally described by
Examination of patient’s Questions on station 17 Figure 1: Harden (1979) shaded
breast stations had examiners
present
Figure 1. Eighteen station OSCE as originally described by Harden (1979). Shaded stations had examiners present.
of multiple stations around which students rotate and at which Dreyfus & Dreyfus (1980) competence is a point on the
students perform and are assessed on specific tasks’’. Hodder spectrum of performance. Any tool used for the assessment
(1989) and van der Vleuten and Swanson (1990) support this of performance should be able to grade the candidates as
view that many different types of test methods can be novice, advanced beginner, competent, proficient or experts
incorporated within the OSCE format. in the tasks allocated to them. A key difference between
Based on various descriptions of OSCE in the literature the OSCE and Work Place Based Assessment (WPBA) is the
we propose a consolidated definition of the OSCE; ‘‘An setting in which performance is being assessed, not that
assessment tool based on the principles of objectivity and the former assesses competence and the latter performance,
standardisation, in which the candidates move through a series as is commonly perceived (Boursicot et al. 2011).
of time-limited stations in a circuit for the purposes of Nevertheless, difference is very significant since the perform-
assessment of professional performance in a simulated envir- ance of an individual on identical tasks can vary considerably
onment. At each station candidates are assessed and marked depending on the context of the assessment (ten Cate et al.
against standardised scoring rubrics by trained assessors’’. 2010). Therefore, for all practical purposes the OSCE should
Some variants of the OSCE described in Table 3 use the be seen as a tool for the assessment of performance within
original format of moving around assessment stations to assess simulated environments. Competency is the composite of
a range of different outcomes (Table 3). cognitive, psychomotor and affective skills as appropriate,
while competence is an attribute of a person. Assessment of
performance on competencies, for example; the ability to
What Does the OSCE Assess? communicate or examine the respiratory system allows the
It is important to understand the relationship of competence assessor to determine if the performer is competent, profi-
and performance before exploring the details of what an cient or expert, etc. A detailed discussion on this topic is
OSCE can be used to assess. The OSCE has previously been beyond the scope of this Guide and we have addressed it in
described in the literature as a tool used for the assessment of our paper on competency, competence and performance
competence. This is slightly misleading as according to the (Khan & Ramachandran 2012).
e1440
Objective Structured Clinical Examination
Med Teach Downloaded from informahealthcare.com by 203.172.198.75 on 05/20/14
For personal use only.
The OSCE and its variants described earlier are only a competency based assessment of performance (CBAP), in
few from a number of tools available to assess performance different settings.
in health care. The diagram (Figure 3) describes the use Figure 3 highlights that assessment at an OSCE station gives
of Workplace Based Assessments and Incognito a ‘snapshot’ of candidates’ demonstrated performance in a
Patients (Rethans et al. 2007) as additional methods for the particular area in a simulated environment. With reference to
e1441
K. Z. Khan et al.
Objective Structured Assessment of Technical Skills (OSATS) (Bodle Designed for objective skills assessment, consisting of a global rating scale and a
et al. 2008; van Hove et al. 2010) procedure specific checklist. It is primarily used for feedback or measuring
progress of training in surgical specialities.
Objective Structured Video Examinations (OSVE) (Humphris & Kaney Videotaped recordings of patient-doctor encounters are shown to students
2000; Vlantis et al. 2004) simultaneously and questions related to the video clip are asked. Written
answers are marked in a standardised manner.
Team Objective Structured Clinical Examination (TOSCE) (Singleton et al. Formative assessment covering common consultations in general practice. A
1999) team of students visits each station in a group, performing one task each in a
sequence. The candidates are marked for their performance and feedback is
provided. The team approach improves efficiency and encourages learning
from peers.
Med Teach Downloaded from informahealthcare.com by 203.172.198.75 on 05/20/14
Performance
Observed
Actual Performance
Performance
For personal use only.
Tools: Tools:
OSCE and its Variaons DOPS Tools:
Assessment on Part Task Trainers Mini-Cex Incognito paents
+/-CBD
Assessment using High Fidelity Video Review of
Simulaon, etc. Performance
Figure 3. Model for the competency based assessment of performance and examples of available assessment tools. Reproduced
from Khan & Ramachandran (2012).
Attitudes
Environment
(prejudice,
(simulated or
professionalism
workplace)
Non-Clinical etc.)
Skills (decision Emotional State
making, team (anxiety, being
working, observed etc.)
planning etc.)
Personality
Knowledge and Traits
ability to apply (cautiousness,
knowledge Performance
extroversion
(cognitive skills) etc.)
Med Teach Downloaded from informahealthcare.com by 203.172.198.75 on 05/20/14
Figure 5. Factors influencing performance. Modified from Khan & Ramachandran (2012).
Table 4. Affective, cognitive and psychomotor domains of We believe that, the OSCE should be designed to assess
learning from basic to advanced levels. certain competencies or skills which are not possible to be
assessed using pen and paper or computer based testing
Range of difficulty Affective Cognitive Psychomotor methods. Typical examples of such skills could be the ability of
For personal use only.
Educational principles of the OSCE test length has been shown to have the greatest influence in
ensuring that candidates’ overall performance is reliably
The two major underlying principles of the OSCE are assessed (Swanson et al. 1995).
‘objectivity’ and ‘structure’. Objectivity predominantly depends
on standardised scoring rubrics and the same, trained, (1) Standardised scoring rubrics
examiner asking the same questions to every candidate. These ensure that examiners are marking candidates
A well-structured OSCE station on the other hand has a against the same criteria thereby improving consistency of
standardised station design assessing a specific clinical task scoring between candidates and examiners (Smee 2003).
which is blueprinted against the curriculum. A well-designed
OSCE has a high level of validity (Downing 2003), which in (1) Using trained examiners
simple terms means that the OSCE assesses what it is designed There is some evidence suggesting that examiner training
to assess. At the same a time well-designed OSCE also has reduces examiner variation in scoring (Newble et al. 1980; van
been shown to demonstrate a high degree of reliability der Vleuten et al. 1989) and improves consistency of exam-
(Boursicot 2010), i.e. the examination results are reproducible iners’ behaviour. Further using different examiners for different
with very little error. The concepts of validity, reliability, stations can reduce individual assessor bias (Gormley 2011).
feasibility, and educational impact are explored in more
detail below. (1) Standardised patient performance
Poorly standardised patients who vary their performance
Med Teach Downloaded from informahealthcare.com by 203.172.198.75 on 05/20/14
Assessment of Clinical Examination Skills (PACES) He has special interest in assessment, problem-based learning and research
in medical education.
examination as a component of its membership exam-
inations (Wallace 2002). DR KATHRYN GAUNT, MBChB, MRCP, PGD(MedEd), at the time of
production of this manuscript was a Medical Education and Simulation
(3) As a formative assessment tool in undergraduate
Fellow, LTHTR, Preston, UK. Dr Kathryn Gaunt graduated in 2002 from the
medical education (Townsend et al. 2001). University of Manchester Medical School, UK. She was a Specialty Registrar
(4) As a tool for the assessment of graduates seeking high- in Palliative Medicine and has a keen interest in medical education. She is
stakes licensure and certification to practise medicine currently pursuing a higher qualification in this field. She has recently taken
(Hodges 2003). Professional Linguistics and Assessment time out of her training programme to work as a Medical Education and
Simulation Fellow at the Lancashire Teaching Hospitals NHS Trust. Here
Board Part II [PLAB] in United Kingdom (Tombleson
she lectured, examined for OSCEs and was involved in research and
et al. 2000), Medical Council of Canada Qualifying development within the field. She has now returned to her training in
Examination II (Reznick et al. 1992), and the Clinical Palliative Medicine.
Skills Assessment part of the United States Medical Dr PIYUSH PUSHKAR, MBChB, works in Critical Care at the Aintree
Licensure Examination are based on an OSCE model University Hospitals Trust, Liverpool, UK. Piyush qualified from the
(Whelan 1999). University of Edinburgh and has subsequently worked and trained in
anaesthesia and critical care in Lancashire and Merseyside, as well as
(5) As an educational tool to provide immediate feedback
aeromedical retrieval medicine in Queensland, Australia. He also works as
(Hodder et al. 1989; Brazeau et al. 2002). a volunteer doctor for Freedom from Torture in Manchester.
Conclusions
Med Teach Downloaded from informahealthcare.com by 203.172.198.75 on 05/20/14
References
Part I of this Guide has introduced the concept of the OSCE as
American Educational Research Association. 1999. Standards for educa-
a tool for performance-based assessment in simulated envir-
tional and psychological testing. Washington, DC: American
onments. This part has also described why the OSCE was Educational Research Association.
developed and how it can meet the need for more structure Bloom BS. 1974. Taxonomy of educational objectives: The classification of
and objectivity within the assessment of performance. We educational goals handbook 1: Cognitive domain, [S.l.], New York:
have stressed throughout this Guide that the OSCE should be Longman.
Bodle JF, Kaufmann SJ, Bisson D, Nathanson B, Binney DM. 2008. Value
used only when other methods of assessment cannot assess
and face validity of objective structured assessment of technical skills
For personal use only.
the competency in question or in conjunction with other (OSATS) for work based assessment of surgical skills in obstetrics and
assessments of performance. It neither replaces the tools used gynaecology. Med Teach 30:212–216.
for assessment of knowledge nor is it a panacea for all the Boursicot K. 2010. Structured assessments of clinical competence. Br J
assessment needs in an educational programme. The chal- Hosp Med 71:342–344.
Boursicot K, Etheridge L, Setna Z, Sturrock A, Ker J, Smee S, Sambandam E.
lenges associated with the academic and practical aspects of
2011. Performance in assessment: Consensus statement and recom-
implementing an OSCE, particularly when this examination mendations from the Ottawa conference. Med Teach 33:370–383.
format is novel to the institution are addressed in the Part II of Brazeau C, Boyd L, Crosson J. 2002. Changing an Existing OSCE to a
this Guide. Readers equipped with the information in this part Teaching Tool: The Making of a Teaching OSCE. Acad Med 77:932.
of the Guide will find it easy to understand the complex Carraccio C, Englander R. 2000. The Objective Structured Clinical
processes needed for the setup of OSCE programmes, Examination. A step in the direction of competency-based evaluation.
Arch Pediat Adol Med 154:736–741.
described in the second part. Downing MS. 2003. Validity: On the meaningful interpretation of
assessment data. Med Educ 37:830–837.
Declaration of interest: The authors report no conflicts of Downing SM. 2010. Validity: Establishing meaning for assessment data
interest. The authors alone are responsible for the content and through scientific evidence. St George’s Advanced Assessment Course,
London.
writing of the article. Dreyfus S, Dreyfus H. 1980. A Five stage model of the mental activities
involved in directed skill acquisition. Research Paper. Department of
Notes on contributors Philosophy, University of California, Berkeley.
Epstein RM. 2007. Assessment in Medical Education. N Eng J Med
Dr KAMRAN KHAN, MBBS, FRCA, FAcadMEd, MScMedEd, at the time of 356:387–396.
production of this manuscript was Senior Teaching Fellow Manchester Fraser RC, Mckinley RK, Mulholland H. 1994. Consultation competence in
Medical School and Consultant Anaesthetist, LTHTR, Preston, UK. During general practice: Establishing the face validity of pritoritised ciriteria in
his career Dr Khan has developed a passion for medical education and the Leicester assessment package. Br J Gen Prac 44:109–113.
acquired higher qualification in this field alongside his parent speciality in Gleeson F. 1997. Assessment of Clinical Competence using the Objective
Anaesthetics. He has held clinical academic posts, first at the University of Structured Long Examination Record (OSLER), AMEE Medical
Oxford and then at the University of Manchester. He was Associate lead for Education Guide No. 9. Med Teach 19:7–14.
‘Assessments’ at the Manchester Medical School. He is currently working in Gormley G. 2011. Summative OSCEs in undergraduate medical education.
the UAE as a Consultant Anaesthetist. He has extensively presented and Ulster Med J 80:127–132.
published at the national and international levels in his fields of academic Gupta P, Dewan P, Singh T. 2010. Objective Structured Clinical
interest. Examination (OSCE) Revisited. Indian Pediat 47:911–920.
Dr SANKARANARAYANAN RAMACHANDRAN, MBBS, Pg Cert (MedEd), Hamdy H, Prasad K, Williams R, Salih FA. 2003. Reliability and validity of
FHEA, is a Medical Education & Simulation Fellow, Lancashire Teaching the direct observation clinical encounter examination (DOCEE). Med
Hospitals NHS Trust, Preston, UK. He qualified in India and underwent Educ 37:205–212.
postgraduate training in General Medicine. He had been teaching and Hamdy H, Telmesani AW, Wardy NA, Abdel-Khalek N, Carruthers G,
training undergraduate and postgraduate medical students throughout his Hassan F, Kassab S, ABU-Hijleh M, AL-Roomi K, O’malley K, et al. 2010.
career. He has completed a Postgraduate Certificate and pursuing a Undergraduate medical education in the Gulf Cooperation Council: A
Master’s degree in Medical Education at the University of Manchester. multi-countries study (Part 2). Med Teach 32:290–295.
e1445
K. Z. Khan et al.
Harden RM. 1979. How to Assess Clinical Competence – An overview. Med structured clinical examination for the licentiate: Report of the pilot
Teach 1:289–296. project of the Medical Council of Canada. Acad Med 67:487–494.
Harden RM. 1988. What is an OSCE? Med Teach 10:19–23. Roberts C, Newble D, Jolly B, Reed M, Hampton K. 2006. Assuring the
Harden RM, Cairncross RG. 1980. Assessment Of Practical Skills: quality of high-stakes undergraduate assessments of clinical compe-
The Objective Structured Practical Examination (OSPE). Studies High tence. Med Teach 28:535–543.
Educ 5:187–196. Shumway JM, Harden RM. 2003. AMEE Guide No. 25: The assessment of
Harden RM, Stevenson M, Downie WW, Wilson GM. 1975. Assessment of learning outcomes for the competent and reflective physician. Med
Clinical Competence using Objective Structured Examination. BMJ Teach 25:569–584.
1:447–451. Singleton A, Smith F, Harris T, Ross-Harper R, Hilton S. 1999. An evaluation
Hays R. 2008. Assessment in medical education: Roles for clinical teachers. of the Team Objective Structured Clinical Examination (TOSCE). Med
Clin Teach 5:23–27. Educ 33:34–41.
Hodder RV, Rivington RN, Calcutt LE, Hart IR. 1989. The effectiveness of Smee S. 2003. ABC of learning and teaching in medicine ‘Skill based
immediate feedback during the Objective Structured Clinical assessment’. BMJ 326:703–706.
Examination. Med Educ 23:184–188. Smith LJ, Price DA, Houston IB. 1984. Objective structured clinical
Hodges B. 2003. OSCE! Variations on a theme by Harden. Med Educ examination compared with other forms of student assessment. Arch
37:1134–1140. Dis Child 59:1173–1176.
Hodges B, Hanson M, Mchaughton N, Regehr G. 2002. Creating, Sood R. 2001. Long Case Examination - Can it be Improved. Indian Acad
Monitoring, and Improving a Psychiatry OSCE. Acad Psychiat Clin Med 2:251–255.
26(3):134–161. Stokes JF. 1979. Take a Clinical Examination. BMJ 1:98–99.
Humphris GM, Kaney S. 2000. The Objective Structured Video Exam for Swanson DB, Norman GR, Linn RL. 1995. Performance-Based Assessment:
assessment of communication skills. Med Educ 34:939–945. Lessons From the Health Professions. Educ Res 24:5–11.
Jayawickramarajah PT. 1985. Oral examinations in medical education. Med Ten Cate TJO, Snell L, Carraccio C. 2010. Medical competence: The
Med Teach Downloaded from informahealthcare.com by 203.172.198.75 on 05/20/14
Educ 19:290–293. interplay between individual ability and the health care environment.
Jewett AE, Jones SL, Luneke SM, Robinson SM. 1971. Educational change Med Teach 32:669–675.
through a taxonomy for writing physical education objectives. Quest Tombleson P, Fox RA, Dacre JA. 2000. Defining the content for the OSCE
15:32–38. component of the PLABoard examination development of a blueprint.
Khan K, Ramachandran S. 2012. Conceptual Framework for Performance Med Educ 34:566–572.
Assessment: Competency, Competence and Performance in the Context Townsend AH, Mcllvenny S, Miller CJ, Dunn EV. 2001. The use of an
of Assessments in Healthcare – Deciphering the Terminology. Med objective structured clinical examination (OSCE) for formative and
Teach 34:920–928. summative assessment in a general practice clinical attachment and its
Krathwohl DR. 1956. Taxonomy of educational objectives: The classfication relationship to final medical school examination performance. Med
of educational goals; handbook 2: Affective domain, [S.l.], New York: Educ 35:841–846.
For personal use only.
Longmans. Troncon LA, Dantas RO, Figueiredo JFC, Ferriolli E, Moriguti JLC, Martinelli
Mcmanus IC, Thompson M, Mollon J. 2006. Assessment of examiner ALC, Voltarelli JLC. 2000. A standardized structured long case
leniency and stringency (‘hawk-dove effect’) in the MRCP(UK) clinical examination of clinical competence of senior medical students. Med
examination (PACES) using multi-facet Rasch modelling. BMC Medical Teach 22:380–385.
Education 6:42. Turner JL, Dankoski ME. 2008. Objective Structured Clinical Exams: A
Miller GE. 1990. The assessment of clinical skills/competence/performance. Critical Review. Fam Med 40:574–578.
Acad Med 65:S63–S67. Van Der Vleuten CPM, Swanson DB. 1990. Assessment of clinical skills with
Nestel D, Kneebone R, Nolan C, Akhtar K, Darzi A. 2011. Formative standardized patients: State of the art. Teach Learn Med 2:58–76.
Assessment of Procedural Skills: Students’ Responses to the Objective Van Der Vleuten CPM, Van Luyk SJ, Van Ballegooijen AMJ, Swansons
Structured Clinical Examination and the Integrated Performance DB. 1989. Training and experience of examiners. Med Educ
Procedural Instrument. Assess Eval Higher Educ 36:171–183. 23:290–296.
Newble D. 2004. Techniques for measuring clinical competence: Objective Van Hove PD, Tuijthof GJM, Verdaasdonk EGG, Stassen LPS, Dankelman J.
structured clinical examinations. Med Educ 38:199–203. 2010. Objective assessment of technical surgical skills. Br J Surg
Newble DI. 1991. The observed long-case in clinical assessment. Med Educ 97:972–987.
25:369–373. Vlantis AC, Lee WC, Van Hasselt CA. 2004. The objective structured video
Newble DI, Hoare J, Sheldrake PF. 1980. The selection and training of examination of medical students. Med Educ 38:1181–1182.
examiners for clinical examinations. Med Educ 14:345–349. Wallace J. 2002. Simulated patients and objective structured clinical
Norcini JJ. 2002. Death of a long case. BMJ 324:408–409. examinations: Review of their use in medical education. Adv Psychiat
Norcini JJ, Mckinley DW. 2007. Assessment methods in medical education. Treat 8:342–348.
Teaching Teach Educ 23:239–250. Walsh, K. 2006. OSCE. BMJ Learning. Available from group.bmj.com/
Norman G. 2002. The long case versus objective structured clinical products/learning/medical-presentations-2/OSCEs.ppt [Accessed 13
examinations. BMJ 324:748–749. August 2013].
PMETB. 2007. Developing and maintaining an assessment system - a Wass V, Jones R, Van Der Vleuten C. 2001a. Standardized or real patients to
PMETB guide to good practice [Online]. London: PMETB. [Accessed test clinical competence? The long case revisited. Med Educ
10 June 2012] Available from http://www.gmc-uk.org/ 35:321–325.
assessment_good_practice_v0207.pdf_31385949.pdf Wass V, Van Der Vleuten C. 2004. The long case. Med Educ
Ponnamperuma GG, Karunathilake IM, Mcaleer S, Davis MH. 2009. 38:1176–1180.
The long case and its modifications: A literature review. Med Educ Wass V, Van Der Vleuten C, Shatzer J, Jones R. 2001b. Assessment of
43:936–941. clinical competence. Lancet 357:945–949.
Rethans J-J, Gorter S, Bokken L, Morrison L. 2007. Unannounced Whelan GP. 1999. Educational commission for Foreign Medical Graduates -
standardised patients in real practice: A systematic literature review. clinical skills assessment prototype. Med Teach 21:156–160.
Med Educ 41:537–549. Wilson GM. 1969. Examination of clinical examiners. Lancet 4:37–40.
Reznick R, Smee S, Rothman A, Chalmers A, Swanson D, Dufresne L, Zayyan M. 2011. Objective Structured Clinical Examination: The
Lacombe G, Baumber J, Poldre P, Levasseur L. 1992. An objective Assessment of Choice. Oman Med J 26(4):219–222.
e1446