Sie sind auf Seite 1von 7

HEALTH PROFESSON’S EDUCATION

ASSESSMENT IN HIGHER EDUCATION

JUNAID SARFRAZ KHAN


Department of Examinations, University of Health Sciences, Lahore – Pakistan

ABSTRACT
Assessment drives learning and influences the quality of learning by directing the approach students
take towards learning and by aligning the outcomes of the educational program with teaching metho-
dologies and the educational environment. Assessment needs to be recognized as a multidimensional en-
tity and not a singular activity or concept that transcends across three domains cognitive, affective and
psychomotor. Any assessment requires activation of and access to different cognitive, affective and psy-
chomotor skills at multiple levels and their applications through a fusion in a multidimensional collus-
ion of stored memories, learned knowledge and behaviour and acquired skills. Another dimension that
requires consideration here is the context in which assessment takes place. Context of assessment can be
defined in terms of the environment in which assessment takes place, its background, stakes as well as
the stakeholders involved. New formats and mediums are being used in all areas of education both as a
learning / teaching strategy as well as for assessment. Computerized, computer – aided or online teach-
ing and learning have paved the way for computer – assisted – assessment techniques. Whether assess-
ment is formative or summative, influences its design, approach and outcomes. To the administrator,
the results of the assessment, either formative or summative, provide data that will help establish cur-
rent policies or bring changes to them. To the program developers, the same results establish the worth
of the program or otherwise. To the trainees, the scores or feedback help in understanding their defici-
encies in relation to the clearly predefined goals and objectives of the educational program. The public
places great emphasis on the nature of assessment and the outcomes related to it since it is the public
that is going to use the product of the medical education programs and confidence in the product will be
related to their acceptability of the assessment and its outcomes. This paper identifies different formats
of assessment and their contextual relevance.
Keywords: Assessment, Formative assessment, summative assessment, Bloom’s Taxonomy, Context.

BACKGROUND a strategic planning whereby teaching and learning is


Assessment drives learning and influences the quality driven through it to achieve the desired goals in a com-
of learning by directing the approach students take petency – oriented, outcomes – based educational pro-
towards learning and by aligning the outcomes of the gram.
educational program with teaching methodologies and Assessment needs to be recognized as a multidi-
the educational environment. In any educational prog- mensional entity and not a singular activity or concept.
ram students learn that, which shall be assessed rather Table 1 presents the taxonomy of educational learning
than what is required. Assessment, therefore, requires and assessment divided into three domains cognitive,

Table 1: Bloom’s taxonomy of educational objectives.


Cognitive Domain
Category Behavior Description Examples Keywords
Multiple-choice test, recount Arrange, define, describe, label,
Recall data or
Knowledge facts or statistics, recall a list, recognize, relate, reproduce,
information.
process, etc. select, state.
Explain or interpret a given Explain, reiterate, classify, give
Ability to grasp the
Comprehension scenario or statement, suggest examples, illustrate, translate,
meaning of material,
treatment, review, report, discuss.
Use, apply, discover, manage,
Ability to use learned Put a theory into practical
execute, solve, produce,
Application material in new and effect, demonstrate, solve a
implement, construct, change,
concrete situations, problem.
Prepare.

Biomedica Vol. 30, Issue 1, Jan. – Mar., 2014 55


JUNAID SARFRAZ KHAN

Category Behavior Description Examples Keywords


Analyze, break down, catalogue,
Interpret organizational Identify constituent parts and
compare, quantify, test, examine,
Analysis principles, structure, functions of process or de-
experiment, relate, graph,
construction. construct a methodology.
diagram, plot.
Ability to put parts Develop plans or procedures, Develop, plan, build, create,
Synthesis together to form a new integrate methods, resources, design, revise, formulate, propose,
whole. ideas. establish, assemble.
Ability to judge the value Select the most effective Review, justify, assess, present a
Evaluation of material for a given solution. Hire the most case for, defend, report on,
purpose. qualified candidate. investigate, direct, appraise, argue.
Affective Domain
Awareness, willingness Listen to teacher, take interest Asks, chooses, describes, follows,
Receiving to hear, selected in learning, participate gives, holds, identifies, locates,
attention. passively. points to, selects, replies, uses.
Participates in class Answers, assists, aids, complies,
React and participate
Responding discussions. Questions new discusses, greets, helps, performs,
actively.
ideals, concepts, models, etc. presents, reads, recites.
Attach values and
Decide worth and relevance of Argue, challenge, debate, refute,
Valuing express personal
ideas, experiences. confront, justify, persuade.
opinions.
Reconcile internal Build, develop, formulate, defend,
Qualify and quantify personal
Organization conflicts; develop value modify, relate, prioritize, reconcile,
views, state personal position.
system. contrast, arrange.
Act, display, influence, solve,
Internalize or Adopt belief system and Shows self-reliance when
practice, proposes, qualifies,
characterize values philosophy. working independently.
questions.
Psychomotor Domain
The ability to use
Detects non-verbal Recognize, distinguish, notice,
Perception sensory cues to guide
communication cues. touch , hear, feel, etc.
motor activity.
Mental, physical or emotional Arrange, prepare, get set, states,
Set Readiness to act.
preparation before experience. volunteers.
The early stages in
learning a complex skill Imitate or follow instruction,
Guided response Imitate, copy, follow, try.
that includes imitation trial and error.
and trial and error.
Competently respond to
Mechanism Basic proficiency. Make, perform, shape, complete.
stimulus for action.
Complex Overt Skillful/expert Execute a complex process
Coordinate, fix, demonstrate.
Response proficiency. with expertise.
Skills are well developed
and the individual can Adapts, alters, changes,
Alter response to reliably meet
Adaptation modify movement rearranges, reorganizes, revises,
varying challenges.
patterns to fit special varies.
requirements.
Creating new movement
Develop and execute new
patterns to fit a Design, formulate, modify, re-
Origination integrated responses and
particular situation or design, trouble – shoot.
activities.
specific problem.

56 Biomedica Vol. 30, Issue 1, Jan. – Mar., 2014


ASSESSMENT IN HIGHER EDUCATION

Table 2: Requirements and concepts behind assessment.2


Learning Domain Activities Delivery Considerations Assessment
Self-check quizzes Web-enhanced materials Project based for higher
Case studies supplementing classroom cognitive skills
Drill and practice lectures; Hybrid course with
Cognitive Multiple choice or short essay
Short answer essay cognitive content on the web;
questions
Project or problem – based Multimedia simulations of
challenging and key concepts. Case Studies.
activities.
Face-to-face meetings
Goal setting Self-assessment using check-
Motivational videos list
Self – reflective writing in
Affective a journal Streaming audio explanations and Pre / post attitude survey
encouragement related to course content
Practice tutorials designed
for student success. Interactive video, web casts, Retention/success in course.
conference calls.
Face-to-face demonstrations
Practice of desired skill
Demonstration videos Performance of skill matches
with feedback
Psychomotor Pictures with audio and text set standard as observed by an
Arranging sequences of an instructor or designee.
explanations
activity in correct order.
Interactive video demonstrations.

affective and psychomotor presented by Bloom.1 Table 22 presents the requirements and concepts
behind assessment of the three domains and their sub- Whether assessment is formative or summative
levels. influences its design, approach and outcomes. Forma-
The pitfall to avoid here is not to consider each tive assessment is defined as “a range of formal and in-
layer within each domain separately but in a multi- formal assessment procedures employed by teachers
layered, multidimensional manner. Any assessment during the learning process in order to modify teach-
requires activation of and access to different cognitive, ing and learning activities to improve student attain-
affective and psychomotor skills at multiple levels and ment”.3 Whereas summative assessment (or summa-
their applications through a fusion in a multidimens- tive evaluation) refers to the assessment of the learn-
ional collusion of stored memories, learned knowledge ing and summarizes the development of learners at a
and behaviour and acquired skills. Repeated activation particular time.4 Since the purpose of formative assess-
and application of learned knowledge, behaviour and ment is to provide feedback on assessment, stakehol-
skills reinforces the same and improves it through the ders approach formative assessment differently than
value of the experiences gained through its application. summative assessment in which the stakes are higher.
Another dimension that requires consideration he- What needs to be recognized is the power and poten-
re is the context in which assessment takes place. Con- tial of formative assessment in aligning educational
text moulds assessment, learning through assessment strategies to achieve the outcomes of the program and
and the outcomes of the assessment. Context of asses- to make summative assessment a success.
sment can be defined in terms of the environment in If formative assessment is to be rooted within the
which assessment takes place, its background, stakes educational environment, depicting the outcomes of
as well as the stakeholders involved. individual components of the program as close to rea-
New formats and mediums are being used in all lity as possible, it will influence tremendously in driv-
areas of education both as a learning / teaching stra- ing learning in the right direction especially at the ri-
tegy as well as for assessment. Computerized, compu- ght time. Formative assessment is not formative on
ter – aided or online teaching and learning have paved account of it being assessment but by virtue of the
the way for computer – assisted – assessment techni- feedback that is generated out of it and presented to
ques. These have evolved from the very basic, resembl- the students, rather all stakeholders. The analogy to
ing pen – and – paper tests, to use of increasingly gre- consider here is that of multiple test – drives, pit –
ater adaptive technology and newer formats requiring checks and fine tuning by the entire team of a Formula
multimedia and constructed responses to finalize the 1 course before the final outcomes, the final drive whi-
programs of both learning and assessment, embedding ch is competitive and is to involve the driver and the
virtual reality and simulations. This brings all three i.e. car above without the rest of the team and whose sta-
the learning experiences, the assessment context and kes are so high that failure could represent consider-
the learning through assessment as close to reality as able losses for the entire team (the stakeholders).
possible. Assessment can therefore be classified on the basis

Biomedica Vol. 30, Issue 1, Jan. – Mar., 2014 57


JUNAID SARFRAZ KHAN

of its functionality. Airasian and Madaus5 classified as- education program. Summative assessment is im-
sessment into the following categories: portant in providing the feedback to all stakehol-
1. Placement assessment: ders that the outcomes have been achieved. In me-
Examples of placement assessment are entrance dical education, summative assessment certifies
tests like Medical Colleges Admission Test (MC- that the product of the medical education program
AT), assessment of students at the beginning of the is safe to progress to the next stage of competency
course to place them into groups based on their development and, to finally become an independe-
background knowledge, skills and behaviour level ntly functioning health professional. This certifica-
or during the course of the program to again ass- tion is important for the public trust in the prog-
ign them into groups that may require different fa- ram and its products. Summative assessment the-
cilitatory or instructional approaches. To arrive at refore, must be an assessment of the competencies
these decisions, a host of different tests and inqui- of the product as close to the real environment as
res can be used including simulated real-time per- possible for that assessment to be sufficiently valid
formance measures, pen – and – paper tests, self – and reliable to foster feelings of trust in the pro-
reports, peer – reports, direct observation of beha- duct.
viour, records of past achievements and experien- Assessment can also be classified by virtue of
ces and outcomes of counseling sessions. interpretation of assessment procedures. That is,
assessment can be norm – referenced or criterion-
2. Formative assessment: referenced.
This assessment typically does not count towards
assigning course grades. Its purpose is to provide 5. Norm – referenced assessment:
feedback to the stakeholders on the alignment of This can be defined as a measure of performance:
the strategic goals of the program and the progress cognitive, psychomotor or behavioural skills sepa-
towards those goals by the stakeholders. There- rately or in a combination, interpreted in terms of
fore, it requires 360° feedback to be fully effective. an individual’s standing in a known group relative
As discussed previously, for formative assessment to others within the group.
to be effective, it needs to be set in as much of a
real – time, objective and competency – oriented 6. Criterion – referenced assessment:
setting, as possible, assessing the program strate-
This can be defined as the measure of performance
gic goals as realistically as can be. Only then can it
cognitive, psychomotor or behavioural skills agai-
guide the instruction and learning in the direction nst predefined criteria, reference or measure. As
where it shall culminate in achievement of the pro-
an example, if the objective of an educational pro-
gram goals, fully assessed through summative ass-
gram was to train a typist to type 40 words per
essment at the end of the program.
minute, a certain referenced test shall measure the
3. Diagnostic assessment: competence of the student against the yardstick or
objective of 40 words typed per minute. Therefore,
Whereas formative assessment aims to provide
criterion – referenced assessment is also called
feedback and correction of deficiencies in instruct-
objective referenced assessment.
ion and learning in an educational program, the
purpose of diagnostic assessment is to provide the Standard – based assessments in medical edu-
stakeholders with a diagnosis of the problems or cation fall within this category as well. They may
obstacles hindering the progress of instructions or typically involve the use of checklists where perfor-
learning in the right direction and at the right time mance of the candidates are measured against set
so that adequate remedial actions can be taken by criteria; pass and fail are not dependent on the
the stakeholders concerned to achieve the strategic relative standing of an individual student within
goals of the program. Diagnostic assessment is, the cohort but by achieving minimum safe standa-
therefore, a specialized assessment requiring spe- rds. Most of the tests in medical education at pre-
cific tools like psychoanalysis, direct observations sent, however, are a mix of the two varieties, that
etc. is, they measure the student competence against
fixed predefined criteria and objectives but also
4. Summative assessment: report on the relative standing of individuals with-
Summative assessment is the final evidence of ach- in the cohort.
ievement of cognitive and psychomotor gains and Our final distinction between the two, catego-
changes in behaviour that were intended in the ries is that whereas criterion – referenced tests are
educational program. It is used for assigning cou- typically designed to measure the degree of com-
rse grades and for certifying competency in the petency or mastery achieved against predefined
outcome – oriented competency – driven higher objectives, norm – referenced tests tell us of the

58 Biomedica Vol. 30, Issue 1, Jan. – Mar., 2014


ASSESSMENT IN HIGHER EDUCATION

Criterion – referenced Combined tests Norm-referenced

Description of Dual interpretation Discrimination


performance amongst individuals

Table 3: Comparison of NRT and CRT.


Common characteristics

NRT and CRT


Both require specification of the achievement domain to be measured.
Both require a relevant and representative sample of test items.
Both use the same types of test items.
Both use the same rules for item writing (except for item difficulty).
Both are judged by the same qualities of goodness (validity and reliability).
Both are useful in educational assessment.
NRT CRT
Typically covers a large domain of learning Typically focuses on a delimited domain of learning
skills, with just a few items measuring each tasks, with a relatively large number of items
specific task. measuring each specific task.
Differences

Emphasizes discrimination among individuals Emphasizes description of what learning tasks


in terms of relative level of learning. individuals can and cannot perform.
Favors items of average difficulty and typically Matches item difficulty to learning tasks, without
omits very easy and very hard items. altering item difficulty or omitting easy or hard items.
Interpretation requires a clearly defined and
Interpretation requires a clearly defined group.
delimited achievement domain.

relative standing of each individual within the gro- or typical in routine performance of the individu-
up. Of note here is the arbitrary distinction bet- als.
ween the two based on the relative standing, either In medical education, examples of the two per-
within the group, or against a criterion. As already formances can be derived from practice. Typical
stated, it is perhaps more common these days, to routine performance of practitioners is seen in the
focus on both, with each test providing a descript- day to day, run – of – the – mill activities of health
ion of competency achieved and to the level that it professionals, in activities that they consider rou-
has been achieved within the group and thereby, tine, like working in the Out Patient Department
the information how the group as a whole has ach- (OPD) diagnosing a set of routine diseases. Maxi-
ieved those objectives. mum performance is observed when individuals
This represents a continuum as shown below6: are challenged by encounters that are other than
Comparison of norm – referenced tests (NRTs) normal or routine, when they have to perform at
and criterion – referenced tests (CRTs) is provided in the best of their abilities to arrive at the desired
table 3. outcomes. This may be a rare or challenging dia-
Cronbach7 further classified assessment into two gnosis, a particularly complicated surgical proce-
broad categories: dure, etc.
Of importance in this distribution is the posi-
a) Measures of maximum performance: tion a test has in the continuum from the routine
These measures or tests assess the performance of to maximum. This will largely depend on the con-
individuals when they are maximally motivated to text in which the test is applied, the objectives of
perform and achieve the highest. the test and the outcomes that are being measured.
Secondly the objectives of the program shall also
b) Measures of typical performance: determine how to shift the routine to the maxi-
These tests are designed to determine the normal mum in the day to day activities of the practitioner

Biomedica Vol. 30, Issue 1, Jan. – Mar., 2014 59


JUNAID SARFRAZ KHAN

Fixed choice Complex – performance

Table 4: Summary of various categorizations of assessments in Higher Education.

Basis for Classification Type of Assessment Function of the Assessment Illustrative Instruments

Determines what individuals can Aptitude tests, achievement


Maximum performance
do when performing at their best tests
Attitude, interest, and
Determines what individuals will personality inventories;
Nature of assessment Typical performance
do under natural conditions observational techniques;
peer appraisal
Efficient measurement of
Standardized multiple-
Fixed-choice test. knowledge and skills, indirect
choice test.
indicator.
Hands – on laboratory
Measurement of performance in experiment, projects, essays,
contexts and on problems valued oral presentations
Complex – performance in their own right Readiness tests, aptitude
Form of assessment.
assessment placement. Determines prerequisite skills, tests, pretests on course
degree of mastery of course goals, objectives, self – report
and / or best mode of learning. inventories, observational
techniques.
Determines learning progress, Teacher – made tests,
provides feedback to reinforce custom – made tests from
Formative.
learning, and corrects learning textbook publishers,
errors. observational techniques.
Determines causes (intellectual, Published diagnostic tests,
Use in classroom
physical, emotional, teacher-made diagnostic
instruction. Diagnostic.
environmental) of persistent tests, observational
learning difficulties. techniques.
Determines end-of-course Teacher-made survey tests,
Summative. achievement for assigning grades performance rating scales,
or certifying mastery of objectives. product scales.
Describes student performance
Teacher – make tests,
according to a specified domain of
custom – made tests from
Criterion referenced. clearly defined learning tasks
test publishers,
(e.g., adds single – digit whole
observational techniques.
Method of interpreting numbers).
results Standardized aptitude and
Describes student performance
achievement test, teacher –
according to relative position in
Norm referenced. made survey tests, interest
some known group (e.g., ranks
inventories, adjustment
10th in a classroom group of 30).
inventories.

in our case. This shift is paramount towards the formats of the Multiple Choice Questions also known
road to competence. as the objective selected – response – test items inclu-
Another distinction that is applied to the me- ding the Extended Matching and the True / False vari-
thods of assessment is based on the continuum of eties. These tests are highly efficient because students
fixed choice tests and complex – performance ass- can respond to a large number of questions relatively
essment. quickly, thereby covering a large area of the curricu-
At the far end of the continuum are the various lum over a short period of time with high validity, reli-

60 Biomedica Vol. 30, Issue 1, Jan. – Mar., 2014


ASSESSMENT IN HIGHER EDUCATION

ability, efficiency and feasibility. Since objectivity and ssments.


comprehensiveness are more important to the test re- One of the drawbacks of performance – based co-
sults than the use of machines. Both certainly improve mplex assessment models is the subjectivity they bring
efficiency. into the assessment process. Assessment of performa-
Major problems associated with the fixed choice nce at levels of competence requires scoring by compe-
tests are firstly, the emphasis on low – levels of know- tent and qualified assessors. Training these assessors
ledge at the expense of problem – solving and concep- in applying objectivity and a criterion-referencing sys-
tual skills. Secondly, according to Resnick and Resni- tem in assessment can obviate a number of these con-
ck8, such tests drive instruction towards accumulation cerns.
of facts and procedures rather than construction of Table 46 provides a summary of various categori-
knowledge through understanding and analysis. zation of assessments used in higher education with
The last few decades have seen a paradigm shift in examples of test instruments applied. Of particular no-
higher education in general towards standard – sett- te is the multiple faceted nature of assessment and the
ing, quality – control and quality assurance, outcome – multiple uses of the instruments depending on how
based and competency – oriented assessment. This pa- they are constructed and the context that they are used
radigm shift has been reflected in assessment through in.
construction of multidimensional, multilevel and com-
plex performance assessment techniques including REFERENCES
written essays, Objective Structured Clinical Examina- 1. Bloom BS. Taxonomy of Educational Objectives, Hand-
tions (OSCE), creative exercises that require analysis, book 1; The cognitive domain. New York: McKay; 1956.
comprehension and conjugation of various cognitive, 2. Vinson C. Learning Domains and Delivery of Instruction
[internet]. 2011[cited 2012 Feb 13]. Available from:
psychomotor, and affective elements. <http://pixel.fhda.edu/id/learning_domain.html>
Falling between these extremes are tests that req- 3. Ministry of Education, New Zealand. Assessment: policy
uire short answers, like the short – essay – questions to practice. Wellington, New Zealand: Learning Media;
or the structured – answer – questions. Interestingly 1994.
none of the examples provided here against the catego- 4. Wikipedia – the free encyclopedia, 2011. Summative
ries of the continuum can be depicted as stereotypes. A assessment [internet]. 2011 [cited 2012 Feb 13].
long essay question on account of the way it has been Available from:
constructed can very well fall short of assessing higher <http://en.wikipedia.org/wiki/Summative_assessment>.
5. Airasian PW, Madaus GJ. Functional types of student
order cognitive process and a short essay question, wh- evaluation. Meas. and Eval Guid. 1972; 4: 221-233.
en constructed with care, can extract through applicat- 6. Linn RL, Miller MD. Measurement and Assessment in
ion and creativity in its design the same in its respo- Teaching, 9th edn. New Jersey: Davis KM; 2005.
nse. The same can be said of the fixed – choice selec- 7. Cronbach LJ. Essentials of psychological testing, 5th edn.
ted – response test items which when provided with a New York: Harper and Row; 1990.
multidimensional problem solving scenario may req- 8. Resnick LB, Resnick DP. Assessing the thinking curricu-
uire higher order thinking to elicit a response. lum: New tools for Educational reform. Gifford, BR and
Complex – performance assessments can be built O’Connor, MC (eds.). Changing assessment: Alternative
views of aptitude, achievement and instruction: 37-75,
into authentic assessments in vitro like OSCE, or the
Boston: Kluwer Academic Publishers; 1992.
more authentic real – time in vivo work – place asse-

Biomedica Vol. 30, Issue 1, Jan. – Mar., 2014 61

Das könnte Ihnen auch gefallen