Sie sind auf Seite 1von 30

2014 DJ

Assessment For Learning: Designing


Effective Learner Centred Assessment







Contributing Lecturer
David Jennings
UCD Teaching and Learning

2014 DJ
Workbook

The aim of the this workbook is to provide a series of resources, methods and approaches,
to the area of assessment design from a student perspective i.e. enabling assessment for
learning and supporting learner autonmy.

The workbook is not exhaustive, but attempts to focus on core issues and needs. The
added literary and web references provide further readings and activites if so required.

Key areas covered include:
- Alignment and Effective Testing
- Principles of good formative assessment and feedback
- Assessment methods
- Peer and self assessment
-
Around each themed area you will find worksheets and activity lists, plus substantial
references to original and core literarture.

You are free to edit, adapt and copy this workbook and present it to your students and
colleagues, however attribution must be given to the original
authors (this work is licenced under the Creative Commons
Attribution Only Licence, see http://creativecommons.org/)
Jennings, D. 2014. Assessment For Learning: Designing Effective Learner
Centred Assessment. UCD Teaching and Learning, UCD, Ireland.

Further workbooks are available, for information contact David.Jennings@ucd.ie

lease noLe Lhe maLerlals ln Lhls workbook are based on Lhe conLenLs of uCu 1eachlng and
Learnlng' Cpen LducaLlonal 8esources webslLe, for furLher deLalls and onllne acLlvlLles vlslL:
www.ucdoer.le
2014 DJ


The UCD Centre for Teaching and Learning 3
Table of Contents

ln18CuuClnC ASSLSSMLn1 lC8 LLA8nlnC ................................................................................................. 4
1?LS Cl ASSLSSMLn1 ............................................................................................................................ 3
8LlCCuSlnC ASSLSSMLn1 lC8 LLA8nlnC .................................................................................................. 6
ALlCnMLn1 & LllLC1lvL 1LS1lnC ............................................................................................................ 7
MA1CPlnC LLA8nlnC Cu1CCMLS 1C ASSLSSMLn1 1?LS ............................................................................ 8
ln18CuuClnC LLA8nlnC CCn18AC1S ...................................................................................................... 10
8lnClLLS Cl CCCu lC8MA1lvL ASSLSSMLn1 Anu lLLu8ACk ................................................................... 12
ASSLSSMLn1 C8l1L8lA .......................................................................................................................... 14
M? ASSLSSMLn1 8C1CCCL .................................................................................................................. 16
ASSLSSMLn1 ML1PCuS ........................................................................................................................ 17
CLnL8AL ASSLSSMLn1 ML1PCuS ........................................................................................................... 18
LLSS lAMlLlA8 ASSLSSMLn1 ML1PCuS .................................................................................................... 19
ASSLSSMLn1 ML1PCuS ln CLASS ............................................................................................................ 20
SLLl Anu LL8 ASSLSSMLn1 .................................................................................................................. 22
C8AulnC C8Cu WC8k: 8CCLSS vS 8CuuC1 ....................................................................................... 23
uLSlCnlnC ASSLSSMLn1S ...................................................................................................................... 24
LxAMLL ASSLSSMLn1 LAn .................................................................................................................. 23
LxAMLL ASSLSSMLn1 MCuLS: CCnCL1 MAS ....................................................................................... 26
LxAMLL ASSLSSMLn1 MCuLS: C8AL 8LSLn1A1lCnS .............................................................................. 27
LxAMLL ASSLSSMLn1 MCuLS: 8u88lCS ................................................................................................. 28
SLLLC1 8l8LlCC8AP? ........................................................................................................................... 29

2014 DJ
Introducing Assessment for Learning

Some of the key purposes of assessment are; to enable the communication of the
achievement and subsequent status of students during their programme of learning;
to provide a means of self-evaluation and information pertaining to such; to identify
student placement within educational paths and/or programmes; to address the
evaluation and effectiveness of instructional programmes; and to simply motivate
the learner the premise of assessing for learning is to provide a platform within
which the learner is clearly supported in achieving the designated outcomes.

"#$%&'# (&)*'+', -&*#+.+/)#+0' 1$)2+#3 455$*)'/&
rovlde feedback Lo lmprove sLudenL
learnlng
1o pass/fall a
sLudenL
rovlde feedback
Lo lecLurers
MoLlvaLe sLudenLs 1o grade/rank lmprove Leachlng
ulagnosls of sLudenLs sLrengLhs and
weaknesses
1o llcence Lo
proceed/pracLlce
MonlLor sLandards
over Llme

"$66)#+7& 455&556&'# 80*6)#+7& 455&556&'#
ls Lhe assessmenL whlch provldes overall
and flnlLe evldence of Lhe achlevemenL of
sLudenLs and of whaL Lhey know,
undersLand and can do, by asslgnlng a value
(ofLen quanLaLlve) Lo whaL Lhe sLudenL
achleves
ls Lhe assessmenL LhaL provldes feedback Lo
learners ln order Lo help Lhem learn, and
feedback Lo Leachers Lo enable Lhem Lo
declde how a sLudenL's learnlng should be
Laken forward
!""#""$#%& () *#+,%-%. !""#""$#%& ()* +#,*%-%.
Three Purposes of Assessment
1
and Assessment Shift



1
8ased on MuLch A, 8rown C (2002).

2014 DJ


The UCD Centre for Teaching and Learning 5
Types of Assessment

There are four core types of assessment one may employ, only summative is may
be considered as not providing an opportunity for the learner to respond, reflect
immediately;

Diagnostic:
Used as a low stakes assessment early on this form may offer the learner an insight
into their own needs and goals pertaining to a particular module/session and invoke
a level of preparedness for activies and outcomes to be addressed.

Formative:
Used pro-actively as a means to assess learning, this form provides the ability to
engage the learner throughout their pedagogical journey. Ideally espousing the
concept of feed-forward: initiating the abilitity of the learner to being able to
respond to their assessment feedback in a positve (learned) manner, preparing
them for the next stage/phase of their programme.

Integrative
2
:
This form enables the learner to embrace their reflexive nature and captures the
capabilities associated with lifelong learning skills. A core component is that a
learner is rewared for their meta-cognitive abilites rather than their declartive
knowledge.

Summative:
Used towards the end of a learning period, these assessments are collated to
determine whether a learner has fulfilled the specified learning outcomes and
consequently achieve accreditation.

2
Crisp. (2012).
2014 DJ
Refocusing Assessment for Learning

Written exams are being replaced by more continuous assessment and coursework.
There is a move towards more student involvement and choice in assessment.
Course outlines have become more explicit about the expectations in assessment.

- Group assessment is more frequently used (in line with the shift in emphasis
within the curriculum from competition between students towards
collaborative learning between students.)
- An understanding of process is now seen as, at least, equally important to a
knowledge of facts. (In line with the general shift towards a process-based,
rather than product-based curriculum.)
- Student focused 'learning outcomes' have begun to replace more teacher
orientated 'objectives'. The focus is more on what the student will learn
rather than what the teacher plans to teach. (This is in line with more student
led approaches in the curriculum generally).
3


From
Towards
Written Exam
Coursework
Tutor led
Student Led
Implicit Criteria
Explicit criteria
Competition
Collaboration
Product assessment
Process
Objectives
Outcomes
Content
Competencies
Trends in Assessment


3
8rown C., 8ull !., endlebury M (1997)

2014 DJ
Alignment & Effective Testing

If the aims are unclear then the system falters. Clear and realistic outcomes provide
learners with a good guide of whats required to be learnt (and how this may be
achieved through suitable learning opportunities). It provides the lecturer with a
direct guide and/or framework of how one may deliver and teach the programme.

Effective assessment methods and tasks are related to the learning outcomes and
the methods and opportunities employed in learning. If written criteria are too vague
then it is difficult for both, the assessor to ensure a consistency of judgment, and for
students to fulfill the demands of the assessment task. Without close links between
feedback, criteria and the assessment tasks, lecturers cannot help students to
achieve the learning outcomes of a course or a programme.

An interagrated approach to assessment design is required to maximise the
benefical impact for both the learner, academic, and the evaluation of a programme.

Aims

Intended Learning Outcomes

Methods of Teaching & Learning
(lesson plans, labs, tutorials, autonomous learning,
in-class activities, assignments etc)

Assessment methods and tasks
(Incl Feedback/Feed forward)

Criteria

Marking & Feedback
Evaluation
(Summative)
2014 DJ


The UCD Centre for Teaching and Learning 8
Matching Learning Outcomes To Assessment Types

Different assessments drive different types of learning, this table offers a selection
of alternative modes of assessment enabling students to work to their strengths,
thus providing an inclusive approach to the assessment regime.

Types of Learning:
Learning outcomes
What is required from
students?
Examples of Assessment
Thinking critically and
making judgments
Development of arguments,
reflection, judgment,
evaluation
Essay
Report
Book review
MCQ/SAQ

Solving problems /
developing plans
Identify problems, define
problems, analyse data,
review, design experiments,
plan, apply information
Problem scenario
Group Work
Work-based problem
Analyse a case
Conference paper (or notes
for a conference paper plus
annotated bibliography)
Performing procedures and
demonstrating techniques
Take readings, use
equipment, follow laboratory
procedures, follow
protocols, carry out
instructions
Demonstration
Role Play
Make a video (write script
and produce/make a video)
Produce a poster
Lab report
Demonstrating knowledge
and understanding
(Can be assessed in
conjunction with the above
types of learning)
Recall, describe, report,
identify, recognise, recount,
relate, etc.
Written examination
Oral examination
MCQs
Essays
Reports
Short answer questions
Mini tests

2014 DJ


The UCD Centre for Teaching and Learning 9
Types of Learning:
Learning outcomes
What is required from
students?
Examples of Assessment
Managing / developing
yourself
Work co-operatively and,
independently, be self-
directed, manage time,
manage tasks
Learning journal
Portfolio
Learning Contracts
Self-evaluation
Group projects
Peer assessment
Designing, creating,
performing
Design, create,
perform, produce, etc.
Design project
Portfolio
Presentation
Performance
Assessing and managing
information
Information search and
retrieval, investigate,
interpret, review information
Annotated bibliographies
Use of bibliographic software
Library research assignment
Data based project
Communicating Written, oral, visual and
technical skills
Written presentation
Oral presentation
Discussions /Debates/ role
plays
Group work
(Adapted from Nightingale et al. (1996)


Task:
Perform a spot check on the modes of assessment above;
- Are you aware of the types of assessment being used by your peers?
- Do you offer a range of assessment to account for student preference / abiilty /
learning style?

2014 DJ
Introducing Learning Contracts

A learning contract is an agreement negotiated between a learner and a
supervisor to ensure that certain activities will be undertaken in order to
achieve an indentified learning goal, specific evidence will be produced to
demonstrate that goal has been reached.

Learning contracts have grown in popularity as part of the changing trend in
assessment methods from tutor-centred to more student-centred
approaches. It also reflects the move towards more self-directed learning.
Knight (2002) describes how there are many alternative terms for learning
contracts including 'learning agreements', 'negotiable learning agreements'.

An essential component of Learning Contracts is that they centre around the
gaps in the knowledge of the student and what it is they need / wish to learn.

A learning contract usually has a written record of:

A series of negotiated learning goals/objectives. These are set
between the student and the tutor/expert
The strategies and resources by which these goals can be met
The evidence which will be presented to show that objectives have
been achieved and how it will be assessed
A time scale for completion


Task:

Take a moment to individually complete the example learning contract, apply it
to this session, could you see your learners using this?

2014 DJ
My Learning Contract

Student: _______________________________________________________
Assessor: _______________________________________________________
Date due: _______________________________________________________

Intended Outcomes
[e.g. Learning Goals identified by or in conjunction with the tutor]





Resources and Strategies
[e.g. What the learner will do to achieve these outcomes]




Evidence
[e.g. How the learner will demonstrate achieving the outcomes]




Assessment Criteria
[e.g. Negotiated or standard]





2014 DJ
Principles of Good Formative Assessment and Feedback

Feedback plays an important role in teaching and learning - learners need prompt
feedback to learn effectively (Gibbs, 2007). In a meta-analysis of studies into
student achievement, feedback was reported to be the single most powerful
influence (Hattie, 1987), a finding supported by Black & Wiliams (1998) review of
formative feedback on learning.

Consider the following queries and how you (may) address them
4


1. Help clarify what good performance is (goals, criteria, standards).
a. To what extent do students in your course have opportunities to engage
actively with goals, criteria and standards, before, during and after an
assessment task?

2. Encourage time and effort on challenging learning tasks.
a. To what extent do your assessment tasks encourage regular study in and out
of class and deep rather than surface learning?

3. Deliver high quality feedback information that helps learners self-correct.
a. What kind of teacher feedback do you provide in what ways does it help
students self-assess and self-correct?











4
Re-engineering Assessment Practices in Higher Education http://www.reap.ac.uk Accessed 01.14
2014 DJ


!"
4. Provide opportunities to act on feedback (to close any gap between current
and desired performance)
a. To what extent is feedback attended to and acted upon by students in your
course, and if so, in what ways?

5. Ensure that summative assessment has a positive impact on learning?
a. To what extent are your summative and formative assessments aligned and
support the development of valued qualities, skills and understanding.

6. Encourage interaction and dialogue around learning (peer and teacher-
student.
a. What opportunities are there for feedback dialogue (peer and/or tutor-student)
around assessment tasks in your course?


















2014 DJ


!$
Assessment Criteria
Reliabilty and Ownership
The development of specific assessment criteria removes the notion that only an
expert may fully understand the process of achievement and attainement. It
enables the learner to chart their assessment process and removes the potential for
misunderstanding, in effect it allows them to gain further insight into their own
acquisition of feedback and the assessment activity itself.

For the academic, the presence of clear and transparent assessment criteria will
enhance the consistency of the assessment protocol, both in what it purports to
measure and how it is subsequently measured, thus ensuring fairness and
objectivity.

The development of self and peer assessment may have a direct impact on the
objectivity and reliability of the process by the fact of removing the single
assessor and enabling judgments and grades to be undertaken communinally.
However this needs to be carefully moderated and managed in its deployment
the question arises how much moderation is requried to engage the students to
take ownership and thus learn more in the process?





Assessments should be both valid and reliable. Validity describes the extent
to which assessment measures what it purports to measure, and reliability
that it achieves this consistently (Gronlund and Linn, 1990)
5


5
Jennings, D, McMahon, T and Surgenor, P. 2013. Assessment in Practice. UCD Teaching and Learning, UCD, Ireland.
Is the assessment assessing
what it is meant to assess?

Does the assessment method
consistently assess what is
being assessed?
2014 DJ
Orsmond (2004) provides a framework for enabling groups to develop and come to
a consensus on the formation of criteria, these are Structured written schedules and
Structured group activities. The core focus is to provide a means by which the
collective cohort are able to take ownership and understand the process of devising
a set of criteria. The structured activities allow the learners the opportunity to relect
and review their assignments, individually, in pairs, triads etc. The impetus here is
to ensure they are able to comment upon their and their peers work constructively
in relation to the criteria why they may have lost marks, and how they may do
better.

The addition of a constructive commentary to any proposed criteria or rubric
increases the objectivity and allows the marker to qualify and define why it may be
that a certain grade/mark was given. In this way judgements are made in a more
critically analytical manner and the active engagement in descriptive writing allays
any overly subjective commentary.

The Boud and Falchikov (1989a & b) studies indicated that there is a direct
correlation between what and how a student may mark and that of a staff member
and this in turn may be dependant on the structured implemenation, and clear and
transparent use of a set of criteria from which to work. An addendum to this point is
the fact that the process of enagement is equally if not more important than the
making of (assessment) judgments (Topping, 2003), herein lies the intregral value of
self and peer assessment, learners invaribly improve there grades over time and
perfom better in summative examinations (Falchikov, 1995 Walser, 2009).

Task:

Consider the following queries pay particular heed to Q2 Reflect on your role and the
expectations of your learner/student (and yourself)

2014 DJ


!%
My Assessment protocol
1. How is my session/module assessed?


2. How long does this take (in hrs): to prepare, offer direction, offer feedback,
implement and correct?


3. Are their known issues re: assessment for the learners e.g. parts they do not
like(!), parts they do not do, or do well in etc.


4. Does all the assessment map to the session objectives or module outcomes?
And are they fit for purpose?


5. Do they demonstrate the learners ability to achieve said outcome/s?


6. Can these (outcomes) be achieved in another manner?


7. Are there additional outcomes as a result of the assessment protocol(!)?


8. Is there too much assessment? (and what is too much?)


9. Is the assessment formative even when it is summative?


2014 DJ


!&
Assessment Methods

Opportunities for innovation in assessment are boundless, though assessment
usually reverts to one (or all) of the Big Three essay questions, multiple choice
questions, or reports).

Continually using the same small range of assessment methods not only results in
the same skill set being assessed over and over, but also serves to disadvantage
those individuals who find these methods more challenging. Over-reliance
encourages a surface approach to learning, with the student focusing on strategies
to pass rather than mastering the subject matter.

The assessment methods should be aligned to ensure that the skills and abilities
developed by the students are assessed in a manner consistent with the design and
delivery of the course as a whole. The choice of assessment method should
therefore be influenced by the learning outcomes and the type of skills you are
seeking to engender in the learners. Thus a course designed to teaching problem-
solving skills and group interaction should have a problem-solving-type assessment
rather than an essay on how a problem could be overcome.


Task:

Review the following;
- Note how many you have not heard of;
- Note how many you use in practice;
- And most importantly (of those in use), which methods enable assessment for
learning?
2014 DJ
General Assessment Methods

Analyse a case
AnnoLaLed blbllography
Applled problem (auLhenLlc)
Applled Lask (auLhenLlc)
ArLefacLs
Aural examlnaLlon
8ook revlew
Case sLudles
CommenL on an arLlcle's LheoreLlcal perspecLlve
CompeLence checkllsLs
ConcepL maps
CrlLlcal lncldenL analysls
uemonsLraLlon
uevlse an encyclopedla enLry
ulagram SheeLs
ulrecL observaLlon
ulscusslon/debaLe/crlLlque.
ulsserLaLlon
urafL a research bld Lo a reallsLlc brlef
Lssay
LxhlblLlon
Lye-wlLness LesLlmonlals
lleld work
Croup Work
!ournal (reflecLlve, pracLlcal / oral, vldeo)
Lab reporL
Learnlng ConLracL
LeLLer of Advlce/Approach Lo ....
Logs/dlarles/reflecLlve [ournals
Make a vldeo (wrlLe scrlpL/produce/make a vldeo)
Model consLrucLlon
MulLlple Cholce CuesLlons
Cb[ecLlve SLrucLured Cllnlcal LxamlnaLlons
CbservaLlon of real/slmulaLed prof pracLlce
Cpen book
Cral examlnaLlon / presenLaLlon
arLlclpaLe ln a 'CourL of Lnqulry'
arLlclpaLlve onllne dlscusslon
eer assessmenL
erformance
orLfollos/e-porLfollo
osLers
repare a commlLLee brleflng paper
repare a manual for a parLlcular audlence
resenL a case for an lnLeresL group
roblem scenarlo
roduce an A - Z of ...
ro[ecL (deslgn, sLrucLure, ouLpuLs eLc)
8eporL
8esearch enqulry
8ole lay
8ubrlc deslgn (for assessmenL)
Self- assessmenL
ShorL answer quesLlons
SlmulaLed lnLervlews
SLrucLured summarles
1wo-parL assessmenL
vlva/Lab defence
Web page creaLlon
WebslLe revlew
Work based AssessmenL
Work-based problem
WrlLe a newspaper arLlcle for a forelgn newspaper
WrlLe an answer Lo a cllenL's quesLlon
WrlLLen examlnaLlon
WrlLLen presenLaLlon (essay, reporL, reflecLlve paper eLc.)
2014 DJ
Less Familiar Assessment Methods
CompeLence
checkllsLs
used ln number of professlons Lo ensure parLlcular ablllLles have been underLaken &
assessed. Crld ln whlch sLudenLs ldenLlfy when Lhey observed an acLlvlLy, rehearsed lL,
esLlmaLed Lhemselves ready Lo be assessed, daLe Lhe assessmenL Look place, ouLcome,
LuLor's name, and any furLher commenLs.
Case sLudles used Lo enable sLudenL Lo demonsLraLe skllls learned ln professlonal conLexLs Lo oLher
seLLlngs. 1hls can lnvolve requlrlng Lhem Lo provlde recommendaLlons or soluLlons, or Lo
wrlLe Lhelr own case sLudles based on Lhelr own experlences
Logs/dlarles/
reflecLlve [ournals
All used where sLudenLs are marked on pracLlce and reflecLlon. Can range from slmple
logs/checkllsLs Lo more deLalled reflecLlve [ournals
orLfollos Wldely used Lo provlde evldence of compeLence from Lhelr pracLlce. Cood meLhod Lo
help sLudenLs assess Lhelr own level of compeLence, by asklng Lhem Lo selecL evldence
LhaL besL demonsLraLes Lhelr ablllLy. SLrong guldellnes requlred Lhough Lo prevenL lL from
becomlng a collecLlon of random and lrrelevanL lnfo.
CbservaLlon 1hls refers Lo Lhe observaLlon of Lhe skllls ln pracLlce, waLchlng a professlonal and
learnlng from Lhe experlence. Can be slmple checkllsLs or more deLalled (requlrlng
sub[ecLlve responses)
ArLefacLs 1end Lo be physlcal producLs of sLudenLs' professlonal pracLlce (e.g., arL work, models,
compuLer programmes, denLal brldges eLc). lmporLanL Lo have clear crlLerla esLabllshed
beforehand. Cood ldea Lo have sample arLefacLs before sLudenLs begln, and frequenL
checks LhroughouL (so Llme & resources aren'L wasLed)
Lye-wlLness
LesLlmonlals
Can form parL of a porLfollo, or a separaLe way Lo evldence Lechnlcal compeLence. May
be a sLaLemenL by a LuLor or placemenL supervlsor who has been observlng/responslble
for Lhe sLudenL.
ln-Lray exerclses SLudenLs are presenLed wlLh a dossler of paper whlch Lhey have an opporLunlLy Lo peruse
before Lhe quesLlon ls presenLed. 1he dossler lncludes a range of lnformaLlon (some
relevanL, some noL, some a LoLal red herrlng) buL Lhe sLudenL musL use lL Lo solve a real
world problem. Can lasL for an hour or all day, and may lnclude Lhe opLlon Lo consulL wlLh
oLher sLudenLs lf deslred.
Cb[ecLlve SLrucLures
Cllnlcal LxamlnaLlons
lnvolve sLudenLs underLaklng a seL of prescrlbed Lasks (for example, 9 ln 90 mlnuLes) aL a
serles of assessmenL sLaLlons ofLen placed around a large room. 1hese provlde
opporLunlLles Lo demonsLraLe Lhelr skllls ln a range of areas ln a pracLlcal way.
osLers/
presenLaLlons
used by lndlvlduals or groups Lo demonsLraLe work underLaken lndlvldually or
collecLlvely. Can be LheoreLlcal or reporLlng back on a pracLlcal acLlvlLy. useful because
can be used ln con[uncLlon wlLh peer assessmenL
Crals Can be used Lo lnLerrogaLe Lhe undersLandlng LhaL underlles pracLlce. Serves Lo lnLroduce
an elemenL of performance lnLo assessmenL, Lhough due conslderaLlon should be glven
Lo crlLerla and welghLlng of marks.
Learnlng conLracLs used Lo lnvolve sLudenLs ln seLLlng Lhelr own learnlng goals. Cenerally have four sLages:
enLry proflllng, needs analysls, acLlon plannlng, evaluaLlon. Levels of relevanL
compeLence are seL ouL aL Lhe beglnnlng of Lhe programme, and Lhen Lhey agree upon
how besL Lo develop Lhese Lo saLlsfy Lhese ouLcomes.
2014 DJ


'(
Assessment Methods in Class
9' /2)55 1$+:&5;<*0=2&6 ">&&#5
uevelop qulzzes/problem sheeLs for groups of sLudenLs ln class, where Lhey can learn from oLher sLudenLs and
monlLor Lhelr progress agalnsL oLhers

?'&@6+'$#& #&5#
SLop class 3 mlnuLes before Lhe end (or aL beglnnlng) asklng, whaL Lhe mosL lmporLanL Lhlnk you have learned,
whaL lmporLanL quesLlons unanswered. use resulL Lo adapL nexL lecLure, clarlfy quesLlons nexL Llme. Some marks
can be glven for parLlclpaLlon ln Lhls acLlvlLy.

A$%%+&5# <0+'#
lnvlLe sLudenLs Lo descrlbe whaL Lhey dldn'L undersLand and whaL Lhey Lhlnk mlghL help.

BC)6 B7)2$)#+0'5
uslng a LesL you (have) use(d), lnvlLe sLudenLs Lo evaluaLe how well lL measures Lhelr knowledge or compeLencles.

D5& 0. /2+/E&*5F 5>0G 0. >)'%5;/)*%5H
Slmllar Lo Lhe qulz, buL a more lndlvldual acLlvlLy, use cllckers ln class (or show of hands/cards) Lo answer
quesLlons. lf you follow Lhls wlLh a qulck dlscusslon ln palrs, sLudenLs wlll geL feedback and learn oLher sLudenLs'
raLlonale for Lhelr answers.

9'@/2)55 %+5/$55+0'5H
Allow opporLunlLy for qulck ln-class dlscusslon ln palrs on more complex maLerlal or Lo dlscuss Lhe appllcaLlon of
Lhe maLerlal Lo Lhelr programme.

4II2+/)#+0' 4*#+/2&
uurlng lasL 13 mlnuLes of class, lnvlLe sLudenLs Lo wrlLe a shorL news arLlcle abouL how a ma[or polnL applles Lo a
real-world slLuaLlon

D5+', 5#$%&'# ,&'&*)#&% 0'@2+'& 6)#&*+)2 .0* +'@/2)55 %+5/$55+0'
lf sLudenLs conLrlbuLe on-llne prlor Lo a class, use some of Lhelr maLerlal/quesLlon Lo refer Lo ln Lhe lecLures,
address common mlsconcepLlons, errors, eLc.


2014 DJ


'!
->)+' '0#&5
ass around a large envelope wlLh a quesLlon abouL Lhe class conLenL. Lach sLudenL wrlLes a shorL answer, puLs lL
ln Lhe envelope, and passes lL on.

9'@/2)55 .&&%=)/E 0' )55+,'6&'#;)55&556&'#H
Clve feedback Lo whole class on common errors ln conLlnuous assessmenL asslgnmenLs (or prevlous years exams).

"#$%&'#@,&'&*)#&% #&5# J$&5#+0'5
ulvlde Lhe class lnLo groups and asslgn each group a Loplc on whlch Lhey are each Lo wrlLe a quesLlon and answer
for Lhe nexL LesL. Lach sLudenL should be assured of geLLlng aL leasL one quesLlon rlghL on Lhe LesL!

<*0=2&6@5027+', )/#+7+#+&5
use esLabllshed or creaLe on-llne problem solvlng acLlvlLles. SLudenLs can aLLempL Lhese mulLlple Llmes Lo geL
correcL answers and as such learn from Lhe process. Crade can be glven for parLlclpaLlon.

<)*#+/+I)#+0' +' K+5/$55+0' #>*&)%5
rovlde /01*-2" Lo ald consLrucLlon and evaluaLlon of knowledge.

D5& 0. G+E+5
1o faclllLaLe developmenL of shared lnformaLlon.

D5& 0. L20,5 ; M0$*')25
1o capLure lndlvldual reflecLlons / commenLarles on procedural maLLers.

N0*% -20$%5
use clouds Lo creaLe synLhesls from plenary dlscusslons.

-0'/&I# A)II+',
CollaboraLlvely deslgn a map Lo explaln, ldenLlfy, evaluaLe a parLlcular Lheme.

?#>&*5O
?I+'+0' <0225P 4II2+/)#+0' -)*%5P <)*)I>*)5+',P Q&G5 R&I0*#+',P L$:: S*0$I5P L*)+'5#0*65P "3'%+/)#&5P
8+5>=0G25P -)I#+0' S)IP R&)%+', *0$'%5P <*&%+/#+0'5P &#/


2014 DJ


''
Self and Peer Assessment

Self Assessment is concerned with a learner making a judgment upon their own
work, from essays to presentations, from grades to comments. It is the latter that is
most productive whereby an individual is able to reflect on a process (e.g. the
design choice, compostion, that led to the construction of a paper) rather than
simply an end product (e.g. a report)

Peer Assessment is the process by which learners will make assessment decisions
upon others work. This may be done anonymously, randomly, individually or by
group. However it is most effectively done collectively whereby a peer group assess
a given body of work or part thereof, and thus demonstrate a consistencey and
validity of the marks awarded (or the contrary).

Each mode may be used either formatively or summatively, and may be best served
in a mixed mode, with one often supporting the other. Thus a self assessment may
inform an ipositive approach that is related to a group activity which is assessed
communally, the latter may have milestones whereby feedback is provided that
informs the individual and group direction before a summative grade is produced
and assigned.

Suffice to say, if a learner has become involved in the assessment process, there is
an immediate benefit - their understanding of the assessment criteria and what it is
to achieve these becomes that much clearer, deepening both their autonomy and
learning experience. Whatever unfolds, one thing is certain there is a lot more
potential for feedback (though it may not be from the expert) that enables an
opportunity for reflection (individually) and action (collectively) to respond to the
learning interventions in a postive manner (Elwood & Klenowski 2002; Shepard
2000; Tan 2008).
2014 DJ
Grading Group Work: Process Vs Product

There are three core ways that a group endeavour may be graded, by;
A group mark/grade
- this is normally most effective when it is the end product being assessed, in
effect it assumes that the group are working collectively.

A division of the activty
- this may be applied where there are a number of discrete elements to an
assignment, enabling the individual/s to take responsibilty for a particular
component. This may then be marked by a tutor or collectively, inviting peers
to assess one anothers successful completion of activities.

A divison of the whole mark
- in essence this is similar to the above, however the difference is that there is
a mark for the product and the process. Thus the tutor may assess the
product and the peers may assess one another on the process or decide
how it is they wish to divide the remaining marks amongst the group.

The opportunity throughout that presents itself here, is the addition of a self-
assessment element, by means of a diary of the process, a reflection on the
outcomes, a meta analysis of the product etc.

Task:

After reviewing the following page;
- Devise / Revise (referring to pg 16 My Assessement Protocol) your assessment
for a given session/module
- Use the template (as a guide) after the next page
2014 DJ


'$
Designing Assessments
6


Seven questions that lecturers might ask when designing an assignment are:

1. What are the outcomes to be assessed?
2. What are the capabilities/skills (implicit or explicit) in the outcomes?
3. Is the method of assessment chosen consonant with the outcomes and skills?
4. Is the method relatively efficient in terms of student time and staff time?
5. What alternatives are there? What are their advantages and disadvantages?
6. Does the specific assessment task match the outcomes and skills?
7. Are the marking schemes or criteria appropriate?

Common weaknesses to avoid
The tasks do not match the stated outcomes
The criteria do not match the tasks or outcomes
The criteria are not known to students
Students do not understand the criteria
Overuse of one mode of assessment such as written examinations, essays, or closed
problems
Overload of students and staff
Insufficient time for students to do the assignments
Too many assignments with the same deadline
Insufficient time for staff to mark the assignments or examinations
Absence of well defined criteria so consistency is difficult to achieve
Unduly specific criteria which create a straitjacket for students and make marking
burdensome for lecturers
Inadequate or superficial feedback provided to students
Wide variations in marking between modules and assessors and within assessors (self-
consistency)
Variations in assessment demands of different modules

6
8rown C., 8ull !., endlebury M (1997).

2014 DJ


')
Example Assessment Plan
Module Learning Outcome/s:




Session Learning Objectives:






How will the objective/s be assessed (criteria / level):






Chosen Assessment method/s:
Scope of Assessment (what is covered and how?):
Opportunity for feedback / forward









2014 DJ


'%
Example Assessment Modes: Concept Maps

A concept map is a visual representation of knowledge. The process enables one to organize
and structure information and the relationships between them. This may be done in a wholly
graphical manner i.e. using images, photos, colour etc. to highlight differing concepts and
their linkages or by identifying key the concepts by name or title and enclosing them in a
visual box then providing connecting navigation to lesser concepts. A traditional concept
provides a hierarchical representation of the information from top down, whereas a mind map
may radiate from a central single concept only. Suffice to say, when creating a spider map,
systems map, concept map, mind map, flow chart, visual plan etc each performs a task that
no ordinary collection of notes may encompass in a single sheet a personal visualization of
knowledge and for our and the students perspective their key learning gaps i.e. what it is
they may wish to focus on, reflect, review and develop. In this way they may be used for as a
tool to support and enhance learning.

A method of assessing Concept Maps proposed by Novak and Gowin in 1984 is based on
the components and structure of the map. This system awards points for:
Valid Propositions (1 Point Each),
Levels Of Hierarchy (5 Points For Each Level),
Number Of Branchings (1 Point For Each Branch),
Crosslinks (10 Points For Each Valid Cross-Link),
And Specific Examples (1 Point For Each Example).
2014 DJ
Example Assessment Modes: Oral Presentations


Criteria Comments
Content
Articulates material at appropriate level
Delivers material fit for profession context
Displays a knowledge of topic
Demonstrates independent research

Delivery
Well rehearsed, maintains appropriate timing
Focused presentation, succinct and articulate
Clear organized sequence of arguments and ideas
Is precise, relevant and avoids repetition of ideas

Interaction (and Response)
Is lively, dynamic and engages the audience
Demonstrates evidence of higher order skills
Able to expand and articulate on ideas and concepts


2014 DJ


'*
Example Assessment Modes: Rubrics
Criteria Unacceptable Acceptable Good Exemplary
Clarity of
purpose
Assessment
criteria unclear
and/or have
significant overlap
Assessment criteria
identifiable, but not
clearly defined or
are inappropriate
Criteria are clear,
appropriate and distinct
Each criteria is distinct, clearly
articulated and fully appropriate
for the assignment/s, session/s
or module
Distinction
between
levels
Little or no
distinction can be
made between
levels of
achievement
Some distinction
made, but not clear
how
Distinction between levels is
apparent
Each level is distinct and
progresses in a clear and logical
manner
Reliabilty
Cross-scoring
among faculty
and/or learners
often results in
significant
differences
Cross-scoring
among faculty
and/or learners
occasionally
produces
inconsistent results
There is general agreement
between different scorers
when using the rubric
Cross-scoring of assignments
using the rubric results in
consistent agreement amongst
scorers
Transparent
Guidance
Rubric is not
available to
learners
Rubric shared and
provides some idea
of assignment /
expectations
Rubric clearly referenced,
used to introduce an
assignment and/or guide
learners
Rubric serves as primary
reference point for discussion
and guidance for assignments
as well as evaluation of
assignment/s
Support of
Meta-
cognition
(Awareness
of learning)
Rubric is not
available to
learners
Rubric shared but
not discussed as
part of what is
being learned
through the
assignment /
course
Rubric is shared and
identified as a tool for aiding
learners to understand what
they are learning through the
assignment/course
Rubric regularly referenced and
used to help learners identify
the skills / competencies /
knowledge they are developing
throughout the course (and
assignment/s)
Engaging
Learners
Learners not party
to the
development or
use of the rubric/s
Learners offered the
rubric and may
choose to use it
for self reflection /
assessment
Learners discuss the design
of rubric and offer feedback
/ input and are responsible
for the use of rubrics in peer
and/or self evaluation
Faculty and learners jointly
responsible for the design of
rubric/s and learners use them
in peer and/or self evaluation

Scoring
0-10 = Needs
improvement
11-15 = Workable 16-20 = Good 21-24 = Exemplary
7


7
Based on Mullinix, B (2003) Rubric for Rubrics: A Tool for Assessing the Quality and Use of Rubrics
in Education, Monmouth University

2014 DJ


'+
Select Bibliography
Anderson et al (2001) A Taxonomy for Learning, Teaching and Assessing: A Revision of Blooms
Taxonomy of Educational Objectives. Longman, London

Biggs, J. (2003) Enriching large class teaching in Teaching for Quality Learning at University. 2nd Ed.
Maidenhead: Open University Press/Society for Research into Higher Education.

Boud, D (1988) in Boud David (ed) Developing Student Autonomy in Learning (2nd Edition) Kogan
Page, London.

Boud, D. (1995) Enhancing Learning Through Self Assessment, Kogan Page

Boud, D & Molloy, E. Eds (2013) Feedback in Higher and Professional Education. Routledge

Boud, D & Falchikov, N. (1989) Quantitative studies of student self-assessment in higher education: a
critical analysis of findings. Higher Education, 18, 529-549.

Brown, S. (2000). Institutional Strategies for Assessment, In, Assessment Matters in Higher
Education: Choosing and Using Diverse Approaches. Buckingham:SRHE and Open University Press.

Brown G., Bull J., Pendlebury M (1997) Assessing Student Learning in Higher Education. London:
Routledge.

Crisp, G. (2012). Integrative assessment: Reframing assessment practice for current and future
learning. Assessment and Evaluation in Higher Education, 37(1), 33-43.

Jennings, D, 2013. An Introduction to Self and Peer Assessment. UCD Teaching and Learning, UCD,
Ireland.

Jennings, D, McMahon, T and Surgenor, P. 2013. Assessment in Practice. UCD
Teaching and Learning, UCD, Ireland.

Ketteridge, S. Marshall, S & Fry, H (2002) The effective acdemic: a handbook for enhanced academic
practice. Kogan Page.

Knight P (2002) Learning Contracts. In, Assessment for Learning in Higher Education. Birmingham:
SEDA series. Pp147-156.

2014 DJ


"(
Knight, P.T. (2001). Complexity and Curriculum: aprocess approach to curriculum-making. Teaching
in Higher Education, 6(3),369-381.

Mutch A, & Brown G (2002) Assessment Series No 2: A Guide for Heads of Department. York:
Learning and Teaching.

O'Neill, G., Huntley-Moore, S., Race, P. Eds .(2007) Case Studies of Good Practices in Assessment
of Student Learning in Higher Education. Dublin: AISHE.
Race, P., (2001) The Lecturer's Toolkit: A Practical Guide to Learning, Teaching and
Assessment. London: Kogan Page Ltd.

Rust C, Price M & ODonovan B (2003): Improving Students'Learning by Developing their
Understanding of Assessment Criteria and Processes, Assessment & Evaluation in Higher Education,
28:2, 147-164

Stefani L.A.J. (1998) Assessment in Partnership with Learners, Assessment and Evaluation in Higher
Education 23 (4) pp 339-350

Tan, K.H.K. (2008). Qualitatively different ways of experiencing studentself-assessment. Higher
Education Research & Development, 27(1),1529.

Topping, K. (1998) Peer assessment between students in colleges and universities. Review of
Educational Research 68: 249-276.

UNESCO. (2005). Guidelines for inclusion: Ensuring access to Education for All. Paris: UNESCO
http://unesdoc.unesco.org/images/0014/001402/140224e.pdf

Walser, T.M (2009) An Action Research Study of Student Self-Assessment in Higher Education.
Innovative Higher Education 34:299306

Yorke, M (2003) Formative assessment in higher education: Moves towards theory and the
enhancement of pedagogic practice. Higher Education 45:477501.






End of Workbook

Das könnte Ihnen auch gefallen