Sie sind auf Seite 1von 28

Developing Assessment

Criteria and Rubrics


13 March 2014
Dr Geraldine O’Neill & Leone Gately
UCD Teaching & Learning
http://tinyurl.com/rubricsworkshop13March
Workshop Overview

 What and Why of Rubrics


 Types of Rubrics (Analytical vs. Holistic)
 Examples of Assessment Criteria & Rubrics
 Group Task: Create a Rubric for your Assessment
 Demo how to use rubrics in Blackboard
 Guidelines for Developing Rubrics
What are Rubrics?
Why use Rubrics?
Why Use Rubrics
 A way to provide feedback
 Defines characteristics of high quality
assignment
 Establishes a range of performance categories
 Helps students understand expectations
 Provides students with a way to evaluate their own
performance (self-assessment, reflection)
 Takes the ‘guess-work’ out of grading
Reliability

Formative
Objectives Assessment

of Rubrics
Validity Transparency

(Refer to Principles & Purpose of Assessment in Appendices)


DIFFERENCE BETWEEN
ANALYTIC AND HOLISTIC
RUBRICS/CRITERIA

Sadler (2009)
Analytic grading Advantages:
• Reliability
• Transparency
Criteria A (detailed feedback to
Preset by staff or students)
with students,
• Objectivity
+ Disadvantages
Developed over Criteria B • Validity: No single
the last 50 years
to address the
issue
+ correct answer in
complex topics
of transparency
Criteria C • Sum of the parts
and
accountability is not always the
to students
whole
= Correct
Response • Time consuming
Analytical Rating Scale
Criteria Max. Weighting Student A Weighting
Score Score

A 10 10 5 5
B 10 10 2 2
C 10 20 8 16
D 10 10 5 5
E 10 10 4 4
F 10 20 7 14
G 10 10 4 4
H 10 10 8 8
100% 58%
Example: Analytical Rubric –
Research (De Toro, 2007)
Levels of Performance
Criteria Weight 1 2 3

Number of
x1 1-4 5-9 10-12
Sources

Historical Lots of historical Few No apparent


x3
Accuracy inaccuracies inaccuracies inaccuracies

Can tell with


Can not tell from Can easily tell which
difficulty where
Organization x1 which source sources info was
information came
information came drawn from
from

Bibliography
Bibliography All relevant
contains most
Bibliography x1 contains very information is
relevant
little information included
information
Holistic grading Advantages:
• Encourages intuitive
expert judgment
• Validity
The • When used with
Whole support, student can
develop the skill of self
judgment (become
expert judge)
Disadvantages
Grade • Reliability
With Rich and • Needs an expert judge
Grade indicative
• Transparency
descriptions

Grade
Need a balance of both

Holistic Analytic
Examples of Assessment
Criteria/Rubrics (see workshop
handout)
• Group Participation (analytic rubric)
• Participation (holistic rubric)
• Design Project (analytic rubric)
• Critical Thinking (analytic rubric)
• Media and Design Elements (analytic rubric;
portfolio)
• Writing (holistic rubric; portfolio)
Comprehensive Validity & Reliability
studies done on:

Crotwell Timmerman, B.E., Strickland, D.C. ,.


Johnson, R.L and Payne, J.R. (2011)
Development of a ‘universal’ rubric for assessing
undergraduates’ scientific reasoning skills using
scientific writing, Assessment & Evaluation in
Higher Education, 36, 5, 509–547
Parts of a Rubric
O Criteria/Dimensions (Rows)
Elements that characterise good performance of task

O Descriptors
specify the meaning of each criterion, describe levels of
performance

O Levels of Mastery/Scales (Columns)


numerical (i.e. 1-5 or actual points value)
Or qualitative i.e.
- exemplary, acceptable, unacceptable
- distinguished, proficient, basic, unacceptable
- novice, apprentice, expert
Group Task:
Create a Rubric for your Assessment
45 mins
Step 1: Choose an assessment method i.e. essays, lab-work,
discussion boards, presentations, e-Portfolios, blogs etc.

Step 2: Identify 3 critical criteria you want to evaluate (rows)

Step 3: Identify a scale (levels of mastery) of at least 3 levels


(columns)

Step 4: For each of the criterion describe skills/knowledge/


behaviours that represent each level of quality (cells)
Rubrics in Blackboard
Demo how to create a Rubric in Blackboard

Info on Using Rubrics in Blackboard:


https://help.blackboard.com/en-
us/Learn/9.1_SP_10_and_SP_11/Instructor/040_Student_Course_
Experience/Student_Performance/Rubrics
Online Rubric Design Tools
O Blackboard: Creating a Rubric (Blackboard On
Demand Video, 3:10)

O Blackboard: Grading with Rubrics (Blackboard On


Demand Video, 3.10)

O Rubistar – free http://rubistar.4teachers.org/

O Various Online Rubric Tools: http://www.teach-


nology.com/web_tools/rubrics/
Guidelines for Developing Rubrics

 Find and adapt, tweak existing templates


 Be clear on what you want to assess
 Have clear essential criteria and a realistic number of
criteria
 Write rubrics in clear language that students understand
 Make sure marks allocated for criteria correlate to amount
of time students spend on criterion
 Share rubrics with colleagues and students in advance
 Revise & Evaluate
References
Bloxham, S., and P. Boyd. 2008. Developing Effective Assessment in Higher Education: A Practical Guide.
Maidenhead: Open University Press McGraw-Hill.
Crotwell Timmerman, B.E., Strickland, D.C. ,. Johnson, R.L and Payne, J.R. (2011) Development of a
‘universal’ rubric for assessing undergraduates’ scientific reasoning skills using scientific writing,
Assessment & Evaluation in Higher Education, 36, 5, 509–547
Hornby, W (2003) Strategies for Streamlining Assessment: Case Studies from the Chalk Face
http://papers.ssrn.com/sol3/papers.cfm?abstract_id=405760
Kearney, (2013) Improving engagement: the use of ‘Authentic self-and peer-assessment for learning’ to
enhance the student learning experience, Assessment & Evaluation in Higher Education, 38:7,
875-891, DOI: 10.1080/02602938.2012.751963
Nicol, D. and MacFarlane-Dick, D. (2006) Formative assessment and self-regulated learning: a model and
seven principles of good feedback practice. Studies in Higher Education. 31 (2), 199-218.
Price, M., Carroll, J., O'Donovan, B. and Rust, C. (2011) ''If I was going there I wouldn't start from here: a
critical commentary on current assessment practices', Assessment & Evaluation in Higher
Education, 36 (4), 479-492.
Sadler. D.R. (2010). Beyond Feedback: Developing Student Capability in Complex Appraisal, Assessment
& Evaluation in Higher Education 35, no. 5: 535-50
Sadler, D.R (2009) Transforming Holistic Assessment and Grading into a Vehicle for Complex Learning In,
G.Joughin (ed.), Assessment, Learning and Judgement in Higher Education, Springer
Science+Business Media B.V.
Seymour, D. (2005). Learning Outcomes and Assessment: Developing assessment criteria for Masters-
level dissertations. Brookes eJournal of Learning and Teaching 1, no. 2: 1-8.
http://bejlt.brookes.ac.uk/paper/learning-outcomes-and-assessment-developing-assessment-
criteria-for-masters-level-dissertations/
Appendices
Principle of ‘Validity’

Assessments should measure what they purport to measure and


should align with the programme and module’s learning
outcomes.

Resource: Curriculum Constructive Alignment


http://www.engsc.ac.uk/er/theory/constructive_alignment.asp
Principle of ‘Reliability’

Assessment
tasks should
generate
comparable
grades across
time, across
markers and
across methods.
Formative assessment
includes

Assessment AS learning occurs when


students reflect on and monitor their
progress to inform their future
learning goals.
http://www.education.vic.gov.au/studentlearning/assessment/
Principle of ‘Practicability and Efficiency’

Assessment tasks
should be
practical for both
staff and students
in terms of the
time needed for
completion and
marking and they
should be cost
effective.

Resource: Hornby, (2003)


Principle of ‘Transparency”

Information, guidance,
assessment criteria, rules
and regulations on
assessment should be
clear, accurate, consistent
and accessible to all
students, staff and
examiners.

Resource: Kearney, (2013) Seymour, D. (2005).


Price et al (2011)
Thanks for Your
Participation
Dr Geraldine O’Neill Leone Gately
T: 01 716 8575 T: 01 716 8498
E: geraldine.m.oneill@ucd.ie E: leone.gately@ucd.ie

Das könnte Ihnen auch gefallen