Sie sind auf Seite 1von 4

Assessment Writing Best Practices

Overview:
• The following practices will allow you to maximize the reliability of your assessments.
By doing so, you’ll be able to compare mastery across units with greater confidence.
• Not surprisingly, corps members who have used these strategies have seen greater
consistency in mastery figures from unit-to-unit, and fewer of those puzzling
fluctuations that don’t seem to actually correspond to variations in student
understanding.

Best Practices What This Looks Like What This Doesn’t Look Like
Rubric Always using Montgomery Oscillating between use of
Consistency County 10-point rubrics to Montgomery County and PALS
evaluate writing and speaking, rubrics to evaluate writing and
OR always using PALS rubrics speaking, OR not using rubrics
for writing and speaking. to evaluate writing/speaking
altogether.
MC speaking rubric can be found
at the bottom of this page:
http://dcspanishresources.pbwor
ks.com/Speaking-Activities

MC writing rubric can be found at


the bottom of this page:
http://dcspanishresources.pbwor
ks.com/Writing-Activities

Maintain Pacing as On pace: Being 0.5 units or Off pace: Being more than
Set by LTP less off from the timeline 0.5 unit off pace of your most
established by your long-term recently submitted LTP.
plan (if you need to adjust your
LTP because you’ll fall more than
0.5 units short, revise your LTP
and resubmit it to Joaquin).

Consistent SKILLS: Consistent point Increasing/decreasing point


Approach to allocation for each section of totals for skills sections from
Assessing Mastery LSRW across all units (e.g., unit-to-unit (making the
always making L, S, R, and W Listening worth 24 on the Unit 1
sections worth 10 points each on exam and then making it worth
every exam) 10 points on Unit 2)
CONTENT: If you track by Devoting X number of
mastery of objectives: questions for indicator 4B and
From here on out, make sure Y number of questions for
you write the same number of indicator 4C.
assessment items for each
objective on your exams (e.g.,
decide to write 2 questions per
objective on every exam)

*If you assess by indicator


mastery:
 From here on out, make sure
you write the same number of
assessment items for each verb
of every indicator (e.g., For
indicator 5A, if there are X verbs
in the indicator, make sure you
write X number of assessment
items for each of 5A’s verbs)
Writing LR Including listening/reading Including listening/reading
questions that excerpts in Spanish and then excerpts in Spanish and then
assess writing the question/answer writing the question/answer
comprehension choices in English choices in Spanish
Accurate Mastery *Unit Exam Attendance *Entering in data
Input Based on Threshold: Entering in mastery
Attendance data for students who attend your
class 40% or more of the time.
*Absences on Unit Exam
Days: On unit assessments,
enter in zero’s for mastery until
students take the make-up exam
*Absences on Final Exam:
exclude data from students who
never take Final Exam.
Alignment Rigor: Ensuring that the task Asking students to complete an
you’re asking students to assessment item that assesses
undertake matches the same their mastery at a different level
level on Bloom’s as the highest on Bloom’s taxonomy than the
level addressed by the indicator curriculum recommends.
or objective (e.g., if the indicator
says students should “present” or
“exchange” but you ask them
only to “identify”, then this is not
as rigorous as it could be)

Find Bloom’s taxonomy here:


http://www.nwlink.com/~Donclark
/hrd/bloom.html

For an easier way to ensure


alignment, check out the “Guide
to Ensuring Bloom’s alignment”
here:
http://dcspanishresources.pbwor
ks.com/LTPs%2C-UPs%2C-and-
Assessments

Content:
• Ensuring that you assess the • Assessing content in a
same content (e.g., different way than you asked
vocabulary) and in a similar students to when you first
fashion as you did on the exit introduced the content, or
slip/independent activity assessing new
vocabulary/structures that
students might have only
been cursorily exposed to.

• Assessing mastery of content • Assessing mastery of


through aligned multiple- content through open-ended
choice, fill-in-the-blank, or questions that are hard to
highly structured short score (e.g., “Write down 3
response questions (e.g., sentences describing what
translate this statement in you like to do after school”).
English into Spanish) These questions are hard to
score and lead you to ask
questions about how you
weigh grammar, spelling,
etc. This inevitably leads to
unreliable scoring
approaches. My suggestion:
save open-ended questions
for the writing section of your
assessment where you’ll be
able to use a standardized
rubric like PALS or MC to
evaluate performance).
Consistent *Avoiding the tendency to switch *Switching formatting up in
formatting up layout/formatting of exam dramatic ways from exam-to-
unless you have good reason to exam without good reason to
(e.g., every test begins with do so
LSRW, finishes with multiple-
choice/short response sections
that assess content)
Tracking and Using an alignment guide that Not using an alignment guide.
grading efficiency allows you to easily tabulate
mastery as you grade (Download
a blank “Assessment Alignment
Guide Template” at this page:
http://dcspanishresources.pbwor
ks.com/Tracking-Student-
Mastery)

Staying consistent with the way Changing the way you group
you group items on your assessment items from
assessment (e.g., grouping by assessment to assessment.
objective or by indicator). This
makes tracking and use of
alignment guide much more
efficient.

Das könnte Ihnen auch gefallen