Beruflich Dokumente
Kultur Dokumente
By
Abstract
This paper introduces instructors to a practical tool, the Test Assessment Questionnaire, which
helps students critically evaluate their course progress after the midterm exam. This tool has
three potential benefits: 1) it guides students toward more self-awareness in their studies 2) it
can be used as a part of assessment and assurance of learning efforts 3) it requires minimal
instructor time and effort. The tool is piloted in two sections of principles of macroeconomics. This
technique is met with overwhelmingly positive feedback from the students. There is some
preliminary evidence that this tool may also improve student performance on the final exam.
1
Special thanks to participants of the 2010 Robert Morris University / McGraw-Hill Irwin Teaching Economics
Conference for helpful comments. Warm gratitude is also extended to Christina Peters and Curtis Price for
insightful suggestions.
Several studies have investigated student overconfidence. Walstad (2001) calls for further
investigation of the psychology of students, suggesting concepts from behavioral economics (e.g.
overconfidence) could be used to explain student behavior. Falchikov and Boud (1989) find that
students have grade expectations that are higher than the typical distribution for the course.
Additionally, students in the principles courses are found to be overconfident in their
understanding of material, as measured by their predictions of exam scores (Grimes 2002).
Nowell and Alston (2007) find that instructor grading practices can influence the degree of
overconfidence.
When faced with a student who is upset at receiving a lower than expected exam grade,
instructors may respond by telling students to study more. For the students who solely study for
two hours the night before the exam and earn a D grade, this strategy surely has merit. However,
for other students, simply studying more may or may not result in a higher grade. The lack of
effect of study time on achievement has been documented by Becker (1982). This can be
frustrating to the students who feel they already are “studying hard”.
Such students may start to blame the instructor for the disconnect between (perceived) effort and
desired grade. Students may label the instructor as “unfair” or the course simply “too hard”.
Grimes, Millea, and Woodruff (2004) find that the degree to which students accept personal
responsibility for performance affects their evaluation of teaching effectiveness and course
satisfaction. In course evaluations, students reward professors who increase achievement in the
contemporaneous course, not those who facilitated deep learning for subsequent courses (Carrell
and West 2010). Millea and Grimes (2002) conclude that instructors need not “water-down”
courses in order to receive favorable course evaluations. Instead, they can positively influence
evaluations by addressing negative student attitudes about forthcoming coursework.
Building on this premise, your author introduces instructors to a practical tool, the Test
Assessment Questionnaire, which helps students critically evaluate their course progress after the
midterm exam. This tool has three potential benefits: 1) it guides students toward more self-
awareness in their studies 2) it can be used as a part of assessment and assurance of learning
efforts 3) it requires minimal instructor time and effort.
The Test Assessment Questionnaire guides students through an analysis of their midterm exam
mistakes3. Students are asked about their exam preparation activities as well as studying
2
These are the students who might say “I thought I got an A” because “I studied really hard” and yet earn a lower
grade.
3
The Appendix contains the full Test Assessment Questionnaire. The concept of a test assessment is something I
stumbled upon in graduate school. There was a handout called “Why did I miss that test question?” which was used
by tutors in the Student Academic Skills Center at the University of Colorado at Boulder. The Test Assessment
Questionnaire is based on that handout.
The Test Assessment Questionnaire has the potential to be a powerful student aid which requires
minimal additional instructor time expenditure and at the same time provides a complement to
departmental assessment activities. Preliminary findings are promising and student feedback on
the process is positive. Students were asked to evaluate the experience of using the test
assessment and all responses were positive. Students who completed the assessment improved
their final exam grades significantly more than students who did not complete the assessment.
First, students are asked to perform an analysis of their midterm exam. 7 They review each missed
exam question and determine why they think they answered it incorrectly. To facilitate this
analysis, eight categories of common types of mistakes are listed, along with an “other” option.
Students can choose from the following reasons 1) didn’t know a definition 2) couldn’t apply a
definition I knew 3) didn’t read the question carefully 4) knew the answer but couldn’t come up
with it during the exam 5) didn’t know how to set the problem up 6) used the wrong formula 7)
debated between two answers and choose the wrong one 8) just didn’t know the material 9)
other.8 Once mistakes have been categorized, students are asked to comment on any trend they
observe. For the majority of students, a clear pattern emerges.
The purpose of analyzing midterm exam mistakes is to lead the student to look at their
performance in a critical way. Over the years, students have come to me after a poorer-than-
expected performance on an exam and expressed that they do not know what they did wrong. In
my experience, such a student had rarely (if ever) actually critically reviewed their mistakes.
Instead of being fixated on the number of mistakes, students are directed to focus on the type of
mistake they are making. In my experience, this change in focus is a powerful tool for
motivating students to improve their learning. It charts a much clearer path for the student’s
future study activities. Some students observe that they mostly miss the graphing problems; they
immediately see that they need to spend more time with that part of the material. Other students
realize they miss questions from lectures on the days when they did not attend class. 9
4
The midterm exam was given two weeks before the deadline to drop the course without evaluation.
5
Homework is given weekly throughout the semester.
6
Meetings typically lasted 5 to 10 minutes. About one-third of the students chose to attend a meeting.
7
The midterm exam was 30 multiple choice questions and 3 short answer, calculation-type questions.
8
Whether their perceptions of why they missed a particular question are accurate is a question left for future
research.
9
While this may seem straightforward to instructors, it is often a profound realization for students.
3
In addition to student self-awareness, when an instructor knows what type of questions a
particular student is missing, he/she can give specific advice to the student on how to improve
their studying. For students who consistently miss definitional questions, making flash cards
may be a suggestion. Students who find that they are debating between two answers and choose
the incorrect one understand most of the material but there is a nuance or detail they have not
picked up on. When this is explained, they seem to feel much better about the situation, knowing
that they “almost have it” now and with a little more attention to detail they will be able to
choose the correct answer.
There are at least two types of students for which the economic content may not be the culprit
behind their lack of performance: students with inadequate math skills and students with test
anxiety. A sizeable number of my students report that they always do poorly when math is
involved. No doubt many readers have encountered similar students. This problem often persists
despite a math prerequisite for the course. In addition, several students self-identify on the test
assessment that they have test anxiety in general. 10 Both issues warrant further investigation but
are larger and deeper than the scope of this paper.
The next question asks students to compare their exam score to their weekly homework scores.
They are asked how heavily they rely on their notes and text when doing homework. The
purpose of this query is to lead students to examine whether they are looking up every single
answer on a homework assignment, or if they are thinking through the answers on their own and
simply using notes as a reference for clarifying the confusing points.
The next questions ask about exam preparation and course studying activities in general. The
answers provide insight into whether or not students are adequately preparing for the exam and
whether or not their personal study habits during the course are facilitating their learning. Some
students do not seem shy in reporting the reasons for their lack of performance on exams; a
majority of the students with failing midterm grades candidly explain the various reasons they
did not study much (e.g. studying for a different exam or had to work). Students are also asked
how many times they have been to office hours to clarify material. For most students the answer
is “zero”. This subtly reminds the student that there is assistance available, but it is up to them to
utilize it.
The final question asks students what grade they hope to earn in the course and to identify a new
study strategy for reaching that goal. Many students report that they will no longer wait until the
last minute to do their homework so that they can attend office hours and ask questions. Other
common strategies include doing the reading in advance, reviewing class notes more often for
short periods of time, and studying more than solely right before an exam.
3. Preliminary Results
Given the small sample size and the exploratory nature of this instrument, it is not possible to
draw firm conclusions about the benefits to student grades of using this Test Assessment.
10
The Test Assessment does not ask specifically if the students suffer from text anxiety, to do so would be a
violation of school policy regarding students with disabilities. However, many students volunteer in the “other”
category that the reason they did poorly was because of test anxiety.
4
However, some of the initial results are promising and student feedback on the process is
positive.
Of the 67 students, 53 submitted a completed test assessment. Despite the fact that the
assessment was a required homework assignment, 21 percent of students did not complete one.
Lack of participation was fairly similar between the genders; 22 percent of males and 19 percent
of females did not submit the assignment.
Table 1 reports the mean scores for the midterm exam, final exam, and homework for the
students completing the assessment and those who did not complete the assessment. The reader
is reminded the intent of this paper is not a rigorous control/treatment experiment, but instead an
introduction to a teaching tool. This table presents a very preliminary exploration of any effect
on students’ improved progress on the final exam. A difference in means test (assuming unequal
variance) reveals no significant difference between the mean scores of students who completed
the test assessment and those who did not.
Table 2 reports the students’ performance change between the midterm exam and the final. 11
The students who did not complete the test assessment on average did 7.4 percentage points
worse on the final than on their midterm exam. On average, students who completed the
assessment marginally improved their final exam grade. Because some students scored worse on
the final exam and some scored better, the mean change in score is not entirely informative.
Looking first at all of the students who improved their final exam score, the students who
completed the assessment improved their scores significantly more. They increased their final
exam scores by an average of 9.9 percentage points above their midterm exam score while
students who did not complete the assessment increased their scores by an average of 2.9
percentage points. Of the students scoring worse on the final exam, there is no significant
difference between the groups.
11
The final exam is cumulative.
12
Readers can download an electronic copy of the Test Assessment Questionnaire at
http://www.scribd.com/doc/34225467/Test-Assessment-Form or from http://www.katherinesauer.net/research.html.
5
time. Data on why students are missing particular questions can inform curriculum and teaching
discussions.
Perhaps most importantly, students are given a tool that concretely guides them into self-
awareness with respect to their studies. In this limited trial, they reported that the experience was
beneficial. In anonymous end-of-course evaluations, students are asked to respond to the
following open-ended statement: “Please comment on the experience of completing the Test
Assessment and/or meeting with me to discuss it.” While some students left the statement blank,
all students who chose to respond indicated that it was a beneficial or positive experience.
Several indicated that the technique was helpful for their study habits in general, not solely the
economics course.
The Test Assessment Questionnaire has the potential to be a powerful student aid which requires
minimal additional instructor time expenditure and at the same time provides a complement to
departmental assessment activities. While students respond favorably to the instrument, the
impact on student performance on the final exam needs further study.
5. References
Becker, W. E. 1982. The educational process and student achievement given uncertainty in
measurement. American Economic Review 72 (1): 229–36.
Carrell, S.E., and J.E. West. 2010 Does Professor Quality Matter? Evidence from Random
Assignment of Students to Professors. Journal of Political Economy 118(3): 409–432.
Falchikov, N., and D. Boud. 1989. Student self-assessment in higher education: A meta-analysis.
Review of Educational Research 59 (4): 395–430.
Grimes, P.W., M.J. Millea, and T.W. Woodruff. 2004. Grades – Who’s to blame? Student
evaluation of teaching and locus of control. Journal of Economic Education 35(2): 129–
147.
Millea, M. J., and P. W. Grimes. 2002. Grade expectations and student evaluation of teaching.
College Student Journal 36 (4): 582–90.
Nowell, C., and R.M. Alston. 2007. I Thought I Got an A! Overconfidence Across the
Economics Curriculum. Journal of Economic Education 38(2): 131–142.
6
Table 1: Mean Scores
Mean Scores
Not Complete Completed
Assessment Assessment t-statistic
Midterm Exam Mean 76.1 70.3 1.5769
Final Exam Mean 68.7 70.4 -0.4235
Homework Mean 72.4 74.1 -0.3239
n=14 n=53
7
6. Appendix
Test Assessment Questionnaire
1. Go through the questions you answered incorrectly on your exam. For each, choose the
reason you feel you got the answer wrong. Write the number of the question in the blank next to
the reason.
Debated between two answers and choose the wrong one ________________________
Other _________________________________________________________________
2. Do you notice any patterns with the type of question you missed? Explain.
3. How do your exam scores compare with your assignment scores? How heavily do you rely
on your notes/text for doing your homework? How heavily do you rely on your group members
during the in class assignments?
5. How much time did you spend preparing for the exam? When did you seriously start studying
for it and how much time did you spend?
8
- reading the text?
- doing homework?
- doing optional problem sets?
- reviewing the previous weeks’ in-class assignments?
7. About how many times have you been to office hours to ask questions/clarify material?
8. What letter grade do you hope to earn in this course? To achieve your goal, what is your
strategy for studying between now and the next exam?