Beruflich Dokumente
Kultur Dokumente
NCEE 2009-4067
U.S. DEPARTMENT OF EDUCATION
The Institute of Education Sciences (IES) publishes practice guides in education
to bring the best available evidence and expertise to bear on the types of challenges
that cannot currently be addressed by a single intervention or program. Authors of
practice guides seldom conduct the types of systematic literature searches that are
the backbone of a meta-analysis, although they take advantage of such work when
it is already published. Instead, authors use their expertise to identify the most im-
portant research with respect to their recommendations and conduct a search of
recent publications to ensure that the research supporting the recommendations
is up-to-date.
Unique to IES-sponsored practice guides is that they are subjected to rigorous exter-
nal peer review through the same office that is responsible for independent reviews
of other IES publications. A critical task for peer reviewers of a practice guide is to
determine whether the evidence cited in support of particular recommendations is
up-to-date and that studies of similar or better quality that point in a different di-
rection have not been ignored. Because practice guides depend on the expertise of
their authors and their group decision making, the content of a practice guide is not
and should not be viewed as a set of recommendations that in every case depends
on and flows inevitably from scientific research.
The goal of this practice guide is to formulate specific and coherent evidence-based
recommendations for use by educators and education administrators to create the
organizational conditions necessary to make decisions using student achievement
data in classrooms, schools, and districts. The guide provides practical, clear in-
formation on critical topics related to data-based decision making and is based on
the best available evidence as judged by the panel. Recommendations presented in
this guide should not be construed to imply that no further research is warranted
on the effectiveness of particular strategies for data-based decision making.
IES PRACTICE GUIDE
Richard Halverson
University of Wisconsin–Madison
Sharnell S. Jackson
Chicago Public Schools
Ellen Mandinach
CNA Education
Jonathan A. Supovitz
University of Pennsylvania
Jeffrey C. Wayman
The University of Texas at Austin
Staff
Cassandra Pickens
Emily Sama Martin
Mathematica Policy Research
Jennifer L. Steele
RAND Corporation
NCEE 2009-4067
U.S. DEPARTMENT OF EDUCATION
This report was prepared for the National Center for Education Evaluation and Re-
gional Assistance, Institute of Education Sciences, under Contract ED-07-CO-0062
by the What Works Clearinghouse, operated by Mathematica Policy Research.
Disclaimer
The opinions and positions expressed in this practice guide are the authors’ and do
not necessarily represent the opinions and positions of the Institute of Education Sci-
ences or the U.S. Department of Education. This practice guide should be reviewed
and applied according to the specific needs of the educators and education agency
using it, and with the full realization that it represents the judgments of the review
panel regarding what constitutes sensible practice, based on the research available
at the time of publication. This practice guide should be used as a tool to assist in
decision making rather than as a “cookbook.” Any references within the document to
specific education products are illustrative and do not imply endorsement of these
products to the exclusion of other products that are not referenced.
September 2009
This report is in the public domain. While permission to reprint this publication is
not necessary, the citation should be:
Hamilton, L., Halverson, R., Jackson, S., Mandinach, E., Supovitz, J., & Wayman, J.
(2009). Using student achievement data to support instructional decision making
(NCEE 2009-4067). Washington, DC: National Center for Education Evaluation and
Regional Assistance, Institute of Education Sciences, U.S. Department of Education.
Retrieved from http://ies.ed.gov/ncee/wwc/publications/practiceguides/.
What Works Clearinghouse Practice Guide citations begin with the panel chair,
followed by the names of the panelists listed in alphabetical order.
Alternative formats
On request, this publication can be made available in alternative formats, such
as Braille, large print, audiotape, or computer diskette. For more information,
call the Alternative Format Center at 202–205–8113.
Using Student Achievement Data to
Support Instructional Decision Making
Contents
Introduction 1
The What Works Clearinghouse standards and their relevance to this guide 4
Overview 5
References 66
( iii )
Using Student Achievement Data to Support Instructional Decision Making
List of tables
Table 1. Institute of Education Sciences levels of evidence for practice guides 3
List of figures
Figure 1. Data use cycle 10
List of examples
Example 1. Examining student data to understand learning 17
( iv )
Introduction and single subject designs to examine
whether data use leads to increases in
As educators face increasing pressure student achievement. Among the studies
from federal, state, and local accountabil- ultimately relevant to the panel’s recom-
ity policies to improve student achieve- mendations, only six meet the causal va-
ment, the use of data has become more lidity standards of the What Works Clear-
central to how many educators evaluate inghouse (WWC) and were related to the
their practices and monitor students’ aca- panel’s recommendations.2
demic progress.1 Despite this trend, ques-
tions about how educators should use data To indicate the strength of evidence sup-
to make instructional decisions remain porting each recommendation, the panel
mostly unanswered. In response, this relied on the WWC standards for determin-
guide provides a framework for using stu- ing levels of evidence, described below and
dent achievement data to support instruc- in Table 1. It is important for the reader to
tional decision making. These decisions remember that the level of evidence rating
include, but are not limited to, how to is not a judgment by the panel on how ef-
adapt lessons or assignments in response fective each of these recommended prac-
to students’ needs, alter classroom goals tices will be when implemented, nor is it
or objectives, or modify student-grouping a judgment of what prior research has to
arrangements. The guide also provides say about the effectiveness of these prac-
recommendations for creating the orga- tices. The level of evidence ratings reflect
nizational and technological conditions the panel’s judgment of the validity of
that foster effective data use. Each rec- the existing literature to support a causal
ommendation describes action steps for claim that when these practices have been
implementation, as well as suggestions implemented in the past, positive effects
for addressing obstacles that may impede on student academic outcomes were ob-
progress. In adopting this framework, edu- served. They do not reflect judgments of
cators will be best served by implement- the relative strength of these positive ef-
ing the recommendations in this guide fects or the relative importance of the in-
together rather than individually. dividual recommendations.
The recommendations reflect both the ex- A strong rating refers to consistent and
pertise of the panelists and the findings generalizable evidence that an inter-
from several types of studies, including vention strategy or program improves
studies that use causal designs to examine outcomes.3
the effectiveness of data use interventions,
case studies of schools and districts that A moderate rating refers either to evidence
have made data-use a priority, and obser- from studies that allow strong causal con-
vations from other experts in the field. The clusions but cannot be generalized with
research base for this guide was identi- assurance to the population on which a
fied through a comprehensive search for recommendation is focused (perhaps be-
studies evaluating academically oriented cause the findings have not been widely
data-based decision-making interventions
and practices. An initial search for litera- 2. Reviews of studies for this practice guide ap-
ture related to data use to support instruc- plied WWC Version 1.0 standards. See Version 1.0
standards at http://ies.ed.gov/ncee/wwc/pdf/
tional decision making in the past 20 years
wwc_version1_standards.pdf.
yielded more than 490 citations. Of these,
3. Following WWC guidelines, improved out-
64 used experimental, quasi-experimental, comes are indicated by either a positive, statisti-
cally significant effect or a positive, substantively
1. Knapp et al. (2006). important effect size (i.e., greater than 0.25).
(1)
Introduction
replicated) or to evidence from studies that that researchers have not yet studied a
are generalizable but have more causal practice or that there is weak or conflicting
ambiguity than that offered by experi- evidence of effectiveness. Policy interest in
mental designs (e.g., statistical models of topics of current study thus can arise be-
correlational data or group comparison de- fore a research base has accumulated on
signs for which equivalence of the groups which recommendations can be based.
at pretest is uncertain).
Under these circumstances, the panel ex-
A low rating refers to evidence either from amined the research it identified on the
studies such as case studies and descrip- topic and combined findings from that
tive studies that do not meet the stan- research with its professional expertise
dards for moderate or strong evidence or and judgments to arrive at recommenda-
from expert opinion based on reasonable tions. However, that a recommendation
extrapolations from research and theory. has a low level of evidence should not be
A low level of evidence rating indicates interpreted as indicating that the panel
that the panel did not identify a body of believes the recommendation is unimport-
research demonstrating effects of imple- ant. The panel has decided that all five rec-
menting the recommended practice on ommendations are important and, in fact,
student achievement. The lack of a body of encourages educators to implement all of
valid evidence may simply mean that the them to the extent that state and district
recommended practices are not feasible or resources and capacity allow.
are difficult to study in a rigorous, experi-
mental fashion.4 In other cases, it means
(2)
Introduction
In general, characterization of the evidence for a recommendation as low means that the
recommendation is based on expert opinion derived from strong findings or theories in
Low related areas and/or expert opinion buttressed by direct evidence that does not rise to
the moderate or strong level. Low evidence is operationalized as evidence not meeting
the standards for the moderate or strong level.
a. American Educational Research Association, American Psychological Association, and National Council on
Measurement in Education (1999).
b. Ibid.
(3)
Introduction
(4)
Using Student progress is a logical way to monitor con-
Achievement Data to tinuous improvement and tailor instruc-
tion to the needs of each student. Armed
Support Instructional with data and the means to harness the
Decision Making information data can provide, educators
can make instructional changes aimed at
improving student achievement, such as:
Overview
• prioritizing instructional time;8
Recent changes in accountability and test-
ing policies have provided educators with • targeting additional individual instruc-
access to an abundance of student-level tion for students who are struggling
data, and the availability of such data has with particular topics;9
led many to want to strengthen the role of
data for guiding instruction and improving • more easily identifying individual stu-
student learning. The U.S. Department of dents’ strengths and instructional in-
Education recently echoed this desire, call- terventions that can help students
ing upon schools to use assessment data to continue to progress;10
respond to students’ academic strengths
and needs.5 In addition, spurred in part • gauging the instructional effectiveness
by federal legislation and funding, states of classroom lessons;11
and districts are increasingly focused on
building longitudinal data systems.6 • refining instructional methods;12 and
(5)
Scope of the these are administered consistently
practice guide and routinely to provide information
that can be compared across class-
rooms or schools.
The purpose of this practice guide is to
help K–12 teachers and administrators use Annual and interim assessments vary con-
student achievement data to make instruc- siderably in their reliability and level of
tional decisions intended to raise student detail, and no single assessment can tell
achievement. The panel believes that the educators all they need to know to make
responsibility for effective data use lies well-informed instructional decisions. For
with district leaders, school administrators, this reason, the guide emphasizes the use of
and classroom teachers and has crafted the multiple data sources and suggests ways to
recommendations accordingly. use different types of common assessment
data to support and inform decision mak-
This guide focuses on how schools can make ing. The panel recognizes the value of class-
use of common assessment data to improve room-specific data sources, such as tests or
teaching and learning. For the purpose of other student work, and the guide provides
this guide, the panel defined common as- suggestions for how these data can be used
sessments as those that are administered to inform instructional decisions.
in a routine, consistent manner by a state,
district, or school to measure students’ aca- The use of data for school management
demic achievement.14 These include purposes, rewarding teacher performance,
and determining appropriate ways to
• annual statewide accountability tests schedule the school day is beyond the
such as those required by No Child scope of this guide. Schools typically col-
Left Behind; lect data on students’ attendance, behav-
ior, activities, coursework, and grades, as
• commercially produced tests—includ- well as a range of administrative data con-
ing interim assessments, benchmark cerning staffing, scheduling, and financ-
assessments, or early-grade reading ing. Some schools even collect perceptual
assessments—administered at mul- data, such as information from surveys or
tiple points throughout the school focus groups with students, teachers, par-
year to provide feedback on student ents, or community members. Although
learning; many of these data have been used to
help inform instructional decision making,
• end-of-course tests administered there is a growing interest among educa-
across schools or districts; and tors and policy advocates in drawing on
these data sources to increase operational
• interim tests developed by districts efficiency inside and outside of the class-
or schools, such as quarterly writing room. This guide does not suggest how
or mathematics prompts, as long as districts should use these data sources to
implement data-informed management
practices, but this omission should not be
14. The panel recognizes that some schools do
not fall under a district umbrella or are not part construed as a suggestion that such data
of a district. For the purposes of this guide, dis- are not valuable for decision making.
trict is used to describe schools in partnership,
which could be either a school district or a collab- Status of the research
orative organization of schools. Technical terms
related to assessments, data, and data-based de-
cision making are defined in a glossary at the end Overall, the panel believes that the ex-
of the recommendations. isting research on using data to make
(6)
Scope of the practice guide
instructional decisions does not yet pro- research that proves the practices do im-
vide conclusive evidence of what works to prove student achievement.
improve student achievement. There are a
number of reasons for the lack of compel- Summary of the recommendations
ling evidence. First, rigorous experimental
studies of some data-use practices are dif- The recommendations in this guide create
ficult or infeasible to carry out. For exam- a framework for effectively using data to
ple, it would be impractical to structure a make instructional decisions. This frame-
rigorous study investigating the effects of work should include a data system that
implementing a districtwide data system incorporates data from various sources,
(recommendation 5) because it is difficult a data team in schools to encourage the
to establish an appropriate comparison use and interpretation of data, collabora-
that reflects what would have happened in tive discussion sessions among teachers
the absence of that system. Second, data- about data use and student achievement,
based decision making is closely tied to and instruction for students about how to
educational technology. As new technolo- use their own achievement data to set and
gies are developed, there is often a lag monitor educational goals. A central mes-
before rigorous research can identify the sage of this practice guide is that effective
impacts of those technologies. As a result, data practices are interdependent among
there is limited evidence on the effective- the classroom, school, and district levels.
ness of the state-of-the-art in data-based Educators should become familiar with all
decision making. Finally, studies of data- five recommendations and collaborate with
use practices generally look at a bundle of other school and district staff to implement
elements, including training teachers on the recommendations concurrently, to the
data use, data interpretation, and utiliz- extent that state and district resources and
ing the software programs associated with capacity allow. However, readers who are
data analysis and storage. Studies typi- interested in implementing data-driven
cally do not look at individual elements, recommendations in the classroom should
making it difficult to isolate a specific ele- focus on recommendations 1 and 2. Read-
ment’s contribution to effective use of data ers who wish to implement data-driven
to make instructional decisions designed decision making at the school level should
to improve student achievement. focus on recommendations 3 and 4. Read-
ers who wish to bolster district data sys-
This guide includes five recommendations tems to support data-driven decision mak-
that the panel believes are a priority to im- ing should focus on recommendation 5.
plement. However, given the status of the Finally, readers interested in technical in-
research, the panel does not have compel- formation about studies that the panel used
ling evidence that these recommendations to support its recommendations will find
lead to improved student outcomes. As a such information in Appendix D.
result, all of the recommendations are sup-
ported by low levels of evidence. While the To account for the context of each school
evidence is low, the recommendations re- and district, this guide offers recommen-
flect the panel’s best advice—informed by dations that can be adjusted to fit their
experience and research—on how teachers unique circumstances. Examples in this
and administrators can use data to make guide are intended to offer suggestions
instructional decisions that raise student based on the experiences of schools and
achievement. In other words, while this the expert opinion of the panel, but they
panel of experts believes these practices should not be construed as the best or only
will lead to improved student achieve- ways to implement the guide’s recommen-
ment, the panel cannot point to rigorous dations. The recommendations, described
(7)
Scope of the practice guide
2. Teach students to examine their own data and set learning goals Low
4. Provide supports that foster a data-driven culture within the school Low
here briefly, also are listed with their levels on the organizational and technological
of evidence in Table 2. conditions that support data use. Recom-
mendation 3 suggests that school leaders
Recommendations 1 and 2 emphasize the establish a comprehensive plan for data
use of data to inform classroom-level in- use that takes into account multiple per-
structional decisions. Recommendation 1 spectives. It also emphasizes the need to
suggests that teachers use data from multi- establish organizational structures and
ple sources to set goals, make curricular and practices that support the implementation
instructional choices, and allocate instruc- of that plan.
tional time. It describes the data sources
best suited for different types of instruc- The panel believes that effective data use
tional decisions and suggests that the use depends on supporting educators who are
of data be part of a cycle of instructional using and interpreting data. Recommenda-
inquiry aimed at ongoing instructional im- tion 4 offers suggestions about how schools
provement. Building on the use of data to and districts can prepare educators to use
drive classroom-based instructional deci- data effectively by emphasizing the impor-
sions, recommendation 2 provides guidance tance of collaborative data use. These col-
about how teachers can instruct students in laboration efforts can create or strengthen
using their own assessment data to develop shared expectations and common practices
personal achievement goals and guide learn- regarding data use throughout a school.
ing. Teachers then can use these goals to
better understand factors that may motivate Recommendation 5 points out that effec-
student performance and can adjust their tive, sustainable data use requires a se-
instruction accordingly. cure and reliable data-management system
at the district level. It provides detailed
The panel believes that effective data use suggestions about how districts or other
at the classroom level is more likely to educational entities, such as multidistrict
emerge when it is supported by a data- collaboratives or charter management or-
informed school and district culture. Rec- ganizations, should develop and maintain
ommendations 3, 4, and 5, therefore, focus a high-quality data system.
(8)
Checklist for carrying out the Recommendation 4. Provide supports
recommendations that foster a data-driven culture within
the school
Recommendation 1. Make data part
of an ongoing cycle of instructional Designate a school-based facilitator
improvement who meets with teacher teams to discuss
data.
Collect and prepare a variety of data
about student learning. Dedicate structured time for staff
collaboration.
Interpret data and develop hypotheses
about how to improve student learning. Provide targeted professional devel-
opment regularly.
Modify instruction to test hypotheses
and increase student learning. Recommendation 5. Develop and
maintain a districtwide data system
Recommendation 2. Teach students
to examine their own data and set Involve a variety of stakeholders in
learning goals selecting a data system.
(9)
Recommendation 1. Figure 1. Data use cycle
Make data part of
an ongoing cycle
Collect and
of instructional prepare a variety Interpret data
of data about and develop
improvement student learning hypotheses about
how to improve
student learning
assesses the impact on student achievement Each assessment type has advantages and
of using an inquiry cycle, or individual steps limitations (e.g., high-stakes accountability
within that cycle, as a framework for data tests may be subject to score inflation and
analysis, however, and the panel determined may lead to perverse incentives).18 There-
that the level of evidence to support this fore, the panel believes that multiple data
recommendation is low. sources are important because no single
assessment provides all the information
Brief summary of evidence to teachers need to make informed instruc-
support the recommendation tional decisions. For instance, as teachers
begin the data-use process for the first time
The panel considers the inquiry cycle of or begin a new school year, the accessibil-
gathering data, developing and testing hy- ity and high-stakes importance of students’
potheses, and modifying instruction to be statewide, annual assessment results pro-
fundamental when using assessment data vide a rationale for looking closely at these
to guide instruction. Although no causal data. Moreover, these annual assessment
evidence is available to support the effective- data can be useful for understanding broad
ness of this cycle, the panel draws on studies areas of relative strengths and weaknesses
that did not use rigorous designs for exam- among students, for identifying students or
ples of the three-point cycle of inquiry—the groups of students who may need particu-
underlying principle of this recommenda- lar support,19 and for setting schoolwide,20
tion—and provides some detail on the con- classroom, grade-level, or department-level
text for those examples in Appendix D. goals for students’ annual performance.
( 11 )
Recommendation 1.
Make data part of an ongoing cycle of instructional improvement
To gain deeper insight into students’ needs improved after a unit spent reading and
and to measure changes in students’ skills analyzing expository writing.
during the academic year, teachers also
can collect and prepare data from interim Finally, it is important to collect and prepare
assessments that are administered consis- classroom performance data for examina-
tently across a district or school at regular tion, including examples and grades from
intervals throughout the year (see the box students’ unit tests, projects, classwork, and
below).22 As with annual assessments, in- homework. The panel recommends using
terim assessment results generally have these classroom-level data sources, in con-
the advantage of being comparable across junction with widely accessible nonachieve-
classrooms, but the frequency of their ad- ment data such as attendance records and
ministration means that teachers can use cumulative files,23 to interpret annual and
the data to evaluate their own instructional interim assessment results (see the box on
strategies and to track the progress of their page 13). An important advantage of these
current students in a single school year. For data sources is that in most cases, they can
instance, data from a districtwide interim be gathered quickly to provide teachers with
assessment could help illuminate whether immediate feedback about student learning.
the students who were struggling to con- Depending on the assignment in question,
vert fractions to decimals improved after they also can provide rich, detailed exam-
receiving targeted small group instruction, ples of students’ academic performance,
or whether students’ expository essays thereby complementing the results of an-
nual or interim tests. For example, if state
and interim assessments show that students
Characteristics of interim
have difficulty writing about literature, then
assessments examination of students’ analytic essays,
• Administered routinely (e.g., each book reports, or reading-response journals
semester, quarter, or month) can illuminate how students are accustomed
throughout a school year to writing about what they read and can sug-
gest areas in which students need additional
• Administered in a consistent guidance.24 An important disadvantage of
manner across a particular grade classroom-level data is that the assignments,
level and/or content area within conditions, and scores are not generally
a school or district comparable across classrooms. However,
when teachers come together to examine
• May be commercial or developed students’ work, this variability also can be
in-house an advantage, since it can reveal discrepan-
cies in expectations and content coverage
• May be administered on paper that teachers can take steps to remedy.
or on a computer
As teachers prepare annual, interim,
• May be scored by a computer and classroom-level data for analysis,
they should represent the information in
or a person
( 12 )
RECOMMEnDATIOn 1. MAkE DATA PART OF An OnGOInG CyClE OF InSTRUCTIOnAl IMPROvEMEnT
( 13 )
Recommendation 1.
Make data part of an ongoing cycle of instructional improvement
2. Interpret data and develop hypotheses source of the discrepancy. In all cases, they
about how to improve student learning. should use classroom and other data to
shed light on the particular aspects of the
Working independently or in teams, teach- skill with which students need extra help.
ers should interpret the data they have
collected and prepared. In interpreting As they triangulate data from multiple
the data, one generally useful objective sources, teachers should develop hypoth-
is to identify each class’s overall areas eses about ways to improve the achieve-
of relative strengths and weaknesses so ment patterns they see in the data. As the
that teachers can allocate instructional box on page 15 explains, good hypoth-
time and resources to the content that is eses emerge from existing data, identify
most pressing. Another useful objective is instructional or curricular changes likely
to identify students’ individual strengths to improve student learning, and can be
and weaknesses so that teachers can adapt tested using future assessment data. For
their assignments, instructional methods, example, existing data can reveal places in
and feedback in ways that address those which the school’s curriculum is not well
individual needs. For instance, teachers aligned with state standards. In those situ-
may wish to adapt students’ class project ations, teachers might reasonably hypoth-
assignments in ways that draw on stu- esize that reorganizing the curriculum to
dents’ individual strengths while encour- address previously neglected material will
aging them to work on areas for growth. improve students’ mastery of the standards.
In other cases, teachers may hypothesize
To gain deeper insight into students’ learn- that they need to teach the same content in
ing needs, teachers should examine evi- different ways. Taking into account how they
dence from the multiple data sources they and their colleagues have previously taught
prepared in action step 1.25 “Triangulation” particular skills can help teachers choose
is the process of using multiple data sources among plausible hypotheses. For instance,
to address a particular question or problem teachers may find that students have diffi-
and using evidence from each source to culty identifying the main idea of texts they
illuminate or temper evidence from the read. This weak student performance may
other sources. It also can be thought of as lead teachers to hypothesize that the skill
using each data source to test and confirm should be taught differently. In talking to
evidence from the other sources in order other teachers, they might choose a differ-
to arrive at well-justified conclusions about ent teaching strategy, such as a discussion
students’ learning needs. When multiple format in which students not only identify
data sources (e.g., results from the annual the main idea of a text but also debate its
state assessment and district interim as- evidence and merits.
sessment) show similar areas of student
strength and weakness (as in Example 1), To foster such sharing of effective practices
teachers can be more confident in their among teachers, the panel recommends
decisions about which skills to focus on. that teachers interpret data collaboratively
In contrast, when one test shows students in grade-level or department-specific teams.
struggling in a particular skill and another In this way, teachers can begin to adopt
test shows them performing well in that some common instructional and assess-
skill, teachers need to look closely at the ment practices as well as common expec-
items on both tests to try to identify the tations for student performance.26 Col-
laboration also allows teachers to develop
25. Halverson, Prichett, and Watson (2007); Her-
man and Gribbons (2001); Lachat and Smith 26. Fiarman (2007); Halverson, Prichett, and Wat-
(2005); Wayman and Stringfield (2006). son (2007); Halverson et al. (2007).
( 14 )
Recommendation 1.
Make data part of an ongoing cycle of instructional improvement
The time it takes teachers to carry out their should give themselves and their students
instructional changes will depend in part time to adapt to it.28
on the complexity of the changes. If teach-
ers are delivering a discrete lesson plan or Potential roadblocks and solutions
a series of lessons, then the change usually
can be carried out quickly. Larger interven- Roadblock 1.1. Teachers have so much
tions take longer to roll out than smaller data that they are not sure where they
ones. For instance, a teacher whose inter- should focus their attention in order to raise
vention involves introducing more collab- student achievement.
orative learning into the classroom may
need time to teach her students to work Suggested Approach. Teachers can nar-
efficiently in small group settings. row the range of data needed to solve a
particular problem by asking specific ques-
During or shortly after carrying out an in- tions and concretely identifying the data
structional intervention, teachers should that will answer those questions. In ad-
take notes on how students responded and dition, administrators can guide this pro-
how they as teachers might modify deliv- cess by setting schoolwide goals that help
ery of the intervention in future classes. clarify the kinds of data teachers should be
These notes may not only help teachers examining and by asking questions about
reflect on their own practice but also pre- how classroom practices are advancing
pare them to share their experiences and those goals. For instance, if administrators
insights with other teachers. have asked teachers to devote particular
effort to raising students’ reading achieve-
To evaluate the effectiveness of the in- ment, teachers may decide to focus atten-
structional intervention, teachers should tion on evidence from state, interim, and
return to action step 1 by collecting and classroom assessments about students’
preparing a variety of data about student reading needs. Teachers should then tri-
learning. For instance, they can gather angulate data from multiple sources (as
classroom-level data, such as students’ described earlier) to develop hypotheses
classwork and homework, to quickly eval- about instructional changes likely to raise
uate student performance after the inter- student achievement. Note that recommen-
vention.27 Teachers can use data from later dation 3 describes how administrators, data
interim assessments, such as a quarterly facilitators, and other staff can help teach-
district test, to confirm or challenge their ers use data in ways that are clearly aligned
immediate, classroom-level evidence. with the school’s medium- and long-term
student achievement goals. Also, recom-
Finally, after triangulating data and con- mendation 4 describes how professional
sidering the extent to which student learn- development and peer collaboration can
ing did or did not improve in response help teachers become more adept at data
to the intervention, teachers can decide preparation and triangulation.
whether to keep pursuing the approach
in its current form, modify or extend the Roadblock 1.2. Some teachers work in a
approach, or try a different approach alto- grade level or subject area (such as early
gether. It is important to bear in mind that elementary and advanced high school
not all instructional changes bear fruit im- grades) or teach certain subjects (such as
mediately, so before discarding an instruc- social studies, music, science, or physical
tional intervention as ineffective, teachers education) for which student achievement
data are not readily available.
( 16 )
Recommendation 1.
Make data part of an ongoing cycle of instructional improvement
( 17 )
Recommendation 1.
Make data part of an ongoing cycle of instructional improvement
Suggested Approach. Part of the work and always should be considered in con-
of collaborative data use involves estab- junction with other data. Also, undue focus
lishing shared learning goals and expec- on students scoring near proficiency may
tations across classrooms.29 District or lead schools to distribute instructional re-
school administrators can help this effort sources inappropriately.35 For instance, stu-
by providing an interim, schoolwide assess- dents scoring further from the cut score (in
ment, ideally linked to state standards, that either direction) may have just as many—if
allows the comparison of results across not more—distinctive instructional needs
classrooms.30 Alternatively, teachers can as those scoring near the cut score. Instead
collaborate to develop their own interim of focusing mainly on students scoring just
assessments. Some schools, for instance, below proficiency on a particular assess-
develop interim writing prompts or other ment, educators should use data from mul-
assessments that are administered through- tiple sources to identify and serve the needs
out the school and scored using a common of all students. When possible, additional re-
rubric.31 (Example 5 in recommendation sources and support should be directed to-
2 illustrates this approach.) Although in- ward students whose needs are the greatest.
house assessments may lack the validity of (See the What Works Clearinghouse guides
commercially developed tests, they never- on Response to Intervention for more sug-
theless provide common metrics by which gestions on tiered student support.)
teachers can assess their students and
share results with colleagues.32 Similarly, Roadblock 1.4. Some district leaders sug-
teachers of supplemental subjects such as gest that schools assign students to courses
art, music, and physical education can de- based solely on proficiency levels on the
velop performance assessments linked to state accountability test.
schoolwide student goals.33
Suggested Approach. Tests should be
Roadblock 1.3. Some schools or districts used for the purposes for which they have
encourage staff to use data to identify stu- been validated; most existing assessments
dents scoring just below proficiency on state have not been validated for the purpose
tests and to focus disproportionate effort on of making decisions about course place-
helping them reach proficiency. ment. In addition, the professional stan-
dards for appropriate use of test scores
Suggested Approach. Teachers and princi- in educational settings state that a single
pals in some schools have reported focusing test score should not be used to make
extra resources on “bubble kids,” or students high-stakes decisions about individuals;
scoring immediately below a proficiency instead, educators and administrators
cut-off on a high-stakes assessment.34 The should consider multiple sources of in-
panel cautions against this practice because formation when assigning students to
results from any single test are imprecise courses or programs.36 Proficiency on a
state accountability test can provide one
indicator of a student’s readiness or need
29. Datnow, Park, and Wohlstetter (2007); Wil-
liams Rose (2006); Rossmiller and Holcomb
for a specific instructional program, but
(1993); Togneri (2003); Wayman, Cho, and John- other information, such as prior perfor-
ston (2007). mance in similar courses, should be taken
30. Wayman, Midgley, and Stringfield (2006). into account. Finally, educators should re-
31. See, for example, Fiarman (2007). consider decisions about placement when
32. Shepard et al. (1996). new data become available.
33. See, for example, Forman (2007).
34. Booher-Jennings (2005); Brunner et al. (2005); 35. Booher-Jennings (2005).
Hamilton et al. (2007); Long et al. (2008). 36. AERA, APA, and NCME (1999).
( 18 )
Recommendation 2. use and student achievement. When
combined with clear data, instructional
Teach students to strategies such as having students
examine their own data rework incorrect problems can enhance
student learning.39
and set learning goals
Level of evidence: Low
Teachers should provide students
with explicit instruction on using The panel judged the level of evidence
achievement data regularly to monitor supporting this recommendation to be low,
their own performance and establish based on two studies with causal designs
their own goals for learning. This data that met WWC standards and drawing on
analysis process—similar to the data additional examples of practices from
use cycle for teachers described in qualitative and descriptive studies and
recommendation 1—can motivate both on their own expertise. One randomized
elementary and secondary students controlled trial that met WWC standards
by mapping out accomplishments with reservations found positive effects
that are attainable, revealing actual of interventions that combined student
achievement gains and providing analysis of data with other practices, such
students with a sense of control as teacher coaching, teacher professional
over their own outcomes. Teachers development, and/or classroom manage-
can then use these goals to better ment interventions; therefore, the panel
understand factors that may motivate could not attribute impacts to student
student performance and adjust their data analysis alone.40 A second random-
instructional practices accordingly. ized controlled trial met WWC standards
and reported positive effects of a web-
based data tool for students, but the size
Students are best prepared to learn and statistical significance of these ef-
from their own achievement data fects could not be confirmed by the WWC;
when they understand the learning therefore, it does not provide the panel
objectives and when they receive with strong causal evidence that having
data in a user-friendly format. Tools students examine their own data is an ef-
such as rubrics provide students with fective intervention.41
a clear sense of learning objectives,
and data presented in an accessible Brief summary of evidence to
and descriptive format can illuminate support the recommendation
students’ strengths and weaknesses
(see recommendation 5 for more Two randomized controlled trials that met
information on reporting formats).37 WWC standards (one with and one without
Many practices around data rely on the reservations) found positive effects of in-
assumption38 of a relationship between terventions in which students used their
formative assessment and feedback own assessment data. One study found
that curriculum-based measurement inter-
37. Black et al. (2003). ventions combined with student analysis
38. Black and Wiliam (1998) and Kluger and De-
Nisi (1996) examine the relationship between as- used noncausal designs that did not meet WWC
sessment and student learning in their respective evidence standards.
meta-analyses on the topic. However, the studies
39. Clymer and Wiliam (2007).
included in those meta-analyses were outside
the date range or otherwise outside the scope 40. Phillips et al. (1993).
of the literature review for this guide, or they 41. May and Robinson (2007).
( 19 )
Recommendation 2. Teach students to examine their own data and set learning goals
of their own assessment data and feedback useful feedback on complex skills such as
from their teachers led to statistically sig- writing an effective essay or term paper,
nificant gains in student achievement.42 delivering a persuasive speech, or execut-
A second study reported statistically sig- ing a science experiment. Teachers also can
nificant gains in achievement for students have students assess a sample assignment
given access to an interactive website re- using the rubric to help them better under-
porting student test scores and providing stand the criteria. Once the students’ actual
advice for improving those scores. How- assignments are completed and evaluated,
ever, the WWC could not confirm the statis- students should receive the completed ru-
tical significance of these gains.43 To add bric from the teacher.
detail and specificity to this recommenda-
tion, and to supplement the information Because public school students in many
available in these two studies, the panel grades are required to take annual stan-
relied upon its own expertise and referred dards-based accountability tests in se-
to several case studies and descriptive lected subjects, teachers should help stu-
analyses of examples of feedback and to dents understand the state standards they
provide information needed to construct are expected to meet by regularly revisit-
sample feedback tools. ing the standards throughout the year. For
example, a 5th-grade teacher could spend
How to carry out this a few minutes at the beginning of an in-
recommendation structional unit explaining that certain
essential concepts in the lesson (e.g., lit-
1. Explain expectations and assessment erary devices such as similes) may appear
criteria. on the annual test. Students could keep
a running list of these standards-based
To interpret their own achievement data, concepts throughout the year, using the
students need to understand how their list as a basis for review before the annual
performance fits within the context of test. Note that making students familiar
classroom-level or schoolwide expecta- with content standards is not the same
tions. Teachers should articulate the con- as engaging in extensive practice using
tent knowledge or skills that they expect problems or tasks designed to mirror the
students to achieve throughout the school format of a specific test. The latter may
year, conveying goals for individual les- result in spurious test-score gains and is
sons and assignments, as well as goals not recommended by the panel.45
for the unit and end-of-year performance.
Teachers should explicitly describe the 2. Provide feedback to students that is timely,
criteria that will be used to assess perfor- specific, well formatted, and constructive.
mance toward those goals.
Providing students with thoughtful and con-
For example, when teachers use a rubric to structive feedback on their progress may
provide feedback (an example is provided improve academic achievement.46 Feedback
in Example 2), teachers should introduce should be designed to help students under-
the rubric at the beginning of the assign- stand their own strengths and weaknesses,
ment so that students know which criteria explaining why they received the grades and
are important before they begin working on scores they did and identifying the specific
a task or assignment.44 Rubrics can provide content areas the student should focus on
( 20 )
Recommendation 2. Teach students to examine their own data and set learning goals
( 21 )
Recommendation 2. Teach students to examine their own data and set learning goals
to improve their scores. Such feedback often specifying why a particular piece of
has the following characteristics: work is praiseworthy.50
• Timely. Feedback should be rapid 3. Provide tools that help students learn
so that students still remember the from feedback.
task and the skills on which they were
being assessed.47 The panel recom- Simply giving students assessment data
mends that assessment data be re- that are accessible and constructive does
turned to students within a week of not guarantee that they will know what to
collecting the assignment, and sooner do with the data. Students need the time
when possible. and tools to analyze the feedback; other-
wise, they may simply glance at the over-
• Appropriately formatted. When pro- all score without considering why they
viding feedback, teachers should se- achieved that score and what they could
lect a mode of delivery (e.g., rubric do to improve.
based, handwritten, or typed) that best
meets students’ needs based on their When providing feedback, teachers should
grade level, the subject area, and the set aside 10 to 15 minutes of classroom
assignment. Typed feedback, for ex- instructional time to allow students to in-
ample, may be appropriate in response terpret and learn from the data. It is im-
to students’ larger projects, whereas portant to undertake this reflection dur-
handwritten feedback may suffice on ing class time, when the teacher can help
short assignments and student jour- students interpret feedback and strategize
nals or as supplemental feedback at ways to improve their performance. Dur-
the end of a rubric-based evaluation. ing this time, teachers should have stu-
Additionally, teachers’ feedback should dents individually review written feedback
be based on a shared understanding of and ask questions about that feedback.
expectations and scoring criteria.
Teachers also can provide students with
• Specific and constructive. Regard- paper- or computer-based tools for inter-
less of the format, feedback should preting feedback, such as the following:
provide concrete information and sug-
gestions for improvement.48 Feedback • a template for listing strengths, weak-
in the form of explanations, exam- nesses, and areas to focus on for a
ples, and suggestions for additional given task (see Example 3);51
practice is more concrete and easier
for students to act on than a score • a list of questions for students to
or letter grade alone, and it may in- consider and respond to (e.g., “Can I
crease students’ confidence and mo- beat my highest score in the next two
tivate better performance.49 For this weeks?” and “Which skills can I work
reason, teachers should avoid pro- harder on in the next two weeks?”);52
viding feedback that is exclusively
focused on what should have been • worksheets to facilitate reflection about
done or delivers vague praise without incorrect items (see Example 4);53
47. Black and Wiliam (1998); Stiggins (2007). 50. Black et al. (2003); Black and Wiliam (1998);
Shepard (1995).
48. Black and Wiliam (1998); Brunner et al.
(2005). 51. Stiggins (2007).
49. Clymer and Wiliam (2007); Schunk and 52. Phillips et al. (1993).
Swartz (1992). 53. Stiggins (2007).
( 22 )
Recommendation 2. Teach students to examine their own data and set learning goals
56. Black et al. (2003); Black and Wiliam (1998); 57. Lee and Gavine (2003); Thurman and Wolfe
Shepard (1995); Wesson (1991). (1999).
( 24 )
Recommendation 2. Teach students to examine their own data and set learning goals
( 25 )
Recommendation 2. Teach students to examine their own data and set learning goals
Suggested Approach. The panel rec- of their own data into routine classroom
ognizes that instruction time is limited. activities may help students develop a
However, time spent explaining assess- habit of learning from feedback, mak-
ment tools and strategies for analyzing ing them more independent as the year
feedback is essential to helping students progresses. Helping students understand
understand their own achievement. Thus, assessment tools and analyze feedback
it should be a natural, integral part of the also puts students at the vanguard of the
teaching process—not an add-on activity. school’s culture of data use.
Incorporating time for students’ analysis
( 26 )
Recommendation 3. one of the key responsibilities of an
education professional.62
Establish a clear
vision for schoolwide Level of evidence: Low
data use Believing that a clear vision for data use is
essential to educators wishing to improve
Schools must establish a strong instruction through interpreting data, the
culture of data use to ensure that data- panel drew from its own knowledge and the
based decisions are made frequently, findings and examples in case studies and
consistently, and appropriately.58 descriptive analyses to inform the develop-
This data culture should emphasize ment of this recommendation. No studies
collaboration across and within grade were identified that examine the effects of
levels and subject areas59 to diagnose establishing a data team or creating a data-
problems and refine educational use plan on student achievement, so the
practices.60 Several factors (e.g., panel judged the level of evidence support-
planning, leadership, implementation, ing this recommendation as low.
and attitude) affect the success
schools will have with developing and Brief summary of evidence to
maintaining a data culture. Here, the support the recommendation
panel suggests steps schools should
take toward establishing their vision, A strong culture of data use, conveyed
while recognizing that following the through a clear schoolwide vision, is criti-
suggestions does not guarantee that cal to ensure that data-based decisions are
a strong culture will emerge. made routinely, consistently, and effec-
tively. This point is conveyed in a number
of studies that use qualitative designs to
A clear plan for schoolwide data examine how schools and districts have
use is essential to developing such implemented data use. Appendix D con-
a culture. Schools should establish a tains two examples of case studies the
representative data team to help ensure panel referenced when developing the ac-
that data activities are not imposed on tion steps in this recommendation. One de-
educators, but rather are shaped by scribes how a set of districts and schools
them.61 This team should develop a has worked to develop achievement goals
written data-use plan that is consistent and to use student data to support prog-
with broader school and district goals, ress toward those goals,63 whereas the
supports a common language related other describes an example of how one
to data use and teaching and learning school has its staff share responsibility for
concepts, and establishes data use as data use to avoid burnout.64 However, the
panel identified no causal evidence linking
the creation of a schoolwide culture or vi-
sion to improved student performance.
( 27 )
Recommendation 3. Establish a clear
vision for schoolwide data use
It is important to note that a data team Based on the data team’s discussions, as
is a committee of advisors on data use well as full staff input, the team’s admin-
within the school. Additionally, the team istrator and teachers should write a plan
represents the entire school community, that clearly articulates how the school will
so decisions should be made in collab- use data to support school-level goals for
oration with the different perspectives
67. Wayman, Cho, and Johnston (2007); Wayman,
65. Halverson and Thomas (2007); Hill, Lewis, Midgley, and Stringfield (2006).
and Pearson (2008); Moody and Dede (2008). 68. Waters and Marzano (2006); Wayman, Cho,
66. Bettesworth (2006). and Johnston (2007).
( 28 )
Recommendation 3. Establish a clear
vision for schoolwide data use
improving student achievement.69 These • timelines for executing the actions; and
goals, developed by school and district
leadership, already exist in most schools. • how each action helps the school reach
To create conditions for effective data use, its long-term goals.
the data team should briefly revisit the
school’s goals to ensure that they are Example 6 provides a hypothetical plan for
tying data use to school goals. The exam-
• attainable, in that they are realistic ple illustrates how a data team might map
given existing performance levels; a clear rationale from each action to the
school’s larger goal of improved reading
• measurable, in that they clearly ex- proficiency, and how each data team mem-
press the parameters of achievement ber might take responsibility for executing a
and can be supported by data70; and portion of the larger plan. The panel encour-
ages schools to develop other similar plans,
• relevant, in that they take into account including detailed lists of data-use responsi-
the specific culture and constraints of bilities by staff role and timelines for using
the school.71 data, but provides this table as a sample of
how an actionable plan might look.
For example, a school in which half the
students can read at grade level may de- The team should revisit the plan annually,73
cide to set a long-term goal of having 75 using data to determine appropriate
percent of students reading on grade level changes to meet the needs and goals of
within five years. It then would seem rea- the school and its students. Revising the
sonable for the school to set ambitious but plan in this way mirrors the cycle of in-
achievable annual goals to increase the structional improvement, further estab-
share of students reading at grade level by lishing a culture of data-based decision
5 percentage points per year. If the data making throughout the school.
team determines that the goals do not
meet the criteria of seeming attainable, 4. Provide ongoing data leadership.
measurable, and relevant, it may wish to
establish short- and medium-term goals Once the plan is developed, the data
that do meet these criteria. team should provide guidance on using
data to support the school’s vision, with
With the school’s goals identified and clari- the ultimate aim of developing the ca-
fied, the data team should prepare a writ- pacity of all school staff to use data. At
ten plan specifying72 the outset, members of the data team
should regularly interact with school
• specific actions for using data to make staff about data and its uses, often-
instructional decisions; times serving as data facilitators (see
recommendation 4). For example, team
• staff and team members responsible members can educate school staff, dis-
for carrying out those actions; trict representatives, or parents about
the school’s vision for data use by hav-
ing individual or small group meetings
69. Armstrong and Anthes (2001); Mason (2002);
Togneri (2003).
70. Datnow, Park, and Wohlstetter (2007); Feld-
man and Tung (2001); Young (2006).
73. Wayman, Cho, and Johnston (2007) recom-
71. Halverson et al. (2007); Leithwood et al. mend revisiting the plan frequently. The panel
(2007). recommends doing so on at least an annual
72. Datnow, Park, and Wohlstetter (2007). basis.
( 29 )
Recommendation 3. Establish a clear
vision for schoolwide data use
Plan and facilitate monthly • Focus on areas of greatest Mike Thompson, Hold first meeting
grades 4–6 team meetings to student need grades 4–6 by October 10;
review Ms. Sanders’s data dis- team leader second by
• Calibrate and elevate
plays and share best practices November 15
expectations among teachers
in mini-lessons co-planned by
Mr. Johnson. • Streamline instructional
practices
Plan and facilitate monthly Beth Miller,
• Share practices that work
grades 1–3 team meetings to grades 1–3
review Ms. Sanders’s data dis- • Encourage vertical align- team leader
plays and share best practices ment between grades
in mini-lessons co-planned by
Mr. Johnson.
Prepare well-chosen data • Help teachers gain facility Erin Sanders, Carry out
graphs on PowerPoint (state in using data data facilitator monthly; distrib-
or interim data updates) for ute examples at
• Focus teachers’ attention and
monthly grade-level team November data
inquiry on areas of particular
meetings. team meeting
strengths and weaknesses in
students’ reading skills
Have teachers choose their • Share and standardize best Lionel Johnson, Bring schedule to
favorite reading instructional practices among classrooms reading coach November data
strategy and prepare sample team meeting;
• Encourage culture of
lessons and evidence of student hold first session
instructional improvement
work. Schedule teachers to pres- by October 10.
ent these during part of their • Reinforce evidence-based
grade-level team meetings. practice
Register and prepare data team • Increase ability of data team Samantha Roberts, October 15
for 4-day offsite workshop on in- to understand and use data assistant principal
terpreting assessment data, cre- • Develop capacity for distrib-
ating data displays, and helping uting leadership within the
teachers use data daily. school
( 30 )
Recommendation 3. Establish a clear
vision for schoolwide data use
focused on these topics. Team members staff use data in ways that advance school
also can goals.74 “Distributed leadership,” a practice
often hypothesized as an important char-
• provide resources and support for data acteristic of effective schools, is one way
analysis and interpretation, such as in- to accomplish this task.75
formation about professional develop-
ment sessions and access to necessary Potential roadblocks and solutions
technologies;
Roadblock 3.1. School staff do not have
• encourage educators to use data in time to develop an additional plan for how
their daily work by modeling data use to use data.
strategies;
Suggested Approach. To alleviate the
• create incentives to motivate staff to pressure of creating a new plan, the plan
analyze data (e.g., “Staff Member of for data use could be incorporated into
the Month” award for excellent data- an existing school improvement plan.76
use, recognition in the school news- Research also has described schools that
letter); and viewed this effort as ultimately time effi-
cient, describing their efforts as “making
• participate in grade- and subject-level time to save time.” 77
meetings to ensure that structured col-
laboration time is used effectively (see Roadblock 3.2. No one is qualified (or
recommendation 4). wants) to be on the data team.
( 31 )
Recommendation 3. Establish a clear
vision for schoolwide data use
Roadblock 3.3. The few data-savvy staff Roadblock 3.4. The district does not have
at the school are overwhelmed by questions research and development staff to partici-
and requests for assistance.78 pate in the school-level data team.
( 32 )
Recommendation 4. reported that teachers in the coaching
Provide supports group more frequently used pupil obser-
vations to modify lessons,86 this outcome
that foster a data- was not measured in a way that allowed
driven culture the authors or the WWC to compute the
magnitude or statistical significance of
within the school any effect of this change on instructional
practice. The panel also identified one cor-
relational study that found a significant
Schools and districts can make positive association between coaching and
concrete changes that encourage reading achievement (however the study
data use within schools.81 These design does not permit causal inferences
changes need to ensure that teachers, about the effect of coaching).87 Although
principals, and school and district these studies, supplemented by findings
staff have a thorough understanding from qualitative analyses and their own
of their roles in using data, and that expertise, helped the panel develop the
they possess the knowledge and skills steps under this recommendation, the
to use data appropriately. Schools and level of evidence supporting this recom-
districts should invest in leadership, mendation is low.
professional development, and
structured time for collaboration.82 Brief summary of evidence to
They also may need to invest in support the recommendation
additional resources, including relevant
technologies83 and specialized staff.84 Although the panel believes that the steps
under this recommendation are essential
Level of evidence: Low and findings of numerous qualitative anal-
yses report that supporting staff in data
Two studies that met WWC standards or use is important, limited rigorous evidence
that met WWC standards with reserva- exists to demonstrate that schoolwide
tions tested interventions that included supports for data use lead to achievement
coaching and feedback to help teachers gains. Two studies tested interventions
interpret and make changes based on as- that included coaching and feedback to
sessment data (the interventions included help teachers interpret and make changes
other practices as well).85 These interven- based on assessment data.88 In both cases,
tions had no discernible effects on student the coaching was only one component of
achievement. Although one study also the intervention, and the intervention was
compared with a competing intervention
(as opposed to business as usual). One
81. Knapp et al. (2006); Lachat and Smith (2005);
Supovitz (2006); Supovitz and Klein (2003); Way- study compared the students of teachers
man, Cho, and Johnston (2007); Wayman and who received coaching to use data to track
Stringfield (2006). student progress and make instructional
82. Datnow, Park, and Wohlstetter (2007); Lachat changes with the students of teachers who
and Smith (2005); Supovitz and Klein (2003); Way- received coaching on behavioral manage-
man, Cho, and Johnston (2007); Wayman and
ment.89 Another compared students of
Stringfield (2006); Young (2006).
83. Wayman, Stringfield, and Yakimowski
(2004).
86. Jones and Krouse (1988).
84. Armstrong and Anthes (2001); Datnow, Park,
and Wohlstetter (2007); Supovitz and Klein (2003); 87. Marsh et al. (2008).
Wayman, Cho, and Johnston (2007). 88. Jones and Krouse (1988); Wesson (1991).
85. Jones and Krouse (1988); Wesson (1991). 89. Jones and Krouse (1988).
( 33 )
Recommendation 4. Provide supports that foster a data-driven culture within the school
teachers who received individual mentor- appropriately so that staff do not become
ing with students of teachers who received too dependent on facilitators.
group mentoring.90 The studies found no
discernible effects of the interventions Data facilitators should meet at least
that included a coaching component. The monthly with grade- and subject-level
panel identified no rigorous studies iden- teacher teams, although teacher teams
tifying the effects on student achievement should meet independently more fre-
of other schoolwide supports for data use. quently (see recommendation 1). During
To shape this recommendation, panelists these meetings, data facilitators should
relied on their own expertise as well as
examples of data leadership and profes- • model data use and interpretation,
sional development opportunities drawn tying examples to the school’s vision
from noncausal studies and implementa- for data use and its learning goals;
tion guides.
• model how to transform daily class-
How to carry out this room practices based on data-driven
recommendation diagnoses of student learning issues;
( 34 )
Recommendation 4. Provide supports that foster a data-driven culture within the school
need to provide the same level of guidance for scheduling collaborative time. For ex-
and support as indicated earlier. ample, one school has dedicated biweekly
two-hour meetings for staff to examine
2. Dedicate structured time for staff student data and identify next instruc-
collaboration. tional steps.100 Another school adjusted
weekly class schedules to have a com-
Encouraging teachers to work collabora- mon break for teachers to examine data
tively with data helps make data use an collaboratively.101
established part of a school’s culture.96
Collaborative data analysis can highlight The collaborative team meetings should
achievement patterns across grade levels, include the following components:
departments, or schools97 and can engen-
der the kind of consistency of instructional • Preparation. Prior to these meetings,
practices and expectations that often char- educators should set an agenda that
acterizes high-performing schools.98 focuses on using the most updated
data relative to a specific, timely topic.
Structured time should be set aside for It is too overwhelming to attempt to
teachers and school staff to collabora- address all student achievement con-
tively analyze and interpret their students’ cerns at once; targeted discussions are
achievement data, and to identify instruc- key to successful data meetings.
tional changes.99 This time also can be
used for professional development on data • Analysis. During these meetings,
use. Ideally, this structured time should teachers should follow the cycle of in-
occur a few times each week, depending quiry, using data to state hypotheses
on the individual school’s needs. It is im- about their teaching and learning prac-
portant that schools make these collabora- tices and then testing those hypoth-
tive meetings a priority. eses (see recommendation 1).102
Because school schedule constraints vary, The skills that educators need in order
principals can explore different options to use data to identify achievement prob-
lems and develop instructional solutions
96. Feldman and Tung (2001). are complex. To enhance data-literacy
97. Cromey and Hanson (2000). and data-use skills in a way that is consis-
98. Bigger (2006); Herman and Gribbons (2001); tent with school goals, it is essential that
Huffman and Kalnin (2003); Lachat and Smith schools and districts provide ongoing pro-
(2005); Wayman, Cho, and Johnston (2007). fessional development opportunities for
99. Anderegg (2007); Bigger (2006); Cromey and
Hanson (2000); Gentry (2005); Herman and Grib-
100. Knapp et al. (2006).
bons (2001); Huffman and Kalnin (2003); Ingram,
Louis, and Schroeder (2004); Supovitz and Klein 101. Mandinach et al. (2005).
(2003); Wayman and Stringfield (2006). 102. Armstrong and Anthes (2001).
( 35 )
Recommendation 4. Provide supports that foster a data-driven culture within the school
administrators, principals, teachers,103 and ments.108 In this way, staff can more easily
classroom support specialists.104 Without connect their training to daily activities109
school- and district-level support for these and not become overwhelmed by training
opportunities, analysis of data may be in- sessions. (See recommendation 5 for more
consistent and potentially ineffective. details on preparing for implementation of
technology systems.)
The skills needed for effective data use
range from data entry to data analysis to It is important to recognize that profes-
leadership; they also vary depending on sional development responsibility does not
professional roles (i.e., teacher, adminis- end after the initial training of staff and
trator, or technology support staff), con- deployment of the district’s data system.
tent area and curriculum, experience with Users also may require ongoing technical
data analysis, and level of comfort with assistance, and additional trainings will be
technology.105 For most staff, professional needed when introducing system enhance-
development should focus on how users ments. Professional development oppor-
will apply the data to their daily work and tunities, therefore, should be continuous,
instructional planning, rather than on the offered at least monthly throughout the
functionality of the system.106 Staff with school year by staff experienced with as-
the specific role of maintaining the sys- sessment and data-literacy skills, technol-
tem, however, should receive specialized ogy use, and the development of cultures
training that prepares them to maintain of effective data use. Professional develop-
the system for all users. ment staff should consider offering online
learning modules as refresher courses or
Ideally, all staff, particularly principals, self-paced, independent training opportu-
should be familiar with components of the nities after initial in-person training ses-
data system, data culture, and data use. sions to moderate costs and offer flexibil-
Table 3 highlights some potential profes- ity in handling scheduling challenges and
sional development opportunities to pri- varying levels of technology use.
oritize for staff based on their roles with
the data system and data use. Potential roadblocks and solutions
Training for data use often is synchronous Roadblock 4.1. It is difficult to locate pro-
with technology training. Creating staff fessional development that is specific to the
confidence in, and comfort with, avail- needs of the school.
able data systems should increase the
chance that data will be used regularly and Suggested Approach. With the assis-
well.107 Related technology training should tance of the data team and data facilitators,
be implemented in small doses, however, schools should determine their needs and
and occur close to implementation of the discuss these with their professional de-
data system or related system enhance- velopment provider. In this way, schools
can ensure that the provider teaches skills
that meet the needs of school staff. If a
session cannot be tailored to the needs
103. Wayman, Cho, and Johnston (2007).
of the school or district, schools should
104. Feldman and Tung (2001).
105. Bigger (2006); Cromey and Hanson (2000);
Herman and Gribbons (2001); Huffman and Kal- 108. Arnold (2007); Cromey and Hanson (2000);
nin (2003); Knapp et al. (2006); Lachat and Smith Gentry (2005).
(2005); Wayman, Cho, and Johnston (2007).
109. Anderegg (2007); Ingram, Louis, and
106. Wayman and Cho (2008). Schroeder (2004); Wayman, Cho, and Johnston
107. Supovitz and Klein (2003). (2007).
( 36 )
Recommendation 4. Provide supports that foster a data-driven culture within the school
Fostering a culture of x x
data-based decision making
Interpreting data in an x x x
educational context
* Other staff can include data facilitators, classroom support specialists, administrative assistants, and counselors.
a. Examples of suggested professional development and training opportunities are drawn and adapted from Chris-
mer and DiBara (2006); Knapp et al. (2006); Marsh et al. (2008); McREL (2003); Nabors Oláh, Lawrence, and Riggan
(2008); and Wayman, Cho, and Johnston (2007).
( 37 )
Recommendation 4. Provide supports that foster a data-driven culture within the school
( 38 )
Recommendation 5. and improve student achievement. To
Develop and maintain guide this recommendation, the panel ref-
erenced descriptive and other noncausal
a districtwide data studies that (1) discussed how schools or
system districts collaboratively created and used
data systems,113 (2) described the impor-
tance or provided examples of selecting a
Districts should develop and maintain system that meets varied users’ needs,114
high-quality data systems that enable all (3) explained the successes and challenges
decision makers to access the necessary schools and districts experienced when
data in a timely fashion. A high-quality implementing their data systems,115 and
data system is comprehensive and (4) advocated the importance or gave ex-
integrated, linking disparate forms of amples of system maintenance and secu-
data for reporting and analysis to a rity relative to data quality.116 Appendix D
range of audiences.112 To help ensure provides details on the characteristics of
that the relevant staff in a school district data systems described in these studies.
will rely on the data system to inform
their decisions, district administrators How to carry out this
should involve a variety of stakeholders recommendation
when determining which functions the
system should provide. Districts and 1. Involve a variety of stakeholders in select-
schools need to secure financial and ing a data system.
human resources to develop safeguards
that ensure data are timely, relevant, Districts should establish a data-system
and useful to educators. advisory council that includes represen-
tatives from key stakeholder groups (see
Level of evidence: Low Table 4). These representatives should
understand the importance of data use
Recognizing that it is difficult if not impos- to make instructional decisions, possess
sible to test the impacts of data systems leadership and time-management skills,
on student achievement empirically, the and be able to effectively communicate
panel based this recommendation on a
combination of its expertise and its review
of descriptive studies and case studies.
The studies did not use a causal design
113. Choppin (2002); Lachat and Smith (2005);
that would provide evidence directly link- Mieles and Foley (2005); Thorn (2001); Wayman,
ing the use of an integrated data system Cho, and Johnston (2007); Wayman and Conoly
with improved academic outcomes; hence, (2006); Wayman and Stringfield (2006); Wayman,
the level of evidence to support this rec- Stringfield, and Yakimowski (2004).
ommendation is low. 114. Breiter and Light (2006); Brunner et al.
(2005); Choppin (2002); Datnow, Park, and Wohl-
stetter (2007); Kerr et al. (2006); Long et al. (2008);
Brief summary of evidence to Mieles and Foley (2005); Thorn (2001); Wayman
support the recommendation and Cho (2008); Wayman, Cho, and Johnston
(2007); Wayman, Stringfield, and Yakimowski
(2004).
A high-quality, districtwide data system is
115. Long et al. (2008); Wayman, Cho, and John-
necessary to provide teachers with the in-
ston (2007); Wayman, Stringfield, and Yakimo-
formation they need to modify instruction wski (2004).
116. Long et al. (2008); Mason (2003); Mieles and
112. Mieles and Foley (2005); Wayman, String- Foley (2005); Wayman and Cho (2008); Wayman,
field, and Yakimowski (2004). Cho, and Johnston (2007).
( 39 )
Recommendation 5. Develop and maintain a districtwide data system
Administrators Compare rates of discipline referrals among different groups of students;a discuss
and principals student progress and classroom pedagogy with faculty.b
Counselors Place students into correct classes based on prior performance and current schedule
constraints; discuss student progress and needs with other building educators.
Information Assess the interoperability of data systems; identify project scope; build strong proj-
technology staff ect plans; establish standards; manage differentiated access by stakeholders; provide
support, maintenance, and enhancements over time; identify challenges that might
prevent or hinder systems from working together for timely access to information.
Support staff Use attendance and assessment data to identify students for targeted interventions;
work with faculty and administration on data use strategies and changing practice.c
Teachers Identify student and class strengths and weaknesses; interact with other staff about
student progress.d
Parents Track immediate student outcomes and compare student performance over time.
information to other educators. Responsi- The panel recommends that the data sys-
bilities could include the following:117 tem advisory council meet frequently (at
least bimonthly, and more frequently if
• developing roles and structures to possible). Meetings should focus on sug-
oversee the district’s commitment to gestions for improving the data system,
data quality and use; addressing concerns from users about the
data system, and identifying professional
• providing guidance about the require- development needs.
ments and design of the data system;
Between meetings, members of the data sys-
• overseeing system development; and/or tem advisory council should solicit feedback
from their respective stakeholder groups
• serving as the liaison between the to better understand (1) how data are being
council and its respective stakeholder used, (2) concerns users have about the sys-
groups. tem, and (3) how the system could be used
in the future. The council should designate
Table 4 illustrates the needs that different one or two of its district-employed members
stakeholder groups might have in using a or identify a full-time individual to serve as
districtwide data system. project manager. These leaders should be
tasked with overseeing system development
117. Mieles and Foley (2005); Thorn (2001); Way-
and supporting the execution of the coun-
man and Conoly (2006); Wayman, Stringfield, and cil’s short- and long-term goals. In this way,
Yakimowski (2004). troubleshooting and decisions regarding the
( 40 )
Recommendation 5. Develop and maintain a districtwide data system
data system can be addressed in a timely, ef- Sample existing and new data
ficient manner outside of council meetings. elements to consider121
Recognizing that these designated staff may
have other responsibilities, administrators • State assessment data
should adjust staff responsibilities to allow • Interim or benchmark assessment
for sufficient time to execute project man- data
agement tasks. 121
( 41 )
Recommendation 5. Develop and maintain a districtwide data system
( 42 )
Recommendation 5. Develop and maintain a districtwide data system
( 43 )
Recommendation 5. Develop and maintain a districtwide data system
staff with new technology. This approach system, however, and the implementation
allows time for staff to adjust to the system, plan would benefit from an inflated esti-
as well as flexibility to modify the system mate of the rollout timeline.140
in response to user feedback. The rollout
plan should be long range (e.g., spread out The plan also should include professional
over the course of one academic year) and development and training opportunities
include specific plans (with activities and tailored to staff needs by considering their
timelines) for maintenance, training, and technological skills, roles, responsibili-
end-user support.138 Further, these oppor- ties, and the content areas in which they
tunities should be tightly linked with spe- work.141 Professional development about
cific tasks that are immediately expected
of the user, as per the district plan.139 It
is easy to underestimate the time needed 140. Mieles and Foley (2005).
to prepare existing data and roll out the 141. Long et al. (2008); Mason (2003); McREL
(2003); Wayman and Cho (2008). Wayman, Cho,
and Johnston (2007) conclude that training
138. Ibid. should be tailored to staff roles (but do not dis-
139. Wayman and Cho (2008). cuss developing a formal training plan).
( 44 )
Recommendation 5. Develop and maintain a districtwide data system
the data system should discuss data trans- as-needed basis (e.g., a technology help
parency and safety, system uses and ca- desk) should be in place as soon as educa-
pabilities, and ongoing opportunities for tors start using the system.
integrating data into instructional practice.
(See recommendation 4 for more informa- Roadblock 5.2. The implementation plan
tion about professional development.) The contains many technological requirements,
plan also should recognize that implemen- but little information on how the system
tation responsibility does not end after ini- will be used.
tial training of staff and deployment of the
system. Users may require ongoing tech- Suggested Approach. Before purchasing
nical assistance, and additional trainings or developing a data system, ensure that
will be needed when introducing system the implementation plan addresses system
refinements and enhancements. requirements as they relate to the teach-
ing and learning goals of the district.144
Potential roadblocks and solutions Be very careful that educational goals are
front and center in this plan—the district
Roadblock 5.1. The data system’s tech- advisory council should never put techno-
nological components are challenging for logical requirements and considerations
staff who do not consider themselves tech- for a system before the educational goals
nologically savvy or are skeptical of using the system supports. If the plan clearly ar-
new technologies. ticulates how the system relates to learn-
ing goals, users will better understand
Suggested Approach. The data system how the system will be used and why that
should not be implemented and used with- use will support student achievement.145
out accompanying training and support
services. When the district is preparing to Roadblock 5.3. A data system seems like
roll out its data system, the council should a financial luxury to many individuals in
ensure that appropriate professional devel- the district.
opment and technology training sessions
are available for a variety of skill levels (see Suggested Approach. For districts that
recommendation 4 for more details).142 In prioritize, and indicate as a priority, the
this way, all stakeholders have the oppor- use of student data to meet educational
tunity to learn about the data system and improvement goals, a data system must
develop the skills necessary to utilize the equally be a priority. Ensure that the dis-
system. District resources should be al- trict’s plan describes how a data system
located to ensure that principals and data supports these goals in a way that clearly
facilitators can support teachers’ use of explains and illustrates the necessity of the
data within the school building,143 and a system, in order to foster support for it.
mechanism for providing assistance on an
144. Wayman and Cho (2008); Wayman, Cho, and
Johnston (2007); Wayman and Conoly (2006).
142. Wayman and Cho (2008). 145. Breiter and Light (2006); Wayman and Cho
143. Kerr et al. (2006). (2008); Wayman, Cho, and Johnston (2007).
( 45 )
Glossary of terms as sense of data.148 Education-related data
used in this report may be student focused (e.g., demograph-
ics, attendance and behavior, performance
on standardized tests) or administrative
Common assessments are those assess- (e.g., financial and staffing information) in
ments administered in a routine, consistent nature but are not limited to these types.
manner across a state, district, or school. Data are typically maintained by state and
Under this definition, common assessments local education agencies, districts, schools,
include annual statewide accountability or teachers (see data warehouse).
tests and commercially produced tests,
interim assessments, benchmark assess- Data-based decision making in educa-
ments, and end-of-course tests, as long as tion refers to teachers, principals, and
they are administered consistently and rou- administrators systematically collecting
tinely to provide information that can be and analyzing various types of data, in-
compared across classrooms and schools. cluding demographic, administrative, pro-
cess, perceptual, and achievement data, to
Correlational studies look for relation- guide a range of decisions to help improve
ships among variables. Although correla- the success of students and schools. Other
tional studies can suggest that a relation- common terms include data-driven deci-
ship between two variables exists, they do sion making, data-informed decision mak-
not support an inference that one variable ing, and evidence-based decision making.
causes a change in another.146
The data culture is a learning environ-
The cycle of inquiry is a process in which ment within a school or district that in-
educators analyze data—such as demo- cludes attitudes, values, goals, norms of
graphic, perceptual, school process, and behavior, and practices, accompanied by
student achievement data—in order to an explicit vision for data use by leader-
understand how these elements are inter- ship, that characterize a group’s apprecia-
related and what they suggest about stu- tion for the importance and power that
dents’ learning needs. As a multistep pro- data can bring to the decision-making
cess, the cycle of inquiry often involves process. It also includes the recognition
analyzing data to better understand stu- that data collection is a necessary part of
dent needs, developing hypotheses about an educator’s responsibilities and that the
instructional practice, formulating and use of data to influence and inform prac-
implementing action plans to improve stu- tice is an essential tool that will be used
dent learning and achievement, and then frequently.
once again analyzing data to evaluate stu-
dent progress and inform next steps.147 The variables that make up a data sys-
tem are known as data elements or data
Data are empirical pieces of information indicators.
that educators can draw upon to make a
variety of instructional and organizational A data facilitator is an individual charged
decisions. By themselves, data are not ev- with helping schools or districts use data
idence—it takes concepts, theories, and effectively to make decisions. Often, data
interpretive frames of references to make facilitators organize school-based data
teams, lead practitioners in a collab-
146. Van Wagner (n.d.). orative inquiry process, help interpret
147. Halverson, Prichett, and Watson (2007); Her-
data, or educate staff on using data to
man and Gribbons (2001); Huffman and Kalnin
(2003; Fiarman (2007). 148. Knapp et al. (2006).
( 46 )
Glossary of terms as used in this report
improve instructional practices and stu- programs, and other materials shape the
dent achievement. context in which work is completed.
The ability to ask and answer questions Formative assessment is a process that
about collecting, analyzing, and making is intended to provide feedback to teach-
sense of data is known as data literacy. ers and students at regular intervals dur-
Widespread data literacy among teachers, ad- ing the course of instruction. The purpose
ministrators, and students is a salient char- of formative assessment is to influence
acteristic of a data-driven school culture. the teaching and learning process so as
to close the gap between current learn-
Data quality refers to the reliability and ing and a desired goal. Assessments used
validity of collected data. for formative purposes—often called for-
mative assessments—are those that are
As school-based groups of educators who “given in the classroom by the teacher for
come together to analyze data and help the explicit purpose of diagnosing where
one another use data effectively, data students are in their learning, where gaps
teams often include a school’s principal, in knowledge and understanding exist,
instructional leader(s), and several teach- and how to help teachers and students
ers. Such teams may lead teachers in using improve student learning. The assessment
achievement data to identify and respond is embedded within the learning activity
to students’ learning needs through in- and linked directly to the current unit of
structional modifications. instruction.”151 However, because most as-
sessments can be used in both formative
A data warehouse is a computer system and summative ways, the term formative
that stores educational information from refers less to a particular type of assess-
several sources and integrates it into a ment than to the purposes for which the
single electronic source. Data warehouses assessment is used.
are designed to allow the manipulation,
updating, and control of multiple data- A hypothesis is a “tentative assumption
bases that are connected to one another made in order to draw out and test its logi-
via individual student identification num- cal or empirical consequences.”152 Within
bers. Capabilities of data warehouses often the cycle of inquiry, it is an evidence-based
extend beyond data storage, however, and assumption about students’ learning needs
may include data management and report- that teachers can test using instructional
ing systems used for retrieving and ana- modifications and follow-up data about
lyzing data.149 student performance.
149. Mieles and Foley (2005); Wayman, String- 151. Perie, Marion, and Gong (2007), p. 3.
field, and Yakimowski (2004). 152. Merriam-Webster Online Dictionary (2009).
150. Spillane, Halverson, and Diamond (2004). 153. Perie, Marion, and Gong (2007).
( 47 )
Glossary of terms as used in this report
( 48 )
Appendix A. particular types of studies for drawing
Postscript from causal conclusions about what works.
Thus, one typically finds that a strong
the Institute of level of evidence is drawn from a body of
Education Sciences randomized controlled trials, the moder-
ate level from well-designed studies that
do not involve randomization, and the low
What is a practice guide? level from the opinions of respected au-
thorities (see Table 1). Levels of evidence
The health care professions have em- also can be constructed around the value
braced a mechanism for assembling and of particular types of studies for other
communicating evidence-based advice to goals, such as the reliability and validity
practitioners about care for specific clini- of assessments.
cal conditions. Variously called practice
guidelines, treatment protocols, critical Practice guides also can be distinguished
pathways, best practice guides, or simply from systematic reviews or meta-analyses
practice guides, these documents are sys- such as What Works Clearinghouse (WWC)
tematically developed recommendations intervention reviews or statistical meta-
about the course of care for frequently en- analyses, which employ statistical meth-
countered problems, ranging from physi- ods to summarize the results of studies
cal conditions, such as foot ulcers, to psy- obtained from a rule-based search of the
chosocial conditions, such as adolescent literature. Authors of practice guides sel-
development.155 dom conduct the types of systematic lit-
erature searches that are the backbone of
Practice guides are similar to the prod- a meta-analysis, although they take advan-
ucts of typical expert consensus panels tage of such work when it is already pub-
in reflecting the views of those serving lished. Instead, authors use their expertise
on the panel and the social decisions that to identify the most important research
come into play as the positions of individ- with respect to their recommendations,
ual panel members are forged into state- augmented by a search of recent publica-
ments that all panel members are willing tions to ensure that the research citations
to endorse. Practice guides, however, are are up-to-date. Furthermore, the character-
generated under three constraints that do ization of the quality and direction of the
not typically apply to consensus panels. evidence underlying a recommendation in
The first is that a practice guide consists a practice guide relies less on a tight set of
of a list of discrete recommendations that rules and statistical algorithms and more
are actionable. The second is that those on the judgment of the authors than would
recommendations taken together are in- be the case in a quality meta-analysis. An-
tended to be a coherent approach to a other distinction is that a practice guide,
multifaceted problem. The third, which is because it aims for a comprehensive and
most important, is that each recommen- coherent approach, operates with more
dation is explicitly connected to the level numerous and more contextualized state-
of evidence supporting it, with the level ments of what works than does a typical
represented by a grade (strong, moder- meta-analysis.
ate, or low).
Thus, practice guides sit somewhere be-
The levels of evidence, or grades, are tween consensus reports and meta-anal-
usually constructed around the value of yses in the degree to which systematic
processes are used for locating relevant
155. Field and Lohr (1990). research and characterizing its meaning.
( 49 )
Appendix A.
Postscript from the Institute of Education Sciences
Practice guides are more like consensus expertise to be a convincing source of rec-
panel reports than meta-analyses in the ommendations. IES recommends that at
breadth and complexity of the topic that one least one of the panelists be a prac-
is addressed. Practice guides are different titioner with experience relevant to the
from both consensus reports and meta- topic being addressed. The chair and the
analyses in providing advice at the level panelists are provided a general template
of specific action steps along a pathway for a practice guide along the lines of the
that represents a more-or-less coherent information provided in this appendix.
and comprehensive approach to a multi- They also are provided with examples of
faceted problem. practice guides. The practice guide panel
works under a short deadline of six to nine
Practice guides in education at the months to produce a draft document. The
Institute of Education Sciences expert panel members interact with and re-
ceive feedback from staff at IES during the
IES publishes practice guides in educa- development of the practice guide, but they
tion to bring the best available evidence understand that they are the authors and,
and expertise to bear on the types of sys- thus, responsible for the final product.
temic challenges that cannot currently be
addressed by single interventions or pro- One unique feature of IES-sponsored prac-
grams. Although IES has taken advantage tice guides is that they are subjected to
of the history of practice guides in health rigorous external peer review through the
care to provide models of how to proceed same office that is responsible for inde-
in education, education is different from pendent review of other IES publications.
health care in ways that may require that A critical task of the peer reviewers of a
practice guides in education have some- practice guide is to determine whether
what different designs. Even within health the evidence cited in support of particular
care, where practice guides now number recommendations is up-to-date and that
in the thousands, there is no single tem- studies of similar or better quality that
plate in use. Rather, one finds descriptions point in a different direction have not been
of general design features that permit ignored. Peer reviewers also are asked to
substantial variation in the realization evaluate whether the evidence grade as-
of practice guides across subspecialties signed to particular recommendations by
and panels of experts.156 Accordingly, the the practice guide authors is appropriate.
templates for IES practice guides may vary A practice guide is revised as necessary to
across practice guides and change over meet the concerns of external peer reviews
time and with experience. and gain the approval of the standards and
review staff at IES. The process of external
The steps involved in producing an IES- peer review is carried out independent of
sponsored practice guide are first to select the office and staff within IES that insti-
a topic, which is informed by formal sur- gated the practice guide.
veys of practitioners and requests. Next, a
panel chair is recruited who has a national Because practice guides depend on the
reputation and up-to-date expertise in the expertise of their authors and their group
topic. Third, the chair, working in collabo- decision making, the content of a practice
ration with IES, selects a small number of guide is not and should not be viewed as a
panelists to coauthor the practice guide. set of recommendations that in every case
These are people the chair believes can depends on and flows inevitably from sci-
work well together and have the requisite entific research. It is not only possible but
also likely that two teams of recognized
156. American Psychological Association (2002). experts working independently to produce
( 50 )
Appendix A.
Postscript from the Institute of Education Sciences
a practice guide on the same topic would its own because the authors are national
generate products that differ in important authorities who have to reach agreement
respects. Thus, consumers of practice among themselves, justify their recom-
guides need to understand that they are, mendations in terms of supporting evi-
in effect, getting the advice of consultants. dence, and undergo rigorous independent
These consultants should, on average, pro- peer review of their product.
vide substantially better advice than an
individual school district might obtain on Institute of Education Sciences
( 51 )
Appendix B. Dr. Halverson is a former high school teacher,
About the authors school technology specialist, curriculum di-
rector, and school administrator.
( 54 )
Appendix D. Recommendation 1.
Make data part of an ongoing cycle
Technical information of instructional improvement
on the studies
Level of evidence: Low
The body of research on how educators
use data to make instructional decisions For this recommendation, the panel drew
consists mainly of studies that do not use on its own expertise as well as examples
a causal design (such as qualitative and within studies that used qualitative designs
descriptive studies), as well as secondary to describe how educators have imple-
analyses (such as literature reviews, meta- mented an inquiry cycle for data use. These
analyses, and implementation guides). Most resources provided needed details about
of the literature consulted provides context the inquiry cycle, especially when, examin-
for and examples of the recommended ing the available evidence, the panel deter-
steps. In drawing from this research to mined that no studies rigorously tested the
formulate this guide, the panel developed effect of using an inquiry cycle as a frame-
recommendations that are accompanied by work for data use on student achievement.
low evidence ratings, because few studies One study, summarized below, illustrates
used causal designs testing the effective- how such a cycle can be implemented and
ness of these recommendations. Of those indicates the types of data that teachers
studies that used causal designs, four met and administrators wish to use as they ex-
WWC standards with or without reserva- amine performance, develop hypotheses,
tions.157 None of those four directly tested and modify instruction.
the effectiveness of the discrete practices
recommended by the panel (i.e., the experi- Example of a study that describes districts
mental condition in the studies combined a that make data part of an ongoing
recommended practice with other aspects, cycle of instructional improvement.
which means that the panel cannot attri-
bute effects observed in the studies to the In a combined case study of two groups of
practices they advise). schools, Herman and Gribbons (2001) de-
scribe how the districts implemented an
This appendix describes the content and inquiry process, detailing the processes
findings of some of the studies the panel for assessing student performance, un-
used to inform its recommendations. It derstanding areas of curriculum strengths
highlights how schools have implemented and weaknesses, and making curricular
and are using processes for making in- changes to address those strengths and
structional changes based on student data weaknesses. The researchers coached the
and also discusses the findings of causal schools through implementing an inquiry
studies as they relate to the panel’s recom- process designed to raise student achieve-
mendations. For each recommendation, ment. Although the panel recognizes that
this appendix also presents a summary coaching of this type will not be available
of one or more key studies both to illus- to all schools or districts that implement
trate how the study supports the panel’s an inquiry cycle for data use, this exam-
recommendation and to provide further ple illustrates one way that schools could
examples for the reader. implement such a cycle in the absence
of coaching.
157. Jones and Krouse (1988); May and Robinson The researchers had the districts begin by
(2007); Phillips et al. (1993); Wesson (1991). assembling data from a variety of sources
( 55 )
Appendix D.
Technical information on the studies
(recommendation 1, action step 1). Avail- After testing this hypothesis, the sec-
able data were categorized as follows: ondary school discovered that students
being bused from more remote locations
• achievement on state- and district-re- had particular problems in 10th-grade
quired tests; math achievement. Upon further discus-
sion and analysis of this lesson from the
• language proficiency; data (recommendation 1, action step 3),
the school discovered a potential curricu-
• demographics; lum problem. The school conducting the
analysis used a nontraditional math se-
• program participation (e.g., Title I, quence, which was aligned to the curricu-
gifted, special education); and lum from the local middle school because
it offered the first course in that sequence
• attendance and course history (in sec- before sending students to high school,
ondary schools). but students from other areas took a dif-
ferent course, resulting in a discontinuity
To encourage study schools to initiate of curriculum for those students. In fact,
their inquiry processes and assist them similarly bussed students who attended
with measuring student progress (recom- the last year of middle school at the tradi-
mendation 1, action step 2), the research- tional feeder school did not have problems
ers asked schools to begin their data anal- in 10th-grade math that were as severe
ysis by reflecting on three descriptive as those of their bussed peers who came
questions: (1) How are we doing? (2) Are from a different middle school. Therefore,
we serving all students well? and (3) What the school decided to modify instruction
are our relative strengths and weaknesses? (recommendation 1, action step 3) by pro-
Schools were given a report card, which viding a spring and summer course for stu-
summarized existing data in the categories dents from nontraditional feeder schools
listed above, as a tool for school admin- who failed the first semester of math. The
istrators to communicate about the pro- school also provided additional curricu-
cess and initiate discussions about needs lum supports to help bring the students
and goals with staff and parents. Based up to speed with their peers.
on these initial measures, the schools
developed hypotheses (recommendation Finally, in keeping with the cyclical na-
1, action step 2) about student achieve- ture of the inquiry process, school staff
ment. For example, one secondary school assessed the effectiveness of the instruc-
noticed that most of the students had not tional modification by examining data
come from the typical feeder school and from students who took the new course.
had concerns about whether a discontinu-
ity of curriculum for students not coming Recommendation 2.
via the typical route might cause achieve- Teach students to examine their
ment problems. The school hypothesized own data and set learning goals
that students who had attended the local
middle school might have higher achieve- Level of evidence: Low
ment on some measures than would stu-
dents from a different background. The The panel identified two randomized ex-
school then engaged in a comparison of periments that met WWC standards (one
the achievement of students who fed into of these with reservations) while testing
the school from different locations. the effectiveness of instructional practices
( 56 )
Table D1. Studies cited in recommendation 2 that meet WWC standards with or without reservations
Brief Citation Population Grade Intervention Comparison Outcome Results
Phillips et al. General 2–5 (1) Curriculum-based mea- (3) Control group with Number dig- (1) vs. (2): +41, ns
(1993) education class- surement (CBM) combined which teachers used its correct on (1) vs. (3): +107, sig
rooms in a with instructional recom- their conventional prac- Math Operations (2) vs. (3): +51, ns
southeastern, mendations and peer tu- tices for planning and Test–Revised.
urban school toring assignments. CBM monitoring.
district consisted of biweekly as-
sessments that provided
information about trend
scores and students to
watch.
( 57 )
May and Randomly High school Personalized Assessment Standard OGT (1) OGT scaled (1) Authors report no
Robinson selected students and Reporting System (PARS), reports for teachers, scores significant difference
(2007)a districts in Ohio teachers a report of the Ohio gradua- parents, and students between students in
tion test (OGT) for teachers, with less color and (2) OGT retake treatment and com-
parents, and students with graphics. All districts scores (among parison districts.
colorful and graphic summa- (including treatment) students failing
ries of student performance, could access website of at least one sub- (2) PARS students
and an interactive website practice tests. test on first try) were more likely than
with advice for students to control students to
improve their scores. retake the test and to
score higher in math,
science, and social
studies.
ns=not significant
sig=statistically significant
Appendix D.
Technical information on the studies
a. May and Robinson (2007) did not report the means and standard deviations needed for the WWC to calculate effect sizes or confirm the statistical significance
of the authors’ claims.
Appendix D.
Technical information on the studies
that included student self-examination of on how to improve their scores and skills
assessment data among other elements.158 through online tutorials and question-and-
However, neither study tested the sole ef- answer sessions.160 Although the authors
fect of student data use; rather, students’ reported that students in the treatment
involvement with their own data was part condition were more likely than other stu-
of multifaceted interventions in which dents to retake the test after failing at least
teachers and specialized staff also were one subtest—and to have higher scores in
using student data (Table D1). In the first math, science, and social studies when they
study, there were large effects on student did retake the test—the study did not re-
achievement, one of which was statisti- port the means and standard deviations of
cally significant. Authors of the second the outcome measures, so the WWC was not
study also reported significant achieve- able to verify statistical significance.
ment effects, but the WWC could not con-
firm that finding because the study did not To provide readers with a sense of how
report the means and standard deviations students use data and teachers provide
used to calculate the effects. feedback, the panel offers the following
example from a study that used a less rig-
In the first study, Phillips et al. (1993) com- orous design.
pared two curriculum-based measure-
ment (CBM) interventions, both of which Example of a study that describes how a
included a student feedback component, teacher can explain expectations, provide
to a non-CBM condition. The study re- timely and constructive feedback, and
ported large positive effects of both CBM help students learn from that feedback.
interventions, but only the comparison of
CBM combined with teacher feedback on Clymer and Wiliam’s (2007) pilot study of a
instructional recommendations versus the standards-based grading system at a sub-
non-CBM condition was statistically sig- urban Pennsylvania 8th-grade classroom
nificant.159 Students analyzing their own is closely related to the panel’s first two
performance in this study were reportedly suggested action steps in recommenda-
reflecting on data using questions such as tion 2. The teacher in the study mapped
“Can I beat my highest score in the next 10 content standards to five marking peri-
two weeks?” and “Which skills can I work ods and identified tasks and skills for stu-
harder on in the next two weeks?” Teacher dents to improve their proficiency on each
feedback included instructing students standard. The teacher then developed a
on how they can interpret their progress performance-rating system using a colored
graphs and skills profiles as well as coach- “stoplight” to reflect beginning knowledge
ing students to ask questions about their (red), developing knowledge (yellow), or
data to diagnose areas for improvement. mastery (green) of these standards. The
colored categories translated into numeric
The second experiment compared two scores at the end of each marking period
school districts in Ohio, both of which re- and were aggregated to generate a stu-
leased reports about student performance dent’s overall grade in the course.
on an annual state test to teachers, parents,
and students. An interactive website used The teacher explained expectations (recom-
by these districts also allowed students in mendation 2, action step 1) by sharing the
the treatment condition to access directions content standards and corresponding rat-
ings with the students and explaining that
158. May and Robinson (2007); Phillips et al. grades would be based on understanding
(1993).
159. Phillips et al. (1993). 160. May and Robinson (2007).
( 58 )
Appendix D.
Technical information on the studies
of the material at the end of each marking Examples of establishing and depending
period. Rather than assigning grades, the on schoolwide leadership for continuous
teacher provided feedback (recommen- data use.
dation 2, action step 2) to students with
weekly reports on their progress toward A case study by Halverson et al. (2007)
each standard (using the colored stoplight) examined the practices of four schools
and helped students learn from that feed- recognized for their strong leadership in
back (recommendation 2, action step 3) by using data to make instructional decisions
encouraging them to revise their work or (while also recording student achievement
complete additional assignments to dem- gains). The researchers gathered data
onstrate better mastery in red and yellow through structured interviews with prin-
areas. The panel considers this type of cipals and other school leaders as well as
feedback to be both timely and construc- through observations of staff meetings
tive. The study also suggested that the and events relevant to data use.
teacher provide tools to help students learn
from this feedback, but did not describe the In these four schools, principals and teach-
tools or feedback process in detail. ers met regularly to reflect on assessment
results and to discuss how to modify prac-
The authors reported that the class in the tice. Administrators provided activities for
pilot study showed greater achievement teachers and principals to work together
gains in science over the course of a school to discern patterns in the data and to de-
year than did a similar class not participat- velop hypotheses and courses of action to
ing in the pilot, although they caution that address perceived needs for instructional
the design of the study means that these change. At several school-level faculty
results may not be generalizable to other meetings throughout the year, staff revis-
classrooms. When surveyed, students par- ited the goals. Faculty meetings around
ticipating in the study also reported that data occurred at least quarterly in study
receiving teacher feedback about how to schools, and one school had weekly meet-
correct their performance, as well as their ings focused on students’ behavioral data.
accuracy, was helpful. Staff involved in school-level data exami-
nation and instructional change decisions
Recommendation 3. included principals, classroom teachers,
Establish a clear vision special education teachers, and school
for schoolwide data use psychologists. Some examples of meth-
ods that principals used to encourage
Level of evidence: Low their staff to take leadership for data use
included scheduling small team meetings
The panel used several studies with quali- for all teachers in a given grade; inviting
tative designs as resources for information all staff to beginning and end-of-year meet-
on how some schools have implemented ings at which the school used achievement
practices similar to those they recom- data to assess progress; and asking teach-
mend, and for concrete examples to clarify ers to use annual assessment data to iden-
its suggested action steps. This section tify areas in which the current curriculum
provides brief overviews of specific quali- had too much, or too little, emphasis on
tative studies that showcase examples of required concepts.
how the recommended action steps have
been implemented. No studies examined Example of how schools could
by the panel used a causal design to ex- approach defining teaching and
amine how establishing a vision for data learning concepts.
use affects student achievement.
( 59 )
Appendix D.
Technical information on the studies
Wayman, Cho, and Johnston (2007) con- researchers also conducted informal school
ducted a case study of how a school district and classroom observations and reviewed
uses, and could more efficiently use, data relevant documents.
for instructional decisions. The authors in-
dicated that districts or systems in which In synthesizing the results from the eight
staff do not have a shared definition of schools, researchers identified that one
teaching and learning will experience bar- practice the schools shared was their use
riers and challenges to agreeing on learn- of assessment data to set measurable goals
ing goals, and they specifically advocated for student, classroom, school, and system
that the educators should begin by answer- progress. The authors noted that setting
ing four questions about data and instruc- goals for students is a “precondition for ef-
tion: “(1) What do we mean by learning and fective data-driven decisionmaking” (p. 20).
achievement? (2) How will we conduct and Schools found the most success in defining
support teaching and learning? (3) How will goals that were focused and specific. For
we know teaching and learning when we example, in one district, the goals for the
see it? (4) What action will we take based year were (1) all students will score a 3 and
on our results?” (p. 42). The panel provides at least two-thirds of students will score a
these questions as examples but recognizes 4 on the schoolwide writing assignment;
that the answers to these questions will (2) all students will be at grade level for
vary widely as schools and districts re- reading in the spring, or at least two levels
spond in ways that account for their local above where they were in the fall; and (3)
circumstances. all students will be at the proficient level
on the math benchmark test by the spring.
Example of districts that develop a Staff and administrators from all levels
written plan to use data in support of (classroom, building, and system) were in-
articulated goals. volved in goal-setting decisions.
Datnow, Park, and Wohlstetter (2007) con- The authors concluded that the eight
ducted case studies of eight urban schools schools used the goal-setting process as
from two public school districts and two a starting point for developing a system-
charter school systems. The study districts wide plan for data use, forming the foun-
were selected from a pool of 25 districts dation for a data culture that had buy-in
that were recommended by researchers from staff at all levels. Leaders at the sys-
and experts in the field as being at the tem level across the study schools re-
forefront of using performance results ported that explicitly stating their expec-
for instructional decision making. The tations for when and how educators would
researchers selected two schools per dis- use assessment data was instrumental in
trict/system after receiving recommenda- encouraging staff to use data rather than
tions from district-level staff about which intuition to shape instructional decisions.
schools were most engaged in the process At the schools in public districts, system
of using data to inform instruction. In each leaders experienced more challenges fos-
district, researchers interviewed staff from tering staff buy-in than did leaders in
the central office, building-level staff at charter systems; researchers and staff at-
each school, and at least five teachers per tributed this to the need to overcome in-
school, for a total of 70 staff interviews over stitutional practices in the public districts
the course of three months in 2006. The that did not exist in charter schools.
( 60 )
Appendix D.
Technical information on the studies
School A 1. Once every month, the school day begins a. School staff review district standards
two hours later—teachers meet during and realign the assessments they use
this time to engage in the activities de- accordingly.
scribed in the column to the right. School
b. School staff continuously reevaluate this
makes up this accumulated time by ex-
work and discuss and plan changes as
tending the school year.
needed.
School B 1. School staff is released early from school a. Schools use allotted time to align curric-
once per week for at least 45 minutes. ulum across grades with the state stan-
This time is added to other days through- dards. This process is driven by student
out the week. assessment data.
2. The entire staff meets weekly for one b. School staff continuously reevaluate this
hour before school. Staff decreased the work and discuss and plan changes as
“nuts and bolts” of the meetings and pri- needed.
oritized work related to assessment.
School C 1. Same-grade teachers meet informally a. Staff discuss students’ progress according
during weekly planning periods and for- to the “developmental continuums” writ-
mally every six weeks. To accommodate ten by school staff.
these planning periods, students in entire
b. Teachers administer individual assess-
grades are sent to “specials” (e.g., gym,
ments to students.
art classes). Time also is allotted at regu-
larly scheduled staff meetings. c. Staff discuss reports on assessment data
from district research department.
2. Teachers are released from teaching du-
ties several days each year and are re-
placed by substitute teachers.
3. Teachers meet with the principal up to
three times each year.
School D 1. Teachers request time to meet with each a. Staff members share knowledge gained
other during school hours; substitutes are from professional development activities
hired to support this. In addition, teach- that addressed curriculum and assess-
ers meet after school. ment. They also discuss student mastery
of standards and other outcomes and
2. Teachers meet in “within-grade” and “sub-
possible intervention strategies.
ject area” teams during their planning
hours once per week.
( 61 )
Appendix D.
Technical information on the studies
In a randomized trial that met WWC stan- Example of a school/district study that
dards with reservations, Jones and Krouse designates structured time for data use.
(1988) randomly assigned student teachers
to one of two groups that received coaching. Cromey and Hanson (2000) conducted a
One group received coaching on classroom qualitative study of how schools use as-
management; the other received coaching sessment data from multiple sources, aim-
on classroom management and data use for ing to identify characteristics of schools
making instructional changes. The data-use that make valuable use of their data. After
intervention included individualized coach- interviewing district administrators, prin-
ing by supervisors on how the teachers could cipals, teachers, and other building staff
use assessment and behavioral data to track from nine schools about how they collect
student progress and make changes in the and use student assessment data, the re-
classroom. Teachers in the data-use group re- searchers identified six characteristics of
ported more frequently using pupil observa- schools with well-developed assessment
tions to make instructional decisions, but the systems. The characteristic most applicable
study authors make no claims about whether to recommendation 4, action step 2, is that
this difference was statistically significant, these schools specifically allocate time for
nor does the study include information the their staff to reflect collaboratively on how
WWC would need to calculate statistical sig- they will use student assessment data to
nificance. There was also no statistically sig- guide their instructional decisions. Table
nificant difference in the reading and math D2, drawn from this study, describes the
outcomes of the students assigned to these approaches four schools used to schedule
two groups of teachers. collaboration time. Although the panel did
161. Jones and Krouse (1988); Wesson (1991). 162. Wesson (1991).
( 62 )
Appendix D.
Technical information on the studies
not have evidence that these approaches compared to only three percent of respon-
are effective for increasing student achieve- dents who never received training. Adminis-
ment, they reproduce this table here to pro- trators were less likely than teachers to show
vide an array of examples to readers. interest in more frequent training—only 14
percent of administrators reporting no train-
Example of how school/district provided ing thought that this was insufficient.
targeted and regular professional
development opportunities. Teachers, administrators, and superinten-
dents proposed ways to improve profes-
Anderegg’s (2007) case study of data use sional development around data use and
in several Alaska school districts has find- analysis. A majority of all respondents
ings relevant to the panel’s third suggested suggested that data training be focused
action step for recommendation 4. The on analysis to inform teachers’ day-to-day
author explored several aspects of data implementation of “standards, curriculum,
use, including professional development and instruction” and provide resources for
around data use and analysis for teachers, doing so (p. 114). All three groups also ad-
school administrators, and district super- dressed the frequency of data training—
intendents. A mixed-method approach was the majority of superintendents and ad-
used to collect and analyze data. The au- ministrators cited the need to engage in
thor implemented a written survey in 53 “ongoing discussions and analysis,” and
districts, conducted follow-up telephone more than one-quarter of teachers sug-
surveys, and studied paper records de- gested that they needed more time to ana-
scribing data use and school in-service lyze and discuss data and plan accordingly
plans at select sites. (p. 116). Sixty-three percent of superinten-
dents cited the need for access to disag-
Survey questions focused on professional gregated data or training on “specific data
development targeted toward “the use of analysis tools” (p. 89).
data analysis methods and skills, such as
finding patterns and/or systemic relation- Given that this study was conducted in
ships between variables” (p. 171), although mostly rural Alaska school districts, the
respondents also were given the opportu- author cautions that these findings may
nity to respond to open-ended questions not be representative of more urban dis-
on existing and desired professional de- tricts or those in other states. Further-
velopment. The majority of respondents more, this study does not present any
reported receiving some kind of data train- evidence suggesting that frequent and
ing, with 12 percent of administrators and targeted professional development leads
four percent of teachers receiving training to increased data use and analysis and will
at least monthly. More than one-third of re- support the overall goal of creating a data-
spondents reported never receiving such driven culture within a school.
training. The study found that regular
professional development (recommenda- Recommendation 5.
tion 4, action step 3) around data use and Develop and maintain
analysis is not widespread. a districtwide data system
thinking through the process of obtaining, aggregation. Users access the report-
launching, and maintaining a system, the ing features using predesigned queries
panel drew examples from qualitative and and web-based reports; and
descriptive studies of how other districts
have approached the challenge of identi- • providing access to instructional sug-
fying the correct data system. gestions based on a student’s perfor-
mance that teachers can link to from the
Example of how one school district area on students’ assessment data.
involved stakeholders in the decision
to build a data system, articulated Example of how a group of districts
requirements, and then implemented involved stakeholders, articulated system
the new system. requirements, and implemented new
data systems (both built and bought).
Long et al. (2008) conducted an implemen-
tation study of a data warehouse in one Mieles and Foley (2005) conducted a case
school district by conducting interviews study focused on the implementation pro-
with staff at all levels. When this school cesses, successes, and challenges of data-
district determined it should build (rec- warehouse technology. The study was based
ommendation 5, action step 3) its own on interview data from educators and ed-
data warehouse to meet rising state and ucation-technology experts in eight urban
federal data needs, the district’s account- school districts that were at different points
ability and research department led the in the process of implementing data ware-
team that developed the new system. To houses. The eight districts involved stake-
involve stakeholders (recommendation 5, holders (recommendation 5, action step 1)
action step 1) in selecting the system and in systems decisions by engaging staff from
to articulate system requirements (recom- multiple levels. These stakeholders included
mendation 5, action step 2), that depart- superintendents, principals, school board
ment began by assessing the needs of data members, experts at neighboring school
users. Then, the team planned and staged districts, staff with expertise in instruc-
implementation (recommendation 5, ac- tion and assessment, and external vendors
tion step 4) of the system by building one with technical expertise. Six of the districts
system module at a time, a process that convened planning committees staffed by
the developers reported “kept [the project] stakeholders with different roles.
alive by not trying to design every part of
the system at once” (p. 216). Some features These committees articulated systems re-
of the final system include quirements (recommendation 5, action step
2) by developing needs assessments and
• combining data from multiple sources, planned for staged rollouts by coming to
including assessment, demographic, agreement on what data the system would
school profile, and special program collect and use, who would use it, and what
data; systems would be replaced by the new ap-
proach. In the final product, the staff inter-
• providing access to handouts, a sta- viewed for the study had a range of formats
tistics chat, and frequently asked and levels of access to reports that drew on
questions; the warehouse data. Particularly useful to
these staff was the ability to “drill down”
• creating a graphing tool that enables and explore the demographic and admin-
users to examine assessment and de- istrative data in the warehouse to look for
mographic data from different peri- patterns of how they might be associated
ods of time and at different levels of with achievement. In some districts, the
( 64 )
Appendix D.
Technical information on the studies
capability to do so was limited by staff roles capacities and needs, advised the district
for security and confidentiality reasons. To to involve stakeholders (recommendation
address security concerns, some districts 5, action step 1) from “every level of the
introduced or planned to introduce differ- district” (p. 11), in a conversation about
entiated access to their data warehouse by what data mean and why they are impor-
staff role in order to protect privacy and tant and useful to staff. Then, the authors
provide security. advised the district to acquire an integrated
computer data system, beginning with a
When planning and staging implementa- clearly articulated understanding of sys-
tion (recommendation 5, action step 4), tem requirements (recommendation 5, ac-
some districts participating in the study tion step 2). The authors advised that the
requested demonstrations or pilots and final system should be intuitive, easy to
got feedback from users about system use, and flexible to pull data from or export
features before full implementation of a data to other systems or programs. This in-
data warehouse. Most districts had imple- teroperability of systems and ease of use,
mented a data warehouse within a year of when available together, could allow staff to
beginning their inquiry process, and all overcome barriers that had previously pre-
districts experienced ongoing modifica- vented them from optimal use of student
tions and expansions to the system after data to inform their decisions. The authors
it was implemented based on increased ca- further recommended that the district care-
pacity and growing demands from users. fully consider security needs for their data
Districts not using external vendors found system as their data-based decision-making
that cross-departmental communication process evolved. Specific suggestions in-
and onsite support from internal staff for cluded development of policies to govern
those using the data warehouse were es- which staff should have access to which
sential to implementation. Some districts types of data, how and when staff should
faced unexpectedly onerous challenges access data, and how the system would be
with cleaning and integrating data that encrypted or otherwise protected. In this
originated from multiple sources and in- study, the authors specifically advised the
dicated that data dictionaries defining district to buy a data warehouse (recom-
the values of variables were a successful mendation 5, action step 3) to hold all of
long-term solution for some districts that these data from multiple sources, based
began with data quality difficulties. After on their evaluation of the district, which
launching a data warehouse, all study dis- showed that it needed a system immedi-
tricts discovered that they needed more ately and did not have the technical capac-
time and resources than expected for data ity to build one.
quality assurance, but they also found that
high-quality data were essential to con- Finally, they advised the district to plan an
vincing staff to use the new system. implementation (recommendation 5, ac-
tion step 4) that consisted of a gradual roll-
Example of a study advising a school out of new system pieces, beginning with
district on how to proceed with its those that “will provide the most value and
data-system decisions, including issues immediate impact” (p. 52) in order to keep
of which staff to involve in choosing the implementation process moving while
system requirements and implementing simultaneously gaining user buy-in.
the system.
Cromey, A., & Hanson, M. (2000). An ex- Oklahoma elementary school. Unpub-
ploratory analysis of school-based stu- lished doctoral dissertation, University
dent assessment systems. Oak Brook, of Oklahoma, Norman, OK.
IL: North Central Regional Educational Halverson, R., Grigg, J., Prichett, R., &
Laboratory (NCREL). Thomas, C. (2007). The new instruc-
Datnow, A., Park, V., & Wohlstetter, P. tional leadership: Creating data-driven
(2007). Achieving with data: How high- instructional systems in schools. Journal
performing school systems use data to of School Leadership, 17(2), 158–193.
improve instruction for elementary stu- Halverson, R., Prichett, R. B., & Watson, J. G.
dents. Los Angeles, CA: University of (2007). Formative feedback systems
Southern California, Center on Educa- and the new instructional leadership.
tional Governance. Madison, WI: University of Wisconsin.
Elmore, R. F. (2003). Doing the right thing, Halverson, R., & Thomas, C. N. (2007). The
knowing the right thing to do: School im- roles and practices of student services
provement and performance-based ac- staff as data-driven instructional leaders.
countability. Washington, DC: National In M. Mangin & S. Stoelinga (Eds.), Instruc-
Governors Association Center for Best tional teachers leadership roles: Using re-
Practices. search to inform and reform (pp. 163–200).
Feldman, J., & Tung, R. (2001). Using data- New York: Teachers College Press.
based inquiry and decision making to Hamilton, L. (2003). Assessment as a policy
improve instruction. ERS Spectrum: tool. Review of Research in Education,
Journal of School Research and Informa- 27, 25–68.
tion, 19(3), 10–19. Hamilton, L. S., Stecher, B. M., Marsh, J. A.,
Fiarman, S. E. (2007). Planning to assess prog- McCombs, J. S., Robyn, A., Russell, J. L.,
ress: Mason Elementary School refines an et al. (2007). Standards-based account-
instructional strategy. In K. P. Boudett & J. ability under No Child Left Behind: Expe-
L. Steele (Eds.), Data wise in action: Stories riences of teachers and administrators
of schools using data to improve teaching in three states. Santa Monica, CA: RAND
and learning (pp. 125–148). Cambridge, Corporation.
MA: Harvard Education Press. Herman, J., & Gribbons, B. (2001). Lessons
Field, M. J., & Lohr, K. N. (Eds.). (1990). Clini- learned in using data to support school in-
cal practice guidelines: Directions for a quiry and continuous improvement: Final
new program. Washington, DC: National report to the Stuart Foundation. Los Ange-
Academy Press. les, CA: University of California, Center
Forman, M. L. (2007). Developing an action for the Study of Evaluation (CSE).
plan: Two Rivers Public Charter School fo- Hill, D., Lewis, J., & Pearson, J. (2008). Metro
cuses on instruction. In K. P. Boudett & J. Nashville Public Schools student assessment
L. Steele (Eds.), Data wise in action: Stories staff development model. Nashville, TN:
of schools using data to improve teaching Vanderbilt University, Peabody College.
and learning (pp. 107–124). Cambridge, Huffman, D., & Kalnin, J. (2003). Collabora-
MA: Harvard Education Press. tive inquiry to make data-based deci-
Garrison, C., & Ehringhaus, M. (2009). For- sions in schools. Teaching and Teacher
mative and summative assessment in the Education, 19(6), 569–580.
classroom. National Middle School Asso- Ingram, D., Louis, K. S., & Schroeder, R. G.
ciation. Retrieved April 15, 2009, from (2004). Accountability policies and
http://www.nmsa.org/Publications/ teacher decision making: Barriers to the
WebExclusive/Assessment/tabid/1120/ use of data to improve practice. Teach-
Default.aspx. ers College Record, 106(6), 1258–1287.
Gentry, D. R. (2005). Technology supported Jones, E. D., & Krouse, J. P. (1988). The ef-
data-driven decision-making in an fectiveness of data-based instruction
( 67 )
References
( 68 )
References
Means, B., Padilla, C., DeBarger, A., & Bakia, curriculum-based measurement and peer
M. (2009). Implementing data-informed tutoring to help general educators provide
decision making in schools—teacher ac- adaptive education. Learning Disabilities
cess, supports and use. Washington, DC: Research & Practice, 8(3), 148–156.
U.S. Department of Education. Ramnarine, S. (2004). Impacting student
Merriam-Webster Online Dictionary. (2009). achievement through data-driven de-
Hypothesis. Retrieved April 22, 2009, cision-making. MultiMedia & Internet @
from http://www.merriam-webster.com/ Schools, 11(4), 33–35.
dictionary/hypothesis. Rossmiller, R. A., & Holcomb, E. L. (1993,
Mid-Continent Research for Education and April). The Effective Schools process for
Learning (McREL). (2003). Sustaining continuous school improvement. Paper
school improvement: Data-driven deci- presented at the annual meeting of the
sion making. Aurora, CO: Author. American Educational Research Asso-
Mieles, T., & Foley, E. (2005). Data ware- ciation, Atlanta, GA.
housing: Preliminary findings from a Schunk, D. H., & Swartz, C. W. (1992, April).
study of implementing districts. Prov- Goals and feedback during writing strat-
idence, RI: Annenberg Institute for egy instruction with gifted students.
School Reform. Paper presented at the annual meeting
Moody, L., & Dede, C. (2008). Models of of the American Educational Research
data-based decision-making: A case Association, San Francisco, CA.
study of the Milwaukee Public Schools. Shepard, L. A. (1995). Using assessment to
In E. B. Mandinach & M. Honey (Eds.), improve learning. Educational Leader-
Data-driven school improvement: Link- ship, 52(5), 38–43.
ing data and learning (pp. 233–254). Shepard, L. A., Flexer, R. J., Hiebert, E. H.,
New York: Teachers College Press. Marion, S. F., Mayfield, V., & Weston, T. J.
Nabors Oláh, L., Lawrence, N., & Riggan, M. (1996). Effects of introducing classroom
(2008, March). Learning to learn from performance assessments on student
benchmark assessment data: How teach- learning. Educational Measurement: Is-
ers analyze results. Paper presented at the sues and Practice, 15(3), 7–18.
annual meeting of the American Educa- Spillane, J. P., Halverson, R., & Diamond, J.
tional Research Association, New York. B. (2004). Towards a theory of leadership
Obama, B. (2009, March 10). Remarks by practice: A distributed perspective. Jour-
the president to the Hispanic Chamber of nal of Curriculum Studies, 36(1), 3–34.
Commerce on a complete and competi- Stecker, P. M. (1993). Effects of instructional
tive American education. Retrieved April modifications with and without curric-
20, 2009, from http://www.whitehouse. ulum-based measurement on the math-
gov/the_press_office/Remarks-of-the- ematics achievement of students with
President-to-the-United-States-Hispanic- mild disabilities (Doctoral dissertation,
Chamber-of-Commerce/. Vanderbilt University, 1993). Disserta-
Owings, C. A., & Follo, E. (1992). Effects of tion Abstracts International, 55 (01A).
portfolio assessment on students’ attitudes Stiggins, R. (2007). Assessment through
and goal setting abilities in mathematics. the student’s eyes. Educational Leader-
Rochester, MI: Oakland University. ship, 64(8), 22–26.
Perie, M., Marion, S., & Gong, B. (2007). A Supovitz, J. A. (2006). The case for district-
framework for considering interim as- based reform: Leading, building, and
sessments. Dover, NH: National Center sustaining school improvement. Cam-
for the Improvement of Educational bridge, MA: Harvard Education Press.
Assessment. Supovitz, J. A., & Klein, V. (2003). Mapping
Phillips, N. B., Hamlett, C. L., Fuchs, L. S., & a course for improved student learning:
Fuchs, D. (1993). Combining classwide How innovative schools systematically
( 69 )
References
( 70 )