Sie sind auf Seite 1von 76

IES PRACTICE GUIDE WHAT WORKS CLEARINGHOUSE

Using Student Achievement Data to


Support Instructional Decision Making

NCEE 2009-4067
U.S. DEPARTMENT OF EDUCATION
The Institute of Education Sciences (IES) publishes practice guides in education
to bring the best available evidence and expertise to bear on the types of challenges
that cannot currently be addressed by a single intervention or program. Authors of
practice guides seldom conduct the types of systematic literature searches that are
the backbone of a meta-analysis, although they take advantage of such work when
it is already published. Instead, authors use their expertise to identify the most im-
portant research with respect to their recommendations and conduct a search of
recent publications to ensure that the research supporting the recommendations
is up-to-date.

Unique to IES-sponsored practice guides is that they are subjected to rigorous exter-
nal peer review through the same office that is responsible for independent reviews
of other IES publications. A critical task for peer reviewers of a practice guide is to
determine whether the evidence cited in support of particular recommendations is
up-to-date and that studies of similar or better quality that point in a different di-
rection have not been ignored. Because practice guides depend on the expertise of
their authors and their group decision making, the content of a practice guide is not
and should not be viewed as a set of recommendations that in every case depends
on and flows inevitably from scientific research.

The goal of this practice guide is to formulate specific and coherent evidence-based
recommendations for use by educators and education administrators to create the
organizational conditions necessary to make decisions using student achievement
data in classrooms, schools, and districts. The guide provides practical, clear in-
formation on critical topics related to data-based decision making and is based on
the best available evidence as judged by the panel. Recommendations presented in
this guide should not be construed to imply that no further research is warranted
on the effectiveness of particular strategies for data-based decision making.
IES PRACTICE GUIDE

Using Student Achievement


Data to Support Instructional
Decision Making
September 2009
Panel
Laura Hamilton (Chair)
RAND Corporation

Richard Halverson
University of Wisconsin–Madison

Sharnell S. Jackson
Chicago Public Schools

Ellen Mandinach
CNA Education

Jonathan A. Supovitz
University of Pennsylvania

Jeffrey C. Wayman
The University of Texas at Austin

Staff
Cassandra Pickens
Emily Sama Martin
Mathematica Policy Research

Jennifer L. Steele
RAND Corporation

NCEE 2009-4067
U.S. DEPARTMENT OF EDUCATION
This report was prepared for the National Center for Education Evaluation and Re-
gional Assistance, Institute of Education Sciences, under Contract ED-07-CO-0062
by the What Works Clearinghouse, operated by Mathematica Policy Research.

Disclaimer
The opinions and positions expressed in this practice guide are the authors’ and do
not necessarily represent the opinions and positions of the Institute of Education Sci-
ences or the U.S. Department of Education. This practice guide should be reviewed
and applied according to the specific needs of the educators and education agency
using it, and with the full realization that it represents the judgments of the review
panel regarding what constitutes sensible practice, based on the research available
at the time of publication. This practice guide should be used as a tool to assist in
decision making rather than as a “cookbook.” Any references within the document to
specific education products are illustrative and do not imply endorsement of these
products to the exclusion of other products that are not referenced.

U.S. Department of Education


Arne Duncan
Secretary

Institute of Education Sciences


John Q. Easton
Director

National Center for Education Evaluation and Regional Assistance


John Q. Easton
Acting Commissioner

September 2009

This report is in the public domain. While permission to reprint this publication is
not necessary, the citation should be:

Hamilton, L., Halverson, R., Jackson, S., Mandinach, E., Supovitz, J., & Wayman, J.
(2009). Using student achievement data to support instructional decision making
(NCEE 2009-4067). Washington, DC: National Center for Education Evaluation and
Regional Assistance, Institute of Education Sciences, U.S. Department of Education.
Retrieved from http://ies.ed.gov/ncee/wwc/publications/practiceguides/.

What Works Clearinghouse Practice Guide citations begin with the panel chair,
followed by the names of the panelists listed in alphabetical order.

This report is available on the IES website at http://ies.ed.gov/ncee and http://ies.


ed.gov/ncee/wwc/publications/practiceguides/.

Alternative formats
On request, this publication can be made available in alternative formats, such
as Braille, large print, audiotape, or computer diskette. For more information,
call the Alternative Format Center at 202–205–8113.
Using Student Achievement Data to
Support Instructional Decision Making
Contents
Introduction 1

The What Works Clearinghouse standards and their relevance to this guide 4

Overview 5

Scope of the practice guide 6

Status of the research 6

Summary of the recommendations 7

Checklist for carrying out the recommendations 9

Recommendation 1. Make data part of an ongoing cycle of


instructional improvement 10

Recommendation 2. Teach students to examine their own data and


set learning goals 19

Recommendation 3. Establish a clear vision for schoolwide data use 27

Recommendation 4. Provide supports that foster a data-driven culture


within the school 33

Recommendation 5. Develop and maintain a districtwide data system 39

Glossary of terms as used in this report 46

Appendix A. Postscript from the Institute of Education Sciences 49

Appendix B. About the authors 52

Appendix C. Disclosure of potential conflicts of interest 54

Appendix D. Technical information on the studies 55

References 66

( iii )
Using Student Achievement Data to Support Instructional Decision Making

List of tables
Table 1. Institute of Education Sciences levels of evidence for practice guides 3

Table 2. Recommendations and corresponding levels of evidence 8

Table 3. Suggested professional development and training opportunities 37

Table 4. Sample stakeholder perspectives on data system use  40

Table 5. Considerations for built and purchased data systems  44

Table D1. Studies cited in recommendation 2 that meet WWC standards


with or without reservations 57

Table D2. Scheduling approaches for teacher collaboration 61

List of figures
Figure 1. Data use cycle 10

Figure 2. Example of classroom running records performance


at King Elementary School 13

List of examples
Example 1. Examining student data to understand learning 17

Example 2. Example of a rubric for evaluating five-paragraph essays 21

Example 3. Example of a student’s worksheet for reflecting on


strengths and weaknesses 23

Example 4. Example of a student’s worksheet for learning from


math mistakes 24

Example 5. Teaching students to examine data and goals 25

Example 6. Examples of a written plan for achieving school-level goals 30

( iv )
Introduction and single subject designs to examine
whether data use leads to increases in
As educators face increasing pressure student achievement. Among the studies
from federal, state, and local accountabil- ultimately relevant to the panel’s recom-
ity policies to improve student achieve- mendations, only six meet the causal va-
ment, the use of data has become more lidity standards of the What Works Clear-
central to how many educators evaluate inghouse (WWC) and were related to the
their practices and monitor students’ aca- panel’s recommendations.2
demic progress.1 Despite this trend, ques-
tions about how educators should use data To indicate the strength of evidence sup-
to make instructional decisions remain porting each recommendation, the panel
mostly unanswered. In response, this relied on the WWC standards for determin-
guide provides a framework for using stu- ing levels of evidence, described below and
dent achievement data to support instruc- in Table 1. It is important for the reader to
tional decision making. These decisions remember that the level of evidence rating
include, but are not limited to, how to is not a judgment by the panel on how ef-
adapt lessons or assignments in response fective each of these recommended prac-
to students’ needs, alter classroom goals tices will be when implemented, nor is it
or objectives, or modify student-grouping a judgment of what prior research has to
arrangements. The guide also provides say about the effectiveness of these prac-
recommendations for creating the orga- tices. The level of evidence ratings reflect
nizational and technological conditions the panel’s judgment of the validity of
that foster effective data use. Each rec- the existing literature to support a causal
ommendation describes action steps for claim that when these practices have been
implementation, as well as suggestions implemented in the past, positive effects
for addressing obstacles that may impede on student academic outcomes were ob-
progress. In adopting this framework, edu- served. They do not reflect judgments of
cators will be best served by implement- the relative strength of these positive ef-
ing the recommendations in this guide fects or the relative importance of the in-
together rather than individually. dividual recommendations.

The recommendations reflect both the ex- A strong rating refers to consistent and
pertise of the panelists and the findings generalizable evidence that an inter-
from several types of studies, including vention strategy or program improves
studies that use causal designs to examine outcomes.3
the effectiveness of data use interventions,
case studies of schools and districts that A moderate rating refers either to evidence
have made data-use a priority, and obser- from studies that allow strong causal con-
vations from other experts in the field. The clusions but cannot be generalized with
research base for this guide was identi- assurance to the population on which a
fied through a comprehensive search for recommendation is focused (perhaps be-
studies evaluating academically oriented cause the findings have not been widely
data-based decision-making interventions
and practices. An initial search for litera- 2.  Reviews of studies for this practice guide ap-
ture related to data use to support instruc- plied WWC Version 1.0 standards. See Version 1.0
standards at http://ies.ed.gov/ncee/wwc/pdf/
tional decision making in the past 20 years
wwc_version1_standards.pdf.
yielded more than 490 citations. Of these,
3.  Following WWC guidelines, improved out-
64 used experimental, quasi-experimental, comes are indicated by either a positive, statisti-
cally significant effect or a positive, substantively
1.  Knapp et al. (2006). important effect size (i.e., greater than 0.25).

(1)
Introduction

replicated) or to evidence from studies that that researchers have not yet studied a
are generalizable but have more causal practice or that there is weak or conflicting
ambiguity than that offered by experi- evidence of effectiveness. Policy interest in
mental designs (e.g., statistical models of topics of current study thus can arise be-
correlational data or group comparison de- fore a research base has accumulated on
signs for which equivalence of the groups which recommendations can be based. 
at pretest is uncertain).
Under these circumstances, the panel ex-
A low rating refers to evidence either from amined the research it identified on the
studies such as case studies and descrip- topic and combined findings from that
tive studies that do not meet the stan- research with its professional expertise
dards for moderate or strong evidence or and judgments to arrive at recommenda-
from expert opinion based on reasonable tions. However, that a recommendation
extrapolations from research and theory. has a low level of evidence should not be
A low level of evidence rating indicates interpreted as indicating that the panel
that the panel did not identify a body of believes the recommendation is unimport-
research demonstrating effects of imple- ant. The panel has decided that all five rec-
menting the recommended practice on ommendations are important and, in fact,
student achievement. The lack of a body of encourages educators to implement all of
valid evidence may simply mean that the them to the extent that state and district
recommended practices are not feasible or resources and capacity allow.
are difficult to study in a rigorous, experi-
mental fashion.4 In other cases, it means

4.  For more information, see the WWC Frequently


Asked Questions page for practice guides, http://
ies.ed.gov/ncee/wwc/references/idocviewer/
doc.aspx?docid=15&tocid=3.

(2)
Introduction

Table 1. Institute of Education Sciences levels of evidence for practice guides


In general, characterization of the evidence for a recommendation as strong requires both
studies with high internal validity (i.e., studies whose designs can support causal conclu-
sions) and studies with high external validity (i.e., studies that in total include enough of
the range of participants and settings on which the recommendation is focused to sup-
port the conclusion that the results can be generalized to those participants and settings).
Strong evidence for this practice guide is operationalized as
• A systematic review of research that generally meets WWC standards (see http://ies.
ed.gov/ncee/wwc/) and supports the effectiveness of a program, practice, or approach
Strong with no contradictory evidence of similar quality; OR
• Several well-designed, randomized controlled trials or well-designed quasi-experi-
ments that generally meet WWC standards and support the effectiveness of a program,
practice, or approach, with no contradictory evidence of similar quality; OR
• One large, well-designed, randomized controlled, multisite trial that meets WWC stan-
dards and supports the effectiveness of a program, practice, or approach, with no
contradictory evidence of similar quality; OR
• For assessments, evidence of reliability and validity that meets the Standards for
Educational and Psychological Testing.a

In general, characterization of the evidence for a recommendation as moderate requires


studies with high internal validity but moderate external validity or studies with high
external validity but moderate internal validity. In other words, moderate evidence is
derived from studies that support strong causal conclusions but generalization is uncer-
tain or studies that support the generality of a relationship but the causality is uncertain.
Moderate evidence for this practice guide is operationalized as
• Experiments or quasi-experiments generally meeting WWC standards and supporting
the effectiveness of a program, practice, or approach with small sample sizes and/
or other conditions of implementation or analysis that limit generalizability and no
contrary evidence; OR
Moderate • Comparison group studies that do not demonstrate equivalence of groups at pretest
and, therefore, do not meet WWC standards but that (1) consistently show enhanced
outcomes for participants experiencing a particular program, practice, or approach
and (2) have no major flaws related to internal validity other than lack of demonstrated
equivalence at pretest (e.g., only one teacher or one class per condition, unequal
amounts of instructional time, highly biased outcome measures); OR
• Correlational research with strong statistical controls for selection bias and for dis-
cerning influence of endogenous factors and no contrary evidence; OR
• For assessments, evidence of reliability that meets the Standards for Educational and
Psychological Testingb but with evidence of validity from samples not adequately rep-
resentative of the population on which the recommendation is focused.

In general, characterization of the evidence for a recommendation as low means that the
recommendation is based on expert opinion derived from strong findings or theories in
Low related areas and/or expert opinion buttressed by direct evidence that does not rise to
the moderate or strong level. Low evidence is operationalized as evidence not meeting
the standards for the moderate or strong level.

a. American Educational Research Association, American Psychological Association, and National Council on
Measurement in Education (1999).­­­
b. Ibid.

(3)
Introduction

The What Works Clearinghouse Following the recommendations and sug-


standards and their relevance to gestions for carrying out the recommen-
this guide dations, Appendix D presents more in-
formation on the research evidence that
In terms of the levels of evidence indi- supports each recommendation.
cated in Table 1, the panel relied on WWC
evidence standards to assess the quality The panel would like to thank Cassandra
of evidence supporting educational pro- Pickens, Emily Sama Martin, Dr. Jennifer
grams and practices. The WWC evaluates L. Steele, and Mathematica and RAND staff
evidence for the causal validity of instruc- members who participated in the panel
tional programs and practices according meetings, characterized the research find-
to WWC standards. Information about ings, and drafted the guide. We also appre-
these standards is available at http://ies. ciate the help of the many WWC reviewers
ed.gov/ncee/wwc/pdf/wwc_version1_ who contributed their time and expertise
standards.pdf. The technical quality of to the review process, and Sarah Wissel for
each study is rated and placed into one of her support of the intricate logistics of the
three categories: project. In addition, we would like to thank
Scott Cody, Kristin Hallgren, Dr. Shannon
• Meets Evidence Standards for random- Monahan, and Dr. Mark Dynarski for their
ized controlled trials and regression oversight and guidance during the devel-
discontinuity studies that provide the opment of the practice guide.
strongest evidence of causal validity.
Dr. Laura Hamilton
• Meets Evidence Standards with Res- Dr. Richard Halverson
ervations for all quasi-experimental Ms. Sharnell S. Jackson, Ed.M.
studies with no design flaws and ran- Dr. Ellen Mandinach
domized controlled trials that have Dr. Jonathan A. Supovitz
problems with randomization, attri- Dr. Jeffrey C. Wayman
tion, or disruption.

• Does Not Meet Evidence Screens for


studies that do not provide strong evi-
dence of causal validity.

(4)
Using Student progress is a logical way to monitor con-
Achievement Data to tinuous improvement and tailor instruc-
tion to the needs of each student. Armed
Support Instructional with data and the means to harness the
Decision Making information data can provide, educators
can make instructional changes aimed at
improving student achievement, such as:
Overview
• prioritizing instructional time;8
Recent changes in accountability and test-
ing policies have provided educators with • targeting additional individual instruc-
access to an abundance of student-level tion for students who are struggling
data, and the availability of such data has with particular topics;9
led many to want to strengthen the role of
data for guiding instruction and improving • more easily identifying individual stu-
student learning. The U.S. Department of dents’ strengths and instructional in-
Education recently echoed this desire, call- terventions that can help students
ing upon schools to use assessment data to continue to progress;10
respond to students’ academic strengths
and needs.5 In addition, spurred in part • gauging the instructional effectiveness
by federal legislation and funding, states of classroom lessons;11
and districts are increasingly focused on
building longitudinal data systems.6 • refining instructional methods;12 and

Although accountability trends explain • examining schoolwide data to consider


why more data are available in schools, whether and how to adapt the curricu-
the question of what to do with the data re- lum based on information about stu-
mains primarily unanswered. Data provide dents’ strengths and weaknesses.13
a way to assess what students are learn-
ing and the extent to which students are
making progress toward goals. However,
making sense of data requires concepts, 8.  Brunner et al. (2005).
theories, and interpretative frames of ref- 9.  Brunner et al. (2005); Supovitz and Klein
(2003); Wayman and Stringfield (2006).
erence.7 Using data systematically to ask
10.  Brunner et al. (2005); Forman (2007); Wayman
questions and obtain insight about student
and Stringfield (2006).
11.  Halverson, Prichett, and Watson (2007);
5.  American Recovery and Reinvestment Act Supovitz and Klein (2003).
of 2009; U.S. Department of Education (2009); 12.  Halverson, Prichett, and Watson (2007);
Obama (2009). Fiarman (2007).
6.  Aarons (2009). 13.  Marsh, Pane, and Hamilton (2006); Kerr
7.  Knapp et al. (2006). et al. (2006).

(5)
Scope of the these are administered consistently
practice guide and routinely to provide information
that can be compared across class-
rooms or schools.
The purpose of this practice guide is to
help K–12 teachers and administrators use Annual and interim assessments vary con-
student achievement data to make instruc- siderably in their reliability and level of
tional decisions intended to raise student detail, and no single assessment can tell
achievement. The panel believes that the educators all they need to know to make
responsibility for effective data use lies well-informed instructional decisions. For
with district leaders, school administrators, this reason, the guide emphasizes the use of
and classroom teachers and has crafted the multiple data sources and suggests ways to
recommendations accordingly. use different types of common assessment
data to support and inform decision mak-
This guide focuses on how schools can make ing. The panel recognizes the value of class-
use of common assessment data to improve room-specific data sources, such as tests or
teaching and learning. For the purpose of other student work, and the guide provides
this guide, the panel defined common as- suggestions for how these data can be used
sessments as those that are administered to inform instructional decisions.
in a routine, consistent manner by a state,
district, or school to measure students’ aca- The use of data for school management
demic achievement.14 These include purposes, rewarding teacher performance,
and determining appropriate ways to
• annual statewide accountability tests schedule the school day is beyond the
such as those required by No Child scope of this guide. Schools typically col-
Left Behind; lect data on students’ attendance, behav-
ior, activities, coursework, and grades, as
• commercially produced tests—includ- well as a range of administrative data con-
ing interim assessments, benchmark cerning staffing, scheduling, and financ-
assessments, or early-grade reading ing. Some schools even collect perceptual
assessments—administered at mul- data, such as information from surveys or
tiple points throughout the school focus groups with students, teachers, par-
year to provide feedback on student ents, or community members. Although
learning; many of these data have been used to
help inform instructional decision making,
• end-of-course tests administered there is a growing interest among educa-
across schools or districts; and tors and policy advocates in drawing on
these data sources to increase operational
• interim tests developed by districts efficiency inside and outside of the class-
or schools, such as quarterly writing room. This guide does not suggest how
or mathematics prompts, as long as districts should use these data sources to
implement data-informed management
practices, but this omission should not be
14.  The panel recognizes that some schools do
not fall under a district umbrella or are not part construed as a suggestion that such data
of a district. For the purposes of this guide, dis- are not valuable for decision making.
trict is used to describe schools in partnership,
which could be either a school district or a collab- Status of the research
orative organization of schools. Technical terms
related to assessments, data, and data-based de-
cision making are defined in a glossary at the end Overall, the panel believes that the ex-
of the recommendations. isting research on using data to make
(6)
Scope of the practice guide

instructional decisions does not yet pro- research that proves the practices do im-
vide conclusive evidence of what works to prove student achievement.
improve student achievement. There are a
number of reasons for the lack of compel- Summary of the recommendations
ling evidence. First, rigorous experimental
studies of some data-use practices are dif- The recommendations in this guide create
ficult or infeasible to carry out. For exam- a framework for effectively using data to
ple, it would be impractical to structure a make instructional decisions. This frame-
rigorous study investigating the effects of work should include a data system that
implementing a districtwide data system incorporates data from various sources,
(recommendation 5) because it is difficult a data team in schools to encourage the
to establish an appropriate comparison use and interpretation of data, collabora-
that reflects what would have happened in tive discussion sessions among teachers
the absence of that system. Second, data- about data use and student achievement,
based decision making is closely tied to and instruction for students about how to
educational technology. As new technolo- use their own achievement data to set and
gies are developed, there is often a lag monitor educational goals. A central mes-
before rigorous research can identify the sage of this practice guide is that effective
impacts of those technologies. As a result, data practices are interdependent among
there is limited evidence on the effective- the classroom, school, and district levels.
ness of the state-of-the-art in data-based Educators should become familiar with all
decision making. Finally, studies of data- five recommendations and collaborate with
use practices generally look at a bundle of other school and district staff to implement
elements, including training teachers on the recommendations concurrently, to the
data use, data interpretation, and utiliz- extent that state and district resources and
ing the software programs associated with capacity allow. However, readers who are
data analysis and storage. Studies typi- interested in implementing data-driven
cally do not look at individual elements, recommendations in the classroom should
making it difficult to isolate a specific ele- focus on recommendations 1 and 2. Read-
ment’s contribution to effective use of data ers who wish to implement data-driven
to make instructional decisions designed decision making at the school level should
to improve student achievement. focus on recommendations 3 and 4. Read-
ers who wish to bolster district data sys-
This guide includes five recommendations tems to support data-driven decision mak-
that the panel believes are a priority to im- ing should focus on recommendation 5.
plement. However, given the status of the Finally, readers interested in technical in-
research, the panel does not have compel- formation about studies that the panel used
ling evidence that these recommendations to support its recommendations will find
lead to improved student outcomes. As a such information in Appendix D.
result, all of the recommendations are sup-
ported by low levels of evidence. While the To account for the context of each school
evidence is low, the recommendations re- and district, this guide offers recommen-
flect the panel’s best advice—informed by dations that can be adjusted to fit their
experience and research—on how teachers unique circumstances. Examples in this
and administrators can use data to make guide are intended to offer suggestions
instructional decisions that raise student based on the experiences of schools and
achievement. In other words, while this the expert opinion of the panel, but they
panel of experts believes these practices should not be construed as the best or only
will lead to improved student achieve- ways to implement the guide’s recommen-
ment, the panel cannot point to rigorous dations. The recommendations, described
(7)
Scope of the practice guide

Table 2. Recommendations and corresponding levels of evidence

Recommendation Level of evidence

1. Make data part of an ongoing cycle of instructional improvement Low

2. Teach students to examine their own data and set learning goals Low

3. Establish a clear vision for schoolwide data use Low

4. Provide supports that foster a data-driven culture within the school Low

5. Develop and maintain a districtwide data system Low

Source: Authors’ compilation based on analysis described in text.

here briefly, also are listed with their levels on the organizational and technological
of evidence in Table 2. conditions that support data use. Recom-
mendation 3 suggests that school leaders
Recommendations 1 and 2 emphasize the establish a comprehensive plan for data
use of data to inform classroom-level in- use that takes into account multiple per-
structional decisions. Recommendation 1 spectives. It also emphasizes the need to
suggests that teachers use data from multi- establish organizational structures and
ple sources to set goals, make curricular and practices that support the implementation
instructional choices, and allocate instruc- of that plan.
tional time. It describes the data sources
best suited for different types of instruc- The panel believes that effective data use
tional decisions and suggests that the use depends on supporting educators who are
of data be part of a cycle of instructional using and interpreting data. Recommenda-
inquiry aimed at ongoing instructional im- tion 4 offers suggestions about how schools
provement. Building on the use of data to and districts can prepare educators to use
drive classroom-based instructional deci- data effectively by emphasizing the impor-
sions, recommendation 2 provides guidance tance of collaborative data use. These col-
about how teachers can instruct students in laboration efforts can create or strengthen
using their own assessment data to develop shared expectations and common practices
personal achievement goals and guide learn- regarding data use throughout a school.
ing. Teachers then can use these goals to
better understand factors that may motivate Recommendation 5 points out that effec-
student performance and can adjust their tive, sustainable data use requires a se-
instruction accordingly. cure and reliable data-management system
at the district level. It provides detailed
The panel believes that effective data use suggestions about how districts or other
at the classroom level is more likely to educational entities, such as multidistrict
emerge when it is supported by a data- collaboratives or charter management or-
informed school and district culture. Rec- ganizations, should develop and maintain
ommendations 3, 4, and 5, therefore, focus a high-quality data system.
(8)
Checklist for carrying out the Recommendation 4. Provide supports
recommendations that foster a data-driven culture within
the school
Recommendation 1. Make data part
of an ongoing cycle of instructional  Designate a school-based facilitator
improvement who meets with teacher teams to discuss
data.
 Collect and prepare a variety of data
about student learning.  Dedicate structured time for staff
collaboration.
 Interpret data and develop hypotheses
about how to improve student learning.  Provide targeted professional devel-
opment regularly.
 Modify instruction to test hypotheses
and increase student learning. Recommendation 5. Develop and
maintain a districtwide data system
Recommendation 2. Teach students
to examine their own data and set  Involve a variety of stakeholders in
learning goals selecting a data system.

 Explain expectations and assessment  Clearly articulate system require-


criteria. ments relative to user needs.

 Provide feedback to students that  Determine whether to build or buy


is timely, specific, well formatted, and the data system.
constructive.
 Plan and stage the implementation of
 Provide tools that help students learn the data system.
from feedback.

 Use students’ data analyses to guide


instructional changes.

Recommendation 3. Establish a clear


vision for schoolwide data use

 Establish a schoolwide data team that


sets the tone for ongoing data use.

 Define critical teaching and learning


concepts.

 Develop a written plan that articulates


activities, roles, and responsibilities.

 Provide ongoing data leadership.

(9)
Recommendation 1. Figure 1. Data use cycle
Make data part of
an ongoing cycle
Collect and
of instructional prepare a variety Interpret data
of data about and develop
improvement student learning hypotheses about
how to improve
student learning

Teachers should adopt a systematic


process for using data in order to bring
evidence to bear on their instructional Modify
decisions and improve their ability to instruction to test
meet students’ learning needs. The hypotheses and
increase student
process of using data to improve learning
instruction, the panel believes, can be
understood as cyclical (see Figure 1).
It includes a step for collecting and Because the data-use process is
preparing data about student learning cyclical, teachers actually can begin at
from a variety of relevant sources, any point shown in Figure 1—that is,
including annual, interim, and classroom with a hypothesis they want to test,
assessment data.15 After preparing an instructional modification they
data for examination, teachers want to evaluate, or a set of student
should interpret the data and develop performance data they want to use
hypotheses about factors contributing to inform their decisions. However,
to students’ performance and the the panel has observed that teachers
specific actions they can take to meet are sometimes asked to use existing
students’ needs. Teachers then should student assessment data without
test these hypotheses by implementing receiving clear guidance on how to
changes to their instructional practice. do so. Consequently, some teachers
Finally, they should restart the cycle by may find it useful to begin with the
collecting and interpreting new student collection and preparation of data
performance data to evaluate their own from a variety of sources, and this
instructional changes.16 guide presents that as the first step
in the process. Also, although the
steps represent the ongoing nature
of the cycle, teachers may find that
they need a considerable amount of
15.  Halverson, Prichett, and Watson (2007), Her- data collection and interpretation to
man and Gribbons (2001), Huffman and Kalnin form strong hypotheses about how
(2003), and Fiarman (2007) outline these com- to change their instruction.
ponents (in varied order) in their case studies
of how the inquiry process was implemented in
some school and district settings. Similarly, Ab- Level of evidence: Low
bott (2008) discusses using data to assess, plan,
implement, and evaluate instructional changes as The panel drew on a group of qualitative
part of a larger framework schools should use to and descriptive studies to formulate this rec-
achieve accountability. Further detail under each
component is based on panelist expertise. ommendation, using the studies as sources
16.  Abbott (2008); Brunner et al. (2005); Halv-
of examples for how an inquiry cycle for
erson, Prichett, and Watson (2007); Kerr et al. data use can be implemented in an educa-
(2006); Liddle (2000); Mandinach et al. (2005). tional setting. No literature was located that
( 10 )
Recommendation 1.
Make data part of an ongoing cycle of instructional improvement

assesses the impact on student achievement Each assessment type has advantages and
of using an inquiry cycle, or individual steps limitations (e.g., high-stakes accountability
within that cycle, as a framework for data tests may be subject to score inflation and
analysis, however, and the panel determined may lead to perverse incentives).18 There-
that the level of evidence to support this fore, the panel believes that multiple data
recommendation is low. sources are important because no single
assessment provides all the information
Brief summary of evidence to teachers need to make informed instruc-
support the recommendation tional decisions. For instance, as teachers
begin the data-use process for the first time
The panel considers the inquiry cycle of or begin a new school year, the accessibil-
gathering data, developing and testing hy- ity and high-stakes importance of students’
potheses, and modifying instruction to be statewide, annual assessment results pro-
fundamental when using assessment data vide a rationale for looking closely at these
to guide instruction. Although no causal data. Moreover, these annual assessment
evidence is available to support the effective- data can be useful for understanding broad
ness of this cycle, the panel draws on studies areas of relative strengths and weaknesses
that did not use rigorous designs for exam- among students, for identifying students or
ples of the three-point cycle of inquiry—the groups of students who may need particu-
underlying principle of this recommenda- lar support,19 and for setting schoolwide,20
tion—and provides some detail on the con- classroom, grade-level, or department-level
text for those examples in Appendix D. goals for students’ annual performance.

How to carry out this However, teachers also should recognize


recommendation that significant time may have passed
between the administration of these an-
1. Collect and prepare a variety of data about nual assessments and the beginning of
student learning. the school year, and students’ knowledge
and skills may have changed during that
To gain a robust understanding of stu- time. It is important to gather additional
dents’ learning needs, teachers need to information at the beginning of the year to
collect data from a variety of sources. supplement statewide test results. In addi-
Such sources include but are not limited tion, the panel cautions that overreliance
to annual state assessments, district and on a single data source, such as a high-
school assessments, curriculum-based as- stakes accountability test, can lead to the
sessments, chapter tests, and classroom overalignment of instructional practices
projects. In most cases, teachers and their with that test (sometimes called “teaching
schools already are gathering these kinds to the test”), resulting in false gains that
of data, so carrying out data collection de- are not reflected on other assessments of
pends on considering the strengths, limita- the same content.21
tions, and timing of each data type and on
preparing data in a format that can reveal
Kalnin (2003); Lachat and Smith (2005); Supo-
patterns in student achievement. More-
vitz (2006).
over, by focusing on specific questions
18.  Koretz (2003); Koretz and Barron (1998).
about student achievement, educators can
19.  Halverson, Prichett, and Watson (2007); Her-
prioritize which types of data to gather to man and Gribbons (2001); Lachat and Smith
inform their instructional decisions.17 (2005); Supovitz and Klein (2003); Wayman and
Stringfield (2006).
17.  Bigger (2006); Cromey and Hanson (2000); 20.  Halverson, Prichett, and Watson (2007).
Herman and Gribbons (2001); Huffman and 21.  Hamilton (2003); Koretz and Barron (1998).

( 11 )
Recommendation 1.
Make data part of an ongoing cycle of instructional improvement

To gain deeper insight into students’ needs improved after a unit spent reading and
and to measure changes in students’ skills analyzing expository writing.
during the academic year, teachers also
can collect and prepare data from interim Finally, it is important to collect and prepare
assessments that are administered consis- classroom performance data for examina-
tently across a district or school at regular tion, including examples and grades from
intervals throughout the year (see the box students’ unit tests, projects, classwork, and
below).22 As with annual assessments, in- homework. The panel recommends using
terim assessment results generally have these classroom-level data sources, in con-
the advantage of being comparable across junction with widely accessible nonachieve-
classrooms, but the frequency of their ad- ment data such as attendance records and
ministration means that teachers can use cumulative files,23 to interpret annual and
the data to evaluate their own instructional interim assessment results (see the box on
strategies and to track the progress of their page 13). An important advantage of these
current students in a single school year. For data sources is that in most cases, they can
instance, data from a districtwide interim be gathered quickly to provide teachers with
assessment could help illuminate whether immediate feedback about student learning.
the students who were struggling to con- Depending on the assignment in question,
vert fractions to decimals improved after they also can provide rich, detailed exam-
receiving targeted small group instruction, ples of students’ academic performance,
or whether students’ expository essays thereby complementing the results of an-
nual or interim tests. For example, if state
and interim assessments show that students
Characteristics of interim
have difficulty writing about literature, then
assessments examination of students’ analytic essays,
• Administered routinely (e.g., each book reports, or reading-response journals
semester, quarter, or month) can illuminate how students are accustomed
throughout a school year to writing about what they read and can sug-
gest areas in which students need additional
• Administered in a consistent guidance.24 An important disadvantage of
manner across a particular grade classroom-level data is that the assignments,
level and/or content area within conditions, and scores are not generally
a school or district comparable across classrooms. However,
when teachers come together to examine
• May be commercial or developed students’ work, this variability also can be
in-house an advantage, since it can reveal discrepan-
cies in expectations and content coverage
• May be administered on paper that teachers can take steps to remedy.
or on a computer
As teachers prepare annual, interim,
• May be scored by a computer and classroom-level data for analysis,
they should represent the information in
or a person

23.  The following studies provide examples of


available data sources: Owings and Follo (1992);
22.  Standards for testing in educational envi- Halverson, Prichett, and Watson (2007); Jones
ronments are discussed in more detail in Amer- and Krouse (1988); Supovitz and Klein (2003);
ican Educational Research Association (AERA), Supovitz and Weathers (2004); Wayman and
American Psychological Association (APA), and Stringfield (2006).
National Council on Measurement in Education 24.  This example is drawn and adapted from a
(NCME) (1999). case study by Fiarman (2007).

( 12 )
RECOMMEnDATIOn 1. MAkE DATA PART OF An OnGOInG CyClE OF InSTRUCTIOnAl IMPROvEMEnT

Examples of classroom and progress on the interim math assessments


other data throughout the year. On the graph, she
might create separate lines for students
• Curriculum-based unit tests from each performance quartile on the
previous year’s state mathematics assess-
• Class projects ment (see Figure 2). Such a graph would
allow her to compare the growth trajec-
• Classwork and homework tories for each group, although she would
need to be certain that each quartile group
• Attendance records contained numerous students, thereby en-
suring that results were not driven by one
• Records from parent meetings or two outliers. (Some data systems will
and phone calls include features that make graphing easier
and more automatic. See recommendation
• Classroom behavior charts 5 for more information on data systems.)

• Individualized educational plans In general, preparing state and district data


(IEPs) for analysis will be easier for teachers who
have access to the kind of districtwide data
• Prior data from students’ cumula- systems described in recommendation 5,
tive folders although these teachers still will need to
maintain useful records of classroom-level
data. Online gradebooks that allow teach-
aggregate forms that address their own ers to prepare aggregate statistics by class-
questions and highlight patterns of in- room, content area, or assignment type can
terest. For instance, if a teacher wanted be useful for identifying patterns in stu-
to use four waves of interim test data to dents’ classroom-level performance and for
learn whether students who started the identifying students whose classwork per-
year with weaker mathematics skills were formance is inconsistent with their perfor-
narrowing the gap with their peers, she mance on annual or interim assessments.
could make a line graph tracking students’

Figure 2. Example of classroom running records performance at King Elementary School

Source: Supovitz and Klein (2003).

( 13 )
Recommendation 1.
Make data part of an ongoing cycle of instructional improvement

2. Interpret data and develop hypotheses source of the discrepancy. In all cases, they
about how to improve student learning. should use classroom and other data to
shed light on the particular aspects of the
Working independently or in teams, teach- skill with which students need extra help.
ers should interpret the data they have
collected and prepared. In interpreting As they triangulate data from multiple
the data, one generally useful objective sources, teachers should develop hypoth-
is to identify each class’s overall areas eses about ways to improve the achieve-
of relative strengths and weaknesses so ment patterns they see in the data. As the
that teachers can allocate instructional box on page 15 explains, good hypoth-
time and resources to the content that is eses emerge from existing data, identify
most pressing. Another useful objective is instructional or curricular changes likely
to identify students’ individual strengths to improve student learning, and can be
and weaknesses so that teachers can adapt tested using future assessment data. For
their assignments, instructional methods, example, existing data can reveal places in
and feedback in ways that address those which the school’s curriculum is not well
individual needs. For instance, teachers aligned with state standards. In those situ-
may wish to adapt students’ class project ations, teachers might reasonably hypoth-
assignments in ways that draw on stu- esize that reorganizing the curriculum to
dents’ individual strengths while encour- address previously neglected material will
aging them to work on areas for growth. improve students’ mastery of the standards.
In other cases, teachers may hypothesize
To gain deeper insight into students’ learn- that they need to teach the same content in
ing needs, teachers should examine evi- different ways. Taking into account how they
dence from the multiple data sources they and their colleagues have previously taught
prepared in action step 1.25 “Triangulation” particular skills can help teachers choose
is the process of using multiple data sources among plausible hypotheses. For instance,
to address a particular question or problem teachers may find that students have diffi-
and using evidence from each source to culty identifying the main idea of texts they
illuminate or temper evidence from the read. This weak student performance may
other sources. It also can be thought of as lead teachers to hypothesize that the skill
using each data source to test and confirm should be taught differently. In talking to
evidence from the other sources in order other teachers, they might choose a differ-
to arrive at well-justified conclusions about ent teaching strategy, such as a discussion
students’ learning needs. When multiple format in which students not only identify
data sources (e.g., results from the annual the main idea of a text but also debate its
state assessment and district interim as- evidence and merits.
sessment) show similar areas of student
strength and weakness (as in Example 1), To foster such sharing of effective practices
teachers can be more confident in their among teachers, the panel recommends
decisions about which skills to focus on. that teachers interpret data collaboratively
In contrast, when one test shows students in grade-level or department-specific teams.
struggling in a particular skill and another In this way, teachers can begin to adopt
test shows them performing well in that some common instructional and assess-
skill, teachers need to look closely at the ment practices as well as common expec-
items on both tests to try to identify the tations for student performance.26 Col-
laboration also allows teachers to develop
25.  Halverson, Prichett, and Watson (2007); Her-
man and Gribbons (2001); Lachat and Smith 26.  Fiarman (2007); Halverson, Prichett, and Wat-
(2005); Wayman and Stringfield (2006). son (2007); Halverson et al. (2007).

( 14 )
Recommendation 1.
Make data part of an ongoing cycle of instructional improvement

a collective understanding of the needs of 3. Modify instruction to test hypotheses and


individual students in their school, so that increase student learning.
they can work as an organization to provide
support for all students. After forming hypotheses about students’
learning needs, teachers must test their
hypotheses by carrying out the instruc-
tional changes that they believe are likely
Forming testable hypotheses to raise student achievement. The kinds
of changes they choose to implement may
Situation: Based on data from your 3rd-
include—but are not limited to—one or
grade class’s assignments and assess-
more of the following:
ments, it appears that more than half
of the students struggle with subtrac-
• allocating more time for topics with
tion. As their teacher, you ask yourself
which students are struggling;
how they can better master subtraction
skills. To answer this question, you hy- • reordering the curriculum to shore up
pothesize that the students’ subtraction essential skills with which students are
skills might improve if they were taught struggling;
to use the “trade first” method for sub-
traction, in which students do their re- • designating particular students to re-
grouping from the tens to ones column ceive additional help with particu-
at the beginning, rather than at the end, lar skills (i.e., grouping or regrouping
of the problem. You determine that this students);
hypothesis can be tested by (1) working
with these students in a group to teach • attempting new ways of teaching dif-
them the trade first method and (2) ex- ficult or complex concepts, especially
amining changes in their subtraction based on best practices identified by
scores on the interim assessment. teaching colleagues;

Characteristics of testable • better aligning performance expecta-


hypotheses tions among classrooms or between
grade levels; and/or
• Identify a promising interven-
tion or instructional modification • better aligning curricular emphasis
(teaching the trade first method for among grade levels.
subtraction) and an effect that you
expect to see (improvement in If the instructional modification was not
the subtraction skills of struggling developed collaboratively, teachers may
students) nonetheless find it useful to seek feedback
from peers before implementing it. This
• Ensure that the effect can be mea- is particularly true if teachers have cho-
sured (students’ subtraction scores sen to enact a large instructional change,
on the interim assessment after such as a comprehensive new approach
they learn the trade first strategy) to algebra instruction or a reorganization
of the mathematics curriculum sequence.
• Identify the comparison data (stu- Because curricular decisions are some-
dents’ subtraction scores on the in- times made at the school or district level,
terim assessment before they were teachers may even want to make a case for
taught the strategy) curriculum reorganization with school or
district leaders ahead of time.
( 15 )
Recommendation 1.
Make data part of an ongoing cycle of instructional improvement

The time it takes teachers to carry out their should give themselves and their students
instructional changes will depend in part time to adapt to it.28
on the complexity of the changes. If teach-
ers are delivering a discrete lesson plan or Potential roadblocks and solutions
a series of lessons, then the change usually
can be carried out quickly. Larger interven- Roadblock 1.1. Teachers have so much
tions take longer to roll out than smaller data that they are not sure where they
ones. For instance, a teacher whose inter- should focus their attention in order to raise
vention involves introducing more collab- student achievement.
orative learning into the classroom may
need time to teach her students to work Suggested Approach. Teachers can nar-
efficiently in small group settings. row the range of data needed to solve a
particular problem by asking specific ques-
During or shortly after carrying out an in- tions and concretely identifying the data
structional intervention, teachers should that will answer those questions. In ad-
take notes on how students responded and dition, administrators can guide this pro-
how they as teachers might modify deliv- cess by setting schoolwide goals that help
ery of the intervention in future classes. clarify the kinds of data teachers should be
These notes may not only help teachers examining and by asking questions about
reflect on their own practice but also pre- how classroom practices are advancing
pare them to share their experiences and those goals. For instance, if administrators
insights with other teachers. have asked teachers to devote particular
effort to raising students’ reading achieve-
To evaluate the effectiveness of the in- ment, teachers may decide to focus atten-
structional intervention, teachers should tion on evidence from state, interim, and
return to action step 1 by collecting and classroom assessments about students’
preparing a variety of data about student reading needs. Teachers should then tri-
learning. For instance, they can gather angulate data from multiple sources (as
classroom-level data, such as students’ described earlier) to develop hypotheses
classwork and homework, to quickly eval- about instructional changes likely to raise
uate student performance after the inter- student achievement. Note that recommen-
vention.27 Teachers can use data from later dation 3 describes how administrators, data
interim assessments, such as a quarterly facilitators, and other staff can help teach-
district test, to confirm or challenge their ers use data in ways that are clearly aligned
immediate, classroom-level evidence. with the school’s medium- and long-term
student achievement goals. Also, recom-
Finally, after triangulating data and con- mendation 4 describes how professional
sidering the extent to which student learn- development and peer collaboration can
ing did or did not improve in response help teachers become more adept at data
to the intervention, teachers can decide preparation and triangulation.
whether to keep pursuing the approach
in its current form, modify or extend the Roadblock 1.2. Some teachers work in a
approach, or try a different approach alto- grade level or subject area (such as early
gether. It is important to bear in mind that elementary and advanced high school
not all instructional changes bear fruit im- grades) or teach certain subjects (such as
mediately, so before discarding an instruc- social studies, music, science, or physical
tional intervention as ineffective, teachers education) for which student achievement
data are not readily available.

27.  Forman (2007). 28.  Elmore (2003).

( 16 )
Recommendation 1.
Make data part of an ongoing cycle of instructional improvement

Example 1. Examining student data to understand learning


Consider this hypothetical example . . . When the 4th- and 5th-grade
teachers at Riverview Elementary School met after school in Septem-
ber for their first data meeting of the year, the data facilitator, Mr.
Bradley, shared selected data about how students had performed on Action Step 1
the previous year’s standards-based state accountability test. Teach-
ers quickly saw that in both grades, students’ proficiency rates were
higher in language arts than in mathematics, so they decided to look
more closely at particular mathematics skills. Examining the results
on each math content strand, the teachers found that although stu-
dents were performing adequately in arithmetic, they struggled with
geometry skills concerning shapes and measurement. This news was Action Step 2
surprising because, consistent with state standards, teachers taught
shapes and measurement in both the 4th and 5th grades.
Because students had already taken their first district-based interim
assessment of the school year, the teachers also were able to use
the district’s data system to look at how students had performed in Action Step 1
geometry on that assessment. Studying one graph, Ms. Irving, a 4th-
grade teacher, observed that the content strand with which students
struggled most was measuring perimeters of polygons. Since calculat-
ing perimeters was a matter of adding, and students had performed
well on the addition strands of both the annual and interim tests, the
teachers were perplexed. They decided to collect new data on students’ Action Step 2
geometry skills using questions from the supplemental workbooks of
their standards-based math curriculum.
When teachers brought their students’ workbook responses to the next
data meeting, they gathered in small groups to examine the students’
work and generate hypotheses. As they shared the classwork exam-
ples, they noticed a pattern. Students performed well on simple pe-
rimeter problems when the shapes were drawn for them, but on word
problems that required them to combine shapes before adding, they Action Step 2
largely faltered. The teachers hypothesized that students’ difficulties
were not with calculating perimeters, but with considering when and
how to combine polygons in response to real-world problems. They
further hypothesized that students would benefit from opportunities
to apply basic geometry skills to novel situations.
Working together in grade-level teams, the teachers devised tasks for
their students that would require them to use manipulatives and on-
line interactive simulations to solve perimeter problems about floor Action Step 3
plans and land use. The teachers agreed to deliver these lessons in
their classrooms and report back on how the students responded.
At the next data meeting, teachers brought implementation notes and
samples of student work from the hands-on perimeter lessons. Most
Action Step 1
reported that students were engaged in the lessons but needed addi-
tional practice. After readministering similar lessons two weeks later,
most teachers found that their students were getting the hang of the
task. On the next interim assessment, teachers were pleased to learn
that the percentage of perimeter and area questions answered correctly Action Step 2
had increased from 40 percent to 70 percent across the two grades.

( 17 )
Recommendation 1.
Make data part of an ongoing cycle of instructional improvement

Suggested Approach. Part of the work and always should be considered in con-
of collaborative data use involves estab- junction with other data. Also, undue focus
lishing shared learning goals and expec- on students scoring near proficiency may
tations across classrooms.29 District or lead schools to distribute instructional re-
school administrators can help this effort sources inappropriately.35 For instance, stu-
by providing an interim, schoolwide assess- dents scoring further from the cut score (in
ment, ideally linked to state standards, that either direction) may have just as many—if
allows the comparison of results across not more—distinctive instructional needs
classrooms.30 Alternatively, teachers can as those scoring near the cut score. Instead
collaborate to develop their own interim of focusing mainly on students scoring just
assessments. Some schools, for instance, below proficiency on a particular assess-
develop interim writing prompts or other ment, educators should use data from mul-
assessments that are administered through- tiple sources to identify and serve the needs
out the school and scored using a common of all students. When possible, additional re-
rubric.31 (Example 5 in recommendation sources and support should be directed to-
2 illustrates this approach.) Although in- ward students whose needs are the greatest.
house assessments may lack the validity of (See the What Works Clearinghouse guides
commercially developed tests, they never- on Response to Intervention for more sug-
theless provide common metrics by which gestions on tiered student support.)
teachers can assess their students and
share results with colleagues.32 Similarly, Roadblock 1.4. Some district leaders sug-
teachers of supplemental subjects such as gest that schools assign students to courses
art, music, and physical education can de- based solely on proficiency levels on the
velop performance assessments linked to state accountability test.
schoolwide student goals.33
Suggested Approach. Tests should be
Roadblock 1.3. Some schools or districts used for the purposes for which they have
encourage staff to use data to identify stu- been validated; most existing assessments
dents scoring just below proficiency on state have not been validated for the purpose
tests and to focus disproportionate effort on of making decisions about course place-
helping them reach proficiency. ment. In addition, the professional stan-
dards for appropriate use of test scores
Suggested Approach. Teachers and princi- in educational settings state that a single
pals in some schools have reported focusing test score should not be used to make
extra resources on “bubble kids,” or students high-stakes decisions about individuals;
scoring immediately below a proficiency instead, educators and administrators
cut-off on a high-stakes assessment.34 The should consider multiple sources of in-
panel cautions against this practice because formation when assigning students to
results from any single test are imprecise courses or programs.36 Proficiency on a
state accountability test can provide one
indicator of a student’s readiness or need
29.  Datnow, Park, and Wohlstetter (2007); Wil-
liams Rose (2006); Rossmiller and Holcomb
for a specific instructional program, but
(1993); Togneri (2003); Wayman, Cho, and John- other information, such as prior perfor-
ston (2007). mance in similar courses, should be taken
30.  Wayman, Midgley, and Stringfield (2006). into account. Finally, educators should re-
31.  See, for example, Fiarman (2007). consider decisions about placement when
32.  Shepard et al. (1996). new data become available.
33.  See, for example, Forman (2007).
34.  Booher-Jennings (2005); Brunner et al. (2005); 35.  Booher-Jennings (2005).
Hamilton et al. (2007); Long et al. (2008). 36.  AERA, APA, and NCME (1999).

( 18 )
Recommendation 2. use and student achievement. When
combined with clear data, instructional
Teach students to strategies such as having students
examine their own data rework incorrect problems can enhance
student learning.39
and set learning goals
Level of evidence: Low
Teachers should provide students
with explicit instruction on using The panel judged the level of evidence
achievement data regularly to monitor supporting this recommendation to be low,
their own performance and establish based on two studies with causal designs
their own goals for learning. This data that met WWC standards and drawing on
analysis process—similar to the data additional examples of practices from
use cycle for teachers described in qualitative and descriptive studies and
recommendation 1—can motivate both on their own expertise. One randomized
elementary and secondary students controlled trial that met WWC standards
by mapping out accomplishments with reservations found positive effects
that are attainable, revealing actual of interventions that combined student
achievement gains and providing analysis of data with other practices, such
students with a sense of control as teacher coaching, teacher professional
over their own outcomes. Teachers development, and/or classroom manage-
can then use these goals to better ment interventions; therefore, the panel
understand factors that may motivate could not attribute impacts to student
student performance and adjust their data analysis alone.40 A second random-
instructional practices accordingly. ized controlled trial met WWC standards
and reported positive effects of a web-
based data tool for students, but the size
Students are best prepared to learn and statistical significance of these ef-
from their own achievement data fects could not be confirmed by the WWC;
when they understand the learning therefore, it does not provide the panel
objectives and when they receive with strong causal evidence that having
data in a user-friendly format. Tools students examine their own data is an ef-
such as rubrics provide students with fective intervention.41
a clear sense of learning objectives,
and data presented in an accessible Brief summary of evidence to
and descriptive format can illuminate support the recommendation
students’ strengths and weaknesses
(see recommendation 5 for more Two randomized controlled trials that met
information on reporting formats).37 WWC standards (one with and one without
Many practices around data rely on the reservations) found positive effects of in-
assumption38 of a relationship between terventions in which students used their
formative assessment and feedback own assessment data. One study found
that curriculum-based measurement inter-
37.  Black et al. (2003). ventions combined with student analysis
38.  Black and Wiliam (1998) and Kluger and De-
Nisi (1996) examine the relationship between as- used noncausal designs that did not meet WWC
sessment and student learning in their respective evidence standards.
meta-analyses on the topic. However, the studies
39.  Clymer and Wiliam (2007).
included in those meta-analyses were outside
the date range or otherwise outside the scope 40.  Phillips et al. (1993).
of the literature review for this guide, or they 41.  May and Robinson (2007).

( 19 )
Recommendation 2. Teach students to examine their own data and set learning goals

of their own assessment data and feedback useful feedback on complex skills such as
from their teachers led to statistically sig- writing an effective essay or term paper,
nificant gains in student achievement.42 delivering a persuasive speech, or execut-
A second study reported statistically sig- ing a science experiment. Teachers also can
nificant gains in achievement for students have students assess a sample assignment
given access to an interactive website re- using the rubric to help them better under-
porting student test scores and providing stand the criteria. Once the students’ actual
advice for improving those scores. How- assignments are completed and evaluated,
ever, the WWC could not confirm the statis- students should receive the completed ru-
tical significance of these gains.43 To add bric from the teacher.
detail and specificity to this recommenda-
tion, and to supplement the information Because public school students in many
available in these two studies, the panel grades are required to take annual stan-
relied upon its own expertise and referred dards-based accountability tests in se-
to several case studies and descriptive lected subjects, teachers should help stu-
analyses of examples of feedback and to dents understand the state standards they
provide information needed to construct are expected to meet by regularly revisit-
sample feedback tools. ing the standards throughout the year. For
example, a 5th-grade teacher could spend
How to carry out this a few minutes at the beginning of an in-
recommendation structional unit explaining that certain
essential concepts in the lesson (e.g., lit-
1. Explain expectations and assessment erary devices such as similes) may appear
criteria. on the annual test. Students could keep
a running list of these standards-based
To interpret their own achievement data, concepts throughout the year, using the
students need to understand how their list as a basis for review before the annual
performance fits within the context of test. Note that making students familiar
classroom-level or schoolwide expecta- with content standards is not the same
tions. Teachers should articulate the con- as engaging in extensive practice using
tent knowledge or skills that they expect problems or tasks designed to mirror the
students to achieve throughout the school format of a specific test. The latter may
year, conveying goals for individual les- result in spurious test-score gains and is
sons and assignments, as well as goals not recommended by the panel.45
for the unit and end-of-year performance.
Teachers should explicitly describe the 2. Provide feedback to students that is timely,
criteria that will be used to assess perfor- specific, well formatted, and constructive.
mance toward those goals.
Providing students with thoughtful and con-
For example, when teachers use a rubric to structive feedback on their progress may
provide feedback (an example is provided improve academic achievement.46 Feedback
in Example 2), teachers should introduce should be designed to help students under-
the rubric at the beginning of the assign- stand their own strengths and weaknesses,
ment so that students know which criteria explaining why they received the grades and
are important before they begin working on scores they did and identifying the specific
a task or assignment.44 Rubrics can provide content areas the student should focus on

42.  Phillips et al. (1993). 45.  Hamilton (2003).


43.  May and Robinson (2007). 46.  May and Robinson (2007); Phillips et al.
44.  Lane et al. (1997). (1993).

( 20 )
Recommendation 2. Teach students to examine their own data and set learning goals

Example 2. Example of a rubric for evaluating five-paragraph essays


1 Beginning 2 Developing 3 Accomplished 4 Exemplary
Organization and Content
Introduction • Central argument • Central argument • Central argument • Central argument
paragraph is unclear is vaguely indicated is clearly stated is clearly stated in a
way that commands
attention
Body paragraphs • None have clear • Some have clear • All have clear • All have clear
main ideas main ideas main ideas main ideas that
are smoothly
• Provide little • Provide weak • Provide mostly connected to other
to no evidence to or unconvincing convincing evidence ideas in the essay
support the central evidence to support to support the
argument the central argument central argument • Provide insightful
and compelling
evidence to support
the central argument
Concluding • Does not summa- • Summarizes • Summarizes main • Summarizes main
paragraph rize main points of some main points points of the essay points in a way
the essay of the essay accurately that commands
attention
• Does not restate • Restates central • Restates central
central argument argument in a argument in a • Restates central
repetitive way new way argument in a
new and thought-
provoking way
Overall • Paragraph tran- • Paragraph tran- • Paragraph transi- • Paragraph transi-
organization sitions are sudden sitions are some- tions are present tions are seamless
and not smooth times awkward
• Ideas are • Ideas are orga-
• Organization of • Ideas show some organized in a nized in a logical
ideas is not clear organization logical way and engaging way
Overall content • Ideas seem • Ideas seem some- • Ideas seem logical • Ideas seem
unoriginal and/ what reasonable and convincing unusually insightful
or unconvincing or illuminating
Grammar and Usage
Paragraphing • Does not use • Uses paragraph • Uses paragraph • Uses paragraph
paragraph breaks breaks and indenta- breaks and indenta- breaks consistently
and indentations to tions inconsistently tions consistently and very accurately
separate important or in illogical places
ideas
Capitalization • Includes many • Includes several • Includes a few • Free of capitaliza-
capitalization capitalization capitalization tion errors
errors errors errors
Sentence • Includes numerous • Includes occasional • Free of fragments • Free of fragments
structure fragments and/or fragments and/or and run-on and run-on sen-
run-on sentences run-on sentences sentences tences, and uses
varied sentence
structures
Punctuation • Includes many • Includes several • Includes a few • Free of punctua-
punctuation errors punctuation errors punctuation errors tion errors

( 21 )
Recommendation 2. Teach students to examine their own data and set learning goals

to improve their scores. Such feedback often specifying why a particular piece of
has the following characteristics: work is praiseworthy.50

• Timely. Feedback should be rapid 3. Provide tools that help students learn
so that students still remember the from feedback.
task and the skills on which they were
being assessed.47 The panel recom- Simply giving students assessment data
mends that assessment data be re- that are accessible and constructive does
turned to students within a week of not guarantee that they will know what to
collecting the assignment, and sooner do with the data. Students need the time
when possible. and tools to analyze the feedback; other-
wise, they may simply glance at the over-
• Appropriately formatted. When pro- all score without considering why they
viding feedback, teachers should se- achieved that score and what they could
lect a mode of delivery (e.g., rubric do to improve.
based, handwritten, or typed) that best
meets students’ needs based on their When providing feedback, teachers should
grade level, the subject area, and the set aside 10 to 15 minutes of classroom
assignment. Typed feedback, for ex- instructional time to allow students to in-
ample, may be appropriate in response terpret and learn from the data. It is im-
to students’ larger projects, whereas portant to undertake this reflection dur-
handwritten feedback may suffice on ing class time, when the teacher can help
short assignments and student jour- students interpret feedback and strategize
nals or as supplemental feedback at ways to improve their performance. Dur-
the end of a rubric-based evaluation. ing this time, teachers should have stu-
Additionally, teachers’ feedback should dents individually review written feedback
be based on a shared understanding of and ask questions about that feedback.
expectations and scoring criteria.
Teachers also can provide students with
• Specific and constructive. Regard- paper- or computer-based tools for inter-
less of the format, feedback should preting feedback, such as the following:
provide concrete information and sug-
gestions for improvement.48 Feedback • a template for listing strengths, weak-
in the form of explanations, exam- nesses, and areas to focus on for a
ples, and suggestions for additional given task (see Example 3);51
practice is more concrete and easier
for students to act on than a score • a list of questions for students to
or letter grade alone, and it may in- consider and respond to (e.g., “Can I
crease students’ confidence and mo- beat my highest score in the next two
tivate better performance.49 For this weeks?” and “Which skills can I work
reason, teachers should avoid pro- harder on in the next two weeks?”);52
viding feedback that is exclusively
focused on what should have been • worksheets to facilitate reflection about
done or delivers vague praise without incorrect items (see Example 4);53

47.  Black and Wiliam (1998); Stiggins (2007). 50.  Black et al. (2003); Black and Wiliam (1998);
Shepard (1995).
48.  Black and Wiliam (1998); Brunner et al.
(2005). 51.  Stiggins (2007).
49.  Clymer and Wiliam (2007); Schunk and 52.  Phillips et al. (1993).
Swartz (1992). 53.  Stiggins (2007).

( 22 )
Recommendation 2. Teach students to examine their own data and set learning goals

Example 3. Example of a student’s worksheet for reflecting on strengths


and weaknesses
Areas of Strength and Areas for Growth
Topic: Writing a Five-Paragraph Essay
Based on: Rubric-based feedback from my last two essays
Name: Jane B. Student
Areas of Strength Areas for Growth
Organization and Content Organization and Content
• Stating main idea in first paragraph • Need to state main idea of each
• Restating main idea in conclusion body paragraph
• Choosing a topic I know well • Need to provide examples in each
body paragraph

Grammar and Usage Grammar and Usage


• Indenting paragraphs • Using quotations correctly
• Correctly capitalizing sentences and • Avoiding sentence fragments
proper nouns (example: “Because he wanted to.”)

• teacher-generated graphs that track It also is possible to use reflective data


student progress over time;54 and/or tools in subjects such as math, for which
rubrics are less common. For instance, Ex-
• grids on which students can re- ample 4 illustrates a worksheet students
cord baseline and interim scores might use for understanding the errors
to track gains over time in specific they made on a mathematics test. The pur-
dimensions.55 pose of such a tool is for students to learn
to diagnose their own errors, distinguish-
For example, after returning test results ing careless mistakes from concepts that
to students at the beginning of the school they still need to master.
year, a teacher might ask all students to
identify specific strengths and weaknesses 4. Use students’ data analyses to guide in-
by analyzing their responses to specific structional changes.
questions on the test. She could then guide
the students to submit in writing realis- Although data analysis tools help students
tic improvement goals for two particular learn from teacher feedback, they also pro-
skills with weak scores. Students with no vide valuable information that teachers
demonstrated weaknesses could be invited can use to inform instruction. Teachers
to select a topic for which enrichment should collect and review students’ goals
could be provided. By helping students and analyses to identify content areas and
make data-based decisions about their skills that need to be reinforced and fac-
own learning goals, the teacher would be tors that may motivate student learning.
emphasizing their responsibility for im- For example, teachers can
proving their own learning.
• review error worksheets (see Example
54.  Clymer and Wiliam (2007); Stecker (1993). 4) to identify concepts that need to be
55.  Lane et al. (1997). retaught;
( 23 )
Recommendation 2. Teach students to examine their own data and set learning goals

Example 4. Example of a student’s worksheet for learning from


math mistakes
Learning from Math Mistakes
Test: Unit 2, Single-Variable Equations
Name: Joe A. Student
Correct
Answer Steps for Need to
Problem (from posttest Solving Reason review this
Number My Answer review) (fill in) Missed concept?
Order of
10 x = √21 x=3 Yes
operations
Dividing by a
18 x = 3/32 x = –3/2 Yes
fraction
Square
27 x=4 x = 4 or –4 No
roots

• organize small group instruction of useful information if there is a larger


around the subsets of goals that stu- goal that they are working to achieve.57
dents prioritized for themselves; and
Roadblock 2.2. Teachers within a school
• tally the concepts that students in the have different approaches to providing
class identify as their weaknesses and feedback to their students.
provide full-class review on the most
frequently mentioned weaknesses. Suggested Approach. Although each
teacher should engage with students in
Potential roadblocks and solutions ways he or she finds effective, teachers
may nevertheless benefit from profes-
Roadblock 2.1. Students view the feedback sional development on how to provide
they receive as a reflection on their abil- concrete and constructive feedback that
ity rather than an opportunity for focused informs student learning through stu-
improvement. dents’ own data. Teachers should collabo-
rate with peers to develop a shared under-
Suggested Approach. Teachers should standing about what constitutes formative
give student feedback that is explanatory feedback, and how and when such feed-
and provides students with a chance to back should be provided (see recommen-
improve.56 Teachers should emphasize dation 1). Teachers may even benefit from
the students’ level of performance on a inviting students to take part in these con-
task in relation to the learning goals and versations and share how they use and re-
avoid making global statements about the spond to instructional feedback.
student’s ability. Encouraging goal setting
also is important because students may be Roadblock 2.3. Teachers are concerned
more willing to view feedback as a source that they do not have enough instructional
time to explain rubrics or help students
analyze feedback.

56.  Black et al. (2003); Black and Wiliam (1998); 57.  Lee and Gavine (2003); Thurman and Wolfe
Shepard (1995); Wesson (1991). (1999).

( 24 )
Recommendation 2. Teach students to examine their own data and set learning goals

Example 5. Teaching students to examine data and goals

This story provides an example of how to implement all four action


steps in this recommendation. The example focuses on language arts
instruction, for which rubric-based assessment is commonplace (see
Examples 2 and 3). However, it also is possible to use reflective data
tools in subjects such as math, for which rubrics are less common
(see Example 4).

At Southside Middle School, language arts teachers assign a five-


paragraph essay prompt to students once per quarter as a school-
wide interim assessment. The language arts teachers jointly design
a rubric (see Example 2) that they all use to assess and score the es-
says. Each quarter, after the essays are scored, they bring examples
of strong and weak essays to their monthly data team meetings, at
which they share the examples and discuss instructional strategies
that might improve students’ performance. Students, meanwhile,
maintain the scored essays and rubrics in assessment portfolios,
which they use to gauge their own progress over time.

In preparing her students for the quarterly writing assessment,


Ms. Alvarez had her students reexamine a blank version of the rubric
(see Example 2) and asked them to remind her of what each of the
standards meant. She then provided a sample student essay and had Action Step 1
students score it using the writing rubric. Next, students discussed
in pairs how they rated the essay on each standard, and why. Finally,
Ms. Alvarez walked students through how she would score the essay,
asking students to weigh in on her reasoning as she talked.

When assessment day came, students wrote their five-paragraph es-


says in response to a new schoolwide prompt. Ms. Alvarez spent the
next three afternoons evaluating student essays using the rubric, Action Step 2
making notes on the rubric to clarify the marks she gave. She also
followed each rubric with a summary note about the essay’s strengths
and weaknesses. When all essays were scored, she first returned
them to the students without the marked rubrics. Ms. Alvarez had
students reread their own essays and list what they considered to be
the main strengths and weaknesses. Next, she returned the marked-
up rubrics and had students read her feedback to decide how well
her assessment matched their own self-assessment. If there were Action Step 3
large discrepancies, she asked students to meet with her after class
to discuss them. She then distributed a handout that students used
to list their areas of strength and weakness (see Example 3). Using the
teacher’s rubric-based feedback as well as their own self-assessments,
students recorded areas of strength and weakness they needed to
consider in undertaking future writing tasks. Ms. Alvarez collected
and reviewed the lists and realized that many students struggled
with providing examples in the body of the essay. She then revised Action Step 4
her lesson plans for the following day to spend more time reviewing
this topic with her students.

( 25 )
Recommendation 2. Teach students to examine their own data and set learning goals

Suggested Approach. The panel rec- of their own data into routine classroom
ognizes that instruction time is limited. activities may help students develop a
However, time spent explaining assess- habit of learning from feedback, mak-
ment tools and strategies for analyzing ing them more independent as the year
feedback is essential to helping students progresses. Helping students understand
understand their own achievement. Thus, assessment tools and analyze feedback
it should be a natural, integral part of the also puts students at the vanguard of the
teaching process—not an add-on activity. school’s culture of data use.
Incorporating time for students’ analysis

( 26 )
Recommendation 3. one of the key responsibilities of an
education professional.62
Establish a clear
vision for schoolwide Level of evidence: Low
data use Believing that a clear vision for data use is
essential to educators wishing to improve
Schools must establish a strong instruction through interpreting data, the
culture of data use to ensure that data- panel drew from its own knowledge and the
based decisions are made frequently, findings and examples in case studies and
consistently, and appropriately.58 descriptive analyses to inform the develop-
This data culture should emphasize ment of this recommendation. No studies
collaboration across and within grade were identified that examine the effects of
levels and subject areas59 to diagnose establishing a data team or creating a data-
problems and refine educational use plan on student achievement, so the
practices.60 Several factors (e.g., panel judged the level of evidence support-
planning, leadership, implementation, ing this recommendation as low.
and attitude) affect the success
schools will have with developing and Brief summary of evidence to
maintaining a data culture. Here, the support the recommendation
panel suggests steps schools should
take toward establishing their vision, A strong culture of data use, conveyed
while recognizing that following the through a clear schoolwide vision, is criti-
suggestions does not guarantee that cal to ensure that data-based decisions are
a strong culture will emerge. made routinely, consistently, and effec-
tively. This point is conveyed in a number
of studies that use qualitative designs to
A clear plan for schoolwide data examine how schools and districts have
use is essential to developing such implemented data use. Appendix D con-
a culture. Schools should establish a tains two examples of case studies the
representative data team to help ensure panel referenced when developing the ac-
that data activities are not imposed on tion steps in this recommendation. One de-
educators, but rather are shaped by scribes how a set of districts and schools
them.61 This team should develop a has worked to develop achievement goals
written data-use plan that is consistent and to use student data to support prog-
with broader school and district goals, ress toward those goals,63 whereas the
supports a common language related other describes an example of how one
to data use and teaching and learning school has its staff share responsibility for
concepts, and establishes data use as data use to avoid burnout.64 However, the
panel identified no causal evidence linking
the creation of a schoolwide culture or vi-
sion to improved student performance.

58.  Datnow, Park, and Wohlstetter (2007); Wil-


liams Rose (2006).
59.  Armstrong and Anthes (2001); Datnow, Park,
and Wohlstetter (2007); Knapp et al. (2006). 62.  Datnow, Park, and Wohlstetter (2007); Wil-
60.  Datnow, Park, and Wohlstetter (2007); Gen- liams Rose (2006); Rossmiller and Holcomb
try (2005). (1993); Wayman, Cho, and Johnston (2007).
61.  Anderson et al. (2006); Feldman and Tung 63.  Datnow, Park, and Wohlstetter (2007).
(2001); Wayman, Cho, and Johnston (2007). 64.  Copland (2003).

( 27 )
Recommendation 3. Establish a clear
vision for schoolwide data use

How to carry out this represented within the school. It is not


recommendation the role of team members to hold staff
accountable for data use, manage or su-
1. Establish a schoolwide data team that sets pervise data-related activities, or provide
the tone for ongoing data use. expert advice on data implementation and
analysis. Instead, team members should
Principals should establish a data team that clarify the school’s data vision and model
will clarify and guide the school’s vision for the use of data to make instructional de-
the most effective use of data.65 This team cisions, encouraging other school staff to
should include a balanced assortment of do the same.
stakeholders who can solicit input from all
aspects of the school, such as: 2. Define critical teaching and learning
concepts.
• a senior member of the school’s ad-
ministration (e.g., principal, assistant At its outset, the data team should develop
principal); a shared vocabulary for critical concepts
related to education in general and data
• two or three teachers representing var- use in particular. The panel recommends
ious subjects and grade levels; that school staff agree about the definition
of terms such as learning, data, evidence,
• one or two classroom support profes- and collaboration. Some educators, for
sionals (e.g., reading coaches); and/or example, may define data simply as test
scores, whereas others may define it as
• if possible, a district-level staff mem- any available information about a student.
ber who works in research, evaluation, Developing a shared vocabulary will help
or assessment. minimize misunderstandings and conflict-
ing assumptions among school staff.67
Principals should invite individuals who
have knowledge—or have a desire to gain
knowledge—of data analysis and interpre- Some critical concepts to define68
tation. Some staff, especially those with • Achievement • Evidence
statistics training or special education cer-
tification, may have experience with data • Collaboration • Learning
analysis and interpretation.66 Principals 68
• Data • Progress
also should consider staff with strong lead-
ership skills and the ability to motivate
fellow teachers, especially if these indi-
viduals express an interest in using data 3. Develop a written plan that articulates
to improve student achievement. activities, roles, and responsibilities.

It is important to note that a data team Based on the data team’s discussions, as
is a committee of advisors on data use well as full staff input, the team’s admin-
within the school. Additionally, the team istrator and teachers should write a plan
represents the entire school community, that clearly articulates how the school will
so decisions should be made in collab- use data to support school-level goals for
oration with the different perspectives
67.  Wayman, Cho, and Johnston (2007); Wayman,
65.  Halverson and Thomas (2007); Hill, Lewis, Midgley, and Stringfield (2006).
and Pearson (2008); Moody and Dede (2008). 68.  Waters and Marzano (2006); Wayman, Cho,
66.  Bettesworth (2006). and Johnston (2007).

( 28 )
Recommendation 3. Establish a clear
vision for schoolwide data use

improving student achievement.69 These • timelines for executing the actions; and
goals, developed by school and district
leadership, already exist in most schools. • how each action helps the school reach
To create conditions for effective data use, its long-term goals.
the data team should briefly revisit the
school’s goals to ensure that they are Example 6 provides a hypothetical plan for
tying data use to school goals. The exam-
• attainable, in that they are realistic ple illustrates how a data team might map
given existing performance levels; a clear rationale from each action to the
school’s larger goal of improved reading
• measurable, in that they clearly ex- proficiency, and how each data team mem-
press the parameters of achievement ber might take responsibility for executing a
and can be supported by data70; and portion of the larger plan. The panel encour-
ages schools to develop other similar plans,
• relevant, in that they take into account including detailed lists of data-use responsi-
the specific culture and constraints of bilities by staff role and timelines for using
the school.71 data, but provides this table as a sample of
how an actionable plan might look.
For example, a school in which half the
students can read at grade level may de- The team should revisit the plan annually,73
cide to set a long-term goal of having 75 using data to determine appropriate
percent of students reading on grade level changes to meet the needs and goals of
within five years. It then would seem rea- the school and its students. Revising the
sonable for the school to set ambitious but plan in this way mirrors the cycle of in-
achievable annual goals to increase the structional improvement, further estab-
share of students reading at grade level by lishing a culture of data-based decision
5 percentage points per year. If the data making throughout the school.
team determines that the goals do not
meet the criteria of seeming attainable, 4. Provide ongoing data leadership.
measurable, and relevant, it may wish to
establish short- and medium-term goals Once the plan is developed, the data
that do meet these criteria. team should provide guidance on using
data to support the school’s vision, with
With the school’s goals identified and clari- the ultimate aim of developing the ca-
fied, the data team should prepare a writ- pacity of all school staff to use data. At
ten plan specifying72 the outset, members of the data team
should regularly interact with school
• specific actions for using data to make staff about data and its uses, often-
instructional decisions; times serving as data facilitators (see
recommendation 4). For example, team
• staff and team members responsible members can educate school staff, dis-
for carrying out those actions; trict representatives, or parents about
the school’s vision for data use by hav-
ing individual or small group meetings
69.  Armstrong and Anthes (2001); Mason (2002);
Togneri (2003).
70.  Datnow, Park, and Wohlstetter (2007); Feld-
man and Tung (2001); Young (2006).
73.  Wayman, Cho, and Johnston (2007) recom-
71.  Halverson et al. (2007); Leithwood et al. mend revisiting the plan frequently. The panel
(2007). recommends doing so on at least an annual
72.  Datnow, Park, and Wohlstetter (2007). basis.

( 29 )
Recommendation 3. Establish a clear
vision for schoolwide data use

Example 6. Example of a written plan for achieving school-level goals

Schoolwide Goal: Increase percentage of students reading on grade level


5 percentage points per year, to reach 75 percent in five years

Action Path to Goal Team Member Timeline

Plan and facilitate monthly • Focus on areas of greatest Mike Thompson, Hold first meeting
grades 4–6 team meetings to student need grades 4–6 by October 10;
review Ms. Sanders’s data dis- team leader second by
• Calibrate and elevate
plays and share best practices November 15
expectations among teachers
in mini-lessons co-planned by
Mr. Johnson. • Streamline instructional
practices
Plan and facilitate monthly Beth Miller,
• Share practices that work
grades 1–3 team meetings to grades 1–3
review Ms. Sanders’s data dis- • Encourage vertical align- team leader
plays and share best practices ment between grades
in mini-lessons co-planned by
Mr. Johnson.

Prepare well-chosen data • Help teachers gain facility Erin Sanders, Carry out
graphs on PowerPoint (state in using data data facilitator monthly; distrib-
or interim data updates) for ute examples at
• Focus teachers’ attention and
monthly grade-level team November data
inquiry on areas of particular
meetings. team meeting
strengths and weaknesses in
students’ reading skills

Have teachers choose their • Share and standardize best Lionel Johnson, Bring schedule to
favorite reading instructional practices among classrooms reading coach November data
strategy and prepare sample team meeting;
• Encourage culture of
lessons and evidence of student hold first session
instructional improvement
work. Schedule teachers to pres- by October 10.
ent these during part of their • Reinforce evidence-based
grade-level team meetings. practice

Register and prepare data team • Increase ability of data team Samantha Roberts, October 15
for 4-day offsite workshop on in- to understand and use data assistant principal
terpreting assessment data, cre- • Develop capacity for distrib-
ating data displays, and helping uting leadership within the
teachers use data daily. school

( 30 )
Recommendation 3. Establish a clear
vision for schoolwide data use

focused on these topics. Team members staff use data in ways that advance school
also can goals.74 “Distributed leadership,” a practice
often hypothesized as an important char-
• provide resources and support for data acteristic of effective schools, is one way
analysis and interpretation, such as in- to accomplish this task.75
formation about professional develop-
ment sessions and access to necessary Potential roadblocks and solutions
technologies;
Roadblock 3.1. School staff do not have
• encourage educators to use data in time to develop an additional plan for how
their daily work by modeling data use to use data.
strategies;
Suggested Approach. To alleviate the
• create incentives to motivate staff to pressure of creating a new plan, the plan
analyze data (e.g., “Staff Member of for data use could be incorporated into
the Month” award for excellent data- an existing school improvement plan.76
use, recognition in the school news- Research also has described schools that
letter); and viewed this effort as ultimately time effi-
cient, describing their efforts as “making
• participate in grade- and subject-level time to save time.” 77
meetings to ensure that structured col-
laboration time is used effectively (see Roadblock 3.2. No one is qualified (or
recommendation 4). wants) to be on the data team.

Once staff members become comfortable Suggested Approach. Consider the


with data use, team members will not need strengths and leadership skills of indi-
to provide the same level of guidance and viduals in your school; many have related
support as indicated earlier. training and skills that will make them
strong team members. For example, new
The data team should meet monthly to teachers, or those who recently completed
monitor the school’s progress in executing continuing education programs, may have
plan components and adhering to timelines. applicable data knowledge if their pro-
The meetings also can be used to share grams provided training on the use of data
successes and challenges in integrating the to make instructional decisions. Similarly,
school’s vision for data use. Each month, some teachers and staff may be able to
one team member should be designated to provide enthusiasm and leadership that
set the agenda for the next meeting. inspire others to support the data-use
process. Once qualified and interested
Maintaining a data team, or building data staff are identified, consider encouraging
responsibilities into an existing team, may participation in the data team by offering
be a positive contribution to the school’s a small stipend from the principal’s dis-
data culture. Team members encourage cretionary funds.
and guide school staff in developing their
capacity to use data effectively to trans-
74.  Copland (2003); Wayman, Cho, and Johnston
form student performance data into in- (2007); Wayman and Stringfield (2006).
formation to inform instruction. Both the 75.  Halverson et al. (2007); Spillane, Halverson,
team and associated capacity-building and Diamond (2004).
efforts help ensure that no one individ- 76.  Mason (2002); Rossmiller and Holcomb
ual—such as a principal or a data-savvy (1993).
grade-level team leader—is left to help all 77.  Wayman, Brewer, and Stringfield (2009).

( 31 )
Recommendation 3. Establish a clear
vision for schoolwide data use

Roadblock 3.3. The few data-savvy staff Roadblock 3.4. The district does not have
at the school are overwhelmed by questions research and development staff to partici-
and requests for assistance.78 pate in the school-level data team.

Suggested Approach. It is important for Suggested Approach. The size of a dis-


principals and district leaders to protect trict may determine if research and devel-
people’s time by clearly defining roles and opment staff are present, or if there are
responsibilities in enforceable job descrip- enough research and development staff
tions.79 Principals also can encourage all to participate in school-level data teams. If
members of the data team to train other district staff cannot participate in school-
educators to use and interpret data. Phas- level teams, however, the principal should
ing data use into the entire school can ensure that any district-level message
help prevent staff burnout, deepen staff about data use is accurately presented to
data literacy, and encourage schoolwide data team members.
support and implementation of the data-
based decision-making process.80

78.  Halverson and Thomas (2007).


79.  Young (2006).
80.  Means et al. (2009).

( 32 )
Recommendation 4. reported that teachers in the coaching
Provide supports group more frequently used pupil obser-
vations to modify lessons,86 this outcome
that foster a data- was not measured in a way that allowed
driven culture the authors or the WWC to compute the
magnitude or statistical significance of
within the school any effect of this change on instructional
practice. The panel also identified one cor-
relational study that found a significant
Schools and districts can make positive association between coaching and
concrete changes that encourage reading achievement (however the study
data use within schools.81 These design does not permit causal inferences
changes need to ensure that teachers, about the effect of coaching).87 Although
principals, and school and district these studies, supplemented by findings
staff have a thorough understanding from qualitative analyses and their own
of their roles in using data, and that expertise, helped the panel develop the
they possess the knowledge and skills steps under this recommendation, the
to use data appropriately. Schools and level of evidence supporting this recom-
districts should invest in leadership, mendation is low.
professional development, and
structured time for collaboration.82 Brief summary of evidence to
They also may need to invest in support the recommendation
additional resources, including relevant
technologies83 and specialized staff.84 Although the panel believes that the steps
under this recommendation are essential
Level of evidence: Low and findings of numerous qualitative anal-
yses report that supporting staff in data
Two studies that met WWC standards or use is important, limited rigorous evidence
that met WWC standards with reserva- exists to demonstrate that schoolwide
tions tested interventions that included supports for data use lead to achievement
coaching and feedback to help teachers gains. Two studies tested interventions
interpret and make changes based on as- that included coaching and feedback to
sessment data (the interventions included help teachers interpret and make changes
other practices as well).85 These interven- based on assessment data.88 In both cases,
tions had no discernible effects on student the coaching was only one component of
achievement. Although one study also the intervention, and the intervention was
compared with a competing intervention
(as opposed to business as usual). One
81.  Knapp et al. (2006); Lachat and Smith (2005);
Supovitz (2006); Supovitz and Klein (2003); Way- study compared the students of teachers
man, Cho, and Johnston (2007); Wayman and who received coaching to use data to track
Stringfield (2006). student progress and make instructional
82.  Datnow, Park, and Wohlstetter (2007); Lachat changes with the students of teachers who
and Smith (2005); Supovitz and Klein (2003); Way- received coaching on behavioral manage-
man, Cho, and Johnston (2007); Wayman and
ment.89 Another compared students of
Stringfield (2006); Young (2006).
83.  Wayman, Stringfield, and Yakimowski
(2004).
86.  Jones and Krouse (1988).
84.  Armstrong and Anthes (2001); Datnow, Park,
and Wohlstetter (2007); Supovitz and Klein (2003); 87.  Marsh et al. (2008).
Wayman, Cho, and Johnston (2007). 88.  Jones and Krouse (1988); Wesson (1991).
85.  Jones and Krouse (1988); Wesson (1991). 89.  Jones and Krouse (1988).

( 33 )
Recommendation 4. Provide supports that foster a data-driven culture within the school

teachers who received individual mentor- appropriately so that staff do not become
ing with students of teachers who received too dependent on facilitators.
group mentoring.90 The studies found no
discernible effects of the interventions Data facilitators should meet at least
that included a coaching component. The monthly with grade- and subject-level
panel identified no rigorous studies iden- teacher teams, although teacher teams
tifying the effects on student achievement should meet independently more fre-
of other schoolwide supports for data use. quently (see recommendation 1). During
To shape this recommendation, panelists these meetings, data facilitators should
relied on their own expertise as well as
examples of data leadership and profes- • model data use and interpretation,
sional development opportunities drawn tying examples to the school’s vision
from noncausal studies and implementa- for data use and its learning goals;
tion guides.
• model how to transform daily class-
How to carry out this room practices based on data-driven
recommendation diagnoses of student learning issues;

1. Designate a school-based facilitator who • assist staff with data interpretation


meets with teacher teams to discuss data. by preparing data reports and related
materials;92 and
Principals should provide data facilitators
who encourage staff to use data systemati- • train and support staff on using data
cally.91 Depending on the size and avail- to improve instructional practices and
able resources of the school and district, student achievement.93
data facilitators may be full-time teachers
who provide coaching to other staff, dis- Learning from the expertise of a colleague
trict staff members who support multiple may help teachers adjust their instruc-
schools in data use, or a dedicated school- tional approaches in ways that improve
level staff person supporting all teachers student achievement.94 However, data
in the building. facilitators need to complement existing
data-literacy capacity and encourage edu-
The data facilitator’s role is complex, re- cators to increase their data literacy. Data
quiring not only expertise with data analy- literacy is necessary to develop and sup-
sis but also an ability to train and encour- port a data culture,95 and overreliance on
age other staff in the data use process. data facilitators can result in educators
Regardless of her or his role in the school failing to develop the necessary knowl-
or district, the data facilitator’s respon- edge and skills, which could lead them to
sibilities should be integrated into the misunderstand or misuse data. Once staff
regular work of the school’s data team become comfortable with data use, how-
(see recommendation 3). It is important ever, it is likely that facilitators will not
to recognize, however, that facilitators
should not bear the sole responsibility for
data interpretation and analysis. Instead,
92.  Wayman, Cho, and Johnston (2007).
data facilitators can help staff obtain the
93.  Chrismer and DiBara (2006); Knapp et al.
knowledge and skills they need to use data (2006); Mid-Continent Research for Education
and Learning (McREL) (2003); Wayman, Cho, and
90.  Wesson (1991). Johnston (2007).
91.  Wayman, Cho, and Johnston (2007); Wesson 94.  Jones and Krouse (1988); Wesson (1991).
(1991). 95.  Knapp et al. (2006).

( 34 )
Recommendation 4. Provide supports that foster a data-driven culture within the school

need to provide the same level of guidance for scheduling collaborative time. For ex-
and support as indicated earlier. ample, one school has dedicated biweekly
two-hour meetings for staff to examine
2. Dedicate structured time for staff student data and identify next instruc-
collaboration. tional steps.100 Another school adjusted
weekly class schedules to have a com-
Encouraging teachers to work collabora- mon break for teachers to examine data
tively with data helps make data use an collaboratively.101
established part of a school’s culture.96
Collaborative data analysis can highlight The collaborative team meetings should
achievement patterns across grade levels, include the following components:
departments, or schools97 and can engen-
der the kind of consistency of instructional • Preparation. Prior to these meetings,
practices and expectations that often char- educators should set an agenda that
acterizes high-performing schools.98 focuses on using the most updated
data relative to a specific, timely topic.
Structured time should be set aside for It is too overwhelming to attempt to
teachers and school staff to collabora- address all student achievement con-
tively analyze and interpret their students’ cerns at once; targeted discussions are
achievement data, and to identify instruc- key to successful data meetings.
tional changes.99 This time also can be
used for professional development on data • Analysis. During these meetings,
use. Ideally, this structured time should teachers should follow the cycle of in-
occur a few times each week, depending quiry, using data to state hypotheses
on the individual school’s needs. It is im- about their teaching and learning prac-
portant that schools make these collabora- tices and then testing those hypoth-
tive meetings a priority. eses (see recommendation 1).102

Collaborative meeting participants can • Action agenda. At the end of each


vary from school to school. Most fre- meeting, educators should be prepared
quently, data meetings occur among small to enact a data-based action plan that
groups of teachers in the same grade level examines and modifies their instruc-
or subject area. Other times, these meet- tion to increase student achievement
ings include some combination of teach- in the area of focus for the meeting.
ers in the same grade level or subject
area, a data facilitator, and/or other data 3. Provide targeted professional develop-
team members. ment regularly.

Because school schedule constraints vary, The skills that educators need in order
principals can explore different options to use data to identify achievement prob-
lems and develop instructional solutions
96.  Feldman and Tung (2001). are complex. To enhance data-literacy
97.  Cromey and Hanson (2000). and data-use skills in a way that is consis-
98.  Bigger (2006); Herman and Gribbons (2001); tent with school goals, it is essential that
Huffman and Kalnin (2003); Lachat and Smith schools and districts provide ongoing pro-
(2005); Wayman, Cho, and Johnston (2007). fessional development opportunities for
99.  Anderegg (2007); Bigger (2006); Cromey and
Hanson (2000); Gentry (2005); Herman and Grib-
100.  Knapp et al. (2006).
bons (2001); Huffman and Kalnin (2003); Ingram,
Louis, and Schroeder (2004); Supovitz and Klein 101.  Mandinach et al. (2005).
(2003); Wayman and Stringfield (2006). 102.  Armstrong and Anthes (2001).

( 35 )
Recommendation 4. Provide supports that foster a data-driven culture within the school

administrators, principals, teachers,103 and ments.108 In this way, staff can more easily
classroom support specialists.104 Without connect their training to daily activities109
school- and district-level support for these and not become overwhelmed by training
opportunities, analysis of data may be in- sessions. (See recommendation 5 for more
consistent and potentially ineffective. details on preparing for implementation of
technology systems.)
The skills needed for effective data use
range from data entry to data analysis to It is important to recognize that profes-
leadership; they also vary depending on sional development responsibility does not
professional roles (i.e., teacher, adminis- end after the initial training of staff and
trator, or technology support staff), con- deployment of the district’s data system.
tent area and curriculum, experience with Users also may require ongoing technical
data analysis, and level of comfort with assistance, and additional trainings will be
technology.105 For most staff, professional needed when introducing system enhance-
development should focus on how users ments. Professional development oppor-
will apply the data to their daily work and tunities, therefore, should be continuous,
instructional planning, rather than on the offered at least monthly throughout the
functionality of the system.106 Staff with school year by staff experienced with as-
the specific role of maintaining the sys- sessment and data-literacy skills, technol-
tem, however, should receive specialized ogy use, and the development of cultures
training that prepares them to maintain of effective data use. Professional develop-
the system for all users. ment staff should consider offering online
learning modules as refresher courses or
Ideally, all staff, particularly principals, self-paced, independent training opportu-
should be familiar with components of the nities after initial in-person training ses-
data system, data culture, and data use. sions to moderate costs and offer flexibil-
Table 3 highlights some potential profes- ity in handling scheduling challenges and
sional development opportunities to pri- varying levels of technology use.
oritize for staff based on their roles with
the data system and data use. Potential roadblocks and solutions

Training for data use often is synchronous Roadblock 4.1. It is difficult to locate pro-
with technology training. Creating staff fessional development that is specific to the
confidence in, and comfort with, avail- needs of the school.
able data systems should increase the
chance that data will be used regularly and Suggested Approach. With the assis-
well.107 Related technology training should tance of the data team and data facilitators,
be implemented in small doses, however, schools should determine their needs and
and occur close to implementation of the discuss these with their professional de-
data system or related system enhance- velopment provider. In this way, schools
can ensure that the provider teaches skills
that meet the needs of school staff. If a
session cannot be tailored to the needs
103.  Wayman, Cho, and Johnston (2007).
of the school or district, schools should
104.  Feldman and Tung (2001).
105.  Bigger (2006); Cromey and Hanson (2000);
Herman and Gribbons (2001); Huffman and Kal- 108.  Arnold (2007); Cromey and Hanson (2000);
nin (2003); Knapp et al. (2006); Lachat and Smith Gentry (2005).
(2005); Wayman, Cho, and Johnston (2007).
109.  Anderegg (2007); Ingram, Louis, and
106.  Wayman and Cho (2008). Schroeder (2004); Wayman, Cho, and Johnston
107.  Supovitz and Klein (2003). (2007).

( 36 )
Recommendation 4. Provide supports that foster a data-driven culture within the school

Table 3. Suggested professional development and training opportunitiesa


Information
Technology
Principals Teachers Other Staff* Staff

Avoiding common data analysis x x x


and interpretation mistakes

Data system use—avoiding x x x


common mistakes

Data system use—entering data x x

Data system use—maintenance x


and troubleshooting

Data system use—reporting x x x


capabilities

Data transparency and safety x x x x

Encouraging staff leadership x

Fostering a culture of x x
data-based decision making

Identifying needs for staff profes- x x


sional development opportunities

Interpreting data in an x x x
educational context

Organizing time for collaborative x x x


data discussions

Understanding and using x x x


the cycle of instructional
improvement

Using data to answer questions x x x


about student achievement

Using data to modify teaching x x x


and learning practices

* Other staff can include data facilitators, classroom support specialists, administrative assistants, and counselors.
a. Examples of suggested professional development and training opportunities are drawn and adapted from Chris-
mer and DiBara (2006); Knapp et al. (2006); Marsh et al. (2008); McREL (2003); Nabors Oláh, Lawrence, and Riggan
(2008); and Wayman, Cho, and Johnston (2007).

( 37 )
Recommendation 4. Provide supports that foster a data-driven culture within the school

consider using a “train-the-trainers” mod- Suggested Approach. Data-based deci-


el.110 Schools should identify trainers, such sion making is not an isolated topic within
as professional development staff within education, but rather one that benefits all
the district office, who can receive broad subject areas and grades. Principals and
training on a particular product or issue re- district-level administrators should secure
lated to data-based decision making for the and distribute the financial resources nec-
school’s data system. These staff can then essary to match educators’ needs for inter-
adapt the training to fit the needs of the preting and interacting with data. When
school or district and train other educators observing the structured collaboration
and staff members as necessary.111 meetings, school leaders should identify
whether teachers and other school staff
Roadblock 4.2. Resources dedicated to need additional professional development
creating staff capacity to use data often are opportunities or materials, supplemental
shifted to other school priorities. support services, or access to support per-
sonnel. Dedicating resources to data liter-
acy will help support and enforce a culture
of data use, enabling educators to better
110.  Wayman and Conoly (2006). help their students meet defined learning
111.  Datnow, Park, and Wohlstetter (2007). goals across all content areas.

( 38 )
Recommendation 5. and improve student achievement. To
Develop and maintain guide this recommendation, the panel ref-
erenced descriptive and other noncausal
a districtwide data studies that (1) discussed how schools or
system districts collaboratively created and used
data systems,113 (2) described the impor-
tance or provided examples of selecting a
Districts should develop and maintain system that meets varied users’ needs,114
high-quality data systems that enable all (3) explained the successes and challenges
decision makers to access the necessary schools and districts experienced when
data in a timely fashion. A high-quality implementing their data systems,115 and
data system is comprehensive and (4) advocated the importance or gave ex-
integrated, linking disparate forms of amples of system maintenance and secu-
data for reporting and analysis to a rity relative to data quality.116 Appendix D
range of audiences.112 To help ensure provides details on the characteristics of
that the relevant staff in a school district data systems described in these studies.
will rely on the data system to inform
their decisions, district administrators How to carry out this
should involve a variety of stakeholders recommendation
when determining which functions the
system should provide. Districts and 1. Involve a variety of stakeholders in select-
schools need to secure financial and ing a data system.
human resources to develop safeguards
that ensure data are timely, relevant, Districts should establish a data-system
and useful to educators. advisory council that includes represen-
tatives from key stakeholder groups (see
Level of evidence: Low Table 4). These representatives should
understand the importance of data use
Recognizing that it is difficult if not impos- to make instructional decisions, possess
sible to test the impacts of data systems leadership and time-management skills,
on student achievement empirically, the and be able to effectively communicate
panel based this recommendation on a
combination of its expertise and its review
of descriptive studies and case studies.
The studies did not use a causal design
113.  Choppin (2002); Lachat and Smith (2005);
that would provide evidence directly link- Mieles and Foley (2005); Thorn (2001); Wayman,
ing the use of an integrated data system Cho, and Johnston (2007); Wayman and Conoly
with improved academic outcomes; hence, (2006); Wayman and Stringfield (2006); Wayman,
the level of evidence to support this rec- Stringfield, and Yakimowski (2004).
ommendation is low. 114.  Breiter and Light (2006); Brunner et al.
(2005); Choppin (2002); Datnow, Park, and Wohl-
stetter (2007); Kerr et al. (2006); Long et al. (2008);
Brief summary of evidence to Mieles and Foley (2005); Thorn (2001); Wayman
support the recommendation and Cho (2008); Wayman, Cho, and Johnston
(2007); Wayman, Stringfield, and Yakimowski
(2004).
A high-quality, districtwide data system is
115.  Long et al. (2008); Wayman, Cho, and John-
necessary to provide teachers with the in-
ston (2007); Wayman, Stringfield, and Yakimo-
formation they need to modify instruction wski (2004).
116.  Long et al. (2008); Mason (2003); Mieles and
112.  Mieles and Foley (2005); Wayman, String- Foley (2005); Wayman and Cho (2008); Wayman,
field, and Yakimowski (2004). Cho, and Johnston (2007).

( 39 )
Recommendation 5. Develop and maintain a districtwide data system

Table 4. Sample stakeholder perspectives on data system use


Staff Title Example of Uses of Data System

Administrators Compare rates of discipline referrals among different groups of students;a discuss
and principals student progress and classroom pedagogy with faculty.b

Counselors Place students into correct classes based on prior performance and current schedule
constraints; discuss student progress and needs with other building educators.

Information Assess the interoperability of data systems; identify project scope; build strong proj-
technology staff ect plans; establish standards; manage differentiated access by stakeholders; provide
support, maintenance, and enhancements over time; identify challenges that might
prevent or hinder systems from working together for timely access to information.

Support staff Use attendance and assessment data to identify students for targeted interventions;
work with faculty and administration on data use strategies and changing practice.c

Teachers Identify student and class strengths and weaknesses; interact with other staff about
student progress.d

Parents Track immediate student outcomes and compare student performance over time.

Students Review scores on recent assessments and track progress on outcomes.


a. Choppin (2002).
b. Wayman and Stringfield (2006).
c. Choppin (2002); Wayman, Cho, and Johnston (2007).
d. Lachat and Smith (2005); Wayman, Cho, and Johnston (2007); Wayman and Stringfield (2006).

information to other educators. Responsi- The panel recommends that the data sys-
bilities could include the following:117 tem advisory council meet frequently (at
least bimonthly, and more frequently if
• developing roles and structures to possible). Meetings should focus on sug-
oversee the district’s commitment to gestions for improving the data system,
data quality and use; addressing concerns from users about the
data system, and identifying professional
• providing guidance about the require- development needs.
ments and design of the data system;
Between meetings, members of the data sys-
• overseeing system development; and/or tem advisory council should solicit feedback
from their respective stakeholder groups
• serving as the liaison between the to better understand (1) how data are being
council and its respective stakeholder used, (2) concerns users have about the sys-
groups. tem, and (3) how the system could be used
in the future. The council should designate
Table 4 illustrates the needs that different one or two of its district-employed members
stakeholder groups might have in using a or identify a full-time individual to serve as
districtwide data system. project manager. These leaders should be
tasked with overseeing system development
117.  Mieles and Foley (2005); Thorn (2001); Way-
and supporting the execution of the coun-
man and Conoly (2006); Wayman, Stringfield, and cil’s short- and long-term goals. In this way,
Yakimowski (2004). troubleshooting and decisions regarding the
( 40 )
Recommendation 5. Develop and maintain a districtwide data system

data system can be addressed in a timely, ef- Sample existing and new data
ficient manner outside of council meetings. elements to consider121
Recognizing that these designated staff may
have other responsibilities, administrators • State assessment data
should adjust staff responsibilities to allow • Interim or benchmark assessment
for sufficient time to execute project man- data
agement tasks. 121

• Locally developed formative


2. Clearly articulate system requirements assessment data
relative to user needs. • Attendance records

It is critical for the council to work closely • Finance and scheduling


with a representative of each school’s data information
team (described in recommendation 3), bas- • Student and teacher demographic
ing its suggestions for the system’s require- data
ments in the vision articulated by the data
team. User needs should dictate system-
requirement decisions in support of educa- • Consistent student and teacher IDs.
tional achievement, not vice versa.118 To enable users to access a complete
picture of a student, an effective data
The council should consider how the sys- system should include a consistent stu-
tem requirements would account for the dent ID that allows users to follow stu-
following: dents over time and between schools,
identify links between students and
• Access to system and data security. teachers for courses and curricula, and
Staff in different roles will use data for identify special programs in which the
different purposes and may, therefore, student participates.
require varied levels of access. Coun-
cil members should consider whether • Consolidation of legacy systems.
users need to have access to the sys- Most schools and districts have data
tem during nonschool hours or from systems that are already in use (legacy
outside the building.119 systems). As system requirements are
articulated, the council should make
• Bandwidth requirements. Information decisions about which functions from
technology staff should confirm that the legacy systems can be maintained by
quantity of data that can be carried from these systems, and which functions
one point to another in a given time pe- should be replaced by a new system.
riod (bandwidth) is sufficient for relevant
and timely data use.120 Also, staff should • Cost (initial and maintenance). The
consider the infrastructure they need to council needs to carefully analyze
connect hardware, software, and users. available resources, including skills
necessary to develop and maintain a
118.  Abbott (2008); Breiter and Light (2006). Long customized data system, financial and
et al. (2008) provide an example of one district time limitations, staffing needs, initial
successfully using a data system that was devel- and ongoing maintenance of data, pro-
oped after assessing user needs. McREL (2003)
advises that purposeful data collection begins fessional development and training
by identifying user needs.
119.  Wayman, Stringfield, and Yakimowski (2004). 121.  Choppin (2002); Datnow, Park, and Wohl-
120.  Wayman, Cho, and Johnston (2007); Way- stetter (2007); Wayman, Stringfield, and Yaki-
man, Stringfield, and Yakimowski (2004). mowski (2004).

( 41 )
Recommendation 5. Develop and maintain a districtwide data system

sessions, and system upgrades.122 data inventory process can prevent


The council also needs to discuss the major data quality problems.129
human and financial resources avail-
able to purchase or build a system (see • Hosting. Servers that house data may
action step 3).123 be located either within a school dis-
trict’s data center or at an off-site host-
• Data storage. Any data system should ing service, depending on the district’s
be flexible enough to incorporate mul- capacity to maintain the quality and
tiple types of data.124 The council speed of the connection through tech-
should consider the existing data that nological and human support.
will need to be incorporated into the
new system,125 as well as the new data • Interoperability. The capacity of a
that may be collected and stored in the system to communicate and exchange
same system. The data system must data seamlessly with other systems (in-
provide seamless access to a broad teroperability) is defined by a standard
variety of data typically stored in dis- format for shared data, a set of nam-
parate systems, such as disciplinary ing conventions, and a set of rules for
data, assessment data, student demo- interaction among applications. Coun-
graphics, and grades. This access must cil members should consider existing
be seamless to the user, offering the data systems to avoid buying future
ability to examine varied types of data add-ons to facilitate interaction be-
concurrently.126 tween new and existing systems.130 In
order to fit the new data system with
• Data quality/accuracy and timeli- other data-collection tools, it is impor-
ness. Data that are inaccurate, un- tant to select systems that are able to
timely, or not specific will greatly share data across databases. Flexibil-
inhibit educators’ ability to make data- ity will allow the district and schools
based decisions about teaching and to better adapt existing data to a new
learning practices.127 Common assess- system and will facilitate shaping the
ment data, for example, should be en- data system as new needs emerge.
tered in the system immediately. At the
outset, leaders should take seriously • Professional development for both
the need to clean existing data.128 Data end users and information technol-
errors can cause mistrust, and a good ogy (IT) staff. See action step 4 for
more information.

• Reporting. The presentation and re-


122.  Wayman, Stringfield, and Yakimowski porting features of the system should be
(2004). user-friendly and seamless, producing
123.  Long et al. (2008). results that draw on data elements from
124.  Mandinach et al. (2005); Wayman, Cho, and
Johnston (2007); Wayman, Stringfield, and Yaki-
mowski (2004).
125.  McREL (2003); Wayman, Stringfield, and Ya-
129.  Choppin (2002); Kerr et al. (2006); Light,
kimowski (2004).
Wexler, and Heinze (2005); Mieles and Foley
126.  Wayman (2005); Wayman, Stringfield, and (2005); Wayman, Cho, and Johnston (2007). Way-
Yakimowski (2004). man, Stringfield, and Yakimowski (2004) also dis-
127.  Choppin (2002); Wayman, Stringfield, and cuss the importance of data quality.
Yakimowski (2004). 130.  Ramnarine (2004); Thorn (2001); Wayman,
128.  Knapp et al. (2006); Wayman, Stringfield, Cho, and Johnston (2007); Wayman, Stringfield,
and Yakimowski (2004). and Yakimowski (2004).

( 42 )
Recommendation 5. Develop and maintain a districtwide data system

multiple systems.131 Staff in different recommendations for system capabili-


roles will use data for different purposes ties. The data needs of teachers, schools,
and may, therefore, require different re- and districts will likely evolve over time,135
porting features and layouts. Some staff so the panel recommends that system
may be initially satisfied with a sum- requirements be reviewed and revised
mary report in HTML or PDF formats but frequently (at least annually) to ensure
will likely require a flexible query tool that the system continues to meet user
that allows them to browse the data and needs.
manipulate the output.132 Additionally,
system components should be flexible 3. Determine whether to build or buy the
to account for changes in presentation data system.
requirements as staff confront new data
or new questions. Considering the needs of stakeholders and
district resource limitations (human and fi-
• Routines and safeguards. Data nancial), the advisory council needs to rec-
quality can be compromised when ommend whether the district should pur-
too many people enter data into the chase a data system from a vendor (buy) or
system.133 To safeguard data, districts develop the system internally (build) (see
could limit data-entry permission to a Table 5).136 Either approach may have hid-
small, specified number of people who den costs, such as additional time to build
are district certified for data entry.134 a personalized system or the need to buy
Alternatively, districts could consider add-ons so that an off-the-shelf purchase
providing varying levels of access for will better meet the articulated system
reading and entering data by role (e.g., requirements.
enable teachers to access their stu-
dents’ data, but not that of other stu- 4. Plan and stage the implementation of the
dents, or permit principals to access data system.
data on all students from their build-
ing and enter data when appropriate). The council’s written plan should address
Most users—such as teachers, admin- aspects critical to the system’s success,
istrators, and support staff—should such as maintenance and enhancement
be granted access to viewing data or needs. Other critical implementation as-
creating reports, but only trained or pects include staged implementation, pro-
certified users—typically an IT per- fessional development sessions, and strat-
son or designated district-level data egies to identify and solve problems.137
administrator—should be allowed to The implementation process should be
enter and edit data. guided by the council leaders, who should
track the implementation process closely
The data-system advisory council lead- to identify areas for improvement.
ers should develop a publically avail-
able written document that specifies During early implementation, the council
should arrange staged rollouts or pilot tests
to mediate the problem of overwhelming
131.  Breiter and Light (2006); Mieles and Foley
(2005); Wayman, Stringfield, and Yakimowski
(2004). 135.  McREL (2003); Rossmiller and Holcomb
132.  Ramnarine (2004); Wayman, Cho, and John- (1993); Wayman, Cho, and Johnston (2007).
ston (2007); Wayman, Stringfield, and Yakimo- 136.  Long et al. (2008); Wayman, Cho, and John-
wski (2004). ston (2007); Wayman, Stringfield, and Yakimo-
133.  Long et al. (2008); Mieles and Foley (2005). wski (2004).
134.  Wayman, Cho, and Johnston (2007). 137.  Wayman, Cho, and Johnston (2007).

( 43 )
Recommendation 5. Develop and maintain a districtwide data system

Table 5. Considerations for built and purchased data systems


Consideration Built Systems Bought Systems
Level of control Building a data system allows districts Prepackaged data system software can
to have more control over how they cus- be challenging to customize and repair.
tomize software and make repairs. Dis- However, vendors typically provide
tricts should be sure they have staff to skilled technical consultants to create
fill the roles of technical project man- solutions and deploy modifications.
ager, business analyst, database admin-
istrator, quality assurance manager, and
developer.
Cost An internally developed system may pres- Purchased systems typically involve an
ent lower initial costs. However, districts up-front cost that may not be recouped
should take into account long-range costs, if the district changes systems or needs
including the longer time it takes to de- to purchase additional add-ons for cus-
velop, test, and implement a built system tomization. However, vendors often host
than to purchase one. Built systems may, the data externally, which could be a cost
therefore, be more costly. savings.
Hardware and Internally hosted data systems require Vendors of prepackaged systems typi-
software needs hardware and software to be purchased, cally offer options of additional hard-
maintained, and continuously supported ware and software, as well as around-
by skilled technical staff. the-clock maintenance and support.
Training Internal staff can develop and deliver Professional development and related
training and technical assistance about technology trainings for organization
the data system that is targeted to the staff often are provided by the vendor;
district’s context and needs. sometimes a train-the-trainer approach
is implemented.
Efficiency District personnel often “reinvent the Vendors bring an economy of scale, hav-
wheel,” learning lessons that have al- ing worked with numerous other dis-
ready been addressed by other districts tricts on similar problems.
or commercial vendors.

staff with new technology. This approach system, however, and the implementation
allows time for staff to adjust to the system, plan would benefit from an inflated esti-
as well as flexibility to modify the system mate of the rollout timeline.140
in response to user feedback. The rollout
plan should be long range (e.g., spread out The plan also should include professional
over the course of one academic year) and development and training opportunities
include specific plans (with activities and tailored to staff needs by considering their
timelines) for maintenance, training, and technological skills, roles, responsibili-
end-user support.138 Further, these oppor- ties, and the content areas in which they
tunities should be tightly linked with spe- work.141 Professional development about
cific tasks that are immediately expected
of the user, as per the district plan.139 It
is easy to underestimate the time needed 140.  Mieles and Foley (2005).
to prepare existing data and roll out the 141.  Long et al. (2008); Mason (2003); McREL
(2003); Wayman and Cho (2008). Wayman, Cho,
and Johnston (2007) conclude that training
138.  Ibid. should be tailored to staff roles (but do not dis-
139.  Wayman and Cho (2008). cuss developing a formal training plan).

( 44 )
Recommendation 5. Develop and maintain a districtwide data system

the data system should discuss data trans- as-needed basis (e.g., a technology help
parency and safety, system uses and ca- desk) should be in place as soon as educa-
pabilities, and ongoing opportunities for tors start using the system.
integrating data into instructional practice.
(See recommendation 4 for more informa- Roadblock 5.2. The implementation plan
tion about professional development.) The contains many technological requirements,
plan also should recognize that implemen- but little information on how the system
tation responsibility does not end after ini- will be used.
tial training of staff and deployment of the
system. Users may require ongoing tech- Suggested Approach. Before purchasing
nical assistance, and additional trainings or developing a data system, ensure that
will be needed when introducing system the implementation plan addresses system
refinements and enhancements. requirements as they relate to the teach-
ing and learning goals of the district.144
Potential roadblocks and solutions Be very careful that educational goals are
front and center in this plan—the district
Roadblock 5.1. The data system’s tech- advisory council should never put techno-
nological components are challenging for logical requirements and considerations
staff who do not consider themselves tech- for a system before the educational goals
nologically savvy or are skeptical of using the system supports. If the plan clearly ar-
new technologies. ticulates how the system relates to learn-
ing goals, users will better understand
Suggested Approach. The data system how the system will be used and why that
should not be implemented and used with- use will support student achievement.145
out accompanying training and support
services. When the district is preparing to Roadblock 5.3. A data system seems like
roll out its data system, the council should a financial luxury to many individuals in
ensure that appropriate professional devel- the district.
opment and technology training sessions
are available for a variety of skill levels (see Suggested Approach. For districts that
recommendation 4 for more details).142 In prioritize, and indicate as a priority, the
this way, all stakeholders have the oppor- use of student data to meet educational
tunity to learn about the data system and improvement goals, a data system must
develop the skills necessary to utilize the equally be a priority. Ensure that the dis-
system. District resources should be al- trict’s plan describes how a data system
located to ensure that principals and data supports these goals in a way that clearly
facilitators can support teachers’ use of explains and illustrates the necessity of the
data within the school building,143 and a system, in order to foster support for it.
mechanism for providing assistance on an
144.  Wayman and Cho (2008); Wayman, Cho, and
Johnston (2007); Wayman and Conoly (2006).
142.  Wayman and Cho (2008). 145.  Breiter and Light (2006); Wayman and Cho
143.  Kerr et al. (2006). (2008); Wayman, Cho, and Johnston (2007).

( 45 )
Glossary of terms as sense of data.148 Education-related data
used in this report may be student focused (e.g., demograph-
ics, attendance and behavior, performance
on standardized tests) or administrative
Common assessments are those assess- (e.g., financial and staffing information) in
ments administered in a routine, consistent nature but are not limited to these types.
manner across a state, district, or school. Data are typically maintained by state and
Under this definition, common assessments local education agencies, districts, schools,
include annual statewide accountability or teachers (see data warehouse).
tests and commercially produced tests,
interim assessments, benchmark assess- Data-based decision making in educa-
ments, and end-of-course tests, as long as tion refers to teachers, principals, and
they are administered consistently and rou- administrators systematically collecting
tinely to provide information that can be and analyzing various types of data, in-
compared across classrooms and schools. cluding demographic, administrative, pro-
cess, perceptual, and achievement data, to
Correlational studies look for relation- guide a range of decisions to help improve
ships among variables. Although correla- the success of students and schools. Other
tional studies can suggest that a relation- common terms include data-driven deci-
ship between two variables exists, they do sion making, data-informed decision mak-
not support an inference that one variable ing, and evidence-based decision making.
causes a change in another.146
The data culture is a learning environ-
The cycle of inquiry is a process in which ment within a school or district that in-
educators analyze data—such as demo- cludes attitudes, values, goals, norms of
graphic, perceptual, school process, and behavior, and practices, accompanied by
student achievement data—in order to an explicit vision for data use by leader-
understand how these elements are inter- ship, that characterize a group’s apprecia-
related and what they suggest about stu- tion for the importance and power that
dents’ learning needs. As a multistep pro- data can bring to the decision-making
cess, the cycle of inquiry often involves process. It also includes the recognition
analyzing data to better understand stu- that data collection is a necessary part of
dent needs, developing hypotheses about an educator’s responsibilities and that the
instructional practice, formulating and use of data to influence and inform prac-
implementing action plans to improve stu- tice is an essential tool that will be used
dent learning and achievement, and then frequently.
once again analyzing data to evaluate stu-
dent progress and inform next steps.147 The variables that make up a data sys-
tem are known as data elements or data
Data are empirical pieces of information indicators.
that educators can draw upon to make a
variety of instructional and organizational A data facilitator is an individual charged
decisions. By themselves, data are not ev- with helping schools or districts use data
idence—it takes concepts, theories, and effectively to make decisions. Often, data
interpretive frames of references to make facilitators organize school-based data
teams, lead practitioners in a collab-
146.  Van Wagner (n.d.). orative inquiry process, help interpret
147.  Halverson, Prichett, and Watson (2007); Her-
data, or educate staff on using data to
man and Gribbons (2001); Huffman and Kalnin
(2003; Fiarman (2007). 148.  Knapp et al. (2006).

( 46 )
Glossary of terms as used in this report

improve instructional practices and stu- programs, and other materials shape the
dent achievement. context in which work is completed.  

The ability to ask and answer questions Formative assessment is a process that
about collecting, analyzing, and making is intended to provide feedback to teach-
sense of data is known as data literacy. ers and students at regular intervals dur-
Widespread data literacy among teachers, ad- ing the course of instruction. The purpose
ministrators, and students is a salient char- of formative assessment is to influence
acteristic of a data-driven school culture. the teaching and learning process so as
to close the gap between current learn-
Data quality refers to the reliability and ing and a desired goal. Assessments used
validity of collected data. for formative purposes—often called for-
mative assessments—are those that are
As school-based groups of educators who “given in the classroom by the teacher for
come together to analyze data and help the explicit purpose of diagnosing where
one another use data effectively, data students are in their learning, where gaps
teams often include a school’s principal, in knowledge and understanding exist,
instructional leader(s), and several teach- and how to help teachers and students
ers. Such teams may lead teachers in using improve student learning. The assessment
achievement data to identify and respond is embedded within the learning activity
to students’ learning needs through in- and linked directly to the current unit of
structional modifications. instruction.”151 However, because most as-
sessments can be used in both formative
A data warehouse is a computer system and summative ways, the term formative
that stores educational information from refers less to a particular type of assess-
several sources and integrates it into a ment than to the purposes for which the
single electronic source. Data warehouses assessment is used.
are designed to allow the manipulation,
updating, and control of multiple data- A hypothesis is a “tentative assumption
bases that are connected to one another made in order to draw out and test its logi-
via individual student identification num- cal or empirical consequences.”152 Within
bers. Capabilities of data warehouses often the cycle of inquiry, it is an evidence-based
extend beyond data storage, however, and assumption about students’ learning needs
may include data management and report- that teachers can test using instructional
ing systems used for retrieving and ana- modifications and follow-up data about
lyzing data.149 student performance.

Distributed leadership articulates how Interim assessments are typically ad-


leadership work and tasks are shared and ministered on a school- or districtwide
supported by individuals and structures scale at regular intervals during a single
across an organization.150 The social dis- school year. Although the results from
tribution of leadership reflects how work interim assessments may be used at the
is shared, assigned, or taken up by formal teacher or student level, the assessment
or informal leaders; the situational dis- is typically designed to be aggregated
tribution of leadership explains how or- at a level beyond the classroom, such
ganizational structures such as policies, as the school or district level.153 Interim

149.  Mieles and Foley (2005); Wayman, String- 151.  Perie, Marion, and Gong (2007), p. 3.
field, and Yakimowski (2004). 152.  Merriam-Webster Online Dictionary (2009).
150.  Spillane, Halverson, and Diamond (2004). 153.  Perie, Marion, and Gong (2007).

( 47 )
Glossary of terms as used in this report

assessments may be used in both forma- alignment processes. However, because


tive and summative ways. most assessments can be used in both
formative and summative ways, the term
Interoperability refers to the capacity of summative refers less to a particular type
a system to communicate and exchange of assessment than to the purposes for
data seamlessly with other systems, de- which the assessment is used. Assess-
fined by a standard format for shared data, ments that often are used in summative
a set of naming conventions, and a set of ways include state assessments, district
rules for interaction among applications. benchmark or interim assessments, end-
For the purposes of this guide, the term is of-unit or end-of-chapter tests, end-of-term
used in a technical-systems context. exams, and scores that are used for ac-
countability of schools (AYP) and students
Summative assessment is a process that (report card grades).154
establishes what students have and have
not accomplished at the culmination of a Triangulation is the process of using
specific unit of instruction, such as a cur- multiple data sources to address a par-
riculum unit, grading period, or school ticular question or problem and using
year. Rather than specifically informing the evidence from each source to illuminate
learning process as it takes place, summa- or temper evidence from other sources. It
tive assessment is intended to evaluate the also can be thought of as using each data
knowledge and skills of the test taker at a source to test and confirm evidence from
given point in time. Assessments used for other sources in order to arrive at a well-
summative purposes—often called sum- justified decision.
mative assessments—also may be used
to evaluate the effectiveness of programs,
school improvement goals, or curriculum 154.  Garrison and Ehringhaus (2009).

( 48 )
Appendix A. particular types of studies for drawing
Postscript from causal conclusions about what works.
Thus, one typically finds that a strong
the Institute of level of evidence is drawn from a body of
Education Sciences randomized controlled trials, the moder-
ate level from well-designed studies that
do not involve randomization, and the low
What is a practice guide? level from the opinions of respected au-
thorities (see Table 1). Levels of evidence
The health care professions have em- also can be constructed around the value
braced a mechanism for assembling and of particular types of studies for other
communicating evidence-based advice to goals, such as the reliability and validity
practitioners about care for specific clini- of assessments.
cal conditions. Variously called practice
guidelines, treatment protocols, critical Practice guides also can be distinguished
pathways, best practice guides, or simply from systematic reviews or meta-analyses
practice guides, these documents are sys- such as What Works Clearinghouse (WWC)
tematically developed recommendations intervention reviews or statistical meta-
about the course of care for frequently en- analyses, which employ statistical meth-
countered problems, ranging from physi- ods to summarize the results of studies
cal conditions, such as foot ulcers, to psy- obtained from a rule-based search of the
chosocial conditions, such as adolescent literature. Authors of practice guides sel-
development.155 dom conduct the types of systematic lit-
erature searches that are the backbone of
Practice guides are similar to the prod- a meta-analysis, although they take advan-
ucts of typical expert consensus panels tage of such work when it is already pub-
in reflecting the views of those serving lished. Instead, authors use their expertise
on the panel and the social decisions that to identify the most important research
come into play as the positions of individ- with respect to their recommendations,
ual panel members are forged into state- augmented by a search of recent publica-
ments that all panel members are willing tions to ensure that the research citations
to endorse. Practice guides, however, are are up-to-date. Furthermore, the character-
generated under three constraints that do ization of the quality and direction of the
not typically apply to consensus panels. evidence underlying a recommendation in
The first is that a practice guide consists a practice guide relies less on a tight set of
of a list of discrete recommendations that rules and statistical algorithms and more
are actionable. The second is that those on the judgment of the authors than would
recommendations taken together are in- be the case in a quality meta-analysis. An-
tended to be a coherent approach to a other distinction is that a practice guide,
multifaceted problem. The third, which is because it aims for a comprehensive and
most important, is that each recommen- coherent approach, operates with more
dation is explicitly connected to the level numerous and more contextualized state-
of evidence supporting it, with the level ments of what works than does a typical
represented by a grade (strong, moder- meta-analysis.
ate, or low).
Thus, practice guides sit somewhere be-
The levels of evidence, or grades, are tween consensus reports and meta-anal-
usually constructed around the value of yses in the degree to which systematic
processes are used for locating relevant
155.  Field and Lohr (1990). research and characterizing its meaning.
( 49 )
Appendix A.
Postscript from the Institute of Education Sciences

Practice guides are more like consensus expertise to be a convincing source of rec-
panel reports than meta-analyses in the ommendations. IES recommends that at
breadth and complexity of the topic that one least one of the panelists be a prac-
is addressed. Practice guides are different titioner with experience relevant to the
from both consensus reports and meta- topic being addressed. The chair and the
analyses in providing advice at the level panelists are provided a general template
of specific action steps along a pathway for a practice guide along the lines of the
that represents a more-or-less coherent information provided in this appendix.
and comprehensive approach to a multi- They also are provided with examples of
faceted problem. practice guides. The practice guide panel
works under a short deadline of six to nine
Practice guides in education at the months to produce a draft document. The
Institute of Education Sciences expert panel members interact with and re-
ceive feedback from staff at IES during the
IES publishes practice guides in educa- development of the practice guide, but they
tion to bring the best available evidence understand that they are the authors and,
and expertise to bear on the types of sys- thus, responsible for the final product.
temic challenges that cannot currently be
addressed by single interventions or pro- One unique feature of IES-sponsored prac-
grams. Although IES has taken advantage tice guides is that they are subjected to
of the history of practice guides in health rigorous external peer review through the
care to provide models of how to proceed same office that is responsible for inde-
in education, education is different from pendent review of other IES publications.
health care in ways that may require that A critical task of the peer reviewers of a
practice guides in education have some- practice guide is to determine whether
what different designs. Even within health the evidence cited in support of particular
care, where practice guides now number recommendations is up-to-date and that
in the thousands, there is no single tem- studies of similar or better quality that
plate in use. Rather, one finds descriptions point in a different direction have not been
of general design features that permit ignored. Peer reviewers also are asked to
substantial variation in the realization evaluate whether the evidence grade as-
of practice guides across subspecialties signed to particular recommendations by
and panels of experts.156 Accordingly, the the practice guide authors is appropriate.
templates for IES practice guides may vary A practice guide is revised as necessary to
across practice guides and change over meet the concerns of external peer reviews
time and with experience. and gain the approval of the standards and
review staff at IES. The process of external
The steps involved in producing an IES- peer review is carried out independent of
sponsored practice guide are first to select the office and staff within IES that insti-
a topic, which is informed by formal sur- gated the practice guide.
veys of practitioners and requests. Next, a
panel chair is recruited who has a national Because practice guides depend on the
reputation and up-to-date expertise in the expertise of their authors and their group
topic. Third, the chair, working in collabo- decision making, the content of a practice
ration with IES, selects a small number of guide is not and should not be viewed as a
panelists to coauthor the practice guide. set of recommendations that in every case
These are people the chair believes can depends on and flows inevitably from sci-
work well together and have the requisite entific research. It is not only possible but
also likely that two teams of recognized
156.  American Psychological Association (2002). experts working independently to produce
( 50 )
Appendix A.
Postscript from the Institute of Education Sciences

a practice guide on the same topic would its own because the authors are national
generate products that differ in important authorities who have to reach agreement
respects. Thus, consumers of practice among themselves, justify their recom-
guides need to understand that they are, mendations in terms of supporting evi-
in effect, getting the advice of consultants. dence, and undergo rigorous independent
These consultants should, on average, pro- peer review of their product.
vide substantially better advice than an
individual school district might obtain on Institute of Education Sciences

( 51 )
Appendix B. Dr. Halverson is a former high school teacher,
About the authors school technology specialist, curriculum di-
rector, and school administrator.

Panel Sharnell S. Jackson, Ed.M., is the retired


chief e-learning officer; state e-learning
Laura Hamilton, Ph.D., (Chair) is a se- director; and former K–9 literacy, math-
nior behavioral scientist at RAND Corpo- ematics, science, and technology educator
ration and an adjunct associate professor for the Chicago Public Schools. In these
in the University of Pittsburgh’s Learning positions, she was responsible for train-
Sciences and Policy program. Her research ing teachers and administrators in data
focuses on assessment, accountability, analysis and using student achievement
and the measurement of instructional data for school improvement purposes.
and leadership practices. She has directed She also worked to identify and manage
several large projects including a study innovative curriculum instruction solu-
of teachers’ and principals’ responses to tions, digital media content, collaborative
state standards–based accountability poli- communication tools, instructional assess-
cies. Dr. Hamilton has served on several ment management systems to support the
national panels, including the Center on data-inquiry processes, data-driven leader-
Education Policy’s Panel on High School ship skills, and customized online learning
Exit Examinations and its Panel on Student for students. Through her current work,
Achievement Under NCLB, the Brookings Ms. Jackson aims to minimize the data-re-
National Commission on Choice in K–12 porting burden on schools while maximiz-
Education, and the American Psychologi- ing data quality, data use, collaboration,
cal Association/ American Educational and 21st-century skills. She encourages
Research Association/National Council districts to organize for structured, col-
on Measurement in Education Joint Com- laborative work, using data systematically
mittee to Revise the Standards for Educa- to inform instructional and schoolwide im-
tional and Psychological Testing. She has provement, measure progress, understand
a strong background in psychometrics and individual learning needs, and motivate
quantitative analysis, as well as extensive and improve student-centered teaching
experience studying systemic reform and and learning.
data-driven decision making in schools.
Ellen Mandinach, Ph.D., is senior project
Richard Halverson, Ph.D., is associate director at CNA Education and the deputy
professor of educational leadership and and research director for the Regional
policy analysis at the University of Wis- Education Laboratory (REL) Appalachia.
consin School of Education, where he di- She has served as the interim director of
rects the Data-Driven Instruction Systems REL Appalachia and director for research
(DDIS) project. DDIS is a National Science for REL Northeast and Islands. Dr. Man-
Foundation–funded project that examines dinach has spent the past several years
how school leaders build local capacity for examining various aspects of data-driven
teachers to use data to inform teaching and decision making and is the lead investiga-
to improve student learning. The project is tor on a set of projects being conducted
examining how school leaders collect and across all the RELs to address pressing is-
distribute a wide range of achievement and sues around data-driven decision making.
behavioral data, provide organized oppor- She is the author of Data-Driven School Im-
tunities for reflection and design, and build provement: Linking Data and Learning and
formative feedback systems to measure is a member of the assessment planning
the success of internal program design. committee for the National Assessment of
( 52 )
Appendix B.
About the authors

Educational Progress Technology Literacy Staff


and the working group on the Assessment
and Teaching of 21st Century Skills for the Cassandra Pickens, M.S.Ed., is a project
Cisco/Intel/Microsoft Project. analyst for the What Works Clearinghouse
(WWC) at Mathematica Policy Research. She
Jonathan A. Supovitz, Ed.D., is an associ- has served as coordinator in several areas
ate professor at the Graduate School of Edu- of the WWC, including practice guides
cation at the University of Pennsylvania and and outreach and development. Ms. Pick-
a senior researcher at the Consortium for ens supported the panel in translating re-
Policy Research in Education. He is a mixed- search findings into practitioner-friendly
method researcher and has conducted a text. Prior to joining the WWC, Ms. Pickens
number of studies on the relationship be- worked in higher-education student devel-
tween data use and professional develop- opment and programming.
ment, teacher and leadership practice, and
student achievement. His current research Emily Sama Martin, M.P.P., is a human
focuses on how schools and districts use services researcher at Mathematica Policy
different forms of data to support the im- Research. She has served as both reviewer
provement of teaching and learning. Addi- and coordinator in several areas of the
tionally, Dr. Supovitz directs the evidence- WWC, including the Beginning Reading
based leadership strand of the University topic area. Ms. Sama Martin used her back-
of Pennsylvania’s mid-career doctoral pro- ground in education-related research to
gram in educational leadership. He teaches support the panel in analyzing evidence
courses on the policy and instructional and applying evidence to recommenda-
uses of assessment, evidence-based leader- tions. Before joining the WWC, Ms. Sama
ship, and organizational learning. Martin worked on a wide range of quanti-
tative and qualitative program evaluations
Jeffrey C. Wayman, Ph.D., is an assistant in the areas of early childhood, education,
professor at The University of Texas at welfare, nutrition, and disabilities.
Austin. His research on data-based deci-
sion making includes efficient structures Jennifer L. Steele, Ed.D., an associate
for creating data-informed school dis- policy researcher at the RAND Corporation,
tricts, effective leadership for data use, received her Ed.D. from Harvard Univer-
software that delivers student data to edu- sity. Her research focuses on teacher labor
cators, and systemic supports that enable markets, school reform, and data-driven
widespread teacher use of student data. decision making in schools. Dr. Steele is
Dr. Wayman has edited two special journal coeditor of Data Wise in Action: Stories of
issues focused on data use (for the Ameri- Schools Using Data to Improve Teaching
can Journal of Education and the Journal and Learning (2007) and the Harvard Edu-
of Education for Students Placed At Risk) cational Review Special Issue on Adolescent
and is currently directing a project funded Literacy (2008). Previously, she worked as
by the Spencer Foundation—The Data-In- a teacher at the elementary, high school,
formed District: Implementation and Ef- and community college levels and man-
fects of a Districtwide Data Initiative. Prior aged teacher recruitment and training for
to joining The University of Texas faculty, a private education company.
Dr. Wayman worked at The Johns Hopkins
University with the Center for Social Orga-
nization of Schools, at Colorado State Uni-
versity in the area of prevention research,
and as a junior high math teacher in Kan-
sas City and Salt Lake City.
( 53 )
Appendix C. evidence that is documented in the prac-
Disclosure of potential tice guide. In addition, the practice guide
undergoes independent external peer re-
conflicts of interest view prior to publication, with particular
focus on whether the evidence related to
Practice guide panels are composed of in- the recommendations in the practice guide
dividuals who are nationally recognized has been appropriately presented.
experts on the topics about which they
are rendering recommendations. The In- The professional engagements reported
stitute of Education Sciences expects that by each panel member that appear most
such experts will be involved profession- closely associated with the panel recom-
ally in a variety of matters that relate to mendations are noted here.
their work on the panel. Panel members
are asked to disclose their professional Jeffrey C. Wayman has no financial stake
involvements and to institute deliberative in any program or practice that is men-
processes that encourage critical exami- tioned in the practice guide. He is con-
nation of the views of panel members as ducting an efficacy study of the Acuity
they relate to the content of the practice formative assessment system, funded by
guide. The potential influence of panel CTB/McGraw-Hill. No specific discussion
members’ professional engagements is of the Acuity system took place in panel
further muted by the requirement that deliberations, and it is not referenced in
they ground their recommendations in this practice guide.

( 54 )
Appendix D. Recommendation 1.
Make data part of an ongoing cycle
Technical information of instructional improvement
on the studies
Level of evidence: Low
The body of research on how educators
use data to make instructional decisions For this recommendation, the panel drew
consists mainly of studies that do not use on its own expertise as well as examples
a causal design (such as qualitative and within studies that used qualitative designs
descriptive studies), as well as secondary to describe how educators have imple-
analyses (such as literature reviews, meta- mented an inquiry cycle for data use. These
analyses, and implementation guides). Most resources provided needed details about
of the literature consulted provides context the inquiry cycle, especially when, examin-
for and examples of the recommended ing the available evidence, the panel deter-
steps. In drawing from this research to mined that no studies rigorously tested the
formulate this guide, the panel developed effect of using an inquiry cycle as a frame-
recommendations that are accompanied by work for data use on student achievement.
low evidence ratings, because few studies One study, summarized below, illustrates
used causal designs testing the effective- how such a cycle can be implemented and
ness of these recommendations. Of those indicates the types of data that teachers
studies that used causal designs, four met and administrators wish to use as they ex-
WWC standards with or without reserva- amine performance, develop hypotheses,
tions.157 None of those four directly tested and modify instruction.
the effectiveness of the discrete practices
recommended by the panel (i.e., the experi- Example of a study that describes districts
mental condition in the studies combined a that make data part of an ongoing
recommended practice with other aspects, cycle of instructional improvement.
which means that the panel cannot attri-
bute effects observed in the studies to the In a combined case study of two groups of
practices they advise). schools, Herman and Gribbons (2001) de-
scribe how the districts implemented an
This appendix describes the content and inquiry process, detailing the processes
findings of some of the studies the panel for assessing student performance, un-
used to inform its recommendations. It derstanding areas of curriculum strengths
highlights how schools have implemented and weaknesses, and making curricular
and are using processes for making in- changes to address those strengths and
structional changes based on student data weaknesses. The researchers coached the
and also discusses the findings of causal schools through implementing an inquiry
studies as they relate to the panel’s recom- process designed to raise student achieve-
mendations. For each recommendation, ment. Although the panel recognizes that
this appendix also presents a summary coaching of this type will not be available
of one or more key studies both to illus- to all schools or districts that implement
trate how the study supports the panel’s an inquiry cycle for data use, this exam-
recommendation and to provide further ple illustrates one way that schools could
examples for the reader. implement such a cycle in the absence
of coaching.

157.  Jones and Krouse (1988); May and Robinson The researchers had the districts begin by
(2007); Phillips et al. (1993); Wesson (1991). assembling data from a variety of sources

( 55 )
Appendix D.
Technical information on the studies

(recommendation 1, action step 1). Avail- After testing this hypothesis, the sec-
able data were categorized as follows: ondary school discovered that students
being bused from more remote locations
• achievement on state- and district-re- had particular problems in 10th-grade
quired tests; math achievement. Upon further discus-
sion and analysis of this lesson from the
• language proficiency; data (recommendation 1, action step 3),
the school discovered a potential curricu-
• demographics; lum problem. The school conducting the
analysis used a nontraditional math se-
• program participation (e.g., Title I, quence, which was aligned to the curricu-
gifted, special education); and lum from the local middle school because
it offered the first course in that sequence
• attendance and course history (in sec- before sending students to high school,
ondary schools). but students from other areas took a dif-
ferent course, resulting in a discontinuity
To encourage study schools to initiate of curriculum for those students. In fact,
their inquiry processes and assist them similarly bussed students who attended
with measuring student progress (recom- the last year of middle school at the tradi-
mendation 1, action step 2), the research- tional feeder school did not have problems
ers asked schools to begin their data anal- in 10th-grade math that were as severe
ysis by reflecting on three descriptive as those of their bussed peers who came
questions: (1) How are we doing? (2) Are from a different middle school. Therefore,
we serving all students well? and (3) What the school decided to modify instruction
are our relative strengths and weaknesses? (recommendation 1, action step 3) by pro-
Schools were given a report card, which viding a spring and summer course for stu-
summarized existing data in the categories dents from nontraditional feeder schools
listed above, as a tool for school admin- who failed the first semester of math. The
istrators to communicate about the pro- school also provided additional curricu-
cess and initiate discussions about needs lum supports to help bring the students
and goals with staff and parents. Based up to speed with their peers.
on these initial measures, the schools
developed hypotheses (recommendation Finally, in keeping with the cyclical na-
1, action step 2) about student achieve- ture of the inquiry process, school staff
ment. For example, one secondary school assessed the effectiveness of the instruc-
noticed that most of the students had not tional modification by examining data
come from the typical feeder school and from students who took the new course.
had concerns about whether a discontinu-
ity of curriculum for students not coming Recommendation 2.
via the typical route might cause achieve- Teach students to examine their
ment problems. The school hypothesized own data and set learning goals
that students who had attended the local
middle school might have higher achieve- Level of evidence: Low
ment on some measures than would stu-
dents from a different background. The The panel identified two randomized ex-
school then engaged in a comparison of periments that met WWC standards (one
the achievement of students who fed into of these with reservations) while testing
the school from different locations. the effectiveness of instructional practices

( 56 )
Table D1. Studies cited in recommendation 2 that meet WWC standards with or without reservations
Brief Citation Population Grade Intervention Comparison Outcome Results
Phillips et al. General 2–5 (1) Curriculum-based mea- (3) Control group with Number dig- (1) vs. (2): +41, ns
(1993) education class- surement (CBM) combined which teachers used its correct on (1) vs. (3): +107, sig
rooms in a with instructional recom- their conventional prac- Math Operations (2) vs. (3): +51, ns
southeastern, mendations and peer tu- tices for planning and Test–Revised.
urban school toring assignments. CBM monitoring.
district consisted of biweekly as-
sessments that provided
information about trend
scores and students to
watch.

(2) CBM alone.


(Both CBM conditions in-
cluded student feedback.)

( 57 )
May and Randomly High school Personalized Assessment Standard OGT (1) OGT scaled (1) Authors report no
Robinson selected students and Reporting System (PARS), reports for teachers, scores significant difference
(2007)a districts in Ohio teachers a report of the Ohio gradua- parents, and students between students in
tion test (OGT) for teachers, with less color and (2) OGT retake treatment and com-
parents, and students with graphics. All districts scores (among parison districts.
colorful and graphic summa- (including treatment) students failing
ries of student performance, could access website of at least one sub- (2) PARS students
and an interactive website practice tests. test on first try) were more likely than
with advice for students to control students to
improve their scores. retake the test and to
score higher in math,
science, and social
studies.

ns=not significant
sig=statistically significant
Appendix D.
Technical information on the studies

a. May and Robinson (2007) did not report the means and standard deviations needed for the WWC to calculate effect sizes or confirm the statistical significance
of the authors’ claims.
Appendix D.
Technical information on the studies

that included student self-examination of on how to improve their scores and skills
assessment data among other elements.158 through online tutorials and question-and-
However, neither study tested the sole ef- answer sessions.160 Although the authors
fect of student data use; rather, students’ reported that students in the treatment
involvement with their own data was part condition were more likely than other stu-
of multifaceted interventions in which dents to retake the test after failing at least
teachers and specialized staff also were one subtest—and to have higher scores in
using student data (Table D1). In the first math, science, and social studies when they
study, there were large effects on student did retake the test—the study did not re-
achievement, one of which was statisti- port the means and standard deviations of
cally significant. Authors of the second the outcome measures, so the WWC was not
study also reported significant achieve- able to verify statistical significance.
ment effects, but the WWC could not con-
firm that finding because the study did not To provide readers with a sense of how
report the means and standard deviations students use data and teachers provide
used to calculate the effects. feedback, the panel offers the following
example from a study that used a less rig-
In the first study, Phillips et al. (1993) com- orous design.
pared two curriculum-based measure-
ment (CBM) interventions, both of which Example of a study that describes how a
included a student feedback component, teacher can explain expectations, provide
to a non-CBM condition. The study re- timely and constructive feedback, and
ported large positive effects of both CBM help students learn from that feedback.
interventions, but only the comparison of
CBM combined with teacher feedback on Clymer and Wiliam’s (2007) pilot study of a
instructional recommendations versus the standards-based grading system at a sub-
non-CBM condition was statistically sig- urban Pennsylvania 8th-grade classroom
nificant.159 Students analyzing their own is closely related to the panel’s first two
performance in this study were reportedly suggested action steps in recommenda-
reflecting on data using questions such as tion 2. The teacher in the study mapped
“Can I beat my highest score in the next 10 content standards to five marking peri-
two weeks?” and “Which skills can I work ods and identified tasks and skills for stu-
harder on in the next two weeks?” Teacher dents to improve their proficiency on each
feedback included instructing students standard. The teacher then developed a
on how they can interpret their progress performance-rating system using a colored
graphs and skills profiles as well as coach- “stoplight” to reflect beginning knowledge
ing students to ask questions about their (red), developing knowledge (yellow), or
data to diagnose areas for improvement. mastery (green) of these standards. The
colored categories translated into numeric
The second experiment compared two scores at the end of each marking period
school districts in Ohio, both of which re- and were aggregated to generate a stu-
leased reports about student performance dent’s overall grade in the course.
on an annual state test to teachers, parents,
and students. An interactive website used The teacher explained expectations (recom-
by these districts also allowed students in mendation 2, action step 1) by sharing the
the treatment condition to access directions content standards and corresponding rat-
ings with the students and explaining that
158.  May and Robinson (2007); Phillips et al. grades would be based on understanding
(1993).
159.  Phillips et al. (1993). 160.  May and Robinson (2007).

( 58 )
Appendix D.
Technical information on the studies

of the material at the end of each marking Examples of establishing and depending
period. Rather than assigning grades, the on schoolwide leadership for continuous
teacher provided feedback (recommen- data use.
dation 2, action step 2) to students with
weekly reports on their progress toward A case study by Halverson et al. (2007)
each standard (using the colored stoplight) examined the practices of four schools
and helped students learn from that feed- recognized for their strong leadership in
back (recommendation 2, action step 3) by using data to make instructional decisions
encouraging them to revise their work or (while also recording student achievement
complete additional assignments to dem- gains). The researchers gathered data
onstrate better mastery in red and yellow through structured interviews with prin-
areas. The panel considers this type of cipals and other school leaders as well as
feedback to be both timely and construc- through observations of staff meetings
tive. The study also suggested that the and events relevant to data use.
teacher provide tools to help students learn
from this feedback, but did not describe the In these four schools, principals and teach-
tools or feedback process in detail. ers met regularly to reflect on assessment
results and to discuss how to modify prac-
The authors reported that the class in the tice. Administrators provided activities for
pilot study showed greater achievement teachers and principals to work together
gains in science over the course of a school to discern patterns in the data and to de-
year than did a similar class not participat- velop hypotheses and courses of action to
ing in the pilot, although they caution that address perceived needs for instructional
the design of the study means that these change. At several school-level faculty
results may not be generalizable to other meetings throughout the year, staff revis-
classrooms. When surveyed, students par- ited the goals. Faculty meetings around
ticipating in the study also reported that data occurred at least quarterly in study
receiving teacher feedback about how to schools, and one school had weekly meet-
correct their performance, as well as their ings focused on students’ behavioral data.
accuracy, was helpful. Staff involved in school-level data exami-
nation and instructional change decisions
Recommendation 3. included principals, classroom teachers,
Establish a clear vision special education teachers, and school
for schoolwide data use psychologists. Some examples of meth-
ods that principals used to encourage
Level of evidence: Low their staff to take leadership for data use
included scheduling small team meetings
The panel used several studies with quali- for all teachers in a given grade; inviting
tative designs as resources for information all staff to beginning and end-of-year meet-
on how some schools have implemented ings at which the school used achievement
practices similar to those they recom- data to assess progress; and asking teach-
mend, and for concrete examples to clarify ers to use annual assessment data to iden-
its suggested action steps. This section tify areas in which the current curriculum
provides brief overviews of specific quali- had too much, or too little, emphasis on
tative studies that showcase examples of required concepts.
how the recommended action steps have
been implemented. No studies examined Example of how schools could
by the panel used a causal design to ex- approach defining teaching and
amine how establishing a vision for data learning concepts.
use affects student achievement.
( 59 )
Appendix D.
Technical information on the studies

Wayman, Cho, and Johnston (2007) con- researchers also conducted informal school
ducted a case study of how a school district and classroom observations and reviewed
uses, and could more efficiently use, data relevant documents.
for instructional decisions. The authors in-
dicated that districts or systems in which In synthesizing the results from the eight
staff do not have a shared definition of schools, researchers identified that one
teaching and learning will experience bar- practice the schools shared was their use
riers and challenges to agreeing on learn- of assessment data to set measurable goals
ing goals, and they specifically advocated for student, classroom, school, and system
that the educators should begin by answer- progress. The authors noted that setting
ing four questions about data and instruc- goals for students is a “precondition for ef-
tion: “(1) What do we mean by learning and fective data-driven decisionmaking” (p. 20).
achievement? (2) How will we conduct and Schools found the most success in defining
support teaching and learning? (3) How will goals that were focused and specific. For
we know teaching and learning when we example, in one district, the goals for the
see it? (4) What action will we take based year were (1) all students will score a 3 and
on our results?” (p. 42). The panel provides at least two-thirds of students will score a
these questions as examples but recognizes 4 on the schoolwide writing assignment;
that the answers to these questions will (2) all students will be at grade level for
vary widely as schools and districts re- reading in the spring, or at least two levels
spond in ways that account for their local above where they were in the fall; and (3)
circumstances. all students will be at the proficient level
on the math benchmark test by the spring.
Example of districts that develop a Staff and administrators from all levels
written plan to use data in support of (classroom, building, and system) were in-
articulated goals. volved in goal-setting decisions.

Datnow, Park, and Wohlstetter (2007) con- The authors concluded that the eight
ducted case studies of eight urban schools schools used the goal-setting process as
from two public school districts and two a starting point for developing a system-
charter school systems. The study districts wide plan for data use, forming the foun-
were selected from a pool of 25 districts dation for a data culture that had buy-in
that were recommended by researchers from staff at all levels. Leaders at the sys-
and experts in the field as being at the tem level across the study schools re-
forefront of using performance results ported that explicitly stating their expec-
for instructional decision making. The tations for when and how educators would
researchers selected two schools per dis- use assessment data was instrumental in
trict/system after receiving recommenda- encouraging staff to use data rather than
tions from district-level staff about which intuition to shape instructional decisions.
schools were most engaged in the process At the schools in public districts, system
of using data to inform instruction. In each leaders experienced more challenges fos-
district, researchers interviewed staff from tering staff buy-in than did leaders in
the central office, building-level staff at charter systems; researchers and staff at-
each school, and at least five teachers per tributed this to the need to overcome in-
school, for a total of 70 staff interviews over stitutional practices in the public districts
the course of three months in 2006. The that did not exist in charter schools.

( 60 )
Appendix D.
Technical information on the studies

Table D2. Scheduling approaches for teacher collaboration


Time and Planning Strategies Activities

School A 1. Once every month, the school day begins a. School staff review district standards
two hours later—teachers meet during and realign the assessments they use
this time to engage in the activities de- accordingly.
scribed in the column to the right. School
b. School staff continuously reevaluate this
makes up this accumulated time by ex-
work and discuss and plan changes as
tending the school year.
needed.

School B 1. School staff is released early from school a. Schools use allotted time to align curric-
once per week for at least 45 minutes. ulum across grades with the state stan-
This time is added to other days through- dards. This process is driven by student
out the week. assessment data.
2. The entire staff meets weekly for one b. School staff continuously reevaluate this
hour before school. Staff decreased the work and discuss and plan changes as
“nuts and bolts” of the meetings and pri- needed.
oritized work related to assessment.

School C 1. Same-grade teachers meet informally a. Staff discuss students’ progress according
during weekly planning periods and for- to the “developmental continuums” writ-
mally every six weeks. To accommodate ten by school staff.
these planning periods, students in entire
b. Teachers administer individual assess-
grades are sent to “specials” (e.g., gym,
ments to students.
art classes). Time also is allotted at regu-
larly scheduled staff meetings. c. Staff discuss reports on assessment data
from district research department.
2. Teachers are released from teaching du-
ties several days each year and are re-
placed by substitute teachers.
3. Teachers meet with the principal up to
three times each year.

School D 1. Teachers request time to meet with each a. Staff members share knowledge gained
other during school hours; substitutes are from professional development activities
hired to support this. In addition, teach- that addressed curriculum and assess-
ers meet after school. ment. They also discuss student mastery
of standards and other outcomes and
2. Teachers meet in “within-grade” and “sub-
possible intervention strategies.
ject area” teams during their planning
hours once per week.

Source: Cromey and Hanson (2000), p. 18.

( 61 )
Appendix D.
Technical information on the studies

Recommendation 4. Another randomized trial, which met WWC


Provide supports that foster a data- standards, compared the reading achieve-
driven culture within the school ment of elementary school students with
disabilities whose teachers used two types
Level of evidence: Low of progress monitoring (curriculum based
versus teacher developed) and received two
The panel identified no causal studies types of consultation from mentors (group
meeting WWC standards that specifically and individual), for a total of four groups.162
examined the effectiveness of staff sup- Related to this recommendation was the au-
ports with respect to student outcomes. thor’s finding that students whose teachers
Two randomized trials of interventions that had group consultation did not perform as
included coaching for teachers around data well as those whose teachers had individual
use along with other treatment condition coaching, but the effect was not statistically
aspects met WWC standards (one with and significant, failing to provide the panel with
one without reservations). In both cases, strong causal support for recommending
however, the treatment condition incor- that teachers receive individual versus
porated many elements of which teacher group consultation.
support was just one, and neither reported
a discernible effect on student achieve- To provide readers with a sense for how
ment.161 The panel examined other studies, other schools designate structured time
which did not use designs rigorous enough for data use and provide professional de-
to meet WWC standards, and noted specific velopment to support staff data use, the
examples of how the recommended action panel offers the following examples from
steps have been implemented. studies that used less rigorous designs.

In a randomized trial that met WWC stan- Example of a school/district study that
dards with reservations, Jones and Krouse designates structured time for data use.
(1988) randomly assigned student teachers
to one of two groups that received coaching. Cromey and Hanson (2000) conducted a
One group received coaching on classroom qualitative study of how schools use as-
management; the other received coaching sessment data from multiple sources, aim-
on classroom management and data use for ing to identify characteristics of schools
making instructional changes. The data-use that make valuable use of their data. After
intervention included individualized coach- interviewing district administrators, prin-
ing by supervisors on how the teachers could cipals, teachers, and other building staff
use assessment and behavioral data to track from nine schools about how they collect
student progress and make changes in the and use student assessment data, the re-
classroom. Teachers in the data-use group re- searchers identified six characteristics of
ported more frequently using pupil observa- schools with well-developed assessment
tions to make instructional decisions, but the systems. The characteristic most applicable
study authors make no claims about whether to recommendation 4, action step 2, is that
this difference was statistically significant, these schools specifically allocate time for
nor does the study include information the their staff to reflect collaboratively on how
WWC would need to calculate statistical sig- they will use student assessment data to
nificance. There was also no statistically sig- guide their instructional decisions. Table
nificant difference in the reading and math D2, drawn from this study, describes the
outcomes of the students assigned to these approaches four schools used to schedule
two groups of teachers. collaboration time. Although the panel did

161.  Jones and Krouse (1988); Wesson (1991). 162.  Wesson (1991).

( 62 )
Appendix D.
Technical information on the studies

not have evidence that these approaches compared to only three percent of respon-
are effective for increasing student achieve- dents who never received training. Adminis-
ment, they reproduce this table here to pro- trators were less likely than teachers to show
vide an array of examples to readers. interest in more frequent training—only 14
percent of administrators reporting no train-
Example of how school/district provided ing thought that this was insufficient.
targeted and regular professional
development opportunities. Teachers, administrators, and superinten-
dents proposed ways to improve profes-
Anderegg’s (2007) case study of data use sional development around data use and
in several Alaska school districts has find- analysis. A majority of all respondents
ings relevant to the panel’s third suggested suggested that data training be focused
action step for recommendation 4. The on analysis to inform teachers’ day-to-day
author explored several aspects of data implementation of “standards, curriculum,
use, including professional development and instruction” and provide resources for
around data use and analysis for teachers, doing so (p. 114). All three groups also ad-
school administrators, and district super- dressed the frequency of data training—
intendents. A mixed-method approach was the majority of superintendents and ad-
used to collect and analyze data. The au- ministrators cited the need to engage in
thor implemented a written survey in 53 “ongoing discussions and analysis,” and
districts, conducted follow-up telephone more than one-quarter of teachers sug-
surveys, and studied paper records de- gested that they needed more time to ana-
scribing data use and school in-service lyze and discuss data and plan accordingly
plans at select sites. (p. 116). Sixty-three percent of superinten-
dents cited the need for access to disag-
Survey questions focused on professional gregated data or training on “specific data
development targeted toward “the use of analysis tools” (p. 89).
data analysis methods and skills, such as
finding patterns and/or systemic relation- Given that this study was conducted in
ships between variables” (p. 171), although mostly rural Alaska school districts, the
respondents also were given the opportu- author cautions that these findings may
nity to respond to open-ended questions not be representative of more urban dis-
on existing and desired professional de- tricts or those in other states. Further-
velopment. The majority of respondents more, this study does not present any
reported receiving some kind of data train- evidence suggesting that frequent and
ing, with 12 percent of administrators and targeted professional development leads
four percent of teachers receiving training to increased data use and analysis and will
at least monthly. More than one-third of re- support the overall goal of creating a data-
spondents reported never receiving such driven culture within a school.
training. The study found that regular
professional development (recommenda- Recommendation 5.
tion 4, action step 3) around data use and Develop and maintain
analysis is not widespread. a districtwide data system

The study’s findings suggest that teachers Level of evidence: Low


would be interested in receiving more fre-
quent professional development around The panel identified no studies that used
data use and analysis. All of the teachers a rigorous design to test how developing
receiving data training at least monthly and maintaining a data system impact stu-
reported that such training was sufficient, dent achievement. To assist districts with
( 63 )
Appendix D.
Technical information on the studies

thinking through the process of obtaining, aggregation. Users access the report-
launching, and maintaining a system, the ing features using predesigned queries
panel drew examples from qualitative and and web-based reports; and
descriptive studies of how other districts
have approached the challenge of identi- • providing access to instructional sug-
fying the correct data system. gestions based on a student’s perfor-
mance that teachers can link to from the
Example of how one school district area on students’ assessment data.
involved stakeholders in the decision
to build a data system, articulated Example of how a group of districts
requirements, and then implemented involved stakeholders, articulated system
the new system. requirements, and implemented new
data systems (both built and bought).
Long et al. (2008) conducted an implemen-
tation study of a data warehouse in one Mieles and Foley (2005) conducted a case
school district by conducting interviews study focused on the implementation pro-
with staff at all levels. When this school cesses, successes, and challenges of data-
district determined it should build (rec- warehouse technology. The study was based
ommendation 5, action step 3) its own on interview data from educators and ed-
data warehouse to meet rising state and ucation-technology experts in eight urban
federal data needs, the district’s account- school districts that were at different points
ability and research department led the in the process of implementing data ware-
team that developed the new system. To houses. The eight districts involved stake-
involve stakeholders (recommendation 5, holders (recommendation 5, action step 1)
action step 1) in selecting the system and in systems decisions by engaging staff from
to articulate system requirements (recom- multiple levels. These stakeholders included
mendation 5, action step 2), that depart- superintendents, principals, school board
ment began by assessing the needs of data members, experts at neighboring school
users. Then, the team planned and staged districts, staff with expertise in instruc-
implementation (recommendation 5, ac- tion and assessment, and external vendors
tion step 4) of the system by building one with technical expertise. Six of the districts
system module at a time, a process that convened planning committees staffed by
the developers reported “kept [the project] stakeholders with different roles.
alive by not trying to design every part of
the system at once” (p. 216). Some features These committees articulated systems re-
of the final system include quirements (recommendation 5, action step
2) by developing needs assessments and
• combining data from multiple sources, planned for staged rollouts by coming to
including assessment, demographic, agreement on what data the system would
school profile, and special program collect and use, who would use it, and what
data; systems would be replaced by the new ap-
proach. In the final product, the staff inter-
• providing access to handouts, a sta- viewed for the study had a range of formats
tistics chat, and frequently asked and levels of access to reports that drew on
questions; the warehouse data. Particularly useful to
these staff was the ability to “drill down”
• creating a graphing tool that enables and explore the demographic and admin-
users to examine assessment and de- istrative data in the warehouse to look for
mographic data from different peri- patterns of how they might be associated
ods of time and at different levels of with achievement. In some districts, the
( 64 )
Appendix D.
Technical information on the studies

capability to do so was limited by staff roles capacities and needs, advised the district
for security and confidentiality reasons. To to involve stakeholders (recommendation
address security concerns, some districts 5, action step 1) from “every level of the
introduced or planned to introduce differ- district” (p. 11), in a conversation about
entiated access to their data warehouse by what data mean and why they are impor-
staff role in order to protect privacy and tant and useful to staff. Then, the authors
provide security. advised the district to acquire an integrated
computer data system, beginning with a
When planning and staging implementa- clearly articulated understanding of sys-
tion (recommendation 5, action step 4), tem requirements (recommendation 5, ac-
some districts participating in the study tion step 2). The authors advised that the
requested demonstrations or pilots and final system should be intuitive, easy to
got feedback from users about system use, and flexible to pull data from or export
features before full implementation of a data to other systems or programs. This in-
data warehouse. Most districts had imple- teroperability of systems and ease of use,
mented a data warehouse within a year of when available together, could allow staff to
beginning their inquiry process, and all overcome barriers that had previously pre-
districts experienced ongoing modifica- vented them from optimal use of student
tions and expansions to the system after data to inform their decisions. The authors
it was implemented based on increased ca- further recommended that the district care-
pacity and growing demands from users. fully consider security needs for their data
Districts not using external vendors found system as their data-based decision-making
that cross-departmental communication process evolved. Specific suggestions in-
and onsite support from internal staff for cluded development of policies to govern
those using the data warehouse were es- which staff should have access to which
sential to implementation. Some districts types of data, how and when staff should
faced unexpectedly onerous challenges access data, and how the system would be
with cleaning and integrating data that encrypted or otherwise protected. In this
originated from multiple sources and in- study, the authors specifically advised the
dicated that data dictionaries defining district to buy a data warehouse (recom-
the values of variables were a successful mendation 5, action step 3) to hold all of
long-term solution for some districts that these data from multiple sources, based
began with data quality difficulties. After on their evaluation of the district, which
launching a data warehouse, all study dis- showed that it needed a system immedi-
tricts discovered that they needed more ately and did not have the technical capac-
time and resources than expected for data ity to build one.
quality assurance, but they also found that
high-quality data were essential to con- Finally, they advised the district to plan an
vincing staff to use the new system. implementation (recommendation 5, ac-
tion step 4) that consisted of a gradual roll-
Example of a study advising a school out of new system pieces, beginning with
district on how to proceed with its those that “will provide the most value and
data-system decisions, including issues immediate impact” (p. 52) in order to keep
of which staff to involve in choosing the implementation process moving while
system requirements and implementing simultaneously gaining user buy-in.
the system.

Wayman, Cho, and Johnston (2007), after


being commissioned to conduct an in-
depth case study of one district’s data use
( 65 )
References Unpublished doctoral dissertation, Uni-
versity of Oregon, Eugene, OR.
Aarons, D. I. (2009). Enthusiasm builds for Bigger, S. L. (2006). Data-driven decision-
data systems. Education Week, 28(34), making within a professional learning
18–19. community: Assessing the predictive quali-
Abbott, D. V. (2008). A functionality frame- ties of curriculum-based measurements
work for educational organizations: to a high-stakes, state test of reading
Achieving accountability at scale. In achievement at the elementary level. Un-
E. Mandinach & M. Honey (Eds.), Data published doctoral dissertation, Univer-
driven school improvement: Linking sity of Pennsylvania, Philadelphia, PA.
data and learning (pp. 257–276). New Black, P., Harrison, C., Lee, C., Marshall,
York: Teachers College Press. B., & Wiliam, D. (2003). Assessment for
American Educational Research Associa- learning: Putting it into practice. Maid-
tion, American Psychological Associa- enhead, UK: Open University Press.
tion, & National Council on Measurement Black, P., & Wiliam, D. (1998). Assessment
in Education. (1999). The standards for and classroom learning. Assessment in
educational and psychological testing. Education, 5(1), 7–74.
Washington, DC: American Educational Booher-Jennings, J. (2005). Below the
Research Association Publications. bubble: “Educational triage” and the
American Psychological Association. Texas Accountability System. American
(2002). Criteria for practice guideline Educational Research Journal, 42(2),
development and evaluation. American 231–268.
Psychologist, 57(12), 1048–1051. Breiter, A., & Light, D. (2006). Data for
American Recovery and Reinvestment Act school improvement: Factors for de-
of 2009, S. 1, 111th Congress, 1st Ses- signing effective information systems
sion (2009). to support decision-making in schools.
Anderegg, C. C. (2007). Classrooms and Educational Technology and Society,
schools analyzing student data: A study 9(3), 206–217.
of educational practice (Doctoral dis- Brunner, C., Fasca, C., Heinze, J., Honey, M.,
sertation, Pepperdine University, 2007). Light, D., Mandinach, E., et al. (2005).
Dissertation Abstracts International, Linking data and learning: The Grow Net-
68(02A), 184–538. work study. Journal of Education for Stu-
Anderson, J., Goertz, M. E., Goldwasser, M., dents Placed At Risk, 10(3), 241–267.
Hovde, K., Massell, D., Mueller, J. A., et al. Choppin, J. (2002, April). Data use in prac-
(2006). SchoolNet: A case study of imple- tice: Examples from the school level.
mentation in three schools. Philadelphia, Paper presented at the annual meeting
PA: Consortium for Policy Research in of the American Educational Research
Education. Association, New Orleans, LA.
Armstrong, J., & Anthes, K. (2001). How data Chrismer, S. S., & DiBara, J. (2006). Formative
can help: Putting information to work Assessment of Student Thinking in Read-
to raise student achievement. American ing (FAST-R): An evaluation of the use of
School Board Journal, 188(11), 38–41. FAST-R in the Boston public schools. Cam-
Arnold, J. G. (2007). School capacity for bridge, MA: Education Matters, Inc. 
data-driven decision making and stu- Clymer, J. B., & Wiliam, D. (2007). Improv-
dent achievement. Unpublished doc- ing the way we grade science. Educa-
toral dissertation, University of South tional Leadership, 64, 36–42. 
Carolina, Columbia, SC. Copland, M. A. (2003). Leadership of inquiry:
Bettesworth, L. R. (2006). Administrators’ Building and sustaining capacity for
use of data to guide decision-making. school improvement. Educational Evalu-
ation and Policy Analysis, 25(4), 375–395. 
( 66 )
References

Cromey, A., & Hanson, M. (2000). An ex- Oklahoma elementary school. Unpub-
ploratory analysis of school-based stu- lished doctoral dissertation, University
dent assessment systems. Oak Brook, of Oklahoma, Norman, OK.
IL: North Central Regional Educational Halverson, R., Grigg, J., Prichett, R., &
Laboratory (NCREL). Thomas, C. (2007). The new instruc-
Datnow, A., Park, V., & Wohlstetter, P. tional leadership: Creating data-driven
(2007). Achieving with data: How high- instructional systems in schools. Journal
performing school systems use data to of School Leadership, 17(2), 158–193.
improve instruction for elementary stu- Halverson, R., Prichett, R. B., & Watson, J. G.
dents. Los Angeles, CA: University of (2007). Formative feedback systems
Southern California, Center on Educa- and the new instructional leadership.
tional Governance. Madison, WI: University of Wisconsin.
Elmore, R. F. (2003). Doing the right thing, Halverson, R., & Thomas, C. N. (2007). The
knowing the right thing to do: School im- roles and practices of student services
provement and performance-based ac- staff as data-driven instructional leaders.
countability. Washington, DC: National In M. Mangin & S. Stoelinga (Eds.), Instruc-
Governors Association Center for Best tional teachers leadership roles: Using re-
Practices. search to inform and reform (pp. 163–200).
Feldman, J., & Tung, R. (2001). Using data- New York: Teachers College Press.
based inquiry and decision making to Hamilton, L. (2003). Assessment as a policy
improve instruction. ERS Spectrum: tool. Review of Research in Education,
Journal of School Research and Informa- 27, 25–68.
tion, 19(3), 10–19. Hamilton, L. S., Stecher, B. M., Marsh, J. A.,
Fiarman, S. E. (2007). Planning to assess prog- McCombs, J. S., Robyn, A., Russell, J. L.,
ress: Mason Elementary School refines an et al. (2007). Standards-based account-
instructional strategy. In K. P. Boudett & J. ability under No Child Left Behind: Expe-
L. Steele (Eds.), Data wise in action: Stories riences of teachers and administrators
of schools using data to improve teaching in three states. Santa Monica, CA: RAND
and learning (pp. 125–148). Cambridge, Corporation.
MA: Harvard Education Press. Herman, J., & Gribbons, B. (2001). Lessons
Field, M. J., & Lohr, K. N. (Eds.). (1990). Clini- learned in using data to support school in-
cal practice guidelines: Directions for a quiry and continuous improvement: Final
new program. Washington, DC: National report to the Stuart Foundation. Los Ange-
Academy Press. les, CA: University of California, Center
Forman, M. L. (2007). Developing an action for the Study of Evaluation (CSE).
plan: Two Rivers Public Charter School fo- Hill, D., Lewis, J., & Pearson, J. (2008). Metro
cuses on instruction. In K. P. Boudett & J. Nashville Public Schools student assessment
L. Steele (Eds.), Data wise in action: Stories staff development model. Nashville, TN:
of schools using data to improve teaching Vanderbilt University, Peabody College.
and learning (pp. 107–124). Cambridge, Huffman, D., & Kalnin, J. (2003). Collabora-
MA: Harvard Education Press. tive inquiry to make data-based deci-
Garrison, C., & Ehringhaus, M. (2009). For- sions in schools. Teaching and Teacher
mative and summative assessment in the Education, 19(6), 569–580.
classroom. National Middle School Asso- Ingram, D., Louis, K. S., & Schroeder, R. G.
ciation. Retrieved April 15, 2009, from (2004). Accountability policies and
http://www.nmsa.org/Publications/ teacher decision making: Barriers to the
WebExclusive/Assessment/tabid/1120/ use of data to improve practice. Teach-
Default.aspx. ers College Record, 106(6), 1258–1287.
Gentry, D. R. (2005). Technology supported Jones, E. D., & Krouse, J. P. (1988). The ef-
data-driven decision-making in an fectiveness of data-based instruction
( 67 )
References

by student teachers in classrooms for Liddle, K. (2000). Data-driven success: How


pupils with mild learning handicaps. one elementary school mined assess-
Teacher Education and Special Educa- ment data to improve instruction. Amer-
tion, 11(1), 9–19. ican School Board Journal. Retrieved
Kerr, K. A., Marsh, J. A., Ikemoto, G. S., April 19, 2009, from http://www.asbj.
Darilek, H., & Barney, H. (2006). Strat- com/MainMenuCategory/Archive.aspx.
egies to promote data use for instruc- Light, D., Wexler, D. H., & Heinze, J. (2005).
tional improvement: Actions, outcomes, Keeping teachers in the center: A frame-
and lessons from three urban districts. work for data-driven decision-making.
American Journal of Education, 112(4), Technology and Teacher Education An-
496–520. nual, 1, 128.
Kluger, A. N., & DeNisi, A. (1996). The ef- Long, L., Rivas, L. M., Light, D., & Mandi-
fects of feedback interventions on per- nach, E. B. (2008). The evolution of a
formance: A historical review, a meta- homegrown data warehouse: TUSDStats.
analysis, and a preliminary feedback In E. B. Mandinach & M. Honey (Eds.),
intervention theory. Psychological Bul- Data-driven school improvement: Link-
letin, 119(2), 254. ing data and learning (pp. 209–232). New
Knapp, M. S., Swinnerton, J. A., Copland, M. York: Teachers College Press.
A., & Monpas-Huber, J. (2006). Data-in- Mandinach, E. B., Honey, M., Light, D.,
formed leadership in education. Seattle, Heinze, C., & Rivas, L. (2005, June).
WA: University of Washington, Center Creating an evaluation framework for
for the Study of Teaching and Policy. data-driven decision-making. Paper pre-
Koretz, D. (2003). Using multiple measures sented at the National Educational Com-
to address perverse incentives and puting Conference, Philadelphia, PA.
score inflation. Educational Measure- Marsh, J. A., Pane, J. F., & Hamilton, L. S.
ment: Issues and Practice, 22(2), 18–26. (2006). Making sense of data-driven de-
Koretz, D. M., & Barron, S. I. (1998). The cision making in education: Evidence
validity of gains in scores on the Ken- from recent RAND research (OP-170).
tucky Instructional Results Information Santa Monica, CA: RAND Corporation.
System (KIRIS). Santa Monica, CA: RAND Marsh, J. A., McCombs, J. S., Lockwood, J. R.,
Corporation. Martorell, F., Gershwin, D., Naftel, S., et
Lachat, M. A., & Smith, S. (2005). Practices al. (2008). Supporting literacy across the
that support data use in urban high sunshine state: A study of Florida middle
schools. Journal of Education for Stu- school reading coaches. Santa Monica,
dents Placed At Risk, 10(3), 333–349. CA: RAND Corporation.
Lane, C., Marquardt, J., Meyer, M. A., & Mason, S. (2002). Turning data into knowl-
Murray, W. (1997). Addressing the lack edge: Lessons from six Milwaukee public
of motivation in the middle school set- schools. Madison, WI: Wisconsin Center
ting. Chicago, IL: St. Xavier University, for Education Research.
Master’s action research project. Mason, S. A. (2003, April). Learning from
Lee, D., & Gavine, D. (2003). Goal-setting data: The role of professional learning
and self-assessment in Year 7 students. communities. Paper presented at the an-
Educational Research, 45(1), 49–59. nual meeting of the American Educational
Leithwood, K., Louis, K. S., Anderson, S., & Research Association, Chicago, IL.
Wahlstrom, K. (2007). Review of research: May, H., & Robinson, M. A. (2007). A ran-
How leadership influences student learn- domized evaluation of Ohio’s Person-
ing. Minneapolis, MN: University of Min- alized Assessment Reporting System
nesota, Center for Applied Research and (PARS). Philadelphia, PA: Consortium for
Educational Improvement. Policy Research in Education.

( 68 )
References

Means, B., Padilla, C., DeBarger, A., & Bakia, curriculum-based measurement and peer
M. (2009). Implementing data-informed tutoring to help general educators provide
decision making in schools—teacher ac- adaptive education. Learning Disabilities
cess, supports and use. Washington, DC: Research & Practice, 8(3), 148–156.
U.S. Department of Education. Ramnarine, S. (2004). Impacting student
Merriam-Webster Online Dictionary. (2009). achievement through data-driven de-
Hypothesis. Retrieved April 22, 2009, cision-making. MultiMedia & Internet @
from http://www.merriam-webster.com/ Schools, 11(4), 33–35.
dictionary/hypothesis. Rossmiller, R. A., & Holcomb, E. L. (1993,
Mid-Continent Research for Education and April). The Effective Schools process for
Learning (McREL). (2003). Sustaining continuous school improvement. Paper
school improvement: Data-driven deci- presented at the annual meeting of the
sion making. Aurora, CO: Author. American Educational Research Asso-
Mieles, T., & Foley, E. (2005). Data ware- ciation, Atlanta, GA.
housing: Preliminary findings from a Schunk, D. H., & Swartz, C. W. (1992, April).
study of implementing districts. Prov- Goals and feedback during writing strat-
idence, RI: Annenberg Institute for egy instruction with gifted students.
School Reform. Paper presented at the annual meeting
Moody, L., & Dede, C. (2008). Models of of the American Educational Research
data-based decision-making: A case Association, San Francisco, CA.
study of the Milwaukee Public Schools. Shepard, L. A. (1995). Using assessment to
In E. B. Mandinach & M. Honey (Eds.), improve learning. Educational Leader-
Data-driven school improvement: Link- ship, 52(5), 38–43.
ing data and learning (pp. 233–254). Shepard, L. A., Flexer, R. J., Hiebert, E. H.,
New York: Teachers College Press. Marion, S. F., Mayfield, V., & Weston, T. J.
Nabors Oláh, L., Lawrence, N., & Riggan, M. (1996). Effects of introducing classroom
(2008, March). Learning to learn from performance assessments on student
benchmark assessment data: How teach- learning. Educational Measurement: Is-
ers analyze results. Paper presented at the sues and Practice, 15(3), 7–18.
annual meeting of the American Educa- Spillane, J. P., Halverson, R., & Diamond, J.
tional Research Association, New York. B. (2004). Towards a theory of leadership
Obama, B. (2009, March 10). Remarks by practice: A distributed perspective. Jour-
the president to the Hispanic Chamber of nal of Curriculum Studies, 36(1), 3–34.
Commerce on a complete and competi- Stecker, P. M. (1993). Effects of instructional
tive American education. Retrieved April modifications with and without curric-
20, 2009, from http://www.whitehouse. ulum-based measurement on the math-
gov/the_press_office/Remarks-of-the- ematics achievement of students with
President-to-the-United-States-Hispanic- mild disabilities (Doctoral dissertation,
Chamber-of-Commerce/. Vanderbilt University, 1993). Disserta-
Owings, C. A., & Follo, E. (1992). Effects of tion Abstracts International, 55 (01A).
portfolio assessment on students’ attitudes Stiggins, R. (2007). Assessment through
and goal setting abilities in mathematics. the student’s eyes. Educational Leader-
Rochester, MI: Oakland University. ship, 64(8), 22–26.
Perie, M., Marion, S., & Gong, B. (2007). A Supovitz, J. A. (2006). The case for district-
framework for considering interim as- based reform: Leading, building, and
sessments. Dover, NH: National Center sustaining school improvement. Cam-
for the Improvement of Educational bridge, MA: Harvard Education Press.
Assessment. Supovitz, J. A., & Klein, V. (2003). Mapping
Phillips, N. B., Hamlett, C. L., Fuchs, L. S., & a course for improved student learning:
Fuchs, D. (1993). Combining classwide How innovative schools systematically
( 69 )
References

use student performance data to guide meeting of the American Educational


improvement. Philadelphia, PA: Univer- Research Association, San Diego, CA.
sity of Pennsylvania, Consortium for Wayman, J. C., & Cho, V. (2008). Preparing
Policy Research in Education. educators to effectively use student
Supovitz, J. A., & Weathers, J. (2004). Dash- data systems. In T. J. Kowalski & T. J.
board lights: Monitoring implementation Lasley (Eds.), Handbook on data-based
of district instructional reform strate- decision-making in education (pp. 89–
gies. Philadelphia, PA: University of 104). New York: Routledge.
Pennsylvania, Consortium for Policy Wayman, J. C., Cho, V., & Johnston, M. T.
Research in Education. (2007). The data-informed district: A dis-
Thorn, C. A. (2001). Knowledge management trict-wide evaluation of data use in the
for educational information systems: Natrona County School District. Austin,
What is the state of the field? Education TX: The University of Texas.
Policy Analysis Archives, 9(47), 17–36. Wayman, J. C., & Conoly, K. (2006). Manag-
Thurman, R., & Wolfe, K. (1999). Improving ing curriculum: Rapid implementation
academic achievement of underachiev- and sustainability of a districtwide data
ing students in a heterogeneous class- initiative. ERS Spectrum, 24(2), 4–8.
room. Chicago, IL: St. Xavier University, Wayman, J. C., Midgley, S., & Stringfield, S.
Master’s action research project. (2006). Leadership for data-based de-
Togneri, W. (2003). Beyond islands of ex- cision-making: Collaborative educator
cellence: What districts can do to im- teams. In A. B. Danzig, K. M. Borman, B.
prove instruction and achievement in A. Jones, & W. F. Wright (Eds.), Learner-
all schools—A leadership brief. Washing- centered leadership: Research, policy,
ton, DC: Learning First Alliance. and practice (pp. 189–206). Mahwah, NJ:
U.S. Department of Education. (2009). Lawrence Erlbaum Associates.
Using ARRA funds to drive school re- Wayman, J. C., & Stringfield, S. (2006). Tech-
form and improvement. Retrieved April nology-supported involvement of entire
24, 2009, from www.ed.gov/policy/gen/ faculties in examination of student data
leg/recovery/guidance/uses.doc. for instructional improvement. American
Van Wagner, K. (n.d.). About.com/psychol- Journal of Education, 112(4), 549–571.
ogy/correlational studies. Retrieved Wayman, J. C., Stringfield, S., & Yakimowski,
April 11, 2009, from http://psychology. M. (2004). Software enabling school im-
about.com/od/researchmethods/a/cor- provement through analysis of student
relational.htm. data (Rep. No. 67). Baltimore, MD: Cen-
Waters, J. T., & Marzano, R. J. (2006). School ter for Research on the Education of Stu-
district leadership that works: The ef- dents Placed at Risk (CRESPAR).
fect of superintendent leadership on Wesson, C. L. (1991). Curriculum-based
student achievement. Denver, CO: Mid- measurement and two models of follow-
Continent Research for Education and up consultation. Exceptional Children,
Learning (McREL). 57(3), 246–256
Wayman, J. C. (2005). Involving teach- Williams Rose, L. (2006). Middle Start
ers in data-driven decision making: schools striving for excellence: Steadily
Using computer data systems to sup- improving high-poverty schools in the
port teacher inquiry and reflection. Mid South Delta. New York: Academy for
Journal of Education for Students Placed Educational Development.
At Risk, 10(3), 295–308. Young, V. M. (2006). Teachers’ use of data:
Wayman, J. C., Brewer, C., & Stringfield, S. Loose coupling, agenda setting, and
(2009, April). Leadership for effective team norms. American Journal of Edu-
data use. Paper presented at the annual cation, 112(4), 521–548.

( 70 )

Das könnte Ihnen auch gefallen