Sie sind auf Seite 1von 21

INSPIRE Evaluation Plan

Year 2: Toward a Comprehensive 4-Year Impact Study

Submitted to:
Bridget Jones
INSPIRE i3 Grant Director
Cabarrus County Schools
By The Evaluation Group
August 2015

Contact: Karyl Askew, PhD


Cell: 803-542-2343
Email: Karyl@evaluationgroup.com

INTRODUCTION
INSPIRE stands for Infusing Innovative STEM Practices Into Rigorous Education. INSPIRE is a four-year
U.S. Department of Education (USDOE) Investing in Innovation (i3) development grant awarded to
Cabarrus County Schools (CCS) in 2014. CCS will use the award in combination with key community
partnerships to develop and validate an innovative integrated K-12 STEM pipeline approach focused on
STEM course content and instructional redesign. The approach will be implemented in four schools with
a target population of approximately 3,000 students and 97 educators. A comprehensive program
evaluation, including three impact studies, will examine the effects of INSPIRE on STEM education,
student engagement, and student achievement.
What is program evaluation?
Program evaluation is a systematic inquiry that applies defensible criteria to establish the merit of a
program. The USDOE requires CCS to assess the effectiveness of its i3-supported practices through a
rigorous and independent program evaluation guided by approved measurable project objectives (p.3).
Evaluation findings will equip CCS program personnel to make data- informed decisions about the
implementation of their program.
Who is conducting the evaluation?
The Evaluation Group (TEG) will conduct a utilization-focused evaluation of the INSPIRE program. TEG is
an independent evaluation firm with demonstrated experience in planning, implementing, and
evaluating programs in education and other human services fields throughout the southeastern USA.
Adhering to the Guiding Principles of the American Evaluation Association, TEG works in close
collaboration with district staff to provide objective feedback in support of delivering a program of the
highest quality.
Who will be involved in the INSPIRE evaluation?
TEG welcomes and relies on a variety of stakeholders to develop and execute a successful program
evaluation. TEG works with program stakeholders to collaboratively plan evaluations, design
instruments, determine data collection protocols, and select reporting formats. Primary INSPIRE
stakeholders include groups of CCS individuals who will be directly impacted by the INSPIRE project:
INSPIRE Leadership Team, school-level administrators and teachers (within and outside of the INSPIRE
target schools), district-level administrators, K 12 students (within and outside of the INSPIRE target
schools), parents and legal guardians of CCS students, INSPIRE project partners, and Cabarrus
community members and businesses. Secondary INSPIRE stakeholders include groups or organizations
not directly involved in the INSPIRE project, but who stand to benefit from the initiative. Examples of
secondary stakeholders include: Institutions of Higher Education, STEM employers, and the broader
education field.

PURPOSE OF THIS DOCUMENT


Toward a comprehensive evaluation study, this document builds and elaborates upon the evaluation
plan as described in pages 20 - 25 of the grant narrative and is intended to guide the implementation of
the INSPIRE program evaluation. This document has five main sections along with appendices.
Sections:
1. INSPIRE Project Description
2. INSPIRE Formative Evaluation
3. INSPIRE Summative Evaluation
4. INSPIRE Evaluation Calendar
5. INSPIRE Evaluation Deliverables
Appendices:
A. INSPIRE Data Collection Tools & Systems: Towards Timely and Accurate Reporting (Fall 2015)
B. TEG Third-Party Data Collection and Security Memorandum
C. NC Assessment used for the INSPIRE Impact Study by Level and School Year (SY)

ii

PROJECT DESCRIPTION
INSPIRE is an innovative integrated K-12 STEM pipeline approach focused on STEM course content and
instructional redesign.
K-12 STEM PIPELINE
INSPIRE is unique as it provides early, continuous engagement starting in Kindergarten by automatically
placing low-income, minority students in elementary STEM magnet schools located in their
neighborhood. INSPIRE elementary students who choose to remain in the pipeline are enrolled into
integrated STEM magnet programs at both middle and high school levels. CCS students can also enter
the INSPIRE pipeline in grades 6 or 9 by applying for open slots (n = 66) not filled by continuing students.
The INSPIRE target sample will also grow to include incoming cohorts of elementary students (n = 228)
annually. In Fall 2014, a STEM class will be added at JN Fries Middle (n = 30) as the school transitions to
a full STEM magnet school. During the four year i3 grant cycle, INSPIRE will be implemented in four
schools with a target population of approximately 3,500 students and 97 STEM educators (Table 1).

STEM COURSE CONTENT AND INSTRUCTIONAL REDESIGN


The INSPIRE project builds on CCS prior achievements with a K-12 pipeline implementation and
experience with promising STEM practices. It was designed to advance the CCS model from a promising
strong theory to an evidence-based practice that can serve as a national model for STEM course content
and instructional redesign. Four key components of the project are depicted in the INSPIRE logic model
below. The theory underlying INSPIRE is that training teachers to implement STEM-based problembased learning (PBL) supported by personalized, tech-enabled instruction and connected to real-world
tethers, will significantly increase student interest and engagement in STEM, leading to higher
achievement in math and science. Table 2 and Table 3 below illustrate pre-award and planned postaward implementation and Year 2 PBL lesson development goals. With the aim that teachers across
subject and grade levels will collaborate to create interdisciplinary PBL lessons (INSPIRE Performance
Measure 1.1), INSPIRE emphasizes quality of lesson over quantity of lessons produced.

Table 1. INSPIRE Target Population by Grant Year

School

STEM
Teachers
Served
Y1 Y4

Coltrane Webb
Elementary

23

Patriots
Elementary

40

JN Fries
Middle School

14

Central Cabarrus
High School

15

Incoming Total
Cumulative Total

92

STEM Students Served


Project Year 1
Project Year 2
Project Year 3
Project Year 4
SY
SY
SY
SY
SY
2013/14
2014/15
2015/16
2016/17
2016/17
Spring
Fall
Spring
Fall
Spring
Fall
Spring
Fall
2014
2014
2015
2015
2016
2016
2017
2017
Incoming
Cohort:

Graduating
Cohort:

Incoming
Cohort:

Graduating
Cohort:

Incoming
Cohort:

Graduating
Cohort:

Incoming
Cohort:

485 a

104

88

104

88

104

88

Incoming
Cohort:

Graduating
Cohort:

Incoming
Cohort:

Graduating
Cohort:

Incoming
Cohort:

Graduating
Cohort:

Incoming
Cohort:

893 a

150

140

150

140

150

140

Incoming
Cohort:

Graduating
Cohort:

Incoming
Students:

Graduating
Cohort:

Incoming
Students:

Graduating
Cohort:

Incoming
Students:

297 a

108

98 b
+ 37 a
+ 30 c

108

135

108

135

Incoming
Cohort:

Incoming
Students:

Incoming
Students:

Graduating
Cohort:

Incoming
Students:

243 a

100

75

100

1918

71 b
+ 29 a
493

463

463

1918

2411

2874

3337

New cohorts or newly accepted applicants who will participate in the INSPIRE Impact Study.
Continuing STEM pipeline students from STEM feeder school.
c
New class added at the middle school.
b

INSPIRE Logic Model


CONSTRUCTS AND INDICATORS

SHORT-TERM OUTCOMES

Constructs 1: HIGH-QUALITY TEACHER


PROFESSIONAL DEVELOPMENT AND SUPPORT
# of INSPIRE trainings and institutes offered
#, % STEM teachers completing 10 hours of
STEM PD per year
#, % of teachers receiving STEM coaching
#, % of teachers who meet with STEM
Professional Learning Communities (PLCs) 4
times per month

INCREASED STEM SUPPORT FOR TEACHERS


IMPROVED TEACHER SKILLS

Construct 2: STEM Problem-Based and ProjectBased Learning (PBL) INSTRUCTION


# of PBL lessons developed
#, % of PBL lessons that are interdisciplinary
#, % of PBL lessons meeting five criteria for
high-quality
#, % of teachers implementing PBL lessons
Construct 3: PERSONALIZED AND TECH-ENABLED
INSTRUCTION
#, % of PBL lessons incorporating technology
#, % of STEM students reporting weekly use of
technology with adaptive content

INCREASED STEM PBL CLASSROOM INSTRUCTION

Construct 4: STUDENT REAL-WORLD STEM


TETHERS
#, % of STEM students participating in schoolsponsored STEM events
#, % of STEM students who participate in
NASA Camps

INSPIRE supplies high-quality PBLrelated


trainings and STEM coaches to provide ongoing
STEM PBL instructional support and sustain
innovative STEM course content and instructional
practices.

INSPIRE teachers develop rigorous lessons aligned


with the NC Common Core Standards and NC Essential
Standards to support science, technology, engineering
and mathematics (STEM) course content connected
across all subjects.

INCREASED PERSONALIZED INSTRUCTION


INSPIRE teachers design STEM lessons and
instructional practices that connect PBL course
content to tech-enabled personalized learning
strategies through digital content integration.
INCREASED USE OF REAL-WORLD STEM TETHERS
INSPIRE provides resources for teachers to
amalgamate real-world student tethers with
STEM course content and instructional practices
through community-based STEM events.

INTERMEDIATE
OUTCOMES

ENHANCED STUDENT
INTEREST AND
ENGAGEMENT IN
STEM
% of STEM students
reporting favorable
opinions about PBL
instruction
% of STEM students
reporting favorable
opinions about
technology use
% of STEM students
reporting favorable
opinions about
STEM
% of students
retained in the
STEM pipeline per
year
% of students
aspiring to a STEMrelated career

LONG-TERM
IMPACT

IMPROVED
STUDENT
ACHIEVEMENT
Significant
improvement in
standardized
achievement
scores in Math
and Science as
evidenced in
QED (with
elementary
students) and
RCT (with MS
and HS
students)
impact
evaluations

Key Evaluation Questions: To what extent have strategies been implemented with fidelity? Does INSPIRE promote student engagement in STEM and improve student
mathematics and science achievement? Metrics and Measures: PD attendance sheets; STEM Attribute Implementation Rubric; NC Standardized State Assessments; Discovery Ed
Mathematics; INSPIRE Fidelity Index; Key informant interviews; Educator Surveys; Educator Focus Groups; Student Surveys; Student Focus Groups

Table 2. INSPIRE Pre-award and Post-award Project Implementation


INSPIRE Key Strategies
Problem-based
learning (PBL) lessons

Tech-enabled
personalized learning
strategies

Teacher development
and support

Pre-award Implementation
PBL lessons implemented in grades K 8.
Isolated and inconsistent
implementation of PBL lessons within
grade-levels.
Limited interdisciplinary focus.
Few opportunities for vertical
alignment across grade levels.
No systematic development, review, or
documentation processes for PBL
lessons.

Technology used to varying degrees by


teachers at all grade levels to
supplement instruction.
Technology not systematically
integrated into PBL lessons.
Multiple devices, platforms, and
software programs.
Use of tablet in a 1:1 ratio only in
grades 9 - 12.
Limited integration of blendedlearning, use of data to personalize
instruction, and
Use of technology to support learning
beyond classroom instruction in grades
9 - 12.
STEM-focused professional development
offered during the academic year.
Specialized, yet limited, PD
opportunities such as 5-day
STEMersion).
Limited capacity of PLCs to offer inclass coaching and on-going support for
instruction, development of PBL
lessons and assessments, progress
towards vertical and horizontal
alignment of units.

Post-award Implementation
PBL lessons implemented in grades K
12.
Focus for PBL lessons on high-quality,
interdisciplinary, horizontal and vertical
alignment.
Development and documentation of a
systematic PBL lesson design and
review process.
Creation of a database of lessons
aligned to the common core that can
be shared nationally and
internationally.
Systematic connection technology to PBL
lessons in K -12
Use of chrome books and tablets in a
1:1 ratio to deliver personalized
instruction with students in grades k 12
Instruction is designed to be student
driven, competency-based, hands-on
Face-to-face teaching time; real time
student needs feedback
Teacher & student access to
meaningful data
Technology to support the learning
process beyond the classroom.
Sustained year-round development and
support.
Initial transition support to PBL.
Collaboration through weekly PLC
meetings.
Ongoing support structures (STEM
coordinator and coaches provide
lesson guidance, coordinate k-12
pipeline, in-class coaching, aligned
formative assessments).
Additional capacity for professional
development (school in-service, two
innovation showcases per year, an
annual 5-day STEMersion, partnerships
with STEM businesses).

INSPIRE Key Strategies


Real-world student
tethers

Pre-award Implementation

Post-award Implementation

Limited and inconsistent use of tethers to


connect STEM content to instructional
practices.
All schools have and offer STEMfocused extracurricular clubs.
Schools sponsor students to attend
STEM student competitions and annual
field trips.
Authentic assessment through senior
capstone projects.

Using a community extension approach,


real world tethers are intentionally and
systematically connected to STEM
content and instructional practices.
Increase the number of STEM-related
high school clubs.
Additional funding to increase schoolsponsored student involvement in
STEM competitions.
Schools will offer new quarterly STEM
events (guest speakers) and NASA
summer camp.
Tethers incorporated into PBL lessons
as a result of STEMersion community
partnerships, STEM coach support,
annual innovation showcases, and
intentional curriculum design and
planning using PLCs.

Table 3. INSPIRE PBL Lesson Development Performance and Goals by Project Year
Grade
Level

Project Year 1
Fall 2014
Actual

Project Year 2
Fall 2015
Projection1

Spring 2015
Actual

Total
Projection

Project Year 3
Spring 2016
Projection

Number of Lessons
2
(Number and Percent Interdisciplinary )

Elementary

49

64

49

113

(36 or 73%)

(57 or 89%)

(28 or 57%)

(85 or 75%)

7 per month

Middle

29

30

29

59

(17 or 59%)

(22 or 73%)

(16 or 55%)

(38 or 64%)

30
(22 or 73%)
3 - 4 per month

42

47

42

89

47

(22 or 52%)

(32 or 68%)

(19 or 45%)

(51 or 57%)

(32 or 68%)

5 per month

Total

(57 or 89%)
9 - 10 per month

4 per month

High

64

120

141

120

5 per month

261

141

(75 or 62%)
(111 or 79%)
(62 or 52%)
(173 or 66%)
(111 or 79%)
The projections for PBL lesson development are based on the actual lessons developed in the prior
year.
2
INSPIRE Performance Measure 1.1 specifies that the number of newly developed interdisciplinary PBL
lessons at each level will increase by at least 5 percentage points in Years 1-2 and 10 percentage points
in Years 3-4 or until no fewer than 75% of all developed PBL units are interdisciplinary.
1

FORMATIVE EVALUATION
Introduction
The INSPIRE formative evaluation is designed to equip CCS program personnel to make data- informed
decisions about program implementation. This section of the document presents the six major
evaluation questions that will guide the study. Four of the six major evaluation questions correspond to
the four INSPIRE core strategies. Subsidiary questions listed under each major evaluation question
specify the scope and focus of the evaluation activities and target samples along with data collection
timelines1.
1) How do INSPIRE school teams develop rigorous problem-based learning (PBL) curriculum
units that impact STEM education, student engagement, and student achievement in
Cabarrus County Schools (CCS)?
2) How do INSPIRE educators combine PBL, digital course content, and tech-enabled
personalized learning strategies to impact STEM education, student engagement, and
student achievement in CCS?
3) How does the INSPIRE approach to teacher development and support impact STEM
education in CCS?
4) How are real-world tethers being integrated into PBL instructional practices to impact STEM
education, student engagement, and student achievement in CCS?
5) How does INSPIRE impact underrepresented students STEM engagement and achievement?
6) To what extent is INSPIRE implemented annually with fidelity?
Each question is addressed separately on the following pages.
Evaluation Strategies
The evaluation strategies make use of a combination of quantitative and qualitative data sources
(Appendix A). TEG will partner with the INSPIRE Lead Team to customize data collection instruments
and schedules.

Data collection timelines represent reoccurring monthly or annual events. Within each subsidiary question,
superscripts (i.e., a-e) denote relations between the evaluation/data collection strategy and the data collection
timeline.

Major Evaluation Question 1: How do INSPIRE school teams develop rigorous PBL curriculum units that impact STEM education, student engagement,
and student achievement in Cabarrus County Schools?
Engaged Personnel

Evaluation Questions

Evaluation/Data Collection Strategies

PD: Project Director


SC: STEM Coaches
EV: Evaluator

Data Collection
Timeline

What strategies emerge as best practices for designing PBL lessons aimed at connecting STEM course content across subjects and levels?
1.1. What characteristics or elements define INSPIRE PBL?

INSPIRE Year-End Teacher Survey

SC, PD, EV

June

Survey: All Teachers

1.2. What percentage of PBL lessons meets at least five of


the six criteria for PBL quality? (PM 1.1 and 1.2)

PBL Assessment of Quality Rubric


STEM Coach Log & Summary

SC, EV

Monthly
(Sept June)

1.3. How does the process for developing, aligning,


assessing, testing, and refining PBL lessons differ by
team across schools and levels?

STEM Lead Team Focus Group (held


during a Lead Team Meeting)

PD, SC, EV

March

PD, SC, EV

March

PD, EV

January a
July b

1.4. What barriers or challenges do grade-level and schoollevel teams experience when working towards
horizontal and vertical alignment of the curriculum?
1.5. Is the PBL development process implemented with
fidelity (PM 1.3)?

STEM Lead Team Focus Group (held


during a Lead Team Meeting)
NC STEM Attribute Application/Report a
INSPIRE Fidelity Index b

How do students STEM knowledge, attitudes, beliefs, and behaviors change as they persist through the INSPIRE pipeline? Do trends differ by
subgroup?
1.6. How do students affinity for PBL change as they
persist through the pipeline and what aspect of the
PBL process influence students reported beliefs?
1.7. How, if at all, does PBL contribute to the retention of
students in the pipeline?

STEM Attitudes & Behaviors Survey


(incoming students only) a
INSPIRE Year-End Student Surveys b
STEM Persistence Focus Groups b
INSPIRE Year-End Student Survey
STEM Persistence Focus Groups

PD, SC, EV
Survey: All students
FG: TBD

PD, SC, EV
Survey: All Students
FG: TBD

October
June b

June

Evaluation Question 2: How do INSPIRE educators combine PBL, digital course content, and tech-enabled personalized learning strategies to impact
STEM education, student engagement, and student achievement in Cabarrus County Schools?
Engaged Personnel

Evaluation Questions

Evaluation/Data Collection Strategies

PD: Project Director


SC: STEM Coaches
EV: Evaluator

Data
Collection
Timeline

Within PBL curriculum units, how do educators use digital devices and adaptive digital course content to personalize learning?
PD, SC, EV
INSPIRE Year-End Teacher Survey
Survey: All Teachers and
devices and adaptive digital course content (to support INSPIRE Year-End Student Surveys
Students Gr 4 -12
learning inside and outside of the classroom)? (PM 2.1
and 2.2)
2.2. How, if at all, do digital devices and adaptive digital
PD, SC, EV
INSPIRE Year-End Teacher Survey
Survey: All Teachers and
course content impact the rigor and engagement of
INSPIRE Year-End Student Surveys
Students Gr 4 -12
PBL lessons as perceived by teachers and students?
2.3. How, if at all, does the adaptive digital course content
PD, SC, EV
INSPIRE Year-End Teacher Surveys
Survey: All Teachers
used by educators align with common core standards
and the school curriculum?
2.1. What successes and challenges do educators face
PD, SC, EV
INSPIRE Year-End Teacher Survey
Survey: All Teachers
when using digital devices and adaptive digital course
content in PBL curriculum units content to personalize
learning for their students?
How do digital devices and adaptive digital course content impact students STEM knowledge, attitudes, beliefs, and behaviors?
2.1. How do teachers and students report using digital

2.1. Do digital devices increase student engagement (inside

and outside of the classroom)?


2.1. Does adaptive digital course content impact students

perceptions of ownership in the learning process and


support self-regulated behaviors?

INSPIRE Year-End Teacher Survey


INSPIRE Year-End Student Surveys

PD, SC, EV

INSPIRE Year-End Teacher Survey


INSPIRE Year-End Student Surveys

PD, SC, EV

June

June
June
June

June

Survey: All Teachers and


Students Gr 4 -12
Survey: All Students and
Teachers

June

Evaluation Question 3: How does the INSPIRE approach to teacher development and support impact STEM education in Cabarrus County Schools?
Engaged Personnel

Evaluation Questions

Evaluation/Data Collection Strategies

PD: Project Director


SC: STEM Coaches
EV: Evaluator

Data Collection Timeline

To what degree do teachers utilize and find value in the resources offered as part of the INSPIRE approach to teacher development and support?

STEM Coach Log (meetings/month) a


INSPIRE Year-End Teacher Survey b

SC, EV

Monthly a
June b

3.2. How often do teachers meet with their PLC (PM 3.4)?

PLC Attendance Logs (meetings/month)

SC, EV

Weekly

3.3. How many teachers participate in STEMersion (PM

Event Attendance Logs (Hrs required)

PD

September

Event Attendance Logs (Hrs required)

PD

September

Event Attendance Logs (Hrs required)


INSPIRE Professional Development Tool

PD

Ongoing

INSPIRE Year-End Teacher Survey

PD, SC, EV

June

Walk-through with PBL Framework Rubric


STEM Coach Log

SC

Walk-through with PBL Framework Rubric


STEM Coach Log

SC

3.1. How often do STEM teachers consult with STEM

coaches (PM 3.2)?

3.3)?
3.4. How many teachers receive NASA summer camp

training (relevant for a subsample)?


3.5. What other personnel, resources, or professional

development training do STEM teachers engage? (PM


3.3)
3.6. What professional development and support
strategies do educators perceive to be effective at
increasing teacher efficacy and instructional practice?
3.7. How, if at all, does the INSPIRE approach to teacher
development and support transform the teacher role
from transmitter to a facilitator of knowledge?
3.8. Do teachers demonstrate an annual increase in
proficiency on a personalized PBL-related instructional
practice goal? (PM 3.1)

Sept Baseline
Nov Mid-Year
Apr Year-End
Sept Baseline
Nov Mid-Year
Apr Year-End

Question 4: How are real-world tethers being integrated into PBL instructional practices to impact STEM education, student engagement, and student
achievement in Cabarrus County Schools?
Engaged Personnel

4.1. How and to what extent do educators integrate real-

world tethers into PBL curriculum?


4.2. What opportunities are provided for teachers to build

a larger community of practice around innovative


STEM instruction? (PM 4.3)
4.3. How, if at all, do partnerships between classroom

educators and STEM professionals contribute to


teachers science efficacy and classroom instructional
practice?
4.4. How do real-world tethers impact students STEM
attitudes, engagement outside of the classroom (PM
4.4), aspirations (PM 4.1), and pipeline persistence
(PM 4.2)?

Data Collection
Timeline

Evaluation/Data Collection Strategies

PD: Project Director


SC: STEM Coaches
EV: Evaluator

PBL Assessment of Quality Rubric


STEM Coach Log & Summary

SC, EV

Ongoing

INSPIRE Professional Development Tool a


STEM Lead Team Focus Group (held
during a Lead Team Meeting) b
INSPIRE Year-End Teacher Survey c
INSPIRE Year-End Teacher Survey

PD, EV
Survey: All Teachers

Ongoing a
March b
June c

SC, PD, EV

June

Evaluation Questions

Survey: All Teachers

INSPIRE Year-End Teacher Survey


INSPIRE Year-End Student Surveys
STEM Persistence Focus Groups

10

PD, SC, EV
Survey: All Students
and Teachers
FG: TBD

June

Question 5: How does INSPIRE impact underrepresented students STEM engagement and achievement?
Engaged Personnel

Evaluation Questions

Evaluation/Data Collection Strategies

PD: Project Director


SC: STEM Coaches
EV: Evaluator
DA: Data Analyst

Data Collection
Timeline

5.1. What factors influence traditionally-underrepresented


INSPIRE students educational K-12 STEM pipeline
trajectories leading to retention or departure?

INSPIRE Year-End Student Survey


STEM Focus Groups

PD, SC, EV

5.2. Do INSPIRE students K-12 STEM pipeline trajectories


differ by race/ethnicity, gender, or socioeconomic
status?
5.3. Does the percentage of low-income and minority
INSPIRE students who meet the academic criteria for
advancing through the pipeline increase annually?
5.4. What strategies are being employed to retain students
in the STEM pipeline, specifically academically
struggling and underrepresented groups?

Administrative Data (Enrollment Data


NC EOG Math, Grade 3 and Grade 5)

PD, DA, EV

Oct - Jan

Administrative Data (Enrollment Data


NC EOG Math, Grade 3 and Grade 5)

PD, DA, EV

Oct - Jan

STEM Lead Team Focus Group (held


during a Lead Team Meeting)

PD, SC, V

Mar

11

June

Survey: Students Gr
4 -12
FG: TBD

Question 6: To what extent is INSPIRE implemented annually with fidelity?


The INSPIRE project will be evaluated, in part, on fidelity of implementation. Implementation fidelity is
the extent to which actual project implementation aligns with proposed project implementation. This
proposed implementation plan was described in the grant application and further specified in the i3
Management Plan submitted to the US DOE in March 2014. The INSPIRE Annual Fidelity Index (refer to
the INSPIRE NEi3 Fidelity Tool) will be used to monitor and measure implementation fidelity.
The INSPIRE Annual Fidelity Index has four constructs that align with the four INSPIRE core strategies
presented in the INSPIRE logic model:
1) High-Quality Teacher Professional Development and Support (4 components)
2) Problem-Based Learning (PBL) Instruction (4 components)
3) Personalized and Tech-Enabled Instruction (2 components)
4) Student Real-World STEM Tethers (2 components)
Fidelity scores are based on indicators of: 1) Reach or extent to which individuals participate; 2) Dosage
or how much of the component was delivered; 3) Quality or how well each component was delivered;
and 4) Responsiveness or extent to which participants are engaged (Durlak & Dupre, 2008; Mowbry,
et.al, 2003). The numbers of components within each construct range from two to four. A fidelity score
will be determined for each component and for each construct.
Thresholds (or targets) are established for each indicator based upon baseline data and the project
directors recommendations and project goals. In subsequent years, thresholds will be reevaluated with
the expectation that they increase incrementally as the program matures. These short-term benchmarks
will assess progress toward long-term program goals and inform replication efforts.
TEG can school fidelity for select elements or components of elements. TEG will include results as part
of the annual evaluation report and as part of the national evaluation reporting requirements. As part
of the formative reporting process, TEG will submit fidelity of implementation data on an ongoing basis
(quarterly or mid-year) as planned and as requested by the project director.

12

SUMMATIVE EVALUATION
The summative evaluation, to be concluded in Fall 2017, is designed to answer the confirmatory
question What is the impact of the INSPIRE program on student achievement?
Description of all impact studies being conducted. The impact studies will address two primary
questions that examine the effect of INSPIRE on: 1) 2nd grade math achievement for INSPIRE-zoned
elementary school students after three program years, and 2) math achievement for secondary school
magnet students after three program years. Accordingly, we propose two impact studies in which
selection bias is controlled by design, either through matching (Study 1) or through randomization of
interested magnet students to treatment and control groups (Study 2). The studies use recognized
assessments with proven reliability and validity, including NC End-of-Grade (EOG) and End-of-Course
(EOC) tests, Discovery Educations (DE) Common Core Math Assessment, and the ACT college readiness
assessment. Table 3 details the standardized assessments to be used by grade level.
Impact Study 1 uses a three-year longitudinal single-cohort quasi-experimental design (QED) to assess
the impact of INSPIRE on math achievement at the end of 2nd grade. The study sample consists of a
single-cohort of all Kindergarten students (N = 228) entering two STEM elementary schools, ColtraneWebb Elementary and Patriots Elementary. The two target schools hold Title I status and serve the
districts most ethnically diverse students (Coltrane-Webb: 53% FRPL, 55% minority; Patriots: 34% FRPL,
28% minority). Low-income minority students are zoned for each school, making their enrollment
compulsory and largely due to circumstance rather than choice. We will compare the outcomes of
INSPIRE-zoned students with similar students from four comparison elementary schools not offering a
STEM program (R.B. McAlister; Beverly Hills; Pitt School Road; Charles Boger). Propensity-score
adjustment (PSA) will be used to minimize selection bias and ensure INSPIRE-zoned students and control
students are equated on key background and demographic variables, including a baseline pretest
measure of math ability in Kindergarten (DE Common Core Math Assessment), age, ethnicity, gender,
and free/reduced lunch status. Propensity score adjustment is an adjusted regression used to predict
the effects of a treatment while accounting for confounding variables.
Impact Study 2 uses a two-cohort individual-level longitudinal randomized control trial (RCT) to assess
the effects of INSPIRE on secondary students math achievement and science after two years of
treatment. Use of a RCT design effectively minimizes selection bias and ensures that the program and
control groups are equitable at baseline in terms of background, demographic, and pre-program factors
such as motivation. In school years 2014/15 and 2015/16, a carefully orchestrated lottery process will
be used to randomly assign eligible 6th grade and 9th grade applicants to open INSPIRE slots at J.N. Fries
Middle School and Central Cabarrus High School (CCHS), respectively, or to a non-INSPIRE control group
in the middle school or high school for which the student is zoned (annual estimates: INSPIRE=66,
control=99, total N=165). The target schools serve a significant number of minority and low-income
students (Fries: 25% FRPL, 35% minority; CCHS: 50% FRPL, 42% minority). Between-group comparisons
will be made on state standardized test scores in math and science assessment. The study will use
Hierarchical Linear Modeling (HLM) and include individual and contextual covariates.

13

INSPIRE EVALUATION CALENDAR: AUGUST 2015 JUNE 2016


Evaluation-Related Activity

Dates
Sept
2015

October
November/
December
January
2016

March

Evaluation Planning Meeting: YR1 Evaluation


Data and YR2 Data Collection (held during a Lead
Team Meeting)

Data collection: Collection of STEM Student and


Parent Consent Forms (incoming students only)

Data collection: Initial PBL Walk-Through/


Coaching Session, Student Rosters, and Teacher
Rosters

Data collection: STEM Attitudes & Behaviors


Survey (incoming students only)

Data collection: Mid-year PBL Walk-Through/


Coaching Session

Data collection: Annual Performance Report


Data Collection & Verification

Evaluation Planning Meeting: Mid-Year Data


Collection Check-in (held during a Lead Team
Meeting)

Data collection: STEM Lead Team Focus Group


(held during a Lead Team Meeting)

Data collection: Year-end PBL Walk-Through/


Coaching Session

Instrument development: INSPIRE Survey and


Focus Group Protocol

Data collection: INSPIRE Year-End Teacher


Survey, INSPIRE Year-End Student Survey &
STEM Persistence Focus Groups

Faculty

Students

April

June

Coaches

X
X

Data collection: Verification and Closeout of all


Ongoing data collection
Monthly Collection (August 2015 June 2016)

Data collection: PBL Framework & Assessment


of Quality, STEM Coach Log & Summary, STEM
Event Attendance & Survey, and Walk-through
with PBL Framework Rubric

Data collection: INSPIRE Professional


Development Tool and PLC Attendance Logs

Ongoing

14

INSPIRE EVALUATION DELIVERABLES


This proposed evaluation deliverables list in Tables 6 and 7 are subject to change based on the pace of
project implementation and at the request of the project director.

Deliverable Name
1
INSPIRE Year 2 Evaluation Plan
2
INSPIRE STEM Events Snapshot

3
4
5
6
7
8

INSPIRE Retention Study Report


INSPIRE NASA Camp Infographic & Data Report
INSPIRE STEMersion Infographic
INSPIRE Professional Development Snapshots
STEM Attitudes & Behaviors Survey Real-time Data Reports
INSPIRE Professional Learning Community (PLC) Snapshots

9
10
11
12
13

INSPIRE Annual Performance Report: Year 2


INSPIRE Performance Measure Update: Year 2
INSPIRE Year-End Student Survey Real-time Data Reports
INSPIRE Year-End Teacher Survey Real-time Data Reports
INSPIRE Year 2 Evaluation Report (SY 2015-2016)

Scheduled Delivery
September 2015
September 2015
November 2015
March 2016
June 2016
October 2015
October 2015
October 2015
October 2015 June 2016
November 2015
November 2015
June 2016
March 2016
July 2016
July 2016
August 2016

Request for Additional Evaluation Services Disclaimer


TEG takes great pride in our work and strives to provide our partners with high caliber services and deliverables.
Through a highly collaborative process, we customize our evaluation plans to meet the unique needs of our
partners. Alterations to these plans may result in challenges that impact the quality of our work and for that
reason, careful consideration is given to each request for additional services and deliverables. Evaluation services
and deliverables, such as ad hoc reports, foundation performance reports, and additional analyses that are not
identified in the evaluation plan may be requested by the Project Director, in writing, and submitted to the
evaluator for review. TEG will consider additional requests through a formal review process and will make a
determination as to whether the additional services can be provided. We will do our best to accommodate
requests but may decline a request, in the case that it will negatively impact the time and quality of our service
delivery. In the case that TEG determines that the additional services and/or deliverables are feasible, TEG will
work with the Project Director to define the additional evaluation services needed and establish a fee that both
parties agree to before these services are delivered. An invoice for these services will be issued and the partner
agrees to pay it before TEG begins working.

15

APPENDIX A
INSPIRE Data Collection Tools & Systems: Towards Timely and Accurate Reporting (Fall 2015)

PBL Professional Development

TOOL
1. PBL Framework &
Assessment of
Quality Rubric

Meeting Attendance

MAPS TO PM
1.1, 1.2

2. STEM PDP Portal

Online Teacher and Coach


Communication Log

CCS Internal Use


Only

3. STEM Coach Log &


Summary

GoogleDoc for each school


(shared with TEG and
Project Director)

3.1, 3.2

a.
b.
c.
d.

INSPIRE Teacher & Student Surveys

LOCATION/ACCESS
POINT
Electronic MSWord File

Coaching Log
PBL Lesson Implementation
PBL Lesson Assessment
Teacher PBL Instruction
Ratings
e. Coaching Detail Summary
f. PBL Lesson & Teaching
Summary

4. PLC Meeting Log

INSPIRE STEM resource


page

3.4

5. INSPIRE PD Tracking
Tool & Real-time Online
Summary Report

INSPIRE STEM resource


page| Online Summary
Password: inspire4ccs

3.3

6. STEM Field Trip &


School Event Survey

INSPIRE STEM resource


page

4.3, 4.4

7.

Administered online by TEG

2.2, 3.2, 4.3

URL submitted by TEG at


beginning of the data
collection period
Paper-based administration
by STEM Coaches

4.1

8.
9.

INSPIRE Teacher
Surveys
Student Online Surveys
(Gr 4 - 12)
Student Surveys
(Gr K 3)

CCS Formative
Data

DESCRIPTION AND NOTES


The Framework serves as a common format for lesson planning; completed and self-assessed by
teachers. Frameworks are collected and reviewed monthly by STEM Coaches. Ratings are reported
to TEG for the U.S. DOE Annual Performance Reports (APR) using the STEM Coach Log (3d). Of
most relevance for evaluation reporting is consistent interpretation of rubric.
Used to facilitate communication between teachers and coaches. Teachers receive a personalized
invitation and link to access the online tool. Coaches are provided with personalized links for each
teacher via the STEM Coach Log (3d). Coaches can use the teachers link to input feedback for the
teacher. Afterwards, coaches may choose to establish a standard turnaround time (feedback
uploaded within in 2-3 business days after the coaching session) or follow-up with an email to the
teacher.
This tool is new as of Year 2 designed to provide coaches with greater ownership of data. It
combines the online STEM coaching log and PBL Lesson Assessment spreadsheet and is designed
to serve as a central data collection portal for site management data and for data required for federal
reporting. A GoogleDoc is established for each school; the schools STEM Coach, Project Director,
and TEG have access to the tool. Coaches can submit requests to TEG to revise their teacher roster
or tool, as needed. Use the Coaching Log (3a) to record individualized teacher support and the PLC
Meeting Log (4) to record whole-grade/whole-team consultations.
This online tool is used to track PLC meeting attendance. One session can be used by PLC leads to
log attendance for all attendees. TEG will support coaches with data verification twice annually in
January and May.
This tool will take the place of CARE sheets for INSPIRE PD. Refer to instruction sheet for detailed
use. TEG submits monthly attendance summaries to Bridget that details each schools progress
disaggregated by teacher. PD administrators should email a list of sessions to Bridget Jones. Events
are entered into the system as requests are forwarded to TEG (please allow 24-hours for entry).
An online survey to be completed by event organizers (only one is required per event) and should be
accompanied by students attendance records that are maintained and submitted by STEM Coaches
as events occurs or on a quarterly basis.
The purpose of the survey is to collect formative feedback on best practices in Project/ProblemBased Learning (PBL) lesson development, instructional support, and professional development.
Two versions are administered online to students who have parental consent.
The guided protocol is customized for each elementary school. TEG scores and report survey data.
Mail completed surveys to: The Evaluation Group, 403 West Ponce de Leon Avenue, Suite 210,
Decatur, Georgia 30030 C/O: Administrative Assistant

INSPIRE STEM resource page: http://www.cabarrus.k12.nc.us/Page/33255. Ongoing technical support is provided by TEG.

APPENDIX B
TEG Third-Party Data Collection and Security Memorandum
The purpose of this memorandum is to provide you with information about the process by which we
protect data that is collected for the purpose of program evaluation conducted by The Evaluation
Group. We recognize that we often request sensitive data as a result of our services and have protocols
in place to protect data that has been collected. We take multiple steps to ensure that confidentiality is
maintained throughout the evaluation and request data in the least intrusive manner allowed by the
evaluation design.
General Use
1. Student and teacher-level data will only be used for purposes outlined in the evaluation
plan and for purposes of completing the Annual Performance Report (APR) or other reports
required by the funder;
2. On reports, student-level data will only be reported in aggregate to protect students. If the
n is less than 10, we will not report for that group;
3. Data will be stored in Qualtrics or Dropbox for up to one year and then archived for three
years before being deleted from storage.
Data Access and Security
1. Data stored in Qualtrics is password protected and encrypted. For more information on
Qualtrics security, see here.
2. Downloaded copies of student data are stored on a password-protected Dropbox, which
provides 256-bit AES encryption and uses SSL/TLS secure tunnel for file transfers.
3. Only members of The Evaluation Group/Research Associates will have access to the data.

We take data security very seriously and ensure that the data we collect is protected to the best of our
ability. As required by the Federal Educational Rights and Privacy Act (FERPA), we utilize reasonable
methods to ensure that data is secure and the confidentiality of respondents is protected. In the case
of a data breach, we have a dedicated IT Department and Data Architect who have the capability of
addressing these challenges immediately so that additional issues do not occur. We are happy to discuss
our data security protocol further, at any point throughout the evaluation. Please let us know if you
have any questions.

APPENDIX C
NC Assessment used for the INSPIRE Impact Study by Level and School Year (SY)
INSPIRE Impact Study Assessment by Grade
Grade
Math
Science
Elementary School (Coltrane-Webb & Patriots)
N/A

(SY 2015/16)

Dreambox Learning Assessment


(August and Year-End Administration)
Dreambox Learning Assessment
(August and Year-End Administration)

Discovery Education Core Math

N/A

K
(SY 2014/15)

N/A

(SY 2016/17)

Middle School (JN Fries)


6

NC EOG Math, 6th Grade

(SY 2014/15 &


2015/16)

DE Benchmark, 6th Grade

NC EOG Math, 7th Grade

(SY 2015/16 &


2016/17)

DE Benchmark, 7th Grade


NC EOG Math, 8th Grade

8
(SY 2016/17)

NC EOC Math I

NC State Final Exam, Science, 6th

NC State Final Exam, Science, 7th

NC EOG Science, 8th Grade


ACT-ASPIRE, Science

DE Benchmark, 8th Grade


High School (Central Cabarrus)

9
(SY 2014/15 &
2015/16)

10
(SY 2015/16 &
2016/17)

11

NC EOC Math II

NC Final Exam (Earth and Environmental Science,

NC EOC Math III

NC EOC Biology

ACT, Math

ACT Science

Physical Science, Physics, or Chemistry)

(SY 2016/17)

INSPIRE Impact Formative Assessments


Elementary School
3

NC EOG Math, 3rd Grade

N/A

NC EOG Math, 5th Grade

N/A

(Annual)

5
(Annual)

NOTES:
1. TEG proposes an annual October 1 January 31 data collection schedule.
2. For all assessments, TEG requests raw scaled scores.
3. For all cases, CCS Data Analyst (Brian Dos) will provide: Stable student identifiers (tracking INSPIRE
student movement longitudinally across schools); Demographic variables (age, ethnicity, gender, and
FRLP); School affiliation; Program flag (INSPIRE, INSPIRE-zoned, Other); Program entry grade

Das könnte Ihnen auch gefallen