Beruflich Dokumente
Kultur Dokumente
Submitted to:
Bridget Jones
INSPIRE i3 Grant Director
Cabarrus County Schools
By The Evaluation Group
August 2015
INTRODUCTION
INSPIRE stands for Infusing Innovative STEM Practices Into Rigorous Education. INSPIRE is a four-year
U.S. Department of Education (USDOE) Investing in Innovation (i3) development grant awarded to
Cabarrus County Schools (CCS) in 2014. CCS will use the award in combination with key community
partnerships to develop and validate an innovative integrated K-12 STEM pipeline approach focused on
STEM course content and instructional redesign. The approach will be implemented in four schools with
a target population of approximately 3,000 students and 97 educators. A comprehensive program
evaluation, including three impact studies, will examine the effects of INSPIRE on STEM education,
student engagement, and student achievement.
What is program evaluation?
Program evaluation is a systematic inquiry that applies defensible criteria to establish the merit of a
program. The USDOE requires CCS to assess the effectiveness of its i3-supported practices through a
rigorous and independent program evaluation guided by approved measurable project objectives (p.3).
Evaluation findings will equip CCS program personnel to make data- informed decisions about the
implementation of their program.
Who is conducting the evaluation?
The Evaluation Group (TEG) will conduct a utilization-focused evaluation of the INSPIRE program. TEG is
an independent evaluation firm with demonstrated experience in planning, implementing, and
evaluating programs in education and other human services fields throughout the southeastern USA.
Adhering to the Guiding Principles of the American Evaluation Association, TEG works in close
collaboration with district staff to provide objective feedback in support of delivering a program of the
highest quality.
Who will be involved in the INSPIRE evaluation?
TEG welcomes and relies on a variety of stakeholders to develop and execute a successful program
evaluation. TEG works with program stakeholders to collaboratively plan evaluations, design
instruments, determine data collection protocols, and select reporting formats. Primary INSPIRE
stakeholders include groups of CCS individuals who will be directly impacted by the INSPIRE project:
INSPIRE Leadership Team, school-level administrators and teachers (within and outside of the INSPIRE
target schools), district-level administrators, K 12 students (within and outside of the INSPIRE target
schools), parents and legal guardians of CCS students, INSPIRE project partners, and Cabarrus
community members and businesses. Secondary INSPIRE stakeholders include groups or organizations
not directly involved in the INSPIRE project, but who stand to benefit from the initiative. Examples of
secondary stakeholders include: Institutions of Higher Education, STEM employers, and the broader
education field.
ii
PROJECT DESCRIPTION
INSPIRE is an innovative integrated K-12 STEM pipeline approach focused on STEM course content and
instructional redesign.
K-12 STEM PIPELINE
INSPIRE is unique as it provides early, continuous engagement starting in Kindergarten by automatically
placing low-income, minority students in elementary STEM magnet schools located in their
neighborhood. INSPIRE elementary students who choose to remain in the pipeline are enrolled into
integrated STEM magnet programs at both middle and high school levels. CCS students can also enter
the INSPIRE pipeline in grades 6 or 9 by applying for open slots (n = 66) not filled by continuing students.
The INSPIRE target sample will also grow to include incoming cohorts of elementary students (n = 228)
annually. In Fall 2014, a STEM class will be added at JN Fries Middle (n = 30) as the school transitions to
a full STEM magnet school. During the four year i3 grant cycle, INSPIRE will be implemented in four
schools with a target population of approximately 3,500 students and 97 STEM educators (Table 1).
School
STEM
Teachers
Served
Y1 Y4
Coltrane Webb
Elementary
23
Patriots
Elementary
40
JN Fries
Middle School
14
Central Cabarrus
High School
15
Incoming Total
Cumulative Total
92
Graduating
Cohort:
Incoming
Cohort:
Graduating
Cohort:
Incoming
Cohort:
Graduating
Cohort:
Incoming
Cohort:
485 a
104
88
104
88
104
88
Incoming
Cohort:
Graduating
Cohort:
Incoming
Cohort:
Graduating
Cohort:
Incoming
Cohort:
Graduating
Cohort:
Incoming
Cohort:
893 a
150
140
150
140
150
140
Incoming
Cohort:
Graduating
Cohort:
Incoming
Students:
Graduating
Cohort:
Incoming
Students:
Graduating
Cohort:
Incoming
Students:
297 a
108
98 b
+ 37 a
+ 30 c
108
135
108
135
Incoming
Cohort:
Incoming
Students:
Incoming
Students:
Graduating
Cohort:
Incoming
Students:
243 a
100
75
100
1918
71 b
+ 29 a
493
463
463
1918
2411
2874
3337
New cohorts or newly accepted applicants who will participate in the INSPIRE Impact Study.
Continuing STEM pipeline students from STEM feeder school.
c
New class added at the middle school.
b
SHORT-TERM OUTCOMES
INTERMEDIATE
OUTCOMES
ENHANCED STUDENT
INTEREST AND
ENGAGEMENT IN
STEM
% of STEM students
reporting favorable
opinions about PBL
instruction
% of STEM students
reporting favorable
opinions about
technology use
% of STEM students
reporting favorable
opinions about
STEM
% of students
retained in the
STEM pipeline per
year
% of students
aspiring to a STEMrelated career
LONG-TERM
IMPACT
IMPROVED
STUDENT
ACHIEVEMENT
Significant
improvement in
standardized
achievement
scores in Math
and Science as
evidenced in
QED (with
elementary
students) and
RCT (with MS
and HS
students)
impact
evaluations
Key Evaluation Questions: To what extent have strategies been implemented with fidelity? Does INSPIRE promote student engagement in STEM and improve student
mathematics and science achievement? Metrics and Measures: PD attendance sheets; STEM Attribute Implementation Rubric; NC Standardized State Assessments; Discovery Ed
Mathematics; INSPIRE Fidelity Index; Key informant interviews; Educator Surveys; Educator Focus Groups; Student Surveys; Student Focus Groups
Tech-enabled
personalized learning
strategies
Teacher development
and support
Pre-award Implementation
PBL lessons implemented in grades K 8.
Isolated and inconsistent
implementation of PBL lessons within
grade-levels.
Limited interdisciplinary focus.
Few opportunities for vertical
alignment across grade levels.
No systematic development, review, or
documentation processes for PBL
lessons.
Post-award Implementation
PBL lessons implemented in grades K
12.
Focus for PBL lessons on high-quality,
interdisciplinary, horizontal and vertical
alignment.
Development and documentation of a
systematic PBL lesson design and
review process.
Creation of a database of lessons
aligned to the common core that can
be shared nationally and
internationally.
Systematic connection technology to PBL
lessons in K -12
Use of chrome books and tablets in a
1:1 ratio to deliver personalized
instruction with students in grades k 12
Instruction is designed to be student
driven, competency-based, hands-on
Face-to-face teaching time; real time
student needs feedback
Teacher & student access to
meaningful data
Technology to support the learning
process beyond the classroom.
Sustained year-round development and
support.
Initial transition support to PBL.
Collaboration through weekly PLC
meetings.
Ongoing support structures (STEM
coordinator and coaches provide
lesson guidance, coordinate k-12
pipeline, in-class coaching, aligned
formative assessments).
Additional capacity for professional
development (school in-service, two
innovation showcases per year, an
annual 5-day STEMersion, partnerships
with STEM businesses).
Pre-award Implementation
Post-award Implementation
Table 3. INSPIRE PBL Lesson Development Performance and Goals by Project Year
Grade
Level
Project Year 1
Fall 2014
Actual
Project Year 2
Fall 2015
Projection1
Spring 2015
Actual
Total
Projection
Project Year 3
Spring 2016
Projection
Number of Lessons
2
(Number and Percent Interdisciplinary )
Elementary
49
64
49
113
(36 or 73%)
(57 or 89%)
(28 or 57%)
(85 or 75%)
7 per month
Middle
29
30
29
59
(17 or 59%)
(22 or 73%)
(16 or 55%)
(38 or 64%)
30
(22 or 73%)
3 - 4 per month
42
47
42
89
47
(22 or 52%)
(32 or 68%)
(19 or 45%)
(51 or 57%)
(32 or 68%)
5 per month
Total
(57 or 89%)
9 - 10 per month
4 per month
High
64
120
141
120
5 per month
261
141
(75 or 62%)
(111 or 79%)
(62 or 52%)
(173 or 66%)
(111 or 79%)
The projections for PBL lesson development are based on the actual lessons developed in the prior
year.
2
INSPIRE Performance Measure 1.1 specifies that the number of newly developed interdisciplinary PBL
lessons at each level will increase by at least 5 percentage points in Years 1-2 and 10 percentage points
in Years 3-4 or until no fewer than 75% of all developed PBL units are interdisciplinary.
1
FORMATIVE EVALUATION
Introduction
The INSPIRE formative evaluation is designed to equip CCS program personnel to make data- informed
decisions about program implementation. This section of the document presents the six major
evaluation questions that will guide the study. Four of the six major evaluation questions correspond to
the four INSPIRE core strategies. Subsidiary questions listed under each major evaluation question
specify the scope and focus of the evaluation activities and target samples along with data collection
timelines1.
1) How do INSPIRE school teams develop rigorous problem-based learning (PBL) curriculum
units that impact STEM education, student engagement, and student achievement in
Cabarrus County Schools (CCS)?
2) How do INSPIRE educators combine PBL, digital course content, and tech-enabled
personalized learning strategies to impact STEM education, student engagement, and
student achievement in CCS?
3) How does the INSPIRE approach to teacher development and support impact STEM
education in CCS?
4) How are real-world tethers being integrated into PBL instructional practices to impact STEM
education, student engagement, and student achievement in CCS?
5) How does INSPIRE impact underrepresented students STEM engagement and achievement?
6) To what extent is INSPIRE implemented annually with fidelity?
Each question is addressed separately on the following pages.
Evaluation Strategies
The evaluation strategies make use of a combination of quantitative and qualitative data sources
(Appendix A). TEG will partner with the INSPIRE Lead Team to customize data collection instruments
and schedules.
Data collection timelines represent reoccurring monthly or annual events. Within each subsidiary question,
superscripts (i.e., a-e) denote relations between the evaluation/data collection strategy and the data collection
timeline.
Major Evaluation Question 1: How do INSPIRE school teams develop rigorous PBL curriculum units that impact STEM education, student engagement,
and student achievement in Cabarrus County Schools?
Engaged Personnel
Evaluation Questions
Data Collection
Timeline
What strategies emerge as best practices for designing PBL lessons aimed at connecting STEM course content across subjects and levels?
1.1. What characteristics or elements define INSPIRE PBL?
SC, PD, EV
June
SC, EV
Monthly
(Sept June)
PD, SC, EV
March
PD, SC, EV
March
PD, EV
January a
July b
1.4. What barriers or challenges do grade-level and schoollevel teams experience when working towards
horizontal and vertical alignment of the curriculum?
1.5. Is the PBL development process implemented with
fidelity (PM 1.3)?
How do students STEM knowledge, attitudes, beliefs, and behaviors change as they persist through the INSPIRE pipeline? Do trends differ by
subgroup?
1.6. How do students affinity for PBL change as they
persist through the pipeline and what aspect of the
PBL process influence students reported beliefs?
1.7. How, if at all, does PBL contribute to the retention of
students in the pipeline?
PD, SC, EV
Survey: All students
FG: TBD
PD, SC, EV
Survey: All Students
FG: TBD
October
June b
June
Evaluation Question 2: How do INSPIRE educators combine PBL, digital course content, and tech-enabled personalized learning strategies to impact
STEM education, student engagement, and student achievement in Cabarrus County Schools?
Engaged Personnel
Evaluation Questions
Data
Collection
Timeline
Within PBL curriculum units, how do educators use digital devices and adaptive digital course content to personalize learning?
PD, SC, EV
INSPIRE Year-End Teacher Survey
Survey: All Teachers and
devices and adaptive digital course content (to support INSPIRE Year-End Student Surveys
Students Gr 4 -12
learning inside and outside of the classroom)? (PM 2.1
and 2.2)
2.2. How, if at all, do digital devices and adaptive digital
PD, SC, EV
INSPIRE Year-End Teacher Survey
Survey: All Teachers and
course content impact the rigor and engagement of
INSPIRE Year-End Student Surveys
Students Gr 4 -12
PBL lessons as perceived by teachers and students?
2.3. How, if at all, does the adaptive digital course content
PD, SC, EV
INSPIRE Year-End Teacher Surveys
Survey: All Teachers
used by educators align with common core standards
and the school curriculum?
2.1. What successes and challenges do educators face
PD, SC, EV
INSPIRE Year-End Teacher Survey
Survey: All Teachers
when using digital devices and adaptive digital course
content in PBL curriculum units content to personalize
learning for their students?
How do digital devices and adaptive digital course content impact students STEM knowledge, attitudes, beliefs, and behaviors?
2.1. How do teachers and students report using digital
PD, SC, EV
PD, SC, EV
June
June
June
June
June
June
Evaluation Question 3: How does the INSPIRE approach to teacher development and support impact STEM education in Cabarrus County Schools?
Engaged Personnel
Evaluation Questions
To what degree do teachers utilize and find value in the resources offered as part of the INSPIRE approach to teacher development and support?
SC, EV
Monthly a
June b
3.2. How often do teachers meet with their PLC (PM 3.4)?
SC, EV
Weekly
PD
September
PD
September
PD
Ongoing
PD, SC, EV
June
SC
SC
3.3)?
3.4. How many teachers receive NASA summer camp
Sept Baseline
Nov Mid-Year
Apr Year-End
Sept Baseline
Nov Mid-Year
Apr Year-End
Question 4: How are real-world tethers being integrated into PBL instructional practices to impact STEM education, student engagement, and student
achievement in Cabarrus County Schools?
Engaged Personnel
Data Collection
Timeline
SC, EV
Ongoing
PD, EV
Survey: All Teachers
Ongoing a
March b
June c
SC, PD, EV
June
Evaluation Questions
10
PD, SC, EV
Survey: All Students
and Teachers
FG: TBD
June
Question 5: How does INSPIRE impact underrepresented students STEM engagement and achievement?
Engaged Personnel
Evaluation Questions
Data Collection
Timeline
PD, SC, EV
PD, DA, EV
Oct - Jan
PD, DA, EV
Oct - Jan
PD, SC, V
Mar
11
June
Survey: Students Gr
4 -12
FG: TBD
12
SUMMATIVE EVALUATION
The summative evaluation, to be concluded in Fall 2017, is designed to answer the confirmatory
question What is the impact of the INSPIRE program on student achievement?
Description of all impact studies being conducted. The impact studies will address two primary
questions that examine the effect of INSPIRE on: 1) 2nd grade math achievement for INSPIRE-zoned
elementary school students after three program years, and 2) math achievement for secondary school
magnet students after three program years. Accordingly, we propose two impact studies in which
selection bias is controlled by design, either through matching (Study 1) or through randomization of
interested magnet students to treatment and control groups (Study 2). The studies use recognized
assessments with proven reliability and validity, including NC End-of-Grade (EOG) and End-of-Course
(EOC) tests, Discovery Educations (DE) Common Core Math Assessment, and the ACT college readiness
assessment. Table 3 details the standardized assessments to be used by grade level.
Impact Study 1 uses a three-year longitudinal single-cohort quasi-experimental design (QED) to assess
the impact of INSPIRE on math achievement at the end of 2nd grade. The study sample consists of a
single-cohort of all Kindergarten students (N = 228) entering two STEM elementary schools, ColtraneWebb Elementary and Patriots Elementary. The two target schools hold Title I status and serve the
districts most ethnically diverse students (Coltrane-Webb: 53% FRPL, 55% minority; Patriots: 34% FRPL,
28% minority). Low-income minority students are zoned for each school, making their enrollment
compulsory and largely due to circumstance rather than choice. We will compare the outcomes of
INSPIRE-zoned students with similar students from four comparison elementary schools not offering a
STEM program (R.B. McAlister; Beverly Hills; Pitt School Road; Charles Boger). Propensity-score
adjustment (PSA) will be used to minimize selection bias and ensure INSPIRE-zoned students and control
students are equated on key background and demographic variables, including a baseline pretest
measure of math ability in Kindergarten (DE Common Core Math Assessment), age, ethnicity, gender,
and free/reduced lunch status. Propensity score adjustment is an adjusted regression used to predict
the effects of a treatment while accounting for confounding variables.
Impact Study 2 uses a two-cohort individual-level longitudinal randomized control trial (RCT) to assess
the effects of INSPIRE on secondary students math achievement and science after two years of
treatment. Use of a RCT design effectively minimizes selection bias and ensures that the program and
control groups are equitable at baseline in terms of background, demographic, and pre-program factors
such as motivation. In school years 2014/15 and 2015/16, a carefully orchestrated lottery process will
be used to randomly assign eligible 6th grade and 9th grade applicants to open INSPIRE slots at J.N. Fries
Middle School and Central Cabarrus High School (CCHS), respectively, or to a non-INSPIRE control group
in the middle school or high school for which the student is zoned (annual estimates: INSPIRE=66,
control=99, total N=165). The target schools serve a significant number of minority and low-income
students (Fries: 25% FRPL, 35% minority; CCHS: 50% FRPL, 42% minority). Between-group comparisons
will be made on state standardized test scores in math and science assessment. The study will use
Hierarchical Linear Modeling (HLM) and include individual and contextual covariates.
13
Dates
Sept
2015
October
November/
December
January
2016
March
Faculty
Students
April
June
Coaches
X
X
Ongoing
14
Deliverable Name
1
INSPIRE Year 2 Evaluation Plan
2
INSPIRE STEM Events Snapshot
3
4
5
6
7
8
9
10
11
12
13
Scheduled Delivery
September 2015
September 2015
November 2015
March 2016
June 2016
October 2015
October 2015
October 2015
October 2015 June 2016
November 2015
November 2015
June 2016
March 2016
July 2016
July 2016
August 2016
15
APPENDIX A
INSPIRE Data Collection Tools & Systems: Towards Timely and Accurate Reporting (Fall 2015)
TOOL
1. PBL Framework &
Assessment of
Quality Rubric
Meeting Attendance
MAPS TO PM
1.1, 1.2
3.1, 3.2
a.
b.
c.
d.
LOCATION/ACCESS
POINT
Electronic MSWord File
Coaching Log
PBL Lesson Implementation
PBL Lesson Assessment
Teacher PBL Instruction
Ratings
e. Coaching Detail Summary
f. PBL Lesson & Teaching
Summary
3.4
5. INSPIRE PD Tracking
Tool & Real-time Online
Summary Report
3.3
4.3, 4.4
7.
4.1
8.
9.
INSPIRE Teacher
Surveys
Student Online Surveys
(Gr 4 - 12)
Student Surveys
(Gr K 3)
CCS Formative
Data
INSPIRE STEM resource page: http://www.cabarrus.k12.nc.us/Page/33255. Ongoing technical support is provided by TEG.
APPENDIX B
TEG Third-Party Data Collection and Security Memorandum
The purpose of this memorandum is to provide you with information about the process by which we
protect data that is collected for the purpose of program evaluation conducted by The Evaluation
Group. We recognize that we often request sensitive data as a result of our services and have protocols
in place to protect data that has been collected. We take multiple steps to ensure that confidentiality is
maintained throughout the evaluation and request data in the least intrusive manner allowed by the
evaluation design.
General Use
1. Student and teacher-level data will only be used for purposes outlined in the evaluation
plan and for purposes of completing the Annual Performance Report (APR) or other reports
required by the funder;
2. On reports, student-level data will only be reported in aggregate to protect students. If the
n is less than 10, we will not report for that group;
3. Data will be stored in Qualtrics or Dropbox for up to one year and then archived for three
years before being deleted from storage.
Data Access and Security
1. Data stored in Qualtrics is password protected and encrypted. For more information on
Qualtrics security, see here.
2. Downloaded copies of student data are stored on a password-protected Dropbox, which
provides 256-bit AES encryption and uses SSL/TLS secure tunnel for file transfers.
3. Only members of The Evaluation Group/Research Associates will have access to the data.
We take data security very seriously and ensure that the data we collect is protected to the best of our
ability. As required by the Federal Educational Rights and Privacy Act (FERPA), we utilize reasonable
methods to ensure that data is secure and the confidentiality of respondents is protected. In the case
of a data breach, we have a dedicated IT Department and Data Architect who have the capability of
addressing these challenges immediately so that additional issues do not occur. We are happy to discuss
our data security protocol further, at any point throughout the evaluation. Please let us know if you
have any questions.
APPENDIX C
NC Assessment used for the INSPIRE Impact Study by Level and School Year (SY)
INSPIRE Impact Study Assessment by Grade
Grade
Math
Science
Elementary School (Coltrane-Webb & Patriots)
N/A
(SY 2015/16)
N/A
K
(SY 2014/15)
N/A
(SY 2016/17)
8
(SY 2016/17)
NC EOC Math I
9
(SY 2014/15 &
2015/16)
10
(SY 2015/16 &
2016/17)
11
NC EOC Math II
NC EOC Biology
ACT, Math
ACT Science
(SY 2016/17)
N/A
N/A
(Annual)
5
(Annual)
NOTES:
1. TEG proposes an annual October 1 January 31 data collection schedule.
2. For all assessments, TEG requests raw scaled scores.
3. For all cases, CCS Data Analyst (Brian Dos) will provide: Stable student identifiers (tracking INSPIRE
student movement longitudinally across schools); Demographic variables (age, ethnicity, gender, and
FRLP); School affiliation; Program flag (INSPIRE, INSPIRE-zoned, Other); Program entry grade