Beruflich Dokumente
Kultur Dokumente
TABLE OF CONTENTS
INTRODUCTION
Percentage in Bands
Trend data
School Growth
Student Growth
11
Item Analysis
15
Relative Achievement
16
18
Template
o
19
32
INTRODUCTION
Analysing NAPLAN results using SMART provides schools with a resource to help better understand
their performance on literacy and numeracy measures. It allows schools to critically evaluate their
performance and contribute diagnostic information to the schools evidence base.
The diagnostic information will assist teachers to identify student learning needs. It is essential that
data presented from NAPLAN must be analysed within the schools broader context. Schools in the
NSW DEC serve a great variety of types of communities and therefore operate in a variety of
conditions that impact student achievement and school performance.
The focus of school improvement is to increase the learning outcomes for all students. All schools
have the potential to increase the learning outcomes of their students. An effective approach to
analysing student achievement and school performance is to identify strengths and areas for
improvement. This approach allows for the fact that the starting points for a school can be very
different to other schools, including the schools located nearby.
School performance in comparison to the state mean is not always a sufficient measure for analysing
by itself. It is often helpful to analyse school performance in relation to other comparisons that take
account of socio-economic and social-education factors. Against Index of Community SocioEducational Advantage (ICSEA) values, a school might be performing much better than predicted or
not as well as predicted. Therefore, consider using the statistically similar school group (SSG)
comparisons in data analysis.
Schools may be performing below the state mean but may have high growth between testing points
in time. Growth is a valuable measure of school improvement that is specific to the students at the
school and demonstrates what impact the school has had on student performance.
Once the Agree button is clicked, your School Summary screen will appear.
This screen displays a snapshot of data for your school. There are various ways to access data about
your school. Firstly we will access downloadable school reports. Click on the Reports tab
Screen Shot
Description of Report
Collection of all static table reports
School Summary
Means and
Standard
Deviations
Percentage in
Bands
Analysis by
Question
Options
Student
Response
Analysis
Student Growth
Student Scores
and Bands
School vs State
Item Summary
NonParticipation
Analysis tools
Percentage in Bands
This analysis tool provides detailed information on student groups and their performance relative to
the region, state and other groups in specific performance bands. There are three years worth of
data displayed at once, which allows the school to see if there have been changes in the number of
students achieving in the various performance bands.
This graph can be important for schools that are performing below the state mean. The graphs may
show value added through the movement of students from lower bands into higher bands by
increased percentages in the higher bands.
Care must be taken when using this information with schools that have less than ten students being
assessed in the group. In schools with small numbers each individual student is worth a large
percentage and this may make using percentages in a band misleading.
Percentage in Bands
Trend Data
The Trend Data screen displays the trends in test means (averages) over time for your school, your
region and the state. You can analyse the data for each of the various test aspects.
There will be up to five years worth of data displayed, which allows your school to see if there have
been changes in trends over time.
These graphs, along with the others found in SMART, are designed to help your school consider the
factors that have contributed to its performance.
Comparing the patterns of performance in the Trend Data graphs for the various test aspects will
assist your school in identifying areas of strength and areas for further investigation.
Consider your schools context and the factors influencing your school results when interpreting
Trend Data.
Trend data
School Growth
The School Growth screen compares the achievement of your school in terms of average growth in
tests scores for the selected test aspect compared with the State, Region or School Education Group
(SEG), or selected Comparative School Group (create with the Manage Groups facility).
The data in this screen set should be considered with the school level data that is provided in the
Student Growth screens, i.e. the Expected Growth and Percentile Range data.
Examine the patterns of student performance relative to the comparison groups to assist you in
identifying areas of strength and areas for further investigation.
Consider your school context and the factors influencing your school results when interpreting data
from School Growth.
School growth
Student Growth
Data from this section is particularly important to all schools. Schools that are not achieving the state
mean may still be providing the environment for students to achieve the expected growth in literacy
and numeracy. Likewise a school that may be at the state mean or above may not have appropriate
growth occurring for their students.
Growth on the NAPLAN scales varies depending on prior scores and because of this, a measure
based on the percentage of students achieving expected growth is more useful for diagnostic and
school planning purposes than a measure based on average growth alone. In general, there is a
tendency for expected growth to be higher when you start with a low prior score than from a higher
prior score.
In using the growth data in SMART, it is important to recognise that:
most students have a growth figure in a relatively small range around the state average
the expected growth value thresholds should be considered interim until sufficient data (at
least 3 years) are available over time to provide confidence in the measure .
The Student Growth screen allows you to identify average scaled score growth, the percentages of
students in growth percentile ranges and the percentage of students achieving expected growth for
the selected test aspect. You can compare this information to a standard school group, state, region
or custom school group.
The Student Growth graph and table, along with the others found in SMART, are designed to help
you consider the factors that have led to student achievement being above or below the state and
/or region performance.
Examine the patterns of student performance relative to the comparison groups to assist you in
identifying areas of strength and areas for further investigation. Consider your school context and
the factors influencing your school results when interpreting Student Growth. After this, areas that
you wish to explore further may be identified.
The areas which you have undertaken for further analysis may highlight issues that have implications
for classroom teaching, school programming and the school plan.
Student growth
10
Arrows that finish above the upper reference line indicate that the students progress was amongst
the higher 25% of the State for students achieving the same result in the previous assessment year.
Note: There should be caution in
drawing conclusions about growth
for students who have very high
prior and present achievements (i.e.
have achieved in the top band). Due
to the characteristics of the tests, a
small change in the number of
correct responses for these
students is likely to have a
disproportionate impact on their
growth, in comparison to students
11
12
The mean (average) is used as a measure because in living systems data often follows a
bell curve (normal distribution), which is symmetrical in shape.
The bell curve (normal distribution) shows the spread of results from NAPLAN
Majority of students score around
the middle (mean)
The mean is useful when the data fits this bell curve shape as it indicates where most
students are scoring.
The mean gives the middle value.
The standard deviation is a figure also found on the table. It measures the spread of data.
For bell curves where there is a high standard deviation, the data is spread out and the
curve flattens, meaning there is more variety of scores.
For a low standard deviation the data is more tightly bunched. The curve becomes taller,
meaning there is less variety of scores.
13
14
Range of Standard
Deviations () from state
School response
Below -0.5
Below state
Within state
Above state
As NAPLAN has a scale from 0 to 1000, schools may find it difficult to interpret their results.
The data from the Means and Standard Deviation table can be entered into the calculation
spreadsheet, which will indicate if your school results are of concern or within state parameters. Just
because the school mean may be below state does not indicate there may be a problem.
To access the NAPLAN analysis spreadsheet go to eLearning at the top of the Means and SD page of
SMART and click on Key Messages. Then click on next slide to access the link for Mean Calculations
and scroll down to Means and Standard Deviations. This spreadsheet allows you to calculate the
effect size of school means relative to the State and Region means to give an indication of whether
the school is well above or well below.
Using the calculation spreadsheet
Enter the data from the Means and Standard Deviation table into the spreadsheet. If you wish to do
it for individual groups you can copy the spreadsheet and enter, for example, the information for
boys, girls, ATSI etc.
15
Once you have entered the values from the table of results from the SMART data the spreadsheet
will indicate the performance of the school relative to the state mean for that year group.
The result for the above data indicates that the school has a problem in the reading aspect based on
the school mean relative to the state.
Note: Remember the size of the group when interpreting results as mentioned in the section above
Item Analysis
The Item Analysis screen displays the details of an assessment item, including the skills
assessed, syllabus references and ESL scales. Each question from the test booklet can be
viewed.
When you click on a question (or on Expand) you can view which students answered the
question correctly or incorrectly, and access the Distractor Analysis (for Numeracy only) and
teaching strategy for each test item. Also analyse, sort and filter graphs of school
performance in each test aspect.
Examine the patterns of responses by groups of students to assist you in identifying areas of
strength and areas for further investigation.
Consider your school context and the factors influencing your school results when
interpreting data from Item Analysis. From this, identify areas that you wish to explore
further.
The areas which you have undertaken for further analysis may highlight issues that have
implications for classroom teaching, school programming and the school plan.
16
Band Predictor
The Band Predictor screen shows how the percentage of students in the bands would change if the
results of every student who sat for the test, changed by the selected number of raw marks (using
the + or buttons). The graph displays this prediction in the form of an additional column shared in
red.
Warning The Band Predictor graph should be interpreted with caution. The graphs demonstrate
the effect that a set change in correct responses (raw marks) for every student would make to the
band distribution based on the results of the current year only.
Note the number of additional questions a student needs to answer correctly in order to progress
to a higher band varies depending on the students current level of achievement and location within
the band.
Band Predictor is best used as a discussion starter. For example, in the case where students have
inadvertently missed some questions in the assessment, you may like to use the analysis provided
for reflection.
17
Relative Achievement
The Relative Achievement screen displays assessment results plotted for two selected test aspects.
The graph shows how each student in the selected group has performed in each of the two test
aspects.
The Relative Achievement graphs can be used to provide broad indicator differences in performance
for the selected tests in NAPLAN. It is recommended that the graphs be used as a general indicator
to consider a schools or groups performance in comparison to state performance patterns.
Examine the patterns of student performance relative to the comparison group to assist you in
identifying areas of strength and areas for further investigation.
Consider your school context and factors influencing your school results when interpreting data from
Relative Achievement. From this, identify areas that you wish to explore further.
The areas which you have undertaken for further analysis may highlight issues that have implications
for classroom teaching, school programming and the school plan.
Relative achievement
tool
The graph plots the achievements of individual students (represented by a circle) on the basis of
their achievement relative to two test aspects. The graphs also display bands for the two selected
tests.
The students scores are plotted in relation to the State Reference Line, which provides a measure of
relative achievement for students across the state.
If there is no State Reference Line this is an indication that there is a weak correlation between the
chosen test aspects. You should re-select the axes to be of a similar strand e.g. Reading and Writing.
Note
The test scales that comprise NAPLAN, namely Reading, Writing, Grammar and Punctuation, Spelling
and Numeracy, are developed on different scales and assess different skills. Because of this, it is not
appropriate to directly compare scaled scores for these different tests. The Relative Achievement
screen in SMART compares student achievement on one test scale with the average achievement of
18
all other students in the state across the range of scores, as represented by the State Reference
Line.
The Relative Achievement graphs then can be used to provide a broad indictor of differences in
performance for the selected tests in NAPLAN. It is recommended that the graphs be used as a
general indicator to consider a schools or groups performance in comparison to state performance
patterns.
19
20
Number of students
Aspect
63 x Year 5 students
Reading
Cropped screen shots have been taken from the online SMART Reports tab and inserted in the relevant sections below.
REPORTS TAB
Focus Questions
School vs State Item
Performance Summary
Results
Incorrect
response
10 or
more(Version:
above state
population
Analysing
NAPLAN
results using
SMART
14 June
2012)
percentage
12 Links information/ Recognises main idea
Area of Strength
Area of Focus
Student Results
Students in
highest band
Students in
lowest band
Band 8 Reading:
B,T,S,C,K,J
Band 3 Reading:
I,Z,B,C,J,J,T
22
Percentages in Bands
Movement of students
in bands across years?
Different groups
performing better
relative to state?
23
Analysis by Question
Options
Incorrect answers
State: The percentage of
students achieving the
correct response was
below 75% for 22 of the 35
questions.
Correct answers
Questions requiring
attention:
2, 10, 11, 12, 13, 15, 16, 17,
18, 19, 20, 21, 22, 23, 24,
25, 26, 27, 28, 29, 30, 31,
32, 33, 34, 35
Alternative answers:
Question 23 state 72%,
school 79%
Question 28 state 83%,
school 79%
Other questions where the
school percentage choosing
the correct response was 9
below the State population
percentage include: 24, 26,
27, 30, 32,
24
Questions 1, 3, 4, 5, 6, 7, 8, 9, 14,
25
26
ANALYSIS TOOLS
Focus Questions
Percentage in Bands
Movement of students in
bands across years?
Results
Area of Strength
Area of Focus
Reduce the number of
students performing in the
bottom bands.
27
Trend Data
What directions are the
school trend lines moving?
School trend is in an
upward direction.
Girls
Boys
28
School Growth
The school shows growth
above the state, region
and of the schools in its
SEG.
Student Growth
What is the average scaled
score growth of the school
compared to state and
region?
What is the proportion of
students in each percentile
band?
Are students showing
expected growth?
Item Analysis
What questions in relation to
their difficulty are students
performing equal or better
than the state>
30
Band Predictors
Blue State
Black DET state
Green School
Red possible school
improvement
Present
+1
+2
One question
improvement The
bottom two bands, while
showing some
improvement, would still
be above state
percentages.
Band 8 would see an
improvement, but the top
two bands would still be
below state percentages.
Two question
improvement Reduced
percentage of students in
the bottom band lower
than state percentage.
Improved performance in
Bands 6, 7, 8 and above
state percentage.
31
+3
Relative Achievement
What aspects are performing
better in the school
compared to others?
32
Three question
improvement significant
reduction in Bands 3, 4, 5.
Improved performance in
Bands 6, 7, 8 resulting in
performance well above
state percentages in Band
6 and 8.
FOLLOW UP TO NAPLAN
Results
Data
Teaching Strategies
What teaching strategies from the
NAPLAN support materials are
already being implemented in the
schools?
Implementation of literacy
programs
Literacy on Track,
Best Start
Accelerated Literacy
Language, Learning and
Literacy L3
Reading Recovery
SLST programs
Area of Strength
Teaching strategies
can be used to help
teachers explicitly
teach skills.
Area of Focus
Question 11: Applied
Comprehension
Question 34: Inferring
meaning
Question 16: Connecting
Ideas
Question 12: Connecting
Ideas
Question 19: Connecting
Ideas
Click on stimulus to
access a copy of the
NAPLAN resource.
33
SCHOOL REPORTS
Data
School vs State Item Performance
Summary
Means and Standard Deviations
Student Results
Students in highest band
Students in lowest band
Percentages in Bands
Analysis by Question Options
Student Response Analysis
Student Growth (Year 5 onward)
Results
Area of Strength
Area of Focus
Area of Strength
Area of Focus
ANALYSIS TOOLS
Data
Percentage in Bands
Movement of students in bands
across years?
Results
34
lines moving?
What directions are the school and
region trend lines moving in relation
to the school trend lines?
Are specific aspects moving in a
particular direction?
School Growth
Student Growth
Means and Standard Deviations
Cohort (above / below)
ATSI (above / below)
At or above National
Minimum Standard
Below National Minimum
Standard
Size of group less than 10
Item Analysis
What questions in relation to their
difficulty are students performing
equal or better than the state>
Which questions are students
performing significantly below the
state?
Band Predictors
Relative Achievement
What aspects are performing better
in the school compared to others?
Analysing NAPLAN results using SMART (Version: 14 June 2012)
35
Data
Teaching Strategies
What teaching strategies from the
NAPLAN support materials are
already being implemented in the
schools?
Results
FOLLOW UP TO NAPLAN
Area of Strength
36
Area of Focus