Sie sind auf Seite 1von 20

Journal

Allan of Science
Leslie White and Mathematics
Education in Southeast Asia
2010, Vol. 33 No. 2, 129 - 148

Numeracy, Literacy and Newman’s Error Analysis

Allan Leslie White


University of Western Sydney

Newman (1977, 1983) defined five specific literacy and numeracy


skills as crucial to performance on mathematical word problems:
reading, comprehension, transformation, process skills, and encoding.
Newman’s Error Analysis (NEA) provided a framework for
considering the reasons that underlay the difficulties students
experienced with mathematical word problems and a process that
assisted teachers to determine where misunderstandings occurred.
NEA also provided directions for where teachers could target effective
teaching strategies to overcome them. NEA experienced a reawakening
in Australia and has been included in a number of programs such as
the Counting On program in the Australian state of New South
Wales. This paper presents findings of a pre-post test given to 1213
students participating in the 2008 Counting On program and
examines NEA as a diagnostic tool linking numeracy and literacy
and will discuss how teachers have also used NEA as a remediation
and general classroom pedagogical strategy for primary and secondary
schools.

Background
While most mathematical questions involve the use of words, not all are
classed as word problems. A primary condition of word problems is the
inclusion of a word description of a context within which the problem resides
such as the word problem shown in Figure 1.

Paul went on a bike hike. He rode 402 km on his bicycle over 6 days. He rode
the same distance each day. How far did Paul ride each day?

Figure 1. A typical word problem.

Key words: Newman’s Error Analysis; Problems; Errors; Numeracy; Literacy

129
Numeracy, Literacy and Newman’s Error Analysis

Mathematical word problems and their place and importance in the


school curriculum have attracted diverse opinions. “Teachers seem not to
like word problems. Many have asked me why these are used to ‘trick’
children in assessments” (Askew, 2003, p. 78). It is well recognised that
students appear to struggle with both the literacy and mathematical demands
of typical mathematical word problems.
... at the upper primary level most errors on mathematics tests and
examinations are caused by Reading, Comprehension or Transformation
errors, or by Carelessness. Often, pupils are able to carry out one or more of
the four operations (+, -, x, ÷) needed to answer a question, but they do not
know which operations to use (Clements, 2004, p. ii).
The importance of word problems, it is argued, lies in the centrality of
language in the teaching and learning of mathematics (Clements & Ellerton,
1993). Others would also argue that a deeper level of mathematics is needed
beyond procedural proficiency, and that a conceptual knowledge of
mathematics is the goal (Carpenter & Lehrer, 1999). Some would maintain
that language provides a vehicle for rich classroom discussions and assists
teachers and students to appreciate the power of mathematics in making
sense of their world.
... if the essence of mathematics is the setting up of and working with
mathematical models, and if we treat word problems in such a way, then
they might have a role to play in helping children better understand the
process of mathematizing. And with the increasing mathematizing of the
world (from national test scores to pension prospects), informed and critical
citizens need to be aware that mathematizing is not something that arises
from the world, but something that is done to the world. In a small way,
working on word problems might help begin to develop this awareness (Askew,
2003, p. 85).
So in summary, while the language demands of the mathematics
curriculum are important and need to be developed, they also contribute to
the difficulties experienced by students struggling with mathematics. Thus
mathematics teachers must be aware of the literacy and numeracy issues
involving word problems. But surely this cannot be of great importance
because students who reach upper primary school and early secondary
school can read, calculate and write. This may not be the case for all students.

130
Allan Leslie White

In Australia there has been an increasing concern with students in the


middle years who are struggling with mathematics. Gervasoni, Hadden and
Turkenburg (2007) conducted a large study of number learning in 2006 of
over 7000 children in Ballarat in the Australian state of Victoria for the
purpose of identifying issues that could inform the development of a
professional learning plan. A notable number of students (31%) beginning
Grade 6 were found not yet able to read, write, order, and interpret four-
digit numbers nor use reasoning-based strategies for calculations in addition,
subtraction, multiplication and division.
In New South Wales the Department of Education and Training
(NSWDET) in responding to similar findings implemented the Counting
On Program to address the needs of students who were excluded from
effective mathematics study in the middle years and beyond because of a
lack of understanding of and proficiency with the early school mathematical
knowledge. The Counting On program was designed with a twin learning
focus upon student and teacher learning and has continued to expand and
evolve. The initial program was designed for first year secondary school
students (Year 7) who had not achieved specific New South Wales Stage 3
(primary school) mathematics outcomes by the time they commenced
secondary school. It was later extended to include the primary schools and
now targets the middle years (9-14 year olds).
The Counting On program has a strong research base starting with the
Counting On Numeracy Framework (Thomas, 1999) which was an extension
of work by Cobb and Wheatley (1988), Beishuizen (1993), Jones, Thornton,
Putt, Hill, Mogill, Rich and van Zoest (1996) and relates to the Count Me In
Too Learning Framework in Number (LFIN; Wright, 1998; Wright, Martland,
& Stafford, 2000). This theoretical base was supported by an increasing
number of Counting On evaluation studies (Mulligan, 1999, Perry & Howard,
2000, 2002, 2003; White 2008, 2009).
In 2007 the program underwent a major revision and was implemented
in 122 schools across NSW which were grouped into 30 clusters with each
cluster supported by a mathematics consultant. It was based on the earlier
Counting On models but included changes designed to simplify and
encourage further and ongoing involvement of schools. One of the features
of the revised model was the inclusion of Newman’s Error Analysis (NEA).

131
Numeracy, Literacy and Newman’s Error Analysis

Communicating is one of the five processes contributing to the Working


Mathematically strand in the NSW mathematics school curriculum. Students
are expected to learn to use appropriate language and representations to
formulate and express mathematical ideas in written, oral and diagrammatic
form. Thus the inclusion of NEA aimed to address the difficulties students
were experiencing with mathematical word problems and the problems
teachers were experiencing with their students’ difficulties. The success of
the inclusion of NEA is the focus of this paper. The next section will present
a brief history of NEA, followed by the results of using NEA in a remedial
numeracy program for struggling middle years students.

Newman’s Error Analysis


In the 1980s and 1990s NEA was mainly promoted in Australia by Clements
(1980, 1982, 1984) and in collaboration with Ellerton (e.g., Clements &
Ellerton, 1992, 1993, 1995; Ellerton, & C1ements, 1991, 1996, 1997) although
there were others (e.g., Casey, 1978; Clarkson, 1980; Watson, 1980; Tuck,
1983; Faulkner, 1992). NEA also spread widely throughout the Asia-Pacific
region such as in Brunei (Mohidin, 1991); in India (Kaushil, Sajjin Singh &
Clements, 1985); in Malaysia (Marinas & Clements, 1990; Clements &
Ellerton, 1992; Sulaiman & Remorin, 1993); in Papua New Guinea (Clements,
1982; Clarkson, 1983, 1991); in Singapore (Kaur, 1995); in the Philippines
(Jiminez, 1992); and in Thailand (Singhatat, 1991; Thongtawat, 1992).
This initial momentum declined in New South Wales and NEA had
almost disappeared and its inclusion in the Counting On program in 2007
was serendipitous. Clements had moved to the University of Brunei
Darussalam and designed and implemented a national professional learning
program for primary teachers titled, Active Mathematics In Classrooms
(AMIC; White & Clements, 2005). The program targeted numeracy and had
nine specific aspects, of which NEA was one. Six of the aspects of the AMIC
program were reported in the NSW primary school journal, Square One, for
the Mathematics Association of New South Wales. An article on the use of
NEA was selected and added to the teacher reader section of the NSWDET’s
website in 2006 (White, 2005). This article created a renewed interest by
teachers in NEA and it was subsequently added to the Counting On program
in 2007.

132
Allan Leslie White

The reasons for the inclusion of NEA, (Newman, 1977, 1983) in the 2007
and 2008 programs were to assist teachers when confronted with students
who experienced difficulties with mathematical word problems. It
challenged the prevailing practice of giving students ‘more of the same’
involving drill and practice in the hope that the students would rectify their
difficulties. NEA provided a framework for considering the reasons that
underlay the difficulties and a process for assisting teachers to determine
where misunderstandings occurred and where to target effective teaching
strategies to overcome them. Moreover, it provided excellent professional
learning program for teachers and made a nice link between literacy and
numeracy.
NEA was designed as a simple diagnostic procedure. Newman (1977,
1983) maintained that when a person attempted to answer a standard,
written, mathematics word problem then that person had to be able to pass
over a number of successive hurdles: Level 1 Reading (or Decoding), 2
Comprehension, 3 Transformation, 4 Process Skills, and 5 Encoding (see
Table 1 for the interview prompts). Along the way, it was always possible
to make a careless error and there were some who gave incorrect answers
because they were not motivated to answer to their level of ability. Newman’s
research generated a large amount of evidence highlighting that far more
children experienced difficulty with the semantic structures, the vocabulary,
and the symbolism of mathematics than with the standard algorithms. In
many Newman studies carried out in schools the proportion of errors first
occurring at the Comprehension and Transformation’ stages has been large
(Marinas & Clements,1990; Ellerton & Clements, 1996; Singhatat, 1991). Thus,
studies regularly reported that approximately 70 per cent of errors made by
Year 7 students on typical mathematics questions were at the Comprehension
or Transformation levels. These researchers also found that Reading
(Decoding) errors accounted for less than 5 per cent of initial errors and the
same was true for Process Skills errors, mostly associated with standard
numerical operations (Ellerton & Clarkson, 1996). Also, Newman’s research
consistently pointed to the inappropriateness of many remedial mathematics
programs in schools in which the revision of standard algorithms was
overemphasised, while hardly any attention was given to difficulties
associated with Comprehension and Transformation (Ellerton & Clarkson,
1996). There have been adaptations and two procedures that modified the
interview procedures used by Newman (1977) will now be briefly described.

133
Numeracy, Literacy and Newman’s Error Analysis

Table 1
The Newman’s Error Analysis Interview Prompts
1. Please read the question to me. If you don’t know a word, leave it out.
2. Tell me what the question is asking you to do.
3. Tell me how you are going to find the answer.
4. Show me what to do to get the answer. “Talk aloud” as you do it, so that I can
understand how you are thinking.
5. Now, write down your answer to the question.

Analysis of All Levels


The first adaptation by Casey (1978) in a study of the errors made by 120
Grade 7 students in a single high school instructed the interviewers to help
students over errors. If a pupil made a Comprehension error, the interviewer
would note this and explain the meaning of the question to the pupil, and
this process would continue until the student had answered the question.
Thus, in Casey’s study, a pupil could make a number of errors on the one
question and thus it is difficult to compare Casey’s interpretations with
Newman’s. However, Casey’s method was attractive to teachers who were
more interested in how the students performed at the Process level.

Analysis of All Answers


The second adaption was proposed by Ellerton and Clements (1997) who
used a modified form of the Newman interview method to analyse the
responses by students in Grades 5 through 8 to a set of 46 questions. All
responses, both correct and incorrect, were analysed. A correct answer which,
after analysis, was not deemed to be associated with an adequate
understanding of the main concepts, and/or skills and/or relationships
tested by a question, would be associated with a Newman error category,
even though the answer was correct. Ellerton and Clements’ modification
led to the adoption of a slightly different definition of “Careless” error from
that previously given by Clements (1982).
This concern with linking correct answers as equivalent to understanding
has been researched in other contexts. For example, Ellerton and Olson (2005)
conducted a study of 83 Grades 7 and 8 American students completing a
test comprising items from Illinois Standards Achievement Tests. Their
findings reinforced the fact that students’ scores on tests do not necessarily
reflect their level of understanding of mathematical concepts and

134
Allan Leslie White

relationships. The results indicated a 35% mismatch with students who gave
correct answers with little or no understanding and others who gave incorrect
answers but possessed some understanding. The authors cast doubt on the
use of large scale testing programs as a means of making comparisons or
being used as basis for the allocation of resources.
While there are other theoretical approaches available to teachers, NEA
offers one of the easiest to use and adapt and has proven popular among
NSW teachers for the ease of the diagnostic features. What is also surprising
is how NEA has been used by teachers as a problem solving strategy for
students and as a classroom pedagogical strategy. In the next section, data
from the 2008 evaluation report (White, 2009) will examine the student
learning outcomes and the teacher use of NEA.

Student Learning
The 2008 Counting On program was implemented in 99 schools across the
state. An assessment instrument based on the learning framework was
administered by the class teacher as a whole class schedule covering place
value, addition, subtraction, multiplication, division, and word problem
tasks. The assessment schedule results were used by the teacher to identify
the student target group. The target group completed the program and were
tested at the start and finish of the program. The teachers were asked to
record the results of the target group assessment process involving a
minimum of 5 students per class on an excel spreadsheet supplied to them.
The spreadsheet recorded the initial level on the learning framework and
NEA for the students before the program was implemented and again
following 10 weeks of targeted activities.
The Counting On program is funded under an Australian federal
government program and there is a mandatory evaluation process that
includes instruments and reporting requirements. The Counting On program
also has to report to other NSW state bodies and other data is collected for
these purposes. The author of this paper was given all the data collected
from all the instruments used and asked to analyse and construct an
evaluation report. He had neither input into the design of these instruments
nor the collection of data although he was able to collect further data. Thus
there are methodological issues that arise such as a concern with the initial
and final student level diagnosis by teachers. While the facilitators are trained
in the use of NEA, there are concerns with the process involving the other

135
Numeracy, Literacy and Newman’s Error Analysis

teachers and this will be discussed later in this paper. However as a result
of these concerns with the integrity of some of the data, it was decided to
use only simple statistical tools in the analysis.
In 2008 data was collected from 74 schools with 55 primary schools, 16
secondary schools and three central schools. There were 1213 students with
954 primary students (78.6%) and 259 secondary students (21.4%). Only one
of the two questions involving Newman’s Error Analysis in the assessment
instrument was recorded for each student. The NEA scale from 1 to 5 was
used, and a category 6 was added to represent those who could complete
the word problem successfully. Table 2 displays the initial and final NEA
levels for the 2008 cohort and indicates an improvement in the overall levels
from the initial to the final student assessments.
Table 2
The Initial and Final Newman’s Error Analysis Levels
NEA Initial Level Percentage Final Level Percentage
Levels Frequency Frequency Frequency Frequency
1 196 16.2% 51 4.2%
2 452 37.3% 234 19.3%
3 399 32.9% 477 39.3%
4 101 8.3% 220 18.1%
5 37 3.1% 134 11.0%
6 28 2.3% 97 8.0%
Total 1213 100.0% 1213 100.0%

Table 3 shows that the majority of students have improved by 1 or more


levels (56.6%), with a sizeable group improving two levels (15.6%). There
are a small group of students who improved by 3 and 4 levels as there are
some who decline by 1, 2 or more levels.
The descriptive statistics record an increase in the mean from 2.52 for
the initial level (SD = 1.096) to 3.37 for the final level (SD = 1.254). Using a
paired sample T-Test, the results indicate that the change in the student
outcomes for mathematical word problem levels at the start and finish of
the 10 week Counting On 2008 program was statistically significant.

136
Allan Leslie White

Table 3
The Difference in Newman’s Error Analysis Levels
Difference Frequency Percentage Frequency
-4 3 0.2%
-3 6 0.5%
-2 14 1.2%
-1 52 4.3%
0 452 37.3%
1 385 31.7%
2 189 15.6%
3 79 6.5%
4 27 2.2%
5 6 0.5%
Total 1213 100.0%

While the 2008 data collected for the pre and post program student
learning outcomes indicated that a statistical significant increase existed in
the NEA levels, of concern was the group who did not show an increase. In
a short program as this it is unrealistic to expect that all students will make
great leaps on the NEA levels. These targeted students have been struggling
for some time with their mathematical and literacy difficulties and have
developed judgements of their own ability. To improve 1 level, especially
for the NEA scale which could involve the improvement of reading or
comprehension, is quite remarkable in such a small time frame.
However, there may be other possible explanations for the lack of
improvement in a small group of students or the apparent decline in others.
Vaiyatvutjamai and Clements (2004) analysed the errors made by 231 Form
3 (Year 9) Thai students in two Chiang Mai government secondary schools.
Students completed tasks before and immediately after a series of 13 lessons.
A number of misconceptions were revealed and although some were clarified
as a result of the lessons, there were others that remained and seemed to be
‘fossilised’. A ‘fossilised misconception’ was used to denote the situation
where a student maintains a faulty conception despite having been
specifically taught the ‘official’ defining characteristics of the relevant
concept. Associated with this then is the absence of cognitive change over
time or even resistance to change over time, so that cognitive inertia persists
despite the individual having been taught the ‘proper’ view of the concept.

137
Numeracy, Literacy and Newman’s Error Analysis

The implications for this current study are that the strategies and procedures
of the intervention program of Counting On should become integrated into
the everyday classroom and continue after the program has finished. These
‘fossilised misconceptions’ may require a greater time period for them to be
changed.
Also of interest is that while the study by Vaiyatvutjamai and Clements
(2004) involved students across the range of abilities, the results for low
performing students challenged the use of the term misconception for many
of the student errors. “A misconception can be regarded as a fairly stable,
but inappropriate, way of thinking ...analysing the errors made by low
performers in this study, was that the word ‘stable’ was not one that could
sensibly be used” (p. 181). Students with ‘unstable’ conceptions will give
different answers at different times and hence it is possible that their test
scores will decline. Students who have not developed confidence in their
ability to answer a question may revert to guessing. This may well explain
the 35% mismatch with students who gave correct answers with little or no
understanding and others who gave incorrect answers but possessed some
understanding reported in Ellerton and Olson’s (2005) study. While students
using a guessing strategy may cause some instability in the results for the
process level in NEA if not identified by the teacher, it would be easily
revealed by the second adaption of NEA (Ellerton & Clements, 1997) where
correct answers are interrogated.
This completes the section on NEA and student learning and the next
section will discuss the second learning focus which is upon the teacher.
Teachers were shown how to use the NEA prompts (see Table 1) to diagnose
the difficulties that their students were having with mathematical word
problems. The following section of this paper will explore how teachers
responded to NEA as an aspect of the Counting On program.

Teacher Classroom Use


The professional learning of teachers within the Counting On program had
evolved by 2007 into a two-day conference attended by one or two volunteer
teachers from each school who would act as facilitators. The facilitators
would then return to their school, form and train a team, and then implement
the program. While initially a ‘train the trainer’ model may have been a
loosely accurate description of this process, it evolved and now a better
term would be to call it a ‘facilitated model’ of teacher professional learning.

138
Allan Leslie White

Whereas in a ‘train the trainer’ process there would be an expectation of a


consistency in delivery with everyone trained and then delivering the
training in the same way, a facilitated model allows for more variability
with regards to how the program is implemented because the expectation is
that the facilitator will shape the program to meet local needs. This change
meant that whereas cascade models of train the trainer suffered from
‘dilution’ as the process moves from level to level, in contrast a facilitated
model had the potential to be both better and worse than the original
facilitator training. The evaluation reports highlighted that the success of
the program depended to a great extent upon the school facilitator. Thus
the implementation of NEA depended upon the school facilitator.
The 2007 and 2008 evaluation reports (White 2008, 2009) revealed that
the majority of teachers were strongly positive about the inclusion of NEA
into the program. Teachers reported it being an understandable, easy to
use, framework and process for uniting numeracy and literacy and there
were requests for further opportunities for teacher professional learning
involving NEA.
I probably found this to be where I gained the most knowledge. The children
having the most difficulty with maths problems were usually the ones with
poor literacy skills and using the analysis pinpointed exactly where the
problem was. I have used the prompts regularly as a teaching strategy and
introduced it to other staff members who are also incorporating it in their
classrooms (White, 2009, p. 51).
In the 2007 evaluation document it was reported that there was a divide
between the responses involving primary and secondary teachers. NEA
appeared to resonate more easily with primary teachers and with the issues
of ‘numeracy across the curriculum’ and ‘every teacher being a teacher of
literacy’ that are promoted by the NSWDET. Primary teachers were able to
use it to analyse their Basic Skills Test errors (schools receive a report on
their students who sat for the NSW state wide primary school testing
program) and develop strategies to improve their students’ literacy needs.
In the secondary school, the resonance was not as high, resulting in some
secondary teachers regarding NEA as an issue that was not their concern.

139
Numeracy, Literacy and Newman’s Error Analysis

The 2007 report stated:


A typical comment extract was ‘The inclusion of NEA has been extremely
beneficial in providing teachers with new insights into where and why the
students break down in solving word number problems. The workshops we
have provided have indicated that a number of secondary mathematics teachers
find it difficult to embrace this process’. However it should be emphasised
that this does not represent all secondary teachers, as is evidenced by the
following comment ‘One head teacher has adopted/adapted it to assist senior
students in Stage 6 mathematics’ (White, 2008, p. 12).
However there were comments in the 2008 report involving secondary
teachers that pointed to a deeper entrenched attitude that school facilitators
had to address when implementing the NEA procedure:
Teachers found this interesting and even though they were confronted with
evidence of what they already claim i.e. it is not the maths that is causing the
child difficulty in the question but the literacy, many teachers are not prepared
to tackle teaching reading and understanding the question, or finding
pertinent information within the question (White, 2008, p.39).
The majority accepted the challenge of incorporating literacy issues
within a mathematics lesson and the 2008 evaluation report described how
teachers had extended the use of NEA beyond a diagnostic tool to a
pedagogical and remedial tool.
The five Newman’s prompts were displayed on a poster in the classroom
(see Figure 2) and were referred to in whole class, small group and individual
student interactions. All students were expected to work through the NEA
levels for all mathematical questions. In a whole class setting, sometimes
the teacher selected students who worked aloud in order to scaffold the
learning of those struggling with one of the levels. Sometimes the whole
class would work aloud together through the prompts for a question, with
the teacher interrupting to probe each response in order to assist students
to construct a deeper understanding of the question, and the process for
finding a correct answer. The students would then construct an answer to
the question in their work books.

140
Allan Leslie White

Figure 2. Classroom poster (NSWDET).

The manner in which teachers used the NEA prompts as a generic


problem solving approach is reflected in the following teacher comments:
The Newman’s error analysis and follow-up strategies have helped students
with their problem-solving skills, and teachers have developed a much more
consistent approach to the teaching of problem-solving. Not only has it raised
awareness of the language demands of problem solving, but through this
systematic approach, teachers can focus on teaching for deeper understanding
(White, 2009, p. 37).
Many primary teachers told of how it had been adapted across their
different subjects and their different student year stages.
Groups are differentiated to cater for learning abilities. My Y5/6 children all
participate regularly in ability based maths groups within my room. They
analyse their own learning often through learning logs. Children practice
NEA with whole group problem solving at beginning of lessons (not always,
but regularly). Children are doing more maths, but maintaining engagement
for entire hour and 25 mins. Maths lessons are much more dynamic! (White,
2009, p. 47).

141
Numeracy, Literacy and Newman’s Error Analysis

To conclude this section, the evaluation reports indicated that the


inclusion of NEA in 2007 was welcomed by teachers and this positive reaction
was also reported by the 2008 teachers involved with the Counting On
program, as the following teacher comment indicates:
This is the best aspect of the programme. I now use the steps as a teaching
strategy for those with difficulties in my classes. Going through the questions
each time helps the students with difficulties at different levels. I have the
questions on a poster in my class. I have also started talking to the English
department about getting some help with certain students (White, 2009, p.
50).

Conclusion
The Counting On program was a success in improving both teacher and
student learning outcomes through the inclusion of NEA. The data revealed
a statistical and educationally significant improvement existing in student
learning outcomes between the start and the completion of the Counting
On program involving mathematical problem solving using word problems.
As well NEA was being used by teachers as a remedial classroom strategy
and as a wider classroom pedagogical strategy. Thus this article concludes
that the inclusion of NEA was a powerful classroom diagnostic assessment
and teaching tool for assessing, analysing and catering for student
experiencing difficulties with mathematical word problems.

Acknowledgement
The author wishes to acknowledge the support of the New South Wales
Department of Education and Training, particularly Peter Gould, Chris
Francis, Ray MacArthur and Bernard Tola of the Curriculum Support
Directorate. The opinions expressed in this paper are those of the author
and do not necessarily reflect those of the New South Wales Department of
Education and Training or the above members of the Curriculum Support
Directorate.

142
Allan Leslie White

References
Askew, M. (2003), Word problems: Cinderellas or wicked witches? In I.
Thompson (Ed.), Enhancing primary mathematics teaching (pp. 78-85).
Berkshire, England: Open University Press.
Beishuizen, M. (1993). Mental strategies and materials or models for addition
and subtraction up to 100 in Dutch second grades. Journal for Research
in Mathematics Education, 24(4), 294-323.
Carpenter, T. P., & Lehrer, R. (1999). Teaching and Learning Mathematics
with Understanding. In E. Fennema & T. A. Romberg (Eds). Mathematics
Classrooms That Promote Understanding (pp. 19-32). Mahwah, N.J.:
Lawrence Erlbaum Associates.
Casey, D. P. (1978). Failing students: a strategy of error analysis. In P. Costello
(ed.). Aspects of Motivation (pp. 295-306). Melbourne: Mathematical
Association of Victoria.
Clarkson, P. C. (1980). The Newman Error Analysis- some extensions. In
B.A. Foster (Ed.), Research in Mathematics Education in Australia 1980
(Vol. I, pp. 11-22). Hobart Mathematics Education Research Group of
Australia.
Clarkson, P. C. (1983). Types of errors made by Papua New Guinean students,
Report No. 26. Lae: Papua New Guinea University of Technology
Mathematics Education Centre.
Clarkson, P. C. (1991). Language comprehension errors: A further
investigation. Mathematics Education Research Journal 3(2), 24-33.
Clements, M. A. (1980). Analysing children’s errors on written mathematical
tasks. Educational Studies in Mathematics, 11 (1), 1-21.
Clements, M. A. (1982). Careless errors made by sixth-grade children on
written mathematical tasks. Journal for Research in Mathematics
Education, 13(2), 136-144.
Clements, M. A. (1984). Language Factors in School Mathematics. In P.
Costello, S. Ferguson, K. Slinn, M. Stephens, D. Trembath, & D. Williams
(Eds.), Facets of Australian Mathematics Education (pp. 137-148) ,
Australian Association of Mathematics Teachers, Adelaide, South
Australia.

143
Numeracy, Literacy and Newman’s Error Analysis

Clements, M. A. (2004). Analysing Errors Made by Pupils on Written


Mathematics Tasks. Sultan Hassanal Bolkiah Institute of Education,
Universiti Brunei Darussalam.
Clements, M. A., & Ellerton, N. F. (1992). Overemphasising process skills in
school mathematics: Newman analysis data from five countries. In W.
Geeslin & K. Graham (Eds.), Proceedings of the Sixteenth International
Conference on the Psychology of Mathematics Education (Vol. 1, pp.
145-152). Durham, New Hampshire: International Group for the
Psychology of Mathematics Education.
Clements, M. A., & Ellerton, N. F. (1993). The centrality of language factors
in mathematics teaching and learning. Paper presented at the
International Seminar on the Mathematical Sciences, MARA Institute,
Kuching, Sarawak.
Clements, M. A., & Ellerton, N. F. (1995). Short-answer and multiple-choice
pencil-and-paper tests: Still useful for school mathematics in the 21st
century? Journal of Science and Mathematics Education in Southeast Asia,
18(2), 10-23.
Cobb, P. & Wheatley, G. (1988). Children’s initial understandings of ten.
Focus on Learning Problems in Mathematics, 10(3), 1-28.
Doig, B., Groves, S. Tytler, R., & Gough, A. (2005). Primary and secondary
mathematics practice: How different is it? In P. Clarkson, A. Downton,
D. Gronn, M. Horne, A. McDonough, R. Pierce, & A. Roche (Eds.).
Building connections: Research, theory and practice, (Proceedings of the
28th Annual Conference of the Mathematics Education Research Group
of Australasia, RMIT, Melbourne, pp. 305-312). Sydney: MERGA Inc
Ellerton, N. F., & Clarkson, P.C. (1996). Language factors in mathematics
teaching and learning. In A. J. Bishop, M. A. Clements, C. Keitel, J.
Kilpatrick, & C. Laborde (Eds.), International handbook of mathematics
education (Part 2, pp. 987-1033). Dordrecht, The Netherlands: Kluwer
Academic Publishers.
Ellerton, N. F. & C1ements, M. A. (1991). Mathematics in language: A review
of language factors in mathematics learning. Australia: Deakin University
Press.

144
Allan Leslie White

Ellerton, N. F., & Clements, M. A. (1996). Newman error analysis. A


comparative study involving Year 7 students in Malaysia and Australia.
In P. C. Clarkson (Ed.), Technology and mathematics education (pp. 186-
193). Melbourne: Mathematics education Research Group of Australasia.
Ellerton, N. F., & Clements, M. A. (1997). Pencil and paper tests under the
microscope. In F. Biddulph & K. Carr (Eds.), People in mathematics
education (pp. 155-162). Waikato, NZ: Mathematics Education Research
Group of Australasia.
Ellerton, N. F. & Olson, J. (2005). The assessment dilemma: Correct answers
with no understanding and incorrect answers with some understanding.
In H. S. Dhindsa, I. J. Kyeleve, O. Chukwu, & J.S.H.Q. Perera (Eds.), Future
directions in science, mathematics and technical education, (Proceedings
of the Tenth International Conference, pp. 226-235). Brunei: University
Brunei Darussalam.
Faulkner, R. (1992). Research on the number and type of calculation errors
made by registered nurses in a major Melbourne teaching hospital.
Unpublished M.Ed. research paper, Deakin University, Victoria,
Australia.
Gervasoni, A., Hadden, T., & Turkenburg, K. (2007). Exploring the number
knowledge of children to inform the Development of a professional
learning plan for teachers in the Ballarat diocese as a means of building
community capacity. In J. Watson & K. Beswick (Eds.), Mathematics:
Essential research, essential practice, (Proceedings of the 30th annual
conference of the Mathematics Education Research Group of Australasia,
Vol. 1, pp. 305-314). Adelaide: MERGA Inc.
Jiminez, E. C. (1992). A Cross-Lingual Study of Grade 3 and Grade 5 Filipino
Children’s Processing of Mathematical Word Problems, SEAMEO-
RECSAM, Penang.
Jones, G. A., Thornton, C. A., Putt, I. J., Hill, K. M., Mogill, T. A., Rich, B. S.,
& van Zoest, L. R. (1996). Multidigit number sense: A framework for
instruction and assessment. Journal for Research in Mathematics
Education, 27(3), 310-336.
Kaur, B. (1995). A window to the problem solvers’ difficulties. In A. Richards
(Ed.), Forging Links and Integrating Resources (pp. 228-234).Darwin:
Australian Association of Mathematics Teachers.

145
Numeracy, Literacy and Newman’s Error Analysis

Kaushil, LD., Sajjin Singh, & Clements, M. A. (1985). Language Factors


Influencing the Learning of Mathematics in an English-Medium School
in Delhi. Delhi: State Institute of Education (Roop Nagar).
Perry, B. & Howard, P. (2000). Evaluation of the impact of the Counting On
program: Final Report. Sydney: NSW Department of Education and
Training.
Perry, B. & Howard, P. (2002). Evaluation of the impact of the Counting On
program during 2001: Final Report. Sydney: NSW Department of
Education and Training.
Perry, B. & Howard, P. (2003). Evaluation of the impact of the Counting On
program 2002: Final Report. Sydney: NSW Department of Education and
Training.
Marinas, B., & Clements, M. A. (1990). Understanding the problem: A
prerequisite to problem solving in mathematics. Journal of Science and
Mathematics Education in South East Asia, 13(1), 14-20.
Mohidin, R. (1991). An Investigation Into the Difficulties Faced by the
Students of Form 4 SMJA Secondary School in Transforming Short
Mathematics Problems Into Algebraic Form. Penang: SEAMEO-
RECSAM.
Mulligan, J. (1999). Evaluation of the pilot Counting On Year 7 numeracy
project. Sydney: NSW Department of Education and Training.
Newman, M. A. (1977). An analysis of sixth-grade pupils’ errors on written
mathematical tasks. Victorian Institute for Educational Research Bulletin,
39, 31-43.
Newman, M. A. (1983). Strategies for diagnosis and remediation. Sydney:
Harcourt, Brace Jovanovich.
Singhatat, N. (1991). Analysis of Mathematics Errors of Lower Secondary
Pupils in Solving Word Problems. Penang: SEAMEO-RECSAM.
Sulaiman, S., Remorin, P.R. (Eds.) (1993). Science- and Mathematics-related
Project Work Abstracts of SEAMEO-RECSAM Participants. Penang
SEAMEO RECSAM.
Thomas, N. (1999). Levels of conceptual development in place value. The
pilot Counting On numeracy project. Sydney: NSW Department of
Education and Training.

146
Allan Leslie White

Thongtawat, N. (1992). Comparing the effectiveness of multiple-choice and


short-answer paper-and-pencil Tests. Penang: SEAMEO-RECSAM.
Tuck, S. (1983). An investigation of the effectiveness of the Newman
Language Mathematics Kit, unpublished M.Ed thesis, Monash University.
Vaiyatvutjamai, P., & Clements, M. A. (2004). Analysing errors made by
middle-school students on six linear inequations tasks. In I. P. A. Cheong,
H. S. Dhindsa, I. J. Kyeleve, & O. Chukwu (Eds.). Globalisation trends in
Science, Mathematics and technical Education 2004, (Proceedings of the
Ninth International Conference of the Department of Science and
Mathematics Education, Universiti Brunei Darussalam, pp. 173-182).
Brunei: University Brunei Darussalam.
Watson, L (1980). Investigating errors of beginning mathematicians.
Educational Studies in Mathematics, 11(3),319-329.
White, A. L. (2009).Counting On 2008: Final report. Sydney: Curriculum K-
12 Directorate, Department of Education and Training.
White, A. L. (2008).Counting On: Evaluation of the impact of Counting On
2007 program. Sydney: Curriculum K-12 Directorate, Department of
Education and Training.
White, A. L. (2005). Active Mathematics In Classrooms: Finding Out Why
Children Make Mistakes - And Then Doing Something To Help Them.
Square One, 15(4), 15-19.
White, A. L., & M. A. Clements (2005). Energising upper-primary
mathematics classrooms in Brunei Darussalam: The Active Mathematics
In Classrooms (AMIC) Project. In H. S. Dhindsa, I. J. Kyeleve, O. Chukwu,
& J.S.H.Q. Perera (Eds.). Future directions in science, mathematics and
technical education (pp. 151-160 ) (Proceedings of the Tenth International
Conference). Brunei: University Brunei Darussalam.
White, A. L., & Lim, C. S. (2008). Lesson study in Asia Pacific classrooms:
Local responses to a global movement. ZDM The International Journal
of Mathematics Education, 40 (6), 915-925.
White, A. L., & Southwell, B (2003). Lesson Study: A model of professional
development for teachers of mathematics in years 7 to 12. In L. Bragg, C.
Campbell, G. Herbert, & J. Mousley (Eds.). Mathematics education
research: Innovation, networking, opportunity: (Proceedings of the 26th
Annual Conference of the Mathematics Education Research Group of
Australasia, Vol. 2, pp. 744-751). Melbourne: Deakin University.

147
Numeracy, Literacy and Newman’s Error Analysis

Wright, R. J. (1998). An overview of a research-based framework for assessing


and teaching early number. In C. Kanes, M. Goos, & E. Warren (Eds.),
Teaching mathematics in new times (pp. 701- 708). Brisbane: Mathematics
Education Research Group of Australasia.
Wright, R. J., Martland, J. R., & Stafford, A. (2000). Early numeracy:
Assessment for teaching an intervention. London: Sage / Paul Chapman
Publications.

Author:
Allan Leslie White; University of Western Sydney;
e-mail: al.white@uws.edu.au

148

Das könnte Ihnen auch gefallen