Sie sind auf Seite 1von 3

1.

0 Analysis
The topic of the lesson that we chose was ‘I See Numbers’. The class taught was a Year 3
class where the pupils were all mixed abilities. During the lesson, we taught the pupils about
confusing pairs such thirteen – thirty and fifteen - fifty. During the set induction, we played a
song about numbers for the pupils. Then, during presentation, we introduced the confusing
pairs to the pupils in which we taught them the pronunciation as well as the spelling of the
numbers so that they can differentiate the numbers. During the practice stage, we played a
game with the pupils. The class was divided into groups. Then, we pasted sticky notes that
had numbers written on them on the whiteboard. A representative is then called from the
groups. When the teacher says a number, the representative runs to the front and takes the
sticky note which has the number the teacher said earlier. The group which has the correct
sticky note gets marks and the wrong ones with no marks. Later, during the production
stage, the assessment was carried out. Since the class consisted of pupils with mixed
abilities, the results of the test are either low or high.

Test scores of Year 3 pupils


8

4
No of Pupils
3

0
0-5 marks 6-10 marks 11-15 marks 16-20 marks 21-25 marks

There were a total of 13 pupils in the class. About 7 pupils received marks within the range
of 21 – 25 marks, 3 of them obtained within 0 – 5 marks, another 3 obtained 6 – 10 marks
and 1 pupil got 14 marks. There were no pupils who obtained marks within 16 – 20 marks.
Percentage of pupils obtaining marks within
given range
0-5 marks 6-10 marks 11-15 marks 16-20 marks 21-25 marks

21%

51%
21%

7%

0%

2.0 Strengths of the test


After conducting the test, I found that the test had established one of the principles of
language assessment which is validity. According to Brown, H. D. (2003), content validity of
the classroom test is demonstrated when the assessment requires students to perform tasks
that were included in the previous classroom lessons. During the lesson, the test conducted
assessed on what the pupils had learnt in the previous presentation and practice stage
which was about confusing pairs in numbers. Therefore, the test was content valid. Besides
that, the test had also achieved face validity where the pupils encountered the tasks
constructed in a familiar structure. (Brown, H. D., 2003) Pupils were familiar with the task
structure such as spelling it out the words and reading comprehension as those structures
were frequently used in their activity books.

Besides relevant to what was being taught, the content in the test was also authentic. First,
according to Brown (2003), authenticity can be presented in a way where the topics are
meaningful for the learner. The test was related to the pupils’ daily lives since the topic
chosen was ‘I See Numbers’. Therefore, it was meaningful to the pupils. On the other hand,
the test had also achieved part of the test reliability which depends on the physical context
which was supported by Brown (2003). He stated that test reliability can be achieved by
making sure that all students receive the same quality of input. During the lesson, every
pupil was given same opportunity to work on a cleanly photocopied worksheet in an optimal
classroom environment. Therefore, the test was reliable.
In the aspect of practicality, the test was considered practical as the pupils could complete
the test within the set time which was 30 minutes. As Brown (2003) aptly viewed that a
practical test means it stays within appropriate time constraints. Furthermore, Brown (2003)
also stated that a test that is not excessively expensive is practical. Since the test carried out
was all using paper and pencil method, the cost of assessment was within the budgeted
limit. Therefore, the test was practical.

Apart from that, the test assessed the targeted skills of the pupils and achieved the
objectives set. For example, in the listening section, the pupils’ listening skills were assessed
where they have to write down the numbers said by the teacher. For the writing section, the
pupils had to write down the correct spelling of the numbers given. And for the reading
section, a short passage was given and the pupils had to answer the questions by skimming
and scanning the passage for the answers.

3.0 Weaknesses of the test


Besides realising on the strengths of both assessments, I also found that reliability could be
an issue in some aspects of the test. The scoring criteria in the test might contribute to the
unreliability of the test. I found some errors made in the questions set in the third section.
There were two questions that were not stated clearly of what the questions were asking the
pupils for. The pupils got confused and asked me what the questions was asking for. For
example, one of the questions asked for the price of the ‘fruits’ but it was actually for the
price of the ‘durians’. And the other question was whether price of each vegetable should be
written or price of all the vegetables should be written. It supposed to have five expected
answers, but only three to four answers were given by the pupils. The assessment could be
reliable if the teacher corrected the answers up to three but not five which means full mark
was given to pupils who scored three answers correct.

4.0 Conclusion
In sum, the test established had basically achieved the four principles of language
assessment. The assessment carried out had successfully prompted to determine the pupils’
progress in learning and also the effectiveness of the instruction during the teaching and
learning.