Sie sind auf Seite 1von 9

Mathematical Studies

2016 Chief Assessors Report


Mathematical Studies

2016 Chief Assessors Report

Overview
Chief Assessors reports give an overview of how students performed in their school
and external assessments in relation to the learning requirements, assessment
design criteria, and performance standards set out in the relevant subject outline.
They provide information and advice regarding the assessment types, the application
of the performance standards in school and external assessments, the quality of
student performance, and any relevant statistical information.

2016 was the last year of teaching Mathematical Studies at Stage 2. Mathematical
Studies at Stage 2 in 2017 is replaced by Mathematical Methods.

School Assessment

Assessment Type 1: Skills and Applications Tasks


The more successful responses
Were produced when students were able to access complex questions involving
conjecture and proof, mathematical modelling, and interpretation.
Were obtained when tasks contained a balance of routine and complex
questions, without being excessively long or encompassing too great a breadth
of content.
Demonstrated skills in conjecture and proof.
Provided evidence of student learning against the performance standards using
only content prescribed in the current subject outline.

The less successful responses


Resulted when tasks had only complex questions and hence some students
were unable to get started.
Resulted when students were assessed on content that deviated from the
requirements of the subject outline.

General information
Teachers continue to use marks as a means for assessing student work. While this is
reasonable to do at a school level, it is important that the final grade submitted to the
SACE Board reflects how the student has achieved against the performance
standards. There were a number of schools this year who failed to supply marks or
performance standards to assist in supporting their decisions. Moderators are looking
to confirm the teachers decisions and it is far more helpful when annotations are
given to indicate whether work is correct or incorrect.

Mathematical Studies 2016 Chief Assessors Report Page 2 of 9


Assessment Type 2: Folio
The more successful responses
Were elicited when at least one of the folio tasks was open-ended, allowing a
variety of responses; teachers should aim to use tasks that allow for students to
apply and extend their new knowledge.
Used the report format specified in the subject outline; while many students still
submit work that is handwritten, this does not preclude them from using the
report format.

The less successful responses


Were produced when both folio tasks were heavily directed, restricting students
from demonstrating evidence of their learning at the A level due to the routine
mathematical skills involved.
Resulted when schools chose to present three or four folio tasks; while doing this
conforms to the subject outline and was probably done to break the tasks into
more manageable pieces, it was seen that the tasks were reduced in complexity
to make up for the increase in number and hence did not show high-level
evidence.
Were the result of directed tasks that had not been developed by the teacher,
but had been obtained from resources that were purchased; while these
resources provide a good starting point for teachers, they should always look to
adapt them so that they become open and individualised.
Relied on presenting the work well, but with insufficient content; work that is
typed and presented well does not automatically lead to a result in the upper
grade bands, and it is important that effort alone is not rewarded.

External Assessment

Assessment Type 3: Examination


Question 1

This question provided an opportunity for students to demonstrate their routine


calculus skills, with 58% of them earning full marks

The more successful responses


Applied the chain rule consistently.

The less successful responses


Did not recognise the need to use the product rule.

Question 2

This question included routine matrix calculations, helping 90% of students earn 3 or
more marks, along with a conjecture and a proof that were generally handled well,
with over 60% of students earning 6 or more marks out of 9 for the entire question.

The more successful responses


Used technology to evaluate the determinants.
Dealt with the conditions for the existence of an inverse with confidence.

Mathematical Studies 2016 Chief Assessors Report Page 3 of 9


The less successful responses
Chose to calculate the three 3 3 determinants by hand.
Attempted to reverse-engineer their conjecture based on the proof.
Did not simplify their determinant expression to show the perfect square
structure.

Question 3

The range of statistics skills assessed proved challenging for some, with a third of
students earning 3 or less marks out of 8, but were handled well by many, with a third
of students earning 7 or 8 marks.

The more successful responses


Used the z-score formula with confidence.
Showed a good understanding of the relationship between standard deviation
and the proportion given.

The less successful responses


Did not label their bell curve with the information about the proportion about
11.5 grams.
Were unable to find the required z-score, and used the probability of 0.11 in its
stead.

Question 4

The relative familiarity of the calculation of a derivative value from first principles
helped two-thirds of students gain half marks or better and over a quarter achieve 9
or 10 out of 10. Issues around notation and a lack of knowledge about average rate
of change led to many of the lower levels of achievement.

The more successful responses


Observed that the function began at = 0 and handled the asymptote at = 4
correctly.
Were confident with the first principles structure, including its notation.

The less successful responses


Seemed unfamiliar with the average rate of change as an important
mathematical concept in a calculus course. Some confused it with the process of
finding the average of function or derivative values.
Struggled to use correct notation to describe intervals.

Question 5

This question presented graphical information that had to be carefully read for
meaning. This proved to a barrier for many students, who made little attempt at the
rest of the question. Over a third of the students earned the mark for part (a)(i) only.
For those who persisted with their solution, this question proved to be an excellent
opportunity to differentiate themselves from other students, with 18% earning full
marks.

The more successful responses


Were able to connect the graphical information provided with the algebraic
representations of the function and its derivative.
Provided working that was sufficient to earn part marks when errors were made.

Mathematical Studies 2016 Chief Assessors Report Page 4 of 9


The less successful responses
Struggled to handle information provided in graphical form.
Lacked resilience when their solution did not unfold at the first attempt.

Question 6

Student responses confirmed that the procedure being examined was familiar to the
vast majority, with errors, when they occurred, being in the execution, in particular,
the handling of exact values. Nearly 90% of students earned 2 or more marks out of
6 and over a third earned full marks.

The more successful responses


Worked with exact values successfully.

The less successful responses


Used decimal approximations in their determination of the equation of a tangent.

Question 7

Most students were familiar, and to some degree comfortable, with what was being
asked in this question, with over 70% of students earning better than half marks. Half
of these students earned 4 out of 5, where the mark lost was for the consistent error
of equating the definite integral to the area, despite its location with respect to the
x-axis.

The more successful responses


Recognised that the definite integral under consideration was equal to the
negative of the area provided.
Included the constant of integration in part (a).

The less successful responses


Showed the common misunderstanding that definite integrals are areas.

Question 8

This question showed that students were more confident with this style of curve-
sketching question, in comparison with previous years. Nearly 80% of students
earned 2 or more marks, and 44% of students earned 5 or 6 out of 6.

The more successful responses


Used more vertical space when sketching their answer to part (c), which allowed
them to include the significant features more easily.
Took care in locating the non-stationary inflection point.

The less successful responses


Did not look carefully enough at the sign diagrams provided and assumed that
the first sign diagram was of the derivative.

Question 9

Students coped well with this familiar question structure, with nearly two-thirds
earning better than half marks. The differentiation between the marks of students
occurred in part (d) when a u-substitution integration involving exact values was
called for, providing increased challenge, with less than 15% of students earning full
marks.

Mathematical Studies 2016 Chief Assessors Report Page 5 of 9


The more successful responses
Showed a familiarity with the u-substitution method of integration.
Recognised the need for a method capable of producing an exact answer, as
was called for in part (d).

The less successful responses


Failed to give the answer in part (a)(ii) to 3 decimal places.
Evaluated the definite integral in part (d) using technology, generating a decimal
approximation rather than the exact answer that was called for.

Question 10

The routine elements of this question meant that 85% of students earned 5 or more
marks out of 13. The more challenging elements of the question, particularly
part (c)(iv), meant that less than 30% earned 9 or more out of 13, and only 3%
earned full marks.

The more successful responses


Handled the ambiguity of might be fair or might be weighted when answering
part (c)(iii).
Tackled part (c)(iv) successfully, often using the formula for the width of a
confidence interval with skill and precision.

The less successful responses


In part (a)(iii), made errors with the intervals involved, leading to the calculation
of probabilities other than the one that was required.
In part (c)(ii), looked for 0.0172 the answer from part (a)(i) instead of 0.167
when interpreting the significance of the confidence interval.

Question 11

The context in this question seemed to be a barrier for many students, with a quarter
making no meaningful attempt to answer. Working with an underspecified system
also proved to be challenging for many, with over 60% of students earning less than
half marks. Students who coped well with these challenges were able to differentiate
themselves from others, with over 15% earning 9 or 10 out of 10.

The more successful responses


Engaged well with the context presented, ensuring that their answers made
sense in the context of the question.

The less successful responses


Did not define their variables as representing quantities, writing things like
x = wood.
Were uncomfortable with an underspecified system of equations and tried to
generate a third equation.

Question 12

Student response to this question was strong, reflecting a familiarity with the ideas of
conjecture and proof sufficient to cope with the more challenging function structure.
Nearly three-quarters of the students earned 4 or more marks out of 10. The
algebraic dexterity required to complete the proof provided a point of differentiation,
with 14% of students earning full marks.

Mathematical Studies 2016 Chief Assessors Report Page 6 of 9


The more successful responses
Handled the proof in terms of n successfully.

The less successful responses


Did not draw the vertical asymptote, as called for in part (a)(i).
Used algebraic techniques to find the stationary point in part (a)(ii).
Provided more cases, when a proof was called for.

Question 13

Students accessed this question successfully, with 70% earning more than half
marks. The technical requirements of a Z-test, along with the associated
interpretation, meant that, while a third of all students earned 9 or more marks out of
11, only 4% earned full marks.

The more successful responses


Were familiar with the structure and notation expected when performing a Z-test.
Were comfortable with the correct interpretation of a do not reject outcome of a
hypothesis test.

The less successful responses


Did not define the null and alternative hypotheses in part (e)(i).
Made surprising errors in parts (a) and (b).

Question 14

Some students were put off by the more complex structure of this relation, with a
quarter of them earning 0 or 1 mark. However, those that persevered were able to
access many of the marks, with more than 50% of students earning half marks or
better.

The more successful responses


Showed their algebraic skills in part (b), including choosing not to expand
brackets upon differentiating, making rearrangement easier.
Took care to include bounds when writing down an integral expression for area.

The less successful responses


Lacked care, particularly in the use of brackets, when substituting 1 into the
relation.
Incorrectly interpreted 1 divided by 0 as being equal to zero.

Question 15

As has been the case in recent times, student engagement in more extended
application questions was greatest when involving matrices. Only 6% earned 0 or
1 mark and 80% earned 5 or more marks out of 13. Nonetheless, the need to
interpret mathematical results using precise and detailed language provided the
necessary element of challenge. Part (d) also challenged many students, with many
making no attempt or providing limited evidence to support their answer.

The more successful responses


In part (b)(i), provided the necessary detail about the origin of probabilities in the
context of the question, rather than just stating that 0.5 0.5 = 0.25.

Mathematical Studies 2016 Chief Assessors Report Page 7 of 9


In part (c)(ii), considered the entire M30 matrix and concluded that the probability
of being on web page D was independent of starting position.

The less successful responses


Lacked detail when asked to interpret their results.
Did not attempt part (d) or did not provide sufficient evidence to support their
answer.

Question 16

This question proved challenging on a conceptual level, working with the idea of rate
rather than quantity, and also on a technical level, with the algebraic skills required in
parts (a) and (e) proving too much for many. A third of students earned 0 or 1 mark
and less than 20% of students were able to earn more than half marks. In spite of
this increased level of challenge, some students were able to distinguish themselves
most meritoriously, with 4% of students earning 15 or 16 out of 16.

The more successful responses


Did not expand the function or its derivative in part (a), making part (e)(i) easier.

The less successful responses


Confused the ideas of rate and quantity, making it hard to access the question.

General information for the examination

Overall, students were successful in accessing what was, on balance, a more


demanding Mathematical Studies examination in comparison with 2015 and 2014.
There were a range of more and less accessible marks throughout the paper from all
mathematical topics. Less successful students showed a lack of resilience when
tackling the less familiar questions with which they were presented. Carelessness
and excessive brevity were also evident in terms of interpretations, rounding, and
meeting stated question requirements. More successful students were able to read
for meaning in contextual questions, persist with questions that they initially found
challenging, and take care to provide detailed evidence to support their answers.
Areas of the curriculum that appeared to be weaknesses for a large proportion of the
students included the concept of an average rate of change and working with linear
equations, particularly working with under-specified systems of equations and
forming equations from written information. Areas of strength included algebraic
differentiation and integration, the calculation of probabilities and confidence
intervals, and working with matrices in context.

Operational Advice
School assessment tasks are set and marked by teachers. Teachers assessment
decisions are reviewed by moderators. Teacher grades/marks should be evident on
all student school assessment work.

As has been the situation in previous years, a number of schools joined together to
create assessment groups. This is beneficial when there are a small number of
students at a school. Creation of groups is best done at the start of the year so that
arrangements can be put in place to do common assessment tasks, which makes the
moderation process easier and makes the process of looking for evidence against
the performance standards more consistent. It is recommended that the schools in
assessment groups also conduct internal moderation processes to ensure that

Mathematical Studies 2016 Chief Assessors Report Page 8 of 9


marking and assessment against the performance standards is consistent and that
the rank order submitted to the SACE Board is correct. The rank order within an
assessment group is maintained at moderation and, therefore, the process of
conducting an internal moderation is a critical one.

The packaging of student materials is important in assisting the moderators to


confirm results. Clearly labelled materials should be presented in the clear bag
there is no need to also use a folder. Some teachers present a summary of results at
the front of each student package which is helpful. Student packages should not
contain assessment plans, solutions, or notes regarding missing work, etc. These
items should be in a teacher package and any student work that deviates from the
assessment plan should be included on the Variations Moderation Materials form.
This form is used to record breaches of rules, missing tasks (marked but lost), tasks
not completed, or special provisions.

There were a number of instances where the summary of results in individual student
packages did not match the result that was entered into Schools Online. The
evidence that moderators will look for will be matched against the results entered
online and therefore it is critical that teachers are careful when submitting their
grades. Additionally, there were a few cases where teachers had allocated a grade of
E where no work was submitted by a student. In the situation where a student
submits no work for an assessment type, the grade that should be allocated is an I
and the Variations Moderation Materials form should reflect this.

Mathematical Studies
Chief Assessor

Mathematical Studies 2016 Chief Assessors Report Page 9 of 9

Das könnte Ihnen auch gefallen