Beruflich Dokumente
Kultur Dokumente
Computing
COMP4 The Computing Practical Project
Report on the Examination
2510
Summer 2015
Version: 1.0
Introduction
Schools and colleges that entered students in 2015 should read this report in conjunction with the
specific feedback sent to them on the publication of results. The comments below highlight the
observations of Senior Moderators during this years exam and should be used with the subject
specification, assessment criteria and the COMP4 Advice and Information Booklet for Teachers'
Standardising to ensure that school/college assessment is accurate. For the 2015 exam, the latter
can be found in the Secure Key Materials section of e-AQA together with exemplar projects and
commentaries. For the 2016 exam, similar information will appear on the Teacher Online
Standardising (T-OLS) website which will be available via e-AQA. It is essential that new schools
and those who have had mark adjustments complete the T-OLS exercise.
Please also use the coursework adviser service for advice on potential complexity of projects in
particular. Your adviser can be contacted by emailing aqaca1@yahoo.co.uk.
There were no changes to the total mark or assessment criteria compared to last year. In most
cases those that had marks adjusted had incorrectly assessed the level of complexity or failed to
adjust the marking band used when a potentially complex problem had a solution that was not
complex. The procedure to follow in these cases is described in section 6 Reassessing
Complexity of the document referenced below. This was particularly the case when candidate
record forms had the concluding comments part of section B blank or incomplete. The form
specifically states that the level of complexity needs to be stated here and justified with reference
to the document COMP4 Definition of problem types for projects. Comments relating to student
ability and/or effort, whilst helpful to students, are irrelevant to moderators. Moderation is only
concerned with the accuracy of assessment of student achievement.
Administration
Many centres complete all the administration requirements fully and submit excellent assessment
comments with the projects which is very helpful to the moderation process. There is a high
correlation between centres that complete all the administration correctly and have centre marks
accepted.
A significant number of centres submitted work after the 15 May deadline, however as there were
several late changes to the moderation team it could be the case that projects were sent to the
moderator first allocated to the centre and there was a delay with forwarding to the adjusted
moderator. If this is the case then AQA sincerely apologise if your feedback form has a negative
comment re late arrival. All moderators were informed to check carefully to avoid this type of
comment. A reminder that 15 May is the date always used and it is the date by which the
moderator should receive the projects and fully completed, carefully checked paperwork.
It is also noted that too many centres persist in sending projects other than by 1st class post as in
the instructions to examinations officers from AQA. Moderators are not at home all day and cannot
be expected to go a distribution centre to collect projects. This risks the projects being returned to
centres with a possible delay in publication of results.
Some centres still send work in bulky ring binders, on other than A4 paper or as loose sheets in
wallet files. Please assist moderation by sending projects printed only on A4 paper and bound by
treasury tags that enable the projects to be read as a book.
3 of 7
All contact between AQA and centres is through the centres examinations officer who should
check that the paper work sent with the projects is complete. This is particularly the case if the
person responsible for assessment is performing this task for the first time. A significant number of
exams officers failed to ensure the instructions given in the document Instructions for submitting
coursework/controlled assessment marks and samples (CAW/INST) are followed exactly. This is
sent to centres and is available on the AQA website. This causes problems with moderation and
centres who do not follow these instructions risk having their results delayed. The most frequent
errors were
1.
2.
3.
4.
5.
4 of 7
Technical Solution
There was some exceptionally high levels of coding skills shown by some high scoring candidates.
The assessment criteria describe how to award the marks based on complexity. However some
centres frequently give a mark based on successfully achieving all or most objectives when the
candidates own appraisal indicates that few of the complex objectives aimed for were actually
achieved. However, centres must be careful not to give credit for imported modules of code when
using an application builder or the Raspberry Pi.
Testing
Although this is not linked to complexity, it is not possible to score top marks without boundary
testing. If this is not applicable the student needs to clearly state and justify why. Too many
students give details of multiple logon and navigation tests only. Candidates need to concentrate
on testing the code they have written to process/transform data. Most test evidence was cross
referenced to a test plan but a significant weakness was annotation of test results as was multiple
screen shots on the same page making it very difficult to see the results of tests. The advice is no
more than 2 screen shots per A4.
System Maintenance (SM)
Only a very few high scoring candidates fully addressed all the assessment criteria for this section.
Most included the code and some good candidates indicated the more complex parts they had
written. This is very good practice. Algorithms were a problem here as well as in design. If the
algorithms in design indicate all the processing and are fully implemented by the code it is
permissible to reference the design algorithms in this section. However credit cannot be given for
referencing SM algorithms in design. Design algorithms indicate the processing intended whereas
SM algorithms state the processing actually done by the finished program. If they are the same
then just copy them so they are in both design and SM.
User Manual (UM) including QWC
Although it perfectly acceptable to include this within a single project report document it should be
possible to print it as a stand-alone document, hence the need for page numbers. Error handling
was often poor which restricts the mark that can be awarded. There is still evidence that a few
centres are using an old subject specification in that QWC was given a separate mark. It is
essential that the current documents are used as the mark range available is different dependent
on the complexity of the solution achieved. The assessment criteria clearly describe how to mark
the UM and what to do if QWC does not meet the criteria listed.
Appraisal
The assessment criteria make it quite clear that the mark for evaluating how well the objectives set
out in analysis is related to complexity. This means 3 marks are only applicable to complex
projects with a fully implemented complex solution. A yes/no or tick box approach to the
achievement of objectives is not valid. User feedback must be authenticated by the assessor to
earn a mark. Only if this user feedback is fully analysed by the student and then used as the basis
for improvements can additional mark(s) be awarded. If the student lists improvements without first
analysing user feedback then this cannot be awarded a mark. Obviously without user feedback
there cannot be marks awarded for improvements. This is explained clearly in the assessment
criteria.
5 of 7
6 of 7