Sie sind auf Seite 1von 6

GCE

Computing
COMP4 The Computing Practical Project
Report on the Examination
2510
Summer 2015
Version: 1.0

Further copies of this Report are available from aqa.org.uk


Copyright 2015 AQA and its licensors. All rights reserved.
AQA retains the copyright on all its publications. However, registered schools/colleges for AQA are permitted to copy material from this
booklet for their own internal use, with the following important exception: AQA cannot give permission to schools/colleges to photocopy any
material that is acknowledged to a third party even for internal use within the centre.

REPORT ON THE EXAMINATION GCE COMPUTING COMP4 Summer 2015

Introduction
Schools and colleges that entered students in 2015 should read this report in conjunction with the
specific feedback sent to them on the publication of results. The comments below highlight the
observations of Senior Moderators during this years exam and should be used with the subject
specification, assessment criteria and the COMP4 Advice and Information Booklet for Teachers'
Standardising to ensure that school/college assessment is accurate. For the 2015 exam, the latter
can be found in the Secure Key Materials section of e-AQA together with exemplar projects and
commentaries. For the 2016 exam, similar information will appear on the Teacher Online
Standardising (T-OLS) website which will be available via e-AQA. It is essential that new schools
and those who have had mark adjustments complete the T-OLS exercise.
Please also use the coursework adviser service for advice on potential complexity of projects in
particular. Your adviser can be contacted by emailing aqaca1@yahoo.co.uk.
There were no changes to the total mark or assessment criteria compared to last year. In most
cases those that had marks adjusted had incorrectly assessed the level of complexity or failed to
adjust the marking band used when a potentially complex problem had a solution that was not
complex. The procedure to follow in these cases is described in section 6 Reassessing
Complexity of the document referenced below. This was particularly the case when candidate
record forms had the concluding comments part of section B blank or incomplete. The form
specifically states that the level of complexity needs to be stated here and justified with reference
to the document COMP4 Definition of problem types for projects. Comments relating to student
ability and/or effort, whilst helpful to students, are irrelevant to moderators. Moderation is only
concerned with the accuracy of assessment of student achievement.
Administration
Many centres complete all the administration requirements fully and submit excellent assessment
comments with the projects which is very helpful to the moderation process. There is a high
correlation between centres that complete all the administration correctly and have centre marks
accepted.
A significant number of centres submitted work after the 15 May deadline, however as there were
several late changes to the moderation team it could be the case that projects were sent to the
moderator first allocated to the centre and there was a delay with forwarding to the adjusted
moderator. If this is the case then AQA sincerely apologise if your feedback form has a negative
comment re late arrival. All moderators were informed to check carefully to avoid this type of
comment. A reminder that 15 May is the date always used and it is the date by which the
moderator should receive the projects and fully completed, carefully checked paperwork.
It is also noted that too many centres persist in sending projects other than by 1st class post as in
the instructions to examinations officers from AQA. Moderators are not at home all day and cannot
be expected to go a distribution centre to collect projects. This risks the projects being returned to
centres with a possible delay in publication of results.
Some centres still send work in bulky ring binders, on other than A4 paper or as loose sheets in
wallet files. Please assist moderation by sending projects printed only on A4 paper and bound by
treasury tags that enable the projects to be read as a book.

3 of 7

REPORT ON THE EXAMINATION GCE COMPUTING COMP4 Summer 2015

All contact between AQA and centres is through the centres examinations officer who should
check that the paper work sent with the projects is complete. This is particularly the case if the
person responsible for assessment is performing this task for the first time. A significant number of
exams officers failed to ensure the instructions given in the document Instructions for submitting
coursework/controlled assessment marks and samples (CAW/INST) are followed exactly. This is
sent to centres and is available on the AQA website. This causes problems with moderation and
centres who do not follow these instructions risk having their results delayed. The most frequent
errors were
1.
2.
3.
4.
5.

Failure to submit a CDS with the coursework sent to the moderator.


Failure to add section marks correctly on CRFs
Failure to the check total mark on the CRF is correctly copied to the CMF or EDI entry.
Failure to ensure candidate authentication signatures on the CRFs.
Failure to complete fully the CRF section entitled Concluding Comments with the level of
complexity used when assessing the projects and failure to justify this with reference to the
document COMP4 Definition of problem types for projects. This is available at
http://filestore.aqa.org.uk/subjects/AQA-2510-W-TRB-COMP4DPTP.PDF

The complete subject specification is at


http://filestore.aqa.org.uk/subjects/specifications/alevel/AQA-2510-W-SP-14.PDF with the detailed
assessment criteria for COMP4 on pages 18 to 33.
Specific guidance with regard to coursework is at http://www.aqa.org.uk/subjects/ict-and-computerscience/a-level/computing-2510/coursework
With regard to the content of the actual projects some centres labelled a project as adequate
complexity but then gave a mark from the top mark band. The assessment criteria give detailed
comments as to what is required in each mark band for each level of complexity. Some of the
more frequently occurring problems are reported section by section in the following paragraphs.
Analysis
It would appear some students are writing a program and then finding a user with a problem that
their program can solve. It is very unlikely that the assessment criteria can be addressed unless
the student first establishes a real user need so that they are aware of the problem that needs to
be solved. Particular weaknesses are a lack of the use of formal methods, lack of analysis data
dictionaries, and data flow diagrams that do not conform to the convention that they are actually
supposed to show what happens to data as it passes through the system from source to
destination. An analysis that does not fully address all the relevant assessment criteria cannot be
rewarded by a high mark from the relevant mark band according to the complexity of the problem
being solved. Frequently objectives are either not clear and/or not SMART.
Design
In almost every case, students who were able to show the processing they intended to code with
algorithms/pseudo code were those that gained a high mark overall. It remains disappointing that
the programming skills required for COMP1 are often not built upon for COMP4. Most students
gave clear details of data requirements and some normalised data sets accurately if their project
was data handling. HCI designs are frequently post implementation with no evidence that they are
prototypes. If such screens have evidence of dialogue with the end user then this is good practice.

4 of 7

REPORT ON THE EXAMINATION GCE COMPUTING COMP4 Summer 2015

Technical Solution
There was some exceptionally high levels of coding skills shown by some high scoring candidates.
The assessment criteria describe how to award the marks based on complexity. However some
centres frequently give a mark based on successfully achieving all or most objectives when the
candidates own appraisal indicates that few of the complex objectives aimed for were actually
achieved. However, centres must be careful not to give credit for imported modules of code when
using an application builder or the Raspberry Pi.
Testing
Although this is not linked to complexity, it is not possible to score top marks without boundary
testing. If this is not applicable the student needs to clearly state and justify why. Too many
students give details of multiple logon and navigation tests only. Candidates need to concentrate
on testing the code they have written to process/transform data. Most test evidence was cross
referenced to a test plan but a significant weakness was annotation of test results as was multiple
screen shots on the same page making it very difficult to see the results of tests. The advice is no
more than 2 screen shots per A4.
System Maintenance (SM)
Only a very few high scoring candidates fully addressed all the assessment criteria for this section.
Most included the code and some good candidates indicated the more complex parts they had
written. This is very good practice. Algorithms were a problem here as well as in design. If the
algorithms in design indicate all the processing and are fully implemented by the code it is
permissible to reference the design algorithms in this section. However credit cannot be given for
referencing SM algorithms in design. Design algorithms indicate the processing intended whereas
SM algorithms state the processing actually done by the finished program. If they are the same
then just copy them so they are in both design and SM.
User Manual (UM) including QWC
Although it perfectly acceptable to include this within a single project report document it should be
possible to print it as a stand-alone document, hence the need for page numbers. Error handling
was often poor which restricts the mark that can be awarded. There is still evidence that a few
centres are using an old subject specification in that QWC was given a separate mark. It is
essential that the current documents are used as the mark range available is different dependent
on the complexity of the solution achieved. The assessment criteria clearly describe how to mark
the UM and what to do if QWC does not meet the criteria listed.
Appraisal
The assessment criteria make it quite clear that the mark for evaluating how well the objectives set
out in analysis is related to complexity. This means 3 marks are only applicable to complex
projects with a fully implemented complex solution. A yes/no or tick box approach to the
achievement of objectives is not valid. User feedback must be authenticated by the assessor to
earn a mark. Only if this user feedback is fully analysed by the student and then used as the basis
for improvements can additional mark(s) be awarded. If the student lists improvements without first
analysing user feedback then this cannot be awarded a mark. Obviously without user feedback
there cannot be marks awarded for improvements. This is explained clearly in the assessment
criteria.

5 of 7

REPORT ON THE EXAMINATION GCE COMPUTING COMP4 Summer 2015

Mark Ranges and Award of Grades


Grade boundaries and cumulative percentage grades are available on the Results Statistics
page of the AQA Website.
Converting Marks into UMS marks
Convert raw marks into Uniform Mark Scale (UMS) marks by using the link below.
UMS conversion calculator www.aqa.org.uk/umsconversion

6 of 7

Das könnte Ihnen auch gefallen