Beruflich Dokumente
Kultur Dokumente
Version: v 0.2
Date: 30th October 2009
Table of Contents
1. BACKGROUND INFORMATION ............................................................................................................ 2
2. PROBLEM DEFINITION......................................................................................................................... 3
2.1 Objectives................................................................................................................................... 3
2.2 Exclusions .................................................................................................................................. 4
3. CURRENT PROCESSES ....................................................................................................................... 4
3.1 Collection of entry information ..................................................................................................... 4
3.2 Mark sheets distributed to markers ............................................................................................. 4
3.3 Marking performed by markers ................................................................................................... 4
3.4 Collecting marks from markers.................................................................................................... 4
3.5 Inputting marks from markers...................................................................................................... 5
3.6 Identifying re-reads and collecting re-read marks ........................................................................ 5
3.7 Calculating overall performance .................................................................................................. 5
3.8 Reporting for the Examination Board (pre-board and final board) ................................................ 5
3.9 Creating statistical reports........................................................................................................... 5
3.10 Reporting results (marks) to the Academic Records Office .................................................... 5
3.11 Uploading into OSS .............................................................................................................. 6
3.12 Publishing ............................................................................................................................. 6
3.13 Creating College reports ....................................................................................................... 6
4. CONSTRAINTS ..................................................................................................................................... 6
5. ASSUMPTIONS ..................................................................................................................................... 6
6. SUCCESS CRITERIA ............................................................................................................................ 7
7. PROPOSAL ........................................................................................................................................... 7
7.1 Description ................................................................................................................................. 7
7.2 Benefits ...................................................................................................................................... 7
7.3 Costs .......................................................................................................................................... 8
8. ESTIMATED PROJECT COSTS AND TIMELINE ................................................................................... 8
8.1 Project Costs .............................................................................................................................. 8
8.2 Timeline...................................................................................................................................... 8
Proposal for an Assessments Management System
1. BACKGROUND INFORMATION
This project comprises two major phases, a requirements gathering and feasibility phase, followed
by a development and implementation phase. This proposal covers the first phase of work, which
will inform the direction and format of the second phase. A subsequent proposal will be submitted
with greater detail about the development and implementation phase.
Within the University, it is estimated that there are hundreds of different ‘assessment management
systems’. Each Department/Faculty has created a system specific to their requirements, the
responsibility of maintaining the system residing with the Department/Faculty. The term ‘assessment
management system’ refers to an electronic system that stores details of assessments, and their
corresponding result. Assessment management systems can be anything from a sophisticated
database program to a simplistic excel workbook. A Department/Faculty will use an assessment
management system to manage marks for coursework, examinations (to question level detail), and
formal submissions such as a dissertation. The system usually calculates an overall result, which
will be classification for completing students. After investigation in 2007, it was confirmed that 80%
of assessment management systems are spreadsheet-based; the remaining 20% are databases.
The only centrally provided system is ‘Mark-it’. This access database was developed in 2001 in
response to the Humanities Division strategic decision to standardise examination regulations, the
way in which awards are calculated across programmes offered by the division. A single system,
with common rules (at that particular point in time), was implemented in the following
Departments/Faculties: History; English; Economics and Management (Final Honour School only);
Politics, Philosophy and Economics (Final Honour School only); Theology; Classics; Modern
Languages (Final Honour School only). Mark-it was developed and is supported by the Applications
Support team (Business Services and Projects) at a cost of approximately £60k per annum.
Although the core programming of Mark-it is the same for each Department/Faculty, the user
interface and specific business rules are different in each instance (there are currently 21
instances). This diversity of requirements is complicated and resource intensive to maintain.
At Oxford, Colleges play a key role, not only for student welfare and support, but also for teaching
and assessment. College collections are formal assessments held in College, usually at the start of
each term for all undergraduate students. The marks for College Collections are not collected
centrally, however are circulated within College. Colleges have created and maintain their own
student records systems, most of which store assessment related information. Colleges report
performance statistics, write letters of reference, and (until 2011) produce academic transcripts for
their students.
Each Department/Faculty and College faces the challenge of supporting and maintaining their own
systems. It is often the case that the knowledge of such a bespoke system is retained only in a
single point of contact, most commonly an IT support staff member or an academic in some cases.
Administrators often do not have access to carry out basic updates to the system, or it requires
specialist skills in order to achieve it. These factors, and others not detailed in this paper, present a
tangible risk of disruption to a critical process.
The central student records system (OSS) is the source of results information to students,
administrative staff, and the academic community. Because departmental systems are not
integrated with OSS, this data must be largely manually extracted from departmental systems in
order to be subsequently uploaded into OSS. This requires a duplication of effort, another stage of
rigorous checking, and is a strain on resources at a time of year when they are already under
severe pressure.
2. PROBLEM DEFINITION
There is a clear need to address the following issues:
Whilst addressing those key issues, the following questions must also be answered:
2.1 Objectives
Objectives will fall into the two phases of this project. For the first phase, the main focus of this
proposal, the objectives are as follows:
At this stage, a single objective can be defined for the second phase of this project:
Develop and implement an assessments management system within an FSD environment
2.2 Exclusions
Only phase one is covered by the proposed work, any development work will be part of phase
two and will be detailed in a separate proposal
Colleges will be consulted to ascertain their requirements, however assessments such as
collections and admissions assessments (for example the History Aptitude Test (HAT)) may
be deemed out of scope after the investigative stages
The system will manage the process once papers are sat and ready for marking, it will not
manage papers, timetabling, or any other elements of the process before marking
Online examination entry should be investigated for feasibility as part of the assessments
management system, but could be excluded if deemed out of scope
3. CURRENT PROCESSES
The management of assessments is a major process, involving both academic and administrative
staff in Departments/Faculties interacting with administrative staff in central administration units,
predominantly the Examination Schools and the Proctors’ Office. The process can be summarised
in distinguishable events:
3.8 Reporting for the Examination Board (pre-board and final board)
Reports for the Examination Board can be extensive; each Chair (and to a certain extent, the
external examiner) has specific reporting requirements. Reporting formats differ across the
University, however particular similarities can be identified. The meeting before the final board
meeting, discusses any cases where moderation is necessary, and cases of mitigating
circumstances. This meeting decides outcomes that are then ratified by the final board meeting. The
pre-board meeting will comprise the Chair, administrator, and perhaps one or two other senior
academics. The final board meeting agrees the classification/progression outcome for each student.
The meeting comprises a quorum group of examiners, including the Chair, external examiners,
administrator, and perhaps a representative from central administration. Members of this meeting
sign a hard copy ‘class/pass’ lists for submission to the Academic Records Office. This, along with
an electronic file containing marks for papers/submissions, is part of the process described in 3.10.
3.12 Publishing
When results are uploaded into OSS, they are placed ‘on hold’ for 48 hours so that the
Department/Faculty has an opportunity to review them before they are released to students and
staff. This review process is in place purely because there was initially very little confidence in OSS
and the process of reporting marks to a central system. In reality, the upload into OSS is completely
accurate; however there have been a few instances of errors in the actual marks submitted to the
Academic Records Office. After 48 hours, results are released/published to students through
Student Self Service, and to staff through OSS Data Views. There is no automatic notification to
student and staff that particular results have been published. They must continuously check to see if
their results are published, it usually happens after a certain period of time after the board has sat.
4. CONSTRAINTS
Current IT infrastructure of the University
Culture of control over Department/Faculty processes, not everyone will want to use a
centrally provided system
The non-modular structure of the University will influence requirements, but may not become
a constraint
5. ASSUMPTIONS
A pilot Department/Faculty can be identified to adopt the assessments management system
Data flow between the assessments management system and OSS is possible
A centrally provided assessments management system can work alongside current systems
and the current OSS results process
Support for a centrally provided assessments management system can be established as part
of services offered by BSP/OUCS
Training for the centrally provided assessments management system can be offered
6. SUCCESS CRITERIA
Processes, procedures and requirements are understood and documented, which then
informs the development phase for a centrally provided assessments management system
A flexible solution is identified which fulfils the variety of requirements across the University
Templates for standardised reporting are defined, with the scope for user-controlled reports to
also be available from the system
Departments/Faculties who are considering their own new system become candidates to pilot
the new centrally provided system
A concise set of requirements is documented and submitted as part of a second phase of
work, the development and implementation of the system
Positive attitudes towards a centrally provided system are created as part of extensive
consultation with Departments/Faculties and Colleges
Information regarding the selection and implementation of assessments management systems
in other HEIs is gathered and documented
The selection process for an assessments management system is affected by the requirement
for implementation within a FSD environment is documented and made publicly available
Documentation and guidelines are produced to inform other HEIs of the decision making
process and the selection criteria defined when choosing an assessments management
system to be implemented within an FSD environment
Cost to implement is estimated at significantly (at least 33%) less than for an ‘ordinary’
implementation of an assessments management component as part of a student records
system
Links and valuable contribution is made to the OUCS TCO project where appropriate
7. PROPOSAL
7.1 Description
In order to meet the objectives of this work, the following major areas of work are required:
7.2 Benefits
7.3 Costs
The Process and Change Management team will carry out the process review and business
analysis work
Two month process review, documentation plus costs for meetings (travel/expenses etc) for
liaison with other HEIs and suppliers.
8.2 Timeline
Phase one (requirements gathering and feasibility) could be completed by March 2010. Phase two
(development and implementation) could start immediately after the first phase is completed. The
pilot project could start in the Academic Year 2010/11, with further rollout in the subsequent
academic years. The JISC FSD project runs until March 2011.