Sie sind auf Seite 1von 9

Student Systems Developments 2009/10: Appendix 9

JISC FLEXIBLE SERVICES DELIVERY PROGRAMME

PROPOSAL FOR AN ASSESSMENTS MANAGEMENT SYSTEM

Version: v 0.2
Date: 30th October 2009

Author: Victoria Clark


Owner: Student Administration, University of Oxford

Table of Contents
1. BACKGROUND INFORMATION ............................................................................................................ 2
2. PROBLEM DEFINITION......................................................................................................................... 3
2.1 Objectives................................................................................................................................... 3
2.2 Exclusions .................................................................................................................................. 4
3. CURRENT PROCESSES ....................................................................................................................... 4
3.1 Collection of entry information ..................................................................................................... 4
3.2 Mark sheets distributed to markers ............................................................................................. 4
3.3 Marking performed by markers ................................................................................................... 4
3.4 Collecting marks from markers.................................................................................................... 4
3.5 Inputting marks from markers...................................................................................................... 5
3.6 Identifying re-reads and collecting re-read marks ........................................................................ 5
3.7 Calculating overall performance .................................................................................................. 5
3.8 Reporting for the Examination Board (pre-board and final board) ................................................ 5
3.9 Creating statistical reports........................................................................................................... 5
3.10 Reporting results (marks) to the Academic Records Office .................................................... 5
3.11 Uploading into OSS .............................................................................................................. 6
3.12 Publishing ............................................................................................................................. 6
3.13 Creating College reports ....................................................................................................... 6
4. CONSTRAINTS ..................................................................................................................................... 6
5. ASSUMPTIONS ..................................................................................................................................... 6
6. SUCCESS CRITERIA ............................................................................................................................ 7
7. PROPOSAL ........................................................................................................................................... 7
7.1 Description ................................................................................................................................. 7
7.2 Benefits ...................................................................................................................................... 7
7.3 Costs .......................................................................................................................................... 8
8. ESTIMATED PROJECT COSTS AND TIMELINE ................................................................................... 8
8.1 Project Costs .............................................................................................................................. 8
8.2 Timeline...................................................................................................................................... 8
Proposal for an Assessments Management System

1. BACKGROUND INFORMATION
This project comprises two major phases, a requirements gathering and feasibility phase, followed
by a development and implementation phase. This proposal covers the first phase of work, which
will inform the direction and format of the second phase. A subsequent proposal will be submitted
with greater detail about the development and implementation phase.

Within the University, it is estimated that there are hundreds of different ‘assessment management
systems’. Each Department/Faculty has created a system specific to their requirements, the
responsibility of maintaining the system residing with the Department/Faculty. The term ‘assessment
management system’ refers to an electronic system that stores details of assessments, and their
corresponding result. Assessment management systems can be anything from a sophisticated
database program to a simplistic excel workbook. A Department/Faculty will use an assessment
management system to manage marks for coursework, examinations (to question level detail), and
formal submissions such as a dissertation. The system usually calculates an overall result, which
will be classification for completing students. After investigation in 2007, it was confirmed that 80%
of assessment management systems are spreadsheet-based; the remaining 20% are databases.

The only centrally provided system is ‘Mark-it’. This access database was developed in 2001 in
response to the Humanities Division strategic decision to standardise examination regulations, the
way in which awards are calculated across programmes offered by the division. A single system,
with common rules (at that particular point in time), was implemented in the following
Departments/Faculties: History; English; Economics and Management (Final Honour School only);
Politics, Philosophy and Economics (Final Honour School only); Theology; Classics; Modern
Languages (Final Honour School only). Mark-it was developed and is supported by the Applications
Support team (Business Services and Projects) at a cost of approximately £60k per annum.
Although the core programming of Mark-it is the same for each Department/Faculty, the user
interface and specific business rules are different in each instance (there are currently 21
instances). This diversity of requirements is complicated and resource intensive to maintain.

At Oxford, Colleges play a key role, not only for student welfare and support, but also for teaching
and assessment. College collections are formal assessments held in College, usually at the start of
each term for all undergraduate students. The marks for College Collections are not collected
centrally, however are circulated within College. Colleges have created and maintain their own
student records systems, most of which store assessment related information. Colleges report
performance statistics, write letters of reference, and (until 2011) produce academic transcripts for
their students.

Each Department/Faculty and College faces the challenge of supporting and maintaining their own
systems. It is often the case that the knowledge of such a bespoke system is retained only in a
single point of contact, most commonly an IT support staff member or an academic in some cases.
Administrators often do not have access to carry out basic updates to the system, or it requires
specialist skills in order to achieve it. These factors, and others not detailed in this paper, present a
tangible risk of disruption to a critical process.

The central student records system (OSS) is the source of results information to students,
administrative staff, and the academic community. Because departmental systems are not
integrated with OSS, this data must be largely manually extracted from departmental systems in
order to be subsequently uploaded into OSS. This requires a duplication of effort, another stage of
rigorous checking, and is a strain on resources at a time of year when they are already under
severe pressure.

C:\DOCUME~1\VICTOR~1\LOCALS~1\Temp\JISC_FSD assessment project proposal.doc


Created by Victoria Clark Page 2 2/3/2010
Proposal for an Assessments Management System

2. PROBLEM DEFINITION
There is a clear need to address the following issues:

Fragmented assessment management systems across the University


Gaps in functionality leads to manual manipulation of data and reports
Varied standards of reporting for examination board purposes
High level of risk associated with the support and maintenance of systems
Departments/Faculties are investigating their own new systems
Data flow between systems and OSS is not efficient or streamlined
High costs for minimal University wide benefit, in the case of Mark-it
Programme (examination) related data is not shared across systems
OSS out-of-the-box assessments management functionality is not suitable for use

Whilst addressing those key issues, the following questions must also be answered:

Fundamentally, what should an assessments management system do?


Can a single system provide a solution to a myriad of business requirements, some of which
may be conflicting?
Can the business requirements of an assessments management system be collated and
documented?
Can an assessments management system enable business processes rather than drive
them?
Can an assessments management system offer a flexible, user-controlled architecture, and be
scaleable?
Can assessments for a modular programme and a non-modular programme (the majority of
Oxford programmes) be managed by the same system?
Can stakeholders be identified in order to champion a centrally provided assessments
management system?
Can the system be future proofed so that its design is in parallel with the infrastructure
developments of the University (a SOA approach)? Would this system qualify as a ‘Proof of
Concept’ for SOA?
Can this approach reduce costs at both the development, and the maintenance and support
stages? Can Total Cost of Ownership be calculated accurately?
How have other HEIs approached the management of assessment data? What can we learn
in partnership with other HEIs?

2.1 Objectives

Objectives will fall into the two phases of this project. For the first phase, the main focus of this
proposal, the objectives are as follows:

Gather, understand and document requirements from all stakeholders


Understand and document how the selection of an assessments management system is
affected by the requirement for implementation within a flexible service delivery environment
Develop and maintain a set of guidelines which informs other HEIs of the decision making
process and the selection criteria defined when choosing an assessments management
system to be implemented within a FSD environment
Assess the costs and benefits of implementing an assessments management system within
an FSD environment
Link this project work with the OUCS Total Cost of Ownership demonstration project, under
the FSD Programme

C:\DOCUME~1\VICTOR~1\LOCALS~1\Temp\JISC_FSD assessment project proposal.doc


Created by Victoria Clark Page 3 2/3/2010
Proposal for an Assessments Management System

At this stage, a single objective can be defined for the second phase of this project:
Develop and implement an assessments management system within an FSD environment

2.2 Exclusions

Only phase one is covered by the proposed work, any development work will be part of phase
two and will be detailed in a separate proposal
Colleges will be consulted to ascertain their requirements, however assessments such as
collections and admissions assessments (for example the History Aptitude Test (HAT)) may
be deemed out of scope after the investigative stages
The system will manage the process once papers are sat and ready for marking, it will not
manage papers, timetabling, or any other elements of the process before marking
Online examination entry should be investigated for feasibility as part of the assessments
management system, but could be excluded if deemed out of scope

3. CURRENT PROCESSES
The management of assessments is a major process, involving both academic and administrative
staff in Departments/Faculties interacting with administrative staff in central administration units,
predominantly the Examination Schools and the Proctors’ Office. The process can be summarised
in distinguishable events:

3.1 Collection of entry information


At Oxford, students enrol upon examinations/submissions rather than modules (with the exception
of some programmes offered by Continuing Education). This process is currently a blend of an
electronic and a paper process. Students are automatically enrolled upon core examinations but
must choose optional examinations through a paper process (forms created by the Academic
Records Office). After choosing their optional entries, with the help of examination regulations,
handbooks and subject tutor advice, the student submits the form to their College Tutorial Office.
The Office then approves and sends back to the Academic Record Office (ARO) in Student
Administration. The ARO manually inputs entries using an OSS function. The entry data is
published to students through Student Self Service, and to staff through OSS Data View Reports.
Statistical management reports are produced using Discoverer and distributed to Chair of
Examiners. Entry data is sent via electronic file (excel, csv file) from Academic Records Office to the
Department/Faculty for use in their assessments systems.

3.2 Mark sheets distributed to markers


The Department/Faculty will produce templates ready for markers to enter marks for
papers/submissions. Usually there are two markers for each paper/submission, although there could
be three or even four. These templates may be produced manually, or by spreadsheet template, or
access reporting. Some Departments/Faculties have developed an online form allowing marks to be
entered by markers directly into the database via a web user interface.

3.3 Marking performed by markers


The Department/Faculty distributes exam scripts and/or submissions to markers. This process may
occur throughout the year, perhaps more commonly with submissions and postgraduate taught
programmes. The main examination period for undergraduates and taught postgraduates is May to
June (Trinity Term). Papers and/or submissions are marked in the usual way.

3.4 Collecting marks from markers


As described in 3.2, markers enter their marks into a template (usually called a marker sheet) or
perhaps through an online system. Markers have to decide upon an agreed mark if the marks they
award differ, which is usually the case. The individual marks and the agreed mark are all recorded.

C:\DOCUME~1\VICTOR~1\LOCALS~1\Temp\JISC_FSD assessment project proposal.doc


Created by Victoria Clark Page 4 2/3/2010
Proposal for an Assessments Management System

3.5 Inputting marks from markers


Depending upon how the markers completed the marker sheets, this usually entails another manual
entry of the marks into the assessments management system. For some, the marks are directly
entered into the system by the markers themselves. For the majority though, an administrator will
input marks from the marker sheet into the system.

3.6 Identifying re-reads and collecting re-read marks


Some Departments/Faculties will highlight if the mark awarded by each marker differs significantly.
Significant difference is often deemed to be greater than ten points; this usually means that there is
a classification band difference in marks. A re-read is recommended (in some cases is policy) for
these cases. Re-read marks will be submitted overriding any previous marks, but not overwriting
them.

3.7 Calculating overall performance


Each programme has specific examination conventions, which define how an award is calculated.
These award calculations, or a progression calculation if the student is continuing, are usually built
into the spreadsheet or database as a series of formulae or macro. Some Departments/Faculties
perform scaling, moderation, grade translation at this stage. The University policy of reporting marks
on the 100 point scale applies to the majority of Departments/Faculties, only a few have been
granted exception from this policy.

3.8 Reporting for the Examination Board (pre-board and final board)
Reports for the Examination Board can be extensive; each Chair (and to a certain extent, the
external examiner) has specific reporting requirements. Reporting formats differ across the
University, however particular similarities can be identified. The meeting before the final board
meeting, discusses any cases where moderation is necessary, and cases of mitigating
circumstances. This meeting decides outcomes that are then ratified by the final board meeting. The
pre-board meeting will comprise the Chair, administrator, and perhaps one or two other senior
academics. The final board meeting agrees the classification/progression outcome for each student.
The meeting comprises a quorum group of examiners, including the Chair, external examiners,
administrator, and perhaps a representative from central administration. Members of this meeting
sign a hard copy ‘class/pass’ lists for submission to the Academic Records Office. This, along with
an electronic file containing marks for papers/submissions, is part of the process described in 3.10.

3.9 Creating statistical reports


Statistical reports are created for the examination board to consider. Usually these comprise reports
detailing the spread of marks on each paper, or the performance of each student across the set of
their examinations. The board particularly looks at detail such as the average mark, median, mode,
and standard deviation against each paper. If scaling was to be applied, statistical reports can
facilitate this, and modelling what-ifs can actually occur within the meeting itself.

3.10 Reporting results (marks) to the Academic Records Office


After the final examination board meeting, after all moderation has been agreed and the final marks
are ‘signed-off’, the Chair will inform the administrator of the final results. The administrator will then
ensure that the signed hard copy list matches the electronic records. Spreadsheets or databases
are updated accordingly. The Department/Faculty must now submit the marks (all of the results
data) to the Academic Records Office so that it will be uploaded into OSS for dissemination to
students and appropriate staff. In order to upload marks into OSS, the Department/Faculty must
submit them within a predefined formatted comma separated values (CSV, Excel) file. This can be
in two different formats, one that is designed for spreadsheets and the other that is designed for
database export. The format of the CSV file is strictly defined and cannot be altered.
Departments/Faculties upload the electronic file containing results into a secure site within
WebLearn (the University’s VLE). They deliver the hard copy (signed) list to the Academic Records
Office.

C:\DOCUME~1\VICTOR~1\LOCALS~1\Temp\JISC_FSD assessment project proposal.doc


Created by Victoria Clark Page 5 2/3/2010
Proposal for an Assessments Management System

3.11 Uploading into OSS


Once the hard copy list has been checked for validity (quorum group of examiners), the electronic
file is collected from WebLearn and uploaded into OSS through a function that has been specifically
developed. OSS performs verification checks upon the data, such as checking that a mark has not
been given to an examination that the student was not entered for. Usually there are some
verification issues, but these are more commonly mistakes with using invalid grading schema values
or entering a mark in a grade column and vice versa. The upload process, without any issues, is
efficient and easy to use. Any upload issues are usually resolved by direct liaison between the
Academic Record Office and the Department/Faculty.

3.12 Publishing
When results are uploaded into OSS, they are placed ‘on hold’ for 48 hours so that the
Department/Faculty has an opportunity to review them before they are released to students and
staff. This review process is in place purely because there was initially very little confidence in OSS
and the process of reporting marks to a central system. In reality, the upload into OSS is completely
accurate; however there have been a few instances of errors in the actual marks submitted to the
Academic Records Office. After 48 hours, results are released/published to students through
Student Self Service, and to staff through OSS Data Views. There is no automatic notification to
student and staff that particular results have been published. They must continuously check to see if
their results are published, it usually happens after a certain period of time after the board has sat.

3.13 Creating College reports


As well as reporting to the Academic Records Office, Departments/Faculties must also produce
hard copy reports for academic tutors and college tutorial offices. These reports are called ‘College
Reports’. They contain details not collected into OSS, such as rank, prizes, and average mark for
the degree. This is information required by Colleges in order to perform specific processes, such as
writing letters of reference. The College must match up information delivered through OSS Data
Views and information supplied by Departments/Faculties. Colleges often then enter results
information into their own systems, and/or print information to store in hard copy files.

3.14 Creating other reports


Departments/Faculties create reports for other processes. These reports include the following:
information for the External Examiner reports; Gender Statistics; internal reports for examiners. The
extent of reporting is not fully understood or documented. It is unknown if Departments/Faculties or
Colleges are able to compare admissions data (A level results) with final results data and if they
create any management information on this basis.

4. CONSTRAINTS
Current IT infrastructure of the University
Culture of control over Department/Faculty processes, not everyone will want to use a
centrally provided system
The non-modular structure of the University will influence requirements, but may not become
a constraint

5. ASSUMPTIONS
A pilot Department/Faculty can be identified to adopt the assessments management system
Data flow between the assessments management system and OSS is possible
A centrally provided assessments management system can work alongside current systems
and the current OSS results process
Support for a centrally provided assessments management system can be established as part
of services offered by BSP/OUCS
Training for the centrally provided assessments management system can be offered

C:\DOCUME~1\VICTOR~1\LOCALS~1\Temp\JISC_FSD assessment project proposal.doc


Created by Victoria Clark Page 6 2/3/2010
Proposal for an Assessments Management System

6. SUCCESS CRITERIA
Processes, procedures and requirements are understood and documented, which then
informs the development phase for a centrally provided assessments management system
A flexible solution is identified which fulfils the variety of requirements across the University
Templates for standardised reporting are defined, with the scope for user-controlled reports to
also be available from the system
Departments/Faculties who are considering their own new system become candidates to pilot
the new centrally provided system
A concise set of requirements is documented and submitted as part of a second phase of
work, the development and implementation of the system
Positive attitudes towards a centrally provided system are created as part of extensive
consultation with Departments/Faculties and Colleges
Information regarding the selection and implementation of assessments management systems
in other HEIs is gathered and documented
The selection process for an assessments management system is affected by the requirement
for implementation within a FSD environment is documented and made publicly available
Documentation and guidelines are produced to inform other HEIs of the decision making
process and the selection criteria defined when choosing an assessments management
system to be implemented within an FSD environment
Cost to implement is estimated at significantly (at least 33%) less than for an ‘ordinary’
implementation of an assessments management component as part of a student records
system
Links and valuable contribution is made to the OUCS TCO project where appropriate

7. PROPOSAL

7.1 Description

In order to meet the objectives of this work, the following major areas of work are required:

Perform a comprehensive process analysis and requirements gathering exercise with


Departments/Faculties and Colleges
Consult with IT infrastructure and OUCS to establish how an FSD environment influences the
technical selection, and proposed development and implementation of an assessments
management system
Consult with third party software suppliers regarding the selection and design of an
assessments management system
Consult with other HEIs regarding the selection and implementation of assessments
management components to a student records system, or distinct systems if appropriate
Build a series of documents to outline the selection process, requirements analysis,
influencing factors, constraints, identified benefits during the project
Publish updates onto a project ‘blog’ site for other HEIs to review progress and decision
making as part of the project work
Produce a final report detailing guidelines which informs them of the findings of the project
(criteria for selection and the decision making process when choosing a system to be
implemented within a FSD environment)

7.2 Benefits

Plans are documented to consolidate fragmented assessment management systems across


the University into a single, centrally provided system
Improvements to student services with regard to assessments, such as notification of results

C:\DOCUME~1\VICTOR~1\LOCALS~1\Temp\JISC_FSD assessment project proposal.doc


Created by Victoria Clark Page 7 2/3/2010
Proposal for an Assessments Management System

A greater understanding of the process to manage assessments information helps to inform


future decisions about the direction of systems and policy
Requirements are documented and incorporated into future system development, ensuring
that manual manipulation of data and reports will no longer be required
A high level of consultation ensures that Departments/Faculties are in support of a centrally
provided system. The impact on the University is minimised by this approach
The level of spreadsheet use is reduced; therefore the risk of error in complicated calculations
is reduced or eliminated completely
The system can be designed to provide a set of standardised reports (defined by a consortium
of Departments/Faculties) and the facility to allow for user-controlled reporting. This would
reduce the effort by Departments/Faculties and Colleges to produce reports. The accuracy of
predefined reports is assured
Central support of systems reduces or eliminates the risk of unsupported systems. There will
no longer be a single point of contact
The analysis for a system will be performed centrally, Departments/Faculties are not required
to assign resources to investigate a new system
Data flow between systems and OSS will be designed to be streamlined and efficient reducing
the effort currently required to move data between systems
Economies of scale is introduced, a single system Mark-it for limited users, is planned to be
replaced by a more cost-effective solution which benefits a greater number of
Departments/Faculties
The management of programme (examination) related data will be investigated to ascertain if
OSS if the most effective system to store this information. Benefits of managing this
information outside of OSS (but sharing with OSS) may be realised through a more flexible
user-controlled system
Focusing on a new system whilst limiting development to OSS will ensure that effort is not on
a short-term basis and is an investment towards a long-term strategy
A centrally provided assessments management system is a step towards a SOA landscape,
which would inform future developments in this area
The University of Oxford provides other HEIs with expert guidance on implementing a system
within an FSD environment, and likewise can use advice from other HEIs to steer
developments within Oxford

7.3 Costs

The Process and Change Management team will carry out the process review and business
analysis work

8. ESTIMATED PROJECT COSTS AND TIMELINE

8.1 Project Costs

Two month process review, documentation plus costs for meetings (travel/expenses etc) for
liaison with other HEIs and suppliers.

Resource % FTE Cost Number of Days Total Cost


Process and Change Manager 1.0 60

8.2 Timeline

Phase one (requirements gathering and feasibility) could be completed by March 2010. Phase two
(development and implementation) could start immediately after the first phase is completed. The

C:\DOCUME~1\VICTOR~1\LOCALS~1\Temp\JISC_FSD assessment project proposal.doc


Created by Victoria Clark Page 8 2/3/2010
Proposal for an Assessments Management System

pilot project could start in the Academic Year 2010/11, with further rollout in the subsequent
academic years. The JISC FSD project runs until March 2011.

C:\DOCUME~1\VICTOR~1\LOCALS~1\Temp\JISC_FSD assessment project proposal.doc


Created by Victoria Clark Page 9 2/3/2010

Das könnte Ihnen auch gefallen