Sie sind auf Seite 1von 10

Vol 2 Issue 1: Assessment

Winter 2003 ISSN 1477-1241 Editorial Rachel Forsyth The Concept of Plagiarism Bill Johnston Plagiarism Detection Software - a new JISC service
LTiA home page

| View this article as a .pdf file | Philip Denton School of Pharmacy and Chemistry, Liverpool JMU

Returning Feedback to Students via Email Using Electronic Feedback 9


The software discussed in this article is available for free download at http://cwis.livjm.ac.uk/cis/download/xlfeedback/welcome.htm Alternatively, please email the author to receive the program as an attachment.

Rachel Forsyth Can students assess students effectively? Some insights into peerassessment A. Mark Langan and and C. Philip Wheater Exploring the potential of MultipleChoice Questions in Assessment Edwina Higgins and Laura Tatham Developing a new assessment strategy Gaynor LeaGreenwood Assessing the Unassessable Tim Dunbar How to assess disabled students without breaking the

Abstract Electronic Feedback is an MS Office marking assistant that has been developed at Liverpool John Moores University over the last 3 years. The software enables assessors to readily generate and return feedback to students, in the form of MS Word processed reports and email messages. Research indicates that the previous version of the software was used by 22 UK academics across a range of disciplines. They agreed that the program enabled them to return more feedback, of higher quality, in a shorter space of time. The latest software, Electronic Feedback 9, is considered to be more user-friendly and incorporates a novel collusion detection facility. The operation of this program is described in this article. Introduction As the Computer Assisted Assessment (CAA) unit at Loughborough University have noted, computerised testing can ease difficulties associated with large class sizes and diverse student capabilities (1). They comment that, CAA is essentially automated assessment through objective testing. Although the terms CAA and objective testing are often used interchangeably, electronically aided assessment has a much broader scope and a more comprehensive description of CAA would include:
y

Objective Testing This could be multiple choice or text match type questions, delivered via the Web. Electronic Submission At a simple level, this could be students presenting work to their tutor via email. Furthermore, some Virtual Learning Environments (VLEs) can support threaded discussions that enable students to contribute to an on-line debate. Free Text Analysis There is a range of plagiarism detection tools that can be used to check for similarities between electronic text files. Some programs use search engine technology to look for matching text on the Web (2). Software capable of automatically grading free text is available (3), but is still in development and is not suited to high stakes assessment at present. Marking Assistants Packages that can aid in the computation of student marks are used regularly by academic staff, e.g. MS Excel. Advanced marking assistants can generate written reports (4).

law Mike Wray Returning Feedback to Students via Email Using Electronic Feedback 9 Phil Denton Tools for ComputerAided Assessment Alan Fielding and Enid Bingham Faculty Learning and Teaching Reports Learning and Teaching News from the Library

As Brown (5) and Gibbs (6) have discussed, returning feedback to students on their work plays an important role in the learning process. Accordingly, providing undergraduates with comprehensive and timely information on their progress is considered to be one of the fundamental responsibilities of a HE tutor (7). Although electronic marking assistants have the potential to improve the return of feedback to students, they are not used routinely by academic staff. This is perhaps surprising, given that a recent summary of HE inspection reports concluded that UK Universities do not supply students with sufficient information on their academic development (8). Electronic Feedback (9,10) is an MS Office marking assistant, developed at Liverpool JMU over the last three years. The program enables tutors to readily generate MS Word processed feedback reports that may be automatically emailed to individual students as messages or attachments. Previous research indicates this method is viewed positively by students (9). Version 8 of the program was used by 22 lecturers across a range of academic disciplines during the academic year 2001/2002. Results of a recent emailed questionnaire (average Likert scale responses, 1 = strongly agree, 5 = strongly disagree) indicated that tutors agree that by using the software there were able to return more feedback (2.1), of higher quality (1.6), in a shorter space of time (2.1). Tutors were neutral, however, as to whether their colleagues could use the program without formal training (2.9). To remedy this, the program has been rewritten from scratch, with new features included to ensure a more user-friendly interface. Version 9 of the program was released in July 2002 and its operation will now be considered. Method Electronic Feedback 9 consists of three files:
y

Feedback9.xls, an Excel file into which data relating to a particular assessed is inputted. Guide9.xls, an interactive guide to the software. Users can navigate through this workbook using a pair of drop-down picking lists that show the contents of the guide. Fb9.doc, a Word document that automatically formats and emails feedback reports.

When ready to use the software, the tutor opens Feedback9.xls. This file is composed of a number of sheets, accessible from a main menu, Figure 1. After configuring the software so that it is appropriate for their personal and institutional use, tutors input data relating to particular assessment event into

these sheets, following the route indicated by the arrows on the Main Menu sheet. When this procedure is complete, tutors can use the software to create, email and print feedback reports.

Figure 1 Main menu of Electronic Feedback 9

The system allows the following raw data to be inputted:


y

Student Details This is a list of student names and email address. Registration numbers and salutations may also be entered, the default salutation being the students first name. During configuration of the software, it is possible to organise this sheet so that it matches the format preferred by the users institution. In this way, existing electronic data can be copied and directly pasted into the program. Grade Comments These are feedback statements activated by the students overall % mark, e.g. First class work for all students attaining 70% or above. General Comments These are included on all feedback reports generated by the software. Three separate general comments can be entered; top, middle, and bottom, these designations indicating their position on the report. Personal Comments This is a feedback statement for an individual student.

Standard Comments These are those remarks that are anticipated to be required repeatedly during marking. They can be written in advance, or gradually composed as marking proceeds. Standard comments can be entered as a list, normal mode, or organised under discrete headings, criterion mode. This latter arrangement is of particular use when marking essays. In either mode, each standard comment is assigned a unique reference number. Some users may prefer to annotate the students work with these numbers, so that students can see exactly where the tutor would want a particular comment to appear.

Comments may include the special character ~, which is automatically replaced with the students salutation when feedback is created. In this way, it is possible to personalise even standard and general comments. Other special characters, , ^, { and }, allow tab-spaces, line breaks, superscripts and subscripts to be incorporated into feedback remarks. It is recognised that the inputting of raw data is the most time consuming aspect of using the software, although Version 9 has a new facility that can make this process less onerous. It is possible to save standard, grade, and general comments, and student details in separate files, using the export procedure. The data in these files may then be readily retrieved using the import facility. This is useful feature when the same assignment, e.g. a laboratory exercise, is set year after year. In such cases, last years standard comments can be imported and used again. Work is currently underway to create a dedicated website where users of the software would be able to upload exported files of standard comments, for download and import by other users. The software has a number of additional features that give the Electronic Feedback method the edge over traditional red pen marking:
y

Allocate Comments This is an interface that allows tutors to rapidly assign the standard comments to individual students. After selecting the student to be considered, tutors input the reference numbers of the standard comments they would wish to award that class member. Alternatively, the user can select the statement itself from a drop-down picking list. This latter approach has the advantage in that the tutor is not required to memorise the reference numbers. Automark Instead of entering student scores manually, % marks can be calculated automatically by activating the Automark feature. In this

way, a students final mark is determined by the feedback statements allocated to them. In criterion mode, for example, it is possible to assign percentage weightings to individual criteria, e.g. introduction, and use these to determine the final mark.
y

Allocation Statistics This sheet gives the tutor an idea of how the standard comments have been allocated across the group. It may reveal, for example, that a particular feedback statement was allocated to a large proportion of the class. This information is of clear value to the assessor. It can, for example, be used to inform future teaching strategies by directing the tutors attention to those aspects of an assignment that caused the greatest difficulty. Allocation Check This feature allows tutors to check for pairs of students allocated similar standard comments. Once two students who match the search criteria are found, their original scripts can be inspected to confirm if any similarities are coincidental or suspicious. Initial work has indicated that this approach can be used to detect plagiarism (11), even in hand written scripts marked a few days apart.

When all standard comments have been allocated, and personal comments have been inputted, the user may generate feedback reports. The options sheet allows the user to control their appearance. Thus, the tutor can select to hide or show the standard comment reference numbers, for example. Users may also decide to incorporate the allocation statistics on the feedback reports. The create feedback button on the main menu activates a dialogue box that gives the tutor the option to generate, print and email the feedback reports for all students awarded a mark, allocated a standard comment, or assigned a personal comment. If they prefer, tutors can first generate the reports, view them within MS Word to confirm they are acceptable, before emailing. Emails can be sent either as an attachment or as a message and users can input their preferred subject line. If emails are sent as a message, any URLs contained within the feedback will be clickable. In principle, it is possible to configure virtual learning environments, such as WebCT, to automatically extract information from files created using Electronic Feedback. In this way, student names and marks can be rapidly uploaded and added to institutional records. Results A specimen feedback sheet produced in criterion mode is shown, Figure 2. The report also includes the allocation statistics.

Dr. Phil Denton PACCH1010 Alchemy Essay BENSON, IAN PATRICK <56346474> 46% (Average: 57%) Third Class. Ian, this was below your usual high standard. Did you have sufficient time to complete this work, in light of recent events? I have marked your work out of the following criteria, and the weightings of each criterion are shown. After each comment, Ian, you will find your performance in that criterion (/100), along with the highest (H), average (A) and lowest (L) student performances in that criterion. Introduction (Weighting 20%) I felt that this defined the subject of your essay quite clearly and went some way to establishing the scope of your study. 50/100 (H: 50, A: 28, L: 4). Demonstrates Understanding (Weighting 20%) Your work demonstrates a satisfactory level of understanding, although I would have preferred your work to have relied a little less heavily on your information sources. Remember, Ian, it is important that you add your own interpretation to the material that you present. 40/100 (H: 82, A: 53, L: 36). Shows Originality (Weighting 20%) This essay displayed a good level of originality. 50/100 (H: 50, A: 46, L: 44). Conclusion (Weighting 20%) Your conclusion satisfactorily summarised your work, but did little to present your point of view. 40/100 (H: 76, A: 56, L: 40). References (Weighting 20%) Your list of references is good, Ian, although I would have preferred you to have used endnotes throughout your text so that I can clearly see where you got your information from. 50/100 (H: 60, A: 56, L: 50). Overall, I was impressed by the standard of work submitted by the group. Many of you are still failing to present references appropriately, however. Ian, I would recommend that you, and the entire class, reread the section on essay writing in the module

handbook.

Figure 2 Example Feedback Report Produced in Criterion Mode.

Conclusions The software has been found to run smoothly and without major difficulty on Office 97 and 2000. Recent refinements (Version 9.3) mean that the software also works on Office XP. The automatic activation of MS Word from MS Excel runs more smoothly in Version 9 than in previous versions. Moreover, Feedback 9 files take up significantly less file space. It is considered that Version 9 is far more user-friendly than Version 8. The previous software evolved over 2 years, with additional features being bolted-on to the original template. This developmental process did not lend itself to user-friendly design! Version 9 makes extensive use of drop-down picking lists. These clearly indicate the restrictions on data entry, where these apply, and facilitate rapid data entry. The principle of checking feedback for similarities can be used as a means to identify those students who have colluded on their assignments, although further research is required to establish the scope of this approach. Indeed, it is the intention of the author to survey existing users of the software in May/June 2003 to ascertain overall impressions of this new program. Informal feedback, however, gathered during a series of recent presentations to UK University staff, indicate that staff are enthusiastic about Electronic Feedback 9. Further Information The software is available for free download at http://cwis.livjm.ac.uk/cis/download/xlfeedback/welcome.htm Alternatively, please email the author to receive the program as an attachment. Acknowledgements This work has been supported by a Liverpool JMU Teaching-Related Research Awarded.

References (1) http://www.lboro.ac.uk/service/ltd/flicaa/why_caa.html (2) Bull J, Coughlin E, Collins C and Sharp D, Technical Review of Plagiarism Detection Software Report [online], Joint Information Systems Committee (2001). Available from http://www.jisc.ac.uk/pub01/luton.pdf (3) Mason O, and Grove-Stephenson I, Automated Free Text Marking With Paperless School, Proceedings of the 6th International Computer Aided Assessment Conference (2002) pp213-219. Available online via: http://www.caaconference.com/ (4) For a summary of available packages see: http://www-staff.lboro.ac.uk/%7Elsdps/index.html (5) Brown G, Bull J and Pendlebury, Assessing Student Learning in Higher Education (1997) p 7, Routledge, London (6) Gibbs G, Improving the Quality of Student Learning (1992), p 17, Technical and Educational Services, Bristol (7) For example, see the Competency Framework for Effective Teaching devised by the Murdoch University, Perth, Australia, at http://wwwadmin.murdoch.edu.au/planning/docs/acfet/index.html#et4. (8) http://www.guardian.co.uk/Archive/Article/0,4273,4150416,00.html. (9) Denton P, Generating Coursework Feedback for Large Groups of Students Using MS Excel and MS Word, U. Chem. Ed. 5 (2001) pp1-8. Available online at: http://www.rsc.org/pdf/uchemed/papers/2001/p1_denton.pdf (10) Denton P, Generating and e-Mailing Feedback to Students Using MS Office, Proceedings of the. 5th International Computer Aided Assessment Conference (2001) pp157-173. Available online via: http://www.caaconference.com/ (11) Denton P, Detection of Plagiarism in Students' Written Work Using a Novel Electronic Marking Assistant, Proceedings of the International Conference on ICTs in Education (2002) pp 445-449.

Philip Denton School of Pharmacy and Chemistry

Liverpool John Moores University Liverpool L3 3AF p.denton@livjm.ac.uk

February 2003 ISSN 1477-1241

top of page

Das könnte Ihnen auch gefallen