Sie sind auf Seite 1von 8

166

IEEE TRANSACTIONS ON EDUCATION, VOL. 51, NO. 2, MAY 2008

A Technological Tool to Detect Plagiarized


Projects in Microsoft Access
James A. McCart and Jay Jarman

AbstractOver one in ten students surveyed have admitted to


copying programs in courses with computer assignments. The ease
with which digital coursework can be copied and the impracticality
of manually checking for plagiarized projects in large courses has
only compounded the problem. As current research has focused
predominantly on detecting plagiarism for textual applications
such as source code and documents, there exists a gap in detecting
plagiarism in graphically-driven applications. This paper focuses
on the effectiveness of a technological tool in detecting plagiarized
projects in a course using Microsoft Access. Seven semesters of
data were collected from a large technology-oriented course in
which the tool had been in use. Comparing semesters before and
after the technological tool was introduced demonstrates a significant decrease in the number of projects being duplicated. The
results indicate combining technology and policy can be effective
in curtailing blatant plagiarism within large technology courses.
Index TermsEducation, educational technology, mass course
techniques, plagiarism detection.

I. INTRODUCTION
VER one in ten students surveyed have admitted
to copying programs in courses with computer
assignments [1]. In those technology-oriented courses, where
much of the work is in digital format, copying coursework
has become virtually effortless. It is not surprising then at
some universities disproportionately large shares of violations to academic integrity have been in technology-oriented
disciplines [2]. The issue is further complicated in large classrooms where the increased number of students makes detecting
plagiarized work even more difficult due to time constraints [3].
So how can cheating be effectively reduced without either radically changing course designs, or requiring draconian measures
which interfere with the learning process?
This study examined a large technology-oriented course,
which used a technological tool to detect plagiarism in direct
response to blatant plagiarism occurring in course projects
using Microsoft Access. The change in students behavior,
in terms of the number of students detected who plagiarized
projects, was analyzed at points before and after the implementation of the tool. This paper illustrates how technology can
help alleviate some difficulties in discovering plagiarism and
enforcing policies against plagiarism.
This paper describes the course, the problem, the tool, and
the process for using the tool. An analysis of seven semesters

Manuscript received December 26, 2006; revised June 14, 2007.


The authors are with Information Systems and Decision Sciences, College of
Business Administration, University of South Florida, Tampa, FL 33620-7800
USA (e-mail: jmccart@coba.usf.edu; rjarman@coba.usf.edu).
Color versions of one or more of the figures in this paper are available online
at http://ieeexplore.ieee.org.
Digital Object Identifier 10.1109/TE.2007.906312

worth of data follows, with a discussion of the limitations and


conclusions of the research.
II. RELATED WORK
Much of the previous research in technological tools regarding plagiarism has focused on algorithms to compare
similarity in text (e.g., [4][6]). Algorithmic comparisons have
been used in programming classes [7] and in general written
assignments [8]. Applications which are developed using
graphical interfaces, such as Microsoft Access databases, are
not textual in nature and, thus, cannot be compared via these
methods. For instance, Hill created an automated grader for
Microsoft Excel projects, which flagged potentially plagiarized projects as those having a high percentage of matching
errors [9]. Later, Hill also created an automatic grader for
Microsoft Access projects [10]. In these instances an automatic
graders requirement that there should be one manner of arriving at a correct answer limits the types of projects which can
be assigned and the acceptable solutions. Therefore, a more
general mechanism to detect plagiarism within Microsoft Access projects is needed which does not place major restrictions
on the projects being assigned.
III. COURSE DESCRIPTION
Information Systems (IS) in organizations was a required
upper-level course for business students at a large public university. The course covered a wide range of topics in IS and
served as an initial foray into how and why IS can be leveraged
within the organization to help facilitate and create competitive
advantages. Since the course served the entire College of
Business, enrollment in the course was high, with a maximum
capacity each fall and spring semester of 700 students, and of
350 students in the summer semester.
The course was divided into three modules, with each module
containing corresponding projects and exams. Four to six graduate teaching assistants (TAs) were assigned during the fall and
spring semesters, and two TAs during the summer semester.
Besides providing lab and office hours to help students with
projects, the TAs contributed 40 hours a week toward grading
projects. Depending on the number of hours allocated to a TA,
they graded between 100250 projects per module.
A. Projects
Although different technologies were used to complete
course projects, this study focused on projects completed
using Microsoft Access. Microsoft Access was used to teach
fundamental database skills such as: creating tables and input
forms, specifying relationships, querying the database, and
generating reports. Over 20 online videos were available to

0018-9359/$25.00 2008 IEEE

MCCART AND JARMAN: TECHNOLOGICAL TOOL TO DETECT PLAGIARIZED PROJECTS IN MICROSOFT ACCESS

167

the students for guidance on different aspects of Microsoft


Access. While the projects challenged the same skill set from
semester to semester, the context of the assignments changed
each semester. Therefore, having access to previous semesters
projects would in all likelihood not help students to complete
current semesters projects.
Many aspects of the projects also had more than one correct
way to complete them. Students could therefore explore different methods of solving the problem, but this advantage also
implied a high degree of variability between student projects.
Therefore, automatic grading could not be used. To facilitate
grading, students were required to follow naming conventions
for their project (i.e., filename) that followed the form: last
name, first initial, and middle initial. For example, a student
with the name Judy Ashley Doe would name her project as
DoeJA.
IV. ACADEMIC INTEGRITY POLICY
Policies regarding cheating were fully explained to the students during the first day of class. The policies were also on the
syllabus, which was handed out during the first class meeting
and was also available online. Additionally, the first assignment
of the semester required students to answer questions regarding
the course policy on cheating.
The course academic integrity policy defined cheating as
plagiarism, copying, or any other form of misrepresentation
of ones work. The policy also prohibited collaborating with
others; all of the work was to be done individually. Some of the
ways in which a student could violate the academic integrity
policy for these projects would be as follows:
to copy a project file with the permission of another student;
to copy a project file that was inadvertently left on a
public-use computer;
to copy an incomplete project file and complete the project
themselves;
to import another project into their project;
to recreate manually a project from another project;
to collaborate with other students to complete a project.
The instructor had several options for dealing with students
caught cheating. First, the instructor could choose not to award
credit for the project. Second, the instructor could fail the student from the course. Lastly, the instructor could fail the student with a mention on their transcript that the failure was due
to cheating.
The last two options were problematic due to the level of
proof required to substantiate the accusation of cheating. Therefore, there was little risk to students in breaking the course academic integrity policy if the instructor simply gave zero points
for the project. If a student failed to turn in a project, or turned
in a duplicate project which was detected, they would not receive any credit. However, if a student turned in a duplicate
project, which was not detected, they would receive full credit.
To remedy this situation, the instructor decided to remove not
only all points for a project but to also deduct an additional 50
points for any students caught cheating on projects. The additional 50-point deduction reduced their overall course grade by
an entire letter grade. This new grading policy increased the

Fig. 1. CCPE screenshot.

penalty for detected plagiarism to a higher cost than that of


simply failing to submit an assignment.
V. THE PROBLEM
While grading the final projects in the fall of 2004, TAs began
noticing that a number of projects looked quite similar, if not
identical, to one another. This observation was particularly significant, given that each TA only graded a portion of all available
projects, and would be unlikely to detect a duplicate graded by
another TA. Additionally, detection was even more difficult if
a student copied another students work and then superficially
changed fonts, colors, layouts, etc. Therefore, the likelihood was
high that there were many other duplicate projects that were not
being detected.
Due to this detection problem, the instructor went through
each of the 627 final projects to determine the number of duplicate projects. Each project was individually opened in Microsoft Access and from the database properties, the creation
date was recorded. Projects with identical creation dates were
then examined more closely to determine if they were indeed
duplicates of one another. This entire process took over four
hours during the final weeks of the semester and resulted in the
discovery and punishment of over 29 students for plagiarism.
While this manual method did result in the detection of duplicate projects, it was time consuming and difficult to detect all
duplicates. Additionally, the instructor would rather deter plagiarism than punish it. Therefore, the instructors motivation
was to deter students from plagiarizing in the first place, while
still catching those who chose to break the rules.
VI. THE TOOL
To automate the previously manual process and to detect the
first three methods of cheating listed in Section IV, a tool affectionately named Cheater Cheater Pumpkin Eater (CCPE) was
created. CCPE was written using Visual Basic for Applications
within a Microsoft Access database (see Fig. 1).
To determine if Microsoft Access projects were duplicates,
properties such as the read-only creation date of the database
and its objects (i.e., tables, queries, forms, reports, etc.) were

168

IEEE TRANSACTIONS ON EDUCATION, VOL. 51, NO. 2, MAY 2008

TABLE I
PROPERTIES

compared. When a database or an object within a database is


created, a document object (DO) is created which stores properties of the newly created database or object [11]. Each DO
contains standard properties such as creation date, last updated
date, and name [11]. In addition, built-in summary properties
of the database, such as the database title [12], are stored in a
separate DO. The last updated properties date is associated with
changes to the built-in summary properties.
When a database is directly copied, all objects of the database, including the DOs, are also copied. In doing so, the properties of the original database are retained in the new database.
Therefore, copied databases have the same creation date, last
updated properties date, and title as the original. In other words,
the copied database is an exact duplicate of the original and all
of the creation date, last updated date, and name properties for
all of the objects within the copied database are the same as in
the original.
Table I presents the definitions for the general properties,
the database properties, and the object properties. The filename
is the name of the database file, as specified in the operating
system. The database creation date is a date and time stamp,
which records when the database was first created [11]. The
database title is originally set based on the supplied filename
when a new database is created. The database title does not
change when the filename is changed, but it can be changed via
the databases properties [12]. When properties such as the title
are changed, the databases last updated properties date is updated. It is important to note that when an object is added or an
existing object is changed, the databases last updated properties date is not changed. This date is only changed when one of
the databases properties is changed. For example, if a student
adds a query and makes changes to a report, the last updated
properties date is not changed. On the other hand, if a student
accepted the default title of db1 when the database was created and later changes that title to SmithJA, the databases last
updated properties date is changed.
Each object has a creation date, name, last updated date, and
type property. Just as with the database, the creation date is a
date and time stamp, which records the time of creation [11].
The name property is originally set when the object is created
and is changed whenever the object is renamed. When any aspect of an object is changed, the date and time in which the
changes occurred are reflected via the last updated date [11].

The type property indicates what the object is classified as (i.e.,


table, query, form, report, etc.).
Each projects pertinent properties were accessed and extracted within CCPE using a combination of Data Access Objects and Microsoft Access database objects [11]. The databases title, creation date, and last updated properties date were
extracted at the database level along with the filename of the
database. Additionally, each objects properties (type, name,
creation date, and last updated date) were extracted for a more
detailed level of comparison. All of the extracted values were
stored internally in the CCPE database.
VII. THE PROCESS
The first step in using CCPE was to specify the directory
containing all of the projects to be processed. Each project was
automatically opened and the properties of the database and all
its objects were loaded into CCPE (Fig. 2, step 1). Next, CCPE
examined the projects for potential duplicate projects. Fig. 2
outlines the conceptual process used to determine if a project
was identical to another. CCPE used database level properties
to determine if a students project was a copy of another and
generated a report listing these projects. A percentage of likeness was then calculated for each project in this report using the
objects of each project. Fig. 3 shows an example of this report
from CCPE.
Projects were first compared via the database creation date
(Fig. 2, step 2). Since it was highly unlikely multiple projects
would have identical creation dates, these projects were considered potential duplicates and were compared at the next step.
The next property that was compared was the database title
(Fig. 2, step 3a). This property is initially set to the databases
filename. If no filename is specified Microsoft Access populates this property by default with the letters db followed by a
number. For example, db1 would be the first database created.
Even though the course required following a naming convention
(the students last name and initials), some students may have
instead used Microsoft Access default database title, db1. If
students copied projects, the database title would be the same
as the projects from which the student copied. However, since
not all students followed directions and instead accepted the
default database title, this alone was not enough to determine
if projects were duplicates. More details had to be examined
(Fig. 2, step 3b).

MCCART AND JARMAN: TECHNOLOGICAL TOOL TO DETECT PLAGIARIZED PROJECTS IN MICROSOFT ACCESS

169

Fig. 2. Process at database level.

Fig. 3. Example report.

TABLE II
EXAMPLE 1

TABLE III
EXAMPLE 2

If the database titles were identical and were not the default
title (db1), those projects were marked as duplicates. Table II
shows an example of three projects meeting these criteria.
Three students each submitted their individual projects and
each project has a unique filename. However, the projects

are duplicates because the databases creation dates and titles


are also identical and the default title of db1 was not used.
Because student Doe followed directions and correctly named
her database DoeJA it can be determined that students Smith
and Zulu copied. This fact could be useful when confronting
the students.

A. Example 1

170

IEEE TRANSACTIONS ON EDUCATION, VOL. 51, NO. 2, MAY 2008

TABLE IV
EXAMPLE 3

TABLE V
EXAMPLE 4

B. Example 2
If students submitted individual projects with identical database creation dates and titles and the titles were the default title
of db1, those projects would be marked as potential duplicates
(Fig. 2, step 3b). Table III shows this example. Student Foo and
student Bar potentially copied projects because the database creation dates and titles are identical for both projects. However,
since the titles are the default db1, it can not be determined
with certainty if the two projects are duplicates. The likelihood
that two projects would have identical creation dates and neither
student changed the database title from the default is remote but
it is possible. Therefore, the percentage of likeness and other details need to be examined to determine if projects meeting these
criteria were copied. These details will be discussed later in the
paper.
C. Example 3
Suppose two projects were submitted that had identical database creation dates but the database titles were different. The
possibility still exists that these two projects were copies. Further details had to be examined to determine if this were the case
(Fig. 2, step 4). The database title can be changed; however, the
need to do this is not intuitively obvious. Most students incorrectly assumed that changing the actual filename of the database
was enough to change the title. On rare occasions, students did
change the title after copying the database from another student.
Table IV shows an example of this.
If the database creation date and title were the only properties examined, one would assume that these students just happened to create their projects at exactly the same time. This
situation is unlikely, but possible. CCPE examined one other
property when such coincidences were discovered to determine
whether projects were duplicates. As discussed previously, the
database has a last updated properties date that identifies when
the properties of the database rather than the actual data have
been updated.
In Table IV, the database creation date is identical for the
two projects, yet the titles are different. CCPE then looks at
the last updated properties date. Student Browns last updated
properties date has been changed from its initial value, which
would have been the same as the database creation date. This
difference in dates shows that the title was probably changed and

that these two projects are potentially copies of each other. As


in example 2, the percentage of likeness and other details need
to be examined to determine if projects meeting these criteria
were indeed duplicates.
D. Example 4
The likelihood that projects would have identical database
creation dates is very small but it does exist. How would CCPE
discern between copies and legitimate projects in this case?
Table V shows an instance where two projects have identical
database creation dates. The titles are different but the previous
example showed that this difference is not enough to distinguish
these two projects as being unique. The last updated properties
date must also be examined. In this case, the last updated
properties date are identical to each other but more importantly,
they are identical to the database creation date. This comparison
shows that the students have not changed the title or any other
database property since the database was created. Therefore,
these projects are not duplicates.
While examination at the database level provided evidence of
plagiarism, for those projects that CCPE determined as potential
duplicates, it was necessary to drill down into their individual
objects to provide further evidence of plagiarism, and the extent
of this plagiarism. As seen in examples 2 and 3 above, CCPE
was only able to determine that those projects were potential
duplicates. In these cases, CCPE also had to examine objects
within the databases to determine whether those databases were
duplicates. If CCPE determined that two projects were copies
of each other, it was also possible that one student had copied
another students project before the assignment was completed
and then did his own work for the rest of the assignment. In this
case, the student still plagiarized but the extent could be determined by CCPE, which would help the instructor in determining
appropriate consequences for the students.
E. Percentage of Likeness
Ranging from 0% to 100%, the percentage of likeness determined the amount of similarity between a project (the target
project) and any projects CCPE considered a duplicate or potential duplicate (see Fig. 3 for an example). A score of 0%
indicated no objects within the target project were similar to
any other projects and, thus, would not be considered a copy.
A score of 100%, on the other hand, indicated all objects within

MCCART AND JARMAN: TECHNOLOGICAL TOOL TO DETECT PLAGIARIZED PROJECTS IN MICROSOFT ACCESS

171

Fig. 4. Process at object level.

F. Changes to Instructor Procedure


To make students aware of plagiarism, they were required to
read and comment on a case study that illustrated examples of
students who had been caught violating the academic integrity
policy and the consequences they faced. With the introduction of
CCPE the instructor informed students, both in the syllabus and
throughout the semester, that a technological tool was used to
flag plagiarized projects. However, the mechanisms with which
the tool detected plagiarism (i.e., creation dates) were never revealed to students, even when they were accused of plagiarism.
Instead, the instructor manually viewed projects flagged as duplicates of one another and used the similarities found in the
projects as evidence to present to the accused students. In the
case where one students name was in the title of another students project, this information was brought to the students attention. In other cases similarities such as being identical in appearance, making the same errors, etc., was shown to the students. When presented with this information, usually at least
one student would confess.

the target project are the same, in terms of date properties, as


another project and, thus, would be considered a copy. Scores
between 0%100% signify some form of duplication exists with
some differences between the target project and other projects.
The difference may be due to new objects being created or existing objects being changed after the target project was copied.
Regardless of the score, detected projects were inspected by the
instructor personally before any accusations of cheating were
made.
To calculate the percentage of likeness, points were added
for each of the target projects suspected objects. The score was
then normalized by dividing by the total number of objects in
the target project. Fig. 4 outlines the conceptual process used to
calculate the percentage of likeness.
First, objects of the target project were compared against
another projects objects (Fig. 4, step 1). Next, objects of the
same type (i.e., query, form, etc.) and having the same creation
date were flagged as potential duplicates (Fig. 4, step 2). Comparisons based on last updated date were performed on these
flagged objects (Fig. 4, step 3). Unlike the database last updated
properties date, which is only updated when properties of the
database are updated, the last updated date at the object level
is updated whenever a change to the object is made, including
when it is renamed. However, even if the last updated date
were the same, the possibility still existed that the objects were
not duplicates of one another. For instance, two objects could
have been created in two separate projects at the exact same
time with different names and were not changed after being
created. Therefore, the object names were compared (Fig. 4,
step 4). If the names were the same, then one point was added
and the object was considered a copy. Otherwise, zero points
were added since it was not a duplicate. In the case that the last
updated dates were not the same (Fig. 4, step 3), the objects
were considered potential duplicates and were assigned half
a point. This is because the object may have been copied and
then some aspect of it changed, which caused the last updated
date to change.

VIII. ANALYSIS
Projects from 19 assignments over seven semesters were collected and processed with the CCPE tool. Averaged over 10
runs, the typical time to process an assignment consisting of 627
projects took less than 19 seconds, resulting in a substantial decrease over the time required for the manual method.
Projects were categorized based on when the student took
the course (i.e., fall and spring semesters versus summer
semesters). Since students from other universities often take
the course during the summer, the categories were created for
more consistent comparisons between semesters. Students that
attended this university during the regular school year had ties
to each other, which provided them more opportunities to cheat
than students from other universities had.
Table VI and Fig. 5 present the results of the fall and spring
semesters, while Table VII and Fig. 6 present the results of the
summer semesters. Each table lists the number of projects submitted and how many were detected as duplicates using CCPE,
while the figures show the change in percentage between
semesters. The group size indicates the number of projects
that were copied from one project, including the original. For
instance, a group size of five means there was one original
project and four other students copied that original project.
In fall 2004, 6.7% of all projects turned in were direct duplicates of another project. Fall 2004 was also the only semester
with groups of four and five students copying the exact same
project. As CCPE was not yet being used, fall 2004 is considered the baseline for fall and spring semesters without any technological mechanisms to detect plagiarized projects.
CCPE was first introduced and used in spring of 2005. Each
regular school year semester was compared to the baseline
semester using a two-proportion z-test to determine if there
was a significant difference in the proportion of students
plagiarizing work between semesters [13]. Additionally, the
average percentage of duplicate projects was compared to the
baseline semester. Semesters using CCPE had a significantly
compared to
lower percentage of copied projects at
the baseline semester (see Table VIII).

172

IEEE TRANSACTIONS ON EDUCATION, VOL. 51, NO. 2, MAY 2008

TABLE VI
FALL AND SPRING SEMESTERS

TABLE VII
SUMMER SEMESTERS

TABLE VIII
FALL AND SPRING SEMESTER COMPARISONS

TABLE IX
SUMMER SEMESTER COMPARISONS

Fig. 5. Fall and spring semesters.

had a significantly lower percentage of copied projects at


compared to the baseline summer semester (see Table IX).
IX. LIMITATIONS

Fig. 6. Summer semesters.

In summer 2004, CCPE was also not yet being used. Therefore, summer 2004 is considered the baseline for summer
semesters without any mechanisms to detect duplicate projects.
Like the regular school year, summer semesters using CCPE

One acknowledged limitation is the use of a single semester


of data as a baseline. Whether the baseline semesters are truly
representative of prior semesters is unknown since previous
semesters projects were not available. However, as semesters
against both baselines are significantly different, it does provide
some evidence that a change occurred after the introduction of
CCPE.
Another limitation exists in CCPE itself. The power of CCPE
is in its simplicity; students, however, can find numerous ways
to cheat which circumvent the detection capabilities of CCPE.
For instance, CCPE is unable to detect collaboration between
students, manual recreation of projects, or a student importing
data from another project. However, CCPEs focus was on detecting blatant plagiarism, which clearly goes against the academic integrity policy of the course and the university.
Lastly, the fall semester of 2004 was the first semester in
which the instructor implemented the minus 50-point policy
for those caught violating the course academic integrity policy.

MCCART AND JARMAN: TECHNOLOGICAL TOOL TO DETECT PLAGIARIZED PROJECTS IN MICROSOFT ACCESS

Because of this new penalty, we cannot be sure the drop in


plagiarized projects in the summer semesters was completely
due to the use of CCPE. Instead, the combination of a harsher
penalty and a mechanism with which to enforce the zero tolerance of plagiarism in the academic integrity policy is the probable reason for the drop in plagiarized projects. Nevertheless,
the harsher penalty was in effect during fall 2004 when the
largest percentage of duplicate projects was detected. A drop
in plagiarism percentages was seen only after the introduction
of CCPE. Unfortunately, the drop may be a consequence of students applying more sophisticated ways to cheat rather than an
actual drop in plagiarism. However, since the mechanisms of
CCPEs operation were never revealed, it is unlikely students
would have the understanding and, thus, the ability to beat
the system. Therefore, the risk would be high (an entire letter
grade) for experimenting with different ways to cheat against a
system they are unsure how to beat.
X. CONCLUSION
This study examined the effectiveness of a technological tool
in detecting plagiarized Microsoft Access projects in a large
technology-oriented course. A significant decrease in the percentage of plagiarized projects was found when CCPE was used.
The use of a technological tool, such as CCPE, provides educators with a relatively simple method to help reduce blatant
plagiarism in large classes which utilize Microsoft Access in
course projects. The combination of the detection capabilities
of CCPE along with an adequate penalty for violating an academic integrity policy may help deter students from considering
plagiarism in the first place.
REFERENCES
[1] D. L. McCabe, Cheating among college and university students. A
North American perspective, Int. J. Educ. Integ., vol. 1, no. 1, pp.
111, Dec. 2005.
[2] E. Roberts, Strategies for promoting academic integrity in CS
courses, in Proc. 32nd Annu. Frontiers in Education Conf., Boston,
MA, Nov. 2002, pp. F3G14F3G19.

173

[3] G. R. S. Weir, M. A. Gordon, and G. MacGregor, Work in


progressTechnology in plagiarism detection and management,
in Proc. 34th Annu. Frontiers in Education Conf., Savannah, GA, Oct.
2004, pp. SE318SE319.
[4] X. Chen, B. Francia, M. Li, B. McKinnon, and A. Seker, Shared information and program plagiarism detection, IEEE Trans. Inf. Theory,
vol. 50, no. 7, pp. 15451551, Jul. 2004.
[5] A. Parker and J. O. Hamblen, Computer algorithm for plagiarism detection, IEEE Trans. Educ., vol. 32, no. 2, pp. 9499, May 1989.
[6] S. Schleimer, D. Wilkerson, and A. Aiken, Winnowing: Local algorithms for document fingerprinting, in Proc. 22nd Association for
Computing Machinery Special Interest Group Management of Data
Int. Conf., San Diego, CA, Jun. 2003, pp. 7685.
[7] M. Joy and M. Luck, Plagiarism in programming assignments, IEEE
Trans. Educ., vol. 42, no. 2, pp. 129133, May 1999.
[8] C. J. Neill and G. Shanmuganthan, A web-enabled plagiarism detection tool, IT Pro, vol. 6, no. 5, pp. 1923, Sep./Oct. 2004.
[9] T. G. Hill, MEAGER: Microsoft Excel automated grader, J. Comput.
Small College, vol. 18, no. 6, pp. 151164, Jun. 2003.
[10] T. G. Hill, Excel grader and Access grader, ACM SIGCSE Bull., vol.
36, no. 2, pp. 101105, Jun. 2004.
[11] S. Roman, Access Database Design and Programming, 2nd ed. Sebastopol, CA: OReilly, 1999, pp. 242349.
[12] V. Anderson, How to Do Everything With Access 2002. New York:
McGraw-Hill, 2001, p. 405.
[13] W. Mendenhall and T. Sincich, A Second Course in Statistics Regression Analysis, 6th ed. Upper Saddle River, NJ: Pearson, 2003, pp.
5161.

James A. McCart received the B.S. degree in information systems from Purdue
University, West Lafayette, IN, and the M.S. degree in management information
systems from the University of South Florida (USF), Tampa, in 2002 and 2006,
respectively.
He is working towards the Ph.D. degree in management information systems at USF. His research interests include software testing, security, and
development.

Jay Jarman received the B.S. degree in computer science from East Tennessee
State University, Johnson City, and the M.S. degree in management information
systems from the University of South Florida (USF), Tampa, in 1996 and 2006,
respectively.
He is working towards the Ph.D. degree in management information systems
at USF. He served eight years in the U.S. Air Force in the information management field. He has 20 years experience in developing and managing information
systems as well as teaching and developing information technology courses for
software companies.

Das könnte Ihnen auch gefallen