Sie sind auf Seite 1von 6

P1: GCO

Journal of Science Education and Technology pp382-jost-367269 February 26, 2002 21:19 Style file version Oct. 23, 2000

Journal of Science Education and Technology, Vol. 11, No. 2, June 2002 (°
C 2002)

Online Data Collection

Neal W. Topp1 and Bob Pawloski1,2

Online data collection is becoming an essential and efficient tool for evaluators, researchers,
and other educators. This paper touches on elements of the rather short, but eventful history
of online data collection. A brief review of the current literature is presented, followed by a list
of pros and cons to be considered when stepping into online surveying. Finally there is a brief
look at what makes the online survey run—a database connected to the Internet. For those
who would prefer, alternatives to creating or hosting your own online surveys are offered. The
paper closes with contemplation toward the future of online data gathering.
KEY WORDS: online survey; Web survey; online data collection; technology-enhanced evaluation.

FROM PAPER TO E-MAIL by the particular e-mail client. The response may or
TO WEB-BASED FORMS may not have remained distinguishable after being
passed over the Internet from one e-mail client to an-
In years past, data collection methods for re- other. The data would then be electronically pasted
searchers have included oral interviews, hard copy into a database or spreadsheet for analysis.
surveys, and electronically scanned bubble sheets, to Eventually, third party software products like
name a few. More recently it would be difficult for any- Lasso, Tango, and even Front Page made it possi-
one to have been a cyber-citizen on the World Wide ble for moderately technically literate researchers
Web3 very long and not have stumbled onto, or maybe to automatically transfer data from Web forms di-
even replied to, at least one online survey or poll. With rectly into a database. Gradually, as this type of user-
the advent of new technological tools, collecting data friendly software became easier and more available,
online provides several new opportunities, as well as it eliminated the need to employ a “CGI4 -scripting
a few new challenges, for the researcher. techie” in order to Web-enable surveys for educa-
The first surveys over the Internet were predom- tional research. Online data collection via the Web
inantly e-mail forms. Limited to text-only, as e-mail has now become a fairly common tool for professional
was in the early days, a respondent would usually type evaluators as evidenced by the January 2001 thread:
an “x” in between two brackets (i.e. [x]) in the reply, “Re: Web-based Survey Question” on EvalTalk, the
replacing the standard “bubble” on a hard copy sur- listserv5 of the American Evaluation Association
vey. Additionally, the response to open-ended ques- (http://www.eval.org, accessed March 2001).
tions included some convention for distinguishing the Regardless of the increasing popularity, ease of
response from the question. The type of convention use, lower costs, and thus the democratization of on-
was up to the survey author and was usually dictated line surveying, some advantages of the e-mail ap-
proach over Web-enabled surveys remain, and are
1 University of Nebraska at Omaha, KH 110, 6001 Dodge St., enumerated in Christine Smith’s article, “Casting the
Omaha, Nebraska 68182.
2 To whom correspondence should be addressed; e-mail: bob 4 CGI (Common Gateway Interface) is a system that allows server
pawloski@mail.unomaha.edu software to call on external programs, such as a database applica-
3 World Wide Web (WWW) or Web is a browsing system that pro-
tion. CGI is usually programmed in a language such as PERL or
vides access to Internet resources based on hypertext graphics- C++.
capable documents. A cybercitizen of the World Wide Web is a 5 A listserv is an electronic mailing list whereby messages can be

frequent user the World Wide Web. shared by a number of subscribers.

173
1059-0145/02/0600-0173/0 °
C 2002 Plenum Publishing Corporation
P1: GCO
Journal of Science Education and Technology pp382-jost-367269 February 26, 2002 21:19 Style file version Oct. 23, 2000

174 Topp and Pawloski

Net” (1997). These advantages include “facilitative automatically to a FileMaker Pro database residing
interaction between survey authors and respondents, on the OIS Web Server. Tango by EveryWare was the
collapsed geographic boundaries, user convenience, out-of-the-box CGI software used to create the sur-
and arguably more candid and extensive response vey form and link it online. Responses to the survey
quality” (Smith, 1997, p. 1). E-mail was the preferred in this third year (1996–97) more than doubled. The
method over the Web in a study by Sheehan and Hoy following year (1997–98) the migration was made to
(1999). They utilized Four11.Com (which has since Version 4 of FileMaker Pro with its new Web Com-
been engulfed by Yahoo.Com) to generate a nation- panion feature. Moreover, this time the Web surveys
ally representative sample of Internet users. The au- were prefaced by a request to participate to each of
thors acknowledged the flexibility and user friend- the 19 Educational Service Unit (ESU) technology
liness of Web-based forms, but preferred to avoid specialists, who in turn sent the e-mail announce-
the broad base response that occurs when users “self- ment to every teacher who had an e-mail account
select” themselves into a survey sample, making it to within their jurisdiction. This announcement also in-
track response rates. formed respondents of the survey’s purpose and URL
Other research indicates that elapsed time from (address on the Web). This sample included essen-
setup to delivery is shorter with e-mail surveys tially all of the teachers in the state of Nebraska
(Bowers, 1999), and that response rates for e-mail who had e-mail accounts at the time (a majority).
and Web-based surveys are becoming close to equal Unfortunately, whether everyone used their e-mail,
(Smith, 1997; Stanton, 1998). A remarkably com- or if addresses had changed as they often do (GVU8,
prehensive compilation of research of online survey 1997), was relatively impossible to determine within
methods can be found at the site of the Research on the scope of this study.
Internet in Slovenia (RIS) Project (WebSM, 2001). In 2000 and 2001 Dr. Neal Topp teamed with
Among the RIS findings in the literature and from Dr. Elliott Soloway of the University of Michigan
RIS’s own studies, it has been found that Web surveys to conduct the “Snapshot Survey of Nebraska”
that invite participation by e-mail have a lower rate of that surveyed educational technology practice in
return than surveys that invite participation by ground Nebraska. It is worth noting that this service is avail-
mail. It is suggested that respondents tend to regard able at no cost to schools or school districts (http://
e-mail invitations for survey response as “SPAM” or www.snapshotsurvey.org; accessed August 2001). The
“junk e-mail.” invitations to participate in this survey were again
The Office of Internet Studies (OIS), located in sent through the ESU technology specialists. The re-
the Teacher Education Department at the University sponses dropped from 3100 in the first year to 1500 in
of Nebraska at Omaha, began doing online data col- the second year of the Snapshot Survey of Nebraska.
lection in 1994. OIS’s history of conducting online It can be speculated that teachers are starting to re-
research parallels the evolution of online surveying. gard such e-mail as SPAM, concurring with the find-
OIS’s first surveys involved research on Nebraska ings of RIS. However, other factors could include the
K-12 teachers’ use of the computer and the Internet. changing face of e-mail authority in and between the
During the 1994–95 academic year, surveys were ESU’s and school districts. The GVU8 WWW Survey
e-mailed to a specific population of Nebraska K-12 (GVU8, 1997) notes increasing numbers of Internet
teachers who had completed training on how to use users owning more than one e-mail address.
the Internet. This training was the prerequisite to Determining an accurate response rate, along
these teachers each receiving an Internet e-mail ac- with encouraging potential respondents to complete
count. The 1995–96 survey was delivered in a simi- an online survey (Gaddis, 1998; Meehan and Burns,
lar manner with a cumulative population. There were 1997), are important considerations for online data
several factors affecting the rate of return these first collection. Meehan and Burns (1997) noted that the
2 years. Most observed difficulties involved people inability to determine the number of people on a list-
not using their e-mail, still not knowing how to use serv hinders calculation of accurate response rates.
e-mail, not having a computer, not being connected to E-mail solicitations have the same type of problem.
the Internet, and having their e-mail address changed This finding is in line with the OIS experiences in
for one reason or another. surveying Nebraska educators. Gaddis (1998) sug-
By the end of 1996, OIS had the technology gests that it is useful to precede any new survey with
in place to deliver the next 2 years of teacher sur- an introduction and purpose to elicit participation and
veys via a Web page. Responses were then submitted cooperation, even when there was prior knowledge of
P1: GCO
Journal of Science Education and Technology pp382-jost-367269 February 26, 2002 21:19 Style file version Oct. 23, 2000

Online Data Collection 175

the survey. Using this type of mixed media approach, causing scanning errors (Smith, 1997). Stanton (1998)
the online data collector may find a higher response suggests that the use of pop-ups, radio buttons, and
rate. The solicitation to participate might be through check boxes enables better control of responses. Tips
television, radio, and newspaper advertisements, or to enhance accurate data entry by respondents and
possibly a U.S. mailed scenic postcard or brochure encourage their participation include: (a) keeping the
that teachers may be less likely to throw away. form short—no more than 4–8 pages (Gaddis, 1998)
A well thought out campaign could possibly bring or 20 min (Bowers, 1999), and (b) keeping the layout
about a sense of owing something back to their pro- easy for the eye to follow (Read, 1991).
fession in the minds of the respondents. Jeffrey Pfeffer
suggests, in his book Managing with Power (1992, Mixing of Types of Media
p. 106), that building a sense of “reciprocity” is one
of the most successful techniques for gaining compli- Web-accessed surveys can incorporate sound,
ance and has often been used in traditional surveys video, and multimedia application products as part
(i.e. a dollar included with the invitation to respond). of the items. Researchers can then assess partici-
Overall, in the evaluation of two United States pants’ interpretations of graphics, art, or music. This
Department of Education Challenge Grants6 and may become important in student assessment, as it
two PT3 Grants,7 the Office of Internet Studies is based more and more on products rather than test
gained a wealth of experience collecting data online. scores (Garrison and Fenton, 1999). The researcher
Another online data collection technique success- can provide specific and interesting multimedia port-
fully employed by OIS was to monitor and catego- folios full of digital artifacts that help the stakeholders
rize each project’s listservs. These e-mail discussion understand the evaluation in a clearer, more meaning-
groups were established by the Challenge Grants to ful way.
give peer and expert support, which could extend
beyond face-to-face staff development experiences. Immediate Results Feedback
Growth was shown in the number of questions posted,
as well as the levels of technical sophistication of the During the OIS experience of providing for-
questions, increasing each year over the life of the mative evaluation the U.S. Department of Educa-
grant (Abdouch et al., 1998). tion Technology Innovation Challenge Grants, par-
ticipants of multiday teacher workshops were asked
WEB SURVEY ADVANTAGES to complete online evaluation forms at the end of
each day. The workshop planning team was then able
Accuracy, Ease, and Speed of Data Entry to review the evaluations in the evening and makes
changes in the next day’s workshop based on these
Online collection of data has several advantages electronically searchable data. This type of immediate
in regard to the actual data collecting process. In turnaround of data is frequently cited as an advantage
essence, the respondents enter the data directly into (Hays, 1998; Read, 1991; Smith, 1997; Stanton, 1998).
the database. The data can then be collected and en-
tered into a statistical package within minutes. Using Ease of Access of Data and Results
relatively flexible Web page layouts, researchers can
more easily ensure accurate data entry, eliminating Online data collection can make annual or sum-
error caused by entry of the respondent, by office mative reports easily accessible to a broader audience,
staff reading from handwritten forms, or by smudges in formats that are easy to access and more meaning-
ful to the user. Web-based reports are much easier to
6 Challenge
complete when data is in a database, or all of the data
Grant (The Technology Innovation Challenge Grant
Program) is a U.S. Department of Education program that helps is in digital format, such as audio and video files.
schools meet the educational needs of their students through the
development of new applications and creative ways to use tech- BARRIERS, CHALLENGES,
nology for learning. AND DISADVANTAGES
7 PT3 (Preparing Tomorrow’s Teachers to Use Technology) is a U.S.

Department of Education grant program that is designed to sup-


port teacher preparation institutions that implement innovative As stated above, collecting data online has sev-
program improvements to prepare technology-proficient educa- eral advantages, but of course there are some barriers,
tors for 21st century schools. challenges, and disadvantages also.
P1: GCO
Journal of Science Education and Technology pp382-jost-367269 February 26, 2002 21:19 Style file version Oct. 23, 2000

176 Topp and Pawloski

Equipment and Know-How also impossible to tell the focus of attention or the
disposition of the respondent (Stanton, 1998).
In order to collect data online, the evaluator must The question is often posed as to whether the data
have access to the equipment and technical know-how is the same when respondents complete surveys on-
to accomplish the task. Web servers, database pro- line compared to paper/pencil collection. Research in-
grams, and technical expertise are all needed, in ad- dicates that the medium causes significant difference
dition to the traditional knowledge of an evaluation only on timed cognitive tests (Mead and Drasgow,
office. Passwords and other control techniques, which 1993 as cited in Stanton, 1998). A generalization
are used to ensure the integrity of respondents, re- can be made that as respondents become more and
quire time, expertise, and expense (Stanton, 1998). more comfortable with technology and the World
Furthermore, the size, speed, and reliability of servers, Wide Web, the more the online gathered data will
the speed of connection, and the number of simulta- mirror traditional methods of collection. It is worth
neous responses must be considered. remembering that many of us are immigrants to this
In years past, the major barrier to online data col- technology, while our children are native to it.
lection has been the unavailability of Web-connected
computers to respondents. Several times traditional
hard copy surveys have had to be used because of In Search of Validity
lack of computer access by the intended respondents.
This is changing, but the availability of computer Web Validity should always be a concern when using
access must still be addressed when considering online survey instruments—online or not. Web-based sur-
data collection. veys do face a threat in the area of predictive validity
in that the populations are usually biased toward those
who have access to the Internet and/or the inclination
Situations and Expertise of Respondents to respond to online surveys.
One way to address content validity could be to
Research of survey taking has shown a demo- use an online process. Such a process was used in the
graphic skew, as greater levels of education can re- OIS development of the Technology Ability Percep-
duce computer anxiety (Stanton, 1998). Often, a re- tion Self-Report Instrument (TAPSI), an online in-
spondent may not have enough computer savvy to strument. TAPSI served as one facet of the evaluation
complete the collection instrument, whether it be a of the UNOmaha College of Education’s PT3 Imple-
simple form type survey, uploading student project mentation Grant. In order to go beyond mere “face
examples, or completing an online unit plan with validity” (Trochim, 1999) the TAPSI developers uti-
all the needed resources included. Sometimes novice lized an innovative technique to determine whether
users are unsure that their responses were accepted, the measure is a good reflection of the construct for
and they accidentally create multiple or partial re- which it was designed. During the instrument devel-
sponses (Hays, 1998). Terminal and server lockups opment phase, content experts from across the state
and submission errors have occurred when there are were solicited by e-mail and given the opportunity to
numerous simultaneous submissions (Smith, 1997). rate the appropriateness of individual scale items via
Accidental or duplicate submissions have occurred a Web-based form. The participation of many of these
often enough to make “computer anxiety” a real fac- experts would have been impossible, if it had not been
tor in any researcher’s analysis process. facilitated by Web-based forms.

Reliability of Online Data Collection A PEEK UNDER THE HOOD

Another challenge to online data collection is After considering the above, if one decides that
ownership of the entered data. With paper/pencil col- a Web-enabled database survey is the solution, how
lection, usually the evaluator is assured that the per- can it be created without much technological trauma?
son completing the survey is the “real” owner of the The mechanics of creating an online survey can be as
responses. With online collection, it is possible that simple as using “save as” in many applications, such as
an imposter is completing the form. This is especially Microsoft Word. Researchers are likely already famil-
a concern when the respondents complete the form iar with creating survey documents in a word process-
on their own, without supervision (Smith, 1997). It is ing application. It is thus a logical and straightforward
P1: GCO
Journal of Science Education and Technology pp382-jost-367269 February 26, 2002 21:19 Style file version Oct. 23, 2000

Online Data Collection 177

progression to create a Hypertext Markup Language less characters). Creating the new database with fields
(HTML) version of the same, since one of the “save (categories) to match the questions is also a rela-
as” options is usually HTML. They can further tweak tively simple technical task, with the possible pitfall
an online form with such popular programs as Home- of forgetting to choose whether each field should be
Page, Front Page, or DreamWeaver, to name only a designated for numeric or text input. The concept of
few. Many educators and researchers have now been response coding will come into play, as it is most de-
at least introduced to the design of Web pages, using sirable to have a numerical value to represent vari-
graphical WYSIWYG (What You See Is What You ous responses to questions for data analysis. A “yes”
Get) editors. response might be coded as a “1” and a “no” as a
It is easy to select from among “objects” such “0.” One can imagine how a Likert scale may also be
as: (a) check boxes, for multiple responses to a single coded—each response for a single question retaining
question; (b) radio buttons, limited to one response the same “fieldname,” yet different values for each
per question; or (c) text area for an open-ended different response. Care should also be taken to set
“default responses” so that nonanswers can be coded
properly (Gaddis, 1998).
Possession of a server or a relatively new desktop
computer that has a good connection to the Internet
24 h a day allows the use of FileMaker Pro (on either

STEP BY STEP INSTRUCTIONS TO


WEB-ENABLE A FILEMAKER PRO DATABASE
1. Create the submission form or convert existing
document.
2. Print out the new/converted submission form.
3. Determine database field names and values for
each entry on the form and write them on the
printout.
4. Create the database.
5. Enter the field names.
6. Modify the submission form to include the field
names EXACTLY as they exist in the database.
7. Add the required FileMaker Form tags.
8. Create two additional HTML pages - a
“Successful
9. Submission” page and an Error page.
10. Test them!

a Mac or PC system) to serve the survey to the World


Wide Web. In the past this required a CGI program-
mer or possibly one of the third party solutions ref-
erenced earlier. Some Web editors now work with
FileMaker, through tutorials or “wizards,” to correctly
link the form to the proper database and response
pages. This can be very helpful when it comes to the
final important task, which is to test the survey to
response. These can be dragged to the desired position make sure submission of a form successfully creates a
immediately following the text of the question, for- “new record” in the database on the server.
matted appropriately, and validated to be easily read
on whatever browser a respondent may use (although ALTERNATIVES AND THE FUTURE
there the format largely depends on what browser
the end user employs). Each of the above mentioned This paper is merely a snapshot of the current
“objects” in the form must be tied to the appropriate state of this technology. Already there are several
question in a related database category with the same available services that guide the developer through
fieldname (tip: use lower case, no spaces, and 8 or the entire process of creating, posting, hosting,
P1: GCO
Journal of Science Education and Technology pp382-jost-367269 February 26, 2002 21:19 Style file version Oct. 23, 2000

178 Topp and Pawloski

and compiling results. One such service is Survey- Discovered: The Search for Meaning Through the Integration
Builder.com, which can manage the entire process of Art and Technology in K-12 Education. Evaluation Progress
Report No. 4. (ED436470)
over the Web, charging fees from $5 to $15 per person Bowers, D. K. (1999, Winter98/Spring99). FAQ on online research.
surveyed. SurveySolutions for the Web software by Marketing Research 10: 45–49.
Pereseusdevelopment.Com is available for around Gaddis, S. E. (1998, June). How to design online surveys. Training
and Development 52: 67–72.
$179 off the shelf. Either of these solutions can provide Garrison, S., and Fenton, R. (1999, July–August). Database driven
an alternative to having a local server (Lake, 1998). web systems for education. Educational Technology 39: 31–
Downloadable survey applications, which incor- 37.
GVU8. (1997). GVU’s Eighth WWW User Survey. Georgia Tech-
porate the questionnaire into an executable file, can nical Institute’s Graphics, Visualization and Usability Center.
provide an even more interactive environment. These Retrieved August 30, 2001, from http://www.gvu.gatech.edu/
can include complex skip patterns and take advantage user surveys/survey-1997-10/
Hays, R. (1998, April 13). Internet-based surveys provide fast re-
of Windows-based controls, but they are expensive sults. Marketing News 32: 13.
and take time to distribute. Furthermore, sophistica- Lake, M. (1998). Surveying the scene: Take the sting out of in-
tion of users and size of connection to the Internet formation gathering. Computer Currents. Retrieved August 9,
1999, from http://www.currents.net/magazine/national/1609/
can be serious drawbacks (Bowers, 1999). cprc1609.html
Whether the choice is to outsource or to use a Meehan, M. L., and Burns, R. C. (1997, March 24–28). E-mail
local server with the latest software, the near future survey of a listserv discussion group: Lessons learned from
surveying an electronic network of learners. Paper presented
will undoubtedly make the online process simpler, at the Annual Meeting of the American Educational Research
with more features, better security, and higher lev- Association, Chicago, IL. (ERIC Document Reproduction
els of data integrity. Analysis programs such as SPSS Service No. ED 411 292)
Read, W. H. (1991, January). Gathering opinion online.
will likely be fewer and fewer steps away from the HRMagazine 36: 51.
online participants. We are already starting to see Sheehan, K. B., and Hoy, M. G. (1999, March). Using e-mail
researchers doing field observations with personal to survey Internet users in the United States: Methodology
and assessment. Journal of Computer-Mediated Communica-
data assistants (PDA’s), like Palm Pilot or Visor. The tion 4: Retrieved August 29, 2001, from http://www.ascusc.org/
database fields may be created on the desktop and jcmc/vol4/issue3/sheehan.html
recorded into these battery-powered hand-held in- Smith, C. B. (1997, June). Casting the Net: Surveying an
Internet population. Journal of Computer-Mediated Commu-
struments that can immediately beam coded obser- nication 3: Retrieved August 29, 2001, from http://jcmc.huji.
vations, via a wireless Internet module in the PDA, to ac.il/vol3/issue1/smith.html
a main database on a computer virtually anywhere on Stanton, J. M. (1998, Fall). An empirical assessment of data col-
lection using the Internet. Personnel Psychology 51: 709–
the World Wide Web. Imagine that. 726.
Trochim, W. (1999). The Research Methods Knowledge Base, 1st ed.,
REFERENCES Atomic Dog Publishing, Cincinnati, OH. Retrieved August 29,
2001, from http://trochim.human.cornell.edu/kb/
WebSM. (2001). WebSM: Web Survey Methodology Study. Re-
Abdouch, R., Grandgenett, N., Topp, N., Ostler, E., Pawloski, B., search on Internet in Slovenia Project website. Retrieved
Timms, M., and Peterson, J. (1998). The Community August 30, 2001, from http://wsm.org

Das könnte Ihnen auch gefallen