Sie sind auf Seite 1von 4

COMPUTER-ASSISTED TEXT-ANALYSIS FOR ESL STUDENTS

By: Joy Reid, Peggy Lindstrom, Maggie McCaffrey, and Doug Larson

During the past three years, thousands of composition students at Colorado State University have used word processors and a text-
editing software system (developed by Bell Laboratories and called the UNIX* Writer's Workbench) (*UNIX is a trademark of Bell
Laboratories) as a means to improve their writing. Research by the directors of this computer-assisted editing project, Drs. Kate
Kiefer and Charles Smith, state that textual analysis with computers intrigues college writers and speed learning of editing skills. In
addition students learn to use the word processors quickly, enjoy the experience eof using text editors, and think that using the text-
analysis programs does improve their writing.
The success of this freshman composition project led us to investigate the possibilities of using word processor sand the text-
analysis software with English as a Second Language (ESL) students. Our very limited objectives for this project were to determine
(a) whether or not international students learning English could learn to use computer equipment and (b) whether or not those students
felt that the time invested was worthwhile, both in terms of their writing skills and in terms of the cultural experience. Additionally,
we hoped to see whether working with the computer system would mitigate three significant problems ESL writers have in writing
American academic prose:
1. their lack of preparation in basic English writing skills. Because the TEOFL examination (required by most
colleges/universities as a standard for ESL student admission) does not test/evaluate productive writing skills, many ESL
students are not prepared for writing in academic curriculums. Most ESL student are unable to produce successful
writing assignments (e.g., essay examinations, technical reports, analysis, theses) that fulfill the expectations of the
academic audience.
2. their difficulty in adapting to the acceptable rhetorical strategies and structures used in American academic prose.
Differing cultural strategies-the elaborateness of the Spanish writer, the circuitousness of the Japanese writer, the
inductive presentation of the Thai writer, and the reliance on generalization of the Arabic writer-often interfere with clear
communication in written English.
3. their inability to edit and revise written material. Most ESL students have rarely, if ever, written rough drafts; the
revision process is simply not a part of the composing process to which they have been exposed. Students turn in a
single draft with a Leave-it-to-Allah attitude: they have done their initial best; the teacher must do the rest.

Because of perceived ESL student needs, we planned a pilot project for spring semester, 1983. In fall, 1982, we prepared for the
project with a literature search and an error analysis of several hundred ESL student placement examinations. We determined what
grammatical errors ESL students most frequently make, rank-ordered those error in terms of gravity (interference with
communication), and decided on a list of ESL errors that could be added to the existing test-analysis program. In addition, we typed
ESL student papers into the Writer's Workbench (WWB) system in order to determine which test-editing programs would most benefit
our ESL students, and we developed a series of testing instruments, including an attitudinal pre-and Post questionnaire, pre- and post-
editing exercises.
The more we learned about word processors and text analysis systems, the more limited our editing objectives were forced to
become. For example, our error-analysis indicated that the most frequent and serious

CALICO Journal, Volume 1 Number 3 40


errors made by ESL writers included verb-tense and ver-agreement errors, comma splices, and vocabulary and word order problems.
The WWB System, however, because of the limitations of its parser (the complicated program that identifies parts of speech), can
identify none of those errors because it does not work within a context. We had also hoped to add many two-word verbs (turn of, put
on, involved in) that our students misuse, but because so many of those two-word verbs are separable (turn it off, put the cloths on),
and because the verbs are often used with out the preposition (turn the know, put it away) the WWB software could not be reliably
adapted to identify and correct the entire class of errors.
As a result of these limitations, we decide to add a list of very specific errors frequently made by ESL writers of several
language backgrounds to the WWB DICTION/SUGGEST program. The existing DICTION program consists of a dictionary of more
than 500 words and the main ideas in the essay. Thus, these two programs would reinforce concepts and formats we were teaching in
the ESL writing classes.
Finally, we used the PROSE and STYLE programs which provide a stylistic analysis of the student essay. PROSE compares
the essay to a large number of Freshman Placement examination essays judged to be well-written. We hoped that an objective
analysis of student writing (e.g., readability, sentence structure, and sentence variability) according to standards established for
successful academic writing would encourage our students to work voluntarily on revising their essays. STYLE then analyzes some
of the output from PROSE, statistically analyzing such information as variety in sentence and word lengths, sentence types, and
sentence openings.
Following our semester of preparation, in January, 1983, nine ESL writing students in the Colorado State University
Intensive English Program volunteered to participate in The Computer Project for a single seven week term; during the following term
(March-May, 1983), Nine student also participated. Students who wished to participate but who did not know how to type enrolled in
a special ten week ESL typing class prior to their admission to the project. Two of the students participated both terms; even of the
entire group were women and eight were undergraduates. Five languages (Spanish, Arabic, Thai, Chinese, and Indonesian) and nine
countries were represented, and nearly as many major fields as participants.
Each student spent a minimum of one scheduled hour each week in the computer lab learning to use the word processors,
entering their writing assignments on the computers, and running the series of WWB programs on their essays. Initially, the students
used an online LEARN program that introduced them, step-by-step, to the computer and the word processing keys. The program,
which took about an hour for students to complete, was developed by Charles Smith and written specifically for freshmen new to
computer technology; a such, the program anticipates potential problems: Don't try to memorize any rules; just do what each screen
tells you to do; Don't panic! If _________ occurs, you probably hit the ________ key by accident. Moreover, Computer Lab
monitors-trained CSU undergraduates-were available to help any student whose hand was raised. The international student had no
serious problems and certainly no problems surfaced that the Lab monitors had not previously encountered with freshmen.
Some interesting experiential information accrued during the semester. For example, when our ESL students discovered that
they were getting only a limited number of the WWB programs, many asked for the fuller set that the freshmen composition students
received. Contrary to our expectations, the students were not intimidated by the additional text-editing programs; rather, several
indicated that they had learned about real university writing from the extended print-outs. Another surprise, for both the students and
the teachers in the project, was the extent of the positive cultural experience. As the students became familiar with the computer lab,
they were able to assist the less experienced freshmen sitting at the terminals next to them, to converse the with the monitors about
problems they had, to bring their friends to the lab and show them the word processors, and to discuss the project with their non-
participant ESL classmates.

Conclusions:
Based on student and teacher experience during the semester of the pilot project, our limited objectives—determining the
capability of the ESL students to use the computer system and to benefit affectivity—were met. The students learned to use the word
processors without major difficulties-although the beginning typists from non-Roman alphabet languages progressed somewhat more
slowly, especially at first. Many of the students volunteered perceptive insights about American academic prose that were directly
related to their work with the WWB programs: my level was grade 3 reading-terrible! I didn't know that 'there is' was not good. I
used this word 13 times in one paragraph-too much, umm? Some went beyond our expectations (I want to use different verbs, not
always 'is' and 'are'); others barely managed to get their assignments typed each week and did little revision. But all the students
enjoyed the cultural experience, nearly all thought that access to the word processors had improved their understanding of writing and
revising academic prose, and all were proud of the expertise they had achieved. The post-project attitudinal surveys indicted similar
trends as seen in surveys completed by university freshmen: 1) the word processors were not difficult to learn to use; 2) one hour per
week was not enough time; and 3) the computer programs helped students learn about their writing.
Did working with the word processors and the text-analysis programs improve the writing of the ESL student participants?
That is, would the basic writing skills of these students be better developed; would their awareness of rhetorical strategies and formats

CALICO Journal, Volume 1 Number 3 41


result in more appropriate wring; and would their monitoring skills be more sophisticated than the students who did not participate in
the project? To discover the answer, we holistically scored the 200 pre- and post-writing samples from the two terms (and the three
classes) of ESL writing students involved in the project (as both participants and controls). The holistic evaluative criteria include
overall and inner paragraph organization, quality and quantity of specific detail, and number and gravity of grammar and sentence
structure errors. The results showed that the students in all the classes improved their writing skills between each pre- and post-test.
However, the data does not show a significant difference in the gain in overall writing skills for the students who worked with the
word processors and WWB programs.
The reasons for the lack of significant gain may be due to several factors: the student participants were drawn from three
levels of English language proficiency, and their developmental maturity in assimilating writing skills may have differed considerably;
they represented only a small minority of each class, and therefore no work with the computer print-outs or the WWB system was
done in any of their regular classes; the limited time on the computers (7 weeks-a minimum of 7 hours) may not have given the
student enough time to make significant gains; and the choice to holistically evaluate the pre- and post-writing samples may have
resulted in too general a set of results.
Despite mixed results of the holistic analysis, our future plans will include the use of word processors and text analysis
software for our international students. We intend to require one advanced ESL writing class to work with the computers
(participation from other classes will continue to be voluntary, but the volunteers will not be included in our research data, except in
possible longitudinal studies). In this way, we will have the opportunity to focus the attention of one class on the editing processes
prompted by the WWB system and to encourage the students to spend more time revising their work. As we continue, we will collect
and analyze data to assess the effect of computers and software for textual analysis on ESL students.

Term 1
Pre-Tests: Control Group Pre-Tests: CAI Participants
N—30 N—9
Mean—5.583 Mean—5.667
SD—1.515 SD—0.3953
Post-Tests: Control Group Post-Tests: CAI Participants
N—30 N—8
Mean—6.918 Mean_7.000
SD—1.110 SD—0.756
GAIN: CONTROL GROUP GAIN: CAI PARTICIPANTS
1.400 1.333
Term 2
Pre-Tests: Control Group Pre-Tests: CAI Participants
N—40 N—8
Mean—5.16 Mean—4.250
SD—2.042 SD—0.756
Post-Tests: Control Group Post-Tests: CAI Participants
N—40 N—8
Mean—6.11 Mean—6.125
SD—1.704 SD—0.856
GAIN: CONTROL GROUP GAIN: CAI PARTICIPANTS
.95 1.875
Figure 1—Evaluation Results for WWB Participants

NOTES
1
For a more complete description of Keifer's and Smith's project and research, see their articles, "Writers's Workbench:
Computers and Writing Instruction", Proceedings of the Future of Literacy Conference, Center for the Study of Adult Learning,
University of Maryland, Baltimore County, Maryland (forthcoming), and "Textual Analysis with Computer Software", Research in
the Teaching of English (forthcoming).
2
For a technical description of the software that comprises the WWB System, see Lorinda L. Cherry, Mary L. Fox,
Lawerence T. Frase, Patricia S. Gingrich, Stacy A. Keenan, and Nina H. MacDonald, "Computer Aids for Text Analysis," Bell
Laboratories Record, May/June (1983) 10-16, and Nina H. MacDonald, Lawerence T. Frase, Patricia S. Gingrich , and Stacey A.
Keenan, "The Writer's Workbench: Computer Aids for Text Analysis", Educational Psychologist, 17:3 (1982), 172-179.
3
For a brief description of contrastive rhetoric, see Joy Reid, "The Linear Product of American Thought", College
Composition and Communication (forthcoming); for a more complete discussion, see the entire third volume of the Annual Review of
Applied Linguistics (Newbury House Publications, 1983).
4
Present ESL computer software is limited to drill and practice. The WWB text-analysis software is the first of its kind, and
the Keifer-Smith project the first with the WWB programs in the nation.

CALICO Journal, Volume 1 Number 3 42


5
For more information concerning syntactic errors in high intermediate and advanced ESL student papers, we used Charles
Seibel's "Error Marking System for Writing Lab Study" (unpublished research, University of Kansas Applied English Center), and
Barbara Kroll's unpublished dissertation, "Levels of Error in ESL Composition", (USC, 1982).
6
All the teachers involved in this project thank Kate Keifer and Charles Smith for their assistance and support. They allowed
us to adapt their attitudinal questionnaire and advised us on the preparation of all the other evaluative instruments. Additionally, they
patiently instructed us in the use of the computers, the limitations of the text-editing and text-analysis software, and the overall
logistics of developing and implementing the project.
7
IBM is completing work on a parser that can with sufficient time, identify such problems. See G. E. Heidorn, K. Jensen,
L.A. Miller, and R. J. Byrd, "The EPISTLE Text-Critiquing System", IBM Systems Journal, 21:3 (1982), 305-326.
8
The SUGGEST program refers students to the Glossary when the information about the DICTION hit is too extensive to be
included in a single line.
9
The WWB System allows different standards to be set according to the needs of the program involved. Keifer and Smith
established the standards for their project, and we used those standards.
10
Sign-up sheets were available for additional computer time; most of the ESL students did sign up for extra time, especially
at the beginning of each term.
11
The students did no composing at the computer; time, space, and program constraints required that they always have a text
ready to put on the computers when they arrived at the Lab, and we arranged our classroom assignments so that was possible.
12
We used a 9-point scale for holistic analysis of the student writing samples; the evaluation guide is one developed for use
with placement examinations within the Intensive English Program. For more information on the process, see Joy M. Reid and
Maryann O'Brian, "The Applications of Holistic Grading in an ESL Program", The American Language Journal, volume 2,
September, 1983 (forthcoming).

CALICO Journal, Volume 1 Number 3 46

Das könnte Ihnen auch gefallen