Sie sind auf Seite 1von 19

Available online at www.sciencedirect.


Computers and Composition 24 (2007) 443–461

ESL students’ experiences of online peer feedback

Martin Guardado ∗ , Ling Shi
The University of British Columbia, 2125 Main Mall, Vancouver, BC V6T 1Z4, Canada

With the popularity of computer technology, online peer feedback has become common in university
writing classes. This paper reports an exploratory study of 22 English as a Second Language (ESL) stu-
dents’ experiences of online peer feedback in a sheltered credit course at a western-Canadian university.
Based on analyses of the electronic feedback (e-feedback) participants received, comparisons of their
initial and revised drafts, and follow-up interviews, the study shows that e-feedback, while eliminating
the logistical problems of carrying papers around, retains some of the best features of traditional written
feedback, including a text-only environment that pushes students to write balanced comments with an
awareness of the audience’s needs and with an anonymity that allows peers to make critical comments
on each others’ writings. However, the participating ESL students expressed little confidence in peer
commenting in general. Some shied away from the demand to express and clarify meaning, which
turned online peer feedback into a one-way communication process, leaving a high percentage of peer
comments not addressed. An intervention of face-to-face class discussion with teacher’s guidance to
clarify comments in question is suggested to maximize the effect of online peer feedback.
© 2007 Elsevier Inc. All rights reserved.

Keywords: e-Feedback; Peer feedback; Revision; Second-language writing; English as a second language; Uni-
versity writing

1. Introduction

Peer commenting on each other’s writing has been an important and useful instructional
process in writing classes. Summarizing the advantages of peer feedback in second language
(L2) classrooms, Jun Liu and Jette G. Hansen (2002) stated that peer feedback not only
increases an awareness of audience needs by creating a collaborative drafting process but also
provides opportunities for ESL students to practice English in a meaningful context. These
advantages, however, as Liu and Hansen pointed out, are constrained by (a) students’ cultural
backgrounds, which influence their classroom behaviors and the amount of participation in peer
discussions, (b) students’ level of English proficiency, which affects their ability to provide and

Corresponding author.
Email addresses: (M. Guardado), (L. Shi).

8755-4615/$ – see front matter © 2007 Elsevier Inc. All rights reserved.
444 M. Guardado, L. Shi / Computers and Composition 24 (2007) 443–461

comprehend peer feedback, and (c) the mode of peer feedback. Compared with face-to-face
peer feedback, written peer feedback using a checklist or comment form offers opportunities
of anonymity and a text-only environment. With the development of information technology,
the traditional written feedback has taken on a new dimension. As a digital written form,
student commentary can be transmitted electronically without the logistical complications of
copying and distributing papers (Tannacito & Tuzi, 2002). Such feedback can be in the form
of synchronous chat system interactions, asynchronous email, and bulletin-board postings.
Being relatively simple to use, these technologies are becoming popular in university writing
classes. Of central concern is how such e-feedback differs from traditional feedback to affect
students’ commenting behaviors and the quality of revisions it generates.
To explore the effect of e-feedback, a growing body of research has compared traditional
face-to-face peer response groups versus computer-mediated peer conferences in the context
of university or pre-college writing classes. A number of such studies have focused on L2 or
ESL/EFL (English as a foreign language) students (Braine, 2001; DiGiovanni & Nagaswami,
2001; Jones, Garralda, Li, & Lock, 2006; Liu & Sadler, 2003; Matsumura & Hann, 2004;
Sullivan & Pratt, 1996; Tuzi, 2004). Researchers have examined the peer feedback interactions
(DiGiovanni & Nagaswami, 2001; Honeycutt, 2001; Jones et al., 2006; Mabrito, 1991; Sirc
& Reynolds, 1990; Strenski, Feagin, & Singer, 2005) and/or the effect of peer comments
on revision and quality of the final paper (Braine, 2001; Hewett, 2000; Liu & Sadler, 2003;
Matsumura & Hann, 2004; Sullivan & Pratt, 1996; Tuzi, 2004). Their findings have suggested
that e-feedback has advantages in terms of its interactive textual exchange and greater student
participation although its impact on revision seems to vary in individual studies.

1.1. Interactive textual exchange in online peer feedback

Researchers have identified how peer feedback in cyberspace retains the advantage of tra-
ditional written feedback as students put words together to write about writing. As Mark
Mabrito (1991) put it, “the situation demands not only writing but also the skillful verbaliza-
tion of one’s thoughts and ideas about writing and a peer’s text” (p. 510). When focusing on
writing in cyberspace, students, in Jennifer Jordan-Henley and Barry M. Maid’s (1995) words,
“are released from much of the responsibility that a face-to-face encounter sometimes forces
on them. They are not affected, for instance, by students with bad breath, or by students who
make them uncomfortable in some vague way, or by students who are angry with a teacher”
(p. 212). While released from these responsibilities in the traditional mode, students take on
new responsibilities in online feedback. For example, in peer e-feedback activities, students
must still be sensitive to the audience’s needs and follow a clear, concise, and informative
style without having the benefit of facial cues or body language that face-to-face interactions
offer (Breuch & Racine, 2000). These constraints pose more challenges to students engaged
in peer e-feedback but perhaps also in a way persuade them to be better writers. Accord-
ing to Lee Honeycutt’s (2001) analyses of students’ conferencing transcripts, students in an
online environment are only linguistically co-present, so they must make explicit references
by using indexical devices, such as page numbers, quotations, and paraphrases to maintain
common document focus and make coherent evaluative comments. One should note that Hon-
eycutt’s observations were made in online conferencing, and other methods such as exchanging
M. Guardado, L. Shi / Computers and Composition 24 (2007) 443–461 445

Microsoft Word documents would have offered possibilities of using highlighting or inserting
comments to make explicit references. The demand for greater document-related referencing
in providing written feedback is conceptualized by Herbert H. Clark and Susan E. Brennan
(1991) as the cost of communicative grounding in online peer feedback. By obliging students to
focus on making coherent comments in a text-only environment, online peer feedback retains
the advantages of traditional written feedback to foster the development of metalanguage and
awareness about written communication.
The nature of textuality in online peer feedback implies the differences between asyn-
chronous and synchronous conferencing. Both are electronic communications, but the former
is closer to textuality whereas the latter is closer to orality. Compared to asynchronous con-
ferencing, which has been observed to have reduced interactivity due to the lack of nonverbal
cues and the delay of interaction (Braine, 2001; Liu & Sadler, 2003; Tuzi, 2004), synchronous
or real-time peer commenting invites quick exchanges and personal involvement (Honeycutt,
2001). It is, therefore, effective in small group peer reviews because it increases student par-
ticipation (Braine, 2004) and encourages students to request specific suggestions for revisions
(Tannacito & Tuzi, 2002). However, synchronous networking can be unnatural when it requires
“a roomful of people to type to each other rather than hold a discussion” (Susser, 1993, p.
71). Indeed, researchers have found synchronous chats sometimes disjointed, scattered, con-
fusing, and disruptive (Braine, 2001; Honeycutt, 2001; Liu & Sadler, 2003). In comparison,
asynchronous email feedback has been found to have less time pressure (Tannacito & Tuzi,
2002), and to be more serious (Honeycutt, 2001) and more effective as students learn to adapt
their verbal behaviors over time to reach similar interpersonal levels observed in synchronous
chats (Walther, 1996). The differences between synchronous and asynchronous conferencing
not only suggest various interactive textual exchanges that e-feedback can generate but also
explain some of the inconsistent findings about the advantages and disadvantages of peer
e-feedback. If synchronous conferencing is a technology change from asynchronous confer-
encing, one needs to be aware of how similar changes of technology might have influenced
and will continue to influence the findings of research studies.

1.2. Student participation in online peer feedback

Another advantage of online peer feedback is the possibility of a less threatening envi-
ronment that encourages greater and more equal member participation than face-to-face
conferencing. ESL students, in particular, seem to benefit from such an environment. For
example, Elaine DiGiovanni and Girija Nagaswami (2001) observed that students in pre-
college ESL writing classes participated in online peer feedback comfortably and remained
on task. Similarly, Nancy Sullivan and Ellen Pratt (1996) found that computer-assisted ESL
peer discussion had 100% participation compared to only 50% participation in the face-to-face
class. Explaining how online interactions encouraged participation, Rodney H. Jones, Angel
Garralda, David C.S. Li, and Graham Lock (2006) suggested the electronic environment freed
ESL students from the embarrassment to speak English with peers who shared the same first
language versus what they experienced in face-to-face encounters. Jun Liu and Randall W.
Sadler (2003) also noted that the online environment facilitated participation of ESL stu-
dents whose cultures placed a strong value on listening and silence in traditional classrooms.
446 M. Guardado, L. Shi / Computers and Composition 24 (2007) 443–461

Japanese students, for instance, are socialized in an educational system where student-initiated
classroom interaction is discouraged. The non-threatening environment of this type of peer
feedback seems to be related to the anonymity that cyberspace offers. Without worrying about
how handwriting in the traditional paper-based mode might reveal the reviewer’s identity,
some students said that using pseudonyms in cyberspace allowed them to make honest com-
ments and try out different roles or develop a “writerly persona” (Strenski et al., 2005, p.
195). High-apprehensive student writers in Mabrito’s study also experienced more freedom to
participate anonymously in email peer feedback and “to engage in a collaborative venture that
they might otherwise have avoided in a face-to-face setting” (1991, p. 529). Although there is
some concern about anonymity that might discourage a sense of community (Bump, 1990),
the advantages of having greater student participation seem to outweigh such disadvantages.

1.3. Impact on revision

The differences between traditional and electronic environments are reflected in the different
types of peer comments they generate. For instance, Jones et al. (2006) compared peer inter-
actions online with those in the traditional face-to-face mode and found the former generated
more feedback on global concerns of content and the writing process whereas the latter focused
more on local textual issues of grammar, style, and word choice. However, other researchers
found that peer e-feedback using Microsoft Word or other special programs designed for
responding to writing actually generated more concrete and revision-oriented comments than
traditional oral or paper-based feedback (Hewett, 2000; Liu & Sadler, 2003). Freed from the
face-to-face encounters of the traditional classroom, some online group discussions developed
into critical and effective negotiations (DiGiovanni & Nagaswami, 2001; Sullivan & Pratt,
1996). Along with the findings about various types of e-feedback were observations of the dif-
ferent experiences of student reviewers. Depending on their experiences of reading and writing
on computers, some students found reading long texts on a screen difficult (Van der Geest &
Remmers, 1994) whereas others enjoyed writing elaborate comments without worrying about
space (Liu & Sadler, 2003).
Peer comments generated via various media shape students’ revisions. Relevant research
has mostly been conducted in L2 contexts and seems inconclusive about the quality of revisions
following electronic peer feedback. For example, L2 students using synchronous peer confer-
encing were found to produce writing rated either lower (Braine, 2001) or higher (Sullivan &
Pratt, 1996) than those revised after traditional peer feedback. By comparing students’ initial
and revised drafts after e-feedback and oral feedback, Frank Tuzi (2004) found L2 students
made more macro-level revisions following e-feedback, adding new information and revising
structures at clause, sentence, and paragraph levels. In addition, Solchi Matsumura and George
Hann (2004) reported that ESL students who did not post their own drafts online because of
high computer anxiety also benefited from looking at other classmates’ drafts and related
comments. As e-feedback may have a different effect on individual students, researchers sug-
gested that training of peer reviewers (Breuch & Racine, 2000; Jordan-Henley & Maid, 1995;
Rilling, 2005) and a combination of online and face-to-face feedback should result in the great-
est degree of improvement in essay writing (e.g., Liu & Sadler, 2003; Matsumura & Hann,
2004; Tuzi, 2004).
M. Guardado, L. Shi / Computers and Composition 24 (2007) 443–461 447

The above review suggests further research on the effect of online peer feedback in L2
contexts. On the one hand, L2 students were observed to participate more in non-threatening
online environments than in traditional settings (e.g., Braine, 2004; Jones et al., 2006; Liu
& Sadler, 2003; Sullivan & Pratt, 1996); the quality of their revisions or final papers, on
the other hand, suggests differing impacts of e-feedback. As a result, online feedback in L2
contexts was described as either an obstacle (Braine, 2001), a help (DiGiovanni & Nagaswami,
2001; Tuzi, 2004) or a mixture of limitation and liberation that should, therefore, be combined
with traditional face-to-face sessions (Liu & Sadler, 2003; Matsumura & Hann, 2004; Tuzi,
2004). These conflicting findings call for further explorations of online ESL peer feedback.
In response to this need, the present study was conducted in the natural setting of a sheltered
course for Japanese exchange students at a western-Canadian university. Based on the analyses
of transcripts of students’ online conferencing, comparisons of students’ initial and revised
drafts, and follow-up interviews, this study aims to answer the following research questions:

1. What types of online peer feedback did student authors receive?

2. Did student authors follow peer comments in their revisions? And if so, how did they
perceive such experiences?
3. Did student authors ignore peer comments in their revisions? And if so, how did they
perceive such experiences?

2. Method

2.1. Research setting and participants

The study was conducted in a year-abroad academic exchange program that brings 100
Japanese university students to a Canadian university every year. Typically, the participants
in the program are second- and third-year students from various faculties of a Japanese
university and with TOEFL scores averaging around 500. In Term One, all students take
specially designed sheltered credit courses in intercultural communication, social science
research, media, and academic writing. Based on a combination of TOEFL scores and Term
One marks, some students are eligible to take elective, non-sheltered credit courses in various
faculties in Term Two.
In the year (2004–2005) when the present study was conducted, many students (see Table 1)
were majoring in international relations, business administration, social science, and letters.
Most of them (n = 94) were second-year students. The female/male distribution was about
66/33. Sixty out of the 100 students took a three-credit course in intercultural communication,
during which the present data were collected. The course, taught by the first author, was divided
into three sections that met for 3 hours once a week. Additionally, a teaching assistant provided
language and content support for 90 minutes every week. The textbook used was Introduction to
Intercultural Communication by Fred E. Jandt (2004). In the year of the study, the course evalu-
ations included three 500-word essays, eight reflection journals, oral presentations, exams, and
other activities. The feedback activities were an integral part of the course design. The first two
essays used face-to-face peer feedback supervised by the teaching assistant, and the third used
448 M. Guardado, L. Shi / Computers and Composition 24 (2007) 443–461

Table 1
Students’ profile
ID Age Gender Program year Faculty/major Incoming TOEFL score
S1 20 F 2 International relations 487
S2 20 F 2 Social science 487
S3 20 F 2 Business administration 480
S4 23 F 3 Letters 487
S5 21 M 2 International relations 523
S6 20 F 2 Social science 513
S7 20 F 2 Letters 507
S8 20 M 2 Law 550
S9 21 M 3 Economics 513
S10 22 M 3 Applied science 493
S11 19 M 2 International relations 497
S12 20 F 2 International relations 520
S13 20 F 2 International relations 497
S14 20 F 2 Letters 527
S15 20 F 2 International relations 523
S16 20 M 2 Social science 473
S17 20 F 2 Letters 507
S18 20 F 2 Policy science 497
S19 21 M 2 Social science 463
S20 21 M 2 Law 493
S21 25 F 3 Letters 533
S22 20 F 2 Business administration 500

online peer feedback supervised by the instructor/co-researcher. All students had the freedom
to select their essay topics within the subject matter of the course. Some of the paper titles were
“Individualism and Collectivism in Japan,” “The Socialization of Asians in Canada,” “Nonver-
bal Communication in Theatre,” “The True Multicultural Nation,” “The Status of Aboriginal
People,” and “Difference of Kinesics between Japan and Western Countries.”

2.2. Process of online peer feedback

The program coordinator helped us invite students to participate in the study and distributed
the consent forms, explaining that the instructor would not have access to the names of par-
ticipating and non-participating students until final course marks had been submitted to the
university. Prior to conducting the online peer feedback activity, students participated in two
face-to-face peer feedback exercises using a checklist (see Appendix A) and working in groups
of three supervised by the course teaching assistant. The online peer feedback discussions were
then carried out using Blackboard, a course-management application that is available on an
annual registration basis. Blackboard offers a plethora of features including synchronous chat,
asynchronous electronic discussion boards, assignment drop boxes, class/individual email
tools, assessment tools, resource centers, and more. For this particular project, the electronic
discussion board was utilized. All students were assigned individual usernames and passwords
for access. The essays were posted with the authors’ names, but the feedback was anonymous.
The online peer feedback activity was conducted for the final 500-word essay assignment.
On the day of the feedback activity, students were taken to the computer lab for the second half
M. Guardado, L. Shi / Computers and Composition 24 (2007) 443–461 449

of the class. They logged on to the course’s Blackboard web site and entered the discussion
board where the essay drafts had been previously posted. The instructor/co-researcher then
went around the room with a wireless Apple iBook computer and randomly assigned essays
to students. He also gave detailed oral instructions and provided copies of checklists to assist
in the evaluation. Students had the option of reading the essays online or downloading them
and printing them out. They were also allowed to read everyone’s essays and feedback.
Additionally, student authors had the option of “talking back” to the peer reviewers through
the same bulletin board and asking for clarification or explanation of their comments. On
average, students spent 60–75 minutes in class doing the feedback activity. They then had
two weeks to continue the feedback activity and to revise and submit their final papers to a
separate forum on the discussion board.

2.3. Data collection and analyses

Of the 60 students who participated in the online peer feedback, 22 volunteered to participate
in the follow-up interviews to comment on their experiences of online peer feedback (see
Appendix B for the interview guide). The interviews lasted an average of 30 minutes each
and were conducted in English by a research assistant. All interviews were digitally recorded
and transcribed by a second research assistant. The present data analyses focused on the 22
students’ interview comments, their peer feedback, and their initial and revised essays.
We first printed out and, based on the checklist student reviewers were advised to follow,
coded the review comments the 22 participating students received. Because the students
followed the checklist in writing their feedback, it was relatively easy for us to code their
comments. The majority of the comments were coded as one of the eight items on the
checklist (introduction, thesis statement, support, topic sentences, unity, coherence, content,
and conclusion). We only needed to add three more items to cover comments on grammar,
reviewers’ personal reactions (e.g., “In my personal opinion, your experience is interesting”),
and general comments on the whole paper (e.g., “The paper is well organized and interesting”).
We then traced the revisions made by the 22 student authors by comparing their initial and
revised drafts using Microsoft Word’s Compare and Merge Documents tool, which highlights
additions and deletions. We tallied these highlighted words (excluding format changes)
and then matched them to the negative peer comments to identify revisions based on peer
comments and those that were self-generated. Finally, students’ comments from interviews
were used to interpret students’ perceived experiences of how they followed or ignored online
peer feedback in their revisions.

3. Findings and discussion

3.1. A balance of positive and negative comments

The 22 participating student authors each received 2–4 feedback postings for a total of 53
(Table 2). Based on the coding, the 53 responses generated a total of 128 positive comments
and 93 negative comments. All the peer feedback included both positive and negative com-
Table 2
Coding of online peer feedbacka
Student No. of Positive or Introduction Thesis Support Topic Unity Coherence Content Conclusion Grammar/ Personal General Total no. of
authors reviews negative statement sentence editing reactions comments

M. Guardado, L. Shi / Computers and Composition 24 (2007) 443–461

P 1 1 1 1 4
S1 2
N 1 1 1 1 1* 4/1*

P 1 2 2 5
S2 2
N 1 1* 1/1*
P 1 1 1 1 4
S3 2
N 1 1* 1/1*

P 1 1 2 4
S4 2
N 1* 1* 1 1 2/2*

P 1 1 2 4
S5 2
N 1/1* 1 1* 1* 2/3*

P 1 1 1 1 1 5
S6 2
N 1 1* 1 2/1*
P 1 1 1 2 2 1 8
S7 2
N 1* 3* 4*

P 1 1 1 2 5
S8 3
N 2* 1 1/2*

P 1 1 3 3 8
S9 4
N 1 1* 1* 1* 1/3*
P 1 1 1 3
S10 2
N 1 1 1* 1/1* 1* 3/3*
P 1 1 1 1 1 5
S11 2
N 1* 1*
P 1 1 2 1 1 1 1 2 10
S12 3
N 1* 1 2* 1/2* 2/5*

P 2 2 1 5
S13 3
N 2/1* 1 1* 1* 3/3*
P 1 1 1 1 1 1 1 2 9
S14 3
N 1/1* 1 1 1 1 5/1*

M. Guardado, L. Shi / Computers and Composition 24 (2007) 443–461

P 1 3 3 2 9
S15 4
N 1 1* 1* 1 2/2*

P 1 1 2 4
S16 2
N 1 1 1/1* 3/1*

P 1 1 1 1 2 6
S17 2
N 1* 1* 2* 4*

P 1 2 1 1 5
S18 2
N 5* 5*

P 1 1 2
S19 2
N 1 1* 1 2/1*

P 2 1 2 1 6
S20 2
N 1* 1*

P 1 1 2 1 2 1 1 1 1 1 11
S21 3
N 1* 1* 1* 1* 2* 1 1/6*

P 1 2 1 1 5
S22 2
N 1/1* 1* 1* 3* 1/6*

P 10 5 15 3 5 9 25 8 16 32 128
Total 53
N 3/3* 5/6* 5/6* 4/1* 9* 5/1* 1/8* 5/6* 5/15* 2* 3 36/57*

Negative comments are marked with “*” when containing specific suggestions for revision.

452 M. Guardado, L. Shi / Computers and Composition 24 (2007) 443–461

ments, and many of the negative comments (57 out of 93) were revision-oriented with specific
suggestions for revision.
An examination of the peer feedback suggests they typically start with a general positive
comment followed by specific positive comments and then move on to negative comments.
The following example illustrates such a balanced response. (Students’ comments and writing
are cited verbatim in this paper.)
Peer comments for S4: I think your essay is well organized. And there are many examples that
help reader to understand. Especially 3rd paragraph’s example is good. It is very interesting and
useful. You also used helpful coherences so I can read easily. I think it is very important thing
to write papers. But I found some points that you can improve. At first, your thesis sentence
is a little bit common opinion so if you consider about it . . . it will be better. And I hope my
opinion help you to write papers next time. Thank you.
The above quotation sets off the review with a positive tone: “I think your essay is well
organized.” Such general positive comments are found in 32 reviews. To maintain the positive
tone, some reviewers even start all their negative comments with a phrasal, as the following
excerpt illustrates:
Peer comments for S1: Your essay was easy to understand but there was no clear thesis sentence
so I could not understand what you want to say in the conclusion. Your explanation about the
Kalasha was clear and interesting but I can’t see what do you think about them.
In the above example, the conjunction, “but,” signals the turn from positive to negative
comments. The intention of the reviewer to sugarcoat the criticism is obvious because the
positive comment “your essay was easy to understand” contradicts the negative comment
“there was no clear thesis sentence so I could not understand what you want to say in the
conclusion.” The second negative comment also starts with the criticism of lacking a personal
voice with a compliment about a clear description of Kalasha, an ethnic group in Pakistan.
The advice for the author to voice her own opinion is a revision-oriented comment. Indeed, as
Table 2 shows, 57 of the 93 negative comments in the present data are revision-oriented. The
following example shows how S8 is advised to revise the thesis statement in concrete terms.
The frequent use of “I” may signal a sincere and personal involvement of the peer reviewer:
Peer comment for S8: Your introduction is interesting as it says unexpected thing. But the
evidence of your [thesis] statement is too vague, I felt. What is “globalization”? I think
globalization has a lot of meanings, so I want you to explain more detail.
The balanced responses with positive and negative comments and specific suggestions for
revision, as the present data show, suggest how online peer feedback can itself be a good
writing practice (e.g., Jordan-Henley & Maid, 1995; Mabrito, 1991; Strenski et al., 2005).
Instead of following a checklist to write in point form, most of the present reviewers opted to
offer a narrative piece of personal views about the paper. Writing and organizing a coherent and
balanced response is certainly a meaningful communicative writing practice. The narrative tone
and frequent use of personal pronouns found in the present feedback seem to support Joseph
B. Walther’s (1996) observation of a potential personal level that e-feedback can achieve. The
engagement of the peer reviewers is also reflected in their attention to the audience’s needs
by offering specific suggestions on how to revise the essay. Following previous researchers
M. Guardado, L. Shi / Computers and Composition 24 (2007) 443–461 453

(e.g., Breuch & Racine, 2000; Honeycutt, 2001; Strenski et al., 2005; Tuzi, 2004), we suggest
that the atmosphere created by the online peer response activity retained the advantage of
traditional written feedback by contributing to the sensitivity of the student writers to the
needs of the audience. However, since we did not record students’ experiences during the two
face-to-face peer feedback exercises conducted before the online version, we were not sure
whether it was the feedback task itself that had encouraged learners to be sensitive to audience
needs, regardless of how the feedback was delivered.

3.2. Follow-up revisions and appreciations of peer feedback

Comparisons of the initial and revised drafts show that 13 out of 22 student authors revised
their papers (Table 3). Of the 13 students who revised their papers, 4 (S8, S12, S13, and S17)
made major revisions (over 500 words of deletions and additions) whereas 3 (S14, S20, and
S21) made minor revisions (under 24 words of deletions and additions). Except for 3 students
(S3, S15, and S21) who made only self-generated revisions, the other 10 all made revisions
based on peer comments. As a group, the 13 students who revised their papers received a total
of 60 negative peer comments of which 27 were addressed.
Of the 27 comments that student authors addressed, 22 were revision-oriented or with spe-
cific suggestions for revision (Table 3). Some of these comments required only local additions
or deletions and were easy to act upon. For example, S22 followed the peer advice and added
the word “always” (underlined in the revised text) to soften the tone of her statement:
Peer comments for S22: [Y]our opinions are too strong. EG) 1st paragraph last sentence
“. . .actually stereotype is not negative words.” I think it should be “stereotype is not always. . .”
Original text of S22: However I think people cannot help having stereotype and actually
stereotype is not negative words.
Revised text of S22: However, I think stereotype is not always negative word.
The revised text certainly makes the content more accurate or the opinion easier for readers
to accept though an article is missing before the phrase, “a negative word.” By contrast, other
comments required structural or major revisions at the paragraph level. S12, for example,
rewrote the concluding paragraph by adding new information (relevant text underlined) and
rewriting sentences in response to advice from two peers to voice and stress her own opinion:
Peer comment 1 for S12: I think you should add your opinion more in conclusion.
Peer comment 2 for S12: It’s better you tell more your opinion about nonverbal communication
problem in conclusion.
Original text of S12: In conclusion, nonverbal expressions have relation to history and value,
and sometimes may be barrier of intercultural communication. Therefore, when we come in
contact with different culture, we have to keep in mind that different country has different
nonverbal expressions.
Revised text of S12: In conclusion, there are a lot of different nonverbal expressions, because
each country made their own expressions according their own historical need and cultural value.
As I mentioned, bow is connected with Japanese history and cultural value, and handshake is
Table 3
Student authors’ revisions based on peer feedbacka
Student Total Revisions based on negative comments Self-generated revisions Total words of

M. Guardado, L. Shi / Computers and Composition 24 (2007) 443–461

authors negative Comments not Comments Words of Words of Words of Words of deletions and
comments addressed addressed deletions additions deletions additions additions
S3 2 1 coherence
0 0 102 110 212
1 content*

S15 4 1 support
1 unity*
0 0 26 81 107
1 content*
1 grammar
S21 7 1 thesis statement*
1 introduction*
1 conclusion*
0 0 7 17 24
2 grammar*
1 content*
1 general
S4 4 1 thesis statement* 1 content*
21 54 53 104 232
1 grammar 1 conclusion

S7 4 1 introduction*
232 408 34 55 729
3 unity*
S8 3 1 support* 1 support*
148 82 116 164 510
1 content

S12 7 1 unity* 2 conclusion*

1 coherence 1 editing* 1 116 140 362 619
1 grammar* 1 grammar
S13 6 1 support 1 support*
1 support* 1 topic sentence
69 225 135 72 501
1 content*
1 grammar*
S14 6 1 introduction 1 introduction*
1 thesis statement
11 0 0 0 11
1 coherence
1 conclusion
1 grammar

M. Guardado, L. Shi / Computers and Composition 24 (2007) 443–461

S17 4 1 thesis statement* 1 topic sentence*
115 120 51 61 347
1 unity* 1 unity*
S18 5 2 grammar* 3 grammar* 7 5 11 91 114
S20 1 1 thesis statement* 1 9 0 11 21

S22 7 1 grammar* 1 thesis statement*

1 thesis statement
52 209 1 3 265
2 grammar*
1 content*
1 unity*

Total 60 12/21* 5/22* 657 1228 676 1131 3692

Comments with specific suggestions for revision are marked with “*”. The numbers before the types of comments indicate frequencies.

456 M. Guardado, L. Shi / Computers and Composition 24 (2007) 443–461

connected with American history and cultural value. The meanings of nonverbal expressions
are not always common in the world, so nonverbal expressions may be a barrier in cross-
cultural communication. I can say so from episode of bow. Therefore, when we come in con-
tact with different countries that have different history and value, we have to think that they
may have different meanings of nonverbal expressions.

In adding her own opinion, S12 uses the personal pronoun, “I,” twice. The added text not
only summarizes the main argument but also highlights the thesis, which states that different
nonverbal expressions are related to the history and cultural values of the people who use
those nonverbal expressions. Furthermore, the addition also leads to rewriting all the other
sentences in the original text. Indeed, students’ self-generated revisions of certain texts can
be indirectly related to their responses to peer comments on other parts of the paper under
review. For example, although S13 did not receive any negative comments or suggestions to
revise the introduction of her essay, she wrote a new introductory paragraph as a result of her
revisions of a topic sentence and supporting examples based on peer feedback. Compared with
L2 students in Tuzi’s (2004) study who made macro-level revisions following e-feedback, the
present ESL students also added and revised large blocks of texts as a result of online peer
The positive experience of the present students with peer feedback was supported by their
interview comments. Many (17 out 22) stated it was a useful activity. Commenting on the
advantages of peer feedback in general rather than those specifically related to e-feedback,
some students said it was helpful to see different reader reactions to their own writing (S1,
S2, S13, and S17) and that providing written feedback for others made them more conscious
about writing (S2, S11, S12, and S21). Fifteen students also said they enjoyed the anonymity
of providing written feedback. As anonymous reviewers, participants felt that they were not
“biased” (S8), “could be honest” with their comments (S10 and S13), and did not have to
“feel bad to criticize” their friends (S21). As student authors, participants said they did not feel
“uncomfortable” (S8 and S15), “discouraged” (S21) or “upset” (S17) when receiving critical
comments since they did not know who sent them. Most students seemed to like the direct
and honest comments from anonymous reviewers. As S8 put it, “my reviewers were a kind of
persons who say directly so it’s good for me.”

3.3. Choice to ignore peer comments and negative feelings about online feedback

Comparisons of the initial and revised drafts also show that 9 of the 22 students ignored peer
comments and made no revisions, and those who did revise their papers chose to follow only
some of the peer comments that called for revision (27 out of 60, Table 3). Our analysis of the
interview data reveal how students’ language and cultural backgrounds contribute to these ESL
participants’ choice to ignore peer comments or to their uncertainty about peer commenting.
S11, for example, said he did not give much negative or critical feedback because “it [was]
not good to say your opinion [directly] as a Japanese.” If S11 lacked confidence in evaluat-
ing his peers’ writing critically because of his Japanese cultural background, others (e.g., S3,
S5, S10, S11, and S21) hesitated because they were concerned about their limited language
proficiency as non-native English speakers. For example S21 said she found reading and com-
M. Guardado, L. Shi / Computers and Composition 24 (2007) 443–461 457

menting online difficult and, therefore, was not comfortable critiquing others’ writing. In her
S21: [I gave] just positive comment and my feeling. I don’t think it’s good way to grade for
the writer, especially when I read online. I can’t understand well. I feel my feedback and
understanding decrease. I can’t feedback well to them so just give them positive comments.

The uncertainty expressed by S21 about her English ability mirrors the uncertainty of those
who received such comments. In fact, of the 11 students who received negative comments on
grammar, only 3 made relevant corrections (see Table 3). Also echoing S21’s decision to give
positive rather than critical feedback were complaints such as the following:
S22: I think [some students] didn’t read [my] paper so much [carefully]. . . . their opinions are
just “good” but they did not give me specific reasons. If I can get the reason more deeply, I can
understand and I can make it better. Just [saying] “good” is not important.

While complaining about the quality of online feedback, eight students said they missed the
immediate interaction in face-to-face modes and wished they could interact online with the
reviewers to clarify some of the comments. For example, S3 said that online peer feedback had
“no communication [but] just participation” compared with face-to-face peer response groups
where students would ask for clarification when in doubt. Although the bulletin board allows
students to interact, none of the participants used the opportunity. The lack of a negotiated
interaction between the author and reviewer further explains how a one-way communication
process may have created misunderstandings and left those relevant comments unaddressed.
Some students explained that they did not interact online because typing on a computer took
more time and energy than face-to-face interaction did. S6 spoke for many when she said,
“When I check my paper online, I didn’t know how to type [revise] and puzzled at those
comments.” Several other students (S5, S14, S16, and S17), in contrast to those who enjoyed
anonymity, found it uncomfortable to interact with an anonymous peer. As S17 explains, “If
I receive some feedback from online without names, I won’t ask anything exactly.” Similarly,
S16 said, “We don’t know the names so we can’t ask their opinions.” Without interactions,
some students found online feedback confusing and wondered if they had misunderstood the
comments (S6 and S13).

4. Conclusion

The present study explores 22 ESL students’ experiences of online peer feedback based on
analyses of the e-feedback they received, revisions they made as a result, and experiences they
perceived. The present findings suggest that e-feedback, like traditional written feedback, offers
a text-only environment that pushes students to write balanced comments with an awareness of
the audience’s needs. Our analyses also uncover students’ mixed feelings of either appreciating
or disliking the online experiences. Many of the present students, like those L2 students in
previous research (e.g., DiGiovanni & Nagaswami, 2001; Sullivan & Pratt, 1996) did embrace
anonymity as a chance to review their peers’ writing critically, which they might not do in a
face-to-face context though a few wished they knew who had reviewed their papers.
458 M. Guardado, L. Shi / Computers and Composition 24 (2007) 443–461

As non-native English-speakers with limited English proficiency and a Japanese cultural

background, many participants expressed little confidence in peer commenting. Some chose to
provide positive feedback or sugarcoat their negative feedback as reviewers or found peer feed-
back useless and, therefore, did not follow it in their own revisions as authors. Many also found
textual exchanges more challenging than face-to-face interaction and shied away from writing
back to reviewers to clarify and negotiate meaning. The lack of interaction turned the online
peer feedback in the present study into a one-way communication process, leaving a good
portion of peer comments unaddressed and, thus, opportunities missed. If previous research
suggests that providing e-feedback pushed students to be good writers (e.g., Jordan-Henley &
Maid, 1995; Mabrito, 1991; Strenski et al., 2005), the present findings illustrate how the demand
to participate in online textual exchanges could scare some ESL students away from interaction
and negotiation. The study highlights the special needs of ESL writers. Compared with their
first-language counterparts, ESL students are still in the process of acquiring the syntactic and
lexical competence of English writing. At the same time, they are learning the values associated
with the nature and functions of English writing that might differ from what they learned in their
first-language education. As the present data illustrate, ESL students are challenged not only
linguistically but also by the demand to review each other’s writing critically, a writing func-
tion or process that might intimidate students from a culture that discourages critiquing one’s
Our study suggests that online peer feedback is not a simple alternative to face-to-
face feedback and needs to be organized carefully to maximize its positive effect. First
of all, training students how to provide feedback is important. Although the present study
did not focus on the training effect, the participants did have two sessions of practice
using a checklist and face-to-face peer feedback before they tried it online. The rele-
vant training, based on students’ experiences, was far from enough. Students should be
trained or encouraged to interact with anonymous reviewers just like professional writ-
ers do. The present study implies a need to observe care in how the peer review activity
is presented to ESL students, addressing the level of their English proficiency and bear-
ing in mind the particular challenges their cultural values might pose during the feedback
Second, although anonymity helped ESL students voice their opinions, it could also dis-
courage interaction. Follow-up class discussions after students have received e-feedback could
also be organized to have students discuss and clarify the problematic comments together. By
keeping the reviewers anonymous and, at the same time, providing a chance for face-to-face
interaction among students, the discussion could help authors clarify the comments and review-
ers understand the confusion. Following previous researchers (DiGiovanni & Nagaswami,
2001; Tannacito & Tuzi, 2002), we believe that printouts of some of the reviews could help
focus the discussions.
Finally, we suggest that the instructor should join such follow-up discussions. With the
guidance of the teacher whom many ESL students respect as an authority who knows what
good writing is and who gives final marks to the papers under review, the process would
ease the concern of ESL students about the quality of peer feedback and build up their con-
fidence in following the feedback effectively in their revisions. With the intervention of the
teacher’s guidance and the negotiated interactions between authors and anonymous reviewers,
M. Guardado, L. Shi / Computers and Composition 24 (2007) 443–461 459

a combination of online and face-to-face peer feedback can contribute to improving students’
competence as reviewers and writers.

Appendix A. Essay checklist

1. Introduction: Does it grab the reader’s attention? Does it set the tone of the essay?
2. Thesis statement: Does the thesis statement name the topic, show the writer’s position or
feelings on the subject, and set out the main points of the essay? In naming these points,
has the writer been careful to maintain parallelism?
3. Support: Has the writer supported all generalizations with concrete details and examples?
4. Topic sentences: Is each topic sentence followed by a series of other sentences that develop
the main point through a combination of examples, description, details, facts, or anecdotes
that directly relate to the topic sentence? Has the writer carefully examined each paragraph
to be sure that no sentences are included which do not support the topic sentence of the
5. Unity/paragraph development: Does each body paragraph have a topic sentence that cor-
responds to one of the points in my thesis statement?
6. Coherence: Has the writer used transition words and phrases to facilitate a smooth and
logical progression from one sentence or paragraph to the next?
7. Content: Is the essay significant and meaningful—a thoughtful, interesting, and informative
presentation of relevant facts, opinions, or ideas?
8. Conclusion: Does the conclusion summarize and reaffirm the thesis? Does it leave the
reader with a distinct sense of closure?

Appendix B. Interview guide

1. Did you feel this was a useful activity?

2. Do you prefer to give and receive peer feedback anonymously? Why or why not?
3. How did you give feedback online?
4. How did online feedback help you with revision?
5. What type of peer feedback activity is better, face-to-face or online?
6. What challenges did you face in online peer review? Please compare your experiences of
with face-to-face peer feedback

Martin Guardado is a Ph.D. candidate in the Department of Language and Literacy Educa-
tion at the University of British Columbia. His research interests include second-language
socialization in home, school, and community settings and second-language writing. He
has published in The Canadian Modern Language Review, TESOL Quarterly, Canadian
Journal of Applied Linguistics, International Journal of Learning, and Canadian Ethnic
460 M. Guardado, L. Shi / Computers and Composition 24 (2007) 443–461

Ling Shi is an associate professor in the Department of Language and Literacy Educa-
tion in the University of British Columbia. Her research has been published in journals
such as TESOL Quarterly, Written Communication, Journal of Second Language Writing,
Language Testing, English for Specific Purposes, and Journal of English for Academic


Braine, George. (2001). A study of English as a foreign language (EFL) writers on a local-area network (LAN)
and in traditional classes. Computers and Composition, 18, 275–292.
Braine, George. (2004). Teaching second and foreign language writing on local area networks (LANs). In Sandra
Fotos & Charles M. Browne (Eds.), New perspectives on CALL for second language classrooms (pp. 93–108).
NJ: Lawrence Erlbaum.
Breuch, Lee-Ann M. Kastman, & Racine, Sam J. (2000). Developing sound tutor training for online writing centers:
Creating productive peer reviewers. Computers and Composition, 17, 245–263.
Bump, Jerome. (1990). Radical changes in class discussion using networking computers. Computers and the
Humanities, 24, 49–65.
Clark, Herbert H., & Brennan, Susan E. (1991). Grounding in communication. In Lauren B. Resnick, John M.
Levine, & Stephanie D. Teasley (Eds.), Perspectives on socially shared cognition (pp. 127–149). Washington,
DC: American Psychological Association.
DiGiovanni, Elaine, & Nagaswami, Girija. (2001). Online peer review: An alternative to face-to-face? ELT Journal,
53, 263–272.
Hewett, Beth L. (2000). Characteristics of interactive oral and computer-mediated peer group talk and its influence
on revision. Computers and Composition, 17, 265–288.
Honeycutt, Lee. (2001). Comparing e-mail and synchronous conferencing in online peer response. Written Com-
munication, 18, 26–60.
Jones, Rodney H., Garralda, Angel, Li, David C. S., & Lock, Graham. (2006). Interactional dynamics in online
and face-to-face peer-tutoring sessions for second language writers. Journal of Second Language Writing, 15,
Jordan-Henley, Jennifer, & Maid, Barry M. (1995). Tutoring in cyberspace: Student impact and college/university
collaboration. Computers and Composition, 12, 211–218.
Kemp, Fred. (2003). The importance of peer interactivity in writing instruction. Retrieved July 16, 2006, from
Liu, Jun, & Hansen, Jette G. (2002). Peer response in second language writing classrooms. Ann Arbor: The
University of Michigan Press.
Liu, Jun, & Sadler, Randall W. (2003). The effect and affect of peer review in electronic versus traditional modes
on L2 writing. Journal of English for Academic Purposes, 2, 193–227.
Mabrito, Mark. (1991). Electronic mail as a vehicle for peer response: Conversations of high- and low-apprehensive
writers. Written Communication, 8, 509–532.
Matsumura, Soichi, & Hann, George. (2004). Computer anxiety and students’ preferred feedback methods in EFL
writing. Modern Language Journal, 88, 403–415.
Rilling, Sarah. (2005). The development of an ESL OWL, or learning how to tutor writing online. Computers and
Composition, 22, 357–374.
Sirc, Geoffrey, & Reynolds, Thomas. (1990). The face of collaboration in the networked writing classroom.
Computers and Composition, 7, 53–70.
Strenski, Ellen, Feagin, Caley O’Dwyer, & Singer, Jonathan A. (2005). Email small group peer review revisited.
Computers and Composition, 22, 191–208.
M. Guardado, L. Shi / Computers and Composition 24 (2007) 443–461 461

Sullivan, Nancy, & Pratt, Ellen. (1996). A comparative study of two ESL writing environments: A computer-assisted
classroom and a traditional oral classroom. System, 24, 491–501.
Susser, Bernard. (1993). Networks and project work: Alternative pedagogies for writing with computers. Computers
and Composition, 10, 63–89.
Tannacito, Terry & Tuzi, Frank. (2002). A comparison of e-response: Two experiences, one conclusion. Kairos, 7(3).
Retrived July 25, 2006, from <
Tuzi, Frank. (2004). The impact of e-feedback on the revisions of L2 writers in an academic writing course.
Computers and Composition, 21, 217–235.
Van der Geest, Thea, & Remmers, Tim. (1994). The computer as means of communication for peer-review groups.
Computers and Composition, 11, 237–250.
Walther, Joseph B. (1996). Computer-mediated communication: Impersonal, interpersonal, and hyperpersonal
interaction. Communication Research, 23, 3–43.