Sie sind auf Seite 1von 10

A CONTENT ANALYSIS OF CRITICAL

THINKING SKILLS AS AN INDICATOR OF


QUALITY OF ONLINE DISCUSSION IN
VIRTUAL LEARNING COMMUNITIES
Leah E. Wickersham
Texas A&M UniversityCommerce

Kim E. Dooley
Texas A&M University

Online discussion is a common tool to create learner-learner interaction. Whole class discussions can result in
potentially hundreds of postings with students spending more time creating the illusion of participation as
opposed to critical reflection and deeper learning. The purpose of this study was to determine quality of online
discussion based on critical thinking constructs when learners were placed in smaller learning communities
and not exposed to whole class discussion. The researchers sought to determine if discrepancies among the
groups would exist and if students would receive the full benefit of learner-learner interaction by placing them
in smaller groups.

INTRODUCTION
Online courses are not so dissimilar from
their face-to-face counterparts, with interactive discussion playing an integral role in the
teaching and learning process. Several advantages to the asynchronous approach not
present in the synchronous environment
involve providing all students the ability to
interact and participate in the discussion, to

learn at their own pace, and to have more


time to reflect and respond within the
expanded timeframe.
New challenges, however, have emerged
as a result of moving the discussion from the
traditional to the virtual environment. More
responsibility is placed on the student to selfengage in this process and the instructor is
faced with the task of analyzing quality of
participation and measuring student learning.

Leah E. Wickersham, Texas A&M UniversityCommerce, PO Box 3011, Department of Secondary and Higher Education, Commerce, TX 75429. Telephone: (903) 468-3248. E-mail: leah_wickersham@tamu-commerce.edu
The Quarterly Review of Distance Education, Volume 7(2), 2006, pp. 185193
ISSN 1528-3518
Copyright 2006 Information Age Publishing, Inc.
All rights of reproduction in any form reserved.

186

The Quarterly Review of Distance Education

Moores editorial in the American Journal


of Distance Education (1989) identified three
types of interaction in distance education:
learner-content,
learner-instructor,
and
learner-learner. Interaction between learner
and content implies that construction of
knowledge occurs when the learner interacts
with the course content and changes in ones
understanding occurs when the new knowledge is combined with preexisting knowledge.
Interaction between the learner and instructor
reinforces the learner-content interaction using
engagement and dialog exchange to promote
the learning process with explanation, discussion, examples, and/or application activities.
Interaction between learner and learner is
essential in distance education if participation
in class discussions is to take place. This interaction can happen one-on-one or within a
group setting, depending on the design of the
course.
Northrup (2002) studied online learners
preferences for interaction or engagement in
learning. She found four variables that serve an
indicators of interaction: (1) content interaction including the structure, pacing, and use of
various delivery technologies and interactive
strategies; (2) conversation and collaboration
through peer interaction and participation in a
learning community; (3) intrapersonal/metacognitive interaction through self-monitoring
and providing cognitive strategies like notetaking guides and advanced organizers; and (4)
support through mentoring, tutorials, and
timely correspondence with the instructor.
One way to promote interaction and collaboration is through online discussions. However, determining the quality of discussion and
amount of participation of students in a course
can be cumbersome to measure. For example,
a course with an enrollment of 30 students and
a requirement to read and respond to all postings can be viewed as busy work with very
little meaningful discussion and learning taking place. It is often confusing and time consuming to sift through what could result in
hundreds of postings in an online discussion.
More time and effort is spent on creating an

Vol. 7, No. 2, 2006

illusion of participation on the part of the student by the number of one to two sentence
postings to many discussion threads rather
than an in-depth, meaningful discussion
among a few, resulting in a failure to achieve
what the instructor had intendedthoughtful
reflection and meaningful discussions.
If a students grades are partially determined by their participation in online discussions, instructors face another challenge in
determining how to assess the quality of discussion and if in fact learning is taking place.
But what tools are available to determine quality in online discussions and whether higherorder thinking skills are being developed?

THEORETICAL FRAMEWORK
Virtual learning communities promote learning when instruction, social interaction, and
technology activities are present (Tu & Corry,
2002). Research has shown that online discussion helps students understand course objectives, provides real world applications, and
promotes interaction (Edelstein & Edwards,
2003; Palloff & Pratt, 1999; Simonson, Smaldino, Albright, & Zvacek, 2003). It is assumed
that the facilitator or course instructor will
consider how much time (or number of postings) a student needs to participate, but determining how the discussions contribute to the
achievement of the course objectives, how
well the student is performing in the course,
and how much learning is happening as a result
of the discussion is another story. Edelstein
and Edwards (2003) created an assessment
rubric for student participation in threaded discussions based on five constructs: (1) promptness and initiative, demonstrating selfmotivation and consistent engagement in the
course content; (2) attention to detail in the
delivery of the post (such as grammar and
spelling); (3) relevance of the post in relation
to the course topic and objectives; (4) how well
opinions and ideas are expressed within the
post; and (5) contribution to the learning community. Although this rubric is useful in deter-

A Content Analysis of Critical Thinking Skills

mining participation beyond just counting


postings, it does not provide an indication of
the depth of understanding of the content in
terms of metacognition, problem solving, and
critical thinking.
Several researchers have used various models to measure intellectual development and
critical thinking within online discussions
(Marra, 2002; Marra, Moore, & Klimczak,
2004; Newman, Webb, & Cochrane, 1996;
Visser, Visser, & Schlosser, 2003). Marra
(2002) suggests that as complex understandings develop, learners are able to see knowledge as being defined and shaped by the
context in which it must be applied (p. 16).
Marra provides a rubric for determining if concepts discussed using online conversation tools
are descriptive of the content domain, if they
are embedded and interconnected, and if links
are descriptive and efficient (2002). She posits
that effective online learning environment
should scaffold and support complex intellectual development.
The model used by Newman et al. was
based on Garrisons five stages of critical
thinking (1992) and Henris cognitive skills
needed in computer mediated communication
(1992). Newman et al. (1996) cite Mason
(1992) regarding how instructors rely on
counting messages and logons to determine
participation in threaded discussions with little
thought of what constitutes good work or the
quality of student learning. These authors support the use of collaborative learning, critical
thinking, and deep understanding of course
material based on content analysis of the written narrative of the online discussion as
another assessment tool.
If we believe that deep learning is promoted
by active engagement and that cognitive skills
are developed in a social context (Lipman,
1991; Resnick, Levine, & Teasley, 1991; Tu &
Corry, 2002), then threaded discussions within
a virtual learning community could promote
deep learning. Right? But, how do we measure
deep learning or critical thinking within virtual
communities? Henri (1992) suggests five
dimensions: (1) participative; (2) social;

187

(3) interactive; (4) cognitive; and (5) metacognitive. For the purposes of this study, the
researchers chose to focus on the cognitive
dimensions only because they were interested
in measuring learning, not only participation
(self-direction) and social functions within a
virtual learning community.
Garrison (1992) provided a five-stage model
to measure critical thinking skills: (1) problem
identification; (2) problem definition; (3) problem exploration; (4) problem evaluation/applicability; and (5) problem integration. Newman,
Webb, and Cochrane (1996) eloquently combined these two models with Masons (1992)
suggestions based on the educational value
exhibited within online discussion: Do the
learners build on previous messages? Do they
draw on their own experience? Do they refer to
course materials? Do they refer to relevant
material outside the course? Do they initiate
new ideas? The resulting model codes provide
indicators of critical and uncritical thinking in
10 areas: (1) relevance; (2) importance; (3) novelty; (4) outside knowledge or experience being
brought to bear on the problem; (5) ambiguities
clarified or confused; (6) linking ideas; (7) justification; (8) critical assessment; (9) practical
utility (grounding); and (10) width of understanding. This model served as the theoretical
framework for this study. Previous studies have
used the content analysis method to measure
critical thinking in face-to-face and computersupported group learning for whole class
instruction, but they have not examined the
impact of smaller learning communities within
a course. Would there be differences in deep
learning or critical thinking if students are only
exposed to their own learning team rather than
the breadth of discussion and perspectives from
the entire class?

PURPOSE AND RESEARCH


OBJECTIVES
The purpose of this study was to determine if
the critical thinking skills model posed by
Newman et al. (1996) could be used to indicate

188

The Quarterly Review of Distance Education

quality of online discussion when the learners


are placed in smaller learning communities
and not exposed to whole-class discussion.
Would there be discrepancies among the
groups? Would one team have a great amount
of critical thinking within their discussion
while one team only scratched the surface?
Would students not receive the full benefit of
learner-to-learner interaction if placed in a
smaller group, or would they have greater intimacy and deeper conversations as a result of
being in a smaller group?

METHODS
For this study, the primary source of data was
narrative. Therefore, acceptable qualitative
research standards drove the methods (Lincoln
& Guba, 1985). There were 30 respondents
within 6 virtual learning communities (5 learners in each community). Each learner was
given a number based on the order they
responded, 1-5A for virtual team A; 1-5B for
virtual team B, and so forth. Research procedures included a review of all 11 discussion
forums for the semester, which resulted in a
substantial amount of data. It was determined
that all of the discussion topics had consistency in length and quality of postings; therefore, one discussion forum was selected for
content analysis. The topic of this discussion
was the strengths and challenges of learnercentered instruction. The graduate students
enrolled in this course read a chapter from their
text and additional research articles on learnercentered instruction prior to engaging in the
discussion. The instructions were that each
learner was required to submit an original
posting and reply to at least one other virtual
team member. The instructor made it clear that
the basis for grading was quality, not quantity
of posting, but the learners were not given the
framework for analysis.
The critical thinking skills model (Newman
et al., 1996) was used as the basis for analysis.
The researchers created a color-coding system
with highlighters to identify the 10 major cate-

Vol. 7, No. 2, 2006

gories within the model. The researchers used


a read aloud protocol with consensus-building measures. An audit trail was kept to verify
the data sources for each of the critical thinking categories by color and number for each
respondent within a virtual team. In order for
transferability to occur, the researchers chose
exemplary discussions for each of the ten indicators to provide thick description (Geertz,
1973).

FINDINGS
The findings are presented with exemplary discussion examples for each critical-thinking
category across all the virtual learning communities. An audit trail with respondent codes
serves as a trustworthiness measure of the
presence of critical thinking across the 10 categories within the theoretical model.
For relevance, the researchers were looking
for relevant or irrelevant statements and/or
diversions from the topic. An example of a relevant statement found in virtual community A
is:
Learner centered instruction refers to
actively involving students in the planning,
implementation, and self-evaluation process of their education. Students who are
involved in their own learning relate better
to the material and process information
according to their own learning type. An
increase in desire to learn and problem
solving skills are the product. (4A)

The second model code is importance. The


researchers were looking for important point/
issues or whether the learners used unimportant, trivial points/issues for this category. One
respondent noted:
Instruction will require teachers to prepare
in advance, acquire the needed resources,
and effectively monitor and provide feedback to the students. Also, the classroom
can become noisy and perhaps somewhat
disorderly during certain times of the day
since many different activities could be
occurring at the same time. (5B)

A Content Analysis of Critical Thinking Skills

189

The third category is novelty or new information, ideas, or solutions. This category
includes putting forth within the discussion
new problem-related information, new ideas
for discussion, new solutions to problems, welcoming new ideas, and bringing new items in
the discussion. On the contrary, if a respondent
repeats what has been said, provides false or
trivial leads, accepts the first solution offered,
or has to be dragged into the discussion by the
instructor, this would be a negative indicator of
novelty. A good example is 1F, who commented, There are teachers and professors
who cling to teacher-oriented approaches
(such as lecture) because of fear. The fear is
also generated by pressure, failure, and laziness.
The next category is bringing outside
knowledge or experience to bear on the problem. A positive indicator would be drawing on
personal experience, referring to course materials, using relevant outside material or previous knowledge, and incorporating courserelated problems brought in from lectures,
texts, and other materials. If learners rely only
on their preconceived notions and assumptions, then this is a negative indicator. To illustrate, 3B discussed his experiences abroad:
I have spent time in several foreign countries and I have lived abroad for two years.
In most places that I have traveled I made a
conscious effort to learn the language of the
people and culture whom I am visiting. I do
this for two reasons. First, I LOVE to learn.
Secondly, the native people of the country
that I am visiting always seem to appreciate
my effort and this results in a very friendly
rapport for my time in the country. But
from this I learned that discovery learning,
or learning in a personal context has been
more effective to me picking up the language as opposed to reading a book to
studying vocabulary flash cards.

Another
insights:

respondent

provided

Im faced with students that without constant direction they would be totally lost
and find it hard to keep up. I try to build the

self-confidence and esteem of my students


at an early age and sometimes when faced
with students that dominate the lessons
their egos could be affected. I feel that the
best of both worlds, traditional and a
learner-centered environment is the best
way to go. (5A)

The fifth category is ambiguitieswhether


the respondent had clear and unambiguous
statements or confused statements. Only four
responses (2A, 3D, 3F, 5F), were ambiguous
within this topic. A negative indicator of ambiguity is:
I think that the learner needs to define what
causes learning to happen. When they
understand what learning is being sought
then they will find all the different methods
of learning. Therefore, this is why we need
to help students recognize these abilities.
(2A)

The next model code is linking ideas, implying that the learner links facts, ideas and
notions and generates new data from information collected. If the learner repeats information without making inferences or interpretation or states that he or she shares the idea or
opinion without taking these further, it is a negative indicator. After a statement made by 1D
in an area of importance, the respondent linked
by adding:
There is so much more information readily
available to students through the Internet.
The teacher is no longer the source of
knowledge. Students can acquire information and act on that information themselves. This acquisition of information is a
strength of leaner-centered instruction. By
using the Internet, or other technologies,
students learn to think critically about what
information isis it opinion, fact, etc.

these

The next category is justification. A respondent who uses critical thinking within a post
should provide proof or examples to justify his
or her solutions or judgments. This includes
discussing advantages and disadvantages of
solutions posed. If the learner uses irrelevant
or obscure questions or examples or offers

190

The Quarterly Review of Distance Education

judgments without explanations, this is a negative indicator of justification. In discussing


strengths about learner-centered instruction,
2C justified by stating:
In a learner centered classroom the teacher
is working along side the students in seeking information that the student wants and
needs to learn. This promotes a positive
student teacher relationship where the student can feel unthreatened and able to
approach the teacher for information. This
aspect of the learner centered environment
can put a lot of strain on a teacher because
there may be several students and one
teacher. The teacher is left trying to meet
the demands of many students, and the students may feel that their needs are not
being met.

The eighth category to determine quality of


critical thinking is critical assessment. When a
learner poses an idea, do they critically assess
or evaluate their idea or the ideas of others or
do they simply accept what is being said or
unreasonably reject without additional input?
To illustrate this point, one respondent in virtual team F disagreed with another team members post.

Vol. 7, No. 2, 2006

This other edge of the sword though is their


current ill-preparedness to adapt to the
learner-centered approach. This past spring
semester I moved from a pure lecture
approach to a lecture with PP[T] slides (and
handouts) and study guides to work. My
goal was to engage my students more in
their own learning. I placed a grade component (10% of the final grade) on attendance
and study guide completion. I did in fact
see the grades improve over those of past
semesters. However, after the second test,
when asked by what turned out to be a
mediocre student why the test was so hard,
I pointed out that several in other classes
had received great grades. She responded
Oh I know one of them but she studies all
the time. That statement alone points out,
at least at the higher education level, one of
the greatest challengesthat of providing
new tools to the student who is ill prepared
or undisciplined to use the tools. (4D)

The last model code is width of understanding. Learners who widen the discussion within
a broader perspective or provides intervention
strategies within a wider framework get the big
picture. If they narrow the discussion or
address only fragments of a situation, they are
not contributing width of understanding.
Respondent 2E, in discussing variables to consider in learner-centered instruction, added:

I disagree with the statement, giving up


control of the classroom, as some opposition may think. With society and technology on the rise, this allows students more
control and independency of their education. Teachers are implementers and can
enhance and even add to previous knowledge. I use this statement for older students
that have already learned the basics. (3F)

Since it hasnt ever worked the way we


thought it would, perhaps we need a different approach. Starting at the top level, the
teacher, and go down from there. Im wondering if the student is much of a variable at
all here. I guess thats why the learner is in
the center of the learner-centered environment. Hmm.

The next category is practical utility


(grounding). If the respondent relates possible
solutions to familiar situations and discusses
practical utility of new ideas, then it is a positive indicator. If he or she discusses ideas in a
vacuum or suggests impractical solutions, then
it is negative. Because most respondents were
teachers and educators, many included practical examples, as indicated in Table 1; however, the following is an example of practical
utility:

For the 10 categories within the critical


thinking skills indicators audit trail codes were
kept for each team (A-F) with five respondents
in each team. Table 1 provides a visual snapshot of the presence of the critical thinking categories within each group.
The researchers noticed some postings fully
integrating most or all of the components of
the critical thinking model. This integration
cannot be expected in every post, but does
highlight the cognitive complexity of individu-

A Content Analysis of Critical Thinking Skills

191

TABLE 1
Audit Trail Codes of Critical Thinking Skill Indicators by Respondent Within Virtual Learning Communities
Respondents by Team
Categories

Relevance

1, 2, 3, 4, 5

1, 2, 3, 4, 5

1, 2, 3, 4, 5

1, 2, 3, 4, 5

1, 2, 3, 4, 5

1, 2, 3, 4, 5

Importance
Novelty

1, 3, 4, 5
1, 2, 3, 4, 5

2, 3, 5
1, 2, 3, 4, 5

1, 2, 4
1, 2, 3, 4, 5

2, 4
1, 2, 4, 3, 5

2, 5
1, 3, 4, 5, 2

1, 3
1, 3, 2, 4, 5

Outside knowledge
Ambiguities

3, 4, 1, 5
2

1, 3, 4, 5

2, 3, 4

1, 2, 3, 4, 5
3

2, 4, 1, 5

1, 3
3, 5

Linking
Justification

1, 3, 2, 5
3, 4, 2, 5

3, 1, 5, 4
1, 2, 3, 5

1, 3, 4, 2, 5
1, 3, 2, 4, 5

1, 2, 4, 3, 5
1, 3, 4, 2, 5

1, 2, 4, 5, 3
1, 2, 3, 4, 5

1, 2, 3, 4
1, 2, 3, 4, 5

1, 4, 5
5

2, 3, 1, 5
3, 1, 5, 4, 2

1, 2, 4
1, 3

2, 4, 1
4, 2

3, 2, 4, 5
1, 2, 5

1, 3, 4
1, 3, 4

1, 4, 2, 5

3, 5, 1, 2

2, 4

1, 2, 3, 4

2, 1

1, 4, 3

Critical assessment
Practical utility
Width of
understanding

als within each team. The following is an


example of integration of all components
within the critical thinking model:
Learner-centered instruction places the
focus or control of learning on individual
students. The teacher's role changes to
facilitator and often are considered a colearner. The student becomes an active participant in exploratory learning.
There are many strengths of learner-centered instruction. Students are active participants and take more responsibility in their
learning as they explore and discover different topics. Learner-centered instruction
allows for opportunities for student interaction and collaboration. This interaction can
occur with the teacher, other students, and
even other people not enrolled in the class.
Motivation can also increase as the student
is allowed selection of topics and strategies. However, the strongest advantage of
learner-centered instruction I see is the
development of lifelong learners. Students
are not only gaining subject matter knowledge, they also gain problem-solving skills
and collaboration skills.
There are also a few challenges of
learner-centered instruction. As stated in
the text, a single approach to all instruction
will not work. The first challenge involves
the need to change the philosophy or attitude of experienced teachers, administrators and some of the parents. Approaches
used in previous years need to be modified
to incorporate new technologies and meth-

ods of learning. Instruction will require


teachers to prepare in advance, acquire the
needed resources, and effectively monitor
and provide feedback to the students. Also,
the classroom can become noisy and perhaps somewhat disorderly during certain
times of the day since many different activities could be occurring at the same time.
As you can see there are many things to
consider when evaluating the use learnercentered instruction in the schools. I feel
that if this method promotes lifelong learners then it is definitely worth pursuing.
(5B)

CONCLUSIONS AND IMPLICATIONS


Based on the exemplary discussion examples
provided for each indicator in the critical
thinking skills model, a summary of findings
along with implications will be provided in
relation to the purpose and research questions
posed in this study. The researchers sought to
determine if the critical thinking skills model
(Newmann et al., 1996) could be used to indicate quality of discussion in an online forum
when learners were placed in smaller learning
communities as opposed to exposure to whole
class discussions.
Analysis of the discussions determined that
all individuals within each learning community did engage in relevant discussion and

192

The Quarterly Review of Distance Education

brought in new/novel ideas to their community. The majority of the respondents by team
did manage to incorporate several of the categories, but not all learners fully integrated the
10 components of the critical thinking skills
model within their discussion. However, each
virtual learning community had at least one
person whose discussions did fully integrate 910 of the categories. Critical assessment, providing practical utility, important statements,
and demonstrating width of understanding
were the four components within the model
where a few communities had only two-three
respondents that integrated these categories.
Adult learners bring in their prior experiences
and knowledge into the classroom and engage
in a discussion regarding issues relevant to
their situation. The implication exists that
learners with more experience impact the
learner-learner interaction within that virtual
learning community.
The researchers found that several new/
novel ideas within each group were convergent; however, divergent novel ideas were also
discovered among communities that drove the
conversations. Even with the introduction of
divergent ideas, the discussions remained relevant to the topic. Further research is needed to
determine convergent and divergent themes
that emerge within the virtual learning communities via a cross case analysis. Would there
be specific patterns of reflection and metacognition unique to individuals in the groups that
could shape the quality of the discussions?
Findings indicate that all groups engaged in
critical thinking within their virtual learning
communities and a high amount of interaction
occurred within each community; however,
further research is needed to determine if the
same level of critical thinking and interaction
would occur if students were exposed to and
expected to interact in a whole class discussion. Would students seek a few individuals
within the class and engage in a meaningful
discussion? Would they self select a smaller
community and continue throughout the
semester in the discussion forums with the

Vol. 7, No. 2, 2006

same individuals similar to the virtual learning


communities?
The critical thinking skills model provided
an excellent framework for content analysis of
discussion threads within the virtual learning
communities. As mentioned previously further
research is needed to assess quality of online
discussions. Although descriptive, these models are time consuming and instructors and
researchers in the field of educational technology need to continue to explore the utilization
of rubrics for assessment of critical thinking in
online discussions. The use of smaller virtual
learning communities for this study has been
shown to have equivalent critical thinking
capacity across teams. This will enable the
instructor to design online learning with the
assurance that deeper learning and critical
thinking will occur without students having to
read and reply to whole class discussions.

REFERENCES
Edelstein, S., & Edwards, J. (2003). If you build it,
they will come: Building learning communities
through threaded discussions. Online Journal of
Distance
Learning
Administration,
51.
Retrieved November 9, 2004, from http://
www.westga.edu?~distance/ojdla/spring51/
edelstein51.html
Garrison, D. R. (1992). Critical thinking and selfdirected learning in adult education: An analysis
of responsibility and control issues. Adult Education Quarterly, 42(3), 136-148.
Geertz, C. (Ed.). (1973). Thick description: Toward
an interpretive theory of culture. In The interpretation of cultures (pp. 5-30). New York: Basic
Books
Henri, F. (1992). Computer conferencing and content analysis. In A. R. Kaye (Ed.), Collaborative
learning through computer conferencing (pp.
117-136). Berlin, Germany: Springer-Verlag.
Lincoln, Y. S., & Guba, E. G. (1985). Naturalistic
inquiry. Newbury Park, CA: Sage.
Lipman, M. (1991). Thinking in education. Cambridge, MA: Cambridge University Press.
Marra, R. M. (2002). The ideal online learning environment for supporting epistemic development:
Putting the puzzle together. Quarterly Review of
Distance Education, 3(1), 15-31.

A Content Analysis of Critical Thinking Skills

Marra, R. M., Moore, J. L., & Klimczak, A. K.


(2004). Content analysis of online discussion
forums: A comparative analysis of protocols.
ETR&D, 52(2), 23-40.
Mason, R. (1992). Evaluation methodologies for
computer conferencing applications. In A. R.
Kay (Ed.), Collaborative learning through computer conferencing (pp. 105-116). Berlin, Germany: Springer-Verlag.
Moore, M. G. (1989). Editorial: Three types of
interaction. The American Journal of Distance
Education, 3(2), 1-6.
Newman, D. R., Webb, B., & Cochrane, C. (1996).
A content analysis method to measure critical
thinking in face-to-face and computer supported
group learning. Retrieved April 25, 2005, from
http://www.qub.ac.uk/agt/papers/methods/
contpap.html
Northrup, P. T. (2002). Online learners preferences
for interaction. Quarterly Review of Distance
Education, 3(2), 219-226.

193

Palloff, R. M., & Pratt, K. (1999). Building learning


communities in cyberspace: Effective strategies
for the online classroom. San Francisco: JosseyBass.
Resnick, L., Levine, J., & Teasley, S. (1991). Perspectives on socially shared cognition. Washington, DC: American Psychological Association.
Simonson, M., Smaldino, S., Albright, M., &
Zvacek, S. (2003). Teaching and learning at a
distance: Foundations of distance education
(2nd ed.). Upper Saddle River, NJ: Merrill Prentice Hall.
Tu, C., & Corry, M. (2002). eLearning communities. Quarterly Review of Distance Education,
3(2), 207-218.
Visser, L., Visser, Y. L., & Schlosser, C. (2003).
Critical thinking in distance education and traditional education. Quarterly Review of Distance
Education, 4(4), 401-407.

Das könnte Ihnen auch gefallen