Beruflich Dokumente
Kultur Dokumente
Volume 4 Number 2
July 2006
Printed in the U.S.A.
Nicholas Ashill
School of Marketing and International Business, P.O. Box 600, Wellington, New Zealand,
e-mail: Nicholas.Ashill@vuw.ac.nz
ABSTRACT
In this study, structural equation modeling is applied to examine the determinants of
students satisfaction and their perceived learning outcomes in the context of university
online courses. Independent variables included in the study are course structure, instructor feedback, self-motivation, learning style, interaction, and instructor facilitation as
potential determinants of online learning. A total of 397 valid unduplicated responses
from students who have completed at least one online course at a university in the
Midwest were used to examine the structural model. The results indicated that all of
the antecedent variables significantly affect students satisfaction. Of the six antecedent
variables hypothesized to affect the perceived learning outcomes, only instructor feedback and learning style are significant. The structural model results also reveal that user
satisfaction is a significant predictor of learning outcomes. The findings suggest online
education can be a superior mode of instruction if it is targeted to learners with specific learning styles (visual and read/write learning styles) and with timely, meaningful
instructor feedback of various types.
author.
215
216
online degree programs at colleges and universities worldwide. The distance learning system can be viewed as having several human/nonhuman entities interacting
together via computer-based instructional systems to achieve the goals of education, including perceived learning outcomes and student satisfaction. These two
outcomes are widely cited as measures of the effectiveness of online education
systems (e.g., Alavi, Wheeler, & Valacich, 1995; Graham & Scarborough, 2001).
The primary objective of this study is to investigate the determinants of students perceived learning outcomes and satisfaction in university online education
using e-learning systems. Using the extant literature, we begin by introducing and
discussing a research model illustrating factors affecting e-learning systems outcomes. We follow this with a description of the cross-sectional survey that was
used to collect data and the results from a Partial Least Squares (PLS) analysis of
the research model. In the final section, we outline the implications of the results
for higher educational institutions and acknowledge the limitations of the study
and a future research agenda.
Student Self-Motivation
Students are the primary participants of e-learning systems. Web-based e-learning
systems placed more responsibilities on learners than traditional face-to-face learning systems. A different learning strategy, self-regulated learning, is necessary for
e-learning systems to be effective. Self-regulated learning requires changing roles
of students from passive learners to active learners. Learners must self-manage
the learning process. The core of self-regulated learning is self-motivation (Smith,
2001). Self-motivation is defined as the self-generated energy that gives behavior
direction toward a particular goal (Zimmerman, 1985, 1994).
The strength of the learners self-motivation is influenced by self-regulatory
attributes and self-regulatory processes. The self-regulatory attributes are the
learners personal learning characteristics including self-efficacy, which is
situation-specific self-confidence in ones abilities (Bandura, 1977). Because selfefficacy influences choice, efforts, and volition (Schunk, 1991), a survey question
(Moti1 in Appendix A) representing self-efficacy is used to measure the strength
of self-motivation. The self-regulatory processes refer to the learners personal
217
Student
Self-motivation
H1a
H1b
Student
Learning Style
H2a
H2b
Perceived
Student
Satisfaction
H3a
H3b
H4a
Instructor
Feedback
H4b
H5a
Interaction
Learning
Outcomes
H5b
H6a
H6b
Course
Structure
218
of motivation is also linked to high dropout rates (Frankola, 2001; Galusha, 1997).
Thus, we hypothesized:
H1a: Students with a higher level of motivation will experience a higher level
of user satisfaction.
H1b: Students with a higher level of motivation in online courses will report
higher levels of agreement that the learning outcomes equal to or better
than in face-to-face courses.
219
include less sound or oral components than traditional face-to-face course delivery
systems and that online learning systems have more proportion of read/write assignment components, Students with visual learning styles and read/write learning
styles may do better in online courses than their counterparts in face-to-face courses.
Hence, we hypothesized:
H2a: Students with visual and read/write learning styles will experience a
higher level of user satisfaction.
H2b: Students with visual and read/write learning styles will report higher
levels of agreement that the learning outcomes of online courses are
equal to or better than in face-to-face courses.
220
Instructor Feedback
Instructor feedback to the learner is defined as information a learner receives about
his/her learning process and achievement outcomes (Butler & Winne, 1995) and it
is one of the most powerful component in the learning process (Dick & Carey,
1990, p. 165). Instructor feedback intends to improve student performance via
informing students how well they are doing and via directing students learning
efforts. Instructor feedback in the Web-based system includes the simplest cognitive feedback (e.g., examination/assignment with his/her answer marked wrong),
diagnostic feedback (e.g., examination/assignment with instructor comments about
why the answers are correct or incorrect), prescriptive feedback (instructor feedback suggesting how the correct responses can be constructed) via replies to student
e-mails, graded work with comments, online grade books, and synchronous and
asynchronous commentary.
Instructor feedback to students can improve learner affective responses, increase cognitive skills and knowledge, and activate metacognition. Metacognition
refers to the awareness and control of cognition through planning, monitoring,
and regulating cognitive activities (Pintrich, Smith, Garcia, & McKeachie, 1991).
Metacognitive feedback concerning learner progress directs the learners attention to learning outcomes (Ley, 1999). When metacognition is activated, students
may become self-regulated learners. They can set specific learning outcomes and
monitor the effectiveness of their learning methods or strategies (Chen, 2002;
Zimmerman, 1989). Therefore, we hypothesized:
H4a: A high level of instructor feedback will lead to a high level of user
satisfaction.
H4b: A higher level of instructor feedback will lead to higher levels of student
agreement that the learning outcomes of online courses are equal to or
better than in face-to-face courses.
Interaction
The design dimension includes a wide range of constructs that affect effectiveness
of e-learning systems such as technology, learner control, learning model, course
contents and structure, and interaction. Of these, the research model focuses on
only interaction and course structure.
Among the many frameworks/taxonomies of interaction (Northrup, 2002),
this research adopts Moores (1989) communication framework which classified
engagement in learning through (a) interaction between participants and learning
materials, (b) interaction between participants and tutors/experts, and (c) interactions among participants. These three forms of interaction in online courses are
221
Course Structure
Course structure is seen as a crucial variable that affects the success of distance
education along interaction. According to Moore (1991, p. 3), the course structure expresses the rigidity or flexibility of the programs educational objectives,
teaching strategies, and evaluation methods and the course structure describes
the extent to which an education program can accommodate or be responsive to
each learners individual needs.
Course structure has two elementscourse objectives/expectation and course
infrastructure. Course objectives/expectation are to be specified in the course syllabus including what topical areas are to be learned, required workload in competing
assignments, expected class participation in the form of online conferencing systems, group project assignments, and so on. Course infrastructure is concerned with
the overall usability of the course Web site and organization of the course material
into logical and understandable components. These structural elements, needless
to say, affect the satisfaction level and learning outcomes of distance learners.
We theorize that course structure will be strongly correlated to user satisfaction and perceived learning outcomes, especially when the course material is organized into logical and understandable components and that the clear communication
222
of course objectives and procedures will lead to the high levels of student satisfaction and perceived learning outcomes. Thus, we hypothesized:
H6a: A good course structure will lead to a high level of user satisfaction.
H6b: A good course structure will lead to higher levels of student agreement
that the learning outcomes of online courses are equal to or better than
in face-to-face courses.
METHODOLOGY
The six sets of hypotheses were tested using a quantitative survey of satisfaction
and learning outcome perceptions of students who have taken at least one online
course at a large Midwestern university in the United States. Structural equation
modeling is employed to examine the determinants of these outcomes and student
satisfaction. The challenge in deciding on the design of the study was to reduce the
complexity of the research object without wrecking ourselves on an unjustifiable
simplification or on an unmanageable research project. Details regarding the design
of this research are provided in the following sections. First, the development
of the survey instrument is described and a discussion of the sample subjects is
provided. Next, specific measures used to assess the variables are identified and
scale reliability and validity data are reported. This is followed by the presentation
of the structural model results associated with the survey.
Survey Instrument
After conducting an extensive literature review, we designed a list of questions
that we believed were logically associated with the factors in our model (see Appendix A). The survey questionnaire is in part adapted or selected from the commonly administered IDEA (Individual Development & Educational Assessment)
student rating systems developed by Kansas State University.
In an effort to survey students using technology-enhanced e-learning systems, we focused on students enrolled in Web-based courses with no oncampus
meetings. We collected the e-mail addresses from the student data files achieved
with every online course delivered through the online program of a university in
the midwestern United States. From these addresses, we generated 1,854 valid email addresses. The 42 survey questions were generated by FrontPage 2000. The
survey URL and instructions were sent to all valid e-mail addresses. We collected
397 valid unduplicated responses from the survey. Appendix B summarizes the
characteristics of the student sample.
Research Method
The research model (Figure 1) was tested using the structural equation model-based
PLS methodology for two reasons. First, PLS is well suited to the early stages of
theory building and testing (Chin, 1998). It is particularly applicable in research
areas where theory is not as well developed as that demanded by linear structural
relationship (LISREL) (Fornell & Bookstein, 1982) as is the case with this research
study. Second, PLS is most appropriately used when the researcher is primarily
223
concerned with prediction of the dependent variable (Fornell & Bookstein, 1982;
Garthwaite, 1994).
224
Factor Loading
.8375
.8681
.8500
.8739
.8295
.9041
.9017
.5249
.9783
.8876
.7441
.8823
.6845
.8468
.9035
.8055
.8686
.9065
.8301
.8533
.8991
.9017
.852
.721
.243
.293
.415
.679
.740
.547
.878
.229
.214
.564
.802
.695
.486
Instructor
Feedback
.784
.265
.394
.252
.393
.391
SelfMotivation
.819
.276
.257
.406
.443
Learning
Style
.789
.524
.531
.441
Interaction
.852
.708
.539
Instructor Knowledge
and Facilitation
Note: The boldface figures represent the square root of the AVE figures. They should be higher than the correlation figures.
Course structure
Instructor feedback
Self-motivation
Learning style
Interaction
Instructor knowledge and facilitation
User satisfaction
Learning outcomes
Course
Structure
Table 2: Correlation among construct scores (square root of AVE in the diagonal).
.868
.773
User
Satisfaction
.884
Learning
Outcomes
226
Observed
t Value
+.382
+.119
+.141
+.147
+.087
+.234
+7.4497
+1.8457
+4.1396
+4.0405
+2.0773
+4.2996
.015
+.118
+.075
+.135
+.031
+.065
+.720
+.2483
+1.6713
+1.6169
+3.8166
+.7687
+1.0805
+12.4127
Sig.
Level
ns
ns
ns
matrix. In contrast, PLS has its primary objective the minimization of error (or
equivalently the maximization of variance explained) in all endogenous constructs.
One consequence of this difference in objectives is that no proper overall goodnessof-fit measures exist for PLS.
Consistent with the distribution free, predictive approach of PLS (Wold,
1985), the structural model was evaluated using the R-squared for the dependent
constructs, the Stone-Geisser Q2 test (Geisser, 1975; Stone, 1974) for predictive
relevance, and the size, t statistics, and significance level of the structural path
coefficients. The t statistics were estimated using the bootstrap resampling procedure (100 resamples). The results of the structural model are summarized in
Table 3.
227
Omission Distance = 25
Construct
R2
Course structure
Instructor feedback
Self-motivation
Learning style
Interaction
Instructor knowledge and
facilitation
User satisfaction
Learning outcomes
NA
NA
NA
NA
NA
NA
.7259
.7706
.6163
.6708
.6235
.7274
NA
NA
NA
NA
NA
NA
.7259
.7706
.6163
.6708
.6235
.7274
NA
NA
NA
NA
NA
NA
.692
.628
.7551
.7832
.5227
.4920
.7551
.7832
.5227
.4920
NA = not applicable.
228
DISCUSSION
This study examined the factors that affect the perceived learning outcomes and
student satisfaction in asynchronous online learning courses. The research model
was tested by using a PLS analysis on the survey data. The hypotheses in this
study received partial support. We found that all six factorscourse structure,
self-motivation, learning styles, instructor knowledge and facilitation, interaction,
and instructor feedbacksignificantly influenced students satisfaction. This is in
accordance with the findings and conclusions discussed in the literature on student
satisfaction.
Of the six factors hypothesized to affect perceived learning outcomes, only
two (learning styles and instructor feedback) were supported. Contrary to previous
research (LaPointe & Gunawardena, 2004), we found no support for a positive
relationship between interaction and perceived learning outcomes. One possible
explanation for this finding is that the study did not account for the quality or
purpose of the interactions. Although a students perception of interaction with
instructors and other students is important in his/her level of satisfaction with the
overall online learning experience, when the purpose of online interaction is to
create a sense of personalization and customization of learning and help students
overcome feelings of remoteness, it may have little effect on perceived learning
outcomes. Furthermore, a well-designed online course delivery system is likely to
reduce the need of interactions between instructors and students. The university
under study has a very friendly online e-learning system and strong technical
support system. Every class Web site follows the similar design structure which
reduces the learning curve.
Another disputable point is the statistically insignificant relationship between
online course structure and perceived learning outcomes. One possible explanation
for this is that, for the students who visited the class Web site on a regular basis, what
matters to their learning is not so much the usability of the course site as a measure
of the quality of engagement in other learning activities. For instance, meaningful
feedback that occurs among students or from a teacher may have a greater impact
on perceived learning outcomes. As long as students received meaningful feedback
about the course contents, an inadequate Web content design becomes less critical.
Contrary to other research findings, no significant relationships were found
between students self-motivation and perceived learning outcomes. We are unable
to explain this aberration. Theoretically, self-motivation can lead students to go
beyond the scope and requirements of an educational course because they are
seeking to learn about the subject, not just fulfill a limited set of requirements.
Self-motivation should also encourage learning even when there is little or no
external reinforcement to learn and even in the face of obstacles and setbacks to
learning. Additional work is needed to better specify the conditions under which
self-motivation is likely to have a positive, negative, or neutral effect on perceived
learning outcomes.
229
technology warrant investigation. Second, future research should seek to further investigate the nonsignificant relationships between the remaining constructs (course
structure, self-motivation, and interactions) and perceived learning outcomes. To
clarify the dispute over the issue, future studies should use more sophisticated measures of course structure, self-motivation, and interactions and their engagement in
learning activities, either quantitatively or qualitatively. In our study, the learning
outcome variables ask students about whether they perceive the quality of online
learning to be better than that of face-to-face courses or whether students learned
more in one than the other. Although students are in general satisfied with online
courses, they believe that they did not learn more in online courses or they believe
that the quality of online courses was not better than face-to-face class. In future
research, it would be interesting to know the critical success factors for improving
the quality of online learning using multilevel hierarchical modeling.
PRACTICAL IMPLICATIONS
Higher educational institutions have invested heavily to constantly update their
online instructional resources, computer labs, and library holdings. Unfortunately,
most institutions have not studied the factors that influence online student satisfaction or learning outcomes. This study is one of the first to extend the structural equation modeling to student satisfaction and perceived learning outcomes in
asynchronous online education courses. The findings from the current study have
significant implications for the distance educators, students, and administrators. In
this study, what we questioned is whether all the six factors will lead to higher
levels of student agreement that the learning outcomes of online courses are equal
to or better than in face-to-face courses. The results indicated that online education
is not a universal innovation applicable to all types of instructional situations. Our
findings suggest online education can be a superior mode of instruction if it is targeted to learners with specific learning styles (visual and read/write learning styles)
and with timely, helpful instructor feedback of various types. Although cognitive
and diagnostics feedbacks are all important factors that improve perceived learning
outcomes, metacognitive feedback can induce students to become self-regulated
learners.
More specifically, there is a clear relationship between instructor feedback
and student satisfaction and perceived outcomes. Feedback is a motivator to many
students and should be incorporated into the design and teaching of online courses.
Although students prefer feedback from the instructor, peer feedback can also be a
valuable instructional tool. As this high level of interaction becomes time consuming, faculty may want to consider efficient teaching and time management strategies. Online quizzes can provide preprogrammed feedback to learners. In addition,
instructors may want to develop feedback comments and frequently asked question
databases that can be used repeatedly. This study may be useful as a pedagogical
tool for instructors planning learning ventures or to justify technological expenditures at the administrative level. It is conceivable that, through this type of research,
online learning will be enhanced when there is a better understanding of critical
online learning factors.
230
REFERENCES
Alavi, M., Wheeler, B. C., & Valacich, J. S. (1995). Using IT to reengineer business
education: An exploratory investigation of collaborative telelearning. MIS
Quarterly, 19(3), 293312.
Arbaugh, J. B. (2000). How classroom environment and student engagement affect
learning in internet-based MBA courses. Business Communication Quarterly,
63(4), 918.
Bandura, A. (1977). Self-efficacy: Toward a unified theory of behavioral change.
Psychological Review, 84(2), 191215.
Butler, D. L., & Winne, P. H. (1995). Feedback and self-regulated learning: A
theoretical synthesis. Review of Educational Research, 65(3), 245281.
Chen, C. S. (2002). Self-regulated learning strategies and achievement in an introduction to information systems course. Information Technology, Learning,
and Performance Journal, 20(1), 1125.
Chi, M. T., & VanLehn, K. A. (1991). The content of physics self-explanations.
The Journal of Learning Sciences, 1(1), 69105.
Chin, W. W. (1998). The partial least squares approach to structural equation modeling. In G. A. Marcoulides (Ed.), Modern methods for business research.
Mahwah, NJ: Erlbaum, 295336.
Curry, L. (1983). A critique of the research on learning styles. Educational Leadership, 48(2), 5056.
Dembo, M., & Eaton, M. (2000). Self regulation of academic learning in middlelevel schools. The Elementary School Journal, 100(5), 473490.
Dick, W., & Carey, L. (1990). The systematic design of instruction (3rd ed.). New
York: Harper Collins Publishers.
Drago, W. A., & Wagner, R. J. (2004). Vark preferred learning styles and online
education. Management Research News, 27(7), 113.
Dunn, R., Beaudry, J., & Klavas, A. (1989). Survey research on learning styles.
Education Leadership, 46, 5058.
Falk, R. F., & Miller, N. B. (1992). A primer for soft modeling. Akron, OH: The
University of Akron Press.
Fornell, C., & Bookstein, F. (1982). A comparative analysis of two structural equation models: LISREL and PLS applied to market data. In C. Fornell (Ed.), A
second generation of multivariate analysis. New York: Praeger, 289394.
Fornell, C., & Larcker, D. (1981). Evaluating structural equation models with unobservable variables and measurement error. Journal of Marketing Research,
18(1), 3950.
Frankola, K. (2001). Why online learner drop out. Workforce, 80(1), 5360.
Galusha, J. M. (1997). Barriers to learning in distance education. Interpersonal
Computing and Technology: An Electronic Journal for the 21st Century,
5(3/4), 614.
231
232
233
Course Structure
Struc1 = The overall usability of the course Web site was good.
Struc2 = The course objectives and procedures were clearly communicated.
Struc3 = The course material was organized into logical and understandable
components.
Feedback
Feed1 = The instructor was responsive to student concerns.
Feed2 = The instructor provided timely feedback on assignments, exams, or
projects.
Feed3 = The instructor provided helpful timely feedback on assignments, exams,
or projects.
Feed 4 = I felt as if the instructor cared about my individual learning in this course.
Self-Motivation
Moti1 = I am goal directed, if I set my sights on a result, I usually can achieve it.
Moti2 = I put forth the same effort in online courses as I would in a face-to-face
course.
Learning Style
Styl1 = I prefer to express my ideas and thoughts in writing, as opposed to oral
expression.
Styl2 = I understand directions better when I see a map than when I receive oral
directions.
Interaction
Intr1 = I frequently interacted with the instructor in this online course.
Intr2 = I frequently interacted with other students in this online course.
OUTPUTS
User Satisfaction
Sati1 = The academic quality was on par with face-to-face courses Ive taken.
Sati2 = I would recommend this course to other students.
Sati3 = I would take an online course at Southeast again in the future.
Learning Outcomes
Outc1 = I feel that I learned as much from this course as I might have from a
face-to-face version of the course?
Outc2 = I feel that I learn more in online courses than in face-to-face courses.
Outc3 = The quality of the learning experience in online courses is better than in
face-to-face courses.
234
Age
<20
2024
2534
3544
4554
>54
Total
Gender
Male
Female
Not answered
Total
Year in school
Freshman
Sophomore
Junior
Senior
Graduate
Not answered
Total
Area of study by colleges
Education
Business
Health and human services
Science and math
Speech communications
University studies
Polytechnic studies
Others
Total
Number
Proportion (%)
12
145
116
69
51
4
397
3.02
36.52
29.22
17.38
12.85
1.01
100.00
111
282
4
397
27.96
71.03
1.01
100.00
14
39
60
115
167
2
397
3.53
9.82
15.11
28.97
42.07
0.50
100.00
139
93
62
18
39
13
11
22
397
35.01
23.43
15.62
4.53
9.82
3.27
2.77
5.54
100.00
Sean B. Eom is Professor of Management Information Systems (MIS) at Southeast Missouri State University. He received a PhD in Management Science with
supporting fields in MIS and Computer Science from the University of Nebraska
Lincoln in 1985. In recognition of his continuing research contributions, he had
been appointed as a Copper dome Faculty Fellow in Research at the Harrison
College of Business of Southeast Missouri State University during the academic
years 19941996 and 19982000. He is the author/editor of five books, including Decision Support Systems Research (19701999): A Cumulative Tradition and
Reference Disciplines, Author Cocitation Analysis using Custom Bibliographical
Databases: An Introduction to the SAS systems, Inter-Organizational Information
Systems in the Internet Age, and Encyclopedia of Information Systems. He has
published more than 50 refereed journal articles and 60 articles in encyclopedia,
book chapters, and proceedings of conferences.
235
H. Joseph Wen is Chairperson and Associate Professor of Management Information Systems, Department of Accounting and Management Information Systems,
Harrison College of Business, Southeast Misourri State University. He holds a PhD
from Virginia Commonwealth University. He has published over 100 articles in
academic refereed journals, book chapters, encyclopedias, and national conference
proceedings. He has received over six million dollars in research grants from various state and federal funding sources. His areas of expertise are Internet research,
electronic commerce, transportation information systems, and software development. He has also worked as a senior developer and project manager for various
software development contracts since 1988.
Nicholas J. Ashill is Associate Professor in Marketing at the University of Victoria
of Wellington. He has contributed to such journals as the European Journal of Marketing, Journal of Strategic Marketing, Journal of Services Marketing, Journal of
Marketing Management, Qualitative Market Research: An International Journal,
Journal of Asia-Pacific Business, and the International Journal of Bank Marketing.