Sie sind auf Seite 1von 16

Running head: Paper 1

Critical Analysis of a Research Article: A Comparison of Students Approaches to


Inquiry, Conceptual Learning, and Attitudes in Simulation-based and
Microcomputer-based Laboratories
Cynthia Sargent
California State University, Monterey Bay

IST520 Learning Theories


Dr. Nancy Lockwood
March 6, 2015

CRITICAL ANALYSIS OF RESEARCH

2
TABLE OF CONTENTS

Introduction ................................................................................................................. 3
Research Procedures (Methods) ................................................................................ 6
Research Results ....................................................................................................... 10
Discussion .................................................................................................................. 14
References .................................................................................................................. 16

CRITICAL ANALYSIS OF RESEARCH

3
Introduction

This critical analysis of a research article evaluates the validity of a quasi-experimental


study; that is, a study in which the researchers were able to control many conditions, but did not
have full control over factors. For the study under analysis, the experimental variable was
laboratory type: microcomputer-based laboratory (MBL) or simulation-based laboratory (SBL).
The researchers sought to design a study to test whether the physical manipulation of concrete
materials and lab apparatus affects learning outcomesspecifically an understanding of Boyles
Law in high school physics students. In the articles introduction, the authors identify and
reference research performed by others to compare traditional hands-on laboratories (physical
laboratories) to simulation-based laboratories. However, the traditional laboratories in these other
studies did not typically involve the use of technology, such as microcomputers and sensors. So
the authors chose to compare MBL to SBL because this comparison controlled for the types of
representations the learners interact with (say a graph of pressure versus volume generated in
real-time on a computer screen), allowed for the same laboratory manual to be followed by both
groups, and allowed for the lab to be completed in the same time period in both groups. The SBL
modeled the same equipment, actions and data as the MBL. There was only one major
differencevirtual manipulation versus actual physical manipulation. The authors explain, our
control group used an MBL rather than a traditional laboratory activity, so that the tactile effect
could be evaluated with the more extraneous variables controlled (pp. 912).
The article identifies general goals of the research as well as specific research questions. One
goal of the research was to provide evidence-based information to guide teachers decision
making with regards to the inclusion of MBLs, SBLs, or a combination of the two in their
curriculum. They state, it is important to clarify what is gained and what is lost in these two

CRITICAL ANALYSIS OF RESEARCH

environments (pp. 906). A second goal was to add to the existing knowledge of the differences
between physical and virtual laboratories with relation to 1) a concept for which students have no
prior tactile experience with, 2) an activity type requiring students to make decisions, improve
the experiment, interpret data representations, and design extension experiments, and 3) with
relation to affective learning objectives in addition to conceptual and cognitive learning
objectives. The three research questions listed in the paper were:
1. How effective is the SBL compared to the MBL in terms of learning
physics concepts [Boyles Law]?
2. Do students perform similarly or differently in inquiry tasks in the SBL
and the MBL?
3. How do students enjoy and are involved in the SBL and the MBL?
(pp. 914)
The authors did not present a hypothesis related to these research questions.
While they discussed many advantages that MBLs have and SBLs have, the authors did
not express bias toward one type of laboratory or the other. They made a convincing case (that is,
supported by research evidence) that MBLs are an improvement over traditional laboratories in
many situations, by removing tedious tasks such as graphing by hand and replacing this task with
automatically generated, real-time graph displays or by reducing the time needed for
investigations and thereby providing time for additional trials, for example. They also made a
convincing case that SBLs have several affordances as well, notably that they remove the need
for expensive equipment and are therefore cost effective, they allow for representations of
abstract phenomena (such as diffusion of molecules), and they reduce cognitive load by reducing
distractions and constraining the learning environment.

CRITICAL ANALYSIS OF RESEARCH

Included in the article is a comprehensive review of literature related to learning theories,


MBLs, and SBLs, and the authors made compelling points that research is still needed to
evaluate when to use one type of laboratory or the other, or when to use both. They explicitly
identify some weaknesses or compounding factors present in prior research studies, which leave
the door open for further research. They identify studies in which the amount of information
provided to the control and experimental groups was not equal, or studies that may have favored
the experimental group because posttest questions used representations that the experimental
group learners had seen during the SBL but the control group had not seen when completing the
traditional laboratory. The authors also emphasized in their introduction that learning goals
related to laboratories should include more than just conceptual learning. Therefore, it is
important for research to focus not just on measuring effects on conceptual learning, but to also
include measures to determine the relationship, if any, between physical manipulation or virtual
manipulation and students attitudes toward science and inquiry skills. In short, the authors made
a convincing case that the three research questions they pursued in the study met a need and
addressed a gap in current research.

CRITICAL ANALYSIS OF RESEARCH

Research Procedures (Methods)


Important considerations and features of the participants, research methods and
procedures used in this study are summarized below:

The laboratory topic of Boyles law was 1) one of three labs in the Ministry of
Educations curriculum guidelines, and 2) a topic for which students had no prior tactile
experience.

The participants were 11th grade physics students in an urban high school in Taiwan
(Taipei), split into two groups by the researchers. One group performed a SBL and the
other performed a MBL. The school was ranked 14 of 28 high schools in Taipei and the
students were determined to have average academic achievement. The same teacher
taught both groups and facilitated both types of laboratories. The two classes were
randomly assigned to the laboratory type. While there were differences in the number of
male and female students in the classes, the researchers used a chi-square test to
determine if the difference was significant and the results of the test showed no
statistically relevant difference in gender distribution. The students in both groups were
not accustomed to performing laboratories of any kind in the normal course of
instruction.

The SBL and MBL groups were both given 1 hour to complete the laboratory, and the
SBL and MBL both afforded quick data collection. The SBL simulated reality in that
forces between gas molecules, systematic errors, and random errors were involved (pp.
914). This provided both groups opportunity to troubleshoot and suggest improvements
to the experiment design and placed similar cognitive load on both groups of learners.
The students worked in groups of four, and followed directions from and answered

CRITICAL ANALYSIS OF RESEARCH

questions in the same printed laboratory manual. Students in both groups were allowed to
discuss the laboratory and experiment but were required to complete the lab manual
handout independently. The lab manual had been tested with other high school teachers
and revised before this research study.

No formal lecture was given to either group, but both groups performed the laboratory at
the end of the unit and had some content knowledge related to Boyles Law. A conceptual
test was given both before and after the laboratory to measure gains in learning. Prior to
use in the study, the conceptual test questions were reviewed and approved by subjectmatter experts and piloted with other 11th graders from the participants school. The
article provides two example questions from the conceptual test and identified Kendalls
coefficient of concordance as the method of evaluating interrater reliability. The
coefficient of concordance was 0.87, which indicates good reliability.

Student responses in the lab manual were collected, read, and coded to measure scientific
inquiry performance. The lab manual for the activity was carefully designed by the
researchers to include features associated with inquiry, including: generating testable
questions, selecting and controlling variables, planning and conducting experiments,
interpreting and transforming data, identifying experimental flaws, and using indirect
reasoning. The lab manual incorporated scaffolding to progress students from structured
to open inquiry. Ratings were given to student responses based on criteria in a rubric and
the reliability of ratings for different raters was measured as 0.95 by Spearmans
correlation, again indicating strong reliability. The article provides a table showing the
coding rubric and examples.

CRITICAL ANALYSIS OF RESEARCH

Eight participants from each group were randomly selected for follow-up structured
interviews to assess their attitudes toward science and inquiry. The authors of the article
provide the interview protocol in an appendix. Questions were categorized as relating to
1) enjoyment of the inquiry, 2) perception of involvement in the laboratory activity, and
3) perceptions of learning. The authors applied member checking as a means of
establishing the validity of the interview results. This involved the researchers restating
and summarizing information for the participants to check whether the recorded
responses accurately depicted the participants positions. Responses were coded as
positive, neutral, or negative and the rating was checked with participants. Interviews
were audio-recorded and transcribed; another researcher inspected the transcripts to
verify coding validity.

The authors of this study provide clear evidence in their article that they planned and carried
out a valid, quasi-experimental study. Many variables were controlled for, such as duration of the
learning activity, allowance of social interaction during the activity, the types of representations
of data generated by the computer during the experiment, and the structure of the assignment (lab
manual). The sample sizes were relatively small (n=32 for SBL and n=36 for MBL), but had the
important feature of providing for comparison of students who had received the same physics
instruction from the same teacher prior to the SBL and MBL laboratories. The sample of students
interviewed was also small for each group (n=8), but the authors address this decision with an
explanation that they wanted interviews to occur very soon after the activity and time and student
availability dictated the decision to conduct interviews with a small subset of participants from
each group.

CRITICAL ANALYSIS OF RESEARCH

The researchers lab manual and conceptual test questions were vetted prior to the study and
the researchers applied methods of evaluating interrater reliability to ensure validity of the
rubrics used to code responses from the conceptual test and lab manual responses. Kendalls
coefficient of concordance of 0.87 and Spearmans correlation of 0.95 provide confidence that
the reported codes for the responses are valid. The use of member checking was another sign
that these researchers placed great importance on using methods that were scientifically sound.
The structure of the activities (SBL and MBL) and the methods of data collection (pre and
posttest conceptual questions, post-activity interviews, and evaluation of student responses in the
lab manual) were closely aligned with the purpose and research questions of the study. Other
researchers interested in replicating this study have sufficient information to do sothe methods
are described in detail in the article, the rubric for coding responses is provided, and statistical
tests are described. Further research would be helpful to determine if these results from
Taiwanese high school students are generalizable to other populations of learners, such as high
school students in the United States or other countries.

CRITICAL ANALYSIS OF RESEARCH

10

Research Results
Table 1 provides a summary of the statistics the researchers used to address their first
research question regarding the effectiveness of each laboratory type in helping students learn a
physics concept. To compare the initial knowledge of the SBL and MBL groups, the researchers
applied an independent t-test. To evaluate each groups progress in conceptual understandingas
measured by the conceptual questions testthe researchers applied a paired t-test. Lastly, an
analysis of covariance (ANCOVA) was used to compare the two groups to one another. Effect
sizes (d) and 95% confidence intervals (CI.95) were analyzed as well. The statistical tests they
applied were appropriate and the authors made valid conclusions based on the probability (p)
values obtained from these tests. Clark and Mayer (2011) summarize the accepted guidelines
from the research community are that a probability value less than 0.05 indicates differences
between groups are statistically significant, and when the effect size is greater than 0.5, the effect
of the treatment is considered significant. If the effect size is greater than 1 it is regarded as a
strong effect.
So, for example, the authors concluded that the SBL and MBL students had the same
prior knowledge before the laboratory. This conclusion is supported by evidence in the article
because authors provide the results of the independent t-test used to compare pretest scores. The
t-test gave a p value of 0.31 which is greater than 0.05. Therefore, any difference in the mean of
the pretest scores in the SBL and MBL groups is not significant. So the interpretation of the t-test
result is consistent with the authors conclusion that prior knowledge was not different between
the groups.

Cynthia Sargent 3/6/2015 8:05 PM


Comment [1]: Page 60 of Mayer text

CRITICAL ANALYSIS OF RESEARCH

11

Table 1: Statistical tests applied to answer research question 1


Type of
General
Used in this
Authors
Statistical Test Definition or Use study to:
Conclusions
Independent
t-test1

Test a hypothesis
related to the
probability that
two groups
(samples) are the
same with respect
to a variable

Paired t-test1

Used when
measurements are
taken from the
same subjects in
before and after
situations

ANCOVA2

1
2

Determine if there
are significant
differences
between the
means of
independent
groups (samples).
Allows one
variable to be
statistically
controlled for
this is the
covariate.

Determine if the
SBL and MBL
groups were not
significantly
different in their
level of
conceptual
knowledge prior
to the laboratory.
Determine if
there were gains
in conceptual
understanding
following
completion of the
Boyles law
laboratory (i.e.:
compare pretest
and posttest
scores.)
Determine if the
changes in
conceptual
understanding
were
significantly
different in the
SBL group
compared to the
MBL group. The
pretest scores
were the
covariate.

The two groups


had the same
conceptual
knowledge
before
completing the
laboratory.
Both groups
(SBL and MBL)
increased their
conceptual
understanding.

Both groups
improved their
conceptual
understanding to
the same degree
by completing
the laboratory.
That is,
difference in the
laboratory type
did not influence
conceptual
learning.

Evidence
provided for
conclusions
p = 0.31

Cynthia Sargent 3/6/2015 6:34 PM


Comment [2]: Footnotes to point to
references
Cynthia Sargent 3/6/2015 6:34 PM
Comment [3]: http://www.ruf.rice.edu/~biosl
abs/tools/stats/ttest.html

p < 0.001 in both


groups
d = 0.87 and 1.43

p = 0.27

For more information, see http://www.ruf.rice.edu/~bioslabs/tools/stats/ttest.html


For more information, see https://statistics.laerd.com/spss-tutorials/ancova-using-spss-statistics.php

Cynthia Sargent 3/6/2015 6:34 PM


Comment [4]: https://statistics.laerd.com/sps
s-tutorials/ancova-using-spss-statistics.php

CRITICAL ANALYSIS OF RESEARCH

12

Related to the proficiency of performance on inquiry tasks, the researchers used t-tests or
Pearsons chi-square tests depending on the task. If the task involved a single question chi-square
tests were used; otherwise, t-tests were applied. Student interview responses were also evaluated
as part of the analysis of differences between the SBL and MBL groups. To compare attitudes
toward science and inquiry, postlaboratory interviews were compared in terms of frequency of
positive, neutral, or negative coded responses, as well as the percentage of such responses in
each group. These statistical analyses and data compilation methods were appropriately applied.
The authors addressed each research question one at a time in their discussion and analysis
and each variable of the study emerged in a meaningful way from the data. The conceptual
understanding test results showed that physical manipulation was not required for students to
learn a science concept, but that laboratory activities of either type (SBL or MBL) positively
affected conceptual learning. So the data from the study clearly answered research question 1.
And, while conceptual learning gains were similar in both groups, features of the MBL did
cause some significant differences in inquiry performance in the groups. The questions included
in the laboratory manual were tied to inquiry practices such as planning and conducting an
experiment and evaluating results and improving the experiment. The authors provide a table in
the article that identifies the goal or purpose of each question, the wording of the task, the
question type, and the statistical test result. From the data in this table, it was apparent that the
SBL group fell short of the MBL group in some meaningful ways. The MBL students made
better decisions about the amount of data needed in an experiment, they performed better at
carrying out the experiment, made deeper connections between the data and possible
experimental flaws, and suggested more ways to improve the experiment. However, there were
no significant differences between the groups for some inquiry activities, such as reasoning about

CRITICAL ANALYSIS OF RESEARCH

13

the materials to use in an experiment, interpreting graphs, or identifying controlled and


experimental variables and designing new experiments. Clear conclusions emerged from the
laboratory manual data and the statistical tests that were applied to the data, and important trends
related to research question 2 became apparent with regard to the effect of laboratory type on
student performance in inquiry tasks.
Conclusions related to student attitudes toward science and inquiry (research question 3) also
emerged from the analysis of data collected during the study. Whereas 50% of interviewees in
the SBL group reported that they prefer the SBL to a traditional (hands-on, or physical)
laboratory, and 50% preferred the lab to lecture, 100% of interviewees in the MBL group
reported a preference for the MBL compared to a traditional laboratory and 75% preferred the
lab to lecture (25% had no preference). Additionally, the SBL group had participants express
disappointment that the environment was too constrained and the tasks were too difficult,
whereas the MBL group expressed no concerns about the difficulty of the task and they
appreciated the opportunity to feel air pressure.

CRITICAL ANALYSIS OF RESEARCH

14
Discussion

The results and analyses support the researchers conclusions that while tactile experience or
stimuli are not necessary for learning physics concepts, the tactile experience does appear to play
a role in inquiry performance, inspiring more ideas and more practical experimental designs than
a SBL environment. The researchers finding that the MBL group had better attitudes toward
science is also supported by the data and analysis. By providing tables and clear explanations of
their tests and analyses, the authors of the article make clear the connections between the
evidence and their findings. Each finding can be tied back to probability values greater or less
than p = 0.05, and are interpreted correctly by the researchers as no significant difference
between the groups when p > 0.05 and a significant difference when p < 0.05.
The authors provide a reasonable explanation for the tactile effect. They propose that the
tactile effect relates to the concept of embodied cognition, that learning with and without motor
experiences prompts different learning processes. A computer-based learning environment seems
to evoke a mindset in students that all variables are perfectly controlled for, the data from a
virtual experiment is ideal and without error, and that there should be no manipulation
constraints to their experiments. The authors stipulate that students expectations in the
computer-based environment (SBL) might have led to trial-and-error strategies and less careful
attention paid to planning and conducting experiments. Related to differences in attitudes
expressed in the two groups, the authors provide a reasonable explanation that the students in the
study had previously learned science from lectures and paper-and-pencil drills, and the MBL
activity made a lasting impression on them, facilitating the positive attitude toward laboratories.
The article provides a number of reasonable implications for teachers and science education
in general. The authors state, the SBL used in the present study demonstrated that SBLs can

CRITICAL ANALYSIS OF RESEARCH

15

involve experimental flaws without jeopardizing the learning of the science concepts. The
implication being that instructional designers of SBLs should consider including experimental
flaws to make the SBL model more closely the real world instead of an idealized world and in
turn encourage deeper thinking in students when the SBL involves inquiry tasks. The authors
suggest that development of large number of these types of SBLs should be promoted. They
caution that a balance needs to be found between the benefit of reducing cognitive load by
constraining the learning environment and the benefit of cultivating inquiry by providing nonidealized data in the SBL.
For teachers, the main implication for practice is to carefully consider the learning objectives
when deciding whether to have students perform virtual labs or physical labs. If the objective is
only to improve conceptual understanding, then SBLs are likely to be just as effective as physical
laboratories. However, if other objectives are included, such as inquiry performance or
influencing attitudes about science, then physical laboratories are likely the better choice. The
article suggests a combination of SBLs and MBLs be used to provide good science instruction
within the constraints of time and resources.
Science teachers and curriculum programs should continue to include physical laboratories as
an integral component of the learning experience, as there were significant differences found in
this study between an SBL and a physical laboratory, notably the positive attitudes students had
related to the opportunity to experience air pressure in a tactile way. Additionally, this study
shows that an activity written in a way to provide scaffolding from structured to open inquiry is
effective in encouraging students to think scientifically even when the students lack prior
experience with inquiry.

CRITICAL ANALYSIS OF RESEARCH

16
References

Chen, S., Chang, W.-H., Lai, C.-H., & Tsai, C.-Y. (2014). A comparison of students approaches
to inquiry, conceptual learning, and attitudes in simulation-based and microcomputer-based
laboratories. Science Education, 98(5), 905935. doi:10.1002/sce.21126
Clark, R. C., and Mayer, R. E. (2011). e-Learning and the science of instruction: proven
guidelines for consumers and designers of multimedia learning. Third edition. San Francisco,
CA: John Wiley & Sons, Inc.

Das könnte Ihnen auch gefallen