Sie sind auf Seite 1von 18

Computers & Education 141 (2019) 103615

Contents lists available at ScienceDirect

Computers & Education


journal homepage: www.elsevier.com/locate/compedu

Learning chemistry nomenclature: Comparing the use of an


T
electronic game versus a study guide approach
Joshua Wood, Dermot Francis Donnelly-Hermosillo∗
Department of Chemistry, California State University Fresno USA

A R T IC LE I N F O ABS TRA CT

Keywords: Learning how to name chemical compounds is a critical feature of chemistry that many students
Nomenclature often find challenging. Naming compounds requires both an understanding of the conventions
Game and language of chemistry. Common strategies used to improve student understanding of che-
Study guide mical nomenclature include study guides and games. However, little is known about how these
Mixed methods
strategies impact student learning of chemical nomenclature. This mixed-method study compares
Undergraduate
the effect of a new electronic chemistry game, Topinomica, versus an existing study guide on the
learning of nomenclature in an introductory undergraduate chemistry course for a diverse stu-
dent population (n = 470). Research methods include pre/post-tests, short student surveys and
instructor questionnaires, and classroom observations. Quantitative findings indicate significant
pre/post gains for both conditions, but no significant difference between the game (n = 255) and
the study guide (n = 215). Prior knowledge analysis shows a significant difference between
conditions for high prior knowledge students, with the game treatment performing better.
Qualitative findings demonstrate that instructors endorse and are adopting the game and that
students prefer a game to a study guide. We discuss implications of this research for future
science education studies related to study guides and educational games.

1. Introduction

Chemical nomenclature is a critical and challenging topic for students to learn that underpins their entire Chemistry experience
(Brecher, 1999). Students must learn to name elements and compounds based on representations of atoms in molecules (molecular
formula, e.g., C6H12O6 - glucose), the proportion of atoms in molecules (empirical formula, e.g., CH2O - glucose), the arrangement of
atoms in molecules (structural formula, e.g., O]C]O - carbon dioxide), and how atoms in different molecules interact (chemical
reactions, e.g., 2H2 + O2 → 2H2O - hydrogen and oxygen gas react to produce water; Taskin & Bernholt, 2014).
In general, the nomenclature difficulties students encounter center on problems with language and conceptual difficulties (Taskin
& Bernholt, 2014). There is evidence to suggest that language can be positively affected through use of electronic games (Tsai & Tsai,
2018). Language involves the meaning, function, and syntax of formulae while conceptual problems involve misunderstanding the
link between models, symbols, and the macroscopic level (Taskin & Bernholt, 2014). The final problem of interpretation involves
assigning static meanings to formulae. This can lead to problems in assuming chemical reactions involve only one molecule per
formula (Taskin & Bernholt, 2014).
Problems with language arise because most students have common preconceptions about certain language, in particular, having
difficulty detaching themselves from the introductory aspects of nomenclature related to elements (Al-Kunifed, 1993; Barke, Hazari,


Corresponding author. 2555 E. San Ramon Ave., MS SB70, Fresno, CA 93740 USA.
E-mail address: ddonnelly@csufresno.edu (D.F. Donnelly-Hermosillo).

https://doi.org/10.1016/j.compedu.2019.103615
Received 1 December 2018; Received in revised form 27 June 2019; Accepted 28 June 2019
Available online 29 June 2019
0360-1315/ © 2019 Elsevier Ltd. All rights reserved.
J. Wood and D.F. Donnelly-Hermosillo Computers & Education 141 (2019) 103615

& Yitbarek, 2009). For example, students have different ideas of what a chemical symbol stands for, with some confusing them as an
abbreviated notation, and others confusing them with chemical formulae. Students may confuse Cl2O (dichlorine monoxide) as a
composition of O (Oxygen) bonded entirely to Cl2 (Chlorine gas) rather than two Cl (Chlorine) atoms bonded individually to O
(Bernholt et al., 2012).
Conceptual understanding of symbols is the link in applying symbols to compounds (Taskin & Bernholt, 2014). Students often use
the wrong symbol for elements such as writing the symbol P for Potassium instead of K, and often misunderstand the use of prefixes
like di- and tri-as in carbon dioxide (two oxygens) or carbon trioxide (three oxygens; Glaźar & Devetak, 2002). Beyond difficulties
with using symbols, students can attempt to apply erroneous meanings to the subscripts of formulae and the order in which formulae
are arranged (Kieg & Rubba, 2006). For example, in the formula for water (H2O), the subscript “2” indicates the presence of two
hydrogen atoms to one oxygen atom but some students believe the “2” indicates the presence of a double bond. Further, students
occasionally have problems in understanding what exactly the charge of an ion means (Glaźar & Devetak, 2002).
Several strategies are recommended for supporting students in learning chemical nomenclature and overcoming their language
and conceptual difficulties such as worksheets, study guides, interactive lectures, and games (Marais & Combrick, 2009; Taskin &
Bernholt, 2014; White, 1984; Wood & Breyfogle, 2006), either through physical or computer-based formats. These strategies are
predominantly drilled-based in nature, requiring the practicing of naming several similar, but distinct elements and compounds. For
the purpose of this study, we focus on the use of two of the most common strategies in undergraduate classes which are study guides
and games.

1.1. Review of study guides

Across various disciplines, students tend to perceive study guides, defined as physical or electronic aids which indicate what a
student should study (Laidlaw & Harden, 1990), in a positive light as they help students succeed on exams or prepare them for the
material in class (Dickson, Miller, & Devoley, 2005; Mires & Howie, 1998; Ravichandran, Abirami, Dilara, & Vijayaraghavan; 2014;
Shemberger & Wright, 2014; Sturges, Maurer, & Kosturik, 2017; Vandsburger & Duncan-Daston, 2011). However, researchers gen-
erally agree that the effectiveness of a study guide is anecdotal at best, and at worst can actively hinder student learning, thus the
need for more empirical research (Dickson et al., 2005; Gurung, 2003; Hackathorn, Joyce, & Bordieri, 2018; Sturges et al., 2017).
Holsgrove, Lanphear, and Ledingham (1998) demonstrate that too much information in study guides in a medical course would
turn students off using the guide entirely. Dickson et al. (2005) show that mandatory study guides improve student performance on
exams compared to a treatment where the study guide was not required, but they question whether a two percent difference between
the two treatments, though of statistical significance, is of practical significance. Further, students who complete a majority of the
study guide compared to students who make a cursory effort at the study guide result in similar outcomes (Dickson et al., 2005).
Several researchers suggest that study guides are most effective when students create them themselves or assist in creating them
thereby promoting self-learning (Hackathorn et al., 2018; Hein, 1978; Langlois, 2002; Mires & Howie, 1998). Study guides promote
self-learning when students are given a study guide with a temporal approach to the course - that is, the content of the study guide
matches the content of the lectures and activities in sequence (Khogali, Laidlaw, & Harden, 2006). The authors caution however that
the findings are the result of a relatively isolated sample (n = 150) of second year medical students on a small subset of the overall
course. In addition, when students receive a completed study guide versus a study guide outline, students generally view the
completed study guide as more helpful, though there is no significant difference between the group with a completed guide and the
group with a study guide outline (Sturges et al., 2017).
Based on existing study guide research, further research is needed on the impact of study guides and there is also a greater need
for empirical research tied to student learning outcomes, particularly for the learning of chemical nomenclature. To the best of the
authors' knowledge there is no literature linking chemical nomenclature to study guides and thus a more general approach to study
guides was taken for this study.

1.2. Games and chemical nomenclature

Multiple games have been developed to teach nomenclature including Compoundica (Bayir, 2014), Nomenclature BINGO (Crute,
2000), GoChemistry (Morris, 2011), Chemantics (Sawyer, 1976), ChemOKey (Kavak, 2012), the Rainbow Matrix (Wulfsberg, Sanger,
Melton, & Chimeno, 2006), and Chemical Alias (Kurushkin & Mikhaylenko, 2015). Compoundica, Chemantics, and ChemOKey are
card games dealing exclusively with ionic compounds (negative and positive ions combined to form a neutral compound, e.g., Na+
and Cl− bond to form NaCl (Sodium chloride, Salt)) with the first game played on a game board (Bayir, 2014) and the latter two with
a series of cards (Kavak, 2012; Sawyer, 1976). Chem-Deck is played as a poker-style game, but also deals exclusively with ionic
nomenclature (Sherman & Sherman, 1980). Though it deals exclusively with ionic nomenclature like the others, Rainbow Matrix is a
digitized game whereby players match like charges from an anion group and cation group to make a neutral compound (Wulfsberg
et al., 2006). Sedney (1988) reports using Trivial Pursuit® as the basis for a chemistry game but provides no further detail. Further
evidence for using Trivial Pursuit® as the framework for an educational game comes from the work of Adair and McAfee (2018) who
made a generalized chemistry game, though not one specifically targeting nomenclature. Hung, Yang, Hwang, Chu, and Wang (2018)
report that board and electronic games are promising tools for increasing student learning but that further research into the topic is
needed. The metanalysis by Clark, Tanner-Smith, and Killingsworth (2016) concluded that electronic games are generally more
effective than non-game instructional methods.
Nomenclature games need not be only for ionic nomenclature, as nomenclature bingo is meant to help students with the naming

2
J. Wood and D.F. Donnelly-Hermosillo

Table 1
Study design of nomenclature games.
Authors Topic Covered (Ionic, …, Study Type (Comparison, Design, Assessment (Conceptual, Self-Report, Conceptual Assessment Items Student Demo- graphics
Multiple, Other) Survey, Case) Observation) Included Reported (Yes/No)

1. Bayir (2014) Monoatomic Ions Case Study Self-Report No No


2. Crute (2000) Other Case Study Self-Report No No
3. Kavak (2012) Ionic Nomenclature Comparison Conceptual No No
4. Kurushkin and Mikhaylenko Multiple Case Study Self-Report No No
(2015)

3
5. Morris (2011) Ionic/Covalent Nomenclature Case Study Self-Report No No
6. Sawyer (1976) Ionic Nomenclature Not Described Not Described No No
7. Sedney (1988) Not Described Not Described Not Described No No
8. Sherman and Sherman (1980) Ionic Nomenclature Comparison Conceptual No No
9. Wulfsburg et al. (2006) Multiple Comparison Conceptual No No
Totals 1 Monoatomic ions 4 Case Study 4 Self-Report 9 No 9 No
4 Ionic Nomenclature 3 Comparison 3 Conceptual
2 Multiple 2 N/A 2 N/A
2 Other
Computers & Education 141 (2019) 103615
J. Wood and D.F. Donnelly-Hermosillo Computers & Education 141 (2019) 103615

conventions for alkanes (Crute, 2000), thus it could be adapted for other use as well. go chemistry is modeled after the game go fish®
and teaches both ionic and covalent nomenclature (Morris, 2011). Finally, chemical alias is a card game teaching oxides, acids and
bases, and salts (Kurushkin, & Mikhaylenko, 2015).
The emerging theme of all these games is a card-based system teaching a limited scope of chemical nomenclature (Table 1). There
is however, to the best of the authors' knowledge, no single nomenclature game designed to give a broad overview of introductory
nomenclature. Each of the games addresses a specific area such as ionic nomenclature or covalent nomenclature. The game created
for the purpose of this study, which the authors title “Topinomica,” addresses the nomenclature of elements, hydrates, diatomic
molecules, and further breaks down ionic nomenclature into polyatomic ions, mono-atomic ions, acids/bases, and ionic compounds.
Please refer below to the game design section for a general overview of the Topinomica game, and the appendix for a full list of rules.
Studies on games for chemical nomenclature tend to be purely qualitative in nature with the exception of Kavak (2012), Sherman
and Sherman (1980), and Wulfsburg et al. (2006; Table 1). Further, across the majority of chemical nomenclature and game studies,
details on student demographics and the nature of assessments are absent (Table 1).

1.3. Integrating games within instruction

There is a long history in the use of games for educational settings sparking much discussion on everything from their appro-
priateness in the classroom, to their design in recent years. With improvements in technology, games have expanded in scope and are
more varied than ever before. Video games, simulations, card games, text games, and board games, can all fall under the umbrella of
a game (Breuer & Bente, 2010).
There is extensive literature evidence suggesting that games do indeed improve student learning across a wide variety of subjects
including math, science, literature, etc. (Connolly, Boyle, MacArthur, Hainey, & Boyle, 2012; Gee, 2003; Kordaki & Gousiou, 2017;
Randel & Morris, 1992). Games can also help student feel more motivated to explore otherwise difficult subjects (Srisawasdi &
Panjaburee, 2018). For example, students playing an interactive computer game, focused on navigating a space ship across a finish
line or around a corner, perform significantly better than students who not playing the game when confronted with Newtonian
physics problems related to force and motion (White, 1984). Further, a literature review indicates that students prefer games and
simulations to traditional classroom instruction and that a majority of games improve content retention (Randel & Morris, 1992;
Vogel et al., 2006).
The common design feature of educational games is an emphasis on knowledge acquisition - particularly “serious games” which is
a term coined by Clark C. Abt in 1975 to describe games designed for a primarily educational purpose rather than entertainment
(Connolly et al., 2012). However, when designing games, it is important to consider the difference between a game which only
illustrates a concept and a game which teaches students to apply that same concept to new and evolving situations (Clark et al., 2011;
Clark, Sengupta, Brady, Martinez-Garza, & Killingsworth, 2015a; Martin, Silander, & Rutter, 2018). Game design is often not taken
into consideration when discussing games vs traditional methods of instruction (Clark et al., 2016). Towards that end, Gauthier and
Jenkinson (2018) posit that serious games be designed to promote “productive negativity” which is to say, an environment in which
students learn from their failures. Games also help those students who have lower prior knowledge of the subject at hand (Gros, 2007;
Virvou, Katsionis, & Manos, 2005). However, to understand the relationship between games and prior knowledge requires further
research (Gros, 2007).
The type of person playing is just as important as the design of game itself (Gaydos, 2015; Gros, 2007). Due to the time re-
quirements and despite positive opinion, instructors rarely integrate games into the classroom largely due to a misunderstanding of
the role of a game in the curriculum or the efficacy of the content (Gros, 2007). Research indicates that other barriers to adoption
exist, including not only the time required to learn and play the game but the time in between play sessions as well as more standard
barriers like class size, availability of equipment, and the instructor's own knowledge about computer systems and the game itself
(Egenfeldt-Nielsen, 2004; Mansureh, 2010). These are examples of first-order barriers as described by Ertmer (1999), that is, barriers
to integration which are extensive to the instructor. Second order barriers are less well-defined but are the more challenging and they
include personal beliefs and possible radical shifts in teaching style (Ertmer, 1999). The problem of integration is still prevalent to the
educational field as instructors fail to integrate games into their daily use despite evidence that students were more engaged and
eager to learn (Huizenga, ten Dam, Voogt, & Admiraal, 2017). Few teachers make use of professional development tools to integrate
such games though they report that they are interested in doing so (Callaghan, Long, Van Es, Reich, & Rutherford, 2018).
As mentioned previously, games are particularly useful in science related fields (Randel & Morris, 1992). While games are used
teach everything from VSEPR theory to nuclear synthesis, there is very little quantitative evidence supporting the use of games to
teach chemical nomenclature with recent designs making use of cards, computers, Jeopardy, or established board games (Adair &
McAfee, 2018; Capps, 2008; Deavor, 1996; Kurushkin & Mikhaylenko, 2015; Morris, 2011; Rastegarpour & Marashi, 2012; Russell,
1999; Sawyer, 1976).
Based on our review of the literature and the focus of the game and the existing study guide, two research questions guide this
study:

1. Compared to the existing study guide approach to Introductory Chemistry nomenclature, does a computer-based chemical no-
menclature game (Topinomica) improve student learning?
2. What are instructors' and students' perspectives of the computer-based game and the barriers to integration of the game?
a. Do students prefer a study guide or a game-based approach to learning nomenclature?
b. Do instructors plan to integrate the game into their future instruction and why?

4
J. Wood and D.F. Donnelly-Hermosillo Computers & Education 141 (2019) 103615

Fig. 1. Screenshot of the electronic game board layout.

For the first research question, we are interested in seeing the pre/post effects of the study guide and Topinomica on student
learning of chemical nomenclature. Within this research question, we are also interested to see if the effects on learning are consistent
across different students, particularly for low prior knowledge students and high prior knowledge students. For the second research
question, we are particularly interested in learning if the Topinomica game has ‘buy-in’, that is, if students prefer using it to a study
guide and if it has it the potential to be sustained in future semesters by instructors. We aim to consider what may be the contributing
factors to such instructor technology integration by analyzing instructor responses through Ertmer's (1999) first and second order
barriers.

2. Method

2.1. Game design and study guide design

Many of the chemistry games discussed in the introduction involve learning one aspect of nomenclature, most often ionic no-
menclature (Bayir, 2014; Kavak, 2012; Sawyer, 1976; Sherman & Sherman, 1980; Wulfsberg et al., 2006). Some games discuss the
nomenclature of elements or of oxo-salts (Kurushkin & Mikhaylenko, 2015). No single game could be found which encompasses all
these topics which is a gap that Topinomica is designed to fill. There is a precedent for modifying existing games for educational
purposes, most notably Trivial Pursuit® (Adair & McAfee, 2018; Ratcliffe, 1986; Sedney, 1988). It was due to the ability to have
multiple categories of questions that Trivial Pursuit® was chosen as the model for this study.
The game board and categories were modified and digitized to fit onto a computer screen (see Fig. 1; Use the following link to
download the game [Windows only]: Game Link). Some categories of questions such as hydrates were included because it is only a
short extension of the knowledge needed for ionic compounds. The design of Topinomica follows established theory for effective
game design such as Gredler's five criteria for essential design. These five criteria are as follows: 1. Winning should not be random,
but indicative of skill or knowledge, 2. The game should address important content, 3. The game should be designed to match the
level of the end user (elementary vs college for example), 4. There is no penalty for wrong answers, and 5. Games should not be zero-
sum (Gredler, 2004). As to the first criterion, students win by answering correctly which demonstrates knowledge. The game instructs
students in the nomenclature of common compounds they will encounter through their chemistry careers thus meeting the second
criterion. The questions asked were largely based on a college-level textbook but can be modified to suit any level of learner.
Topinomica is intended to be played at any introductory college course where the nomenclature categories may be introduced in a
different order throughout the semester. Finally, there are no penalties for incorrect answers and students are actively learning
whether they win or lose meeting the zero-sum criterion (see Figs. 2–4).
The rules for Topinomica follow the same rules of Trivial Pursuit® but can be modified to fit individual curricula. The board was
modified from the original Trivial Pursuit® board to streamline use in the classroom by removing the cross arms in the center of the
board and roll again spaces. Students were given one minute to answer the question of the category they landed on, but the timer's
use was optional. If a student landed on a scroll and answered the corresponding question correctly, then they would be able to fill in
their color wheel for that section. Please see Appendix A for additional information on how Trivial Pursuit® is played.
The study guide involved 80 questions focused on nomenclature related to ionic compounds and acids and bases (Authors, 2014).

5
J. Wood and D.F. Donnelly-Hermosillo Computers & Education 141 (2019) 103615

Fig. 2. Digital question cards on the left side of the electronic board with detail of components.

Based on the work of Sturges et al. (2017), and Hackathorn et al. (2017), study guides can be classified as a worksheet wherein
students fill out the information or as notes, where the instructor provides what is needed for exams and the details. The study guide
used in this paper falls into the former category and would be classified as a worksheet. In the first and second sections of the study
guide, students were tasked with filling in a table with the probable ionic formula given the names of two elements or two polyatomic
ions. An example of this is writing KI (potassium iodide) when presented with potassium and iodine. Parts three and four of the study
guide dealt exclusively with nomenclature and writing the formula and vice versa. Please see Appendix B for a copy of the study
guide.

2.2. Participants

This study involved 470 students taking a General Chemistry course in a western United States comprehensive university that is a
Hispanic-Serving Institution. Of the 323 students that selected a response to the gender question, 178 selected ‘Female’ (55%), 136
selected ‘Male’ (42%), one selected ‘Non-binary/3rd Gender’ (1%), and eight selected ‘Decline to State’ (2%). Of the 339 students that
selected a response to the demographic question, 153 selected ‘Multiple’ (45%), 56 selected ‘White’ (17%), 55 selected ‘Hispanic/
Latino/Latina’ (16%), 24 selected ‘Asian’ (7%), 15 selected ‘Other’ (4%), seven selected ‘Black or African American’ (2%), and 29
selected ‘Decline to State’ (9%).

6
J. Wood and D.F. Donnelly-Hermosillo Computers & Education 141 (2019) 103615

Fig. 3. Game board with explanation of piece movements and spaces.

Fig. 4. Game pieces, color wedge, timer, and electronic die. (For interpretation of the references to color in this figure legend, the reader is referred
to the Web version of this article.)

7
J. Wood and D.F. Donnelly-Hermosillo Computers & Education 141 (2019) 103615

2.3. Implementation

As students self-selected into classrooms, random distribution of students across treatments was not possible. Twenty-four classes
across 15 instructors were available for students to select from and these classes were split across two rooms. As such, two classes
were offered at each time slot. Four of the instructors were female and 11 were male. Prior to beginning the study, the instructors
were given a demonstration of the game approximately one week beforehand in an hour-long meeting so that they could familiarize
themselves with the layout, and how the game was loaded and worked.
The treatments were randomly assigned and split by class for each time slot. Hence, for each time slot, there was a study guide
treatment class and a game treatment class. It was expected that such an implementation would help alleviate differences based on
self-selection into classes. Further, pre-test scores were compared to check for equivalency of the two groups and are reported in the
findings. For both treatments, complete data was obtained for 215 students in the study guide and 255 students in the game
treatment.
The authors observed all classes from both the study guide and game groups to determine the fidelity of implementation. One
class of approximately 26 students in a study guide treatment did not follow the instructions of the study and so their results were
excluded from the analysis. Instructions for both treatments were relatively straightforward and thus instructors did not differ in their
implementation of the two treatments. Students in the study guide group actively engaged in discussion together and used the
nomenclature sheet, appeared engaged and asked multiple questions. The students in the game condition also engaged in discussion
and asked the instructor questions if their answer differed from the correct answer on the game. Students in the study guide condition
also asked questions of the instructor but seemed less likely to do so without the immediate feedback of the correct answer to prompt
them. Students in both treatments worked in pairs.
University ethical approval was received before the research was conducted and all instructors and students signed consent forms
before participating in the research.

2.4. Research design

There is some debate as to whether a quantitative or qualitative methodology is more appropriate for use and there are pro-
ponents of both, however there is evidence that a mixed method study can help explain the gaps left by one methodology or another
(Cook & Reichardt, 1979, pp. 7, 68–85). Some authors are proponents of contingency theory, which is the using of the method or
mixture of methods that are the most applicable to the study (Johnson & Onwuegbuzie, 2004). The pragmatist approach advocates
that the research questions should drive the method, which in the case of this study was primarily quantitative with supporting
qualitative evidence (Onwuegbuzie & Leech, 2005). We use a quantitative approach to address the first research question through
pre/post-tests - whether the game improved scores over the study guide. We use a qualitative approach to address the second research
question through a brief instructor questionnaire, a brief student survey, and classroom observations.
Creswell (2009, p. 205) points out some challenges of the mixed method study, namely that there is a time intensive component as
vast amounts of data are analyzed as well as the requirement that the researchers be familiar with both quantitative and qualitative
approaches. Furthermore, researchers must take into consideration the timing of the study and whether quantitative and qualitative
data must be collected at the same time or whether there is a gap (Creswell, 2009, p. 206).
In the case of this study, both quantitative and qualitative data were collected simultaneously, giving weight to the use of a mixed
method approach. It was not possible to continue data collection in a stepwise-concurrent fashion because students discuss no-
menclature only for a week (A few classes per day) before moving on to other topics.

2.5. Data collection and analysis

Prior to either treatment, students took a pre-test with 18 multiple-choice nomenclature items to assess prior knowledge
(Appendix C). The 18 items included six categories (three questions per category). The six categories are 1. Monoatomic ions, 2.
Polyatomic ions, 3. Ionic compounds, 4. Covalent compounds, 5. Acids and bases, and 6. A mixture of elements, hydrates, and
diatomic molecules. Aside from the elements, the list of questions for each category was developed by looking through the lists given
in the introductory chemistry textbook used at the university at the time (Tro, 2011). Further questions were generated from the
laboratory manual used by students. We obtained a Cronbach's alpha of 0.837 for the pre-test and 0.729 for the post-test. A pilot
study was performed (n = 372) the semester prior to the main study to review the assessment items of both the pre- and post-tests
thereby ensuring consistency across both conditions. An additional question was added to each section of the main study changing
the total questions from twelve to eighteen. The purpose of this was to allow further insight into each category. In addition, an “I don't
know” option was added to help mitigate the possibility of students guessing with assurance that it would not affect their grade.
Both the study guide and game treatments received a periodic table, and a nomenclature guide with the various rules listed by Tro
(2011). Students did not use any other resources to help them answer questions. As an example of one of the rules, when naming
binary acids such as hydrochloric acid (HCl), Tro (2011) suggests the naming schema of [Hydro]+[base name of the nonmetal]+[ic]
+[Acid]. In this case, chloride is the base name of the nonmetal. Students were then given a nomenclature lecture period which went
over the common nomenclature rules, though they still had the nomenclature card to reference. There is evidence to suggest that
formal instruction better prepares students for game (Lundy, 1991; Sakkal & Martin, 2019). Nomenclature lectures lasted between 20
and 30 min for all the classes. After the lecture, students worked on their assigned activity (study guide or game), after which the
nomenclature cards were collected and the post-test handed-out. The questions on the pre-test and post-test were developed by

8
J. Wood and D.F. Donnelly-Hermosillo Computers & Education 141 (2019) 103615

choosing three questions from each of the six nomenclature categories. The post-test had comparable though different questions for
each category and were presented in a different order (Appendix D).
The post-test included a survey on student's thoughts of the study guide/game as well as demographic information. A gatekeeping
question was put in to help filter out students who were not reading the questions carefully thus survey responses have a smaller
sample size. There were three criteria to rule student responses out. The first criterium was changing their answers to the gatekeeping
question which reversed what the question was asking students. The second criterium was to rule out anyone who went from a
neutral to a strong opinion on the gatekeeping question as it again implies that the question was not read thoroughly. The final
criterium was the first question in the series asking students to choose what activity they did. As students were broken down by class
for activity, none should have done a different activity from their fellow classmates. These reductions lead to a sample size for the
survey responses of 181 students for the game group and 142 students for the study guide comparison.
All quantitative data was analyzed within SPSS. Depending on the analysis, different tests were used. For example, a t-test was
used for comparison between pre- and post-scores while a repeated measures analysis was used to compare the differences between
conditions. Specific tests are listed in the results section when used. In order to move beyond dichotomous scoring (right or wrong)
and show nuance in student understanding, test scores were converted from raw right or wrong answers into a 4-point scale. This
method of scoring aligns with constructivist views of learning that provide greater insight for instructors on student understanding
(Liu, Lee, Hofstetter, & Linn, 2008). A correct answer would result in 3 points for that question, while an incorrect answer would
result in 2 points. The “I don't know” option was counted as 1 point, and leaving the question blank would count for 0 points. The
two-point scale only allows for analysis of right or wrong answers without providing insight into whether students guessing at the
correct answer. The four-point scale made the multiple choice test more sensitive to student understanding.
Qualitative data was obtained from instructors through a survey with free response questions. Instructors were first asked open-
ended questions such as how long they spent on their respective activities and how they felt about the activity they did. They were
then asked more pointed questions specifically relating to the barriers of technology integration first defined by Ertmer (1999). First
order barrier categories were Equipment, Training, and Support. Ertmer (1999) categorizes second-order barriers under Assessment,
Student-Teacher Roles, Teaching Method, and Management and Organizational Styles (Table 2). These barriers provide the
groundwork for many fields of educational research and the adoption of technologies in classrooms (An & Reigeluth, 2011; Chen,
Tan, & Lim, 2012, pp. 191–196; Heath, 2017; Laine, Sedano, Joy, & Sutinen, 2010; Minshew & Anderson, 2015). For example,
Kopcha (2012) and Clark, Zhang, and Strudler (2015b) highlight the value of this framework in categorizing qualitative data for
technology integration. Comments were coded independently between the authors using a binary system in Microsoft's Excel®. A 1
represented a comment as being indicative of a particular barrier, while a 0 indicates that it did not belong to a barrier. Some
comments were coded as being part of more than one barrier. Analysis of qualitative data took an inductive rather than deductive
approach with Ertmer's barriers as the starting theory. The initial coding by both authors had an interrater agreement of 79%.
Responses were re-examined with an iterative discussion between the first and second author until agreement was at 100%.

3. Results

3.1. Overall performance (RQ1)

Students in both treatments improve overall. A t-test analysis between the pre- and post-tests for all students and by treatment
show significant improvement (p = 0.001; Table 3). Repeated measures between the study guide (n = 215) and the game (n = 255)
treatments lack a significant difference (F (2, 468) = 1.818, p = 0.178). Analysis of pre-test scores across conditions via one-way
ANOVA does not show a significant difference (F(2, 468) = 0.001, p = 0.99).
Gains by instructor were analyzed across treatments by repeated measures analysis. Results indicate non-significant pre/post
differences by instructor when looking at instructors for both conditions (n = 22, F = 1.494, p = 0.074) or for instructors in the study
guide (n = 10, F = 1.778, p = 0.074) or game (n = 12, F = 1.175, p = 0.305) treatments.

Table 2
Ertmer's first and second order barriers.
First order Barriers Definition

Availability of Equipment Equipment is either unavailable and does not work as intended
Training on the Technology An instructor does not feel adequately trained to use the technology before using it
Support Lack of access to help to technology while using it
Second Order Barriers
Assessment Technology integration is influenced by instructors' views of assessment and how well the technology aligns with
existing assessments.
Student-Teacher Roles The role an instructor has or feels they should have when interacting with a students
Teaching Method A pedagogical belief in a certain method of teaching preventing the adoption of new methods
Management/Organizational Styles A factor sometimes outside of instructor control, Management/Organizational style is a belief in how a class should be
structured

9
J. Wood and D.F. Donnelly-Hermosillo Computers & Education 141 (2019) 103615

Table 3
Combined t-test results for all students across both conditions and individually by condition.
Pre/Post Items n Mean (SD) Pre-test mean (SD) Post-Test mean (SD) t p d

Combined Students 470 2.88 (4.94) 43.75 (5.77) 46.63 (3.97) 12.627 0.001 0.58
Study Guide 215 2.54 (5.25) 43.76 (5.43) 46.30 (4.19) 7.106 0.001 0.53
Game 255 3.16 (4.66) 43.75 (6.05) 46.91 (3.76) 10.831 0.001 0.63

3.1.1. Category outcomes


After we determined the overall performance, we conducted an in-depth item analysis for the six categories of questions (Table 4).
Each category was represented by three questions on the pre- and post-tests. The sample consisted of all combined students with three
questions per category giving a sample size of 1410. Repeated measures analysis shows significant difference between the categories
(F (6, 1404) = 8.833, p = 0.001) larger effect sizes for Category 2 (Ionic Compounds) and Category 3 (Polyatomic ions; Table 4).

3.1.2. Prior knowledge outcomes


Given similar pre-test scores for both treatments, students from both conditions were split into a low prior knowledge (LPK) group
and a high prior knowledge (HPK) group based on a review of the average scores on the pre-test. Students scoring 43 or lower were
included in the LPK group, and students scoring 44 or higher were included in the HPK group. Students in both the LPK and HPK
groups benefited from the two treatments (Table 5). Both the study guide and the game helped LPK students (SG: n = 93, t = 10.46,
p = 0.001, d = 1.460; Game: n = 109, t = 10.61, p = 0.001, d = 1.25) more than it helped HPK students (SG: n = 122, t = 0.49,
p = 0.542, d = 0.06; Game: n = 146, t = 6.034, p = 0.001, d = 0.44). For the HPK group, the game significantly improved scores
compared to the study guide. Repeated measures analysis between the game and study guide for LPK students was not significant
(n = 202, F = 0.008, p = 0.928). Repeated measures analysis of HPK students showed significant differences between the study
guide and game conditions (n = 268, F = 7.009, p = 0.01) in favor of the game treatment.

3.2. Student perspective outcomes (RQ2a)

A total of 323 students shared their perspectives (See Appendix D for the list of survey questions asked). Fifty-five percent
(n = 178) of students across conditions strongly agreed or agreed that they would have rather played a game than complete a study
guide to learn nomenclature (Table 6). Twenty-seven percent (n = 88) were neutral in their opinion, while only 17% (n = 57) were
against playing a game. The number of students who responded that they would rather play the game than do a study guide was
higher among students in the game condition at 62% (n = 113) than the study guide group where only 45% (n = 65) responded in
favor of the game treatment. Seventy-one percent of students in the game condition reported that their activity was enjoyable
compared to only 26% of students in the study guide condition.
A majority of students from both groups report that the activity they did helped them feel more confident writing formulas and
names of compounds. The study guide condition had a higher percentage of students agreeing to this question at 71% (n = 103)
compared to the game condition at 63% (n = 115).

3.3. Instructor perspective outcomes (RQ2b)

Instructors were asked for their feedback through follow-up questions in the form of a survey. These responses were then coded
and grouped according the first- and second-order barriers provided by Ertmer (1999).

3.3.1. First order barriers


First-order barriers as those conditions extrinsic to an instructor which prevents adoption of new technology whether the barrier
be monetary, lack of resources, or time-related (Ertmer, 1999; Petrov, 2014; Schoepp, 2005). The barriers we coded for are the
availability of equipment, training on the technology, and support. There were comparatively few first order barriers as opposed to
second order barriers, and of the first-order barrier comments, the majority were related to the equipment barrier. Below are some of
the representative examples:

Table 4
Item analysis by category.
Paired t-Test n Mean (SD) Pre-test (SD) Post-test (SD) t p d

Category 1: Elements and Hydrates 1410 0.09 (0.61) 2.60 (0.58) 2.69 (0.49) 5.834 0.001 0.17
Category 2: Ionic Compounds 1410 0.20 (7.09) 2.40 (0.65) 2.60 (0.53) 10.664 0.001 0.34
Category 3: Polyatomic Ions 1410 0.24 (0.79) 2.27 (0.69) 2.52 (0.57) 11.470 0.001 0.40
Category 4: Acids and Bases 1410 0.10 (0.78) 2.37 (0.70) 2.47 (0.57) 4.879 0.001 0.16
Category 5: Covalent Compounds 1410 0.14 (0.65) 2.62 (0.60) 2.77 (0.45) 8.129 0.001 0.28
Category 6: Monoatomic Ions 1410 0.18 (0.81) 2.32 (0.75) 2.50 (0.64) 8.286 0.001 0.26

10
J. Wood and D.F. Donnelly-Hermosillo Computers & Education 141 (2019) 103615

Table 5
Pre-post analysis based on prior knowledge outcomes.
n Gain Score Pre-Test Mean (SD) Post-Test Mean (SD) t p d

LPK (Total) 202 5.62 38.85 (5.01) 44.47 (3.35) 14.90 0.001 1.32
LPK (Study Guide) 93 5.59 38.90 (4.11) 44.49 (3.57) 10.46 0.001 1.46
LPK (Game) 109 5.66 38.80 (5.69) 44.56 (3.17) 10.61 0.001 1.25
HPK (Total) 268 0.81 47.45 (2.76) 48.26 (3.62) 3.95 0.001 0.25
HPK (Study Guide) 122 0.22 47.46 (2.72) 47.68 (4.13) 0.49 0.542 0.06
*HPK (Game) 146 1.29 47.45 (2.81) 48.74 (3.06) 6.034 0.001 0.44

1. Equipment
“I know that the USB drive with the game was available to some of us, but some of the errors in the game has not been corrected.”
Instructor A
“Like Trivial Pursuit®, the game can go on for a very long time if the players can't land on the corners. Perhaps a different model
game structure could be used.”
Instructor B
These comments represent first order barriers due to the indication that something was wrong with the physical equipment that
was used - either a problem with the USB Drives or a problem with the format of the game itself.

2. Training and Support


“More specific instructions would be nice, although the students did pick up on the games fairly quickly.”
Instructor E
This comment illustrated a need for additional instructions on how to play the game which could either represent a barrier in
training to use the computer program or a barrier in support when using the program in the classroom.

3.3.2. Second order barriers


When first order barriers are removed, meaning that there is adequate access and support of technology, the integration of
technology is still not guaranteed (Cuban, Kirkpatrick, & Peck, 2001). This indicated the existence of a second order barrier that is
less tangible and more colloquially defined as “beliefs” (Ertmer, 1999; Schoepp, 2005). Adoption of technology depends on in-
structors beliefs – specifically their belief in the importance of technology (Schibeci, MacCallum, Cumming-Potvin, Durrant, Kissane,
& Miller, 2008). The authors coded for four categories of second order barrier - 1. Assessment, 2. Student Teacher Roles, 3. Teaching
Method, and 4. Management/Organizational Styles. These barriers were confirmed to exist again in 2002 after a repeat study ex-
amined whether advancements in technology had made these barriers obsolete (Pugh, Sheldon, Byers, & Zhao, 2002).

1. Assessment

An assessment barrier is the belief that students will have similar outcomes regardless of the tool used and that there exists some
other factor preventing students from learning (Pugh et al., 2002). The assessment barrier does not necessarily indicate a dislike for
one method over another, but rather feeling that a combination of methods “should” be used to test student learning and that the lack
of one or another method may be a detriment to students. For example, Instructor F reports that students still need to study and
practice, which is not a comment on the efficacy of the game or study guide but on how students “should” learn.
“[The] Game should be included as it’s more fun, interesting and giving the same results as study guide”
Instructor D
“The game seems to be more fun for most of the students. They still need to study and practice nomenclature to do well [in
exams].”
Instructor F
“I personally prefer the study guide, but the game is another good method of learning. If both give similar results, then I believe
that is best for us to allow the use of both teaching method. Students don't all learn the same and may be more inclined to learn on
a different platform or in a different way. The benefit to the study guide, in my opinion, is the amount of tricks we as an instructor
can help give them to memorizing the nomenclature. Giving them patterns or additional ideas to help them solidify the memory
process is helpful in my opinion. All-in-all, it would be nice to integrate both of the method into teaching."
Instructor B
2. Student-Teacher Roles

Instructor's perspective on their role in the study guide or game can possibly indicate a certain belief set that is a detriment to

11
J. Wood and D.F. Donnelly-Hermosillo

Table 6
Survey data on Perspective Outcomes.
Combined Question 1a % (n): “I would rather play a game Question 2% (n): “The Question 3a % (n): “I would rather complete a Question 4% (n): “After today's nomenclature
based on nomenclature rather than complete a nomenclature activity that I did worksheet on nomenclature rather than play a board activity, I feel more confident writing formulas
worksheet on nomenclature” was enjoyable” game based on learning nomenclature” and names of compounds.”

Strongly Agree/Agree 55 (178) 51 (166) 24 (79) 67 (218)


Neutral 27 (88) 30 (97) 26 (84) 21 (69)
Strongly Disagree/ 18 (57) 19 (60) 50 (160) 11 (34)
Disagree
Study Guide

12
Strongly Agree/Agree 46 (65) 26 (37) 30 (42) 72 (103)
Neutral 33 (48) 46 (66) 27 (39) 18 (25)
Strongly Disagree/ 20 (29) 27 (39) 43 (61) 10 (14)
Disagree
Game
Strongly Agree/Agree 62 (113) 71 (129) 20 (37) 63 (115)
Neutral 22 (40) 17 (31) 24 (45) 24 (44)
Strongly Disagree/ 15 (28) 11 (21) 55 (99) 11 (20)
Disagree

a
Questions 1&3 served as gatekeeper questions.
Computers & Education 141 (2019) 103615
J. Wood and D.F. Donnelly-Hermosillo Computers & Education 141 (2019) 103615

integration of technology. Without active teacher participation in the activity, learning will be diminished and integration would
ultimately fail (Egenfeldt-Nielsen, 2004). For example, Instructor A's and H's comments indicate a belief that learning should be “fun”
as something that can be enjoyed while not necessarily representing a barrier.
“I will still implement the game as the students enjoy that more than just doing a worksheet.”
Instructor A
“I'd prefer the game. It's more interactive and it beats having the students work on the study guide during lab.”
Instructor H
Instructor E offers an interesting example of a potential Student-Teacher role barrier. When asked the question, “What do you see
as the instructor's role when students are playing the Topinomica chemical nomenclature game?” Instructor E responded:
“To ensure that the students understand how to play the game."
Instructor E
However, when asked the same question but in regards to the study guide, Instructor E responded:
“To ensure that all questions the students may have are answered."
Instructor E
The differences in these two responses indicate a very different perspective on the instructor's role when playing the game. When
in regards to the study guide, it is instructor E's perspective to answer all of the student's questions, but that perspective changes when
playing a game to a more passive teaching style whereby the instructor's role is to help students use the game.

3. Teaching Method

Teaching method indicates a belief in how a student should be taught, and how instructors should use supplemental material. It
refers to pedagogy. Comments in this category generally fell into a pattern of ‘This is how technology should be used’.
“It could be used as a supplemental material. I think most of the students enjoyed it.”
Instructor C
“I think that it is a good tool to use for those whom do not want to sit down and study traditionally.”
Instructor E
“The game was well thought-out but I think it would be better if used after memorizing nomenclature rules, i.e., used as a ‘self-
quiz'."
Instructor I
4. Management/Organizational Styles

Management and Organizational styles refers to how a classroom is organized and should be managed. It can represent a barrier
when the way the class is structured does not fit with the instructor's beliefs on how it should be organized.
[Question: Would you play the game again?] Answer: “Only if it were able to be tailored to the content of my course.”
Instructor B
[Question: Would you prefer a game or a study guide as an assessment of student learning? Why?] Answer: “With the allotted
given last semester, the study guide because it took a while for the students to understand how to play the game. For a study guide,
you can get right into it"
Instructor G
Instructors B and G both indicate barriers to adoption but for different reasons. On the one hand, Instructor B would use the game
but only if it can be tailored to how instructor B wants the material to be taught. While this perception can certainly be considered a
barrier to the instructor's teaching method, the instructor specifically mentions the course as a whole rather than an activity in
isolation and thus is indicative of a potential organizational barrier. Instructor G's barrier is more illustrative of management pro-
blems – namely that they would choose the study guide over the game due to the time constraints largely outside of their control.
“It's a good game. I enjoy the board version of the game more than the digital version, it's easier to keep tabs on the class. But the
board version has problems with the pieces getting mixed up and things going missing.”
Instructor F
Instructor F mentioned that it was “easier to keep tabs on the class, and when asked whether they would prefer a game or study
guide as an assessment of student learning, Instructor F's comments were indicative of a barrier not with the game or the study-guide
itself, but another management barrier:
“Typically in the class they ignore the topic until the last minute and do not get enough study time. When they complete the study
guide it is typically alone, quietly, with no help/feedback from their peers. If they have not studied the material before, there is

13
J. Wood and D.F. Donnelly-Hermosillo Computers & Education 141 (2019) 103615

not much they can pick up on. Typically, I can support the students who are having difficulty as I move around the room while
they work on the guide. But my attention is divided 24 ways. Students who know the material finish and want to leave."
Instructor F
As is indicated by the response, Instructor F is not particularly concerned whether the study guide or the game serves as a better
assessment, but rather being able to keep an eye on and help 24 students in learning the material.

4. Discussion

Overall, students in the study guide and game treatments improve their scores on the pre/post-tests (Table 3). Differences by
treatment are not statistically significant. This finding aligns with Kavak (2012) and Sherman and Sherman (1980), who found that
their students did significantly better on the post-test after their respective games compared to their normal curriculum treatment.
Despite no significant differences between treatments overall, we found a significant effect by treatment based on students' prior
knowledge (Table 5). Students, whether low prior knowledge (LPK) or high prior knowledge (HPK), all significantly improve from
pre-to post-test with larger gain scores for LPK than HPK. Such findings align with previous research (Gros, 2007; Virvou et al., 2005)
that report that games help lower prior knowledge students more than their higher prior knowledge counterparts. However, in our
study, we found that HPK students in the game treatment significantly outperform HPK students in the study guide treatment. LPK
students whether in the study guide or the game treatment have similar learning outcomes. The differences between the game and
study guide groups are thought to be due to a lack of scaffolding provided in the game. There is evidence to suggest that had
scaffolding been provided, the game group would have been more likely to far outperform the students in the study guide condition
(Kao, Chiang, & Sun, 2017; Masek, Boston, Lam, & Corcoran, 2017). As it is, HPK students in the game condition still achieved higher
gains than their study guide counterparts even with lack of scaffolding. It is these differences which will be investigated in future
studies.
An explanation for such findings, based on our observations and students' and instructors' preference for the game, is that the
study guide follows a somewhat rigid, linear, and one-size-fits-all approach where all students, no matter their prior knowledge, must
start and finish at the same place. However, the game offers a more flexible, non-linear, multiple pathway approach to learning that
supports students of all prior knowledge types to target categories they are least familiar with, rather than repeating categories they
already understand. Further, we argue that the game presents information in a less intimidating manner where students can work
through examples one at a time rather than being presented with a multiple-page study guide. These points are also supported by
previous research (Holsgrove et al., 1998) that found students were against using a study guide because it contained too much
information. Overall, there is a lack of research on study guides in general and particularly in science education, and there is a critical
need to determine under which conditions and for which demographic of students are study guides are most effective. In particular,
our study indicates that future research should consider alternative supports for high prior knowledge students that are less likely to
find value in a rigid study guide when they are already aware of much of the information presented.
Given many studies focus on the use of games to improve understanding of ionic compounds, we analyzed our findings by
category to understand the potential influence of particular categories on reported learning outcomes (Table 4). Repeated measures
analysis shows that there is a significant difference between learning outcomes based on categories, with the categories of ionic
compounds and polyatomic ions resulting in greater learning gains than categories such as Elements and Hydrates, and Acids and
Bases. These findings support the findings of Kavak (2012), Sherman and Sherman (1980), and Wulfsburg et al. (2006) who found
that ionic nomenclature games improve student learning. There is however, very little literature in regard to the other categories of
nomenclature, indicating possible areas for future research. The seemingly higher effect sizes of ionic nomenclature may be due to
ionic nomenclature being a more introductory concept often taught towards the beginning of one's chemistry career than a topic such
as Acids and Bases. The lower effect size for Elements and Hydrates can be explained by a higher pre-test score for this category,
indicating students enter such courses with a better understanding for this category, as would be expected.
Gains by instructor were not significant across overall scores or by treatment indicating that students performed similarly no
matter which instructor they had. As students were randomized at the class level for this study and instructors received limited
professional development before implementing the game, such a finding indicates the ease in using the game. Further, instructors
ranged from new teaching assistants (mostly graduate students) to part-time instructors (some with over 20 years of experience),
again highlighting the ease of using the game across instructor type. Future research studies could investigate how instructors may
greater influence student learning for study guides and games by providing students with particular instructor questioning and/or
prompts while completing such activities.
After iterative agreement between the authors for the survey items, only a few comments were identified as indicating a first-
order barrier as described by Ertmer (1999). Most of the comments indicated a second order barrier and were evenly distributed
between the four categories. The most prevalent barriers were teacher-student roles and teaching methods. Instructor E was a prime
example illustrating the possible barriers that can arise as the result of teacher-student roles. When doing the study guide, Instructor
E's perspective was one of a conventional instructional role – to provide students with knowledge and ensure that their questions are
answered. There is a distinctive shift when the instructor is asked about the game because the instructor felt that their role was now to
help students play the game. The teacher-student role should be the same if there was no barrier but this is not the case. It should be
noted however, that while technology integration barriers may exist, as indicated by instructor comments, it does not necessarily
mean that said barrier was not overcome. This overcoming of barriers is illustrated by the teaching method barrier responses. The
instructors had similar responses – The game was good; it was well thought out, etc. Even though a barrier exists and instructor's

14
J. Wood and D.F. Donnelly-Hermosillo Computers & Education 141 (2019) 103615

comments clearly indicate how they feel technology should be used, they are at the same time overcoming the barrier and thinking of
ways the technology can be integrated. Indeed, most of the comments were overwhelmingly positive in support of the game despite
indicating possible barriers. Field observation indicates wide acceptance by both instructors and students with multiple instructors
and students asking the authors for access to the game online, and instructors asking to use it for future semesters.
The survey of students in the study guide condition revealed a slightly higher percentage of students who felt that the activity
prepared them more for nomenclature than did students in the game condition. This finding illustrates a methodological pitfall
associated with study guides, and supports previous literature suggesting that students perceive a study guide in a positive light for
exams and as a good resource to prepare for class material despite there having been little empirical research done to support this
idea (Dickson et al., 2005; Mires & Howie, 1998; Ravichandran, Abirami, Dilara, & Vijayaraghavan, 2014; Shemberger & Wright,
2014; Sturges et al., 2017; Vandsburger & Duncan-Daston, 2011). For this study, students in the study guide condition improved their
scores on the pre-post-test thereby lending support to the commonly held belief that guides are an effective learning tool. However,
further evidence is needed to determine under what conditions, and what types of study guides are most effective for students,
particularly based on prior knowledge.
The importance of this study is to serve as an example from which future studies can build in areas such as study guides and game
design for science education, in particular - chemical nomenclature. Of the studies presented in the literature review, there are some
important methodological considerations to address. Most of the studies fall under the domain of a case study with limited examples
of empirical or mixed-method studies. Only one of the studies identified report their survey items but unfortunately none of the
quantitative studies can be replicated due to the lack of assessment items included. This study provides some empirical evidence that
students do better on pre-/post-tests when given a study guide or a game and that such performance is influenced by condition for
particular students (HPK) and by assessment category type.

5. Conclusion

Topinomica is a new chemistry game for use in the higher education classroom which fills the role of a general-purpose tool.
Other games focus on limited aspects of nomenclature such as ionic or covalent nomenclature, whereas Topinomica is meant to be all-
inclusive. Students and instructors alike appreciate the game as indicated by their survey results and interview responses respectively.
Despite the possible existence of second order barriers among instructors, it is an exciting result that several have readily adopted the
Topinomica game into their curriculum. The program's ease of use, and lack of setup/cleanup makes it easier to use compared with
physical board and card games. This, combined with the finding that students who played the game did just as well as those in the
study guide treatment, indicates an appealing alternative for those students who have difficulties in completing assigned coursework
through study guides. At the very least, Topinomica can serve as a good supplement to traditional instruction methods. Unusually,
and a strong topic for further research is the result that higher prior knowledge students benefit more from the game than high prior
knowledge students in the study guide condition treatment. Future studies should investigate in more detail the design features of the
game/study guide that contribute to such differences for HPK students.

5.1. Limitations of the study

It can be argued that free-response questions offer a more comprehensive view on a students' knowledge gains leading to different
results (Berg & Boote, 2015). This possibility was mitigated by assuring students that there would be no penalty for scoring poorly
and providing an option on each question that they could pick if they did not know the answer. This also helped reduce the possibility
of guessing the correct answer.
Further limitations that would be addressed in future iterations of this study would be the time given to instructors to do the
study. Instructors spent anywhere from ninety minutes to two hours on their respective activity. We gave the instructors, who all
consented to have their class participate in the study, the freedom to devote as much time as they wanted on either activity. This
limitation was addressed by asking instructors how long they spent on each activity on average and by examining scores broken down
by instructor.
Scaffolding was not included in the game which represents an opportunity to improve the scores in future iterations. Scaffolding
can be designed to improve learning and can be added through future development of a pre-existing game (Kao et al., 2017; Sun,
Chen, & Chu, 2018).
One final area of further research would be effects on learning had students been given a delayed post-test. Such a test was not
done in this study due to the impact a delayed test would have on the students' and instructor's class time when nomenclature is no
longer the subject under study. Clark et al. (2016) reports that students who also have more than one play session - a feature
impossible to provide due to the same time restriction as a delayed post-test – lead to significantly better learning outcomes than the
control. However, given the encouraging results of this study, instructors would support a future study to address such issues and
include a delayed post-test and multiple play sessions.

Declarations of interest

None.

15
J. Wood and D.F. Donnelly-Hermosillo Computers & Education 141 (2019) 103615

Acknowledgement

We would like to acknowledge the instructors and students who participated in this study. Without their involvement, this study
would not have been possible. We would also like to acknowledge faculty and students within the Andrews STEM Education Center
who provided feedback on earlier drafts of this manuscript.

Appendix A. Supplementary data

Supplementary data to this article can be found online at https://doi.org/10.1016/j.compedu.2019.103615.

References

Adair, B. M., & McAfee, L. V. (2018). Chemical pursuit: A modified trivia board game. Journal of Chemical Education, 95(3), 416–418. http://doi.org/10.1021/acs.
jchemed.6b00946.
Al-Kunifed, A. (1993). Investigation of high school chemistry students' concepts of chemical symbol, formula, and equation: Students' prescientific conceptions(Doctoral
Dissertation). Ann Arbor: Louisiana State University and Agricultural & Mechanical College Retrieved from LSU Digital Commons. .
An, Y. J., & Reigeluth, C. (2011). Creating technology-enhanced, learner-centered classrooms. Journal of Digital Learning in Teacher Education, 28(2), 54–62. https://
doi.org/10.1080/21532974.2011.10784681.
Authors (2014). Study guide B.
Barke, H.-D., Hazari, A., & Yitbarek, S. (2009). Students' misconceptions and how to overcome them - misconceptions in chemistry: Addressing perceptions in chemical
education. Berlin, Heidelberg: Springer Berlin Heidelberg.
Bayir, E. (2014). Developing and playing chemistry games to learn about elements, compounds, and the periodic table: Elemental periodica, compoundica, and
groupica. Journal of Chemical Education, 91(4), 531. https://doi.org/10.1021/ed4002249.
Berg, C., & Boote, S. (2015). Format effects of empirically derived multiple-choice versus free-response instruments when assessing graphing abilities. International
Journal of Science and Mathematics Education, 15(1), 19–38. https://doi.org/10.1007/s10763-015-9678-6.
Bernholt, S., Fischer, I., Heuer, S., Taskin, V., Martens, J., & Parchmann, I. (2012). Die chemische formelsprache – (un-)vermeidbare hürden auf dem weg zu einer
verständnisentwicklung? ChemKon, 4(19), 141–178. https://doi.org/10.1002/ckon.201210183.
Brecher, J. (1999). Name=struct: A practical approach to the sorry state of real-life chemical nomenclature. Journal of Chemical Information and Computer Sciences,
39(6), 943–950. https://doi.org/10.1021/ci990062c.
Breuer, J., & Bente, S. G. (2010). Why so serious? On the relation of serious games and learning. Journal for Computer Game Culture, 4(1), 7–24. https://hal.archives-
ouvertes.fr/hal-00692052/.
Callaghan, M. N., Long, J. J., van Es, E. A., Reich, S. M., & Rutherford, T. (2018). How teachers integrate a math computer game: Professional development use,
teaching practices, and student achievement. Journal of Computer Assisted Learning, 34(1), 10–19. https://doi.org/10.1111/jcal.12209.
Capps, K. (2008). Chemistry taboo: An active learning game for the general chemistry classroom. Journal of Chemical Education, 85(4), 518. https://doi.org/10.1021/
ed085p518.
Chen, W., Tan, A., & Lim, C. (2012). Extrinsic and intrinsic barriers in the use of ICT in teaching: A comparative case study in Singapore. Ascilite 2012: Future challenges,
sustainable futures.
Clark, D. B., Nelson, B. C., Chang, H.-Y., Martinez-Garza, M., Slack, K., & D'Angelo, C. M. (2011). Exploring Newtonian mechanics in a conceptually-integrated digital
game: Comparison of learning and affective outcomes for students in Taiwan and the United States. Computers & Education, 57(3), 2178–2195. https://doi.org/10.
1016/j.compedu.2011.05.007.
Clark, D., Sengupta, P., Brady, C., Martinez-Garza, M., & Killingsworth, S. (2015a). Disciplinary integration of digital games for science learning. International Journal of
STEM Education, 2(2)https://doi.org/10.1186/s40594-014-0014-4.
Clark, D. B., Tanner-Smith, E. E., & Killingsworth, S. S. (2016). Digital games, design, and learning: A systematic review and meta-analysis. Review of Educational
Research, 86(1), 79–122. https://doi.org/10.3102/0034654315582065.
Clark, C., Zhang, S., & Strudler, N. (2015b). Teacher candidate technology integration: For student learning or instruction? Journal of Digital Learning in Teacher
Education, 31(3), 93–106. https://doi.org/10.1080/21532974.2014.967421.
Connolly, T., Boyle, E., MacArthur, E., Hainey, T., & Boyle, J. (2012). A systematic literature review of empirical evidence on computer games and serious games.
Computers & Education, 59(2), 661–686. https://doi.org/10.1016/j.compedu.2012.03.004.
Cook, T., & Reichardt, C. (1979). Qualitative and quantitative methods in evaluation. Beverly Hills, CA: Sage Publications.
Creswell, J. (2009). Mixed method procedures. Research design: Qualitative, quantitative, and mixed method approaches (pp. 203–226). Thousand Oaks, Ca: Sage
Publications.
Crute, T. (2000). Classroom nomenclature games—BINGO. Journal of Chemical Education, 77(4), 481. https://doi.org/10.1021/ed077p481.
Cuban, L., Kirkpatrick, H., & Peck, C. (2001). High access and low use of technologies in high school classrooms: Explaining an apparent paradox. American Educational
Research Journal, 38(4), 813–834. https://doi.org/10.3102/00028312038004813.
Deavor, J. (1996). Chemical jeopardy. Journal of Chemical Education, 73(5), 430. https://doi.org/10.1021/ed073p430.
Dickson, K., Miller, M., & Devoley, M. (2005). Effect of textbook study guides on student performance in introductory psychology. Teaching of Psychology, 32(1), 34–39.
https://doi.org/10.1207/s15328023top3201_8.
Egenfeldt‐Nielsen, S. (2004). Practical barriers in using educational computer games. On the Horizon, 12(1), 18–21. https://doi.org/10.1108/10748120410540454.
Ertmer, P. (1999). Addressing first- and second-order barriers to change: Strategies for technology integration. Educational Technology Research & Development, 47(4),
47–61. https://doi.org/10.1007/BF02299597.
Gauthier, A., & Jenkinson, J. (2018). Designing productively negative experiences with serious game mechanics: Qualitative analysis of game-play and game design in
a randomized trial. Computers and Education, 127, 66–89. https://doi.org/10.1016/j.compedu.2018.08.017.
Gaydos, M. (2015). Seriously considering design in educational games. Educational Researcher, 44(9), 478–483. https://doi.org/10.3102/0013189X15621307.
Gee, J. (2003). What video games have to teach us about learning and literacy. Computers in Entertainment, 1(1), 20. https://doi.org/10.1145/950566.950595.
Glažar, S., & Devetak, I. (2002). Secondary school students' knowledge of stoichiometry. Acta Chimica Slovenica, 49, 43–53. https://www.researchgate.net/
publication/290170776_Secondary_school_students'_knowledge_of_stoichiometry.
Gredler, M. (2004). Games and simulations and their relationships to learning. In D. Jonassen (Ed.). Handbook of research or educational communications and technology
(pp. 571–582). (2nd ed.). Mahwah, NJ.
Gros, B. (2007). Digital games in education. Journal of Research on Technology in Education, 40(1), 23–38. https://doi.org/10.1080/15391523.2007.10782494.
Gurung, R. (2003). Pedagogical aids and student performance. Teaching of Psychology, 30(2), 92–95. https://doi.org/10.1207/S15328023TOP3002_01.
Hackathorn, J., Joyce, A. W., & Bordieri, M. (2017). Do these things even work? A call for research on study guides. Essays from Excellence in Teaching Vol XVIIhttp://
teachpsych.org/E-xcellence-in-Teaching-Blog/5506547.
Hackathorn, J., Joyce, A., & Bordieri, M. (2018). Do these things even work? A call for research on study guides. Essays from E-xcellence in Teaching, 17, 48–51. http://
teachpsych.org/resources/Documents/ebooks/eit2017.pdf#page=53.
Heath, M. K. (2017). Teacher-initiated one-to-one technology initiatives: How teacher self-efficacy and beliefs help overcome barrier thresholds to implementation.
Computers in the Schools, 34(1–2), 88–106. https://doi.org/10.1080/07380569.2017.1305879.

16
J. Wood and D.F. Donnelly-Hermosillo Computers & Education 141 (2019) 103615

Hein, C. (1978). Learning to learn with independent study guides. Peabody Journal of Education, 55(3), 193–197. https://doi.org/10.1080/01619567809538186.
Holsgrove, G., Lanphear, J., & Ledingham, I. (1998). Study guides: An essential student learning tool in an integrated curriculum. Medical Teacher, 20(2), 99–103.
https://doi.org/10.1080/01421599881174.
Huizenga, J. C., ten Dam, G. T. M., Voogt, J. M., & Admiraal, W. F. (2017). Teacher perceptions of the value of game-based learning in secondary education. Computers
& Education, 110, 105–115. https://doi.org/10.1016/j.compedu.2017.03.008.
Hung, H.-T., Yang, J. C., Hwang, G.-J., Chu, H.-C., & Wang, C.-C. (2018). A scoping review of research on digital game-based language learning. Computers & Education,
126, 89–104. https://doi.org/10.1016/j.compedu.2018.07.001.
Johnson, R., & Onwuegbuzie, A. (2004). Mixed methods research: A research paradigm whose time has come. Educational Researcher, 33(7), 14–26. https://doi.org/10.
3102/0013189X033007014.
Kao, G. Y.-M., Chiang, C.-H., & Sun, C.-T. (2017). Customizing scaffolds for game-based learning in physics: Impacts on knowledge acquisition and game design
creativity. Computers & Education, 113, 294–312. https://doi.org/10.1016/j.compedu.2017.05.022.
Kavak, N. (2012). ChemOkey: A game to reinforce nomenclature. Journal of Chemical Education, 89(8), 1047–1049. https://doi.org/10.1021/ed3000556.
Khogali, S., Laidlaw, J., & Harden, R. (2006). Study guides: A study of different formats. Medical Teacher, 28(4), 375–377. https://doi.org/10.1080/
01421590600799059.
Kieg, P. F., & Rubba, P. A. (2006). Translation of representations of the structure of matter and its relationship to reasoning, gender, spatial reasoning, and specific
prior knowledge. Journal of Research in Science Teaching, 30(8), 883–903. https://doi.org/10.1002/tea.3660300807.
Kopcha, T. J. (2012). Teachers' perceptions of the barriers to technology integration and practices with technology under situated professional development. Computers
& Education, 59(4), 1109–1121. https://doi.org/10.1016/j.compedu.2012.05.014.
Kordaki, M., & Gousiou, A. (2017). Digital card games in education: A ten year systematic review. Computers & Education, 109, 122–161. https://doi.org/10.1016/j.
compedu.2017.02.011.
Kurushkin, M., & Mikhaylenko, M. (2015). Chemical alias: An engaging way to examine nomenclature. Journal of Chemical Education, 92(10), 1678–1680. https://doi.
org/10.1021/acs.jchemed.5b00244.
Laidlaw, J. M., & Harden, R. M. (1990). What is… a study guide? Medical Teacher, 12(1), 7–12. https://doi.org/10.3109/01421599009010556.
Laine, T. H., Sedano, C. A., Joy, M., & Sutinen, E. (2010). Critical factors for technology integration in game-based pervasive learning spaces. IEEE Transactions on
Learning Technologies, 3(4), 294–306. https://doi.org/10.1109/TLT.2010.16.
Langlois, S. (2002). The power of posing questions: Engaging students in study guide construction. Measurement in Physical Education and Exercise Science, 6(2),
143–146. https://doi.org/10.1207/S15327841MPEE0602_4.
Liu, O. L., Lee, H. S., Hofstetter, C., & Linn, M. C. (2008). Assessing knowledge integration in science: Construct, measures, and evidence. Educational Assessment, 13(1),
33–55. https://doi.org/10.1080/10627190801968224.
Lundy, J. (1991). Cognitive learning from games: Student approaches to business games. Studies in Higher Education, 16(2), 179–188. https://doi.org/10.1080/
03075079112331382964.
Mansureh, K. (2010). Factors affecting teachers' adoption of educational computer games: A case study. British Journal of Educational Technology, 41(2), 256–270.
https://doi.org/10.1111/j.1467-8535.2008.00921.x.
Marais, F., & Combrinck, S. (2009). An approach to dealing with the difficulties undergraduate chemistry students experience with stoichiometry. South African Journal
of Chemistry, 62, 88–96. https://www.ajol.info/index.php/sajc/article/view/123215.
Martin, W., Silander, M., & Rutter, S. (2018). Digital games as sources for science analogies: Learning about energy through play. Computers & Education. https://doi.
org/10.1016/j.compedu.2018.11.002.
Masek, M., Boston, J., Lam, C. P., & Corcoran, S. (2017). Improving mastery of fractions by blending video games into the Math classroom. Journal of Computer Assisted
Learning, 33(5), 486–499. https://doi.org/10.1111/jcal.12194.
Minshew, L., & Anderson, J. (2015). Teacher self-efficacy in 1:1 iPad integration in middle school science and math classrooms. Contemporary Issues in Technology and
Teacher Education, 15(3), Retrieved from https://www.citejournal.org/volume-15/issue-3-15/science/teacher-self-efficacy-in-11-ipad-integration-in-middle-
school-science-and-math-classrooms.
Mires, G., & Howie, P. (1998). A ‘topical’ approach to planned teaching and learning using a topic-based study guide. Medical Teacher, 20(5), 438. https://doi.org/10.
1080/01421599880535.
Morris, T. (2011). Go chemistry: A card game to help students learn chemical formulas. Journal of Chemical Education, 88(10), 1397–1399. https://doi.org/10.1021/
ed100661c.
Onwuegbuzie, A., & Leech, N. (2005). On becoming a pragmatic researcher: The importance of combining quantitative and qualitative research methodologies.
International Journal of Social Research Methodology, 8(5), 375–387. https://doi.org/10.1080/13645570500402447.
Petrov, A. (2014). Using minecraft in education: A qualitative study on benefits and challenges of game-based educationUnpublished master’s thesis. Ontario, Canada:
University of Toronto.
Pugh, K., Sheldon, S., Byers, J., & Zhao, Y. (2002). Conditions for classroom technology innovations. Teachers College Record, 104(3), 482–515. https://doi.org/10.
1111/1467-9620.00170.
Randel, J., & Morris, B. (1992). The effectiveness of games for educational purposes: A review of recent research. Simulation & Gaming, 23(3), 261. https://doi.org/10.
1177/1046878192233001.
Rastegarpour, H., & Marashi, P. (2012). The effect of card games and computer games on learning of chemistry concepts. Procedia - Social and Behavioral Sciences, 31,
597–601. https://doi.org/10.1016/j.sbspro.2011.12.111.
Ratcliffe, B. (1986). Chemical pursuit. The Science Teacher, 53(2), 57–58. https://www.jstor.org/stable/24140123.
Ravichandran, L., Abirami, V., Dilara, K., & Vijayaraghavan, P. V. (2014). Student perception on study guides in an integrated preclinical curriculum. Sri Ramachandra
Journal of Medicine, 7(2), 9–12. http://search.ebscohost.com/login.aspx?direct=true&db=asx&AN=113728443&site=eds-live.
Russell, J. (1999). Using games to teach chemistry: An annotated bibliography. Journal of Chemical Education, 76(4), 481. https://doi.org/10.1021/ed076p481.
Sakkal, A., & Martin, L. (2019). Learning to rock: The role of prior experience and explicit instruction on learning and transfer in a music videogame. Computers &
Education, 128, 389–397. https://doi.org/10.1016/j.compedu.2018.10.007.
Sawyer, A. (1976). “Chemantics” - a new chemical education card game. Journal of Chemical Education, 53(12), 780. https://doi.org/10.1021/ed053p780.1.
Schibeci, R., MacCallum, J., Cumming‐Potvin, W., Durrant, C., Kissane, B., & Miller, E. J. (2008). Teachers' journeys towards critical use of ICT. Learning, Media and
Technology, 33(4), 313–327. https://doi.org/10.1080/17439880802497065.
Schoepp, K. (2005). Learning and teaching in higher education: Gulf perspectives barriers to technology integration in a technology-rich environment. Learning and
teaching in higher education: Gulf perspectives: Vol. 2, (pp. 1–24). . Retrieved from http://www.zu.ac.ae/lthe/vol2no1/lthe02_05.pdf.
Sedney, D. L. (1988). “Trivial pursuit” for chemists. Journal of Chemical Education, 65(5), 383. https://doi.org/10.1021/ed065p383.
Shemberger, M., & Wright, L. (2014). Exploring the use of social media as a digital study guide. Journal of Interdisciplinary Studies in Education, 3(1), 60–75. http://
www.isejournal.org/index.php/jise/article/view/97.
Sherman, A., & Sherman, S. (1980). Chem-deck: How to learn to write the formulas of chemical compounds (or lose your shirt). Journal of Chemical Education, 57(7),
503. https://doi.org/10.1021/ed057p503.
Srisawasdi, N., & Panjaburee, P. (2018). Implementation of game-transformed inquiry-based learning to promote the understanding of and motivation to learn
chemistry. Journal of Science Education and Technology, 1–13. https://doi.org/10.1007/s1095.
Sturges, D., Maurer, T., & Kosturik, A. (2017). Using study guides in undergraduate human anatomy and physiology classes: Student perceptions and academic
performance. International Journal of Kinesiology in Higher Education, 1(1), 18–27. https://doi.org/10.1080/24711616.2016.1277672.
Sun, C.-T., Chen, L.-X., & Chu, H.-M. (2018). Associations among scaffold presentation, reward mechanisms and problem-solving behaviors in game play. Computers &
Education, 119, 95–111. https://doi.org/10.1016/j.compedu.2018.01.001.
Taskin, V., & Bernholt, S. (2014). Students' understanding of chemical formulae: A review of empirical research. International Journal of Science Education, 36(1),

17
J. Wood and D.F. Donnelly-Hermosillo Computers & Education 141 (2019) 103615

157–185. https://doi.org/10.1080/09500693.2012.744492.
Tro, N. (2011). Chemistry: A molecular approach (2nd ed.). Pearson.
Tsai, Y.-L., & Tsai, C.-C. (2018). Digital game-based second-language vocabulary learning and conditions of research designs: A meta-analysis study. Computers &
Education, 125, 345–357. https://doi.org/10.1016/j.compedu.2018.06.020.
Vandsburger, E., & Duncan-Daston, R. (2011). Evaluating the study guide as a tool for increasing students' accountability for reading the textbook. Journal of College
Reading and Learning, 42(1), 6–23. http://doi.org/10.1080/10790195.2011.10850345.
Virvou, M., Katsionis, G., & Manos, K. (2005). Combining software games with education: Evaluation of its educational effectiveness. Journal of Educational Technology
& Society, 8(2), 54–65. http://www.jstor.org/stable/jeductechsoci.8.2.54.
Vogel, J. J., Vogel, D. S., Cannon-Bowers, J. A. N., Bowers, C. A., Muse, K., & Wright, M. (2006). Computer gaming and interactive simulations for learning: A meta-
analysis. Journal of Educational Computing Research, 34(3), 229–243. http://search.ebscohost.com/login.aspx?direct=true&db=ehh&AN=22933097&site=
ehost-live.
White, B. Y. (1984). Designing computer games to help physics students understand Newton's laws of motion. Cognition and Instruction, 1(1), 69–108. http://www.
jstor.org/stable/3233521.
Wood, C., & Breyfogle, B. (2006). Interactive demonstrations for mole ratios and limiting reagents. Journal of Chemical Education, 83(5), 741. http://doi.org/10.1021/
ed083p741.
Wulfsberg, G. P., Sanger, M. J., Melton, T. J., & Chimeno, J. S. (2006). The rainbow wheel and rainbow matrix: Two effective tools for learning ionic nomenclature.
Journal of Chemical Education, 83(4), 651. http://doi.org/10.1021/ed083p651.

18