Sie sind auf Seite 1von 17

Information and Learning Science

Learning analytics: where information science and the learning sciences meet
Stephanie Danell Teasley,
Article information:
To cite this document:
Stephanie Danell Teasley, (2018) "Learning analytics: where information science and the learning
sciences meet", Information and Learning Science, https://doi.org/10.1108/ILS-06-2018-0045
Permanent link to this document:
https://doi.org/10.1108/ILS-06-2018-0045
Downloaded by Eastern Kentucky University At 05:56 11 January 2019 (PT)

Downloaded on: 11 January 2019, At: 05:56 (PT)


References: this document contains references to 58 other documents.
To copy this document: permissions@emeraldinsight.com
The fulltext of this document has been downloaded 55 times since 2018*
Access to this document was granted through an Emerald subscription provided by emerald-
srm:412925 []
For Authors
If you would like to write for this, or any other Emerald publication, then please use our Emerald
for Authors service information about how to choose which publication to write for and submission
guidelines are available for all. Please visit www.emeraldinsight.com/authors for more information.
About Emerald www.emeraldinsight.com
Emerald is a global publisher linking research and practice to the benefit of society. The company
manages a portfolio of more than 290 journals and over 2,350 books and book series volumes, as
well as providing an extensive range of online products and additional customer resources and
services.
Emerald is both COUNTER 4 and TRANSFER compliant. The organization is a partner of the
Committee on Publication Ethics (COPE) and also works with Portico and the LOCKSS initiative for
digital archive preservation.

*Related content and download information correct at time of download.


The current issue and full text archive of this journal is available on Emerald Insight at:
www.emeraldinsight.com/2398-5348.htm

Learning
Learning analytics: where analytics
information science and the
learning sciences meet
Stephanie Danell Teasley
School of Information, University of Michigan, Ann Arbor, Michigan, USA
Received 5 June 2018
Revised 13 September 2018
Accepted 17 September 2018
Abstract
Purpose – The explosive growth in the number of digital tools utilized in everyday learning activities
Downloaded by Eastern Kentucky University At 05:56 11 January 2019 (PT)

generates data at an unprecedented scale, providing exciting challenges that cross scholarly communities.
This paper aims to provide an overview of learning analytics (LA) with the aim of helping members of the
information and learning sciences communities understand how educational Big Data is relevant to their
research agendas and how they can contribute to this growing new field.
Design/methodology/approach – Highlighting shared values and issues illustrates why LA is the
perfect meeting ground for information and the learning sciences, and suggests how by working together
effective LA tools can be designed to innovate education.
Findings – Analytics-driven performance dashboards are offered as a specific example of one research area
where information and learning scientists can make a significant contribution to LA research. Recent reviews
of existing dashboard studies point to a dearth of evaluation with regard to either theory or outcomes. Here,
the relevant expertise from researchers in both the learning sciences and information science is offered as an
important opportunity to improve the design and evaluation of student-facing dashboards.
Originality/value – This paper outlines important ties between three scholarly communities to illustrate
how their combined research expertise is crucial to advancing how we understand learning and for
developing LA-based interventions that meet the values that we all share.
Keywords Education, Learning, Dashboards, Educational technology, Learning analytics,
Student data privacy
Paper type Viewpoint

The most dramatic factor shaping the future of higher education is something that we can’t
actually touch or see: big data and analytics. (Siemens and Long, 2011, p. 31).

1. Introduction
The word “analytics” is very much in the news as there is a growing awareness of the extent
to which the technologies we use every day collect data about our online activities. As a
broad concept, analytics refers to data-gathering and analysis methods for discovery,
interpretation, and communication of meaningful patterns in data. In the age of Big Data,
the use of analytics provides a way to process very large, unstructured data sets to discern
patterns more quickly and accurately than a human ever could. It is unquestionable and
potentially unavoidable that technology providers collect and keep a tremendous amount of
information about our behavior. Analytics is the process that transforms these rich data
repositories into consumable information by comparing activity and performance indicators
within and across datasets. Information and Learning Science
In the private sector, analytics are used to measure performance at a number of different © Emerald Publishing Limited
2398-5348
levels: corporate, business unit and personnel and are used to measure, monitor and manage DOI 10.1108/ILS-06-2018-0045
ILS business processes to achieve strategic objectives (Eckerson, 2011) by helping users to
identify trends, patterns and anomalies and to support reasoning and guide effective
decision-making. Although educational institutions have been using “academic analytics”
(Campbell et al., 2007) for specific business purposes (e.g. enrollment management, fund-
raising), the use of analytics has moved into the pedagogical practice of education by
utilizing new data about teaching and learning available through the wide-scale adoption of
educational technology, such as learning management systems (LMS), digital textbooks,
lecture capture systems and podcasts.
Developments in artificial intelligence and machine learning create the conditions for
applying new methodologies to the ever-increasing amounts of data that learners generate
through their interactions with online tools. Similar to the shift to evidence-based medicine
in health care, research in education has begun to recognize the opportunities that arise from
analyzing learner-produced data trails. For example, just as modeling techniques can be
Downloaded by Eastern Kentucky University At 05:56 11 January 2019 (PT)

used to predict who is most likely to get the flu (see the CDC website, https://predict.
phiresearchlab.org/), we can use the same kind of analytic methods to predict who is most
likely to fail a course. As in health care, we may be able to intervene before the prediction
becomes reality.
In this paper, I provide an overview of a new scholarly community called learning
analytics (LA), discuss why LA is the perfect meeting ground for information and the
learning sciences, and suggest how by working together we can design effective LA tools
that may in fact innovate education. This article aims to help members of the information
and learning sciences communities who are interested in LA understand how educational
Big Data is relevant to the research agenda(s) undertaken by both information and learning
scientists. The explosive growth in the number of digital tools utilized in everyday learning
activities generates data at an unprecedented scale, providing exciting challenges for both
scholarly communities. My hope is that this paper can build on the existing synergies
between information and learning sciences (Ahn and Erickson, 2016) to stimulate
discussions, energize collaborations, and open up new pathways between LA and these two
more established scholarly communities.

2. What is learning analytics?


In 2011, the first Conference on Learning Analytics and Knowledge (“LAK”) was held in
Banff, Canada. The conference was attended by about 100 academics from various countries
who represented a wide variety of disciplinary training. The introduction to the conference
proceedings states:
The idea for establishing a special dedicated forum to researching learning analytics was
motivated by several important indicators:

(1) The growth of data surpasses the ability of organizations to make sense of it. This concern is
particularly pronounced in relation to knowledge, teaching, and learning.

(2) Learning institutions and corporations make little use of the data learners “throw off” in the
process of accessing learning materials, interacting with educators and peers, and creating new
content.

(3) In an age where educational institutions are under growing pressure to reduce costs and
increase efficiency, analytics promises to be an important lens through which to view and plan for
change at course and institutions levels (Long and Siemens, 2011, p. 3).
In the years since, the growth of LA is evidenced by an annual conference that grows in size Learning
and impact, the incorporation of a scholarly society (the Society for Learning Analytic analytics
Research “SoLAR”), and a new journal, the Journal of Learning Analytics. Researchers
involved in LA come from various fields including education, psychology, philosophy,
sociology, linguistics, learning sciences, statistics, machine learning/artificial intelligence,
computer science and information science. There was – and still is – some overlap between
related societies and conferences with longer histories, specifically Artificial Intelligence in
Education, European Council on Technology Enabled Learning, and Educational Data
Mining[1].
The definition of LA adopted by the Society for Learning Analytics Research states that:
Learning analytics is the measurement, collection, analysis and reporting of data about learners
and their contexts, for purposes of understanding and optimizing learning and the environments
in which it occurs (see www.SoLAResearch.org).
Downloaded by Eastern Kentucky University At 05:56 11 January 2019 (PT)

A similar definition was proposed by EDUCAUSE, a nonprofit association whose mission,


is to advance higher education through the use of information technology, for their Next
Generation Learning Initiative (www.nextgenlearning.org), “the use of data and models to
predict student progress and performance, and the ability to act on that information.” Both
definitions propose an emphasis on actionable data, creating two synergistic goals for LA: to
understand how students learn and to create interventions aimed at teaching and learning
practices in educational contexts (Wise, 2014).

3. What does it mean to “do” learning analytics?


LA starts with data, and educational institutions have a lot of data about their students. For
example, the student registration or application process generates very detailed
demographic information about each student, including student’s educational history, but
other information about the student as well, such as the family’s income and the parents’
level of educational attainment. Once a student is enrolled, the registrar’s office keeps a
detailed record of the students’ academic progress including specifics such as grade level
completed (in primary and secondary education) and courses enrolled, dropped or completed
(in post-secondary and graduate education) and when these were taken, as well as the
student’s final grades. On university campuses, the housing office may collect students’
building usage through the use of swipe card entry systems, the library tracks resource use,
and the computing service unit logs students’ location when they use the campus internet.
However, regardless of level (primary, secondary and post-secondary), most education
institutions make very little use of student data to improve teaching or learning. By joining
data from various student information systems with behavioral data from LMS and other
online educational systems (e.g. interactive learning environments, intelligent tutoring
systems, e-portfolio systems, and personal learning environments), these rich data
repositories can be mined to identify trends, patterns and anomalies. The data can represent
both the process of learning – records of student activity, such as clickstream data from
online tools, library use, resources accessed, etc., and the products of learning – evidence of
learning found in discussion posts, blogs, tweets, hashtag use, etc., providing a much richer
picture of learner behavior than previously possible. The availability of this data has given
rise to fears of the corporatization of our educational systems and calls for agreement about
the ethics of using such data (see below).
Researchers in LA acknowledge that data do not exist in a vacuum. Greller and
Drachsler (2012) proposed an LA design framework which provides detail on six critical
dimensions, including data and technologies as discussed above, but also stakeholders,
ILS competencies, constraints and objectives (Figure 1). This framework highlights that the data
and technologies involved in LA work are conducted for someone (stakeholders), with the
requisite skills to interpret the output (competencies), to support actionable outcomes
(objectives), with limits on how LA can be done (constraints). Individual LA projects have
different emphases on each of these dimensions, and the sub-topics in the model have been
further developed since 2012.
Early scholarly work in the field of LA focused on identifying key variables for
predicting student outcomes, such as retention and time to degree. For example, utilizing
prediction models to analyze these variables can identify which students are “at risk.” Many
of the papers in the early LAK conferences relied primarily on data-mining and statistical
techniques for analyzing student data, typically from LMS event logs (Khalil and Ebner,
2016). In addition to predicting performance outcomes, LA researchers also began building
systems to present performance metrics visually to monitor student progress and to identify
Downloaded by Eastern Kentucky University At 05:56 11 January 2019 (PT)

when students are in need of academic intervention.


Purdue University was one of the first institutions to scale up an LA-based system for
campus-wide use, called Course Signals (Arnold, 2010; Arnold and Pistilli, 2012). Fritz (2011)
conducted some of the earliest research on the wide-scale deployment of a tool, called Check
My Activity, for displaying performance information directly to university students. Both
are early warning systems, utilizing student data including pre-college preparation, course
performance and LMS activity to categorize students’ risk for academic difficulty and
allowing students to compare their LMS activity and grades against an anonymous
summary of their course peers. More recently, GA State University has been recognized for
its use of predictive analytics to help reduce the achievement gap for first-generation, Pell-
eligible (a subsidy provided by the US Federal Government to low income students), black

Figure 1.
Framework for LA
and Latino students. They review over 700 risk factors for each of the 24,000 undergraduate Learning
students on a continuous basis and have a large staff of academic advisors who reach out to analytics
the underperforming students identified by this system (Dennis, 2017). There are a growing
number of universities throughout the world that are institutionalizing the use of prediction
models with the aim of reducing drop-out rates, especially at institutions that serve diverse
student populations. In both online and residential settings, there are an increasing number
of dashboard systems developed to provide “actionable intelligence” to instructors (e.g.
SNAPP, see Dawson et al., 2010), academic advisors (e.g. Student Explorer, see Krumm et al.,
2014) and, more recently, directly to students (Yoo et al., 2015).
What these LA-based interventions, and other like them, have in common is the delivery
of feedback based on student models that blend demographic, institutional and LMS data in
the form of simple counts of login activity and course grades gathered throughout an
academic term. Using prediction models allows investigation of research questions about
Downloaded by Eastern Kentucky University At 05:56 11 January 2019 (PT)

performance outcomes. For example, among freshman entering college with strong high
school record (GPA > 3.8), who will do poorly (GPA < 2.0) at the end of their first year?
Using individual course grades and overall GPA to create three performance categories –
under achieving, over achieving, as expected – what factors categorize the students who are
under achieving? The answers to these questions have implications for designing
personalized interventions. Through sequence mining, research can address questions
about student performance over time, such as, how do different pathways through a
curriculum influence GPA at graduation? These type of analyses allow inferences about
students’ trajectory within and across courses to unpack the factors related to
undergraduate completion. With statistical methods such as hazard modeling, researchers
can investigate questions about changes in student performance throughout a term rather
than just relying on their final course grade. For example, what is the likelihood of an
individual student’s recovery from academic difficulty in one or more courses during
different points in the semester? Is there a point of no return in which students who are
performing poorly are not likely to pass the course or complete the MOOC? Determining
these critical periods during the academic year can lead to effective interventions before
students experience failure.
As LA as a field of study has continued to develop, so has the call for utilizing this
research to make a greater impact on our understanding of learning and effective
educational practice. As early as 2012, only one year after the first LAK conference,
Ferguson challenged the community to incorporate more data that is intrinsically social into
LA research, such as social network analysis and discourse analysis, to “move away from
data driven investigation towards research more strongly grounded in the learning sciences
and, increasingly, dealing with the complexities of lifelong learning that takes place in a
variety of contexts” (p. 10). In 2014, several of the leaders in establishing the field observed,
“Learning analytics to date has served to identify a condition, but has not advanced to deal
with the learning challenges in a more nuanced and integrated manner” (Dawson et al., 2014,
p. 232).
Today, LA research has continued to expand its focus, incorporating more learning
theory and additional tools for investigating impact, such as text mining, semantics and
linguistic analyses, social network analysis and qualitative analysis. In fact, the field now
includes sub-specialties, such as “Social Learning Analytics” (Shum and Ferguson, 2012),
“Multi-Modal Learning Analytics” (Blikstein and Worsley, 2016) and Temporal Analytics
(Chen et al., 2018), reflecting the ever-increasing kinds of data available, as well as a
movement toward a greater emphasis on explanatory models rather than prediction
accuracy. In the first Handbook of Learning Analytics (Lang et al., 2017), 9 of the 30 chapters
ILS address techniques and methodological approaches, and the following 10 chapters detail the
various ways these methods can be applied. The emphasis is shifting from meta-analysis
and group-level assessments using broad demographic factors to an approach that focuses
on fine-grained information about each individual student, collected over time. Winne (2017)
reminds us that “[. . .] determining that a learner’s age, sex, or lab group predicts outcomes
offers weak grounds for intervening without other data” (p. 241).
Like the shift from evidence-based medicine to precision medicine (Beckmann and Lew,
2016), LA research is increasingly focusing on personalizing student learning rather than
generating one-size-fits-all [group] approaches. The impact of LA research comes from the
realized value of applying analytics techniques to multiple types of student data, in addition
to the data captured by the digital learning environment. With student-level identified data,
personalization is triggered by the learner’s actions and personal information, rather than
solely by a prediction model that has been designed based on general patterns of prior
Downloaded by Eastern Kentucky University At 05:56 11 January 2019 (PT)

behavior. Much of the needed learner-specific data are already being collected by schools
and universities in their administrative data systems, and interest is growing in capturing
other data with sensor technology (e.g. eye tracking, gesture recognition systems, electro-
dermal activation and other bio-physiological data) that allow for the collection of rich, high
frequency data about how learners are physically and visually engaging with a given task.
Research arising from a deeper and more fine-grained view of the learner experience will
allow for the design of personalized learning trajectories that support the diversity of
students and their preparation for learning in specific types of educational contexts.
However, to achieve this goal, we need sound theoretical models for understanding what
the data are telling us. There is a need, in both LA and other data analytics, to come to a new
understanding of how data and theory are related. Rogers et al. (2016) argue for the need for
theory-driven research in LA to “close the loop” between research and practice. Going into
the data with a without a theory to be tested provides correlational results but does not
necessarily support a causal understanding of learning outcomes. Although Wise and
Schwartz (2017) caution that suppositions about theoretical constructs can limit the gaze of
researchers, without explaining why a particular practice or system works – and for whom –
the research is unlikely to have a wide impact on educational practice.

4. What information 1 learning sciences bring to learning analytics


This new field of LA has grown quickly as a result of a perfect storm of conditions: we know
more about the basic processes involved in learning, educational technology generates new
forms of data about learning and the rapid growth of online technology platforms and
services provides new opportunities to engage in formal and informal learning that is not
bound to physical spaces like schools or museums. When you couple this with increasing
societal pressures to make learning accessible to all and relevant to life in the information
age, we need scholarly research to ensure that advances in learning technologies and
educational practices benefit society as a whole. However, the commercialization of these
systems – and the data they generate – has outpaced the science about how they can and
should affect our lives. We need an active partnership between research communities to
ensure that the promise provided by technology to innovate teaching and learning can be
realized.
Like the early days of both the learning sciences and information science, LA is a new
field of study without a singular disciplinary home. Scholarship in all three fields has been
built on a solid foundation provided by other academic disciplines, typically by researchers
who were driven by the challenges presented by the transition to the digital age to conduct
their work across traditional disciplinary boundaries. While recognizing that there as much
variation within these three communities as across them, technology is a central focus to Learning
them all. Hoadley (2004) observed, “Technology was seen as a component of the learning analytics
environment from the beginning of the learning sciences field” (p. 9). For a full history of the
learning sciences, see Hoadley and Van Haneghan, 2011. Similarly, in a history of
information schools, Larsen (2010) proposed that “emergence and evolution of iSchools was
triggered by the explosive growth in digital information.” The foundation for LA is also
technology. However, this is not the only meeting ground for these three fields.

4.1 Shared goal and values


One of the most important commonalities between the learning sciences, information science
and LA is a deeply held view that information is inherently a public good. Achieving an
educated citizenry is a goal that is shared by all three research communities, and an
understanding of how educational technology can best be used to support learning is crucial
Downloaded by Eastern Kentucky University At 05:56 11 January 2019 (PT)

to achieving that goal. However, with the increasing drive for “data-driven decision-making”
and the potential risks for misuse and/or misinterpretation of data generated by information
technology, there is a need for explicit values to guide research practices beyond those that
are in the purview of intuitional review boards. A recent paper by Penuel et al. (2017)
proposed a set of shared values for learning sciences research that I believe are also
applicable to research in LA and information science:
 The problem should be important to a broad range of stakeholders.
 The role and contributions of partners should be clearly described, particularly their
expertise and how it was integrated into the research.
 The research should support the agency of participants.
 The research should attend to context.
 The research should provide something of practical value to participants.
 The research plan should include specific, logical, and coherent plans for studying
and following problems; for designing, testing, and iterating upon solutions; and for
constructing and using practical knowledge.
 The research should account for the gap between what was intended and what was
accomplished.
 The research should contribute to organizational or community culture and practice.
 The research should be of value to others outside the partnership (http://learndbir.
org/resources/Penuel_et_al_2017_11_20-TO-SHARE.pdf).

How these values come to play in the three research communities will undoubtedly vary due
to differences in focus, data sources, and stakeholders. Experience coming from the learning
sciences community in research practice partnerships (Coburn and Penuel, 2016) and
approaches such collaborative data inquiry (Krumm et al., 2018) to can guide future research
to help ensure that LA-based work is indeed “actionable.”

4.2 Learner-centered approach


Following closely to the values discussed above, a “human-centered” approach to
technology is a signature of research on designing information systems. For example, years
of research on technology adoption (Staub, 2017) have demonstrated that the assumption “if
we build it, they will come” is not always correct. When designing information technology,
we think first about the intended users and promote using an iterative design cycle that uses
ILS feedback from early versions to revise and refine the system development and
implementation. In the learning sciences literature, a similar process is a key feature of the
various research practices, such as DBIR, community-based design research and other
forms of research-practice partnerships. All are an approach to relating research and
practice that is collaborative, iterative, and grounded in systematic inquiry. Driven by the
same explosive growth in health-care data as we are experiencing with educational data,
Friedman et al. (2017) have proposed a “learning cycle” to portray how an iterative cycle of
research can lead to improvements in health systems. An adapted graphic of their model is
shown in Figure 2. Applied to LA research, this represents the process of gathering data
from the various sources described above, conducting analyses on that data to capture new
knowledge about learning processes (D2K), translating those insights to changes in
educational practice (K2P) and generating new data from the results of the intervention to
begin the research cycle again (P2D).
Downloaded by Eastern Kentucky University At 05:56 11 January 2019 (PT)

In the data to knowledge part of the cycle, LA will benefit from the methodological
expertise often located in the field of information science for the management and curation
of large-scale data sets produced by information technologies. For moving knowledge into
practice, these strengths need to be in partnership with learning theory to design, build and
implement effective LA interventions. For example, the rise of social LA comes from
constructivist theory about learning as an inherently social activity, especially in our
increasingly online and connected world (Shum and Ferguson, 2012). Despite the wealth of
educational data increasingly available, Kennedy et al. (2017) offer this caution, “We must
also continually remind ourselves that many of the learning interactions between staff and
students and much of our students’ learning experiences exist outside the administrative
and learning systems of universities” (p. 72).
In the learning sciences, Ito’s conception of “Connected Learning” (Ito et al., 2013) is one
example of work demonstrating how students use technology to pursue an interest or

Figure 2.
Iterative cycle of data
(D) to knowledge (K)
to practice (P)
passion and link this learning and interest to academic achievement, career success and Learning
civic engagement. This line of research and others in the informal space could lead to analytics
insights about how to collect data that can enrich our understanding of learning as a life-
wide activity (Banks, et al., 2006). Research in Information Science on large data sets drawn
from social media use (e.g. Facebook, Twitter, Wikipedia) could be linked to other student
data allowing a more comprehensive view of how learners achieve this connection.
Similarly, many of the new types of data being utilized for Multimodal LA are based on
psychological and psychophysiological theory in research areas such as embodied
cognition, affective learning, and attention (Blikstein and Worsley, 2016). With the
continued growth of online educational systems that reach thousands of users, such as
MOOCs and commercialized cognitive tutors, as well as the ways in which information
technology is blurring the line between formal and informal learning, sharing methods and
applying theory across these three scholarly communities is necessary now more than ever.
Downloaded by Eastern Kentucky University At 05:56 11 January 2019 (PT)

4.3 Data privacy, security and ethics


The recent outcry about the use of Facebook data by Cambridge Analytica (www.
nytimes.com/2018/04/04/us/politics/cambridge-analytica-scandal-fallout.html) is one
particularly salient demonstration of the many ways data can be harvested and
utilized, often without direct knowledge by the system users. Privacy issues complicate
access to the kind of data needed to conduct LA research. In the K-12 space, the non-
profit project called inBloom was initiated in 2011 to collect student information for
states and districts and make the data available to third parties to develop tools and
dashboards for classroom educators. Although opinions differ on the controversy that
led to the shut-down of this project, it is clear that what could potentially be gained
from federating student data was not as salient to parents as the possible misuse of the
data (Herold, 2014). Unfortunately, in the wake of the very public nature of its failure in
2014, “the trend in data-driven educational technologies has since been toward
piecemeal adoption of closed, proprietary systems instead of a multi-state, open source
platform” (Data and Society Blog, 2017).
One of the earliest and most cited LA papers on ethics (Slade and Prinsloo, 2013)
categorized three main issues:
(1) the location and interpretation of data;
(2) informed consent, privacy and the de-identification of data; and
(3) the management, classification and storage of data.

In 2014, a group of educators, scientists and legal/ethical scholars developed a framework to


inform decisions about appropriate use of data and technology in learning research for
higher education. The six principles proposed (see http://asilomar-highered.info/) were
intended to inform the collection, storage, distribution and analysis of data derived from
human engagement with learning resources. Since that time, there have been a growing
number of papers adding new principles and policy recommendations (Gaševic et al., 2016).
Despite these proposals, there are few regulatory frameworks for student data in the
USA, in either higher education or K-12, apart from the Children’s’ Online Privacy and
Protection Act and the Family Educational; Rights and Privacy Act, which were drafted to
provide some limits to sharing this data. However, as noted by Kurzweil and Stevens (2018),
“[in the US] the domains of edtech and LA are without commonly shared routines for
adjudicating conflicts of interest in data use for academic, commercial, and scientific
purposes” (p. 22). The European Union has adopted a new law called the General Data
ILS Protection Regulation (GDPR), which is intended to strengthen individual privacy rights and
impose significant fines for companies that violate these rights (https://gdpr-info.eu/).
Because compliance with the GDPR requires pseudonymization and anonymization of
personal data, the effect on LA research conducted inside of the EU is yet to be determined.
Consideration of the ethics of using educational data and the differences in local
governing policies suggest some questions that would benefit from input from researchers
in the three communities addressed in this paper:
 What kinds of data do we think are important to be collected about learners?
 What kind of data are unnecessary?
 What kinds of control do we want to have over the learners’ data?
 What kinds of control do we want learners to have over their own data?
 What kinds of risks are we concerned about?
Downloaded by Eastern Kentucky University At 05:56 11 January 2019 (PT)

 Who is responsible for mediating these risks?

Tackling these issues of privacy, information sharing and data stewardship are crucial to
the future of LA. O’Neil’s (2016) book, Weapons of Math Destruction, is one example of the
narrative about how “Big Data” can be used to support and deepen disparities and reinforce
discriminatory practices (see also Rummel et al., 2016). While the sheer volume of digital
data gathered from whole populations (e.g. every student in a class, at a school, in a district)
provides an appearance of validity and generalizability – no small n’s or selective sampling,
no response bias – we must be mindful that what gets collected, who has access, how the
data are analyzed and interpreted, and the decisions resulting from these analyses are all
subject to human judgment. As Reich (2015) observed, “Researchers have come to believe
that nudges and pings can have enormous power over individual student choices, and
educators, parents and the public-at-large need to discuss how that power should best be
used.”

5. Designing effective learning analytics together


As a concrete example of one area where information and learning scientists can make a
significant contribution to LA research, consider a tool that is just beginning to be used in
educational technologies: performance dashboards for students.
Analytic tools relieve us of the burden of too much data by delivering analytics that
allow us to make sense of our own data. Whether it is using Google analytics to monitor
website traffic, a click button to see who has viewed our LinkedIn page, or the dashboard on
our Fitbit home page, these tools measure, analyze and build reports from logged data for
purposes of understanding, monitoring, and perhaps changing behavior. When the data are
about students, collected to act in the students’ best interests but often captured without
their own or their parents’ consent, we will need the best theory and practices to come from
all scholarly societies whose work touches on the role that technology plays in modern life.
Student-facing performance dashboards are being integrated into existing educational
technologies, such as LMS, as well as in new applications for personalizing learning such as
gameful approaches to pedagogy (e.g. Gradecraft, see Aguilar et al., 2015) and tailored
messaging systems (e.g. eCoach, see Huberth et al., 2015). These dashboards typically
include comparative feedback about activity and performance, using the class average as a
benchmark. Viewed as meta-cognitive tools (Durall and Gros, 2014), student use is expected
to support awareness, self-reflection and sense-making (Govaerts et al., 2010) and lead to
improved performance. However, there is a large body of literature on feedback showing
that the effects on student learning are highly variable (Hattie and Timperley, 2007), and the Learning
use of comparative feedback may invoke specific motivational issues (Aguilar, 2017; analytics
Teasley, 2017). Winstone et al. (2017) assert “Inevitably, the benefits of receiving feedback
are not uniform across all circumstances, so it is imperative to understand how these gains
can be maximized” (p. 17). Despite the fact that a growing number of commercial vendors
and homegrown university systems are poised to implement student-facing dashboards into
broad scale use, little research has addressed the impact of such systems and caution may be
needed when deploying one-size-fits-all dashboards directly to students (Teasley, 2017).
Two recent reviews of the state-of-the-art in dashboard research demonstrate the lack of
evaluation with regard to either theory or outcomes. In the first review, Schwendimann et al.
(2017) examined 55 dashboard papers published between 2010 and 2015, of which 28 of the
dashboards were provided directly to students. Of all papers reviewed here, less than half
included a specific reference to pedagogy, and only three papers contributed a theoretical
Downloaded by Eastern Kentucky University At 05:56 11 January 2019 (PT)

proposal or framework. Further, only 15 papers described evaluations of dashboard use,


leading the authors to conclude, “. . .perhaps the most striking finding of this review is the
fact that very few studies look directly into student learning gains or learning-related
constructs” (p. 37).
In the second review, Bodily and Verbert (2017) examined 93 papers where students were
the intended users of the system. Of these studies, only six included a description of a needs
analysis and only ten provided a report on usability testing. With respect to measured
effects of dashboard use, only 15 articles examined the impact of the system on student
behavior – 14 on student achievement and 2 on student skills such as reflection or
awareness. These authors concluded simply, “Student use of reporting systems is not well
studied or understood” (p. 417).
Clearly, there is a lot of room here for more researchers from the learning sciences to
bring their expertise in self-regulation and motivational theory into the design and
evaluation of student-facing dashboards. An excellent example of this approach can be
found in a paper by Sedrakyan et al. (2018), who propose a conceptual model for visualizing
the relationships between dashboard design and the learning sciences to provide cognitive
and behavioral process-oriented feedback to learners. By focusing on the links between what
is known about effective feedback and the regulatory mechanisms that underlie learning
processes, these authors outline how dashboard design could positively influence student
learning. From an information sciences perspective, Vieira et al. (2018) note the lack of
intersection between visual analytics and LA. They reviewed 52 papers about dashboards to
evaluate the sophistication of the displays and their connection to educational theory, and
they found an inverse relationship. Specifically, studies that provided a critical analysis of
visualization literature did not critically engage with educational theories, and vice versa.
The authors concluded that “Interdisciplinary work between information visualization
experts and educational researchers seems to be missing in the literature” (p. 131), and they
provide direction for a research agenda that would connect these fields to create a new sub-
field, visual LA, bringing insights from both communities in closer contact with LA
research.
Despite the limitations of existing dashboard research outlined in the papers
described here, there is fine work being done by my LA colleagues. Certainly, there are
excellent papers listed in these reviews that have provided thoughtful analyses and
solid direction for future research. However, for LA research to proceed with
successfully designing tools that truly innovate teaching and learning, we need to
continue to cross disciplinary boundaries.
ILS 6. Conclusions
New fields of study bubble up in the academy, often through interdisciplinary efforts that
bring people together around the promise of how theory, methodology and technological
innovation can be rearranged and reassembled to lead to new insights about our world. Over
the course of my academic career, I have been privileged to be part of the emergence of both
the Learning Sciences and iSchools. I credit this background, combined with the forward
thinking about the value of student data at my institution, as the reason my own research in
LA predates the first LAK conference. I found that my focus on learning was opened up in
new ways by the availability and quantity of educational data combined with the methods
for handling very large datasets demonstrated by my information science colleagues.
Further, I saw the work in LA as a way forward for moving innovative educational practice
in the digital age, and doing so at scale in ways help to ensure that all learners can be
successful.
Downloaded by Eastern Kentucky University At 05:56 11 January 2019 (PT)

This article is intended to help the learning sciences and information science
communities understand why LA is important and how their expertise is crucial to
advancing how we understand learning, and for developing LA-based interventions that
meet the values that we share. Educational technologies are not all easy to implement and
educational practices are notoriously difficult to change, especially at scale. The push for LA
across all educational sectors calls for scholarly research that clearly demonstrates – to
parents, students, teachers and administrators – the value proposition implicit in this field.
The use of LA in education requires that we think carefully about what we need to know
and when data is most likely to tell us what we need to know. I invite colleagues in the
learning sciences and information sciences to join me in this endeavor.

Note
1. See Siemens and Baker (2012) for a direct comparison of LA and EDM research

References
Aguilar, S., Holman, C. and Fishman, B. (2015), “Game-Inspired design empirical evidence in support of
gameful learning environments”, Games and Culture, Vol. 13 No. 1, pp. 44-70.
Ahn, J. and Erickson, I. (2016), “Revealing mutually constitutive ties between the information and
learning sciences”, The Information Society, Vol. 32 No. 2, pp. 81-84.
Arnold, K.E. (2010), “Signals: Applying academic analytics”, ECUCAUSE Quarterly, Vol. 33 No. 1, p. 1.
Arnold, K.E. and Pistilli, M.D. (2012), “Course signals at purdue: Using learning analytics to increase
student success”, Proceedings of the International Conference on Learning Analytics and
Knowledge, ACM, New York, NY, pp. 267-270.
Banks, J., Au, K., Ball, A., Bell, P., Gordon, E., Gutierrez, K. and Zhou, M. (2006), Learning in and out of
School in Diverse Environments: Life-Long, Life-Wide, Life-Deep, NSF LIFE Center and
University of Washington, DC Center for Multicultural Education, Seattle, WA.
Beckmann, J.S. and Lew, D. (2016), “Reconciling evidence-based medicine and precision medicine in the
era of big data: challenges and opportunities”, Genome Medicine, Vol. 8 No. 1, pp. 134-2016.
Blikstein, P. and Worsley, M. (2016), “Multimodal learning analytics: a methodological framework for
research in constructivist learning”, Journal of Learning Analytics, Vol. 3 No. 2, pp. 220-238.
Bodily, R. and Verbert, K. (2017), “Review of research on student-facing learning analytics dashboards
and educational recommender systems”, IEEE Transactions on Learning Technologies, Vol. 10
No. 4, pp. 405-418.
Campbell, J.P., DeBlois, P.B. and Oblinger, D.G. (2007), “Academic analytics: a new tool for a new era”, Learning
EDUCAUSE Review, Vol. 42 No. 4, p. 40.
analytics
Chen, B., Knight, S. and Wise, A.F. (2018), “Critical issues in designing and implementing temporal
analytics”, Journal of Learning Analytics, Vol. 5 No. 1, pp. 1-9.
Coburn, C.E. and Penuel, W.R. (2016), “Research–practice partnerships in education: outcomes,
dynamics, and open questions”, Educational Researcher, Vol. 45 No. 1, pp. 48-54.
Data and Society Blog (2017), Assessing the Legacy of InBloom, available at: https://datasociety.net/
blog/2017/02/02/assessing-legacy-inbloom/ (accessed 23 May 2018).
Dawson, S., Bakharia, A. and Heathcote, E. (2010), “SNAPP: Realising the affordances of real-time SNA
within networked learning environments”, Proceedings of the 7th International Conference on
Networked Learning, Lancaster University, Lancaster, pp. 125-133.
Dawson, S., Gaševic, D., Siemens, G. and Joksimovic, S. (2014), “Current state and future trends: a
citation network analysis of the learning analytics field”, Proceedings of the Fourth International
Downloaded by Eastern Kentucky University At 05:56 11 January 2019 (PT)

Conference on Learning Analytics and Knowledge, ACM, New York, NY, USA, pp. 231-240.
Dennis, M.J. (2017), “How big data can help recruitment and retention”, Enrollment Management
Report, Vol. 21 No. 5, pp. 3-3.
Durall, E. and Gros, B. (2014), “Learning analytics and a metacognitive tool”, Proceedings of the 6th
International Conference on Computer Supported Education (CSEDU), pp. 380-384.
Eckerson, W.W. (2011), Performance Dashboards. Measuring, Monitoring, and Managing Your
Business, 2nd ed., Wiley, Hoboken.
Ferguson, R. (2012), “Learning analytics: drivers, developments and challenges”, International Journal
of Technology Enhanced Learning, Vol. 4 Nos 5/6, pp. 304-317.
Friedman, C.P., Rubin, J.C. and Sullivan, K.J. (2017), “Toward an information infrastructure for global
health improvement”, Yearbook of Medical Informatics, Vol. 26 No. 01, pp. 16-23.
Fritz, J.L. (2011), “Classroom walls that talk: using inline course activity data of successful students to
raise self-awareness of underperforming peers”, Internet and Higher Education, Vol. 14 No. 2,
pp. 89-97.
Gaševic, D., Dawson, S. and Jovanovic, J. (2016), “Ethics and privacy as enablers of learning analytics”,
Journal of Learning Analytics, Vol. 3 No. 1, pp. 1-4.
Govaerts, S., Verbert, K., Klerkx, J. and Duval, E. (2010), “Visualizing activities for self-reflection and
awareness”, in Advances in Web-Based Learning–ICWL, Springer, Berlin, pp. 91-100.
Greller, W. and Drachsler, H. (2012), “Turning learning into numbers. Toward a generic framework for
learning analytics”, Journal of Educational Technology and Society, Vol. 15 No. 3, pp. 42-57.
Hattie, J. and Timperley, H. (2007), “The power of feedback”, Review of Educational Research, Vol. 77
No. 1, pp. 81-112.
Herold, B. (2014), “inBloom to shut down amid growing data-privacy concerns”, Education Week,
Vol. 21.
Hoadley, C. (2004), “Learning and design: Why the learning sciences and instructional systems need
each other”, Educational Technology, Vol. 44 No. 3, pp. 6-1.
Hoadley, C. and Van Haneghan, J. (2011), “The learning sciences: Where they came from and what it
means for instructional designers”, in Reiser, R.A. and Dempsey, J.V. (Eds), Trends and Issues in
Instructional Design and Technology, 3rd ed., Pearson, New York, NY, pp. 53-63.
Huberth, M., Chen, P., Tritz, J. and McKay, T.A. (2015), “Computer-Tailored student support in
introductory physics”, PLoS One, Vol. 10 No. 9, p. e0137001.
Ito, M., Gutiérrez, K., Livingstone, S., Penuel, B., Rhodes, J., Salen, K., Schor, J., Sefton-Green, J. and
Watkins, S.C. (2013), Connected Learning: An Agenda for Research and Design, BookBaby.
Kennedy, G., Corrin, L. and De Barba, P. (2017), “Analytics of what? negotiating the seduction of big
data and learning analytics”, in James, R., French, S. and Kelly, P. (Eds), Visions for Australian
ILS Tertiary Education, Melbourne Centre for the Study of Higher Education, The University of
Melbourne, Melbourne, pp. 67-76.
Khalil, M. and Ebner, M. (2016), “What is learning analytics about? A survey of different methods used
in 2013-2015”, Proceedings of the Smart Learning Conference, arXiv preprint arXiv:1606.02878,
pp. 294-304.
Krumm, A.E., Waddington, R.J., Teasley, S.D. and Lonn, S. (2014), “Using data from a learning
management system to support academic advising in undergraduate engineering education”, in
Larusson, J.A. and White, B. (Eds), Learning Analytics from Research to Practice: Methods,
Tools, and Approaches, Springer-Verlag, Berlin, pp. 103-119.
Krumm, A., Means, B. and Bienkowski, M. (2018), Learning Analytics Goes to School: A Collaborative
Approach to Improving Education, Routledge, New York, NY.
Kurzweil, M. and Stevens, M. (2018), “Setting the table: responsible use of student data in higher
education”, EDUCAUSE Review, May/June, pp. 17-24.
Downloaded by Eastern Kentucky University At 05:56 11 January 2019 (PT)

Lang, C., Siemens, G., Wise, A. and Gaševic, D. (Eds) (2017), Handbook of Learning Analytics, Solar,
Society for Learning Analytics and Research.
Larsen, R.L. (2010), “iSchools”, in Bates, M.J. and Maack, M.N (Eds), Encyclopedia of Library and
Information Sciences, 3rd ed., CRC Press, Boca Raton, pp. 3018-3023.
Long, P. and Siemens, G. (Eds) (2011), Proceedings of the 1st International Conference on Learning
Analytics and Knowledge, ACM, New York, NY, USA.
O’Neil, C. (2016), “Weapons of math destruction: How big data increases inequality and threatens
democracy”, Broadway Books.
Penuel, W.R., Peurach, D.J., LeBoeuf, W.A., Riedy, R., Barber, M., Clark, T.R. and Gabriele, K. (2017),
Defining Collaborative Problem Solving Research: Common Values and Distinctive Approaches,
University of CO, Boulder, CO.
Reich, J. (3 June, 2015), available at: www.kqed.org/mindshift/40719
Rogers, T., Dawson, S. and Gaševic, D. (2016), “Learning analytics and the imperative for theory driven
research”, in Haythornthwaite, C., Andrews, R., Fransman, J. and Meyers, E. (Eds), Handbook of
E-Learning Research, SAGE, London, pp. 232-250.
Rummel, N., Walker, E. and Aleven, V. (2016), “Different futures of adaptive collaborative learning
support”, International Journal of Artificial Intelligence in Education, Vol. 26 No. 2, pp. 784-795.
Schwendimann, B.A., Rodriguez-Triana, M.J., Vozniuk, A., Prieto, L.P., Boroujeni, M.S., Holzer, A.
and Dillenbourg, P. (2017), “Perceiving learning at a glance: a systematic literature review
of learning dashboard research”, IEEE Transactions on Learning Technologies, Vol. 10
No. 1, pp. 30-41.
Sedrakyan, G., Malmberg, J., Verbert, K., Järvelä, S. and Kirschner, P.A. (2018), “Linking learning
behavior analytics and learning science concepts: designing a learning analytics dashboard for
feedback to support learning regulation”, Computers in Human Behavior.
Shum, S.B. and Ferguson, R. (2012), “Social learning analytics”, Journal of Educational Technology and
Society, Vol. 15 No. 3, pp. 3-26.
Siemens, G. and Baker, R. (2012), “Learning analytics and educational data mining: towards
communication and collaboration”, Proceedings of the 2nd International Conference on Learning
Analytics and Knowledge, ACM, Vancouver, Canada, pp. 252-254.
Siemens, G. and Long, P. (2011), “Penetrating the fog: analytics in learning and education”,
EDUCAUSE Review, Vol. 46 No. 5, pp. 30-32.
Slade, S. and Prinsloo, P. (2013), “Learning analytics: ethical issues and dilemmas”, American
Behavioral Scientist, Vol. 57 No. 10, pp. 1509-1528.
Staub, E.T. (2017), “Understanding technology adoption: Theory and future directions for informal
learning”, Review of Educational Research, Vol. 79 No. 2, pp. 625-649.
Teasley, S.D. (2017), “Student facing dashboards: One size fits all?”, Technology, Knowledge and Learning
Learning, Vol. 22 No. 3, pp. 377-384.
analytics
Vieira, C., Parsons, P. and Byrd, V. (2018), “Visual learning analytics of educational data: a systematic
literature review and research agenda”, Computers and Education, Vol. 122, pp. 119-135.
Winne, P.H. (2017), “Learning analytics for self-regulated learning”, in Land, C., Siemens, G., Wise, A.
and Gaševic, D. (Eds), Handbook of Learning Analytics, 1st ed., The Society for Learning
Analytics Research, Beaumont, AB, pp. 241-249.
Winstone, N.E., Nash, R.A., Parker, M. and Rowntree, J. (2017), “Supporting learners’ agentic
engagement with feedback: a systematic review and a taxonomy of recipience processes”,
Educational Psychologist, Vol. 52 No. 1, pp. 17-37.
Wise, A. (2014), “Designing pedagogical interventions to support student use of learning analytics”,
Proceedings of the International Conference on Learning Analytics and Knowledge, ACM,
Indianapolis, IN, pp. 203-211.
Downloaded by Eastern Kentucky University At 05:56 11 January 2019 (PT)

Wise, A.F. and Schwarz, B.B. (2017), “Visions of CSCL: eight provocations for the future of the field”,
International Journal of Computer-Supported Collaborative Learning, Vol. 12 No. 4, pp. 423-467.
Yoo, Y., Lee, H., Jo, I.H. and Park, Y. (2015), “Educational dashboards for smart learning: review of case
studies”, In Chen, G., Kumar, V., Kinshuk, Huang, R. and Kong, S.C. (Eds), Emerging Issues in
Smart Learning, Springer, Berlin Heidelberg, pp. 145-155.

Further reading
Aguilar, S. (2016), “Perceived motivational affordances: Capturing and measuring students’ sense-
making around visualizations of their academic achievement information”, (Doctoral
Dissertation), University of Michigan, Ann Arbor, MI.
Prinsloo, P. and Slade, S. (2017), “Ethics and learning analytics: Charting the (un)charted”, in C., Land,
G., Siemens. A., Wise and Gaševic, D. (Eds), Handbook of Learning Analytics, 1st edition, The
Society for Learning Analytics Research, pp. 49-57.

Corresponding author
Stephanie Danell Teasley can be contacted at: steasley@umich.edu

For instructions on how to order reprints of this article, please visit our website:
www.emeraldgrouppublishing.com/licensing/reprints.htm
Or contact us for further details: permissions@emeraldinsight.com
This article has been cited by:

1. HaythornthwaiteCaroline, Caroline Haythornthwaite. Learning, connectivity and networks.


Information and Learning Sciences, ahead of print. [Abstract] [Full Text] [PDF]
Downloaded by Eastern Kentucky University At 05:56 11 January 2019 (PT)

Das könnte Ihnen auch gefallen