Beruflich Dokumente
Kultur Dokumente
Learning analytics: where information science and the learning sciences meet
Stephanie Danell Teasley,
Article information:
To cite this document:
Stephanie Danell Teasley, (2018) "Learning analytics: where information science and the learning
sciences meet", Information and Learning Science, https://doi.org/10.1108/ILS-06-2018-0045
Permanent link to this document:
https://doi.org/10.1108/ILS-06-2018-0045
Downloaded by Eastern Kentucky University At 05:56 11 January 2019 (PT)
Learning
Learning analytics: where analytics
information science and the
learning sciences meet
Stephanie Danell Teasley
School of Information, University of Michigan, Ann Arbor, Michigan, USA
Received 5 June 2018
Revised 13 September 2018
Accepted 17 September 2018
Abstract
Purpose – The explosive growth in the number of digital tools utilized in everyday learning activities
Downloaded by Eastern Kentucky University At 05:56 11 January 2019 (PT)
generates data at an unprecedented scale, providing exciting challenges that cross scholarly communities.
This paper aims to provide an overview of learning analytics (LA) with the aim of helping members of the
information and learning sciences communities understand how educational Big Data is relevant to their
research agendas and how they can contribute to this growing new field.
Design/methodology/approach – Highlighting shared values and issues illustrates why LA is the
perfect meeting ground for information and the learning sciences, and suggests how by working together
effective LA tools can be designed to innovate education.
Findings – Analytics-driven performance dashboards are offered as a specific example of one research area
where information and learning scientists can make a significant contribution to LA research. Recent reviews
of existing dashboard studies point to a dearth of evaluation with regard to either theory or outcomes. Here,
the relevant expertise from researchers in both the learning sciences and information science is offered as an
important opportunity to improve the design and evaluation of student-facing dashboards.
Originality/value – This paper outlines important ties between three scholarly communities to illustrate
how their combined research expertise is crucial to advancing how we understand learning and for
developing LA-based interventions that meet the values that we all share.
Keywords Education, Learning, Dashboards, Educational technology, Learning analytics,
Student data privacy
Paper type Viewpoint
The most dramatic factor shaping the future of higher education is something that we can’t
actually touch or see: big data and analytics. (Siemens and Long, 2011, p. 31).
1. Introduction
The word “analytics” is very much in the news as there is a growing awareness of the extent
to which the technologies we use every day collect data about our online activities. As a
broad concept, analytics refers to data-gathering and analysis methods for discovery,
interpretation, and communication of meaningful patterns in data. In the age of Big Data,
the use of analytics provides a way to process very large, unstructured data sets to discern
patterns more quickly and accurately than a human ever could. It is unquestionable and
potentially unavoidable that technology providers collect and keep a tremendous amount of
information about our behavior. Analytics is the process that transforms these rich data
repositories into consumable information by comparing activity and performance indicators
within and across datasets. Information and Learning Science
In the private sector, analytics are used to measure performance at a number of different © Emerald Publishing Limited
2398-5348
levels: corporate, business unit and personnel and are used to measure, monitor and manage DOI 10.1108/ILS-06-2018-0045
ILS business processes to achieve strategic objectives (Eckerson, 2011) by helping users to
identify trends, patterns and anomalies and to support reasoning and guide effective
decision-making. Although educational institutions have been using “academic analytics”
(Campbell et al., 2007) for specific business purposes (e.g. enrollment management, fund-
raising), the use of analytics has moved into the pedagogical practice of education by
utilizing new data about teaching and learning available through the wide-scale adoption of
educational technology, such as learning management systems (LMS), digital textbooks,
lecture capture systems and podcasts.
Developments in artificial intelligence and machine learning create the conditions for
applying new methodologies to the ever-increasing amounts of data that learners generate
through their interactions with online tools. Similar to the shift to evidence-based medicine
in health care, research in education has begun to recognize the opportunities that arise from
analyzing learner-produced data trails. For example, just as modeling techniques can be
Downloaded by Eastern Kentucky University At 05:56 11 January 2019 (PT)
used to predict who is most likely to get the flu (see the CDC website, https://predict.
phiresearchlab.org/), we can use the same kind of analytic methods to predict who is most
likely to fail a course. As in health care, we may be able to intervene before the prediction
becomes reality.
In this paper, I provide an overview of a new scholarly community called learning
analytics (LA), discuss why LA is the perfect meeting ground for information and the
learning sciences, and suggest how by working together we can design effective LA tools
that may in fact innovate education. This article aims to help members of the information
and learning sciences communities who are interested in LA understand how educational
Big Data is relevant to the research agenda(s) undertaken by both information and learning
scientists. The explosive growth in the number of digital tools utilized in everyday learning
activities generates data at an unprecedented scale, providing exciting challenges for both
scholarly communities. My hope is that this paper can build on the existing synergies
between information and learning sciences (Ahn and Erickson, 2016) to stimulate
discussions, energize collaborations, and open up new pathways between LA and these two
more established scholarly communities.
(1) The growth of data surpasses the ability of organizations to make sense of it. This concern is
particularly pronounced in relation to knowledge, teaching, and learning.
(2) Learning institutions and corporations make little use of the data learners “throw off” in the
process of accessing learning materials, interacting with educators and peers, and creating new
content.
(3) In an age where educational institutions are under growing pressure to reduce costs and
increase efficiency, analytics promises to be an important lens through which to view and plan for
change at course and institutions levels (Long and Siemens, 2011, p. 3).
In the years since, the growth of LA is evidenced by an annual conference that grows in size Learning
and impact, the incorporation of a scholarly society (the Society for Learning Analytic analytics
Research “SoLAR”), and a new journal, the Journal of Learning Analytics. Researchers
involved in LA come from various fields including education, psychology, philosophy,
sociology, linguistics, learning sciences, statistics, machine learning/artificial intelligence,
computer science and information science. There was – and still is – some overlap between
related societies and conferences with longer histories, specifically Artificial Intelligence in
Education, European Council on Technology Enabled Learning, and Educational Data
Mining[1].
The definition of LA adopted by the Society for Learning Analytics Research states that:
Learning analytics is the measurement, collection, analysis and reporting of data about learners
and their contexts, for purposes of understanding and optimizing learning and the environments
in which it occurs (see www.SoLAResearch.org).
Downloaded by Eastern Kentucky University At 05:56 11 January 2019 (PT)
Figure 1.
Framework for LA
and Latino students. They review over 700 risk factors for each of the 24,000 undergraduate Learning
students on a continuous basis and have a large staff of academic advisors who reach out to analytics
the underperforming students identified by this system (Dennis, 2017). There are a growing
number of universities throughout the world that are institutionalizing the use of prediction
models with the aim of reducing drop-out rates, especially at institutions that serve diverse
student populations. In both online and residential settings, there are an increasing number
of dashboard systems developed to provide “actionable intelligence” to instructors (e.g.
SNAPP, see Dawson et al., 2010), academic advisors (e.g. Student Explorer, see Krumm et al.,
2014) and, more recently, directly to students (Yoo et al., 2015).
What these LA-based interventions, and other like them, have in common is the delivery
of feedback based on student models that blend demographic, institutional and LMS data in
the form of simple counts of login activity and course grades gathered throughout an
academic term. Using prediction models allows investigation of research questions about
Downloaded by Eastern Kentucky University At 05:56 11 January 2019 (PT)
performance outcomes. For example, among freshman entering college with strong high
school record (GPA > 3.8), who will do poorly (GPA < 2.0) at the end of their first year?
Using individual course grades and overall GPA to create three performance categories –
under achieving, over achieving, as expected – what factors categorize the students who are
under achieving? The answers to these questions have implications for designing
personalized interventions. Through sequence mining, research can address questions
about student performance over time, such as, how do different pathways through a
curriculum influence GPA at graduation? These type of analyses allow inferences about
students’ trajectory within and across courses to unpack the factors related to
undergraduate completion. With statistical methods such as hazard modeling, researchers
can investigate questions about changes in student performance throughout a term rather
than just relying on their final course grade. For example, what is the likelihood of an
individual student’s recovery from academic difficulty in one or more courses during
different points in the semester? Is there a point of no return in which students who are
performing poorly are not likely to pass the course or complete the MOOC? Determining
these critical periods during the academic year can lead to effective interventions before
students experience failure.
As LA as a field of study has continued to develop, so has the call for utilizing this
research to make a greater impact on our understanding of learning and effective
educational practice. As early as 2012, only one year after the first LAK conference,
Ferguson challenged the community to incorporate more data that is intrinsically social into
LA research, such as social network analysis and discourse analysis, to “move away from
data driven investigation towards research more strongly grounded in the learning sciences
and, increasingly, dealing with the complexities of lifelong learning that takes place in a
variety of contexts” (p. 10). In 2014, several of the leaders in establishing the field observed,
“Learning analytics to date has served to identify a condition, but has not advanced to deal
with the learning challenges in a more nuanced and integrated manner” (Dawson et al., 2014,
p. 232).
Today, LA research has continued to expand its focus, incorporating more learning
theory and additional tools for investigating impact, such as text mining, semantics and
linguistic analyses, social network analysis and qualitative analysis. In fact, the field now
includes sub-specialties, such as “Social Learning Analytics” (Shum and Ferguson, 2012),
“Multi-Modal Learning Analytics” (Blikstein and Worsley, 2016) and Temporal Analytics
(Chen et al., 2018), reflecting the ever-increasing kinds of data available, as well as a
movement toward a greater emphasis on explanatory models rather than prediction
accuracy. In the first Handbook of Learning Analytics (Lang et al., 2017), 9 of the 30 chapters
ILS address techniques and methodological approaches, and the following 10 chapters detail the
various ways these methods can be applied. The emphasis is shifting from meta-analysis
and group-level assessments using broad demographic factors to an approach that focuses
on fine-grained information about each individual student, collected over time. Winne (2017)
reminds us that “[. . .] determining that a learner’s age, sex, or lab group predicts outcomes
offers weak grounds for intervening without other data” (p. 241).
Like the shift from evidence-based medicine to precision medicine (Beckmann and Lew,
2016), LA research is increasingly focusing on personalizing student learning rather than
generating one-size-fits-all [group] approaches. The impact of LA research comes from the
realized value of applying analytics techniques to multiple types of student data, in addition
to the data captured by the digital learning environment. With student-level identified data,
personalization is triggered by the learner’s actions and personal information, rather than
solely by a prediction model that has been designed based on general patterns of prior
Downloaded by Eastern Kentucky University At 05:56 11 January 2019 (PT)
behavior. Much of the needed learner-specific data are already being collected by schools
and universities in their administrative data systems, and interest is growing in capturing
other data with sensor technology (e.g. eye tracking, gesture recognition systems, electro-
dermal activation and other bio-physiological data) that allow for the collection of rich, high
frequency data about how learners are physically and visually engaging with a given task.
Research arising from a deeper and more fine-grained view of the learner experience will
allow for the design of personalized learning trajectories that support the diversity of
students and their preparation for learning in specific types of educational contexts.
However, to achieve this goal, we need sound theoretical models for understanding what
the data are telling us. There is a need, in both LA and other data analytics, to come to a new
understanding of how data and theory are related. Rogers et al. (2016) argue for the need for
theory-driven research in LA to “close the loop” between research and practice. Going into
the data with a without a theory to be tested provides correlational results but does not
necessarily support a causal understanding of learning outcomes. Although Wise and
Schwartz (2017) caution that suppositions about theoretical constructs can limit the gaze of
researchers, without explaining why a particular practice or system works – and for whom –
the research is unlikely to have a wide impact on educational practice.
to achieving that goal. However, with the increasing drive for “data-driven decision-making”
and the potential risks for misuse and/or misinterpretation of data generated by information
technology, there is a need for explicit values to guide research practices beyond those that
are in the purview of intuitional review boards. A recent paper by Penuel et al. (2017)
proposed a set of shared values for learning sciences research that I believe are also
applicable to research in LA and information science:
The problem should be important to a broad range of stakeholders.
The role and contributions of partners should be clearly described, particularly their
expertise and how it was integrated into the research.
The research should support the agency of participants.
The research should attend to context.
The research should provide something of practical value to participants.
The research plan should include specific, logical, and coherent plans for studying
and following problems; for designing, testing, and iterating upon solutions; and for
constructing and using practical knowledge.
The research should account for the gap between what was intended and what was
accomplished.
The research should contribute to organizational or community culture and practice.
The research should be of value to others outside the partnership (http://learndbir.
org/resources/Penuel_et_al_2017_11_20-TO-SHARE.pdf).
How these values come to play in the three research communities will undoubtedly vary due
to differences in focus, data sources, and stakeholders. Experience coming from the learning
sciences community in research practice partnerships (Coburn and Penuel, 2016) and
approaches such collaborative data inquiry (Krumm et al., 2018) to can guide future research
to help ensure that LA-based work is indeed “actionable.”
In the data to knowledge part of the cycle, LA will benefit from the methodological
expertise often located in the field of information science for the management and curation
of large-scale data sets produced by information technologies. For moving knowledge into
practice, these strengths need to be in partnership with learning theory to design, build and
implement effective LA interventions. For example, the rise of social LA comes from
constructivist theory about learning as an inherently social activity, especially in our
increasingly online and connected world (Shum and Ferguson, 2012). Despite the wealth of
educational data increasingly available, Kennedy et al. (2017) offer this caution, “We must
also continually remind ourselves that many of the learning interactions between staff and
students and much of our students’ learning experiences exist outside the administrative
and learning systems of universities” (p. 72).
In the learning sciences, Ito’s conception of “Connected Learning” (Ito et al., 2013) is one
example of work demonstrating how students use technology to pursue an interest or
Figure 2.
Iterative cycle of data
(D) to knowledge (K)
to practice (P)
passion and link this learning and interest to academic achievement, career success and Learning
civic engagement. This line of research and others in the informal space could lead to analytics
insights about how to collect data that can enrich our understanding of learning as a life-
wide activity (Banks, et al., 2006). Research in Information Science on large data sets drawn
from social media use (e.g. Facebook, Twitter, Wikipedia) could be linked to other student
data allowing a more comprehensive view of how learners achieve this connection.
Similarly, many of the new types of data being utilized for Multimodal LA are based on
psychological and psychophysiological theory in research areas such as embodied
cognition, affective learning, and attention (Blikstein and Worsley, 2016). With the
continued growth of online educational systems that reach thousands of users, such as
MOOCs and commercialized cognitive tutors, as well as the ways in which information
technology is blurring the line between formal and informal learning, sharing methods and
applying theory across these three scholarly communities is necessary now more than ever.
Downloaded by Eastern Kentucky University At 05:56 11 January 2019 (PT)
Tackling these issues of privacy, information sharing and data stewardship are crucial to
the future of LA. O’Neil’s (2016) book, Weapons of Math Destruction, is one example of the
narrative about how “Big Data” can be used to support and deepen disparities and reinforce
discriminatory practices (see also Rummel et al., 2016). While the sheer volume of digital
data gathered from whole populations (e.g. every student in a class, at a school, in a district)
provides an appearance of validity and generalizability – no small n’s or selective sampling,
no response bias – we must be mindful that what gets collected, who has access, how the
data are analyzed and interpreted, and the decisions resulting from these analyses are all
subject to human judgment. As Reich (2015) observed, “Researchers have come to believe
that nudges and pings can have enormous power over individual student choices, and
educators, parents and the public-at-large need to discuss how that power should best be
used.”
This article is intended to help the learning sciences and information science
communities understand why LA is important and how their expertise is crucial to
advancing how we understand learning, and for developing LA-based interventions that
meet the values that we share. Educational technologies are not all easy to implement and
educational practices are notoriously difficult to change, especially at scale. The push for LA
across all educational sectors calls for scholarly research that clearly demonstrates – to
parents, students, teachers and administrators – the value proposition implicit in this field.
The use of LA in education requires that we think carefully about what we need to know
and when data is most likely to tell us what we need to know. I invite colleagues in the
learning sciences and information sciences to join me in this endeavor.
Note
1. See Siemens and Baker (2012) for a direct comparison of LA and EDM research
References
Aguilar, S., Holman, C. and Fishman, B. (2015), “Game-Inspired design empirical evidence in support of
gameful learning environments”, Games and Culture, Vol. 13 No. 1, pp. 44-70.
Ahn, J. and Erickson, I. (2016), “Revealing mutually constitutive ties between the information and
learning sciences”, The Information Society, Vol. 32 No. 2, pp. 81-84.
Arnold, K.E. (2010), “Signals: Applying academic analytics”, ECUCAUSE Quarterly, Vol. 33 No. 1, p. 1.
Arnold, K.E. and Pistilli, M.D. (2012), “Course signals at purdue: Using learning analytics to increase
student success”, Proceedings of the International Conference on Learning Analytics and
Knowledge, ACM, New York, NY, pp. 267-270.
Banks, J., Au, K., Ball, A., Bell, P., Gordon, E., Gutierrez, K. and Zhou, M. (2006), Learning in and out of
School in Diverse Environments: Life-Long, Life-Wide, Life-Deep, NSF LIFE Center and
University of Washington, DC Center for Multicultural Education, Seattle, WA.
Beckmann, J.S. and Lew, D. (2016), “Reconciling evidence-based medicine and precision medicine in the
era of big data: challenges and opportunities”, Genome Medicine, Vol. 8 No. 1, pp. 134-2016.
Blikstein, P. and Worsley, M. (2016), “Multimodal learning analytics: a methodological framework for
research in constructivist learning”, Journal of Learning Analytics, Vol. 3 No. 2, pp. 220-238.
Bodily, R. and Verbert, K. (2017), “Review of research on student-facing learning analytics dashboards
and educational recommender systems”, IEEE Transactions on Learning Technologies, Vol. 10
No. 4, pp. 405-418.
Campbell, J.P., DeBlois, P.B. and Oblinger, D.G. (2007), “Academic analytics: a new tool for a new era”, Learning
EDUCAUSE Review, Vol. 42 No. 4, p. 40.
analytics
Chen, B., Knight, S. and Wise, A.F. (2018), “Critical issues in designing and implementing temporal
analytics”, Journal of Learning Analytics, Vol. 5 No. 1, pp. 1-9.
Coburn, C.E. and Penuel, W.R. (2016), “Research–practice partnerships in education: outcomes,
dynamics, and open questions”, Educational Researcher, Vol. 45 No. 1, pp. 48-54.
Data and Society Blog (2017), Assessing the Legacy of InBloom, available at: https://datasociety.net/
blog/2017/02/02/assessing-legacy-inbloom/ (accessed 23 May 2018).
Dawson, S., Bakharia, A. and Heathcote, E. (2010), “SNAPP: Realising the affordances of real-time SNA
within networked learning environments”, Proceedings of the 7th International Conference on
Networked Learning, Lancaster University, Lancaster, pp. 125-133.
Dawson, S., Gaševic, D., Siemens, G. and Joksimovic, S. (2014), “Current state and future trends: a
citation network analysis of the learning analytics field”, Proceedings of the Fourth International
Downloaded by Eastern Kentucky University At 05:56 11 January 2019 (PT)
Conference on Learning Analytics and Knowledge, ACM, New York, NY, USA, pp. 231-240.
Dennis, M.J. (2017), “How big data can help recruitment and retention”, Enrollment Management
Report, Vol. 21 No. 5, pp. 3-3.
Durall, E. and Gros, B. (2014), “Learning analytics and a metacognitive tool”, Proceedings of the 6th
International Conference on Computer Supported Education (CSEDU), pp. 380-384.
Eckerson, W.W. (2011), Performance Dashboards. Measuring, Monitoring, and Managing Your
Business, 2nd ed., Wiley, Hoboken.
Ferguson, R. (2012), “Learning analytics: drivers, developments and challenges”, International Journal
of Technology Enhanced Learning, Vol. 4 Nos 5/6, pp. 304-317.
Friedman, C.P., Rubin, J.C. and Sullivan, K.J. (2017), “Toward an information infrastructure for global
health improvement”, Yearbook of Medical Informatics, Vol. 26 No. 01, pp. 16-23.
Fritz, J.L. (2011), “Classroom walls that talk: using inline course activity data of successful students to
raise self-awareness of underperforming peers”, Internet and Higher Education, Vol. 14 No. 2,
pp. 89-97.
Gaševic, D., Dawson, S. and Jovanovic, J. (2016), “Ethics and privacy as enablers of learning analytics”,
Journal of Learning Analytics, Vol. 3 No. 1, pp. 1-4.
Govaerts, S., Verbert, K., Klerkx, J. and Duval, E. (2010), “Visualizing activities for self-reflection and
awareness”, in Advances in Web-Based Learning–ICWL, Springer, Berlin, pp. 91-100.
Greller, W. and Drachsler, H. (2012), “Turning learning into numbers. Toward a generic framework for
learning analytics”, Journal of Educational Technology and Society, Vol. 15 No. 3, pp. 42-57.
Hattie, J. and Timperley, H. (2007), “The power of feedback”, Review of Educational Research, Vol. 77
No. 1, pp. 81-112.
Herold, B. (2014), “inBloom to shut down amid growing data-privacy concerns”, Education Week,
Vol. 21.
Hoadley, C. (2004), “Learning and design: Why the learning sciences and instructional systems need
each other”, Educational Technology, Vol. 44 No. 3, pp. 6-1.
Hoadley, C. and Van Haneghan, J. (2011), “The learning sciences: Where they came from and what it
means for instructional designers”, in Reiser, R.A. and Dempsey, J.V. (Eds), Trends and Issues in
Instructional Design and Technology, 3rd ed., Pearson, New York, NY, pp. 53-63.
Huberth, M., Chen, P., Tritz, J. and McKay, T.A. (2015), “Computer-Tailored student support in
introductory physics”, PLoS One, Vol. 10 No. 9, p. e0137001.
Ito, M., Gutiérrez, K., Livingstone, S., Penuel, B., Rhodes, J., Salen, K., Schor, J., Sefton-Green, J. and
Watkins, S.C. (2013), Connected Learning: An Agenda for Research and Design, BookBaby.
Kennedy, G., Corrin, L. and De Barba, P. (2017), “Analytics of what? negotiating the seduction of big
data and learning analytics”, in James, R., French, S. and Kelly, P. (Eds), Visions for Australian
ILS Tertiary Education, Melbourne Centre for the Study of Higher Education, The University of
Melbourne, Melbourne, pp. 67-76.
Khalil, M. and Ebner, M. (2016), “What is learning analytics about? A survey of different methods used
in 2013-2015”, Proceedings of the Smart Learning Conference, arXiv preprint arXiv:1606.02878,
pp. 294-304.
Krumm, A.E., Waddington, R.J., Teasley, S.D. and Lonn, S. (2014), “Using data from a learning
management system to support academic advising in undergraduate engineering education”, in
Larusson, J.A. and White, B. (Eds), Learning Analytics from Research to Practice: Methods,
Tools, and Approaches, Springer-Verlag, Berlin, pp. 103-119.
Krumm, A., Means, B. and Bienkowski, M. (2018), Learning Analytics Goes to School: A Collaborative
Approach to Improving Education, Routledge, New York, NY.
Kurzweil, M. and Stevens, M. (2018), “Setting the table: responsible use of student data in higher
education”, EDUCAUSE Review, May/June, pp. 17-24.
Downloaded by Eastern Kentucky University At 05:56 11 January 2019 (PT)
Lang, C., Siemens, G., Wise, A. and Gaševic, D. (Eds) (2017), Handbook of Learning Analytics, Solar,
Society for Learning Analytics and Research.
Larsen, R.L. (2010), “iSchools”, in Bates, M.J. and Maack, M.N (Eds), Encyclopedia of Library and
Information Sciences, 3rd ed., CRC Press, Boca Raton, pp. 3018-3023.
Long, P. and Siemens, G. (Eds) (2011), Proceedings of the 1st International Conference on Learning
Analytics and Knowledge, ACM, New York, NY, USA.
O’Neil, C. (2016), “Weapons of math destruction: How big data increases inequality and threatens
democracy”, Broadway Books.
Penuel, W.R., Peurach, D.J., LeBoeuf, W.A., Riedy, R., Barber, M., Clark, T.R. and Gabriele, K. (2017),
Defining Collaborative Problem Solving Research: Common Values and Distinctive Approaches,
University of CO, Boulder, CO.
Reich, J. (3 June, 2015), available at: www.kqed.org/mindshift/40719
Rogers, T., Dawson, S. and Gaševic, D. (2016), “Learning analytics and the imperative for theory driven
research”, in Haythornthwaite, C., Andrews, R., Fransman, J. and Meyers, E. (Eds), Handbook of
E-Learning Research, SAGE, London, pp. 232-250.
Rummel, N., Walker, E. and Aleven, V. (2016), “Different futures of adaptive collaborative learning
support”, International Journal of Artificial Intelligence in Education, Vol. 26 No. 2, pp. 784-795.
Schwendimann, B.A., Rodriguez-Triana, M.J., Vozniuk, A., Prieto, L.P., Boroujeni, M.S., Holzer, A.
and Dillenbourg, P. (2017), “Perceiving learning at a glance: a systematic literature review
of learning dashboard research”, IEEE Transactions on Learning Technologies, Vol. 10
No. 1, pp. 30-41.
Sedrakyan, G., Malmberg, J., Verbert, K., Järvelä, S. and Kirschner, P.A. (2018), “Linking learning
behavior analytics and learning science concepts: designing a learning analytics dashboard for
feedback to support learning regulation”, Computers in Human Behavior.
Shum, S.B. and Ferguson, R. (2012), “Social learning analytics”, Journal of Educational Technology and
Society, Vol. 15 No. 3, pp. 3-26.
Siemens, G. and Baker, R. (2012), “Learning analytics and educational data mining: towards
communication and collaboration”, Proceedings of the 2nd International Conference on Learning
Analytics and Knowledge, ACM, Vancouver, Canada, pp. 252-254.
Siemens, G. and Long, P. (2011), “Penetrating the fog: analytics in learning and education”,
EDUCAUSE Review, Vol. 46 No. 5, pp. 30-32.
Slade, S. and Prinsloo, P. (2013), “Learning analytics: ethical issues and dilemmas”, American
Behavioral Scientist, Vol. 57 No. 10, pp. 1509-1528.
Staub, E.T. (2017), “Understanding technology adoption: Theory and future directions for informal
learning”, Review of Educational Research, Vol. 79 No. 2, pp. 625-649.
Teasley, S.D. (2017), “Student facing dashboards: One size fits all?”, Technology, Knowledge and Learning
Learning, Vol. 22 No. 3, pp. 377-384.
analytics
Vieira, C., Parsons, P. and Byrd, V. (2018), “Visual learning analytics of educational data: a systematic
literature review and research agenda”, Computers and Education, Vol. 122, pp. 119-135.
Winne, P.H. (2017), “Learning analytics for self-regulated learning”, in Land, C., Siemens, G., Wise, A.
and Gaševic, D. (Eds), Handbook of Learning Analytics, 1st ed., The Society for Learning
Analytics Research, Beaumont, AB, pp. 241-249.
Winstone, N.E., Nash, R.A., Parker, M. and Rowntree, J. (2017), “Supporting learners’ agentic
engagement with feedback: a systematic review and a taxonomy of recipience processes”,
Educational Psychologist, Vol. 52 No. 1, pp. 17-37.
Wise, A. (2014), “Designing pedagogical interventions to support student use of learning analytics”,
Proceedings of the International Conference on Learning Analytics and Knowledge, ACM,
Indianapolis, IN, pp. 203-211.
Downloaded by Eastern Kentucky University At 05:56 11 January 2019 (PT)
Wise, A.F. and Schwarz, B.B. (2017), “Visions of CSCL: eight provocations for the future of the field”,
International Journal of Computer-Supported Collaborative Learning, Vol. 12 No. 4, pp. 423-467.
Yoo, Y., Lee, H., Jo, I.H. and Park, Y. (2015), “Educational dashboards for smart learning: review of case
studies”, In Chen, G., Kumar, V., Kinshuk, Huang, R. and Kong, S.C. (Eds), Emerging Issues in
Smart Learning, Springer, Berlin Heidelberg, pp. 145-155.
Further reading
Aguilar, S. (2016), “Perceived motivational affordances: Capturing and measuring students’ sense-
making around visualizations of their academic achievement information”, (Doctoral
Dissertation), University of Michigan, Ann Arbor, MI.
Prinsloo, P. and Slade, S. (2017), “Ethics and learning analytics: Charting the (un)charted”, in C., Land,
G., Siemens. A., Wise and Gaševic, D. (Eds), Handbook of Learning Analytics, 1st edition, The
Society for Learning Analytics Research, pp. 49-57.
Corresponding author
Stephanie Danell Teasley can be contacted at: steasley@umich.edu
For instructions on how to order reprints of this article, please visit our website:
www.emeraldgrouppublishing.com/licensing/reprints.htm
Or contact us for further details: permissions@emeraldinsight.com
This article has been cited by: