ArticlePDF Available

Editorial: Learning design, teacher inquiry into student learning and learning analytics: A call for action: Learning design, TISL and learning analytics

Wiley
British Journal of Educational Technology
Authors:

Abstract

This special issue deals with three areas. Learning design is the practice of devising effective learning experiences aimed at achieving defined educational objectives in a given context. Teacher inquiry is an approach to professional development and capacity building in education in which teachers study their own and their peers' practice. Learning analytics use data about learners and their contexts to understand and optimise learning and the environments in which it takes place. Typically, these three—design, inquiry and analytics—are seen as separate areas of practice and research. In this issue, we show that the three can work together to form a virtuous circle. Within this circle, learning analytics offers a powerful set of tools for teacher inquiry, feeding back into improved learning design. Learning design provides a semantic structure for analytics, whereas teacher inquiry defines meaningful questions to analyse.
Editorial: Learning design, teacher inquiry into student learning
and learning analytics: A call for action
Yishay Mor, Rebecca Ferguson and Barbara Wasson
Yishay Mor is a consultant in educational technology and design research, at http://www.yishaymor.org/. Rebecca
Ferguson is a lecturer in the Institute of Educational Technology at The Open University, Milton Keynes, MK7 6AA,
UK. Barbara Wasson is Professor in the Department of Information Science and Media Studies, University of
Bergen, PostBox 7800, 5020, Bergen, Norway. Address for correspondence: Dr. Yishay Mor, 36 Penderyn way, N7
0EW, London, UK. Email: yishaym@gmail.com
Abstract
This special issue deals with three areas. Learning design is the practice of devising
effective learning experiences aimed at achieving defined educational objectives in a
given context. Teacher inquiry is an approach to professional development and capacity
building in education in which teachers study their own and their peers’ practice. Learn-
ing analytics use data about learners and their contexts to understand and optimise
learning and the environments in which it takes place. Typically, these three—design,
inquiry and analytics—are seen as separate areas of practice and research. In this issue,
we show that the three can work together to form a virtuous circle. Within this circle,
learning analytics offers a powerful set of tools for teacher inquiry, feeding back into
improved learning design. Learning design provides a semantic structure for analytics,
whereas teacher inquiry defines meaningful questions to analyse.
Introduction
Learning design or, as some prefer, design for learning (Beetham & Sharpe, 2013; Laurillard,
2013), is an emerging field of educational research and practice. Its practitioners are interested
in understanding how the intuitive processes undertaken by teachers and trainers can be made
visible, shared, exposed to scrutiny and consequently made more effective and efficient. Craft and
Mor (2012) define learning design as “the creative and deliberate act of devising new practices,
plans of activity, resources and tools aimed at achieving particular educational aims in a given
context.” The emphasis on this activity as both “creative and deliberate” highlights the dual
nature of design, and in particular learning design, as both a creative practice and a rigorous
inquiry. This tension is discussed in theoretical studies of the nature and value of design (Cross,
2001; Latour, 2008; Schön, 1992a).
The learning design approach advocates a shift from a focus on content to a focus on the learning
experience, with an aim of guiding learners as they make a transition from an initial state of mind
to a desired one. In most design disciplines there is usually a clear division of labour between
designers, developers and evaluators. An architect designs a building, which a contractor builds.
An inspector verifies that it meets all the required standards. Although a similar dynamic is
possible, and often present, in learning design, there is a growing interest in alternative models in
which teachers are empowered as designers and researchers of learning (Bannan-Ritland, 2008;
Kali, McKenney & Sagy, 2012).
The virtues of such an approach are multiple. In terms of the specific designed learning experi-
ence, teachers have the advantage of an intimate knowledge of the context of learning and the
British Journal of Educational Technology Vol 46 No 2 2015 221–229
doi:10.1111/bjet.12273
© 2015 British Educational Research Association
characteristics of the learners. This means they have the capacity to produce a design that is well
fit for purpose. In terms of the teacher engaged in design, research has shown the advantage of
engaging in design for developing their pragmatic as well as their theoretical skills (Mor &
Mogilevsky, 2013; Voogt et al, 2011).
Shifting the locus of design work to teachers, as Holmberg (2014) argues, is in line with Schön’s
model of design as reflective practice, a “conversation with materials,” during which practitioners
attentively introduce innovations into their environment, observe their effects and adjust them
until they achieve the desired effect. Schön argued for the potential of reconceptualising educa-
tional practice as design: “from the perspective of designing as learning and learning as design-
ing, the teaching/learning process could be seen, at its best, as a collaborative, communicative
process of design and discovery” (Schön, 1992b, p. 133).
In Schön’s view, this is the antidote for what he calls the dilemma of abandonment:
practitioners may feel, in relation to the academy, a sense of having been seduced and abandoned. [. . .] an
erosion of practitioners’ faith in the ability of academic research to deliver knowledge usable for solving
social problems. [. . .] On the other hand, when practitioners accept and try to use the academy’s esoteric
knowledge they are apt to discover that its appropriation alienates then from their own understandings,
engendering a loss of their sense of competence and control.
He concludes that “nowhere are these dilemmas more apparent than in the field of education”
(Schön, 1992b, p. 120).
By contrast, adopting a design mindset, “when we attend to what we know already, appreciating
the artistry and wisdom implicit in competent practice, believing that by reflection on that
practice we can make some of our tacit knowledge explicit, we take on a ‘reflective turn’, that
leads us to see students and teachers (at their best) as participants in a kind of reflective practice,
a communicative and self-reflective practice of reciprocal inquiry” (Schön, 1992b, p. 123). This
is perhaps the greatest advocacy of learning design ever written, and it was written almost a
decade before the term was coined (Dalziel et al, 2013).
Schön defined his concepts of design with reference to Dewey’s legacy of inquiry (Schön, 1992b).
Teacher inquiry adopts Dewey’s inquiry-based learning as a framework for teachers’ professional
development. Using this approach, teachers develop conjectures about students’ learning, and
then use available data or conduct classroom experiments in order to test these conjectures. In so
doing, they enhance both their theoretical knowledge and their practical skills.
Teacher inquiry, which emerged in the late 1980s (Cochran-Smith & Lytle, 1999), can be seen
both as a way to improve day-to-day teaching in the classroom and as professional development
for teachers. Clarke and Erickson (2003) define teacher inquiry as a set of research practices by
which teachers examine their practice and its effects on students’ learning, in order to enhance
their professional knowledge and improve their practice. Acknowledgement of inquiry as an
inherent component of teaching practice, argue Clarke and Erickson, is essential to maintaining
the professional standards of teaching and blocking the risk of disempowerment of educators and
de-professionalisation of education by over-regulation and standardisation of curricula and
testing regimes.
In their extensive survey of the literature, Clark et al identified that:
key characteristics that may contribute to a broadly conceptualised definition of teacher inquiry include the
notion that it is: systematic, intentional, contextual, self-critical, practical, action oriented, planned,
evidence-based, evaluative, and shared, and the main challenge lies in transforming teachers’ personal
skills, knowledge, and expertise into professional skills, knowledge and expertise. (Clark, Luckin & Jewitt,
2011, p. 8)
Traditionally, teacher inquiry has relied predominantly on qualitative methods, including teacher
journals, oral examinations, analysis of interviews, texts, student productions and observations,
222 British Journal of Educational Technology Vol 46 No 2 2015
© 2015 British Educational Research Association
peer-observation, and conceptual desk research of classroom issues (Lytle & Cochran-Smith,
1990). More recently, there has been a focus on teacher inquiry and on the role of student data
as evidence. For example, Timperley, Wilson, Barrar and Fung (2007) noted that in all of the 97
studies of teacher inquiry they reviewed, student data in the form of assessment were used to
provide an analysis of the teaching–learning relationship for the purpose of improving teaching.
Taking this a step further, teacher inquiry into student learning (TISL) focuses on the evidence
base of student learning data that is available in technology-rich environments and its role in
improving practice.
TISL is a systematic, intentional, design-oriented approach to teachers’ technology-supported inquiry into
students’ learning. It focuses on the development and use of formative e-assessment methods using
advanced learning technologies (digital tools) to capture, analyse, interpret, share and evaluate student
data. In so doing, it aims to contribute to the development of teacher professionalism and school improve-
ment through a focus on teacher-centred, practice-based, evidence-oriented research activity. (Clark et al,
2011, p. 13)
Today’s classrooms are technology-rich environments, and the key contribution of the TISL
approach is in acknowledging and harnessing the opportunities created by technology to support
a more systematic, data-informed mode of practitioner research.
A second limitation of traditional teacher inquiry approaches has been the fragmentation of
inquiry efforts. Teachers have typically conducted their investigations either individually, or in
small local groups. We need a teacher inquiry method that is initiated by a top-down change but
encompasses the practice of many teachers. This will help to shift inquiry away from a focus on
individual practice and towards a more collaborative approach, aggregating the findings of mul-
tiple inquiries into a robust body of knowledge. Again, technology has the potential to enable this
change—by supporting large communities of inquiry.
Garrison, Anderson and Archer (1999, 2010) have identified three dimensions that determine
the effectiveness of an online community of inquiry: cognitive presence, social presence and
teaching presence. Online environments forTISL have the potential to raise the quality of teacher
inquiry, both in terms of the scientific quality, and in terms of the value for professional develop-
ment, by offering scaffolds for positive cognitive and social presence, and opportunities for teacher
trainer (or researcher) presence.
The data-informed approach to teacher inquiry, a key quality of TISL, is part of a wider trend in
education, towards the use of data and analytics to inform educational practice. Learning
analytics are defined as “the measurement, collection, analysis and reporting of data about
learners and their contexts, for purposes of understanding and optimising learning and the
environments in which it occurs” (Ferguson, 2012). Learning analytics typically employ large
datasets to provide real-time or retrospective insights about the effect and effectiveness of various
elements and features of learning environments. Learners and educators can use these analytics
to review work that has been done in the past, to support current study and to access recommen-
dations for future activities. This is a very young field—yet some impressive results have already
been reported (Arnold & Pistilli, 2012).
Learning analytics are rooted in data science, artificial intelligence, and practices of recom-
mender systems, online marketing and business intelligence.The tools and techniques developed
in these domains make it possible to identify trends and patterns, and then benchmark individuals
or groups against these trends. For example, analysing the activity logs of students watching
videos in online courses allows researchers to identify parameters such as video length that
influence engagement (Guo, Kim & Rubin, 2014; Kim et al, 2014). Discourse and content analy-
sis can be used to help identify effective and ineffective discussions and argumentations in col-
laborative online learning environments (Buckingham Shum & Ferguson, 2012). Analysis of
Learning design, TISL and learning analytics 223
© 2015 British Educational Research Association
online discussions shows that the frequency of activity—the number of chat messages a student
sends—is not necessarily correlated with success, but the type of activity (eg, seeking technical
advice vs. questions regarding the course materials) is a predictor of final grades (Abdous, He &
Yen, 2012). Using such insights, educators can identify at-risk students based on content analysis
of their online activity.
Dietz-Uhler and Hurn (2013) review eight institutions that have incorporated learning analytics
into their teaching and learning systems and list a variety of uses they observed. Learning
analytics can provide near-real-time insights into the learner activity and experience, enabling
faculty to make suggestions to students that will help them succeed. By monitoring student
activity and performance throughout a course, learning analytics can help faculty identify and
redress weaknesses in course design and implementation.
Dietz-Uhler and Hurn list the potential benefits of learning analytics, according to Long and
Siemens (2011). From an institutional perspective, learning analytics can improve decision
making and resource allocation, highlight an institution’s successes and challenges and increase
organisational productivity. From the perspective of a faculty member, learning analytics can
help to identify at-risk learners and provide interventions, transform pedagogical approaches and
help students gain insight into their own learning. Using learning analytics, staff can identify
productive and counter-productive patterns of learner behaviour. The productive models can be
suggested to learners as guidance, the counterproductive can be used to identify at-risk learners.
Learning analytics can identify sections of a course that most learners skim or skip without any
negative effect, suggesting that these elements are redundant, and the sections most students
struggle with, which may need to be reworked. Some institutions offer students a learning
analytics dashboard as part of their learning environment, which can help them to monitor and
manage their own learning processes.
A current problem is that the information provided by learning analytics tools is not generally
aligned with teachers’ needs for the management of learning activities. We need to be aware that
the pedagogical decisions embedded in learning designs affect both the learning analytics process
and its outcomes. We also need to take into account that mixed approaches will be necessary to
represent and understand the complex dynamics of groups of students in online learning envi-
ronments.
Challenges and synergies
Learning design, TISL and learning analytics all offer significant value for educational practice,
research and development. Yet all three have their challenges. Together, these approaches com-
plement each other, in that the advantages of one provide a remedy for the caveats of the others.
In the opening paragraphs of this paper, we adopted a view of design as simultaneously a creative
practice and a rigorous inquiry. Arguably, most of the work in the field of learning design has
focused on the creative processes, on practices, tools and representations to support it and on
mechanisms for sharing its outputs between practitioners. Very little has been done in terms of
the practices, tools and representations used for evaluating the effects of the designs.
Several approaches emphasise top-down quality enhancement, which help designers to base
their work on sound pedagogical principles. What is missing is the trajectory that would complete
the feedback loop: the built-in evaluation of designs to see whether they achieved the expected
outcomes.
Schön’s “reflective conversation with the situation,” whereby practitioners design an innovation,
introduce it, observe its effects and refine it, may occur spontaneously and intuitively—but it
remains tacit. There are no structured mechanisms to support it, and no standards for sharing the
findings from such an inquiry. The TISL framework offers exactly such mechanisms. Embedding
224 British Journal of Educational Technology Vol 46 No 2 2015
© 2015 British Educational Research Association
design work in a broader cycle of structured inquiry offers multiple benefits from a learning
design perspective. First and foremost, it would enhance the quality of the designed artefacts, by
exposing them to scrutiny, validating the areas in which they achieve their aims and highlighting
the areas that require refinement. Consequently, it would increase the credibility of these designs
and of the practices of learning design in general.
TISL also stands to benefit from a dialogue with learning design. Inquiry, as proposed by Dewey,
is a learning process modelled on the practices of scientific work. Teachers frame conjectures,
identify the suitable methods for testing these conjectures, conduct empirical work to implement
these test, and share and reflect on the outcomes. Indeed, teacher inquiry has also been referred
to as teacher research or classroom action research. Yet, when we say “scientific work,” what do
we mean? The dominant interpretation in this context would be based on the paradigms of social
sciences. By contrast, Laurillard (2013) and Mor and Winters (2007) propose Simon’s notion of
design science as a valuable alternative (Simon, 1996). Indeed, this approach is the basis for
much of the work in educational design, even if explicit reference to Simon is rare. Adopting
design science as the frame of reference would enable TISL to focus explicitly on value-driven
positive change and articulate the means of promoting such change. Learning design can offer an
arsenal of tools and representations appropriate for such a perspective.
Combining learning design and TISL can thus give birth to a “citizen design science of learning.”
What this combination still lacks is its scientific instruments: how can we actually measure the
effects of a design? Learning analytics offer an answer to this question. Consider the following
scenario: a teacher identifies a challenge in teaching a mathematical concept. Based on a review
of relevant literature and examples of past innovations in the field, she devises a lesson plan that
combines an off-the-shelf game with an online discussion. She then defines metrics for the
discussion, which will serve as indicators for learning. Now she uses the game’s learning analytics
dashboard and the online discussion system’s content analysis module to evaluate and refine her
lesson plan. Once she is satisfied with her results, she shares them with peers, who replicate her
experiment and calibrate her findings. Clearly, this is not a randomised control trial. However,
what it loses in scientific validity, it gains in practical relevance. In many ways, this is perhaps as
close as we may wish to get to research-based educational practice.
The field of learning analytics faces considerable challenges, some of which could be addressed by
a closer dialogue with learning design and TISL. Some of these are technical, some political,
cultural or administrative, some ethical (Dietz-Uhler & Hurn, 2013; Ferguson, 2012). These need
to be resolved before the value of learning analytics for learning design and TISL can be fully
realised. Yet there are some challenges that can only be addressed by introducing the vocabulary
of learning design into the repertoire of learning analytics. Learning analytics can identify
patterns of learner activity, interaction and conversation. They can identify generically beneficial
patterns and generically obstructive ones. Clearly, if a learner does not engage with the learning
material at all, or if her conversations in the course forum are limited to her struggles with
technical issues, she needs attention. The more complex patterns can only be interpreted in
relation to the design of the learning activity: if we set learners on a path of collaborative inquiry,
we expect them first to negotiate a research question, then curate research methods, brainstorm
an experimental design, divide the work between them, coordinate their individual efforts and
finally converge to discuss their findings. In each phase of this cycle, we would expect to find
different patterns of action, interaction and conversation.
As mentioned above, learning analytics holds clear potential benefits for educators and students.
These benefits will remain theoretical unless the potential beneficiaries engage in an active
process of inquiry into learning. Without such a process, there is a risk that the signals provided
by learning analytics systems will be ignored, or worse—misinterpreted.
Learning design, TISL and learning analytics 225
© 2015 British Educational Research Association
The linking of learning design and teacher inquiry is aligned both with Schön’s conception of
design as inquiry and with Simon’s model of design science. Learning analytics provide the
instrument for making this design inquiry of learning truly powerful.
Emin-Martínez et al (2014) take a first step towards this integration and propose a teacher-led
design inquiry of learning as a new model of educational practice and professional development.
The model is designed to capture the essence of the synergy of learning design, TISL and learning
analytics.
The papers in this special issue
Persico and Pozzi (2014) provide a systematic review of the field of learning design and map the
potential contributions of learning analytics along three dimensions: representations, methods
and tools. The authors identify the problems implicit in using a multitude of different represen-
tations, approaches and tools that are not interoperable. Such a review is useful in framing the
discussion in the other papers and any future discussion in this area.
Avramides, Hunter, Oliver and Luckin (2014) examine a case study in which an educational
innovation is introduced by school leadership, through a process of collaborative teacher inquiry.
They highlight the importance of context and demonstrate how the combination of learning
design, TISL and learning analytics could provide an effective model of institutional change.
Apart from the advantages discussed above, this model dissolves resistance to change by empow-
ering teachers as partners in the process while reinforcing the leadership of the head teacher.
McKenney and Mor (2015) discuss the potential and challenges of a prospective system combin-
ing learning design, TISL and learning analytics through a retrospective analysis of CASCADE-
SEA, a learning design system that has been used successfully and evaluated systematically in the
past. From this analysis, they derive design principles for the development of future systems.
Ghislandi and Raffaghelli (2015) demonstrate the potential of combining learning design, TISL
and learning analytics to raise educational quality. They describe a teacher inquiry that is mod-
elled as a design experiment and driven by qualitative and quantitative data of student activity.
Their study illustrates how a process of teacher inquiry can draw on both the practices and
representations of educational design research and the tools and methods of learning design,
making use of the data afforded by educational technologies.
Haya, Daems, Malzahn, Castellanos and Hoppe (2015) consider the complex pedagogical frame-
work of the JuxtaLearn project, which involves student production, team work and critical dis-
cussions between teams. Such situations, while very rich from a learning science perspective, are
often hard to manage from a pragmatic point of view. Teachers find it challenging to monitor
students’ behaviour and consequently to evaluate their learning designs. The authors introduce
social learning analytics and content analysis tools, tailored to the dynamics dictated by the
design, and present the outputs of these tools in a manner that is accessible to teachers. Thus, the
development of design-specific analytics tools that are embedded in a TISL process enables teach-
ers and students to benefit from a richer pedagogy.
Location-based games are another example of a pedagogical scenario that is hard to monitor.
Melero, Hernández-Leo, Sun, Santos and Blat (2015) present a learning design dashboard devel-
oped to support teachers in such a setting and illustrate how it transformed teachers’ inquiry
strategies, enabling them to make evidence-based learning design decisions. As with the paper by
Haya et al, this study shows how the combination of learning design, TISL and learning analytics
can open the door to effective new pedagogies.
Both the JuxtaLearn scenario and the location-based game scenario are instances of blended
computer-supported collaborative learning design. Rodríguez-Triana, Martínez-Monés,
Asensio-Pérez and Dimitriadis (2014) propose a monitoring-aware pattern-based design process
226 British Journal of Educational Technology Vol 46 No 2 2015
© 2015 British Educational Research Association
that not only allows educators to benefit from the outputs of learning analytics to improve their
design, but also encourages them to design their activities in a manner that would maximise the
effectiveness of learning analytics. Their empirical findings confirm the value of an approach that
combines learning design, TISL and learning analytics and suggest that indeed, in this case, the
sum is greater than the parts; monitoring-aware design improves both design and monitoring
and expands the pedagogical repertoire of teachers.
Together, these papers have implications for the practice of educational establishments, teachers
and those involved in the development and deployment of learning analytics.
Schools, colleges and universities should be aware of the possibilities that are opened up by
bringing together design, inquiry and analytics, and they need to consider carefully how these
combined activities can be encouraged and supported. Although teachers engage naturally in
learning design, top-down pressures can have a heavy impact on the designs that they produce.
For learning activities to be aligned with the requirements of particular educational situations,
teachers need to be actively involved in their design and supported to do this.
Learning analytics and the associated visualisations and dashboards can be used to improve
learning design and to support inquiry into students’ learning activities in the classroom. In the
wider context, engaging the whole educational community extends the benefits of design,
inquiry and analytics. Sharing analytics with students enhances their opportunities for self-
assessment, whereas opening up design for learning provides a way to share the quality principles
that govern education. This can have positive impact on understanding, negotiating and inno-
vating for educational quality.
Within a single classroom, a teacher inquiry may have limited impact. However, an inquiry that
is coordinated by a lead teacher or by management can encourage individuals or groups of
teachers to carry out subsequent inquiries with implications for the whole department or school.
Considering the interplay between design, inquiry and analytics across the institution will help to
avoid duplication of effort and help to develop communities of practice. In order to do this
effectively and to improve the quality of teaching and learning, training is needed in these areas,
and this requires time and resources to be set aside for staff development.
Conclusion
This special issue seeks to explore the synergies between learning design, TISL and learning
analytics. Research has generally taken learning design, TISL and learning analytics to be sepa-
rate activities, and epistemic research in the domains has seen relatively little convergence.
However, as this special issue editorial argues, they can be seen as complementary endeavours,
each informing and improving the others. The seven papers in the issue provide a rich picture of
current efforts to take advantage of these synergies.
References
Abdous, M., He, W. & Yen, C.-J. (2012). Using data mining for predicting relationships between online
question theme and final grade. Educational Technology & Society,15, 3, 77–88.
Arnold, K. E. & Pistilli, M. (2012). Course signals at Purdue: using learning analytics to increase student success.
Paper presented at the LAK12: 2nd International Conference on Learning Analytics and Knowledge (30
April–2 May), Vancouver, Canada.
Avramides, K., Hunter, J., Oliver, M. & Luckin, R. (2014). A method for teacher inquiry in cross-curricular
projects: lessons from a case study. British Journal of Educational Technology,46, 2, 249–264.
Bannan-Ritland, B. (2008). Teacher design research: an emerging paradigm for teachers’ professional
development. In R. Lesh, A. Kelly & J. Baek (Eds), Handbook of design research methods in education (pp.
246–262). New York: Routledge.
Beetham, H. & Sharpe, R. (2013). Rethinking pedagogy for a digital age: designing for 21st-century learning.
London: Routledge.
Learning design, TISL and learning analytics 227
© 2015 British Educational Research Association
Buckingham Shum, S. & Ferguson, R. (2012). Social learning analytics. Educational Technology & Society,
15, 3, 3–26.
Clark, W., Luckin, R. & Jewitt, C. (2011). NEXT-TELL research report D5.1: methods and specifications for TISL
components V1. NEXT-TELL Consortium, European Commission IST-285114.
Clarke, A. & Erickson, G. L. (2003). Teacher inquiry: a defining feature of professional practice. In A. Clarke
& G. Erickson (Eds), Teacher inquiry: living the research in everyday practice (pp. 1–6). London: Psychology
Press.
Cochran-Smith, M. & Lytle, S. L. (1999). The teacher research movement: a decade later. Educational
Researcher,28, 15–25.
Craft, B. & Mor, Y. (2012). Learning Design: mapping the landscape. Research in Learning Technology,20,
85–94.
Cross, N. (2001). Designerly ways of knowing: design discipline versus design science. Design Issues,17,3,
49–55.
Dalziel, J., Conole, G., Wills, S., Walker, S., Bennett, S., Dobozy, E. et al. (2013). The larnaca declaration on
learning design—2013. Retrieved January 1, 2015, from http://www.larnacadeclaration.org.
Dietz-Uhler, B. & Hurn, J. E. (2013). Using learning analytics to predict (and improve) student success: a
faculty perspective. Journal of Interactive Online Learning,12, 1, 17–26.
Emin-Martínez, V., Hansen, C., Rodríguez-Triana, M. J., Wasson, B., Mor, Y., Dascalu, M., Ferguson, R.
& Pernin, J.-P. (2014). Towards teacher-led design inquiry of learning. eLearning Papers. Accessed
March 6, 2015, from http://openeducationeuropa.eu/en/article/Towards-Teacher-led-Design-Inquiry-
of-Learning?paper=134810.
Ferguson, R. (2012). Learning analytics: drivers, developments and challenges. International Journal of
Technology Enhanced Learning (IJTEL),4, 5/6, 304–317.
Garrison, D. R., Anderson, T. & Archer, W. (1999). Critical inquiry in a text-based environment: computer
conferencing in higher education. The Internet and Higher Education,2, 2, 87–105.
Garrison, D. R., Anderson, T. & Archer, W. (2010). The first decade of the community of inquiry framework:
a retrospective. The Internet and Higher Education,13, 1, 5–9.
Ghislandi, P. & Raffaghelli, J. (2015). Forward-oriented designing for learning as a means to achieve educa-
tional quality. British Journal of Educational Technology,46, 2, 280–299.
Guo, P. J., Kim, J. & Rubin, R. (2014). How video production affects student engagement: an empirical study of
MOOC videos. Paper presented at the Learning@Scale Conference.
Haya, P., Daems, O., Malzahn, N., Castellanos, J. & Hoppe, H. (2015). Analysing content and patterns of
interaction for improving the learning design of networked learning environments. British Journal of
Educational Technology,46, 2, 300–316.
Holmberg, J. (2014). Studying the process of educational design—revisiting Schön and making a case for
reflective design-based research on teachers’ ‘conversations with situations’. Technology, Pedagogy and
Education,23, 3, 293–310.
Kali, Y., McKenney, S. & Sagy, O. (2012). Teachers as designers of technology enhanced learning. Instruc-
tional Science, 1–7.
Kim, J., Guo, P. J., Seaton, D. T., Mitros, P., Gajos, K. Z. & Miller, R. C. (2014). Understanding in-video
dropouts and interaction peaks in online lecture videos. Paper presented at the Learning@Scale
conference.
Latour, B. (2008). A cautious Prometheus? A few steps toward a philosophy of design (with special attention to
Peter Sloterdijk). Paper presented at the Proceedings of the 2008 Annual International Conference of the
Design History Society.
Laurillard, D. (2013). Teaching as a design science: building pedagogical patterns for learning and technology. New
York / Oxon: Routledge.
Long, P. & Siemens, G. (2011). Penetrating the fog: analytics in learning and education. EDUCAUSE Review,
46, 5, 31–40.
Lytle, S. & Cochran-Smith, M. (1990). Learning from teacher research: a working typology. The Teachers
College Record,92, 1, 83–103.
McKenney, S. & Mor, Y. (2015). Supporting teachers in data-informed educational design. British Journal of
Educational Technology,46, 2, 265–279.
Melero, J., Hernández-Leo, D., Sun, J., Santos, P. & Blat, J. (2015). How was the activity? A visualization
support for a case of location-based learning design. British Journal of Educational Technology,
46, 2, 317–329.
Mor, Y. & Mogilevsky, O. (2013). The learning design studio: collaborative design inquiry as teachers’
professional development. Research in Learning Technology,21. doi: http://dx.doi.org/10.3402/rlt.v21i0
.22054
228 British Journal of Educational Technology Vol 46 No 2 2015
© 2015 British Educational Research Association
Mor, Y. & Winters, N. (2007). Design approaches in technology-enhanced learning. Interactive Learning
Environments,15, 1, 61–75.
Persico, D. & Pozzi, F. (2014). Informing learning design with learning analytics to improve teacher inquiry.
British Journal of Educational Technology,46, 2, 230–248.
Rodríguez-Triana, M. J., Martínez-Monés, A., Asensio-Pérez, J. I. & Dimitriadis, Y. (2014). Scripting and
monitoring meet each other: aligning learning analytics and learning design to support teachers in
orchestrating CSCL situations. British Journal of Educational Technology,46, 2, 330–343.
Schön, D. A. (1992a). Designing as reflective conversation with the materials of a design situation. Research
in Engineering Design,3, 3, 131–147.
Schön, D. A. (1992b). The theory of inquiry: Dewey’s legacy to education. Curriculum Inquiry,22,2,
119–139.
Simon, H. A. (1996). The sciences of the artificial Vol. 136. Boston: MIT Press.
Timperley, H., Wilson, A., Barrar, H. & Fung, I. (2007). Teacher professional learning and development: best
evidence synthesis iteration. Wellington: New Zealand: Ministry of Education.
Voogt, J., Westbroek, H., Handelzalts, A., Walraven, A., McKenney, S., Pieters, J. et al (2011). Teacher
learning in collaborative curriculum design. Teaching and Teacher Education,27, 8, 1235–1244.
Learning design, TISL and learning analytics 229
© 2015 British Educational Research Association
... Learning design is a rapidly evolving field in dynamic higher education contexts, and learning design methods are therefore constantly evolving as well. Mor et al. (2015) have noted that learning design, or as some prefer to call it design for learning (Kickbusch et al., 2022;Laurillard, 2013), is an emerging field of educational research and practice. The difference is significant, as the concept of learning design has traditionally been associated with technology-enhanced learning (Conole & Fill, 2005;Salmon, 2013) and instructional design (Brown & Green, 2019), while design for learning shifts the focus firmly to the 'design' part of design for learning. ...
... The difference is significant, as the concept of learning design has traditionally been associated with technology-enhanced learning (Conole & Fill, 2005;Salmon, 2013) and instructional design (Brown & Green, 2019), while design for learning shifts the focus firmly to the 'design' part of design for learning. Mor et al. (2015) further stress the dual nature of learning design, as "both a creative practice and a rigorous inquiry" (p. 221), reinforcing the idea of learning design as both deliberative and creative. ...
... On the other hand, FDLD adopts a holistic and iterative approach that aligns closely with the emerging field of learning design (Mor et al., 2015). This approach focuses on the "design" aspect, emphasizing learning design's creative and rigorous nature as both a deliberative and creative practice. ...
Article
Full-text available
Learning design has become increasingly important in the context of expanding and dynamic digital learning environments. More traditional teaching approaches are no longer sufficient to engage learners in these changing learning environments. Teachers and aspiring learning designers therefore increasingly need learning design expertise, which takes time to develop. This paper responds to this challenge by outlining a fishbone digital learning design method, which is aligned with updated versions of Bloom’s taxonomy for the digital world, as a scaffolded approach to developing basic learning design expertise. The study integrates three practical tools into one learning design method to ensure constructive alignment. The findings contribute to digital educational development and teacher professional development by proposing a practical and easy to adapt learning design method.
... The research on educational technology has named several impacts of LA. LA has improved students' learning experiences (Ellis et al., 2017) and improved students' success (Dietz-Uhler & Hurn, 2013;Mor et al., 2015). Mor et al. (2015) states, "Sharing analytics with students enhances their opportunities for self-assessment, whereas opening up design for learning provides a way to share the quality principles that govern education". ...
... LA has improved students' learning experiences (Ellis et al., 2017) and improved students' success (Dietz-Uhler & Hurn, 2013;Mor et al., 2015). Mor et al. (2015) states, "Sharing analytics with students enhances their opportunities for self-assessment, whereas opening up design for learning provides a way to share the quality principles that govern education". ...
Conference Paper
Full-text available
The purpose of the study is to highlight what is important for students when they study in a learning management system (LMS), as well as if learning analytics (LA) can support students' learning in a LMS. The research questions were: 1) What do students find important when they study in a LMS?, 2) Are teachers using and enabling LA functionalities in a LMS? and 3) How can LA usage improve the usage of LA? The study was conducted through an online survey and interviews. According to the findings, the ability to do assignments at one's own pace and whenever one wants is highly valued by students. Students stated that tracking one's own progress in an LMS is important, and they would like a summary or an overall picture of their studies, including deadlines, the accumulation of tasks for certain weeks, and the time required for the tasks.
... A DBR-based HCLA design approach, that is, an approach that employs DBR as a relevant theoretical framework, would position the LA as the tool through which teachers can iteratively support the learning of students as they get more data related to their students' learning. Rather than form theories that are generalizable to the larger field of education, they could form "micro-theories" about the individual learning processes of their students and how to redesign the learning environment to improve that learning (Mor et al., 2015). ...
... Specifically, providing teachers with subscores related to students' learning progress along each dimension of the curriculum standard supported them in taking evidence-based pedagogical actions, such as customizing the learning design, albeit in accordance with their pedagogical practices and perspectives (Campos et al., 2021). The three dimension scores presented graphically (quantitative data) and summarized in the Key Insights (qualitative data) motivated teachers' inquiry into student thinking (Mor et al., 2015), examining the relationship between students' integrated understanding and their knowledge of particular ideas and concepts targeted by the standards-aligned learning design. Dominant LA dashboards follow an exploratory visual analytics approach, present teachers with the data analysis and leave them unassisted in exploring the data to determine whether and how they can inform the pedagogical action needed to support student learning (Echeverria, Martinez-Maldonado, Buckingham Shum, et al., 2018;Echeverria, Martinez-Maldonaldo, Granda, et al., 2018). ...
Article
Full-text available
This paper describes a Human‐Centred Learning Analytics (HCLA) design approach for developing learning analytics (LA) dashboards for K‐12 classrooms that maintain both contextual relevance and scalability—two goals that are often in competition. Using mixed methods, we collected observational and interview data from teacher partners and assessment data from their students' engagement with the lesson materials. This DBR‐based, human‐centred design process resulted in a dashboard that supported teachers in addressing their students' learning needs. To develop the dashboard features that could support teachers, we found that a design refinement process that drew on the insights of teachers with varying teaching experience, philosophies and teaching contexts strengthened the resulting outcome. The versatile nature of the approach, in terms of student learning outcomes, makes it useful for HCLA design efforts across diverse K‐12 educational contexts. Practitioner notes What is already known about this topic Learning analytics that are aligned to both a learning theory and learning design support student learning. LA dashboards that support users to understand the associated learning analytics data provide actionable insight. Design‐based research is a promising methodology for Human‐Centred Learning Analytics design, particularly in the K‐12 educational context. What this paper adds Leveraging a longstanding, yet fluid, research‐practice partnership is an effective design‐based research adaptation for addressing the high variation in instructional practices that characterize K‐12 education. Using both quantitative and qualitative data that reflects students' developing knowledge effectively supports teachers' inquiry into student learning. Teachers' use of learning analytics dashboards is heavily influenced by their perspectives on teaching and learning. Implications for practice and/or policy Impact on student learning outcomes, alongside usability and feasibility, should be included as a necessary metric for the effectiveness of LA design. LA dashboard developers should both leverage learning data that reflect students' developing knowledge and position teachers to take responsive pedagogical action to support student learning. LA researchers and developers should utilize a long‐term, yet fluid, research‐practice partnership to form a multi‐stakeholder, multidisciplinary design team for Human‐Centred Learning Analytics design.
... Because it captures learners' footprints in digital learning environments, LA has been increasingly valued for its capacity in informing learning design (LD) to improve learning experiences and outcomes (Mor et al., 2015). LD concerns the composition and arrangement of learning resources and activities that constitute a learning episode, with an emphasis on enabling the active role of students during learning (Holmes et al., 2019). ...
Article
Full-text available
One apparent challenge associated with learning analytics (LA) has been to promote adoption by university educators. Researchers suggest that a visualization dashboard could serve to help educators use LA to improve learning design (LD) practice. We therefore used an educational design approach to develop a pedagogically useful and easy-to-use LA visualization solution to support data-informed LD. We interviewed four staff in a medical degree program at a New Zealand university, designed and piloted a dashboard, and evaluated it through interviews. As a proof-of-concept project, our study showed that educational design research could be meaningfully used to develop a visualization dashboard that is easy-to-use and useful. In particular, the preliminary design principles identified provide implications for practitioners who are seeking to use LA to inform LD. Finally, we reflect on the purpose of visualization dashboards in relation to the literature and identify areas for future developments.
Article
Full-text available
In a higher education context, students are expected to take charge of their learning by deciding “what” to learn and “how” to learn. While the learning analytics (LA) community has seen increasing research on the “how” to learn part (i.e., researching methods for supporting students in their learning journey), the “what” to learn part is still underinvestigated. We present a case study of curriculum analytics and its application to a dataset of 243 students of the bachelor’s program in the broad discipline of health sciences to explore the effects of course choices on students’ academic performance. Using curriculum metrics such as grading stringency, course temporal position, and duration, we investigated how course choices differed between high- and low-performing students using both temporal and sequential analysis methods. We found that high-performing students were likely to pick an elective course of low difficulty. It appeared that these students were more strategic in terms of their course choices than their low-performing peers. Generally, low-performing students seemed to have made suboptimal choices when selecting elective courses; e.g., when they picked an elective course of high difficulty, they were less likely to pick a following course of low difficulty. The findings of this study have design implications for researchers, program directors, and coordinators, because they can use the results to (i) update the course sequencing, (ii) guide students about course choices based on their current GPA (such as through course recommendation dashboards), (iii) identify bottleneck courses, and (iv) assist higher education institutions in planning a more balanced course roadmap to help students manage their workload effectively.
Chapter
The interest towards using learning analytics in a variety of educational contexts is growing, as they have a great potential to enable data-informed decision-making for students, faculty and staff. This paper describes a study, examining the perceptions and attitudes of university academic staff towards learning analytics in two Estonian higher education institutions. This research aimed to obtain information on how the academic staff perceived the role of learning analytics in enhancing learning and teaching in higher education, and what challenges they face. The study found that academic staff had a mostly positive perception of learning analytics and expressed the need for access to data on student progress, learning materials engagement, prior knowledge and technology familiarity. The major challenges and obstacles associated with the implementation and adoption of learning analytics in higher education from the perspective of academic staff were concerns about ethics and privacy, need for training and support, and technical challenges. The successful implementation and use of learning analytics in higher education requires careful consideration of these challenges and obstacles, as well as strategies to address them.
Chapter
Although there has been an increasing amount of literature on learning experience design in the past five years, it is not clear yet how to operationally integrate the process and methods of instructional design, learning design, user-experience design, design-based research and design thinking in a multi-step participative design process involving stakeholders with different perspectives. Our recent systematic literature review on relationships between learning design and learning analytics applying a critical interpretive synthesis and text analytics identified two issues rarely explicitly discussed in the literature: “evidence-informed instructional design approaches” and “design-based research.” Elaborating on these two concepts, we proposed that evidence-based practice promoted within the learning design paradigm and the need for applying research-based findings advanced in the instructional design field should be complementary to each other in a participatory learning experience design process built upon the tradition of design-based research and recent development of software engineering design and Design Thinking. Against this background, the position paper addresses the following research question: How can we facilitate the participative technology-enhanced learning design in an effective, efficient and appealing way? To this end, the paper introduces Group Concept Mapping (GCM), a mix-methods research methodology. GCM is a consensus-driven approach combining qualitative data collection with advanced statistical techniques to aggregate participants’ contributions and show their collective perspectives.
Book
Full-text available
Rethinking Learning for a Digital Age addresses the complex and diverse experiences of learners in a world embedded with digital technologies. The text combines first-hand accounts from learners with extensive research and analysis, including a developmental model for effective e-learning, and a wide range of strategies that digitally-connected learners are using to fit learning into their lives. A companion to Rethinking Pedagogy for a Digital Age (2007), this book focuses on how learners’ experiences of learning are changing and raises important challenges to the educational status quo. Chapters are freely available to download from the publisher's website: https://www.taylorfrancis.com/books/e/9780203078952
Article
Full-text available
Recent years have seen a growing recognition of the value of positioning teaching as a design science. In this context, we see a great potential in the possible synergy between educational design, learning analytics and teacher inquiry. This synergy is demonstrated in the form of Computer Supported Curriculum Analysis, Design and Evaluation for Science Education in Africa (CASCADE-SEA), a computer-based support tool that assists teachers in creating exemplary lesson materials for secondary level science and mathematics in southern Africa. The purpose of the present study is to articulate how teachers might be supported in the synergistic processes that are integral to educational design. It does so by revisiting existing data collected during the design, development and evaluation of CASCADE-SEA, and sharing the results of a retrospective analysis in light of recent developments in learning analytics, learning design and teacher inquiry. The retrospective analysis revealed insights related to each of the aforementioned themes. For learning analytics, those were social structures of learning and teaching practices; pedagogy-driven collection and analysis of data; and the use of data to inform teaching and learning. For learning design, the findings pertained to the language, practices and tools used to achieve particular educational aims. For teacher inquiry, the findings relate to the empirical process of question formulation, methods selection, collection and analysis of data, and sharing/reflecting on the findings as well as overall process. Based on the retrospective analysis, design considerations are offered for a modern application to replace the practical and effective desktop software that was developed during the 1990s.
Article
Every day, teachers design and test new ways of teaching, using learning technology to help their students. Sadly, their discoveries often remain local. By representing and communicating their best ideas as structured pedagogical patterns, teachers could develop this vital professional knowledge collectively
Article
Learning analytics is receiving increased attention, in part because it offers to assist educationalinstitutions in increasing student retention, improving student success, and easing the burden ofaccountability. Although these large-scale issues are worthy of consideration, faculty might alsobe interested in how they can use learning analytics in their own courses to help their studentssucceed. In this paper, we define learning analytics, how it has been used in educationalinstitutions, what learning analytics tools are available, and how faculty can make use of data intheir courses to monitor and predict student performance. Finally, we discuss several issues andconcerns with the use of learning analytics in higher education.
Article
We propose that the design and implementation of effective Social Learning Analytics (SLA) present significant challenges and opportunities for both research and enterprise, in three important respects. The first is that the learning landscape is extraordinarily turbulent at present, in no small part due to technological drivers. Online social learning is emerging as a significant phenomenon for a variety of reasons, which we review, in order to motivate the concept of social learning. The second challenge is to identify different types of SLA and their associated technologies and uses. We discuss five categories of analytic in relation to online social learning; these analytics are either inherently social or can be socialised. This sets the scene for a third challenge, that of implementing analytics that have pedagogical and ethical integrity in a context where power and control over data are now of primary importance. We consider some of the concerns that learning analytics provoke, and suggest that Social Learning Analytics may provide ways forward. We conclude by revisiting the drivers andtrends, and consider future scenarios that we may see unfold as SLA tools and services mature. © International Forum of Educational Technology & Society (IFETS).
Article
In this paper, we reflect on how Design for Learning can create the basis for a culture of educational quality. We explore the process of Design for Learning within a blended, undergraduate university course through a teacher-led inquiry approach, aiming at showing the connections between the process of Design for Learning and academic staff/student reflections on educational quality. As emerges from the evidence collected, forward-oriented design—which is Design for Learning as iterative and participatory practice—supports meta-learning on the practices and values of educational quality. We introduce in depth the two main constructs adopted in our work—Design for Learning and educational quality—in order to understand the ongoing debate in these two separate (but convergent) areas of research. The four phases (design for configuration, design for orchestration, design for reflection and design for redesign) show the impact of design on educational quality reflections and practices. Alongside the process of Design for Learning, the data are collected and analysed adopting different methods that enable the researchers to better understand the diverse perspectives of staff and students. We conclude that Design for Learning can be deemed a mediational instrument to explore methods of transforming existing educational situations into desired situations, linking the vision of educational quality with concrete practices of the teacher and students' daily activities.