Content uploaded by Yishay Mor
Author content
All content in this area was uploaded by Yishay Mor on Jan 28, 2018
Content may be subject to copyright.
Editorial: Learning design, teacher inquiry into student learning
and learning analytics: A call for action
Yishay Mor, Rebecca Ferguson and Barbara Wasson
Yishay Mor is a consultant in educational technology and design research, at http://www.yishaymor.org/. Rebecca
Ferguson is a lecturer in the Institute of Educational Technology at The Open University, Milton Keynes, MK7 6AA,
UK. Barbara Wasson is Professor in the Department of Information Science and Media Studies, University of
Bergen, PostBox 7800, 5020, Bergen, Norway. Address for correspondence: Dr. Yishay Mor, 36 Penderyn way, N7
0EW, London, UK. Email: yishaym@gmail.com
Abstract
This special issue deals with three areas. Learning design is the practice of devising
effective learning experiences aimed at achieving defined educational objectives in a
given context. Teacher inquiry is an approach to professional development and capacity
building in education in which teachers study their own and their peers’ practice. Learn-
ing analytics use data about learners and their contexts to understand and optimise
learning and the environments in which it takes place. Typically, these three—design,
inquiry and analytics—are seen as separate areas of practice and research. In this issue,
we show that the three can work together to form a virtuous circle. Within this circle,
learning analytics offers a powerful set of tools for teacher inquiry, feeding back into
improved learning design. Learning design provides a semantic structure for analytics,
whereas teacher inquiry defines meaningful questions to analyse.
Introduction
Learning design or, as some prefer, design for learning (Beetham & Sharpe, 2013; Laurillard,
2013), is an emerging field of educational research and practice. Its practitioners are interested
in understanding how the intuitive processes undertaken by teachers and trainers can be made
visible, shared, exposed to scrutiny and consequently made more effective and efficient. Craft and
Mor (2012) define learning design as “the creative and deliberate act of devising new practices,
plans of activity, resources and tools aimed at achieving particular educational aims in a given
context.” The emphasis on this activity as both “creative and deliberate” highlights the dual
nature of design, and in particular learning design, as both a creative practice and a rigorous
inquiry. This tension is discussed in theoretical studies of the nature and value of design (Cross,
2001; Latour, 2008; Schön, 1992a).
The learning design approach advocates a shift from a focus on content to a focus on the learning
experience, with an aim of guiding learners as they make a transition from an initial state of mind
to a desired one. In most design disciplines there is usually a clear division of labour between
designers, developers and evaluators. An architect designs a building, which a contractor builds.
An inspector verifies that it meets all the required standards. Although a similar dynamic is
possible, and often present, in learning design, there is a growing interest in alternative models in
which teachers are empowered as designers and researchers of learning (Bannan-Ritland, 2008;
Kali, McKenney & Sagy, 2012).
The virtues of such an approach are multiple. In terms of the specific designed learning experi-
ence, teachers have the advantage of an intimate knowledge of the context of learning and the
British Journal of Educational Technology Vol 46 No 2 2015 221–229
doi:10.1111/bjet.12273
© 2015 British Educational Research Association
characteristics of the learners. This means they have the capacity to produce a design that is well
fit for purpose. In terms of the teacher engaged in design, research has shown the advantage of
engaging in design for developing their pragmatic as well as their theoretical skills (Mor &
Mogilevsky, 2013; Voogt et al, 2011).
Shifting the locus of design work to teachers, as Holmberg (2014) argues, is in line with Schön’s
model of design as reflective practice, a “conversation with materials,” during which practitioners
attentively introduce innovations into their environment, observe their effects and adjust them
until they achieve the desired effect. Schön argued for the potential of reconceptualising educa-
tional practice as design: “from the perspective of designing as learning and learning as design-
ing, the teaching/learning process could be seen, at its best, as a collaborative, communicative
process of design and discovery” (Schön, 1992b, p. 133).
In Schön’s view, this is the antidote for what he calls the dilemma of abandonment:
practitioners may feel, in relation to the academy, a sense of having been seduced and abandoned. [. . .] an
erosion of practitioners’ faith in the ability of academic research to deliver knowledge usable for solving
social problems. [. . .] On the other hand, when practitioners accept and try to use the academy’s esoteric
knowledge they are apt to discover that its appropriation alienates then from their own understandings,
engendering a loss of their sense of competence and control.
He concludes that “nowhere are these dilemmas more apparent than in the field of education”
(Schön, 1992b, p. 120).
By contrast, adopting a design mindset, “when we attend to what we know already, appreciating
the artistry and wisdom implicit in competent practice, believing that by reflection on that
practice we can make some of our tacit knowledge explicit, we take on a ‘reflective turn’, that
leads us to see students and teachers (at their best) as participants in a kind of reflective practice,
a communicative and self-reflective practice of reciprocal inquiry” (Schön, 1992b, p. 123). This
is perhaps the greatest advocacy of learning design ever written, and it was written almost a
decade before the term was coined (Dalziel et al, 2013).
Schön defined his concepts of design with reference to Dewey’s legacy of inquiry (Schön, 1992b).
Teacher inquiry adopts Dewey’s inquiry-based learning as a framework for teachers’ professional
development. Using this approach, teachers develop conjectures about students’ learning, and
then use available data or conduct classroom experiments in order to test these conjectures. In so
doing, they enhance both their theoretical knowledge and their practical skills.
Teacher inquiry, which emerged in the late 1980s (Cochran-Smith & Lytle, 1999), can be seen
both as a way to improve day-to-day teaching in the classroom and as professional development
for teachers. Clarke and Erickson (2003) define teacher inquiry as a set of research practices by
which teachers examine their practice and its effects on students’ learning, in order to enhance
their professional knowledge and improve their practice. Acknowledgement of inquiry as an
inherent component of teaching practice, argue Clarke and Erickson, is essential to maintaining
the professional standards of teaching and blocking the risk of disempowerment of educators and
de-professionalisation of education by over-regulation and standardisation of curricula and
testing regimes.
In their extensive survey of the literature, Clark et al identified that:
key characteristics that may contribute to a broadly conceptualised definition of teacher inquiry include the
notion that it is: systematic, intentional, contextual, self-critical, practical, action oriented, planned,
evidence-based, evaluative, and shared, and the main challenge lies in transforming teachers’ personal
skills, knowledge, and expertise into professional skills, knowledge and expertise. (Clark, Luckin & Jewitt,
2011, p. 8)
Traditionally, teacher inquiry has relied predominantly on qualitative methods, including teacher
journals, oral examinations, analysis of interviews, texts, student productions and observations,
222 British Journal of Educational Technology Vol 46 No 2 2015
© 2015 British Educational Research Association
peer-observation, and conceptual desk research of classroom issues (Lytle & Cochran-Smith,
1990). More recently, there has been a focus on teacher inquiry and on the role of student data
as evidence. For example, Timperley, Wilson, Barrar and Fung (2007) noted that in all of the 97
studies of teacher inquiry they reviewed, student data in the form of assessment were used to
provide an analysis of the teaching–learning relationship for the purpose of improving teaching.
Taking this a step further, teacher inquiry into student learning (TISL) focuses on the evidence
base of student learning data that is available in technology-rich environments and its role in
improving practice.
TISL is a systematic, intentional, design-oriented approach to teachers’ technology-supported inquiry into
students’ learning. It focuses on the development and use of formative e-assessment methods using
advanced learning technologies (digital tools) to capture, analyse, interpret, share and evaluate student
data. In so doing, it aims to contribute to the development of teacher professionalism and school improve-
ment through a focus on teacher-centred, practice-based, evidence-oriented research activity. (Clark et al,
2011, p. 13)
Today’s classrooms are technology-rich environments, and the key contribution of the TISL
approach is in acknowledging and harnessing the opportunities created by technology to support
a more systematic, data-informed mode of practitioner research.
A second limitation of traditional teacher inquiry approaches has been the fragmentation of
inquiry efforts. Teachers have typically conducted their investigations either individually, or in
small local groups. We need a teacher inquiry method that is initiated by a top-down change but
encompasses the practice of many teachers. This will help to shift inquiry away from a focus on
individual practice and towards a more collaborative approach, aggregating the findings of mul-
tiple inquiries into a robust body of knowledge. Again, technology has the potential to enable this
change—by supporting large communities of inquiry.
Garrison, Anderson and Archer (1999, 2010) have identified three dimensions that determine
the effectiveness of an online community of inquiry: cognitive presence, social presence and
teaching presence. Online environments forTISL have the potential to raise the quality of teacher
inquiry, both in terms of the scientific quality, and in terms of the value for professional develop-
ment, by offering scaffolds for positive cognitive and social presence, and opportunities for teacher
trainer (or researcher) presence.
The data-informed approach to teacher inquiry, a key quality of TISL, is part of a wider trend in
education, towards the use of data and analytics to inform educational practice. Learning
analytics are defined as “the measurement, collection, analysis and reporting of data about
learners and their contexts, for purposes of understanding and optimising learning and the
environments in which it occurs” (Ferguson, 2012). Learning analytics typically employ large
datasets to provide real-time or retrospective insights about the effect and effectiveness of various
elements and features of learning environments. Learners and educators can use these analytics
to review work that has been done in the past, to support current study and to access recommen-
dations for future activities. This is a very young field—yet some impressive results have already
been reported (Arnold & Pistilli, 2012).
Learning analytics are rooted in data science, artificial intelligence, and practices of recom-
mender systems, online marketing and business intelligence.The tools and techniques developed
in these domains make it possible to identify trends and patterns, and then benchmark individuals
or groups against these trends. For example, analysing the activity logs of students watching
videos in online courses allows researchers to identify parameters such as video length that
influence engagement (Guo, Kim & Rubin, 2014; Kim et al, 2014). Discourse and content analy-
sis can be used to help identify effective and ineffective discussions and argumentations in col-
laborative online learning environments (Buckingham Shum & Ferguson, 2012). Analysis of
Learning design, TISL and learning analytics 223
© 2015 British Educational Research Association
online discussions shows that the frequency of activity—the number of chat messages a student
sends—is not necessarily correlated with success, but the type of activity (eg, seeking technical
advice vs. questions regarding the course materials) is a predictor of final grades (Abdous, He &
Yen, 2012). Using such insights, educators can identify at-risk students based on content analysis
of their online activity.
Dietz-Uhler and Hurn (2013) review eight institutions that have incorporated learning analytics
into their teaching and learning systems and list a variety of uses they observed. Learning
analytics can provide near-real-time insights into the learner activity and experience, enabling
faculty to make suggestions to students that will help them succeed. By monitoring student
activity and performance throughout a course, learning analytics can help faculty identify and
redress weaknesses in course design and implementation.
Dietz-Uhler and Hurn list the potential benefits of learning analytics, according to Long and
Siemens (2011). From an institutional perspective, learning analytics can improve decision
making and resource allocation, highlight an institution’s successes and challenges and increase
organisational productivity. From the perspective of a faculty member, learning analytics can
help to identify at-risk learners and provide interventions, transform pedagogical approaches and
help students gain insight into their own learning. Using learning analytics, staff can identify
productive and counter-productive patterns of learner behaviour. The productive models can be
suggested to learners as guidance, the counterproductive can be used to identify at-risk learners.
Learning analytics can identify sections of a course that most learners skim or skip without any
negative effect, suggesting that these elements are redundant, and the sections most students
struggle with, which may need to be reworked. Some institutions offer students a learning
analytics dashboard as part of their learning environment, which can help them to monitor and
manage their own learning processes.
A current problem is that the information provided by learning analytics tools is not generally
aligned with teachers’ needs for the management of learning activities. We need to be aware that
the pedagogical decisions embedded in learning designs affect both the learning analytics process
and its outcomes. We also need to take into account that mixed approaches will be necessary to
represent and understand the complex dynamics of groups of students in online learning envi-
ronments.
Challenges and synergies
Learning design, TISL and learning analytics all offer significant value for educational practice,
research and development. Yet all three have their challenges. Together, these approaches com-
plement each other, in that the advantages of one provide a remedy for the caveats of the others.
In the opening paragraphs of this paper, we adopted a view of design as simultaneously a creative
practice and a rigorous inquiry. Arguably, most of the work in the field of learning design has
focused on the creative processes, on practices, tools and representations to support it and on
mechanisms for sharing its outputs between practitioners. Very little has been done in terms of
the practices, tools and representations used for evaluating the effects of the designs.
Several approaches emphasise top-down quality enhancement, which help designers to base
their work on sound pedagogical principles. What is missing is the trajectory that would complete
the feedback loop: the built-in evaluation of designs to see whether they achieved the expected
outcomes.
Schön’s “reflective conversation with the situation,” whereby practitioners design an innovation,
introduce it, observe its effects and refine it, may occur spontaneously and intuitively—but it
remains tacit. There are no structured mechanisms to support it, and no standards for sharing the
findings from such an inquiry. The TISL framework offers exactly such mechanisms. Embedding
224 British Journal of Educational Technology Vol 46 No 2 2015
© 2015 British Educational Research Association
design work in a broader cycle of structured inquiry offers multiple benefits from a learning
design perspective. First and foremost, it would enhance the quality of the designed artefacts, by
exposing them to scrutiny, validating the areas in which they achieve their aims and highlighting
the areas that require refinement. Consequently, it would increase the credibility of these designs
and of the practices of learning design in general.
TISL also stands to benefit from a dialogue with learning design. Inquiry, as proposed by Dewey,
is a learning process modelled on the practices of scientific work. Teachers frame conjectures,
identify the suitable methods for testing these conjectures, conduct empirical work to implement
these test, and share and reflect on the outcomes. Indeed, teacher inquiry has also been referred
to as teacher research or classroom action research. Yet, when we say “scientific work,” what do
we mean? The dominant interpretation in this context would be based on the paradigms of social
sciences. By contrast, Laurillard (2013) and Mor and Winters (2007) propose Simon’s notion of
design science as a valuable alternative (Simon, 1996). Indeed, this approach is the basis for
much of the work in educational design, even if explicit reference to Simon is rare. Adopting
design science as the frame of reference would enable TISL to focus explicitly on value-driven
positive change and articulate the means of promoting such change. Learning design can offer an
arsenal of tools and representations appropriate for such a perspective.
Combining learning design and TISL can thus give birth to a “citizen design science of learning.”
What this combination still lacks is its scientific instruments: how can we actually measure the
effects of a design? Learning analytics offer an answer to this question. Consider the following
scenario: a teacher identifies a challenge in teaching a mathematical concept. Based on a review
of relevant literature and examples of past innovations in the field, she devises a lesson plan that
combines an off-the-shelf game with an online discussion. She then defines metrics for the
discussion, which will serve as indicators for learning. Now she uses the game’s learning analytics
dashboard and the online discussion system’s content analysis module to evaluate and refine her
lesson plan. Once she is satisfied with her results, she shares them with peers, who replicate her
experiment and calibrate her findings. Clearly, this is not a randomised control trial. However,
what it loses in scientific validity, it gains in practical relevance. In many ways, this is perhaps as
close as we may wish to get to research-based educational practice.
The field of learning analytics faces considerable challenges, some of which could be addressed by
a closer dialogue with learning design and TISL. Some of these are technical, some political,
cultural or administrative, some ethical (Dietz-Uhler & Hurn, 2013; Ferguson, 2012). These need
to be resolved before the value of learning analytics for learning design and TISL can be fully
realised. Yet there are some challenges that can only be addressed by introducing the vocabulary
of learning design into the repertoire of learning analytics. Learning analytics can identify
patterns of learner activity, interaction and conversation. They can identify generically beneficial
patterns and generically obstructive ones. Clearly, if a learner does not engage with the learning
material at all, or if her conversations in the course forum are limited to her struggles with
technical issues, she needs attention. The more complex patterns can only be interpreted in
relation to the design of the learning activity: if we set learners on a path of collaborative inquiry,
we expect them first to negotiate a research question, then curate research methods, brainstorm
an experimental design, divide the work between them, coordinate their individual efforts and
finally converge to discuss their findings. In each phase of this cycle, we would expect to find
different patterns of action, interaction and conversation.
As mentioned above, learning analytics holds clear potential benefits for educators and students.
These benefits will remain theoretical unless the potential beneficiaries engage in an active
process of inquiry into learning. Without such a process, there is a risk that the signals provided
by learning analytics systems will be ignored, or worse—misinterpreted.
Learning design, TISL and learning analytics 225
© 2015 British Educational Research Association
The linking of learning design and teacher inquiry is aligned both with Schön’s conception of
design as inquiry and with Simon’s model of design science. Learning analytics provide the
instrument for making this design inquiry of learning truly powerful.
Emin-Martínez et al (2014) take a first step towards this integration and propose a teacher-led
design inquiry of learning as a new model of educational practice and professional development.
The model is designed to capture the essence of the synergy of learning design, TISL and learning
analytics.
The papers in this special issue
Persico and Pozzi (2014) provide a systematic review of the field of learning design and map the
potential contributions of learning analytics along three dimensions: representations, methods
and tools. The authors identify the problems implicit in using a multitude of different represen-
tations, approaches and tools that are not interoperable. Such a review is useful in framing the
discussion in the other papers and any future discussion in this area.
Avramides, Hunter, Oliver and Luckin (2014) examine a case study in which an educational
innovation is introduced by school leadership, through a process of collaborative teacher inquiry.
They highlight the importance of context and demonstrate how the combination of learning
design, TISL and learning analytics could provide an effective model of institutional change.
Apart from the advantages discussed above, this model dissolves resistance to change by empow-
ering teachers as partners in the process while reinforcing the leadership of the head teacher.
McKenney and Mor (2015) discuss the potential and challenges of a prospective system combin-
ing learning design, TISL and learning analytics through a retrospective analysis of CASCADE-
SEA, a learning design system that has been used successfully and evaluated systematically in the
past. From this analysis, they derive design principles for the development of future systems.
Ghislandi and Raffaghelli (2015) demonstrate the potential of combining learning design, TISL
and learning analytics to raise educational quality. They describe a teacher inquiry that is mod-
elled as a design experiment and driven by qualitative and quantitative data of student activity.
Their study illustrates how a process of teacher inquiry can draw on both the practices and
representations of educational design research and the tools and methods of learning design,
making use of the data afforded by educational technologies.
Haya, Daems, Malzahn, Castellanos and Hoppe (2015) consider the complex pedagogical frame-
work of the JuxtaLearn project, which involves student production, team work and critical dis-
cussions between teams. Such situations, while very rich from a learning science perspective, are
often hard to manage from a pragmatic point of view. Teachers find it challenging to monitor
students’ behaviour and consequently to evaluate their learning designs. The authors introduce
social learning analytics and content analysis tools, tailored to the dynamics dictated by the
design, and present the outputs of these tools in a manner that is accessible to teachers. Thus, the
development of design-specific analytics tools that are embedded in a TISL process enables teach-
ers and students to benefit from a richer pedagogy.
Location-based games are another example of a pedagogical scenario that is hard to monitor.
Melero, Hernández-Leo, Sun, Santos and Blat (2015) present a learning design dashboard devel-
oped to support teachers in such a setting and illustrate how it transformed teachers’ inquiry
strategies, enabling them to make evidence-based learning design decisions. As with the paper by
Haya et al, this study shows how the combination of learning design, TISL and learning analytics
can open the door to effective new pedagogies.
Both the JuxtaLearn scenario and the location-based game scenario are instances of blended
computer-supported collaborative learning design. Rodríguez-Triana, Martínez-Monés,
Asensio-Pérez and Dimitriadis (2014) propose a monitoring-aware pattern-based design process
226 British Journal of Educational Technology Vol 46 No 2 2015
© 2015 British Educational Research Association
that not only allows educators to benefit from the outputs of learning analytics to improve their
design, but also encourages them to design their activities in a manner that would maximise the
effectiveness of learning analytics. Their empirical findings confirm the value of an approach that
combines learning design, TISL and learning analytics and suggest that indeed, in this case, the
sum is greater than the parts; monitoring-aware design improves both design and monitoring
and expands the pedagogical repertoire of teachers.
Together, these papers have implications for the practice of educational establishments, teachers
and those involved in the development and deployment of learning analytics.
Schools, colleges and universities should be aware of the possibilities that are opened up by
bringing together design, inquiry and analytics, and they need to consider carefully how these
combined activities can be encouraged and supported. Although teachers engage naturally in
learning design, top-down pressures can have a heavy impact on the designs that they produce.
For learning activities to be aligned with the requirements of particular educational situations,
teachers need to be actively involved in their design and supported to do this.
Learning analytics and the associated visualisations and dashboards can be used to improve
learning design and to support inquiry into students’ learning activities in the classroom. In the
wider context, engaging the whole educational community extends the benefits of design,
inquiry and analytics. Sharing analytics with students enhances their opportunities for self-
assessment, whereas opening up design for learning provides a way to share the quality principles
that govern education. This can have positive impact on understanding, negotiating and inno-
vating for educational quality.
Within a single classroom, a teacher inquiry may have limited impact. However, an inquiry that
is coordinated by a lead teacher or by management can encourage individuals or groups of
teachers to carry out subsequent inquiries with implications for the whole department or school.
Considering the interplay between design, inquiry and analytics across the institution will help to
avoid duplication of effort and help to develop communities of practice. In order to do this
effectively and to improve the quality of teaching and learning, training is needed in these areas,
and this requires time and resources to be set aside for staff development.
Conclusion
This special issue seeks to explore the synergies between learning design, TISL and learning
analytics. Research has generally taken learning design, TISL and learning analytics to be sepa-
rate activities, and epistemic research in the domains has seen relatively little convergence.
However, as this special issue editorial argues, they can be seen as complementary endeavours,
each informing and improving the others. The seven papers in the issue provide a rich picture of
current efforts to take advantage of these synergies.
References
Abdous, M., He, W. & Yen, C.-J. (2012). Using data mining for predicting relationships between online
question theme and final grade. Educational Technology & Society,15, 3, 77–88.
Arnold, K. E. & Pistilli, M. (2012). Course signals at Purdue: using learning analytics to increase student success.
Paper presented at the LAK12: 2nd International Conference on Learning Analytics and Knowledge (30
April–2 May), Vancouver, Canada.
Avramides, K., Hunter, J., Oliver, M. & Luckin, R. (2014). A method for teacher inquiry in cross-curricular
projects: lessons from a case study. British Journal of Educational Technology,46, 2, 249–264.
Bannan-Ritland, B. (2008). Teacher design research: an emerging paradigm for teachers’ professional
development. In R. Lesh, A. Kelly & J. Baek (Eds), Handbook of design research methods in education (pp.
246–262). New York: Routledge.
Beetham, H. & Sharpe, R. (2013). Rethinking pedagogy for a digital age: designing for 21st-century learning.
London: Routledge.
Learning design, TISL and learning analytics 227
© 2015 British Educational Research Association
Buckingham Shum, S. & Ferguson, R. (2012). Social learning analytics. Educational Technology & Society,
15, 3, 3–26.
Clark, W., Luckin, R. & Jewitt, C. (2011). NEXT-TELL research report D5.1: methods and specifications for TISL
components V1. NEXT-TELL Consortium, European Commission IST-285114.
Clarke, A. & Erickson, G. L. (2003). Teacher inquiry: a defining feature of professional practice. In A. Clarke
& G. Erickson (Eds), Teacher inquiry: living the research in everyday practice (pp. 1–6). London: Psychology
Press.
Cochran-Smith, M. & Lytle, S. L. (1999). The teacher research movement: a decade later. Educational
Researcher,28, 15–25.
Craft, B. & Mor, Y. (2012). Learning Design: mapping the landscape. Research in Learning Technology,20,
85–94.
Cross, N. (2001). Designerly ways of knowing: design discipline versus design science. Design Issues,17,3,
49–55.
Dalziel, J., Conole, G., Wills, S., Walker, S., Bennett, S., Dobozy, E. et al. (2013). The larnaca declaration on
learning design—2013. Retrieved January 1, 2015, from http://www.larnacadeclaration.org.
Dietz-Uhler, B. & Hurn, J. E. (2013). Using learning analytics to predict (and improve) student success: a
faculty perspective. Journal of Interactive Online Learning,12, 1, 17–26.
Emin-Martínez, V., Hansen, C., Rodríguez-Triana, M. J., Wasson, B., Mor, Y., Dascalu, M., Ferguson, R.
& Pernin, J.-P. (2014). Towards teacher-led design inquiry of learning. eLearning Papers. Accessed
March 6, 2015, from http://openeducationeuropa.eu/en/article/Towards-Teacher-led-Design-Inquiry-
of-Learning?paper=134810.
Ferguson, R. (2012). Learning analytics: drivers, developments and challenges. International Journal of
Technology Enhanced Learning (IJTEL),4, 5/6, 304–317.
Garrison, D. R., Anderson, T. & Archer, W. (1999). Critical inquiry in a text-based environment: computer
conferencing in higher education. The Internet and Higher Education,2, 2, 87–105.
Garrison, D. R., Anderson, T. & Archer, W. (2010). The first decade of the community of inquiry framework:
a retrospective. The Internet and Higher Education,13, 1, 5–9.
Ghislandi, P. & Raffaghelli, J. (2015). Forward-oriented designing for learning as a means to achieve educa-
tional quality. British Journal of Educational Technology,46, 2, 280–299.
Guo, P. J., Kim, J. & Rubin, R. (2014). How video production affects student engagement: an empirical study of
MOOC videos. Paper presented at the Learning@Scale Conference.
Haya, P., Daems, O., Malzahn, N., Castellanos, J. & Hoppe, H. (2015). Analysing content and patterns of
interaction for improving the learning design of networked learning environments. British Journal of
Educational Technology,46, 2, 300–316.
Holmberg, J. (2014). Studying the process of educational design—revisiting Schön and making a case for
reflective design-based research on teachers’ ‘conversations with situations’. Technology, Pedagogy and
Education,23, 3, 293–310.
Kali, Y., McKenney, S. & Sagy, O. (2012). Teachers as designers of technology enhanced learning. Instruc-
tional Science, 1–7.
Kim, J., Guo, P. J., Seaton, D. T., Mitros, P., Gajos, K. Z. & Miller, R. C. (2014). Understanding in-video
dropouts and interaction peaks in online lecture videos. Paper presented at the Learning@Scale
conference.
Latour, B. (2008). A cautious Prometheus? A few steps toward a philosophy of design (with special attention to
Peter Sloterdijk). Paper presented at the Proceedings of the 2008 Annual International Conference of the
Design History Society.
Laurillard, D. (2013). Teaching as a design science: building pedagogical patterns for learning and technology. New
York / Oxon: Routledge.
Long, P. & Siemens, G. (2011). Penetrating the fog: analytics in learning and education. EDUCAUSE Review,
46, 5, 31–40.
Lytle, S. & Cochran-Smith, M. (1990). Learning from teacher research: a working typology. The Teachers
College Record,92, 1, 83–103.
McKenney, S. & Mor, Y. (2015). Supporting teachers in data-informed educational design. British Journal of
Educational Technology,46, 2, 265–279.
Melero, J., Hernández-Leo, D., Sun, J., Santos, P. & Blat, J. (2015). How was the activity? A visualization
support for a case of location-based learning design. British Journal of Educational Technology,
46, 2, 317–329.
Mor, Y. & Mogilevsky, O. (2013). The learning design studio: collaborative design inquiry as teachers’
professional development. Research in Learning Technology,21. doi: http://dx.doi.org/10.3402/rlt.v21i0
.22054
228 British Journal of Educational Technology Vol 46 No 2 2015
© 2015 British Educational Research Association
Mor, Y. & Winters, N. (2007). Design approaches in technology-enhanced learning. Interactive Learning
Environments,15, 1, 61–75.
Persico, D. & Pozzi, F. (2014). Informing learning design with learning analytics to improve teacher inquiry.
British Journal of Educational Technology,46, 2, 230–248.
Rodríguez-Triana, M. J., Martínez-Monés, A., Asensio-Pérez, J. I. & Dimitriadis, Y. (2014). Scripting and
monitoring meet each other: aligning learning analytics and learning design to support teachers in
orchestrating CSCL situations. British Journal of Educational Technology,46, 2, 330–343.
Schön, D. A. (1992a). Designing as reflective conversation with the materials of a design situation. Research
in Engineering Design,3, 3, 131–147.
Schön, D. A. (1992b). The theory of inquiry: Dewey’s legacy to education. Curriculum Inquiry,22,2,
119–139.
Simon, H. A. (1996). The sciences of the artificial Vol. 136. Boston: MIT Press.
Timperley, H., Wilson, A., Barrar, H. & Fung, I. (2007). Teacher professional learning and development: best
evidence synthesis iteration. Wellington: New Zealand: Ministry of Education.
Voogt, J., Westbroek, H., Handelzalts, A., Walraven, A., McKenney, S., Pieters, J. et al (2011). Teacher
learning in collaborative curriculum design. Teaching and Teacher Education,27, 8, 1235–1244.
Learning design, TISL and learning analytics 229
© 2015 British Educational Research Association