Figure 1 - uploaded by Gonzalo Méndez
Content may be subject to copyright.
Courses included on the core curriculum of the ESPOL Computer Science program 

Courses included on the core curriculum of the ESPOL Computer Science program 

Source publication
Conference Paper
Full-text available
One of the key promises of Learning Analytics research is to create tools that could help educational institutions to gain a better insight of the inner workings of their programs, in order to tune or correct them. This work presents a set of simple techniques that applied to readily available historical academic data could provide such insights. T...

Contexts in source publication

Context 1
... this step, we selected a common core of 27 classes for which grading information was readily available. The set of courses included in the last version of the dataset is illustrated in Figure 1. This set is composed by 8 basic sciences courses, 16 on professional instruction and 3 from the humanities category. ...
Context 2
... this part, we based our analysis on the academic per- formance of the last six generations of CS undergraduate students who successfully completed their degree at ESPOL between 2000 and 2012. A maximum-likelihood factor anal- ysis was performed on the correlation matrix of the dataset composed by the performance achieved by 333 students on each of the 27 courses shown in Figure 1. The analysis was executed by using the functionalities provided by the psych statistical package 1 available for the R statistical software [16]. ...

Similar publications

Article
Full-text available
Learning analytics has been as used a tool to improve the learning process mainly at the micro-level (courses and activities). However, another of the key promises of Learning Analytics research is to create tools that could help educational institutions at the meso- and macro-level to gain a better insight of the inner workings of their programs,...

Citations

... The curriculum design process is systematic, iterative (Méndez, Ochoa, and Chiluiza 2014;Pereira et al. 2020), and shaped by conceptions of learning held by the designers. ...
Article
Full-text available
Student support, which is an integral part of a learning programme, is most effective when it is integrated into the design of the curricula, rather than when it forms stand-alone interventions. Identifying those areas that require attention from a student support perspective is often based on the perspectives of the institution and teaching staff involved, rather than on how Students concerned interact with the programme. In this article, we draw on the research fields of curriculum analytics to identify areas of curriculum improvement for an ODL programme using student data. The results of the study indicate the important role that is played by curriculum analytics in designing student support interventions, and in restructuring elements of the curriculum structure to support student success. Such is done by ascertaining what constitutes the learned curriculum versus the planned curriculum, the Temporal Distance between Courses, and any bottlenecks within the programme that might hamper progression. The results, further, underscore the need for an effective execution strategy to be aligned with the principles that guided the development of the curriculum concerned.
... In these studies, sequence frequency metrics such as support and confidence and interestingness metrics were typically obtained to identify prevalent and interesting sequential patterns (Agrawal & Srikant, 1995;Bazaldua, Baker, & San Pedro, 2014;Kinnebrew, Loretz, & Biswas, 2013). These patterns provide unique insights into self-regulatory behaviors and strategies (Kinnebrew et al., 2013;Sabourin, Mott, & Lester, 2013), the inquiry or reasoning processes that distinguish high-performing from low-performing students (Jiang & Cayton-Hodges, 2023;Jiang, Clarke-Midura, Baker, Paquette, & Keller, 2018a;Jiang, Paquette, Baker, & Clarke-Midura, 2015;Perez et al., 2017;Taub, Azevedo, Bradbury, Millar, & Lester, 2018), dropout behaviors in online courses (Deeva, Smedt, Koninck, & Weerdt, 2017;Méndez, Ochoa, & Chiluiza, 2014), processes involved in collaborative problem solving (Martinez-Maldonado, Dimitriadis, Martinez-Monés, Kay, & Yacef, 2013), and transitions between affective states during learning (Andres et al., 2019). ...
Article
Using appropriate tools strategically to aid in problem solving is a crucial skill identified in K-12 mathematics curriculum standards. As more assessments transition from paper-and-pencil to digital formats, a variety of interactive tools have been made available to test takers in digital testing platforms. Using onscreen calculators as an example, this study illustrates how process data obtained from student interactions with a digitally-based large-scale assessment can be leveraged to explore how and how well test takers use interactive tools and unveil their mathematical problem-solving processes and strategies. Specifically, sequence mining techniques using the longest common subsequence were applied on process data collected from a nationally representative sample who took the National Assessment of Educational Progress (NAEP) mathematics assessment to examine patterns of eighth-grade students’ calculator-use behaviors and the content of calculator input across a series of items. Sequences of keystrokes executed on the onscreen calculator by test takers were compared to reference sequences identified by content experts as proficient and efficient use to infer how well and how consistently the calculator was used. Results indicated that calculator-use behaviors and content differed by item characteristics. Students were more likely to use calculators on calculation-demanding items that involve intensive and complex computations than on items that involve simple or no computation. Using the calculator on more calculation-demanding items and using it in a manner that is more efficient and more similar to reference sequences on these items were related to higher mathematical proficiency. Findings have implications for assessment design and can be used in educational practices to provide educators with actionable process-related information on tool use and problem solving.
... Despite this varied and expanding work, the use of LA to contribute to programme review strategies that might reveal factors that impact the learning experience and learning outcomes for current students (that is students currently enrolled in the programme under review) is underexplored (Komenda et al., 2015;Méndez et al., 2014). It was our interest in the potential application of LA to this latter issue, that is, assisting students with their 'learning-in-action' so to speak, that led the project design. ...
Article
Full-text available
The application of learning analytics (LA) to research and practice in higher education is expanding. Researchers and practitioners are using LA to provide an evidentiary basis across higher education to investigate student learning, to drive institutional quality improvement strategies, to determine at-risk behaviours and develop intervention strategies, to measure attrition more effectively and to improve curriculum design and evaluation in both on-campus and e-learning settings. This paper is a case study report of the novel application of LA to programme curriculum review from a major cross-institutional project in Hong Kong. The paper describes the rationale for the project, the conceptual model that led the approach and the development of a software tool that allowed the automation of statistical analyses specifically relevant to programme review. In addition, the paper addresses a major challenge that the project faced in relation to data governance. The paper concludes by proposing the potential benefits of LA for programme curriculum review.
... Higher education institutions are progressively investing in technology and data collected from various digital learning platforms (such as learning management systems or MOOC-based platforms) to design the effective sequencing of courses in a study program. The use of data-driven approaches and analytics to support curriculum decision-making has been continuously growing (Dawson and Hubball 2014;Méndez, Ochoa, and Chiluiza 2014b;Komenda et al. 2015). For instance, Méndez and colleagues (2014b) demonstrated the use of simple analytical techniques to describe real course difficulty estimation and impact on overall student achievement, using data from a computer science program. ...
Article
Full-text available
The success and satisfaction of students with online courses are significantly impacted by the sequencing of learning objectives and activities. Equally critical is designing online degree programs and structuring multiple courses to reduce learners' cognitive load and attain maximum learning success. In its current form, the evaluation of program design is complex and done primarily using stakeholders' perceptions and qualitative methods, which are subjective. Emerging technologies produce enormous amounts of educational data that have the potential to lead to evidence-based practice to reform the design of study programs. Although a few studies adopt such data-driven approaches to evaluate and enhance study programs, remarkably little is known about the sequencing of learning objectives and the influence of their mastery in one course on performance across subsequent courses. This study aims to assess the interdependency of learning objectives from multiple courses in a MOOC-based program by drawing on an interdisciplinary approach intersecting analytic techniques and measurement models. Situated in the context of professional leadership skill development, we ground our findings in leadership theory and how learners acquire skills over time and across varying contexts.
... One of the main objectives of LA research is to create tools that facilitate educational institutions to gain a better insight into the inner workings of their programs in order to tune or correct them. Several studies on curriculum visualisation and analysis have been conducted using several different methods [45,46,47,48,49,50]. These studies often emphasise on understanding curricular structure in order to modify and improve it. ...
... Several studies have discussed methodologies and tools for analysing CS course curricula with the intention of providing insights to educational providers [47,49,50,51,52,53]. ...
... Méndez et al. [47] presented a set of techniques that relies on historical academic data for various applications like estimating course difficulty and identifying dropout paths. The results of their analysis are then used to gather recommendations for curriculum design. ...
Thesis
Full-text available
Motivated by the increasing influence of data analytics in the higher education sector, this thesis focuses on enhancing the effectiveness and quality of an undergraduate student's journey. An undergraduate student's journey begins when they enrol at a university and ends once employed in the graduate labour market. Findings of this research benefits stakeholders of education, such as educational policymakers, education providers, current and prospective students in solving a variety of problems including student drop-out, low course satisfaction, and undesirable graduate employment outcomes.
... Data-driven discovery refers to hypothesis generation from data (Romero et al., 2008). Applications of data-driven discovery include tasks like uncovering the structure of curricular topics (Méndez et al., 2014) or enumerating and measuring the cognitive states of students in computerized learning environments (Koedinger et al., 2013). In contrast, theory-driven analyses focus on relationships that have been hypothesized based on previous research. ...
Article
Growth mindset interventions foster students’ beliefs that their abilities can grow through effort and appropriate strategies. However, not every student benefits from such interventions – yet research identifying which student factors support growth mindset interventions is sparse. In this study, we utilized machine learning methods to predict growth mindset effectiveness in a nationwide experiment in the U.S. with over 10,000 students. These methods enable analysis of arbitrarily-complex interactions between combinations of student-level predictor variables and intervention outcome, defined as the improvement in grade point average (GPA) during the transition from high school. We utilized two separate machine learning models: one to control for complex relationships between 51 student-level predictors and GPA, and one to predict the change in GPA due to the intervention. We analyzed the trained models to discover which features influenced model predictions most, finding that prior academic achievement, blocked navigations (attempting to navigate through the intervention software too quickly), self-reported reasons for learning, and race/ethnicity were the most important predictors in the model for predicting intervention effectiveness. As in previous research, we found that the intervention was most effective for students with prior low academic achievement. Unique to this study, we found that blocked navigations predicted an intervention effect as low as 0.185 GPA points (on a 0–4 scale) less than the mean. This was a notable negative prediction given that the mean intervention effect in our sample was just 0.026 GPA points, though few students (4.4%) experienced a substantial number of blocked navigation events. We also found that some minoritized students were predicted to benefit less (or even not at all) from the intervention. Our findings have implications for the design of computer-administered growth mindset interventions, especially in relation to students who experience procedural difficulties completing the intervention.
... Most concerning is that learning analytics are increasingly being implemented in different educational settings, often without the guidance of a research base [20]. There is also a scarcity of research on how to analyze the learning process at the program level in order to guide the design or redesign of a program [25]. ...
Conference Paper
Full-text available
Understanding students’ sentiment is valuable to understanding the changes that could or should be made in curriculum design at third level. Learning analytics has shown potential for improving student learning experiences and supporting teacher inquiry. Yet, there is limited research that reports on the adoption and actual use of learning analytics to support teacher inquiry. This study captures sentiment of postgraduate students by integrating learning analytics with the steps of teacher inquiry. This study makes two important contributions to teaching and learning literature. First, it reports on the use of learning analytics to support teacher inquiry over three iterations of a business analytics programme between 2016 and 2019. Second, evidence-based recommendations on how to optimise learning analytics to support teacher inquiry are provided.
... Many studies have discussed methodologies for analysing CS course curricula to provide insights to educational providers (Méndez, Ochoa, & Chiluiza, 2014;Oliver, Dobele, Greber, & Roberts, 2004;Pedroni, Oriol, & Meyer, 2007;Sekiya, Matsuda, & Yamaguchi, 2015). Méndez et al. (2014) presented a set of techniques that relies on historical academic data for various applications like estimating course difficulty and identifying dropout paths. ...
... Many studies have discussed methodologies for analysing CS course curricula to provide insights to educational providers (Méndez, Ochoa, & Chiluiza, 2014;Oliver, Dobele, Greber, & Roberts, 2004;Pedroni, Oriol, & Meyer, 2007;Sekiya, Matsuda, & Yamaguchi, 2015). Méndez et al. (2014) presented a set of techniques that relies on historical academic data for various applications like estimating course difficulty and identifying dropout paths. The results were then used to gather recommendations for curriculum design. ...
Conference Paper
Full-text available
A large variety of Computer Science (CS) and Information and Communications Technology (ICT) programs are offered in different institutions across Australia. Even the same institution has several majors of CS programs due to its recent popularity among students and demand in the job market. Current practices in CS education lack a unified approach to analyse and compare these programs, and consequently cannot assess the students in different programs based on a common platform. In this paper, we propose a unified and systematic approach to analyse CS units and courses. Our approach is based on CS curriculum guidelines, according to which each subject (or unit) of a program (or course) can be mapped to knowledge areas. The insights gained through the proposed unified approach could help education providers, students, and governing bodies to understand the fundamentals of CS and ICT programs. A case study involving data from CS programs offered by two Australian higher education institutions show the efficacy of our proposed approach.
... Several studies have discussed methodologies and tools for analysing CS course curricula with the intention of providing insights to educational providers [9]- [12]. The study in [9] presents a set of techniques that relies on historical academic data to estimate course difficulty, dependence, dropout paths, etc. ...
... Several studies have discussed methodologies and tools for analysing CS course curricula with the intention of providing insights to educational providers [9]- [12]. The study in [9] presents a set of techniques that relies on historical academic data to estimate course difficulty, dependence, dropout paths, etc. The results of their analysis are then used to gather recommendations for curriculum design. ...
Conference Paper
Computer Science (CS) education is increasingly becoming popular across the globe. With the increased popularity of CS and Information and Communications Technology (ICT) courses, the ability to measure the similarity/distance between units can aid the decision-making process of education providers during learning path recommendations, evaluating unit exemptions/advanced standings, etc. The ability to quantitatively measure the similarity of two units will assist decision-making processes which are otherwise likely to be tedious and time-consuming. Therefore, in this work, we explore how to quantitatively measure the similarity/distance between two CS/ICT units, in an Australian setting. In this study,we utilize data from multiple CS/ICT courses offered by two Australian education providers. The data is generated as a result of the Australian Computer Society’s (ACS) course accreditation process during which each unit of a course is mapped into multiple knowledge areas. We measured the similarity between units in terms of their coverage of the knowledge areas using multiple distance measures. Our work is novel in examining a quantitative measure of unit similarity, by introducing a unified approach across education providers that follow the ACS course accreditation process.
... The curriculum guides teaching and learning. It not only "establishes the content for teaching" 15 , but also identifies the "design specifications and constraints (student outcomes, competencies, learning goals)" 16 . Curriculum development focuses on the creation of these design specifications and constraints. ...
... Curriculum guides the desired learning objectives, however it does not necessarily represent the actual applied curriculum 20 or the way that students experience it. Some suggest that curriculum design decisions are often based on "opinions, intuitions, and personal preferences" 16,21 . This highlights the importance of "a fairly accurate picture of the real curriculum" 20 , constructed through evidence-based practices 21 . ...
Conference Paper
Full-text available
The correct sequence of courses in a curriculum can ensure that students develop their knowledge and skills holistically. The challenge level can also be more evenly distributed. Creating these sequences is difficult because curriculum designers must consider multiple potentially conflicting criteria simultaneously. There currently exists a dearth of tools for analyzing the curriculum that incorporates course dependencies as defined by curriculum designers while also considering students' pathways through the curriculum. In this paper, we present Curri, a data-driven curriculum visualization system that scrapes dependencies from our university's published curriculum and leverages student academic data to determine when, on average, students take each course. We evaluate our approach with a case study and two focus groups. This work provides initial evidence that considering both dependencies and students' temporal performance leads to new analyses and insights.