From the article: "The educational domain is momentarily witnessing the emergence of learning analytics – a form of data analytics within educational institutes. Implementation of learning analytics tools, however, is not a trivial process. This research-in-progress focuses on the experimental implementation of a learning analytics tool in the virtual learning environment and educational processes of a case organization – a major Dutch university of applied sciences. The experiment is performed in two phases: the first phase led to insights in the dynamics associated with implementing such tool in a practical setting. The second – yet to be conducted – phase will provide insights in the use of pedagogical interventions based on learning analytics. In the first phase, several technical issues emerged, as well as the need to include more data (sources) in order to get a more complete picture of actual learning behavior. Moreover, self-selection bias is identified as a potential threat to future learning analytics endeavors when data collection and analysis requires learners to opt in."
DOCUMENT
Despite the promises of learning analytics and the existence of several learning analytics implementation frameworks, the large-scale adoption of learning analytics within higher educational institutions remains low. Extant frameworks either focus on a specific element of learning analytics implementation, for example, policy or privacy, or lack operationalization of the organizational capabilities necessary for successful deployment. Therefore, this literature review addresses the research question “What capabilities for the successful adoption of learning analytics can be identified in existing literature on big data analytics, business analytics, and learning analytics?” Our research is grounded in resource-based view theory and we extend the scope beyond the field of learning analytics and include capability frameworks for the more mature research fields of big data analytics and business analytics. This paper’s contribution is twofold: 1) it provides a literature review on known capabilities for big data analytics, business analytics, and learning analytics and 2) it introduces a capability model to support the implementation and uptake of learning analytics. During our study, we identified and analyzed 15 key studies. By synthesizing the results, we found 34 organizational capabilities important to the adoption of analytical activities within an institution and provide 461 ways to operationalize these capabilities. Five categories of capabilities can be distinguished – Data, Management, People, Technology, and Privacy & Ethics. Capabilities presently absent from existing learning analytics frameworks concern sourcing and integration, market, knowledge, training, automation, and connectivity. Based on the results of the review, we present the Learning Analytics Capability Model: a model that provides senior management and policymakers with concrete operationalizations to build the necessary capabilities for successful learning analytics adoption.
MULTIFILE
In deze lezing worden vier onderwerpen besproken: 1. Wat is Learning Analytics (LA)? 2. Hoe begin je met LA? 3. Wat is het verschil tussen adoptie en implementatie van LA? 4. Waarom zijn veel LA-projecten niet succesvol en moeilijk om op te schalen? Wat zijn randvoorwaarden voor succes?
DOCUMENT
Although learning analytics benefit learning, its uptake by higher educational institutions remains low. Adopting learning analytics is a complex undertaking, and higher educational institutions lack insight into how to build organizational capabilities to successfully adopt learning analytics at scale. This paper describes the ex-post evaluation of a capability model for learning analytics via a mixed-method approach. The model intends to help practitioners such as program managers, policymakers, and senior management by providing them a comprehensive overview of necessary capabilities and their operationalization. Qualitative data were collected during pluralistic walk-throughs with 26 participants at five educational institutions and a group discussion with seven learning analytics experts. Quantitative data about the model’s perceived usefulness and ease-of-use was collected via a survey (n = 23). The study’s outcomes show that the model helps practitioners to plan learning analytics adoption at their higher educational institutions. The study also shows the applicability of pluralistic walk-throughs as a method for ex-post evaluation of Design Science Research artefacts.
LINK
Deze uitgave is een eerste verkenning van de mogelijkheden om learning analytics in te zetten bij open en online onderwijs en de Grand Challenges die daarbij spelen. Vijf experts uit de special interest groups Open Education en Learning Analytics identificeerden daartoe de uitdagingen in één van beide gebieden. Per uitdaging is een literatuurstudie uitgevoerd en is onderzocht welke concrete vragen er bestaan, welke nationale en internationale voorbeelden er zijn en welke punten nader onderzoek verdienen.
DOCUMENT
Educational institutions in higher education encounter different thresholds when scaling up to institution-wide learning analytics. This doctoral research focuses on designing a model of capabilities that institutions need to develop in order to remove these barriers and thus maximise the benefits of learning analytics.
DOCUMENT
Conference Paper From the article: Abstract Learning analytics is the analysis and visualization of student data with the purpose of improving education. Literature reporting on measures of the effects of data-driven pedagogical interventions on learning and the environment in which this takes place, allows us to assess in what way learning analytics actually improves learning. We conducted a systematic literature review aimed at identifying such measures of data-driven improvement. A review of 1034 papers yielded 38 key studies, which were thoroughly analyzed on aspects like objective, affected learning and their operationalization (measures). Based on prevalent learning theories, we synthesized a classification scheme comprised of four categories: learning process, student performance, learning environment, and departmental performance. Most of the analyzed studies relate to either student performance or learning process. Based on the results, we recommend to make deliberate decisions on the (multiple) aspects of learning one tries to improve by the application of learning analytics. Our classification scheme with examples of measures may help both academics and practitioners doing so, as it allows for structured positioning of learning analytics benefits.
DOCUMENT
Learning analytics can help higher educational institutions improve learning. Its adoption, however, is a complex undertaking. The Learning Analytics Capability Model describes what 34 organizational capabilities must be developed to support the successful adoption of learning analytics. This paper described the first iteration to evaluate and refine the current, theoretical model. During a case study, we conducted four semi-structured interviews and collected (internal) documentation at a Dutch university that is mature in the use of student data to improve learning. Based on the empirical data, we merged seven capabilities, renamed three capabilities, and improved the definitions of all others. Six capabilities absent in extant learning analytics models are present at the case organization, implying that they are important to learning analytics adoption. As a result, the new, refined Learning Analytics Capability Model comprises 31 capabilities. Finally, some challenges were identified, showing that even mature organizations still have issues to overcome.
DOCUMENT
A promising contribution of Learning Analytics is the presentation of a learner's own learning behaviour and achievements via dashboards, often in comparison to peers, with the goal of improving self-regulated learning. However, there is a lack of empirical evidence on the impact of these dashboards and few designs are informed by theory. Many dashboard designs struggle to translate awareness of learning processes into actual self-regulated learning. In this study we investigate a Learning Analytics dashboard based on existing evidence on social comparison to support motivation, metacognition and academic achievement. Motivation plays a key role in whether learners will engage in self-regulated learning in the first place. Social comparison can be a significant driver in increasing motivation. We performed two randomised controlled interventions in different higher-education courses, one of which took place online due to the COVID-19 pandemic. Students were shown their current and predicted performance in a course alongside that of peers with similar goal grades. The sample of peers was selected in a way to elicit slight upward comparison. We found that the dashboard successfully promotes extrinsic motivation and leads to higher academic achievement, indicating an effect of dashboard exposure on learning behaviour, despite an absence of effects on metacognition. These results provide evidence that carefully designed social comparison, rooted in theory and empirical evidence, can be used to boost motivation and performance. Our dashboard is a successful example of how social comparison can be implemented in Learning Analytics Dashboards.
MULTIFILE
Learning in the workplace is crucial in higher engineering education, since it allows students to transfer knowledge and skills from university to professional engineering practice. Learning analytics endeavors in higher education have primarily focused on classroom-based learning. Recently, workplace learning analytics has become an emergent research area, with target users being workers, students and trainers. We propose technology for workplace learning analytics that allows program managers of higher engineering education programs to get insight into the workplace learning of their students, while ensuring privacy of students' personal data by design. Using a design-based agile methodology, we designed and developed a customizable workplace learning dashboard. From the evaluation with program managers in the computing domain, we can conclude that such technology is feasible and promising. The proposed technology was designed to be generalizable to other (engineering) domains. A next logical step would be to evaluate and improve the proposed technology within other engineering domains.
DOCUMENT