Despite the promises of learning analytics and the existence of several learning analytics implementation frameworks, the large-scale adoption of learning analytics within higher educational institutions remains low. Extant frameworks either focus on a specific element of learning analytics implementation, for example, policy or privacy, or lack operationalization of the organizational capabilities necessary for successful deployment. Therefore, this literature review addresses the research question “What capabilities for the successful adoption of learning analytics can be identified in existing literature on big data analytics, business analytics, and learning analytics?” Our research is grounded in resource-based view theory and we extend the scope beyond the field of learning analytics and include capability frameworks for the more mature research fields of big data analytics and business analytics. This paper’s contribution is twofold: 1) it provides a literature review on known capabilities for big data analytics, business analytics, and learning analytics and 2) it introduces a capability model to support the implementation and uptake of learning analytics. During our study, we identified and analyzed 15 key studies. By synthesizing the results, we found 34 organizational capabilities important to the adoption of analytical activities within an institution and provide 461 ways to operationalize these capabilities. Five categories of capabilities can be distinguished – Data, Management, People, Technology, and Privacy & Ethics. Capabilities presently absent from existing learning analytics frameworks concern sourcing and integration, market, knowledge, training, automation, and connectivity. Based on the results of the review, we present the Learning Analytics Capability Model: a model that provides senior management and policymakers with concrete operationalizations to build the necessary capabilities for successful learning analytics adoption.
MULTIFILE
From the article: "The educational domain is momentarily witnessing the emergence of learning analytics – a form of data analytics within educational institutes. Implementation of learning analytics tools, however, is not a trivial process. This research-in-progress focuses on the experimental implementation of a learning analytics tool in the virtual learning environment and educational processes of a case organization – a major Dutch university of applied sciences. The experiment is performed in two phases: the first phase led to insights in the dynamics associated with implementing such tool in a practical setting. The second – yet to be conducted – phase will provide insights in the use of pedagogical interventions based on learning analytics. In the first phase, several technical issues emerged, as well as the need to include more data (sources) in order to get a more complete picture of actual learning behavior. Moreover, self-selection bias is identified as a potential threat to future learning analytics endeavors when data collection and analysis requires learners to opt in."
DOCUMENT
Although learning analytics benefit learning, its uptake by higher educational institutions remains low. Adopting learning analytics is a complex undertaking, and higher educational institutions lack insight into how to build organizational capabilities to successfully adopt learning analytics at scale. This paper describes the ex-post evaluation of a capability model for learning analytics via a mixed-method approach. The model intends to help practitioners such as program managers, policymakers, and senior management by providing them a comprehensive overview of necessary capabilities and their operationalization. Qualitative data were collected during pluralistic walk-throughs with 26 participants at five educational institutions and a group discussion with seven learning analytics experts. Quantitative data about the model’s perceived usefulness and ease-of-use was collected via a survey (n = 23). The study’s outcomes show that the model helps practitioners to plan learning analytics adoption at their higher educational institutions. The study also shows the applicability of pluralistic walk-throughs as a method for ex-post evaluation of Design Science Research artefacts.
LINK