From the article: "The educational domain is momentarily witnessing the emergence of learning analytics – a form of data analytics within educational institutes. Implementation of learning analytics tools, however, is not a trivial process. This research-in-progress focuses on the experimental implementation of a learning analytics tool in the virtual learning environment and educational processes of a case organization – a major Dutch university of applied sciences. The experiment is performed in two phases: the first phase led to insights in the dynamics associated with implementing such tool in a practical setting. The second – yet to be conducted – phase will provide insights in the use of pedagogical interventions based on learning analytics. In the first phase, several technical issues emerged, as well as the need to include more data (sources) in order to get a more complete picture of actual learning behavior. Moreover, self-selection bias is identified as a potential threat to future learning analytics endeavors when data collection and analysis requires learners to opt in."
DOCUMENT
A promising contribution of Learning Analytics is the presentation of a learner's own learning behaviour and achievements via dashboards, often in comparison to peers, with the goal of improving self-regulated learning. However, there is a lack of empirical evidence on the impact of these dashboards and few designs are informed by theory. Many dashboard designs struggle to translate awareness of learning processes into actual self-regulated learning. In this study we investigate a Learning Analytics dashboard based on existing evidence on social comparison to support motivation, metacognition and academic achievement. Motivation plays a key role in whether learners will engage in self-regulated learning in the first place. Social comparison can be a significant driver in increasing motivation. We performed two randomised controlled interventions in different higher-education courses, one of which took place online due to the COVID-19 pandemic. Students were shown their current and predicted performance in a course alongside that of peers with similar goal grades. The sample of peers was selected in a way to elicit slight upward comparison. We found that the dashboard successfully promotes extrinsic motivation and leads to higher academic achievement, indicating an effect of dashboard exposure on learning behaviour, despite an absence of effects on metacognition. These results provide evidence that carefully designed social comparison, rooted in theory and empirical evidence, can be used to boost motivation and performance. Our dashboard is a successful example of how social comparison can be implemented in Learning Analytics Dashboards.
MULTIFILE
Deploying Learning Analytics that significantly improve learning outcomes remains a challenge. Motivation has been found to be related to academic achievement and is argued to play an essential role in efficient learning. We developed a Learning Analytics dashboard and designed an intervention that relies on goal orientation and social comparison. Subjects can see a prediction of their final grade in a course as well as how they perform in comparison to classmates with similar goal grades. Those with access to the dashboard ended up more motivated than those without access, outperformed their peers as the course progressed and achieved higher final grades. Our results indicate that learner-oriented dashboards are technically feasible and may have tangible benefits for learners.
DOCUMENT
Deze uitgave is een eerste verkenning van de mogelijkheden om learning analytics in te zetten bij open en online onderwijs en de Grand Challenges die daarbij spelen. Vijf experts uit de special interest groups Open Education en Learning Analytics identificeerden daartoe de uitdagingen in één van beide gebieden. Per uitdaging is een literatuurstudie uitgevoerd en is onderzocht welke concrete vragen er bestaan, welke nationale en internationale voorbeelden er zijn en welke punten nader onderzoek verdienen.
DOCUMENT
Educational institutions in higher education encounter different thresholds when scaling up to institution-wide learning analytics. This doctoral research focuses on designing a model of capabilities that institutions need to develop in order to remove these barriers and thus maximise the benefits of learning analytics.
DOCUMENT
Learning in the workplace is crucial in higher engineering education, since it allows students to transfer knowledge and skills from university to professional engineering practice. Learning analytics endeavors in higher education have primarily focused on classroom-based learning. Recently, workplace learning analytics has become an emergent research area, with target users being workers, students and trainers. We propose technology for workplace learning analytics that allows program managers of higher engineering education programs to get insight into the workplace learning of their students, while ensuring privacy of students' personal data by design. Using a design-based agile methodology, we designed and developed a customizable workplace learning dashboard. From the evaluation with program managers in the computing domain, we can conclude that such technology is feasible and promising. The proposed technology was designed to be generalizable to other (engineering) domains. A next logical step would be to evaluate and improve the proposed technology within other engineering domains.
DOCUMENT
We use a randomised experiment to study the effect of offering half of 556 freshman students a learning analytics dashboard and a weekly email with a link to their dashboard, on student behaviour in the online environment and final exam performance. The dashboard shows their online progress in the learning management systems, their predicted chance of passing, their predicted grade and their online intermediate performance compared with the total cohort. The email with dashboard access, as well as dashboard use, has positive effects on student behaviour in the online environment, but no effects are found on student performance in the final exam of the programming course. However, we do find differential effects by specialisation and student characteristics.
MULTIFILE
Hoofdstuk 10 in HRM Heden en Morgen. Dit hoofdstuk is geschreven vanuit de overtuiging dat een gemeenschappelijke taal en begrip van people analytics, evenals enkele basale wetenschappelijke principes waarop het gestoeld is, het jonge vakgebied in de praktijk naar een hoger niveau kunnen tillen. En daarmee de (toekomstige) HRM-professionals werkzaam op en rondom dit uitdagende thema in staat kunnen stellen (nog meer) impact te maken in hun organisatie. Het primaire doel van dit hoofdstuk is om de (toekomstige) professional die dit leest, aan het denken te zetten. Dit kan betekenen inspireren, verwarren, of duiden. Maar ook aanzetten tot het concreet aan de slag gaan met people analytics in de eigen organisatie, op de grens van wetenschap en praktijk, because that’s where the magic happens.
DOCUMENT
ABSTRACT Purpose: This short paper describes the dashboard design process for online hate speech monitoring for multiple languages and platforms. Methodology/approach: A case study approach was adopted in which the authors followed a research & development project for a multilingual and multiplatform online dashboard monitoring online hate speech. The case under study is the project for the European Observatory of Online Hate (EOOH). Results: We outline the process taken for design and prototype development for which a design thinking approach was followed, including multiple potential user groups of the dashboard. The paper presents this process's outcome and the dashboard's initial use. The identified issues, such as obfuscation of the context or identity of user accounts of social media posts limiting the dashboard's usability while providing a trade-off in privacy protection, may contribute to the discourse on privacy and data protection in (big data) social media analysis for practitioners. Research limitations/implications: The results are from a single case study. Still, they may be relevant for other online hate speech detection and monitoring projects involving big data analysis and human annotation. Practical implications: The study emphasises the need to involve diverse user groups and a multidisciplinary team in developing a dashboard for online hate speech. The context in which potential online hate is disseminated and the network of accounts distributing or interacting with that hate speech seems relevant for analysis by a part of the user groups of the dashboard. International Information Management Association
LINK
Learning is all about feedback. Runners, for example, use apps like the RunKeeper. Research shows that apps like that enhance engagement and results. And people think it is fun. The essence being that the behavior of the runner is tracked and communicated back to the runner in a dashboard. We wondered if you can reach the same positive effect if you had a dashboard for Study-behaviour. For students. And what should you measure, track and communicate? We wondered if we could translate the Quantified Self Movement into a Quantified Student. So, together with students, professors and companies we started designing & building Quantified Student Apps. Apps that were measuring all kinds of study-behaviour related data. Things like Time On Campus, Time Online, Sleep, Exercise, Galvanic Skin Response, Study Results and so on. We developed tools to create study – information and prototyped the Apps with groups of student. At the same time we created a Big Data Lake and did a lot of Privacy research. The Big Difference between the Quantified Student Program and Learning Analytics is that we only present the data to the student. It is his/her data! It is his/her decision to act on it or not. The Quantified Student Apps are designed as a Big Mother never a Big Brother. The project has just started. But we already designed, created and learned a lot. 1. We designed and build for groups of prototypes for Study behavior Apps: a. Apps that measure sleep & exercise and compare it to study results, like MyRhytm; b. Apps that measure study hours and compare it to study results, like Nomi; c. Apps that measure group behavior and signal problems, like Groupmotion; d. Apps that measure on campus time and compare it with peers, like workhorse; 2. We researched student fysics to see if we could find his personal Cup-A-Soup-Moment (meaning, can we find by looking at his/her biometrics when the concentration levels dip?); 3. We created a Big Data lake with student data and Open Data and are looking for correlation and causality there. We already found some interesting patterns. In doing so we learned a lot. We learned it is often hard to acquire the right data. It is hard to create and App or a solution that is presenting the data in the right way and presents it in a form of actionable information. We learned that health trackers are still very inprecise. We learned about (and solved some) challenges surrounding privacy. Next year (2017) we will scale the most promising prototype, measure the effects, start a new researchproject and continu working on our data lake. Things will be interesting, and we will blog about it on www.quantifiedstudent.nl.
LINK