The Technical Manual for the digital evaluation tool QualiTePE supports users of the QualiTePE tool in creating, conducting and analysing evaluations to record the quality of teaching in physical education. The information on the General Data Protection Regulation (GDPR) instructs users on how to anonymise the data collection of evaluations and which legal bases apply with regard to the collection of personal data. The technical manual for the digital evaluation tool QualiTePE and the information on the General Data Protection Regulation (GDPR) are available in English, German, French, Italian, Spanish, Dutch, Swedish, Slovenian, Czech and Greek.
DOCUMENT
Following the rationale of the current EU legal framework protecting personal data, children are entitled to the same privacy and data protection rights as adults. However, the child, because of his physical and mental immaturity, needs special safeguards and care, including appropriate legal protection. In the online environment, children are less likely to make any checks or judgments before entering personal information. Therefore, this paper presents an analysis of the extent to which EU regulation can ensure children’s online privacy and data protection.
DOCUMENT
The American company Amazon has made headlines several times for monitoring its workers in warehouses across Europe and beyond.1 What is new is that a national data protection authority has recently issued a substantial fine of €32 million to the e-commerce giant for breaching several provisions of the General Data Protection Regulation (gdpr) with its surveillance practices. On 27 December 2023, the Commission nationale de l’informatique et des libertés (cnil)—the French Data Protection Authority—determined that Amazon France Logistique infringed on, among others, Articles 6(1)(f) (principle of lawfulness) and 5(1)(c) (data minimization) gdpr by processing some of workers’ data collected by handheld scanner in the distribution centers of Lauwin-Planque and Montélimar.2 Scanners enable employees to perform direct tasks such as picking and scanning items while continuously collecting data on quality of work, productivity, and periods of inactivity.3 According to the company, this data processing is necessary for various purposes, including quality and safety in warehouse management, employee coaching and performance evaluation, and work planning.4 The cnil’s decision centers on data protection law, but its implications reach far beyond into workers’ fundamental right to health and safety at work. As noted in legal literature and policy documents, digital surveillance practices can have a significant impact on workers’ mental health and overall well-being.5 This commentary examines the cnil’s decision through the lens of European occupational health and safety (EU ohs). Its scope is limited to how the French authority has interpreted the data protection principle of lawfulness taking into account the impact of some of Amazon’s monitoring practices on workers’ fundamental right to health and safety.
MULTIFILE
In this project we take a look at the laws and regulations surrounding data collection using sensors in assistive technology and the literature on concerns of people about this technology. We also look into the Smart Teddy device and how it operates. An analysis required by the General Data Protection Regulation (GDPR) [5] will reveal the risks in terms of privacy and security in this project and how to mitigate them. https://nl.linkedin.com/in/haniers
MULTIFILE
Design and development practitioners such as those in game development often have difficulty comprehending and adhering to the European General Data Protection Regulation (GDPR), especially when designing in a private sensitive way. Inadequate understanding of how to apply the GDPR in the game development process can lead to one of two consequences: 1. inadvertently violating the GDPR with sizeable fines as potential penalties; or 2. avoiding the use of user data entirely. In this paper, we present our work on designing and evaluating the “GDPR Pitstop tool”, a gamified questionnaire developed to empower game developers and designers to increase legal awareness of GDPR laws in a relatable and accessible manner. The GDPR Pitstop tool was developed with a user-centered approach and in close contact with stakeholders, including practitioners from game development, legal experts and communication and design experts. Three design choices worked for this target group: 1. Careful crafting of the language of the questions; 2. a flexible structure; and 3. a playful design. By combining these three elements into the GDPR Pitstop tool, GDPR awareness within the gaming industry can be improved upon and game developers and designers can be empowered to use user data in a GDPR compliant manner. Additionally, this approach can be scaled to confront other tricky issues faced by design professionals such as privacy by design.
LINK
Vanuit Fontys Hogescholen wordt veel onderzoek gedaan, met name door onderzoekers van de verschillende lectoraten. Vanzelfsprekend worden er binnen deze onderzoeken veel data verzameld en verwerkt. Fontys onderschrijft het belang van zorgvuldige omgang met onderzoeksdata en vraagt daarom van onderzoekers dat zij hun Research Data Management (RDM) op orde hebben. Denk hierbij aan veilige opslag en duurzame toegankelijkheid van data. Maar ook (open access) publiceren en archiveren van onderzoeksdata maken onderdeel uit van RDM. Hoe je hier als onderzoeker invulling aan geeft kan soms best een zoektocht zijn, mede doordat nog niet iedereen even bekend is met het onderwerp RDM. Met dit boek hopen we onderzoekers binnen Fontys de belangrijkste informatie te bieden die nodig is om goed invulling te geven aan Research Data Management en daarbij ook te wijzen op de ondersteuning die op dit gebied voorhanden is.
DOCUMENT
Recent years have seen a massive growth in ethical and legal frameworks to govern data science practices. Yet one of the core questions associated with ethical and legal frameworks is the extent to which they are implemented in practice. A particularly interesting case in this context comes to public officials, for whom higher standards typically exist. We are thus trying to understand how ethical and legal frameworks influence the everyday practices on data and algorithms of public sector data professionals. The following paper looks at two cases: public sector data professionals (1) at municipalities in the Netherlands and (2) at the Netherlands Police. We compare these two cases based on an analytical research framework we develop in this article to help understanding of everyday professional practices. We conclude that there is a wide gap between legal and ethical governance rules and the everyday practices.
MULTIFILE
What you don’t know can’t hurt you: this seems to be the current approach for responding to disinformation by public regulators across the world. Nobody is able to say with any degree of certainty what is actually going on. This is in no small part because, at present, public regulators don’t have the slightest idea how disinformation actually works in practice. We believe that there are very good reasons for the current state of affairs, which stem from a lack of verifiable data available to public institutions. If an election board or a media regulator wants to know what types of digital content are being shared in their jurisdiction, they have no effective mechanisms for finding this data or ensuring its veracity. While there are many other reasons why governments would want access to this kind of data, the phenomenon of disinformation provides a particularly salient example of the consequences of a lack of access to this data for ensuring free and fair elections and informed democratic participation. This chapter will provide an overview of the main aspects of the problems associated with basing public regulatory decisions on unverified data, before sketching out some ideas of what a solution might look like. In order to do this, the chapter develops the concept of auditing intermediaries. After discussing which problems the concept of auditing intermediaries is designed to solve, it then discusses some of the main challenges associated with access to data, potential misuse of intermediaries, and the general lack of standards for the provision of data by large online platforms. In conclusion, the chapter suggests that there is an urgent need for an auditing mechanism to ensure the accuracy of transparency data provided by large online platform providers about the content on their services. Transparency data that have been audited would be considered verified data in this context. Without such a transparency verification mechanism, existing public debate is based merely on a whim, and digital dominance is likely to only become more pronounced.
MULTIFILE
Data collected from fitness trackers worn by employees could be very useful for businesses. The sharing of this data with employers is already a well-established practice in the United States, and companies in Europe are showing an interest in the introduction of such devices among their workforces. Our argument is that employers processing their employees’ fitness trackers data is unlikely to be lawful under the General Data Protection Regulation (GDPR). Wearable fitness trackers, such as Fitbit and AppleWatch devices, collate intimate data about the wearer’s location, sleep and heart rate. As a result, we consider that they not only represent a novel threat to the privacy and autonomy of the wearer, but that the data gathered constitutes ‘health data’ regulated by Article 9. Processing health data, including, in our view, fitness tracking data, is prohibited unless one of the specified conditions in the GDPR applies. After examining a number of legitimate bases which employers can rely on, we conclude that the data processing practices considered do not comply with the principle of lawfulness that is central to the GDPR regime. We suggest alternative schema by which wearable fitness trackers could be integrated into an organization to support healthy habits amongst employees, but in a manner that respects the data privacy of the individual wearer.
MULTIFILE
The use of in-body wearable devices is increasing in the healthcare sector, given their capacity to diagnose diseases and monitor health conditions. At the same time, some of these devices have entered the market and are being researched for use in workplace settings to enhance workers’ health and safety. However, neither specific EU legislation nor national law currently regulates the use of in-body wearables in employment, raising questions about the safeguarding of workers’ fundamental rights to privacy and data protection. Addressing the challenges posed by this regulatory gap, this article explores whether the European legislative framework employed in the healthcare sector for medical devices could be applied to the use of in-body wearables in employment settings. It also discusses the application of a key principle of the General Data Protection Regulation when in-body wearables are used in the workplace: lawfulness.
MULTIFILE