Design and development practitioners such as those in game development often have difficulty comprehending and adhering to the European General Data Protection Regulation (GDPR), especially when designing in a private sensitive way. Inadequate understanding of how to apply the GDPR in the game development process can lead to one of two consequences: 1. inadvertently violating the GDPR with sizeable fines as potential penalties; or 2. avoiding the use of user data entirely. In this paper, we present our work on designing and evaluating the “GDPR Pitstop tool”, a gamified questionnaire developed to empower game developers and designers to increase legal awareness of GDPR laws in a relatable and accessible manner. The GDPR Pitstop tool was developed with a user-centered approach and in close contact with stakeholders, including practitioners from game development, legal experts and communication and design experts. Three design choices worked for this target group: 1. Careful crafting of the language of the questions; 2. a flexible structure; and 3. a playful design. By combining these three elements into the GDPR Pitstop tool, GDPR awareness within the gaming industry can be improved upon and game developers and designers can be empowered to use user data in a GDPR compliant manner. Additionally, this approach can be scaled to confront other tricky issues faced by design professionals such as privacy by design.
LINK
Data collected from fitness trackers worn by employees could be very useful for businesses. The sharing of this data with employers is already a well-established practice in the United States, and companies in Europe are showing an interest in the introduction of such devices among their workforces. Our argument is that employers processing their employees’ fitness trackers data is unlikely to be lawful under the General Data Protection Regulation (GDPR). Wearable fitness trackers, such as Fitbit and AppleWatch devices, collate intimate data about the wearer’s location, sleep and heart rate. As a result, we consider that they not only represent a novel threat to the privacy and autonomy of the wearer, but that the data gathered constitutes ‘health data’ regulated by Article 9. Processing health data, including, in our view, fitness tracking data, is prohibited unless one of the specified conditions in the GDPR applies. After examining a number of legitimate bases which employers can rely on, we conclude that the data processing practices considered do not comply with the principle of lawfulness that is central to the GDPR regime. We suggest alternative schema by which wearable fitness trackers could be integrated into an organization to support healthy habits amongst employees, but in a manner that respects the data privacy of the individual wearer.
MULTIFILE
According to Johnson & Grandison (2007), failure to safeguard privacy of users of services provided by private and governmental organisations, leaves individuals with the risk of exposure to a number of undesirable effects of information processing. Loss of control over information about a person may lead to fraud, identity theft, reputation damage, and may cause psychosocial consequences ranging from mild irritation, unease, social exclusion, physical harm or even, in extreme cases, death. Although pooh-poohed upon by some opinion leaders from search engine and ICT industries for over a decade (Sprenger, 1999; Esguerra, 2009), the debate in the wake of events like the tragic case of Amanda Todd could be interpreted as supporting a case for proper attention to citizens’ privacy. Truth be told, for a balanced discussion on privacy in the age of Facebook one should not turn towards the social media environment that seems to hail any new development in big data analysis and profiling-based marketing as a breathtaking innovation. If the myopic view of technology pundits is put aside, a remarkably lively debate on privacy and related issues may be discerned in both media and scientific communities alike. A quick keyword search on ‘privacy’, limited to the years 2000-2015, yields huge numbers of publications: Worldcat lists 19,240; Sciencedirect 52,566, IEEE explore 71,684 and Google scholar a staggering 1,880,000. This makes clear that privacy is still a concept considered relevant by both the general public and academic and professional audiences. Quite impressive for a subject area that has been declared ‘dead’.
MULTIFILE
A huge amount of data are being generated, collected, analysed and distributed in a fast pace in our daily life. This data growth requires efficient techniques for analysing and processing high volumes of data, for which preserving privacy effectively is a crucial challenge and even a key necessity, considering the recently coming into effect privacy laws (e.g., the EU General Data Protection Regulation-GDPR). Companies and organisations in their real-world applications need scalable and usable privacy preserving techniques to support them in protecting personal data. This research focuses on efficient and usable privacy preserving techniques in data processing. The research will be conducted in different directions: - Exploring state of the art techniques. - Designing and applying experiments on existing tool-sets. - Evaluating the results of the experiments based on the real-life case studies. - Improving the techniques and/or the tool to meet the requirements of the companies. The proposal will provide results for: - Education: like offering courses, lectures, students projects, solutions for privacy preservation challenges within the educational institutes. - Companies: like providing tool evaluation insights based on case studies and giving proposals for enhancing current challenges. - Research centre (i.e., Creating 010): like expanding its expertise on privacy protection technologies and publishing technical reports and papers. This research will be sustained by pursuing following up projects actively.