This article examines to what extent and how cannabis users in different countries, with different cannabis legislation and policies practice normalization and self-regulation of cannabis use in everyday life. Data were collected in a survey among a convenience sample of 1,225 last-year cannabis users aged 18–40 from seven European countries, with cannabis policies ranging from relatively liberal to more punitive. Participants were recruited in or in the vicinity of Dutch coffeeshops. We assessed whether cannabis users experience and interpret formal control and informal social norms differently across countries with different cannabis policies. The findings suggest that many cannabis users set boundaries to control their use. Irrespective of national cannabis policy, using cannabis in private settings and setting risk avoidance rules were equally predominant in all countries. This illustrates that many cannabis users are concerned with responsible use, demonstrating the importance that they attach to discretion. Overall, self-regulation was highest in the most liberal country (the Netherlands). This indicates that liberalization does not automatically lead to chaotic or otherwise problematic use as critics of the policy have predicted, as the diminishing of formal control (law enforcement) is accompanied by increased importance of informal norms and stronger self-regulation. In understanding risk-management, societal tolerance of cannabis use seems more important than cross-national differences in cannabis policy. The setting of cannabis use and self-regulation rules were strongly associated with frequency of use. Daily users were less selective in choosing settings of use and less strict in self-regulation rules. Further differences in age, gender, and household status underline the relevance of a differentiated, more nuanced understanding of cannabis normalization.
Data collected from fitness trackers worn by employees could be very useful for businesses. The sharing of this data with employers is already a well-established practice in the United States, and companies in Europe are showing an interest in the introduction of such devices among their workforces. Our argument is that employers processing their employees’ fitness trackers data is unlikely to be lawful under the General Data Protection Regulation (GDPR). Wearable fitness trackers, such as Fitbit and AppleWatch devices, collate intimate data about the wearer’s location, sleep and heart rate. As a result, we consider that they not only represent a novel threat to the privacy and autonomy of the wearer, but that the data gathered constitutes ‘health data’ regulated by Article 9. Processing health data, including, in our view, fitness tracking data, is prohibited unless one of the specified conditions in the GDPR applies. After examining a number of legitimate bases which employers can rely on, we conclude that the data processing practices considered do not comply with the principle of lawfulness that is central to the GDPR regime. We suggest alternative schema by which wearable fitness trackers could be integrated into an organization to support healthy habits amongst employees, but in a manner that respects the data privacy of the individual wearer.
Vanuit Fontys Hogescholen wordt veel onderzoek gedaan, met name door onderzoekers van de verschillende lectoraten. Vanzelfsprekend worden er binnen deze onderzoeken veel data verzameld en verwerkt. Fontys onderschrijft het belang van zorgvuldige omgang met onderzoeksdata en vraagt daarom van onderzoekers dat zij hun Research Data Management (RDM) op orde hebben. Denk hierbij aan veilige opslag en duurzame toegankelijkheid van data. Maar ook (open access) publiceren en archiveren van onderzoeksdata maken onderdeel uit van RDM. Hoe je hier als onderzoeker invulling aan geeft kan soms best een zoektocht zijn, mede doordat nog niet iedereen even bekend is met het onderwerp RDM. Met dit boek hopen we onderzoekers binnen Fontys de belangrijkste informatie te bieden die nodig is om goed invulling te geven aan Research Data Management en daarbij ook te wijzen op de ondersteuning die op dit gebied voorhanden is.
During the coronavirus pandemic, the use of eHealth tools became increasingly demanded by patients and encouraged by the Dutch government. Yet, HBO health professionals demand clarity on what they can do, must do, and cannot do with the patients’ data when using digital healthcare provision and support. They often perceive the EU GDPR and its national application as obstacles to the use of eHealth due to strict health data processing requirements. They highlight the difficulty of keeping up with the changing rules and understanding how to apply them. Dutch initiatives to clarify the eHealth rules include the 2021 proposal of the wet Elektronische Gegevensuitwisseling in de Zorg and the establishment of eHealth information and communication platforms for healthcare practitioners. The research explores whether these initiatives serve the needs of HBO health professionals. The following questions will be explored: - Do the currently applicable rules and the proposed wet Elektronische Gegevensuitwisseling in de Zorg clarify what HBO health practitioners can do, must do, and cannot do with patients’ data? - Does the proposed wet Elektronische Gegevensuitwisseling in de Zorg provide better clarity on the stakeholders who may access patients’ data? Does it ensure appropriate safeguards against the unauthorized use of such data? - Does the proposed wet Elektronische Gegevensuitwisseling in de Zorg clarify the EU GDPR requirements for HBO health professionals? - Do the eHealth information and communication platforms set up for healthcare professionals provide the information that HBO professionals need on data protection and privacy requirements stemming from the EU GDPR and from national law? How could such platforms be better adjusted to the HBO professionals’ information and communication needs? Methodology: Practice-oriented legal research, semi-structured interviews and focus group discussions will be conducted. Results will be translated to solutions for HBO health professionals.
This project addresses the fundamental societal problem that encryption as a technique is available since decades, but has never been widely adopted, mostly because it is too difficult or cumbersome to use for the public at large. PGP illustrates this point well: it is difficult to set-up and use, mainly because of challenges in cryptographic key management. At the same time, the need for encryption has only been growing over the years, and has become an urgent problem with stringent requirements – for instance for electronic communication between doctors and patients – in the General Data Protection Regulation (GDPR) and with systematic mass surveillance activities of internationally operating intelligence agencies. The interdisciplinary project "Encryption for all" addresses this fundamental problem via a combination of cryptographic design and user experience design. On the cryptographic side it develops identity-based and attribute-based encryption on top of the attribute-based infrastructure provided by the existing IRMA-identity platform. Identity-based encryption (IBE) is a scientifically well-established technique, which addresses the key management problem in an elegant manner, but IBE has found limited application so far. In this project it will be developed to a practically usable level, exploiting the existing IRMA platform for identification and retrieval of private keys. Attribute-based encryption (ABE) has not reached the same level of maturity yet as IBE, and will be a topic of further research in this project, since it opens up attractive new applications: like a teacher encrypting for her students only, or a company encrypting for all employees with a certain role in the company. On the user experience design side, efforts will be focused on making these encryption techniques really usable (i.e., easy to use, effective, efficient, error resistant) for everyone (e.g., also for people with disabilities or limited digital skills). To do so, an iterative, human-centred and inclusive design approach will be adopted. On a fundamental level, scientific questions will be addressed, such as how to promote the use of security and privacy-enhancing technologies through design, and whether and how usability and accessibility affect the acceptance and use of encryption tools. Here, theories of nudging and boosting and the unified theory of technology acceptance and use (known as UTAUT) will serve as a theoretical basis. On a more applied level, standards like ISO 9241-11 on usability and ISO 9241-220 on the human-centred design process will serve as a guideline. Amongst others, interface designs will be developed and focus groups, participatory design sessions, expert reviews and usability evaluations with potential users of various ages and backgrounds will be conducted, in a user experience and observation laboratory available at HAN University of Applied Sciences. In addition to meeting usability goals, ensuring that the developed encryption techniques also meet national and international accessibility standards will be a particular point of focus. With respect to usability and accessibility, the project will build on the (limited) usability design experiences with the mobile IRMA application.
A huge amount of data are being generated, collected, analysed and distributed in a fast pace in our daily life. This data growth requires efficient techniques for analysing and processing high volumes of data, for which preserving privacy effectively is a crucial challenge and even a key necessity, considering the recently coming into effect privacy laws (e.g., the EU General Data Protection Regulation-GDPR). Companies and organisations in their real-world applications need scalable and usable privacy preserving techniques to support them in protecting personal data. This research focuses on efficient and usable privacy preserving techniques in data processing. The research will be conducted in different directions: - Exploring state of the art techniques. - Designing and applying experiments on existing tool-sets. - Evaluating the results of the experiments based on the real-life case studies. - Improving the techniques and/or the tool to meet the requirements of the companies. The proposal will provide results for: - Education: like offering courses, lectures, students projects, solutions for privacy preservation challenges within the educational institutes. - Companies: like providing tool evaluation insights based on case studies and giving proposals for enhancing current challenges. - Research centre (i.e., Creating 010): like expanding its expertise on privacy protection technologies and publishing technical reports and papers. This research will be sustained by pursuing following up projects actively.