This paper argues online privacy controls are based on a transactional model of privacy, leading to a collective myth of consensual data practices. It proposes an alternative based on the notion of privacy coordination as an alternative vision and realizing this vision as a grand challenge in Ethical UX
DOCUMENT
Het is voor CX-professionals één van de belangrijkste vraagstukken binnen de snel digitaliserende economie. En het vraagstuk wordt alleen maar groter: hoe moeten we omgaan met de privacy van de klant terwijl dezelfde consument vraagt om meer gepersonaliseerde dienstverlening?
LINK
Smart speakers are heralded to make everyday life more convenient in households around the world. These voice-activated devices have become part of intimate domestic contexts in which users interact with platforms.This chapter presents a dualstudy investigating the privacy perceptions of smart speaker users and non-users. Data collected in in-depth interviews and focus groups with Dutch users and non-users show that they make sense of privacy risks through imagined sociotechnical affordances. Imagined affordances emerge with the interplay between user expectations, technologies, and designer intentions. Affordances like controllability, assistance, conversation, linkability, recordability, and locatability are associated with privacy considerations. Viewing this observation in the light of privacy calculus theory, we provide insights into how users’ positive experiences of the control over and assistance in the home offered by smart speakers outweighs privacy concerns. On the contrary, non-users reject the devices because of fears that recordability and locatability would breach the privacy of their homes by tapping data to platform companies. Our findings emphasize the dynamic nature of privacy calculus considerations and how these interact with imagined affordances; establishing a contrast between rational and emotional responses relating to smart speaker use.Emotions play a pivotal role in adoption considerations whereby respondents balance fears of unknown malicious actors against trust in platform companies.This study paves the way for further research that examines how surveillance in the home is becoming increasingly normalized by smart technologies.
DOCUMENT
Het is voor marketeers en CX professionals één van de belangrijkste vraagstukken binnen de snel digitaliserende economie. En het vraagstuk wordt alleen maar groter: hoe moeten we omgaan met de privacy van de consument, terwijl dezelfde consument vraagt om meer gepersonaliseerde dienstverlening? Wat kunnen marketeers en CX-professionals doen om een balans te vinden tussen de kansen en de risico’s in het gebruik van persoonsgegevens?
LINK
Following the rationale of the current EU legal framework protecting personal data, children are entitled to the same privacy and data protection rights as adults. However, the child, because of his physical and mental immaturity, needs special safeguards and care, including appropriate legal protection. In the online environment, children are less likely to make any checks or judgments before entering personal information. Therefore, this paper presents an analysis of the extent to which EU regulation can ensure children’s online privacy and data protection.
DOCUMENT
Het is voor CX-professionals één van de belangrijkste vraagstukken binnen de snel digitaliserende economie. En het vraagstuk wordt alleen maar groter: hoe moeten we omgaan met de privacy van de klant terwijl dezelfde consument vraagt om meer gepersonaliseerde dienstverlening?
LINK
In this paper we explore the extent to which privacy enhancing technologies (PETs) could be effective in providing privacy to citizens. Rapid development of ubiquitous computing and ‘the internet of things’ are leading to Big Data and the application of Predictive Analytics, effectively merging the real world with cyberspace. The power of information technology is increasingly used to provide personalised services to citizens, leading to the availability of huge amounts of sensitive data about individuals, with potential and actual privacy-eroding effects. To protect the private sphere, deemed essential in a state of law, information and communication systems (ICTs) should meet the requirements laid down in numerous privacy regulations. Sensitive personal information may be captured by organizations, provided that the person providing the information consents to the information being gathered, and may only be used for the express purpose the information was gathered for. Any other use of information about persons without their consent is prohibited by law; notwithstanding legal exceptions. If regulations are properly translated into written code, they will be part of the outcomes of an ICT, and that ICT will therefore be privacy compliant. We conclude that privacy compliance in the ‘technological’ sense cannot meet citizens’ concerns completely, and should therefore be augmented by a conceptual model to make privacy impact assessments at the level of citizens’ lives possible.
DOCUMENT
Data collected from fitness trackers worn by employees could be very useful for businesses. The sharing of this data with employers is already a well-established practice in the United States, and companies in Europe are showing an interest in the introduction of such devices among their workforces. Our argument is that employers processing their employees’ fitness trackers data is unlikely to be lawful under the General Data Protection Regulation (GDPR). Wearable fitness trackers, such as Fitbit and AppleWatch devices, collate intimate data about the wearer’s location, sleep and heart rate. As a result, we consider that they not only represent a novel threat to the privacy and autonomy of the wearer, but that the data gathered constitutes ‘health data’ regulated by Article 9. Processing health data, including, in our view, fitness tracking data, is prohibited unless one of the specified conditions in the GDPR applies. After examining a number of legitimate bases which employers can rely on, we conclude that the data processing practices considered do not comply with the principle of lawfulness that is central to the GDPR regime. We suggest alternative schema by which wearable fitness trackers could be integrated into an organization to support healthy habits amongst employees, but in a manner that respects the data privacy of the individual wearer.
MULTIFILE
In this project we take a look at the laws and regulations surrounding data collection using sensors in assistive technology and the literature on concerns of people about this technology. We also look into the Smart Teddy device and how it operates. An analysis required by the General Data Protection Regulation (GDPR) [5] will reveal the risks in terms of privacy and security in this project and how to mitigate them. https://nl.linkedin.com/in/haniers
MULTIFILE
With population aging and the expected shortage of formal and informal caregivers, emerging technologies for assistive living are on the rise. Focusing on the perspective of the prospective users of these technologies, this study investigates the perceived drivers and barriers that influence AAL adoption. An online survey among 1296 Dutch older adults was conducted. Although loss of privacy was identified as major barrier towards AAL adoption in previous research, the current study provides statistical evidence that these concerns are secondary to the expected benefits of safe and independent living. These findings suggests that older adults consider aging safely in their trusted home environment as a valid trade-off for some loss of privacy. Despite these results, we urge developers to be mindful of privacy aspects when developing AAL applications, as privacy concerns still had a significant negative influence on the attitude towards using AAL.
DOCUMENT