This paper argues online privacy controls are based on a transactional model of privacy, leading to a collective myth of consensual data practices. It proposes an alternative based on the notion of privacy coordination as an alternative vision and realizing this vision as a grand challenge in Ethical UX
DOCUMENT
According to Johnson & Grandison (2007), failure to safeguard privacy of users of services provided by private and governmental organisations, leaves individuals with the risk of exposure to a number of undesirable effects of information processing. Loss of control over information about a person may lead to fraud, identity theft, reputation damage, and may cause psychosocial consequences ranging from mild irritation, unease, social exclusion, physical harm or even, in extreme cases, death. Although pooh-poohed upon by some opinion leaders from search engine and ICT industries for over a decade (Sprenger, 1999; Esguerra, 2009), the debate in the wake of events like the tragic case of Amanda Todd could be interpreted as supporting a case for proper attention to citizens’ privacy. Truth be told, for a balanced discussion on privacy in the age of Facebook one should not turn towards the social media environment that seems to hail any new development in big data analysis and profiling-based marketing as a breathtaking innovation. If the myopic view of technology pundits is put aside, a remarkably lively debate on privacy and related issues may be discerned in both media and scientific communities alike. A quick keyword search on ‘privacy’, limited to the years 2000-2015, yields huge numbers of publications: Worldcat lists 19,240; Sciencedirect 52,566, IEEE explore 71,684 and Google scholar a staggering 1,880,000. This makes clear that privacy is still a concept considered relevant by both the general public and academic and professional audiences. Quite impressive for a subject area that has been declared ‘dead’.
MULTIFILE
Human rights groups are increasingly calling for the protection of their right to privacy in relation to the bulk surveillance and interception of their personal communications. Some are advocating through strategic litigation. This advocacy tool is often chosen when there is weak political or public support for an issue. Nonetheless, as a strategy it remains a question if a lawsuit is strategic in the context of establishing accountability for indiscriminate bulk data interception. The chapter concludes that from a legal perspective the effect of the decision to litigate on the basis of the claim that a collective right to group privacy was violated has not (yet) resulted in significant change. Yet the case study, the British case of human rights groups versus the intelligence agencies, does seem to suggest that they have been able to create more public awareness about mass surveillance and interception programs and its side-effects
LINK
Smart speakers are heralded to make everyday life more convenient in households around the world. These voice-activated devices have become part of intimate domestic contexts in which users interact with platforms.This chapter presents a dualstudy investigating the privacy perceptions of smart speaker users and non-users. Data collected in in-depth interviews and focus groups with Dutch users and non-users show that they make sense of privacy risks through imagined sociotechnical affordances. Imagined affordances emerge with the interplay between user expectations, technologies, and designer intentions. Affordances like controllability, assistance, conversation, linkability, recordability, and locatability are associated with privacy considerations. Viewing this observation in the light of privacy calculus theory, we provide insights into how users’ positive experiences of the control over and assistance in the home offered by smart speakers outweighs privacy concerns. On the contrary, non-users reject the devices because of fears that recordability and locatability would breach the privacy of their homes by tapping data to platform companies. Our findings emphasize the dynamic nature of privacy calculus considerations and how these interact with imagined affordances; establishing a contrast between rational and emotional responses relating to smart speaker use.Emotions play a pivotal role in adoption considerations whereby respondents balance fears of unknown malicious actors against trust in platform companies.This study paves the way for further research that examines how surveillance in the home is becoming increasingly normalized by smart technologies.
DOCUMENT
Following the rationale of the current EU legal framework protecting personal data, children are entitled to the same privacy and data protection rights as adults. However, the child, because of his physical and mental immaturity, needs special safeguards and care, including appropriate legal protection. In the online environment, children are less likely to make any checks or judgments before entering personal information. Therefore, this paper presents an analysis of the extent to which EU regulation can ensure children’s online privacy and data protection.
DOCUMENT
Data collected from fitness trackers worn by employees could be very useful for businesses. The sharing of this data with employers is already a well-established practice in the United States, and companies in Europe are showing an interest in the introduction of such devices among their workforces. Our argument is that employers processing their employees’ fitness trackers data is unlikely to be lawful under the General Data Protection Regulation (GDPR). Wearable fitness trackers, such as Fitbit and AppleWatch devices, collate intimate data about the wearer’s location, sleep and heart rate. As a result, we consider that they not only represent a novel threat to the privacy and autonomy of the wearer, but that the data gathered constitutes ‘health data’ regulated by Article 9. Processing health data, including, in our view, fitness tracking data, is prohibited unless one of the specified conditions in the GDPR applies. After examining a number of legitimate bases which employers can rely on, we conclude that the data processing practices considered do not comply with the principle of lawfulness that is central to the GDPR regime. We suggest alternative schema by which wearable fitness trackers could be integrated into an organization to support healthy habits amongst employees, but in a manner that respects the data privacy of the individual wearer.
MULTIFILE
In this project we take a look at the laws and regulations surrounding data collection using sensors in assistive technology and the literature on concerns of people about this technology. We also look into the Smart Teddy device and how it operates. An analysis required by the General Data Protection Regulation (GDPR) [5] will reveal the risks in terms of privacy and security in this project and how to mitigate them. https://nl.linkedin.com/in/haniers
MULTIFILE
Abstract The aim of this cross-sectional study was to develop a Frailty at Risk Scale (FARS) incorporating ten well-known determinants of frailty: age, sex, marital status, ethnicity, education, income, lifestyle, multimorbidity, life events, and home living environment. In addition, a second aim was to develop an online calculator that can easily support healthcare professionals in determining the risk of frailty among community-dwelling older people. The FARS was developed using data of 373 people aged ≥ 75 years. The Tilburg Frailty Indicator (TFI) was used for assessing frailty. Multivariate logistic regression analysis showed that the determinants multimorbidity, unhealthy lifestyle, and ethnicity (ethnic minority) were the most important predictors. The area under the curve (AUC) of the model was 0.811 (optimism 0.019, 95% bootstrap CI = −0.029; 0.064). The FARS is offered on a Web site, so that it can be easily used by healthcare professionals, allowing quick intervention in promoting quality of life among community-dwelling older people.
DOCUMENT
This matrix is generic. It is a tool for data stewards or other research supporters to assist researchers in taking appropriate measures for the safe use and protection of data about people in scientific research. It is a template that you can adjust to the context of your own institution, faculty and / or department by taking into consideration your setting’s own policies, guidelines, infrastructure and technical solutions. In this way you can more effectively determine the appropriate technical and organizational measures to protect the data based on the context of the research and the risks associated with the data.
DOCUMENT
Editorial on the Research Topic "Leveraging artificial intelligence and open science for toxicological risk assessment"
LINK