Abstract: Background: There has been a rapid increase in the population of senior citizens in many countries. The shortage of caregivers is becoming a pressing concern. Robots are being deployed in an attempt to fill this gap and reduce the workload of caregivers. This study explores how healthcare robots are perceived by trainee care professionals. Methods: A total of 2365 students at different vocational levels completed a questionnaire, rating ethical statements regarding beneficence, maleficence, justice, autonomy, utility, and use intentions with regard to three different types of robots (assistive, monitoring, and companion) along with six control variables: gender, age, school year, technical skills, interest in technology, and enjoying working with computers.
DOCUMENT
From the pubisher's website: This paper aims to chart the (moral) values from a robotic industry's perspective regarding the introduction of robots in education. To our knowledge, no studies thus far have addressed this perspective in considering the moral values within this robotic domain. However, their values could conflict with the values upheld by other relevant stakeholders, such as the values of teachers, parents or children. Hence, it is crucial to take the various perspectives of relevant stakeholder's moral values into account. For this study, multiple focus group sessions (n=3) were conducted in The Netherlands with representatives (n=13) of robotic companies on their views of robots in primary education. Their perceptions in terms of opportunities and concerns, were then linked to business values reported in the extant literature. Results show that out of 26 business values, mainly six business values appeared relevant for robot tutors: 1) profitability, 2) productivity, 3 & 4) innovation and creativity, 5) competitiveness, and 6) risk orientation organization. https://doi.org/10.1109/DEVLRN.2019.8850726
DOCUMENT
Through a qualitative examination, the moral evaluations of Dutch care professionals regarding healthcare robots for eldercare in terms of biomedical ethical principles and non-utility are researched. Results showed that care professionals primarily focused on maleficence (potential harm done by the robot), deriving from diminishing human contact. Worries about potential maleficence were more pronounced from intermediate compared to higher educated professionals. However, both groups deemed companion robots more beneficiary than devices that monitor and assist, which were deemed potentially harmful physically and psychologically. The perceived utility was not related to the professionals' moral stances, countering prevailing views. Increasing patient's autonomy by applying robot care was not part of the discussion and justice as a moral evaluation was rarely mentioned. Awareness of the care professionals' point of view is important for policymakers, educational institutes, and for developers of healthcare robots to tailor designs to the wants of older adults along with the needs of the much-undervalued eldercare professionals.
DOCUMENT