Let's Get Personal - Social Personalization for Sustainable Long-Term Educational Robots
DOCUMENT
From the pubisher's website: This paper aims to chart the (moral) values from a robotic industry's perspective regarding the introduction of robots in education. To our knowledge, no studies thus far have addressed this perspective in considering the moral values within this robotic domain. However, their values could conflict with the values upheld by other relevant stakeholders, such as the values of teachers, parents or children. Hence, it is crucial to take the various perspectives of relevant stakeholder's moral values into account. For this study, multiple focus group sessions (n=3) were conducted in The Netherlands with representatives (n=13) of robotic companies on their views of robots in primary education. Their perceptions in terms of opportunities and concerns, were then linked to business values reported in the extant literature. Results show that out of 26 business values, mainly six business values appeared relevant for robot tutors: 1) profitability, 2) productivity, 3 & 4) innovation and creativity, 5) competitiveness, and 6) risk orientation organization. https://doi.org/10.1109/DEVLRN.2019.8850726
DOCUMENT
Abstract: Background: There has been a rapid increase in the population of senior citizens in many countries. The shortage of caregivers is becoming a pressing concern. Robots are being deployed in an attempt to fill this gap and reduce the workload of caregivers. This study explores how healthcare robots are perceived by trainee care professionals. Methods: A total of 2365 students at different vocational levels completed a questionnaire, rating ethical statements regarding beneficence, maleficence, justice, autonomy, utility, and use intentions with regard to three different types of robots (assistive, monitoring, and companion) along with six control variables: gender, age, school year, technical skills, interest in technology, and enjoying working with computers.
DOCUMENT
Through a qualitative examination, the moral evaluations of Dutch care professionals regarding healthcare robots for eldercare in terms of biomedical ethical principles and non-utility are researched. Results showed that care professionals primarily focused on maleficence (potential harm done by the robot), deriving from diminishing human contact. Worries about potential maleficence were more pronounced from intermediate compared to higher educated professionals. However, both groups deemed companion robots more beneficiary than devices that monitor and assist, which were deemed potentially harmful physically and psychologically. The perceived utility was not related to the professionals' moral stances, countering prevailing views. Increasing patient's autonomy by applying robot care was not part of the discussion and justice as a moral evaluation was rarely mentioned. Awareness of the care professionals' point of view is important for policymakers, educational institutes, and for developers of healthcare robots to tailor designs to the wants of older adults along with the needs of the much-undervalued eldercare professionals.
DOCUMENT
Social robots have been introduced in different fields such as retail, health care and education. Primary education in the Netherlands (and elsewhere) recently faced new challenges because of the COVID-19 pandemic, lockdowns and quarantines including students falling behind and teachers burdened with high workloads. Together with two Dutch municipalities and nine primary schools we are exploring the long-term use of social robots to study how social robots might support teachers in primary education, with a focus on mathematics education. This paper presents an explorative study to define requirements for a social robot math tutor. Multiple focus groups were held with the two main stakeholders, namely teachers and students. During the focus groups the aim was 1) to understand the current situation of mathematics education in the upper primary school level, 2) to identify the problems that teachers and students encounter in mathematics education, and 3) to identify opportunities for deploying a social robot math tutor in primary education from the perspective of both the teachers and students. The results inform the development of social robots and opportunities for pedagogical methods used in math teaching, child-robot interaction and potential support for teachers in the classroom
DOCUMENT
While social robots bring new opportunities for education, they also come with moral challenges. Therefore, there is a need for moral guidelines for the responsible implementation of these robots. When developing such guidelines, it is important to include different stakeholder perspectives. Existing (qualitative) studies regarding these perspectives however mainly focus on single stakeholders. In this exploratory study, we examine and compare the attitudes of multiple stakeholders on the use of social robots in primary education, using a novel questionnaire that covers various aspects of moral issues mentioned in earlier studies. Furthermore, we also group the stakeholders based on similarities in attitudes and examine which socio-demographic characteristics influence these attitude types. Based on the results, we identify five distinct attitude profiles and show that the probability of belonging to a specific profile is affected by such characteristics as stakeholder type, age, education and income. Our results also indicate that social robots have the potential to be implemented in education in a morally responsible way that takes into account the attitudes of various stakeholders, although there are multiple moral issues that need to be addressed first. Finally, we present seven (practical) implications for a responsible application of social robots in education following from our results. These implications provide valuable insights into how social robots should be implemented
MULTIFILE
Background: Older adults are a rapidly growing group world-wide, requiring an increasing amount of healthcare. Technological innovations such as care robots may support the growing demand for care. However, hardly any studies address those who will most closely collaborate with care robots: the (trainee) healthcare professional. Methods: This study examined the moral considerations, perceptions of utility, and acceptance among trainee healthcare professionals toward different types of care robots in an experimental questionnaire design (N = 357). We also examined possible differences between participants’ intermediate and higher educational levels. Results: The results show that potential maleficence of care robots dominated the discussion in both educational groups. Assisting robots were seen as potentially the most maleficent. Both groups deemed companion robots least maleficent and most acceptable, while monitoring robots were perceived as least useful. Results further show that the acceptance of robots in care was more strongly associated with the participants’ moral considerations than with utility. Conclusions: Professional care education should include moral considerations and utility of robotics as emerging care technology. The healthcare and nursing students of today will collaborate with the robotic colleagues of tomorrow
LINK
The challenges facing primary education are significant: a growing teacher shortage, relatively high administrative burdens that contribute to work-related stress and an increasing diversity of children in the classroom. A promising new technology that can help teachers and children meet these challenges is the social robot. These physical robots often use artificial intelligence and can communicate with children by taking on social roles, such as that of a fellow classmate or teaching assistant. Previous research shows that the use of social robots can lead to better results in several ways than when traditional educational technologies are applied. However, social robots not only bring opportunities but also lead to new ethical questions. In my PhD research, I investigated the moral considerations of different stakeholders, such as parents and teachers, to create the first guideline for the responsible design and use of social robots for primary education. Various research methods were used for this study. First of all, a large, international literature study was carried out on the advantages and disadvantages of social robots, in which 256 studies were ultimately analysed. Focus group sessions were then held with stakeholders: a total of 118 parents of primary school children, representatives of the robotics industry, educational policymakers, government education advisors, teachers and primary school children contributed. Based on the insights from the literature review and the focus group sessions, a questionnaire was drawn up and distributed to all stakeholders. Based on 515 responses, we then classified stakeholder moral considerations. In the last study, based on in-depth interviews with teachers who used robots in their daily teaching and who supervised the child-robot interaction of >2500 unique children, we studied the influence of social robots on children's social-emotional development. Our research shows that social robots can have advantages and disadvantages for primary education. The diversity of disadvantages makes the responsible implementation of robots complex. However, overall, despite their concerns, all stakeholder groups viewed social robots as a potentially valuable tool. Many stakeholders are concerned about the possible negative effect of robots on children's social-emotional development. Our research shows that social robots currently do not seem to harm children's social-emotional development when used responsibly. However, some children seem to be more sensitive to excessive attachment to robots. Our research also shows that how people think about robots is influenced by several factors. For example, low-income stakeholders have a more sceptical attitude towards social robots in education. Other factors, such as age and level of education, were also strong predictors of the moral considerations of stakeholders. This research has resulted in a guideline for the responsible use of social robots as teaching assistants, which can be used by primary schools and robot builders. The guideline provides schools with tools, such as involving parents in advance and using robots to encourage human contact. School administrators are also given insight into possible reactions from parents and other parties involved. The guideline also offers guidelines for safeguarding privacy, such as data minimization and improving the technical infrastructure of schools and robots; which still often leaves much to be desired. In short, the findings from this thesis provide a solid stepping stone for schools, robot designers, programmers and engineers to develop and use social robots in education in a morally responsible manner. This research has thus paved the way for more research into robots as assistive technology in primary education.
LINK
This study aimed (1) to examine the contribution of robot ZORA in achieving therapeutic and educational goals in rehabilitation and special education for children with severe physical disabilities, and (2) to discover the roles professionals attribute to robot ZORA when it is used in robot-based play interventions in rehabilitation and special education for children with severe physical disabilities. A multi-centre mixed methods study was conducted among children with severe physical disabilities in two centres for rehabilitation and one school for special education. The participating children played with robot ZORA six times during a period of 6 weeks, in individual or group sessions. Quantitative data were gathered about the contribution of ZORA in reaching individual goals for all of the participating children, using the Individually Prioritized Problem Assessment (IPPA). Playfulness was measured with a visual analogue scale (0–10) and children could indicate whether they liked the sessions using a scale consisting of smileys. Video-stimulated recall interviews were used to collect qualitative data about the different roles of ZORA. In total, 33 children and 12 professionals participated in the study. The results of the IPPA showed a significant contribution of ZORA to the achievement of (children’s) individual goals. The data gathered using the IPPA during the ZORA-based interventions showed that the largest contributions of robot ZORA lie in the domains of movement skills and communication skills. Playfulness of the sessions was 7.5 on average and 93% of the sessions were evaluated as ‘enjoyable’ by the children. Overall, ZORA could positively contribute to the achievement of individual goals for children with severe physical disabilities. According to the participating professionals the most promising roles in which robot ZORA can be used are motivator, rewarder or instructor.
DOCUMENT
Robot tutors provide new opportunities for education. However, they also introduce moral challenges. This study reports a systematic literature re-view (N = 256) aimed at identifying the moral considerations related to ro-bots in education. While our findings suggest that robot tutors hold great potential for improving education, there are multiple values of both (special needs) children and teachers that are impacted (positively and negatively) by its introduction. Positive values related to robot tutors are: psychological welfare and happiness, efficiency, freedom from bias and usability. However, there are also concerns that robot tutors may negatively impact these same values. Other concerns relate to the values of friendship and attachment, human contact, deception and trust, privacy, security, safety and accountability. All these values relate to children and teachers. The moral values of other stakeholder groups, such as parents, are overlooked in the existing literature. The results suggest that, while there is a potential for ap-plying robot tutors in a morally justified way, there are imported stake-holder groups that need to be consulted to also take their moral values into consideration by implementing tutor robots in an educational setting. (from Narcis.nl)
MULTIFILE