Visually impaired people (VIP) can experience difficulties in navigating urban environments. They mostly depend on the environment’s infrastructure or technical solutions like smartphone apps for navigation. However apps typically use visual and audio feedback, which can be ineffective, distracting and dangerous. Haptic feedback in the form of vibrations can complement where visual and audio fall short, reducing the cognitive load.Existing research into wayfinding using haptic feedback to better support navigation for the visually impaired often relies on custom tactile actuators and the use of multiple vibration motors. Although these solutions can be effective, they are often impractical in every day life or are stigmatizing due to their unusual appearance.To address this issue we propose a more modular system that can be easily integrated in commercially available smartwatches. Based on existing research we present a tactile communication method utilizing the vibrotactile actuator of a smartwatch to provide VIP with wayfinding information that complements visual and audio feedback. Current smartwatches contain a single tactile actuator, but can still be used by focusing on navigation patterns. These patterns are based on research in personal orientation and mobility training with VIP. For example, a vibration pattern is used to represent a concept like ‘attention’, ‘left’ or ‘stairs’ directing the navigator’s attention towards audio or visual information or to the environment.In next phase of this research we will conduct several focus groups and co-creation sessions with VIP and orientation and mobility experts to further specify the requirements and test our proposed tactile method. In the future, this method could be integrated in existing navigation apps using commercially available devices to complement visual and audio information and provide VIP with additional wayfinding information via haptic feedback.
LINK
The pervasiveness of wearable technology has opened the market for products that analyse running biomechanics and provide feedback to the user. To improve running technique feedback should target specific running biomechanical key points and promote an external focus. Aim for this study was to define and empirically test tailored feedback requirements for optimal motor learning in four consumer available running wearables. First, based on desk research and observations of coaches, a screening protocol was developed. Second, four wearables were tested according to the protocol. Third, results were reviewed, and four experts identified future requirements. Testing and reviewing the selected wearables with the protocol revealed that only two less relevant running biomechanical key points were measured. Provided feedback promotes an external focus of the user. Tailoring was absent in all wearables. These findings indicate that consumer available running wearables have a potential for optimal motor learning but need improvements as well.
DOCUMENT
Recent textile innovations have significantly transformed both the material structures of fibers and fabrics as well as their sphere of use and applications.At the same time, new recycling concepts and methods to re--use textile waste are rapidly being developed and many new ways to make use of recycled and reclaimed fibers have already been found. In this paper, we describe how the development of a new textile, making use of recycled fibers, sparked the development of Textile Reflexes, a robotic textile that can change shape. This paper elaborates on the development of the new textile material, the multidisciplinary approach we take to advance it towards a robotic textile and our first endeavours to implement it in a health & wellbeing context. Textile Reflexes was applied in a vest that supports posture correction and training that was evaluated in a user study. In this way, the paper demonstrates a material and product design study that bridges disciplines and that links to both environmental and social change.doi: 10.21606/dma.2017.610This work is licensed under a Creative Commons Attribution-NonCommercial-Share Alike 4.0 International License. https://creativecommons.org/licenses/by-nc-sa/4.0/
MULTIFILE