Urban environments are full of noise and obstacles, and therefore potentially dangerous and difficult to navigate for the visually impaired. Using Bluetooth beacons and a smartphone app we guide them through these environments by providing the information needed for that specific location. We present the preliminary results concerning the usability of our approach.
DOCUMENT
Introduction: Visually impaired people experience trouble with navigation and orientation due to their weakened ability to rely on eyesight to monitor the environment [1][2]. Smartphones such as the iPhone are already popular devices among the visually impaired for navigating [3]. We explored if an iPhone application that responds to Bluetooth beacons to inform the user about their environment could aid the visually impaired in navigation in an urban environment.Method: We tested the implementation in an urban environment with visually impaired people using the route from the Amsterdam Bijlmer train station to the Royal Dutch Visio office. Bluetooth beacons were attached at two meters high to lampposts and traffic signs along a specified route to give the user instructions via a custom made iPhone app. Three different obstacle types were identified and implemented in the app: a crossover with traffic signs, a car parking entrance and objects blocking the pathway like stairs. Based on the work of Atkin et al.[5] and Havik et al. [6] at each obstacle the beacon will trigger the app to present important information about the surroundings like potential hazards nearby, how to navigate around or through obstacles and information about the next obstacle. The information is presented using pictures of the environment and instructions in text and voice based on Giudice et al. [4]. The application uses Apple’s accessibility features to communicate the instructions with VoiceOver screenreader. The app allows the user to preview the route, to prepare for upcoming obstacles and landmarks. Last, users can customize the app by specifying the amount of detail in images and information the app presents.To determine if the app is more useful for the participants than their current navigational method, participants walked the route both with and without the application. When walking with the app, participants were guided by the app. When walking without the app they used their own navigational method. During both walks a supervisor ensured the safety of the participant.During both walks, after each obstacle, participants were asked how safe they felt. We used a five point Likert scale where one stood for “feeling very safe” and five for “feeling very unsafe”.Qualitative feedback on the usability of the app was collected using the speak-a-lout method during walking and by interview afster walking.Results: Five visually impaired participated, one female and five males, age range from 30 to 78 and with varying levels of visual limitations. Three participants were familiar with the route and two walked the route for the first time.After each obstacle participants rated how safe they felt on a five point Likert scale. We normalized the results by deducting the scores of the walk without the app from the scores of the walk with the app. The average of all participants is shown in figure 2. When passing the traffic light halfway during the route we see that the participants feel safer with than without the app.Summarizing the qualitative feedback, we noticed that all participants indicated feeling supported by the app. They found the type of instructions ideal for walking and learning new routes. Of the five participants, three found the length of the instructions appropriate and two found them too long. They would like to split the detailed instructions in a short instruction and the option for more detailed instructions. They felt that a detailed instruction gave too much information in a hazardous environment like a crossover. Two participants found the information focused on orientation not necessary, while three participants liked knowing their surroundings.Conclusion and discussion: Regarding the safety questions we see that participants felt safer with the app, especially when crossing the road with traffic lights. We believe this big difference in comparison to the other obstacles is due to the crossover being considered more dangerous than the other obstacles. This is reflected by their feedback in requesting less direct information at these locations.All participants indicated feeling supported and at ease with our application, stating they would use the application when walking new routes.Because of the small sample size we consider our results an indication that the app can be of help and a good start for further research on guiding people through an urban environment using beacons.
DOCUMENT
Urban environments are full of noise and obstacles, and therefore potentially dangerous and difficult to navigate for the visually impaired. Using Bluetooth beacons and a smartphone app we guide them through these environments by providing the information needed for that specific location. We present the preliminary results concerning the usability of our approach.
DOCUMENT
People with a visual impairment (PVI) often experience difficulties with wayfinding. Current navigation applications have limited communication channels and do not provide detailed enough information to support PVI. By transmitting wayfinding information via multimodal channels and combining these with wearables, we can provide tailored information for wayfinding and reduce the cognitive load. This study presents a framework for multimodal wayfinding communication via smartwatch. The framework consists of four modalities: audio, voice, tactile and visual. Audio and voice messages are transmitted using a bone conduction headphone, keeping the ears free to focus on the environment. With a smartwatch vibrations are directed to a sensitive part of the body (i.e., the wrist), making it easier to sense the vibrations. Icons and short textual feedback are viewed on the display of the watch, allowing for hands-free navigation.
DOCUMENT
Visually impaired people (VIP) can experience difficulties in navigating urban environments. They mostly depend on the environment’s infrastructure or technical solutions like smartphone apps for navigation. However apps typically use visual and audio feedback, which can be ineffective, distracting and dangerous. Haptic feedback in the form of vibrations can complement where visual and audio fall short, reducing the cognitive load.Existing research into wayfinding using haptic feedback to better support navigation for the visually impaired often relies on custom tactile actuators and the use of multiple vibration motors. Although these solutions can be effective, they are often impractical in every day life or are stigmatizing due to their unusual appearance.To address this issue we propose a more modular system that can be easily integrated in commercially available smartwatches. Based on existing research we present a tactile communication method utilizing the vibrotactile actuator of a smartwatch to provide VIP with wayfinding information that complements visual and audio feedback. Current smartwatches contain a single tactile actuator, but can still be used by focusing on navigation patterns. These patterns are based on research in personal orientation and mobility training with VIP. For example, a vibration pattern is used to represent a concept like ‘attention’, ‘left’ or ‘stairs’ directing the navigator’s attention towards audio or visual information or to the environment.In next phase of this research we will conduct several focus groups and co-creation sessions with VIP and orientation and mobility experts to further specify the requirements and test our proposed tactile method. In the future, this method could be integrated in existing navigation apps using commercially available devices to complement visual and audio information and provide VIP with additional wayfinding information via haptic feedback.
LINK
Background: Construction work, squares, busy sidewalks, road crossings and hasty cyclists - these are just a few of the many challenges visually impaired people (VI) encounter when navigating through urban environments. Especially unknown routes require so much concentration and energy, that VI often choose to stay at home or to travel with assistance. The EyeBeacons project investigates how a wayfinding smartphone app for VI can support independent travel.Methods: Two clickable prototypes of a wayfinding app for VI were created based on the user-requirements from a pilot study. The first app contained many personalization options for route planning, while the second app used pre-defined user profiles (wizard) to create personal routes. Both apps were evaluated in a co-creation workshop of 16 participants, consisting of VI (6), VI care professionals (5) and UX/ICT experts (5). During the workshop, several UX design tools (e.g., customer journey maps) were used to evaluate the apps.Findings: Our preliminary results show that both apps were considered to have additional value for VI’s current route planning and wayfinding practices. Surprisingly, the first app, which offered many personalization options but consequently included more interaction steps, was preferred over the optimized wizard design of the second app. The main reason for this choice, was the limited insight on the reasoning behind route selection. Key features that participants missed in both prototype apps included for example, a function to repeat navigation cues on request.Discussion: This study provides valuable new insights for the design of wayfinding apps that allow VI to navigate safely and independently through challenging urban environments. Furthermore, we found that co-creation works well with the target group using common UX design methods as long as some extra facilitation is provided to the VI. By using clickable prototypes, both VI and professionals were able to experience and evaluate the design prototypes.
DOCUMENT
HB2006 : proceedings of the 8th international conference healthy buildings. Oliveira Fernandez, E. de; Gameiro da Silva, M.; Rosada Pinto, J. (red). ISBN 989-95067-1-0 2006 4-8 juni, Lissabon, Portugal, volume III, p. 279-282
DOCUMENT
Background: Impaired upper extremity function due to muscle paresis or paralysis has a major impact on independent living and quality of life (QoL). Assistive technology (AT) for upper extremity function (i.e. dynamic arm supports and robotic arms) can increase a client’s independence. Previous studies revealed that clients often use AT not to their full potential, due to suboptimal provision of these devices in usual care. Objective: To optimize the process of providing AT for impaired upper extremity function and to evaluate its (cost-)effectiveness compared with care as usual. Methods: Development of a protocol to guide the AT provision process in an optimized way according to generic Dutch guidelines; a quasi-experimental study with non-randomized, consecutive inclusion of a control group (n = 48) receiving care as usual and of an intervention group (optimized provision process) (n = 48); and a cost-effectiveness and cost-utility analysis from societal perspective will be performed. The primary outcome is clients’ satisfaction with the AT and related services, measured with the Quebec User Evaluation of Satisfaction with AT (Dutch version; D-QUEST). Secondary outcomes comprise complaints of the upper extremity, restrictions in activities, QoL, medical consumption and societal cost. Measurements are taken at baseline and at 3, 6 and 9 months follow-up.
DOCUMENT
In the Netherlands, over 40% of nursing home residents are estimated to have visual impairments. This results in the loss of basic visual abilities. The nursing home environment fits more or less to residents’ activities and social participation. This is referred to as environmental fit. To raise professional awareness of environmental fit, an Environmental Observation tool for the Visually Impaired was developed. This tool targets aspects of the nursing home environment such as ‘light’, the use of ‘colours and contrasts’ and ‘furnishing and obstacles’. Objective of this study is to validate the content of the observation tool to have a tool applicable for practice. Based on the content validity approach, we invited a total of eight experts, six eye care professionals and two building engineering researchers, to judge the relevance of the items. The Item Content Validity approach was applied to determine items to retain and reject. The content validity approach led to a decrease in the number of items from 63 to 52. The definitive tool of 52 items contains 21 for Corridors, 17 for the Common Room, and 14 for the Bathroom. All items of the definite tool received an Item-Content Validity Index of 0.875 and a Scale-Content Validity Index of 0.71. The content validity index of the scale and per item has been applied, resulting in a tool that can be applied in nursing homes. The tool might be a starting point of a discussion among professional caregivers on environmental interventions for visually impaired older adults in nursing homes
MULTIFILE
More and more people suffer from age-related eye conditions, e.g. Macular Degeneration. One of the problems experienced by these people is navigation. A strategy shown by many juvenile visually impaired persons (VIPs) is using auditory information for navigation. Therefore, it is important to train age-related VIPs to use auditory information for navigation. Hence the serious game HearHere was developed to train the focused auditory attention of age-related VIPs enhancing the use of auditory information for navigation, available as an application for tablets. Players of the game are instructed to navigate virtually as quickly as possible to a specific sound, requiring focused auditory attention. In an experimental study, the effectiveness of the game on improving focused auditory attention was examined. Forty participants were included, all students of the University of Groningen with normal or corrected-to-normal vision. By including sighted participants, we could investigate whether someone who was used to rely on its vision could improve its focused auditory attention after playing HearHere. As a control, participants played a digital version of Sudoku. The order of playing the games was counterbalanced. Participants were asked to perform a dichotic listening task before playing any game, after playing the first game and after playing the second game. It was found that participants improved significantly more in their performance on the dichotic listening task after having played HearHere (p<.001) than after playing Sudoku (p=.040). This means the game indeed improves focused auditory attention, a skill necessary to navigate on sounds. In conclusion, we recommend the game to become part of the orientation and mobility program, offering age-related VIPs the opportunity to practice the use of auditory information for navigation. Currently, we are working on a version that is suitable for actual use.
DOCUMENT