People counting is a challenging task with many applications. We propose a method with a fixed stereo camera that is based on projecting a template onto the depth image. The method was tested on a challenging outdoor dataset with good results and runs in real time.
In this paper we propose a head detection method using range data from a stereo camera. The method is based on a technique that has been introduced in the domain of voxel data. For application in stereo cameras, the technique is extended (1) to be applicable to stereo data, and (2) to be robust with regard to noise and variation in environmental settings. The method consists of foreground selection, head detection, and blob separation, and, to improve results in case of misdetections, incorporates a means for people tracking. It is tested in experiments with actual stereo data, gathered from three distinct real-life scenarios. Experimental results show that the proposed method performs well in terms of both precision and recall. In addition, the method was shown to perform well in highly crowded situations. From our results, we may conclude that the proposed method provides a strong basis for head detection in applications that utilise stereo cameras.
MULTIFILE
The purpose of this study was to determine the efficacy of an online self-tracking program on physical activity, glycated hemoglobin, and other health measures in patients with type 2 diabetes. Seventy-two patients with type 2 diabetes were randomly assigned to an intervention or control group. All participants received usual care. The intervention group received an activity tracker (Fitbit Zip) connected to an online lifestyle program. Physical activity was analyzed in average steps per day from week 0 until 12. Health outcome measurements occurred in both groups at baseline and after 13 weeks. Results indicated that the intervention group significantly increased physical activity with 1.5 ± 3 days per week of engagement in 30 minutes of moderate-vigorous physical activity versus no increase in the control group (P = .047). Intervention participants increased activity with 1255 ± 1500 steps per day compared to their baseline (P < .010). No significant differences were found in glycated hemoglobin A1c, with the intervention group decreasing -0.28% ± 1.03% and the control group showing -0.0% ± 0.69% (P = .206). Responders (56%, increasing minimally 1000 steps/d) had significantly decreased glycated hemoglobin compared with nonresponders (-0.69% ± 1.18% vs 0.22% ± 0.47%, respectively; P = .007). To improve effectiveness of eHealth programs, additional strategies are needed.This is an open-access article distributed under the terms of the Creative Commons Attribution-Non Commercial-No Derivatives License 4.0 (CCBY-NC-ND), where it is permissible to download and share the work provided it is properly cited. The work cannot be changed in any way or used commercially without permission from the journal.
Everyone has the right to participate in society to the best of their ability. This right also applies to people with a visual impairment, in combination with a severe or profound intellectual and possibly motor disability (VISPIMD). However, due to their limitations, for their participation these people are often highly dependent on those around them, such as family members andhealthcare professionals. They determine how people with VISPIMD participate and to what extent. To optimize this support, they must have a good understanding of what people with disabilities can still do with their remaining vision.It is currently difficult to gain insight into the visual abilities of people with disabilities, especially those with VISPIMD. As a professional said, "Everything we can think of or develop to assess the functional vision of this vulnerable group will help improve our understanding and thus our ability to support them. Now, we are more or less guessing about what they can see.Moreover, what little we know about their vision is hard to communicate to other professionals”. Therefore, there is a need for methods that can provide insight into the functional vision of people with VISPIMD, in order to predict their options in daily life situations. This is crucial knowledge to ensure that these people can participate in society to their fullest extent.What makes it so difficult to get this insight at the moment? Visual impairments can be caused by a range of eye or brain disorders and can manifest in various ways. While we understand fairly well how low vision affects a person's abilities on relatively simple visual tasks, it is much more difficult to predict this in more complex dynamic everyday situations such asfinding your way or moving around during daily activities. This is because, among other things, conventional ophthalmic tests provide little information about what people can do with their remaining vision in everyday life (i.e., their functional vision).An additional problem in assessing vision in people with intellectual disabilities is that many conventional tests are difficult to perform or are too fatiguing, resulting in either no or the wrong information. In addition to their visual impairment, there is also a very serious intellectual disability (possibly combined with a motor impairment), which makes it even more complex to assesstheir functional vision. Due to the interplay between their visual, intellectual, and motor disabilities, it is almost impossible to determine whether persons are unable to perform an activity because they do not see it, do not notice it, do not understand it, cannot communicate about it, or are not able to move their head towards the stimulus due to motor disabilities.Although an expert professional can make a reasonable estimate of the functional possibilities through long-term and careful observation, the time and correct measurement data are usually lacking to find out the required information. So far, it is insufficiently clear what people with VZEVMB provoke to see and what they see exactly.Our goal with this project is to improve the understanding of the visual capabilities of people with VISPIMD. This then makes it possible to also improve the support for participation of the target group. We want to achieve this goal by developing and, in pilot form, testing a new combination of measurement and analysis methods - primarily based on eye movement registration -to determine the functional vision of people with VISPIMD. Our goal is to systematically determine what someone is responding to (“what”), where it may be (“where”), and how much time that response will take (“when”). When developing methods, we take the possibilities and preferences of the person in question as a starting point in relation to the technological possibilities.Because existing technological methods were originally developed for a different purpose, this partly requires adaptation to the possibilities of the target group.The concrete end product of our pilot will be a manual with an overview of available technological methods (as well as the methods themselves) for assessing functional vision, linked to the specific characteristics of the target group in the cognitive, motor area: 'Given that a client has this (estimated) combination of limitations (cognitive, motor and attention, time in whichsomeone can concentrate), the order of assessments is as follows:' followed by a description of the methods. We will also report on our findings in a workshop for professionals, a Dutch-language article and at least two scientific articles. This project is executed in the line: “I am seen; with all my strengths and limitations”. During the project, we closely collaborate with relevant stakeholders, i.e. the professionals with specific expertise working with the target group, family members of the persons with VISPIMD, and persons experiencing a visual impairment (‘experience experts’).
Nederland kent ongeveer 220.000 bedrijfsongevallen per jaar (met 60 mensen die overlijden). Vandaar dat elke werkgever verplicht is om bedrijfshulpverlening (BHV) te organiseren, waaronder BHV-trainingen. Desondanks brengt slechts een-derde van alle bedrijven de arbeidsrisico’s in kaart via een Risico-Inventarisatie & Evaluatie (RI&E) en blijft het aandeel werknemers met een arbeidsongeval hoog. Daarom wordt er continu geïnnoveerd om BHV-trainingen te optimaliseren, o.a. door middel van Virtual Reality (VR). VR is niet nieuw, maar is wel doorontwikkeld en betaalbaarder geworden. VR biedt de mogelijkheid om veilige realistische BHV-noodsimulaties te ontwikkelen waarbij de cursist het gevoel heeft daar echt te zijn. Ondanks de toename in VR-BHV-trainingen, is er weinig onderzoek gedaan naar het effect van VR in BHV-trainingen en zijn resultaten tegenstrijdig. Daarnaast zijn er nieuwe technologische ontwikkelingen die het mogelijk maken om kijkgedrag te meten in VR m.b.v. Eye-Tracking. Tijdens een BHV-training kan met Eye-Tracking gemeten worden hoe een instructie wordt opgevolgd, of cursisten worden afgeleid en belangrijke elementen (gevaar en oplossingen) waarnemen tijdens de simulatie. Echter, een BHV-training met VR en Eye-Tracking (interacties) bestaat niet. In dit project wordt een prototype ontwikkeld waarin Eye-Tracking wordt verwerkt in een 2021 ontwikkelde VR-BHV-training, waarin noodsituaties zoals een kantoorbrand worden gesimuleerd (de BHVR-toepassing). Door middel van een experiment zal het prototype getest worden om zo voor een deel de vraag te beantwoorden in hoeverre en op welke manier Eye-Tracking in VR een meerwaarde biedt voor (RI&E) BHV-trainingen. Dit project sluit daarmee aan op het missie-gedreven innovatiebeleid ‘De Veiligheidsprofessional’ en helpt het MKB dat vaak middelen en kennis ontbreekt voor onderzoek naar effectiviteit rondom innovatieve-technologieën in educatie/training. Het project levert onder meer een prototype op, een productie-rapport en onderzoeks-artikel, en staat open voor nieuwe deelnemers bij het schrijven van een grotere aanvraag rondom de toepassing en effect van VR en Eye-Tracking in BHV-trainingen.
The Netherlands has approximately 220,000 industrial accidents per year (with 60 people who die). That is why every employer is obliged to organize company emergency response (BHV), including emergency response training. Despite this, only one-third of all companies map out their occupational risks via a Risk Inventory & Evaluation (RI&E) and the share of employees with an occupational accident remains high. That is why there is continuous innovation to optimize emergency response training, for example by means of Virtual Reality (VR). VR is not new, but it has evolved and become more affordable. VR offers the possibility to develop safe realistic emergency response simulations where the student has the feeling that they are really there. Despite the increase in VR-BHV training, little research has been done on the effect of VR in ER training and results are contradictory. In addition, there are new technological developments that make it possible to measure viewing behavior in VR using Eye-Tracking. During an emergency response training, Eye-Tracking can be used to measure how an instruction is followed, whether students are distracted and observe important elements (danger and solutions) during the simulation. However, emergency response training with VR and Eye-Tracking (interactions) does not exist. In this project, a prototype is being developed in which Eye-Tracking is incorporated into a VR-BHV training that was developed in 2021, in which emergency situations such as an office fire are simulated (the BHVR application). The prototype will be tested by means of an experiment in order to partly answer the question to what extent and in what way Eye-Tracking in VR offers added value for (RI&E) emergency response training. This project is therefore in line with the mission-driven innovation policy 'The Safety Professional' and helps SMEs that often lack resources and knowledge for research into the effectiveness of innovative technologies in education/training. The project will include a prototype, a production report and research article, and is open to new participants when writing a larger application about the application and effect of VR and Eye-Tracking in emergency response training.