The following paper presents a methodology we developed for addressing the case of a multi-modal network to be implemented in the future. The methodology is based on a simulation approach and presents some characteristics that make a challenge to be verified and validated. To overcome this limitation, we proposed a novel methodology that implies interaction with subjectmatter experts, revision of current data, collection and assessment of future performance and educated assumptions. With that methodology we could construct the complete passenger trajectory Door to door in Europe. The results indicate that the approach allows to approach infrastructure analysis at an early stage to have an initial estimation of the upper boundary of performance indicators. To exemplify this, we present the results for a case study in Europe.
The real-time simulation of human crowds has many applications. In a typical crowd simulation, each person ('agent') in the crowd moves towards a goal while adhering to local constraints. Many algorithms exist for specific local ‘steering’ tasks such as collision avoidance or group behavior. However, these do not easily extend to completely new types of behavior, such as circling around another agent or hiding behind an obstacle. They also tend to focus purely on an agent's velocity without explicitly controlling its orientation. This paper presents a novel sketch-based method for modelling and simulating many steering behaviors for agents in a crowd. Central to this is the concept of an interaction field (IF): a vector field that describes the velocities or orientations that agents should use around a given ‘source’ agent or obstacle. An IF can also change dynamically according to parameters, such as the walking speed of the source agent. IFs can be easily combined with other aspects of crowd simulation, such as collision avoidance. Using an implementation of IFs in a real-time crowd simulation framework, we demonstrate the capabilities of IFs in various scenarios. This includes game-like scenarios where the crowd responds to a user-controlled avatar. We also present an interactive tool that computes an IF based on input sketches. This IF editor lets users intuitively and quickly design new types of behavior, without the need for programming extra behavioral rules. We thoroughly evaluate the efficacy of the IF editor through a user study, which demonstrates that our method enables non-expert users to easily enrich any agent-based crowd simulation with new agent interactions.
MULTIFILE
The Interoceanic corridor of Mexico stands as a pivotal infrastructure project poised to significantly enhance Mexico's national and regional economy. Anticipated to start the operations in 2025 under the auspice of the national government, this corridor represents a strategic counterpart to the Panama Canal, which faces capacity constraints due to climate change and environmental impacts. Positioned as a promising alternative for transporting goods from Asia to North America, this corridor will offer a new transport route, yet its real operational capacity and spatial impacts remains uncertain. In this paper, the authors undertake a preliminary, informed analysis leveraging publicly available data and other specific information about infrastructure capacities and economic environment to forecast the potential throughput of this corridor upon full operationalization and in the future. Applying simulation techniques, the authors simulate the future operations of the corridor according to different scenarios to offer insights into its potential capacity and impacts. Furthermore, the paper delves into the opportunities and challenges that are inherent in this project and gives a comprehensive analysis of its potential impact and implications.
MULTIFILE
The IMPULS-2020 project DIGIREAL (BUas, 2021) aims to significantly strengthen BUAS’ Research and Development (R&D) on Digital Realities for the benefit of innovation in our sectoral industries. The project will furthermore help BUas to position itself in the emerging innovation ecosystems on Human Interaction, AI and Interactive Technologies. The pandemic has had a tremendous negative impact on BUas industrial sectors of research: Tourism, Leisure and Events, Hospitality and Facility, Built Environment and Logistics. Our partner industries are in great need of innovative responses to the crises. Data, AI combined with Interactive and Immersive Technologies (Games, VR/AR) can provide a partial solution, in line with the key-enabling technologies of the Smart Industry agenda. DIGIREAL builds upon our well-established expertise and capacity in entertainment and serious games and digital media (VR/AR). It furthermore strengthens our initial plans to venture into Data and Applied AI. Digital Realities offer great opportunities for sectoral industry research and innovation, such as experience measurement in Leisure and Hospitality, data-driven decision-making for (sustainable) tourism, geo-data simulations for Logistics and Digital Twins for Spatial Planning. Although BUas already has successful R&D projects in these areas, the synergy can and should significantly be improved. We propose a coherent one-year Impuls funded package to develop (in 2021): 1. A multi-year R&D program on Digital Realities, that leads to, 2. Strategic R&D proposals, in particular a SPRONG/sleuteltechnologie proposal; 3. Partnerships in the regional and national innovation ecosystem, in particular Mind Labs and Data Development Lab (DDL); 4. A shared Digital Realities Lab infrastructure, in particular hardware/software/peopleware for Augmented and Mixed Reality; 5. Leadership, support and operational capacity to achieve and support the above. The proposal presents a work program and management structure, with external partners in an advisory role.
In this project, the AGM R&D team developed and refined the use of a facial scanning rig. The rig is a physical device comprising multiple cameras and lighting that are mounted on scaffolding around a 'scanning volume'. This is an area at which objects are placed before being photographed from multiple angles. The object is typically a person's head, but it can be anything of this approximate size. Software compares the photographs to create a digital 3D recreation - this process is called photogrammetry. The 3D model is then processed by further pieces of software and eventually becomes a face that can be animated inside in Unreal Engine, which is a popular piece of game development software made by the company Epic. This project was funded by Epic's 'Megagrant' system, and the focus of the work is on streamlining and automating the processing pipeline, and on improving the quality of the resulting output. Additional work has been done on skin shaders (simulating the quality of real skin in a digital form) and the use of AI to re/create lifelike hair styles. The R&D work has produced significant savings in regards to the processing time and the quality of facial scans, has produced a system that has benefitted the educational offering of BUas, and has attracted collaborators from the commercial entertainment/simulation industries. This work complements and extends previous work done on the VIBE project, where the focus was on creating lifelike human avatars for the medical industry.
Automated driving nowadays has become reality with the help of in-vehicle (ADAS) systems. More and more of such systems are being developed by OEMs and service providers. These (partly) automated systems are intended to enhance road and traffic safety (among other benefits) by addressing human limitations such as fatigue, low vigilance/distraction, reaction time, low behavioral adaptation, etc. In other words, (partly) automated driving should relieve the driver from his/her one or more preliminary driving tasks, making the ride enjoyable, safer and more relaxing. The present in-vehicle systems, on the contrary, requires continuous vigilance/alertness and behavioral adaptation from human drivers, and may also subject them to frequent in-and-out-of-the-loop situations and warnings. The tip of the iceberg is the robotic behavior of these in-vehicle systems, contrary to human driving behavior, viz. adaptive according to road, traffic, users, laws, weather, etc. Furthermore, no two human drivers are the same, and thus, do not possess the same driving styles and preferences. So how can one design of robotic behavior of an in-vehicle system be suitable for all human drivers? To emphasize the need for HUBRIS, this project proposes quantifying the behavioral difference between human driver and two in-vehicle systems through naturalistic driving in highway conditions, and subsequently, formulating preliminary design guidelines using the quantified behavioral difference matrix. Partners are V-tron, a service provider and potential developer of in-vehicle systems, Smits Opleidingen, a driving school keen on providing state-of-the-art education and training, Dutch Autonomous Mobility (DAM) B.V., a company active in operations, testing and assessment of self-driving vehicles in the Groningen province, Goudappel Coffeng, consultants in mobility and experts in traffic psychology, and Siemens Industry Software and Services B.V. (Siemens), developers of traffic simulation environments for testing in-vehicle systems.