"We must know whether we want to change the world to experience it with the same sensorial system like the one we already possess, or whether we’d rather modify our body, the somatic filter through which it passes."
LINK
The decomposition of a body is influenced by burial conditions, making it crucial to understand the impact of different conditions for accurate grave detection. Geophysical techniques using drones have gained popularity in locating clandestine graves, offering non-invasive methods for detecting surface and subsurface irregularities. Ground-penetrating radar (GPR) is an effective technology for identifying potential grave locations without disturbance. This research aimed to prototype a drone system integrating GPR to assist in grave localization and to develop software for data management. Initial experiments compared GPR with other technologies, demonstrating its valuable applicability. It is suitable for various decomposition stages and soil types, although certain soil compositions have limitations. The research used the DJI M600 Pro drone and a drone-based GPR system enhanced by the real-time kinematic (RTK) global positioning system (GPS) for precision and autonomy. Tests with simulated graves and cadavers validated the system’s performance, evaluating optimal altitude, speed, and obstacle avoidance techniques. Furthermore, global and local planning algorithms ensured efficient and obstacle-free flight paths. The results highlighted the potential of the drone-based GPR system in locating clandestine graves while minimizing disturbance, contributing to the development of effective tools for forensic investigations and crime scene analysis.
MULTIFILE
In this project, the AGM R&D team developed and refined the use of a facial scanning rig. The rig is a physical device comprising multiple cameras and lighting that are mounted on scaffolding around a 'scanning volume'. This is an area at which objects are placed before being photographed from multiple angles. The object is typically a person's head, but it can be anything of this approximate size. Software compares the photographs to create a digital 3D recreation - this process is called photogrammetry. The 3D model is then processed by further pieces of software and eventually becomes a face that can be animated inside in Unreal Engine, which is a popular piece of game development software made by the company Epic. This project was funded by Epic's 'Megagrant' system, and the focus of the work is on streamlining and automating the processing pipeline, and on improving the quality of the resulting output. Additional work has been done on skin shaders (simulating the quality of real skin in a digital form) and the use of AI to re/create lifelike hair styles. The R&D work has produced significant savings in regards to the processing time and the quality of facial scans, has produced a system that has benefitted the educational offering of BUas, and has attracted collaborators from the commercial entertainment/simulation industries. This work complements and extends previous work done on the VIBE project, where the focus was on creating lifelike human avatars for the medical industry.