Most violence risk assessment tools have been validated predominantly in males. In this multicenter study, the Historical, Clinical, Risk Management–20 (HCR-20), Historical, Clinical, Risk Management–20 Version 3 (HCR-20V3), Female Additional Manual (FAM), Short-Term Assessment of Risk and Treatability (START), Structured Assessment of Protective Factors for violence risk (SAPROF), and Psychopathy Checklist–Revised (PCL-R) were coded on file information of 78 female forensic psychiatric patients discharged between 1993 and 2012 with a mean follow-up period of 11.8 years from one of four Dutch forensic psychiatric hospitals. Notable was the high rate of mortality (17.9%) and readmission to psychiatric settings (11.5%) after discharge. Official reconviction data could be retrieved from the Ministry of Justice and Security for 71 women. Twenty-four women (33.8%) were reconvicted after discharge, including 13 for violent offenses (18.3%). Overall, predictive validity was moderate for all types of recidivism, but low for violence. The START Vulnerability scores, HCR-20V3, and FAM showed the highest predictive accuracy for all recidivism. With respect to violent recidivism, only the START Vulnerability scores and the Clinical scale of the HCR-20V3 demonstrated significant predictive accuracy.
MULTIFILE
Understanding the factors that may impact the transfer, persistence, prevalence and recovery of DNA (DNA-TPPR), and the availability of data to assign probabilities to DNA quantities and profile types being obtained given particular scenarios and circumstances, is paramount when performing, and giving guidance on, evaluations of DNA findings given activity level propositions (activity level evaluations). In late 2018 and early 2019, three major reviews were published on aspects of DNA-TPPR, with each advocating the need for further research and other actions to support the conduct of DNA-related activity level evaluations. Here, we look at how challenges are being met, primarily by providing a synopsis of DNA-TPPR-related articles published since the conduct of these reviews and briefly exploring some of the actions taken by industry stakeholders towards addressing identified gaps. Much has been carried out in recent years, and efforts continue, to meet the challenges to continually improve the capacity of forensic experts to provide the guidance sought by the judiciary with respect to the transfer of DNA.
Already for some decades lateral flow assays (LFAs) are ‘common use’ devices in our daily life. Also, for forensic use LFAs are developed, such as for the analysis of illicit drugs and DNA, but also for the detection of explosives and body fluid identification. Despite their advantages, including ease-of-use, LFAs are not yet frequently applied at a crime scene. This review describes (academic) developments of LFAs for forensic applications, focusing on biological and chemical applications, whereby the main advantages and disadvantages of LFAs for the different forensic applications are summarized. Additionally, a critical review is provided, discussing why LFAs are not frequently applied within the forensic field and highlighting the steps that are needed to bring LFAs to the forensic market.
Every year the police are confronted with an ever increasing number of complex cases involving missing persons. About 100 people are reported missing every year in the Netherlands, of which, an unknown number become victims of crime, and presumed buried in clandestine graves. Similarly, according to NWVA, several dead animals are also often buried illegally in clandestine graves in farm lands, which may result in the spread of diseases that have significant consequences to other animals and humans in general. Forensic investigators from both the national police (NP) and NWVA are often confronted with a dilemma: speed versus carefulness and precision. However, the current forensic investigation process of identifying and localizing clandestine graves are often labor intensive, time consuming and employ classical techniques, such as walking sticks and dogs (Police), which are not effective. Therefore, there is an urgent request from the forensic investigators to develop a new method to detect and localize clandestine graves quickly, efficiently and effectively. In this project, together with practitioners, knowledge institutes, SMEs and Field labs, practical research will be carried out to devise a new forensic investigation process to identify clandestine graves using an autonomous Crime Scene Investigative (CSI) drone. The new work process will exploit the newly adopted EU-wide drone regulation that relaxes a number of previously imposed flight restrictions. Moreover, it will effectively optimize the available drone and perception technologies in order to achieve the desired functionality, performance and operational safety in detecting/localizing clandestine graves autonomously. The proposed method will be demonstrated and validated in practical operational environments. This project will also make a demonstrable contribution to the renewal of higher professional education. The police and NVWA will be equipped with operating procedures, legislative knowledge, skills and technological expertise needed to effectively and efficiently performed their forensic investigations.
The project aim is to improve collusion resistance of real-world content delivery systems. The research will address the following topics: • Dynamic tracing. Improve the Laarhoven et al. dynamic tracing constructions [1,2] [A11,A19]. Modify the tally based decoder [A1,A3] to make use of dynamic side information. • Defense against multi-channel attacks. Colluders can easily spread the usage of their content access keys over multiple channels, thus making tracing more difficult. These attack scenarios have hardly been studied. Our aim is to reach the same level of understanding as in the single-channel case, i.e. to know the location of the saddlepoint and to derive good accusation scores. Preferably we want to tackle multi-channel dynamic tracing. • Watermarking layer. The watermarking layer (how to embed secret information into content) and the coding layer (what symbols to embed) are mostly treated independently. By using soft decoding techniques and exploiting the “nuts and bolts” of the embedding technique as an extra engineering degree of freedom, one should be able to improve collusion resistance. • Machine Learning. Finding a score function against unknown attacks is difficult. For non-binary decisions there exists no optimal procedure like Neyman-Pearson scoring. We want to investigate if machine learning can yield a reliable way to classify users as attacker or innocent. • Attacker cost/benefit analysis. For the various use cases (static versus dynamic, single-channel versus multi-channel) we will devise economic models and use these to determine the range of operational parameters where the attackers have a financial benefit. For the first three topics we have a fairly accurate idea how they can be achieved, based on work done in the CREST project, which was headed by the main applicant. Neural Networks (NNs) have enjoyed great success in recognizing patterns, particularly Convolutional NNs in image recognition. Recurrent NNs ("LSTM networks") are successfully applied in translation tasks. We plan to combine these two approaches, inspired by traditional score functions, to study whether they can lead to improved tracing. An often-overlooked reality is that large-scale piracy runs as a for-profit business. Thus countermeasures need not be perfect, as long as they increase the attack cost enough to make piracy unattractive. In the field of collusion resistance, this cost analysis has never been performed yet; even a simple model will be valuable to understand which countermeasures are effective.