The HCR-20V3 is a violence risk assessment tool that is widely used in forensic clinical practice for risk management planning. The predictive value of the tool, when used in court for legal decisionmaking, is not yet intensively been studied and questions about legal admissibility may arise. This article aims to provide legal and mental health practitioners with an overview of the strengths and weaknesses of the HCR-20V3 when applied in legal settings. The HCR-20V3 is described and discussed with respect to its psychometric properties for different groups and settings. Issues involving legal admissibility and potential biases when conducting violence risk assessments with the HCR-20V3 are outlined. To explore legal admissibility challenges with respect to the HCR-20V3, we searched case law databases since 2013 from Australia, Canada, Ireland, the Netherlands, New Zealand, the UK, and the USA. In total, we found 546 cases referring to the HCR-20/HCR-20V3. In these cases, the tool was rarely challenged (4.03%), and when challenged, it never resulted in a court decision that the risk assessment was inadmissible. Finally, we provide recommendations for legal practitioners for the cross-examination of risk assessments and recommendations for mental health professionals who conduct risk assessments and report to the court. We conclude with suggestions for future research with the HCR-20V3 to strengthen the evidence base for use of the instrument in legal contexts.
DOCUMENT
Accurate and reliable decision-making in the criminal justice system depends on accurate expert reporting and on the correct interpretation of evidence by the judges, prosecutors, and defense lawyers. The present study aims to gain insight into the judiciary's capability to assess the accuracy and reliability of forensic expert reports by first examining the extent to which criminal justice professionals are able to differentiate between an accurate (or sound) expert report and an inaccurate (or unsound) expert report. In an online questionnaire, 133 participants assessed both a sound and an unsound expert report. The findings show that, on average, participants were unable to significantly distinguish between sound and unsound forensic expert reports. Second, the study explored the influence of institutional authority on the evaluation of forensic expert reports. Reports that were not recognized as flawed—particularly those originating from well-known and reputable institutions—were subjected to less critical examination, increasing the risk of evaluation errors. These results suggest that the perceived institutional authority influences the assessment of forensic evidence. The study highlights the need for tools to support criminal justice professionals in evaluating forensic evidence, particularly when experts are unregistered. Recommendations include adhering to established quality standards, consulting counter-expert evaluations, improving courtroom communication, and enhancing forensic knowledge through training. Overall, the findings underscore the importance of critical evidence evaluation to reduce the risk of misinterpretation and wrongful convictions in the judicial process.
DOCUMENT