The Netherlands’ probation service has been using deeply flawed algorithms for years to assess the risk of re-offending, leading to potentially inaccurate recommendations to judges regarding sentencing and release. A scathing report from the Justice and Security Inspectorate revealed widespread errors in the systems, prompting immediate action from the service and raising serious questions about the reliance on algorithmic decision-making within the criminal justice system.

Widespread Errors and Outdated Data

The inspection uncovered that approximately one in five risk assessments generated by these algorithms contained inaccuracies. The algorithms, including one called Oxrec, were found to be significantly below acceptable standards for government use. Critical formulas within Oxrec were improperly configured, neglecting crucial factors like drug dependency and severe mental health conditions.

Perhaps most concerning, the data used to train the Oxrec algorithm originated from Swedish detainees. Experts question the applicability of data from one population to another, suggesting the models were fundamentally unsuited for predicting recidivism rates within the Dutch context. According to a report by the AlgorithmWatch organization in November 2025, such data mismatches are a common source of bias in algorithmic risk assessments globally.

Accountability and Response

Jessica Westerik, Director of the Dutch Probation Service, acknowledged the severity of the findings, describing them as “vrey confrontational.” she stated her acceptance of responsibility and affirmed the organization is actively addressing the shortcomings. “We are very concerned about this and I want to take responsibility for this,” Westerik said. The probation service has temporarily suspended the use of the problematic models while it conducts a thorough review of its risk assessment practices.

A Summary of Key Findings

Issue Details
Algorithm Accuracy Approximately 20% of assessments contained errors
Data Source Utilized outdated data from Swedish detainees
Critical Omissions Failed to adequately account for drug dependency and mental health
Algorithmic Standards Oxrec algorithm did not meet governmental standards

Concerns Over Discrimination

The issues with the Oxrec algorithm are not new. Warnings about potential discriminatory elements were raised as early as 2020,with concerns that incorporating factors like zip code and income coudl lead to ethnic profiling. Despite these warnings, the algorithm continued to be used without sufficient scrutiny. The Netherlands Institute for Human Rights similarly cautioned against using these characteristics without robust justification in 2021.

Sven Stevenson of the Dutch Data Protection Authority described the report as “one of the most painful” his organization has seen, emphasizing the need for public trust in the fairness and impartiality of these systems. The potential for biased outcomes raises fundamental questions about equity and justice within the legal framework.

The Broader Implications of Algorithmic Risk Assessment

This case highlights the growing challenges associated with the use of artificial intelligence and algorithms in high-stakes decision-making. Experts like Cynthia Liem, associate professor at TU Delft, caution against an overreliance on computer models. “AI and algorithms make it very attractive for us to think less, question less and simply accept what is offered to us,” she explained. It’s a trend observed across various sectors, where the perceived objectivity of algorithms can overshadow critical human oversight.

The Dutch probation service’s experiance serves as a cautionary tale for other jurisdictions considering or already implementing similar systems. It underscores the importance of rigorous testing, ongoing monitoring, and a commitment to transparency and accountability in the age of algorithmic governance.

What measures should be taken to ensure algorithmic fairness in criminal justice?

How can we balance the efficiency of AI-driven tools with the need for human oversight and ethical considerations?

This is a developing story. Check back for updates.