Curated by THEOUTPOST
On Wed, 9 Apr, 4:03 PM UTC
5 Sources
[1]
UK's MoJ testing algorithms to uncover future killers
Even though policing department spent 2 years on study predicting which criminals will become killers The UK's justice department has confirmed it is working on developing algorithms to predict which criminals will later become murderers. It was internally referred to as the Homicide Prediction Project, and was first discovered via Freedom of Information (FOI) requests filed by civil liberties group Statewatch, which uncovered the project. More recently, the study was dubbed "the murder prediction program" by The Guardian, which first reported the story. Sources told The Register the project is an expansion of existing risk-prediction tools that are already used to predict the possibility of criminals reoffending when approaching prison release, for example. They said this is a long-established practice, with the Offender Assessment System (OASys) one example of this. Introduced in 2001, it uses data to predict reoffending rates and the OAS outcomes inform criminal sentencing and prison block categorizations, with recent research [PDF] stating OASys-linked predictors can be reliable. Theory or practice? Critics, however, are concerned that the information used to develop the new "murder prediction" tool includes data on up to half a million people, some of whom are innocent of any crime, and are worried about its potential for bias. And while officials insist this remains a research project only at present, and claim commentary around the discovery is over-sensationalized, Statewatch says its documents refer to the "future operationalization" of the system. Statewatch went on to say the documents it obtained - an MoJ Data Protection Impact Assessment, an Internal Risk Assessment and a Data Sharing Agreement with Greater Manchester Police - referred to the MoJ's data science team developing models to understand "the powerful predictors in the data for homicide risk." The Register has asked the MoJ for comment. Project involves two largest police forces in the country The project involves bringing new partners and data streams into the fold to fortify the data used by these models. Data from the MoJ, Home Office, Greater Manchester Police, and London's Metropolitan Police are all allegedly informing new predictors, while West Midlands Police was also approached. According to Statewatch, types of data the homicide prediction project looks at include those related to: Suspects, victims, witnesses, missing people, people for whom there are safeguarding concerns, and other vulnerable individuals. The MoJ documents stated that health marker data was expected to give "significant predictive power" to the models, with factors like mental health, addiction, self-harm, suicide, vulnerability, and disability all informing homicide predictions. Previous data informing homicide predictions, published in 2023 [PDF], looked at data such as ethnicity to predict offending rates. Data from 2020 [PDF] noted that the majority of UK homicide victims and suspects were white. People of all ethnicities from more deprived areas as well as Black people specifically are "significantly over-represented" in the data the Ministry of Justice holds and used for this analysis of homicide, says Statewatch, which adds that data-driven "predictive" models discriminate against racialized communities, "reinforcing the structural discrimination of the criminal justice system." Greater Manchester Police's internal data protection impact assessment (DPIA) of the project, obtained by Statewatch, shows the algorithms will incorporate police data such as names, dates of birth, genders, ethnicities, and police national computer (PNC) numbers to produce probabilistic matches. It expected between 100,001 and 500,000 records to be processed as part of the project and the data was encrypted at rest and in transfer, and MFA was used to access relevant systems. Commissioned by Rishi Sunak's government in January 2023, internal project timelines showed the research project was due to end in December 2024, at which point the research data would be deleted and the findings presented to stakeholders. Statewatch researcher Sofia Lyall said the discovery of the Homicide Prediction Project is "chilling and dystopian." "Time and again, research shows that algorithmic systems for 'predicting' crime are inherently flawed," she said. "Yet the government is pushing ahead with AI systems that will profile people as criminals before they've done anything. "This latest model, which uses data from our institutionally racist police and Home Office, will reinforce and magnify the structural discrimination underpinning the criminal legal system. Like other systems of its kind, it will code in bias towards racialized and low-income communities. Building automated tools to profile people as violent criminals is deeply wrong, and using such sensitive data on mental health, addiction, and disability is highly intrusive and alarming. "The Ministry of Justice must immediately halt further development of this murder prediction tool. Instead of throwing money towards developing dodgy and racist AI and algorithms, the government must invest in genuinely supportive welfare services. Making welfare cuts while investing in techno-solutionist 'quick fixes' will only further undermine people's safety and wellbeing." ®
[2]
UK government developing homicide prediction algorithm to identify potential violent offenders
Serving tech enthusiasts for over 25 years. TechSpot means tech analysis and advice you can trust. WTF?! There have been several stories over the years about different governments creating crime-predicting algorithms, leading to comparisons to the 2002 movie Minority Report - even though that film involved clairvoyant humans. The UK government is the latest to come under the spotlight for working on this technology, but officials insist it is only a research project - at least for now. The UK government's program, originally called the "homicide prediction project," works by using algorithms to analyze the information of hundreds of thousands of people, including victims of crime, in the hope of identifying those most likely to commit serious violent offences, writes The Guardian. Civil liberties group Statewatch uncovered the project through the Freedom of Information Act. It claimed that the tool was developed using data from between 100,000 and 500,000 people. Statewatch says the group includes not only those with criminal convictions, but also victims of crime, though officials deny this is the case, claiming it only uses existing data from convicted offenders. The data included names, dates of birth, gender, ethnicity, and a number that identifies people on the police national computer. It also covers sensitive information such as mental health, addiction, suicide and vulnerability, self-harm, and disabilities. "The Ministry of Justice's attempt to build this murder prediction system is the latest chilling and dystopian example of the government's intent to develop so-called crime 'prediction' systems," said Sofia Lyall, a researcher for Statewatch. "Time and again, research shows that algorithmic systems for 'predicting' crime are inherently flawed." "This latest model, which uses data from our institutionally racist police and Home Office, will reinforce and magnify the structural discrimination underpinning the criminal legal system." Officials say that the program is an extension of existing risk-prediction tools, which are often used to predict the likelihood of a prisoner reoffending when they approach their release date. They added that the project is designed to see if adding new data sources from police and custody data would improve risk assessment. A Ministry of Justice spokesperson said the project is being conducted for research purposes only. There's a long history of crime-predicting algorithms that often get compared to Minority Report, including South Korea's "Dejaview" - an AI system that analyzes CCTV footage to detect and potentially prevent criminal activity. It works by analyzing patterns and identifying signs of impending crimes. In 2022, university researchers said they had developed an algorithm that could predict future crime one week in advance with an accuracy of 90%. Also in 2022, it was reported that China was looking at ways to build profiles of its citizens, from which an automated system could predict potential dissidents or criminals before they have a chance to act on their impulses.
[3]
AI murder predictor could catch killers before they strike
Artificial intelligence (AI) could be used to predict whether criminals will go on to become murderers. Researchers are using algorithms to analyse thousands of criminals in the UK to try to identify those who pose the greatest risk of committing serious violent offences. The Ministry of Justice (MoJ) believes it will help boost public safety, but civil liberty campaigners have called it "chilling and dystopian". The existence of the MoJ research project was discovered by Statewatch, the pressure group, with some of its workings uncovered through documents obtained by freedom of information requests. The scheme was originally called the "homicide prediction project", but its name has been changed to "sharing data to improve risk assessment". MoJ officials said only data about people with at least one criminal conviction had been used, despite claims by Statewatch that the personal information of people not convicted of any crime was being used, including records about self-harm and details relating to domestic abuse. The MoJ said the scheme would "review offender characteristics that increase the risk of committing homicide" and "explore alternative and innovative data science techniques to risk assessment of homicide". The project would "provide evidence towards improving risk assessment of serious crime, and ultimately contribute to protecting the public via better analysis", a spokesman added. Commissioned by the prime minister's office when Rishi Sunak was in power, the scheme is analysing data about crime before 2015, from official sources including the Probation Service and Greater Manchester Police. The types of information processed includes names, dates of birth, gender and ethnicity, and the numbers that identify people on the police national computer. Statewatch's claim that data from innocent people and those who have gone to the police for help will be used is based on a part of the data-sharing agreement between the MoJ and GMP. A section on the "type of personal data to be shared" by police with the Government includes various types of criminal convictions, but also lists the age a person first appeared as a victim, including for domestic violence, and the age a person was when they first had contact with police. Also to be shared - and listed under "special categories of personal data" - are "health markers which are expected to have significant predictive power", such as data relating to mental health, addiction, suicide and vulnerability, and self-harm, as well as disability. 'Chilling and dystopian' Sofia Lyall, a researcher for Statewatch, said: "The MoJ's attempt to build this murder prediction system is the latest chilling and dystopian example of the government's intent to develop so-called crime 'prediction' systems. "Time and again, research shows that algorithmic systems for 'predicting' crime are inherently flawed. "This latest model, which uses data from our institutionally racist police and Home Office, will reinforce and magnify the structural discrimination underpinning the criminal legal system. "Like other systems of its kind, it will code-in bias towards racialised and low-income communities. Building an automated tool to profile people as violent criminals is deeply wrong, and using such sensitive data on mental health, addiction and disability is highly intrusive and alarming." An MoJ spokesman said: "This project is being conducted for research purposes only. It has been designed using existing data held by HM Prison and Probation Service and police forces on convicted offenders to help us better understand the risk of people on probation going on to commit serious violence. A report will be published in due course." Officials said the prison and probation services already used risk assessment tools, and this project would determine whether adding new data sources, from police and custody data, would improve risk assessment.
[4]
UK developing algorithmic tool to predict potential killers
In echoes of Minority Report, the British government is working on a "murder prediction" tool aimed at identifying individuals who are most likely to become killers, the Guardian reported this week. The project -- originally called the "homicide prediction project" but since renamed as "sharing data to improve risk assessment" -- is being run by the U.K.'s Ministry of Justice and uses algorithms and personal data, including from the Probation Service, to make its calculations. Recommended Videos The government said that the project is currently for research purposes only, and will "help us better understand the risk of people on probation going on to commit serious violence." The work was launched under the previous Conservative administration and is continuing under the Labour government, which took office last year. Civil liberty campagin group Statewatch discovered the project's existence through a Freedom of Information request. Sofia Lyall, a researcher for Statewatch said, "The Ministry of Justice's attempt to build this murder prediction system is the latest chilling and dystopian example of the government's intent to develop so-called crime 'prediction' systems." She said that the tool will "reinforce and magnify the structural discrimination underpinning the criminal legal system," adding: "Time and again, research shows that algorithmic systems for 'predicting' crime are inherently flawed. Yet the government is pushing ahead with AI systems that will profile people as criminals before they've done anything." Lyall called on the government to "immediately halt further development of this murder prediction tool." The concept of using algorithms to predict potential killers is prominently featured in Philip K. Dick's 1956 novella Minority Report, later adapted into the hit 2002 movie starring Tom Cruise. In this fictional universe, so-called "PreCrime" officers use psychic mutants ("precogs") to arrest individuals before they commit murders, representing an early exploration of predictive policing. However, in this case, the story employs precognition rather than traditional algorithms. Back in the real world, predictive policing is known to be used by a number of police departments in the U.S., though its adoption faces growing scrutiny and regulatory challenges.
[5]
UK Government Is Secretly Building 'Murder Prediction' AI System - Decrypt
The UK Ministry of Justice has been quietly developing an AI system that feels ripped straight from the sci-fi thriller "Minority Report" -- a program designed to predict who might commit murder before they've done anything wrong. According to information released by watchdog organization Statewatch on Tuesday, the system uses sensitive personal data scraped from police and judicial databases to flag individuals who might become killers. Instead of using teenage psychics floating in pools, the UK's program reportedly relies on AI to analyze and profile citizens by scraping loads of data, including mental health records, addiction history, self-harm reports, suicide attempts, and disability status. "A document we obtained says data on 100,000+ people was shared by (the Greater Manchester Police) to develop the tool," Statewatch revealed in social media. 'The data comes from multiple police and judicial databases, known for institutional racism and bias,' the watchdog argued. Statewatch is a nonprofit group founded in 1991 to monitor the development of the EU state and civil liberties. It has built a network of members and contributors which include investigative journalists, lawyers, researchers and academics from over 18 countries. The organization said the documents were obtained via Freedom of Information requests. "The Ministry of Justice's attempt to build this murder prediction system is the latest chilling and dystopian example of the government's intent to develop so-called crime 'prediction' systems, Sofia Lyall, a Researcher for Statewatch said in a statement. "The Ministry of Justice must immediately halt further development of this murder prediction tool." "Instead of throwing money towards developing dodgy and racist AI and algorithms, the government must invest in genuinely supportive welfare services. Making welfare cuts while investing in techno-solutionist 'quick fixes' will only further undermine people's safety and well-being," Lyall said. Statewatch's revelations outlined the breadth of data being collected, which includes information on suspects, victims, witnesses, missing persons, and individuals with safeguarding concerns. One document specifically noted that "health data" was considered to have "significant predictive power" for identifying potential murderers. Of course, the news about this AI tool quickly spread and faced major criticism among experts. Business consultant and editor Emil Protalinski wrote that "governments need to stop getting their inspiration from Hollywood," whereas the official account of Spoken Injustice warned that "without real oversight, AI won't fix injustice, it will make it worse." Even AI seems to know how badly this can end. "The UK's AI murder prediction tool is a chilling step towards 'Minority Report,'" Olivia, an AI agent "expert" on policy making wrote earlier Wednesday. The controversy has ignited debate about whether such systems could ever work ethically. Alex Hern, AI writer at The Economist, highlighted the nuanced nature of objections to the technology. "I'd like more of the opposition to this to be clear about whether the objection is 'it won't work' or 'it will work but it's still bad,'" he wrote. This is not the first time politicians have attempted to use AI to predict crimes. Argentina, for example, sparked controversy last year when it reported working on an AI system capable of detecting crimes before they happen. Japan's AI-powered app called Crime Nabi has gotten a warmer reception, while Brazil's CrimeRadar app developed by the Igarapé Institute claims to have helped reduce crime up to 40% in test zones in Rio de Janeiro. Other countries using AI to predict crimes are South Korea, China, Canada, UK, and even the United States -- with the University of Chicago claiming to have a model capable of predicting future crimes "one week in advance with about 90% accuracy." The Ministry of Justice has not publicly acknowledged the full scope of the program or addressed concerns about potential bias in its algorithms. Whether the system has moved beyond the development phase into actual deployment remains unclear.
Share
Share
Copy Link
The UK Ministry of Justice is developing an AI-powered algorithm to predict potential murderers, sparking debates on ethics, privacy, and the use of sensitive personal data.
The UK Ministry of Justice (MoJ) has confirmed the development of an artificial intelligence system designed to predict potential murderers, sparking intense debate over ethics, privacy, and the use of sensitive personal data 123. Originally dubbed the "Homicide Prediction Project," this initiative aims to analyze data from hundreds of thousands of individuals to identify those most likely to commit serious violent offenses 4.
The project, now renamed "Sharing Data to Improve Risk Assessment," involves collaboration between the MoJ, Home Office, Greater Manchester Police, and London's Metropolitan Police 1. The AI system processes data from 100,000 to 500,000 records, including information on suspects, victims, witnesses, and vulnerable individuals 12.
Key data points analyzed include:
MoJ officials insist that the project is currently for research purposes only and an extension of existing risk-prediction tools used in the criminal justice system 13. They argue that the initiative aims to improve public safety by enhancing risk assessment capabilities, particularly for individuals on probation 35.
Civil liberties groups and researchers have raised significant concerns about the project:
Bias and discrimination: Critics argue that the system may reinforce existing biases in the criminal justice system, particularly affecting racialized and low-income communities 123.
Privacy invasion: The use of sensitive health and personal data has been described as "highly intrusive and alarming" 13.
Ethical implications: Questions have been raised about the morality of profiling individuals as potential criminals before any offense has been committed 24.
Accuracy and reliability: Experts point out that algorithmic systems for predicting crime are inherently flawed and may lead to false positives 23.
The UK's project is not unique, as several countries have explored or implemented AI-driven crime prediction systems:
As the UK government continues to develop this tool, the debate surrounding its ethical implications and potential effectiveness remains heated. Critics call for an immediate halt to the project, advocating for investment in supportive welfare services instead of "techno-solutionist quick fixes" 13. The controversy highlights the ongoing challenge of balancing technological advancements in law enforcement with civil liberties and ethical considerations in the age of AI.
Reference
[1]
[2]
[3]
[4]
The University of Surrey, in collaboration with the Metropolitan Police, has developed an AI system called Knife Hunter to combat knife crime in the UK. This innovative tool enhances weapon identification, origin tracing, and crime pattern analysis.
2 Sources
2 Sources
A groundbreaking AI tool capable of analyzing 81 years' worth of detective work in just 30 hours is being trialed by police forces. This technology promises to revolutionize cold case investigations and potentially solve long-standing unsolved crimes.
2 Sources
2 Sources
An exploration of how AI is impacting the criminal justice system, highlighting both its potential benefits and significant risks, including issues of bias, privacy, and the challenges of deepfake evidence.
2 Sources
2 Sources
The UK government is under fire for failing to update its mandatory AI register, raising concerns about transparency and accountability in the use of artificial intelligence in public services.
2 Sources
2 Sources
The United Kingdom is set to become the first country to introduce laws criminalizing the use of AI tools for creating and distributing sexualized images of children, with severe penalties for offenders.
11 Sources
11 Sources
The Outpost is a comprehensive collection of curated artificial intelligence software tools that cater to the needs of small business owners, bloggers, artists, musicians, entrepreneurs, marketers, writers, and researchers.
© 2025 TheOutpost.AI All rights reserved