Police Will Use AI to Predict Violent Crimes, Minority-Report Style
No precogs required.
The world’s first system to predict violent crime identifying perpetrators and victims before a killing is committed may come online on March 2019.
It sounds a lot like Minority Report, the futuristic movie in which an DC-based police force uses beings with psychic powers to predict assassinations. Except this is freaking real.
Instead of using precogs — as Philip K. Dick called these special humans in the story that inspired the film — the police in the United Kingdom will use Artificial Intelligence and Big Data gathered from multiple police databases at the local and national level. Their system has already produced more than one terabyte of information and more than five million potential suspects and victims.
A report by New Scientist says that this system — called National Data Analytic Solution or NDAS — uses AI and statistics to asses the risk of someone “committing or becoming a victim of gun or knife crime, as well as the likelihood of someone falling victim to modern slavery.”
The AI analyzes 1400 indicators, including 30 factors — like previous crime records or association with criminals — that are especially useful to predict who will commit the next crime and when.
MORE: If the Pixel 3 Doesn't Wow You, Google's AI Will
A total of nine British police forces — including London’s Metropolitan Police and Greater Manchester Police — are participating in the development of this predictive technology, which will see its first prototype become sentient as soon as March 2019 in the West Midlands.
Sign up to get the BEST of Tom's Guide direct to your inbox.
Here at Tom’s Guide our expert editors are committed to bringing you the best news, reviews and guides to help you stay informed and ahead of the curve!
According to the NDAS project manager Iain Donnelly, the objective of the system — which its promoters claim is the first of its kind — is to prevent crime in order not to spend money solving crimes and chasing criminals. The idea is that, if the system flags anyone as a criminal or victim, the police will be able to stop the crime with counseling and social assistance programs. Or just scaring the bejeezus out of bad guys.
Critical backslash
While Britain’s NDAS seems to be pioneering in the way it combines multiple databases and AI to pinpoint future criminals at the individual level, police forces in the United States have used AI systems to identify criminal hotspots. And while these systems don’t identify specific future killers, just potential areas of conflict, their use have drawn criticism by the American Civil Liberties Union, the Brennan Center for Justice, and other civil rights organizations that believe could be misused and, worse, biased against certain populations.
Critics like the Alan Turing Institute have released a report that raises serious ethical concerns over this crime predictive effort.
According to West Midlands Police’s Deputy Chief Constable Louisa Rolfe, however, the NDAS “sought this independent review at a very early stage as we think an ethical approach should guide the development of this work.” According to the Data Ethics Group in The Alan Turing Institute, the “West Midlands Police have begun work on the National Analytics Solution project and are actively drawing on advice offered by the Turing and IDEPP to help develop their approach to the ethical governance of the project.”
In other words: It seems that the project will be going forward with some modifications and Britain — a country known for festooned with CCTVs with 420,000 in London alone, only second to Beijing’s 470,000— will have a new Big Brother as soon as next year.
Jesus Diaz founded the new Sploid for Gawker Media after seven years working at Gizmodo, where he helmed the lost-in-a-bar iPhone 4 story and wrote old angry man rants, among other things. He's a creative director, screenwriter, and producer at The Magic Sauce, and currently writes for Fast Company and Tom's Guide.