'Minority Report?' New AI Program Aims to Predict Violent Crime Before It Happens

The UK is testing a new system that sounds suspiciously like Minority Report, sans the watery tank full of bald women hooked up to electrodes. Instead, the program, called the National Data Analytics Solution (or NDAS), uses AI to analyze police records and assign individuals a risk factor for committing future crimes, including violent ones involving guns or knives.

One key difference from Minority Report is that the individuals who are identified as having high risk factors for crime are not arrested—instead, they’ll be targeted by a social worker or another state representative to receive counseling. This is expected to be useful for criminals with mental illnesses, as well as those with a history of violent behavior. According to the New Scientist, NDAS has already identified 1400 indicators that might be used to predict an individual’s future crimes, including “the number of crimes an individual had committed with the help of others and the number of crimes committed by people in that individual’s social group.”

NDAS has already collected more than a terabyte of information, and has roughly 5 million individuals in its combined database. The key impetus for the program was apparently a series of budget cuts to the UK’s police force, which has forced them to be more efficient with limited resources. NDAS, if it works as planned, might help prevent crimes before they even happen.

On the other hand, the Alan Turing Institute has already come out with a report that identifies key ethical and practical issues with the system. One of the biggest is that NDAS will reinforce police biases by focusing their attention on the same individuals. Another is an ethical concern brought up in Philip K. Dick’s classic sci-fi story: is it ethical to intervene in someone’s life before they’ve committed a crime?


Sharing is caring!