How AI is already being used to predict crime before it happens

Tom Cruise is the
Tom Cruise is a 'pre-crime police officer on the run in the 2002 film Minority Report. (Twentieth Century Fox)

In the film Minority Report, Tom Cruise leads a police department that predicts crime before it happens – but is artificial intelligence now making this a reality?

Such technology has already been used successfully, and studies have shown that it can work to predict crime with up to 90% accuracy. But the technology is controversial, as it holds the potential to build on existing bias around how police forces deal with certain areas or communities.

If racist police officers target certain areas, for example, and their arrest data is used to train AI, the AI could direct even more officers to those areas. In the European Union's recent AI act, politicians banned systems for predictive policing, alongside other technology such as emotion recognition software.

Recommended reading

Police normally predict crime in a very basic way, by overlaying reports of crime on maps to reveal hotspots where crime regularly occurs. But AI can analyse datasets to find patterns and predict where crime is likely to occur in the near future.

Research by the University of Chicago led to an algorithm that can predict future crimes a week in advance with 90% accuracy. But the study also highlighted problems in policing, finding that crime in wealthier areas led to more arrests than crime in poor areas.

Has AI been used to predict crime?

A police department in Richmond, Virginia, used historic data to predict where guns might be fired randomly on New Year's Eve – and monitored the areas where it was likely to happen. Random gunfire decreased 47%, the police department said.

Police forces in the UK have used software from companies including IBM, Microsoft and Palantir both to predict where crimes might occur and whether individuals might commit crimes.

Why is it controversial?

AI systems have in the past been 'fed' with data from highly problematic police forces with a history of racial bias.

In those cases, the AI systems can reinforce historic biases – and even be used as a form of 'tech-washing' where the technology is used to allow officers to target certain minorities, experts have warned.

Kate Crawford, cofounder and co-director of AI Now, said in 2019: "Your system is only as good as the data that you use to train it on.

"If the data itself is incorrect, it will cause more police resources to be focused on the same over-surveilled and often racially-targeted communities."