Police in Plainfield, New Jersey, acquired crime prediction software in 2018. An NGO analyzed the reports and data generated by this software and made a definitive observation: they do not work.
The Californian company Geolitica offers a crime prediction service so that law enforcement can accomplish their public safety mission more effectively, informed and accountable. In practice, the authorities have called on this service to “be more effective in the fight against crime. And thinking that being able to predict where we should patrol would help us achieve that. I don’t know if that was the case,” according to Plainfield Police Capt.
Experience is a resounding failure, Mr. Guarino adds: “We haven’t used it very often, if ever. That’s why we ended up getting rid of it.” And indeed, of the “23,631 predictions generated by Geolitica in 10 months for the Plainfield police, “less than 100 of the predictions corresponded to a crime in the predicted category”. With a success rate of 0.5% on averagewe are far from the success rate of the investigators in “Minority Report”.
Crime prediction software is a failure and reinforces prejudices
That said, this resounding failure is not only attributable to Geolitica. The police officer admits, neither he nor his colleagues had focused on understanding how the tool worked. THE $20,500 paid to Geolitica would have been better spent elsewhere. But this waste of public money is not the only problem raised by this type of software.
Read — Facial recognition: AI ignores certain skin tones according to Sony
After analyzing the data generated by Geolitica for the police departments of 38 American cities, it appears that the algorithms used by the company mainly target the poorest populations, residing in African-American or Latino neighborhoods. In other words, artificial intelligence and prediction technologies are biased and reinforce prejudices against certain individuals, which can lead to miscarriages of justice or even serious abuses.
Source: TheMarkUp