You are viewing an archived webpage. The information on this page may be out of date. Learn about EPIC's recent work at epic.org.

Federal Appeals Court Sounds Alarm Over Predictive Policing

Judges on a federal appeals court took aim yesterday at predictive policing, the practice of using algorithmic analysis to predict crime and direct law enforcement resources. The Fourth Circuit ruled that Richmond police violated the Fourth Amendment when they stopped and searched the defendant, Billy Curry, simply because he was walking near the scene of a shooting. In a dissent, Judge J. Harvie Wilkinson called the court’s decision a “gut-punch to predictive policing.” But others on the court responded to highlight the dangers and failings of the practice. Chief Judge Roger Gregory questioned whether predictive policing is "a high-tech version of racial profiling.” Judge James A. Wynn highlighted the “devastating effects of over-policing on minority communities” and explained that predictive policing “results in the citizens of those communities being accorded fewer constitutional protections than citizens of other communities.” Judge Stephanie D. Thacker warned that “any computer program or algorithm is only as good as the data that goes into it” and that predictive policing “has been shown to be, at best, of questionable, effectiveness, and at worst, deeply flawed and infused with racial bias.” EPIC has long highlighted the risks of algorithms in the criminal justice system and recently obtained a 2014 Justice Department report detailing the dangers of predictive policing.


« BREAKING: Top Court in Europe Invalidates EU-U.S. Privacy Shield, Citing Lack of Privacy Safeguards and Overbroad U.S. Surveillance Laws | Main | AI Commission Holds First Public Meeting »

Share this page:

Defend Privacy. Support EPIC.
US Needs a Data Protection Agency
2020 Election Security