Geolitica

PredPol, Inc, now known as Geolitica,[2] is a predictive policing company that attempts to predict property crimes using predictive analytics. PredPol is also the name of the software the company produces. PredPol began as a project of the Los Angeles Police Department (LAPD) and University of California, Los Angeles professor Jeff Brantingham. PredPol has produced a patented algorithm, which is based on a model used to predict earthquake aftershocks.

PredPol
TypePrivate
Founded1 January 2012 Edit this on Wikidata
Founders
  • Jeff Brantingham
  • George Mohler[1]
HeadquartersSanta Cruz, California, U.S.
Key people
Brian MacDonald (CEO)[2]
ProductsPredictive analytics
Websitewww.predpol.com

As of 2020, PredPol's algorithm is the most commonly used predictive policing algorithm in the U.S.[3][4] Police departments that use PredPol are given printouts of jurisdiction maps that denote areas where crime has been predicted to occur throughout the day.[5] The Los Angeles Times reported that officers are expected to patrol these areas during their shifts, as the system tracks their movements via the GPS in their patrol cars.[6] Scholar Ruha Benjamin called PredPol a "crime production algorithm," as police officers then more heavily patrol these predicted crime zones, expecting to see crime, which leads to a self-fulfilling prophecy.[3]

In an August 2023 earnings call, the CEO of SoundThinking announced that the company had begun the process of absorbing parts of Geolitica, including its engineering team, patents, and customers. According to SoundThinking, Geolitica would cease operations at the end of 2023.[7]

Controversies

PredPol was created in 2010 and was a leading vendor of predictive policing technology by 2012.[8] Smithsonian magazine remarked in 2018 that no independent published research had ever confirmed PredPol's claims of its software's accuracy.[9] In March 2019, the LAPD's internal audit concluded that there were insufficient data to determine if PredPol software helped reduce crime.[6]

In October 2018 Cory Doctorow described the secrecy around identifying which police departments use PredPol.[10] PredPol does not share this information.[10] The information is not accessible to the public.[10] In February 2019 Vice followed up to report that many police departments secretly use PredPol.[11] According to PredPol in 2019, 60 police departments in the U.S. used PredPol, most of which were mid-size agencies of 100 to 200 officers. In 2019, several cities reported cancelling PredPol contracts due to cost. The city of Mountain View, California spent more than $60,000 on the program between 2013 and 2018, and Hagerstown, Maryland spent $15,000 a year on the service until 2018.[6]

In 2016 Mic reported that PredPol inappropriately directs police to minority neighborhoods.[12]

In 2017 Santa Cruz placed a moratorium on the use of predictive policing technology.[13] In 2020, the Santa Cruz City Council banned the use of predictive policing, a move that was supported by a coalition of civil liberties and racial justice groups.[14]

Institutions like the Brennan Center have urged transparency from police departments employing the technology, because in order for policymakers and auditors to evaluate these algorithms, audit logs of who creates and accesses the predictions need to be kept and disclosed.[15]

In April 2020, the Los Angeles Police Department, one of the oldest customers of PredPol, ended its program [16] without being able to measure its effectiveness in reducing crime.[17]

In December 2021, a report was published by Gizmodo and The Markup indicating that PredPol perpetuated racial biases by targeting Latino and Black neighborhoods, while crime predictions for white middle- to upper-class areas were absent.[2][18]

In October 2023, an investigation by The Markup revealed the crime predictions generated by PredPol's algorithm for the Plainfield Police Department had an accuray rate less than half of 1%.[19]

References

  1. Gilbertson, Annie (August 20, 2020). "Data-informed predictive policing was heralded as less biased. Is it?". Mic.
  2. Aaron Sankin et al. "Crime Prediction Software Promised to Be Free of Biases. New Data Shows It Perpetuates Them". Gizmodo, December 1, 2021.
  3. Benjamin, Ruha (2019). Race After Technology: Abolitionist Tools for the New Jim Code. Medford, MA: Polity. p. 83.
  4. Heaven, Will Douglas (July 17, 2020). "Predictive policing algorithms are racist. They need to be dismantled". MIT Technology Review.
  5. Wang, Jackie (2018). Carceral Capitalism. South Pasadena, CA: Semiotext(e). p. 241.
  6. Puente, Mark (July 3, 2019). "LAPD pioneered predicting crime with data. Many police don't think it works". Los Angeles Times. Retrieved November 22, 2020.
  7. Mehrotra, Dhruv (September 27, 2023). "The Maker of ShotSpotter Is Buying the World's Most Infamous Predictive Policing Tech". Wired. ISSN 1059-1028. Retrieved September 27, 2023.
  8. Winston, Ali (April 26, 2018). "A pioneer in predictive policing is starting a troubling new project". The Verge.
  9. Rieland, Randy (March 5, 2018). "Artificial Intelligence Is Now Used to Predict Crime. But Is It Biased?". Smithsonian.
  10. Doctorow, Cory (October 30, 2018). "Is this the full list of US cities that have bought or considered Predpol's predictive policing services?". Boing Boing.
  11. Koebler, Jason; Haskins, Caroline (February 6, 2019). "Dozens of Cities Have Secretly Experimented With Predictive Policing Software". Vice.
  12. Smith IV, Jack (October 6, 2016). "(Exclusive) Crime-prediction tool PredPol amplifies racially biased policing, study shows". Mic.
  13. Miller, Susan (July 1, 2020). "Santa Cruz bans predictive policing -". GCN. Retrieved November 23, 2020.
  14. "Santa Cruz becomes first U.S. city to approve ban on predictive policing". Santa Cruz Sentinel. June 24, 2020. Retrieved November 23, 2020.
  15. "Predictive Policing Explained | Brennan Center for Justice". www.brennancenter.org. April 1, 2020. Retrieved November 23, 2020.
  16. "LAPD will end controversial program that aimed to predict where crimes would occur". Los Angeles Times. April 21, 2020.
  17. Leila Miller (April 21, 2020). "LAPD data programs need better oversight to protect public, inspector general concludes". Los Angeles Times.
  18. "Crime Prediction Software Promised to be Free of Biases. New Data Shows It Perpetuates Them – the Markup". December 2, 2021.
  19. Sankin, Aaron (October 2, 2023). "Predictive Policing Software Terrible At Predicting Crimes". The Markup. The Wired. Retrieved October 3, 2023.
This article is issued from Wikipedia. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.