Skip to main content

Palantir TechnologiesPalantir predictive policing software for LAPD disproportionately targeted Black and Latino residents

Palantir provided predictive policing software to the Los Angeles Police Department that designated 'chronic offenders' and generated bulletins for targeted enforcement. Analysis showed the system disproportionately targeted minority neighborhoods, with those flagged being 53% Latino and 31% Black. Criminologists found the system amplified existing racial biases in policing data, essentially automating historical injustices rather than providing neutral analysis.

Scoring Impact

TopicDirectionRelevanceContribution
Algorithmic Fairness-againstprimary-1.00
Criminal Justice Reform-againstsecondary-0.50
Racial Justice-againstsecondary-0.50
Surveillance Technology+towardprimary-1.00
Overall incident score =-0.725

Score = avg(topic contributions) × significance (high ×1.5) × confidence (0.64)

Evidence (2 signals)

Confirms Criticism Oct 1, 2024 documented

Academic research paper documented algorithmic bias in Palantir's predictive policing technology

A peer-reviewed research paper analyzed Palantir's predictive policing technology and documented systematic algorithmic bias and lack of transparency. Criminologist Maria Velez of the University of Maryland found that LAPD's use of Palantir disproportionately overtargeted Black and Latino communities.

Confirms Criticism Sep 29, 2020 documented

BuzzFeed News obtained LAPD training documents revealing scope of Palantir surveillance

BuzzFeed News obtained training documents showing the LAPD's Palantir system tracked scars, tattoos, license plates, and other personal details. Analysis showed chronic offender bulletins disproportionately targeted Black and Latino residents, with flagged individuals being 53% Latino and 31% Black.

Related: Same Topics