Skip to main content
policy Support = Good

Criminal Justice Reform

Supporting means...

Supports criminal justice reform; fair chance hiring; opposes prison labor exploitation; funds reform organizations; bail reform

Opposing means...

Exploits prison labor; builds tools for mass incarceration; opposes reform; discriminates against formerly incarcerated

Recent Incidents

negligent

New York Attorney General Letitia James announced a settlement with DoorDash for routinely rejecting delivery worker applicants with criminal histories without fair assessment, in violation of state human rights and corrections laws and the NYC Fair Chance Act. In a one-year period, DoorDash rejected approximately 3,000 New York applicants based on their criminal history without considering the age of the applicant when the offense was committed, rehabilitation efforts, or time elapsed since the offense. Of 2,898 rejected applicants, 57 submitted appeals but DoorDash did not reverse any rejections.

$14.0M

In 2020, Jack Dorsey's Start Small fund donated $3 million to Colin Kaepernick's Know Your Rights Camp for criminal justice reform, $10 million to Boston University's Center for Antiracist Research, $1 million to NAACP for policing reform and voting rights, $1.5 million to Black Visions Collective, and $750,000 to ArchCity Defender to combat criminalization of poverty. He has also given over $53 million to Rihanna's Clara Lionel Foundation since 2020.

negligent

Clearview AI's facial recognition technology was linked to wrongful arrests where police relied on erroneous facial matches with minimal additional investigation. Robert Williams of Detroit was arrested for larceny in January 2020 despite never having stolen anything — identified solely by the facial recognition system. NYT reporter Kashmir Hill documented multiple cases of flawed results leading to privacy-eroding and false arrests by law enforcement agencies. The cases disproportionately affected people of color due to documented racial bias in facial recognition technology.

Amazon's Ring subsidiary partnered with over 2,600 police departments, giving law enforcement the ability to request doorbell camera footage from users without warrants. Ring admitted to providing footage to police without owner consent at least 11 times in early 2022 during 'emergencies.' Sen. Markey's investigation found Ring had egregiously lax privacy and civil rights protections, with employees in Ukraine having unfettered access to live camera feeds. Over 30 civil rights organizations demanded the partnerships end, citing racial profiling and overpolicing risks. Ring discontinued its police Request for Assistance tool in January 2024.

In 2018, Slack partnered with The Last Mile, FREEAMERICA, and the WK Kellogg Foundation to create Next Chapter, an engineering apprenticeship program helping formerly incarcerated individuals find skilled employment in tech. The program has since expanded to 14 hiring partner companies. Slack was one of the first major tech companies to create a dedicated reentry hiring pathway.

In May 2018, Reid Hoffman and his wife Michelle Yee joined The Giving Pledge, committing to donate the majority of their wealth to philanthropic causes. Through his Aphorism Foundation ($1 billion), Hoffman funds five areas: science, economic opportunity, democracy, AI, and human rights. He launched the $10 million Trust in American Institutions Challenge through Lever for Change to scale solutions restoring public trust in schools, government bodies, media, and medical systems. His foundation also supports Equal Justice Initiative, AllRaise for VC equity, and Opportunity@Work for workforce training.

Between 2012 and 2018, the New Orleans Police Department covertly used Palantir's predictive policing software, which analyzed criminal records, social media activity, and gang affiliations to identify potential crime risks. The program operated under the guise of a philanthropic partnership through the NOLA For Life initiative, circumventing normal procurement procedures and public oversight. The secret nature of the deployment prevented any democratic accountability or public debate about the use of surveillance technology.

Palantir provided predictive policing software to the Los Angeles Police Department that designated 'chronic offenders' and generated bulletins for targeted enforcement. Analysis showed the system disproportionately targeted minority neighborhoods, with those flagged being 53% Latino and 31% Black. Criminologists found the system amplified existing racial biases in policing data, essentially automating historical injustices rather than providing neutral analysis.