Skip to main content
company

Meta Platforms

Parent company of Facebook, Instagram, WhatsApp, and Threads. Major player in social media, virtual reality (Meta Quest), and AI research.

Team & Alumni

CEO
Board Member
Apr 15, 2025 – Present
Yann LeCun Current
Executive
Jan 1, 2013 – Present
Board Member
Jun 1, 2008 – Present
Employee
Jun 1, 2019 – May 1, 2021
Executive
Jul 21, 2014 – Mar 1, 2017
Board Member
Jun 1, 2011 – May 1, 2019
Board Member
Jan 1, 2010 – Jan 1, 2019
COO
Mar 1, 2008 – Sep 30, 2022
Executive
Jun 1, 2007 – Jun 1, 2011
Board Member
Jun 1, 2004 – Feb 7, 2022

Related Entities

Acquired Oculus VR since Jul 21, 2014
Acquired Instagram since Sep 6, 2012
Parent of Facebook since Feb 4, 2004
Acquired WhatsApp since Oct 6, 2014

Track Record

negligent

42 State Attorneys General issued a letter to Meta (along with other large technology companies) about the rise in sycophantic and delusional outputs from generative AI software. The letter highlighted that generative AI software has been involved in at least six deaths in the United States, and other incidents of domestic violence, poisoning, and hospitalizations for psychosis.

negligent

In December 2025, families of Levi Maciejewski (13, Pennsylvania, died 2024) and Murray Downey (16, Scotland, died 2023) sued Meta alleging Instagram's design enabled sextortion schemes targeting teens. The lawsuit cited an internal 2022 audit that allegedly found Instagram's 'Accounts You May Follow' feature recommended 1.4 million potentially inappropriate adults to teenage users in a single day. Instagram's default public privacy settings for teens were not changed to private until 2024, despite Meta claiming the change was made in 2021.

negligent

In November 2025, Meta's board of directors settled a shareholder derivative lawsuit for $190 million. Shareholders alleged that board members failed to properly oversee compliance with a 2012 FTC consent decree on user privacy, and that they improperly agreed to the $5 billion 2019 FTC settlement specifically to shield Mark Zuckerberg from personal liability. The suit highlighted undisclosed conflicts of interest among board members, including allegations that Marc Andreessen provided Zuckerberg strategic advice during board negotiations over a stock restructuring.

reactive

Chief Judge James Boasberg ruled after a six-week bench trial that the FTC failed to prove Meta unlawfully monopolized 'personal social networking.' The court found TikTok and YouTube are legitimate competitors, noting Americans spend only 17% of time on Facebook viewing friends' content. The ruling was the most decisive government loss in any major Big Tech antitrust case. The FTC appealed in January 2026.

negligent

Reuters obtained internal Meta documents showing the company displayed approximately 15 billion 'higher risk' scam advertisements per day, generating an estimated $16 billion annually (10% of revenue). Documents revealed Meta set 'revenue guardrails' limiting fraud enforcement to 0.15% of revenue (~$135M), and executives proposed focusing fraud control only on countries with imminent regulatory action. Internal documents showed Meta was involved in 1 in 3 U.S. frauds. Meta also developed a 'playbook' to manage regulatory perception of scam ads.

negligent

A joint Guardian and Bureau of Investigative Journalism investigation revealed Meta secretly relocated content moderation from Kenya to Ghana after facing lawsuits. Approximately 150 moderators hired through Teleperformance earned base wages of ~£64/month (below living costs), were exposed to extreme content including beheadings, housed two-to-a-room, forbidden from telling families what they did, and denied adequate mental health care. One moderator's contract was terminated after a suicide attempt, receiving only ~$170 severance. Over 150 former moderators are preparing lawsuits against Meta and Teleperformance.

$227.0M

The European Commission issued its first-ever Digital Markets Act fine, finding Meta's 'consent or pay' model violated DMA obligations to give consumers a choice of service using less personal data. Meta offered EU users of Facebook and Instagram only a binary choice between consenting to full data combination for personalized ads or paying a subscription. Internal documents revealed the model 'was never intended to comply' with the DMA, with Meta's own estimates predicting below 1% subscription uptake. The violation period ran from March to November 2024.

reactive

Meta announced a major overhaul of its DEI initiatives in January 2025. The company eliminated its dedicated DEI team, ended equity and inclusion programs, stopped representation goals, ceased diverse supplier sourcing requirements, and ended the 'diverse slate approach' in hiring. VP of HR Janelle Gale cited the changing legal and policy landscape around DEI in the United States.

On January 7, 2025, Meta announced it would end its third-party fact-checking program on Facebook and Instagram, replacing it with a community notes system similar to X (formerly Twitter). CEO Mark Zuckerberg stated fact-checkers had been 'too politically biased' and called for reducing 'censorship'. The change was announced two weeks before Trump's second inauguration.

reactive

On January 7, 2025, as part of broader content moderation changes, Meta updated its Community Standards to expressly permit users to describe LGBTQ+ people as mentally ill or abnormal and to call for their exclusion from professions, public spaces, and society based on sexual orientation and gender identity.

Meta spent a record $24.4 million on lobbying in 2024, a 27% increase from 2023 and the most the company has spent since it began federal lobbying in 2009. The effort was powered by 65 lobbyists — one for every eight members of Congress. Combined, Amazon, Apple, Google, Meta, and Microsoft spent nearly $69 million lobbying the federal government in 2022 alone.

Meta faces a lawsuit from Ferras Hamad, a Palestinian-American engineer who claims he was fired for attempting to fix bugs that suppressed Palestinian posts on Instagram. Hamad found content by Palestinian photojournalist Motaz Azaiza was misclassified as pornographic. The lawsuit alleges Meta deleted internal employee communications mentioning relatives killed in Gaza and investigated employees for using the Palestinian flag emoji.

negligent

Meta has faced multiple lawsuits from content moderators suffering severe psychological trauma. In 2020, Facebook paid $52 million to settle a US class-action (Scola v. Facebook) from moderators employed through Accenture and other contractors who developed PTSD. In September 2024, a Kenyan court ruled Meta can be sued in local courts, with 144 former moderators (81% diagnosed with severe PTSD) seeking $1.6 billion in compensation. Additional lawsuits from Ghana moderators allege depression, anxiety, insomnia, and substance abuse from reviewing extreme content. Accenture employed more than a third of Meta's ~15,000 content moderators.

negligent

On October 24, 2023, forty-one states and D.C. sued Meta Platforms alleging the company knowingly designed and deployed harmful features on Instagram and Facebook that purposefully addict children and teens. The lawsuit alleged Meta violated COPPA by collecting personal data of users under 13 without parental consent, and that the company marketed its platforms to children despite knowing the harm. The suit cited internal research showing Meta was aware of the negative mental health effects on young users.

Court filings revealed Meta engineers torrented 81.7 terabytes of copyrighted books from Library Genesis, Z-Library, and Anna's Archive to train Llama models. Internal emails showed Meta director Sony Theakanath confirmed 'GenAI has been approved to use LibGen for Llama 3' after escalation to Mark Zuckerberg, with explicit instruction to never publicly disclose the use. Engineers wrote scripts to strip copyright notices from ebooks. A June 2025 ruling found this piracy was not protected by fair use.

Analysis by Issue One and Public Citizen found that 85% of Meta's registered federal lobbyists were former government employees as the company faced FTC antitrust litigation. Meta's D.C. lobbying operation expanded significantly during 2023-2024, hiring former officials from DOJ, FTC, and congressional staff. This pattern of revolving-door hiring was part of a broader tech industry trend where 75% of FTC officials had corporate conflicts of interest.

negligent

On September 30, 2022, North London coroner Andrew Walker ruled that Molly Russell's death in November 2017 was 'an act of self-harm suffering from depression and the negative effects of online content.' This was the first ruling to formally attribute a child's death to social media content. The inquest found that of 16,300 posts Molly saved, shared or liked on Instagram in the six months before her death, 2,100 were related to depression, self-harm or suicide. The coroner found the platforms were 'not safe' and issued a prevention of future deaths report to Meta and Pinterest.

Support & Opposition

Actions from other entities targeting Meta Platforms

Patrick Collison · Apr 15, 2025

Joined Meta's board of directors amid company's outreach to Trump administration

Patrick Collison appointed to Meta's board effective April 15, 2025. Appointment came as Meta rolled back fact-checking and DEI programs, donated $1M to Trump's inauguration, and settled lawsuit with Trump for $25M. Collison joins alongside Dina Powell McCormick, a former Trump advisor.

2 sources 2 confirming
Frances Haugen · Oct 3, 2021

Leaked tens of thousands of internal Facebook documents exposing platform harms

In 2021, Frances Haugen copied tens of thousands of internal Facebook documents and provided them to the SEC, Congress, and journalists. Known as the 'Facebook Papers,' these documents revealed Facebook knew Instagram was harmful to teen mental health (13.5% of teen girls said it worsened suicidal thoughts), that the platform's algorithms amplified divisive content, and that safety measures were weakened after the 2020 election. She appeared on 60 Minutes on October 3, 2021, and testified before Congress on October 5, 2021. Facebook's market cap dropped approximately $6 billion following the revelations.

2 sources 2 confirming
Frances Haugen · Oct 1, 2021

Filed eight SEC complaints alleging Facebook misled investors about platform safety

Frances Haugen filed at least eight complaints with the Securities and Exchange Commission alleging that Facebook's public statements about platform safety contradicted its internal research. The complaints focused on Facebook's knowledge of harms caused by its platform, including teen mental health impacts and algorithmic amplification of harmful content, which it allegedly concealed from investors.

1 source 1 confirming