Skip to main content

Mark ZuckerbergFacebook's algorithms amplified anti-Rohingya hate speech that UN investigators said played 'determining role' in Myanmar genocide; over 10,000 killed, 740,000+ displaced

Between 2012-2018, Facebook's recommendation algorithms systematically amplified hate speech and disinformation against Myanmar's Rohingya Muslim minority. UN investigators concluded this played a 'determining role' in inciting genocide. The Myanmar military used Facebook as a tool for ethnic cleansing propaganda over multiple years. Over 10,000 Rohingya were killed in 2017 and more than 740,000 forced to flee. Facebook knew its algorithms amplified harmful content from internal studies dating to 2012, but failed to adequately invest in content moderation. Zuckerberg was presented with options to remove algorithmic amplification in April 2020 but chose not to. Facebook apologized in April 2018 but civil rights groups dismissed it as 'grossly insufficient.' Rohingya refugees filed a $150 billion lawsuit in 2021; a survivor filed SEC whistleblower complaint in January 2025.

Scoring Impact

TopicDirectionRelevanceContribution
Algorithmic Fairness-againstprimary-1.00
Content Moderation-againstprimary-1.00
Digital Safety for Vulnerable Users-againstprimary-1.00
Overall incident score =-0.752

Score = avg(topic contributions) × significance (critical ×2) × confidence (0.75)× agency (negligent ×0.5)

Evidence (3 signals)

Confirms Statement Sep 29, 2022 documented

Amnesty International report found Facebook's algorithms promoted violence against Rohingya for profit

In September 2022, Amnesty International released a comprehensive report finding that Meta's dangerous algorithms and reckless pursuit of profit substantially contributed to atrocities against the Rohingya in 2017. The report found Meta knew or should have known that Facebook's algorithmic systems were supercharging the spread of harmful anti-Rohingya content. Internal studies dating to 2012 indicated Meta knew its algorithms could result in serious real-world harms.

Confirms Statement Nov 6, 2018 verified

Facebook admitted it failed to prevent platform being used to fuel violence in Myanmar

In November 2018, Facebook officially admitted it failed to do enough to prevent its platform being used to fuel political division and bloodshed in Myanmar. A company representative stated 'We know we need to do more to ensure we are a force for good in Myanmar.' This admission came after years of warnings from NGOs and civil society organizations.

Confirms Statement Mar 12, 2018 verified

UN investigators concluded Facebook played 'determining role' in inciting violence against Rohingya

In March 2018, United Nations investigators concluded that disinformation campaigns facilitated by Facebook played a 'determining role' in inciting violence against Myanmar's Rohingya Muslim minority. The Myanmar military systematically used Facebook to disseminate hate propaganda, false news and inflammatory posts over multiple years leading to genocide.

Related: Same Topics