Skip to main content
company

YouTube

Video sharing platform acquired by Google in 2006. World's largest video platform with significant content moderation and creator economy implications.

Team & Alumni

Neal Mohan Current
CEO
CEO
Feb 1, 2014 – Feb 16, 2023

Related Entities

Acquired by Google since Oct 9, 2006

Track Record

reactive · Trump Administration (2025-) · $24.5M

YouTube/Alphabet agreed to pay $24.5M to settle Trump's lawsuit over his 2021 account suspension following Jan 6. Of this, $22M goes to Trump's Trust for the National Mall for a new White House State Ballroom. Settlement negotiations included mediation at Mar-a-Lago with Sundar Pichai and Sergey Brin. Legal experts called it 'straight influence-peddling' with no legal merit.

In late 2024, YouTube rewrote its moderation policy to allow videos with up to 50% violating content to remain online (up from 25%), prioritizing 'freedom of expression' over enforcement. Moderators instructed to leave up videos on elections, race, gender, abortion even if half violates rules against hate speech or misinformation. Changes disclosed publicly in June 2025 via NYT report.

YouTube introduced a revenue sharing model for Shorts creators, giving them 45% of revenue from the Creator Pool while YouTube retains 55% (largely to cover music licensing costs). Ad revenue from between Shorts in the feed is pooled monthly, then distributed based on each creator's share of total eligible views. While payout rates are modest ($0.03-$0.10 per 1,000 views), the model provides an actual revenue share rather than a fixed fund, aligning platform and creator incentives. Additional monetization options include Super Thanks tipping and affiliate product tagging.

In 2023, YouTube significantly expanded creator monetization opportunities. In February, YouTube launched Shorts revenue sharing giving creators 45% of allocated ad revenue. In June 2023, YouTube lowered Partner Program eligibility thresholds from 1,000 to 500 subscribers and from 4,000 to 3,000 watch hours, enabling more emerging creators to earn money. The lower tier initially provided access to fan funding features (Super Chat, Super Thanks, channel memberships), with ad revenue sharing unlocking at the existing thresholds.

In June 2023, YouTube reversed its policy of removing content making false claims about the 2020 US presidential election being stolen. The platform had previously removed 'tens of thousands' of such videos since December 2020. YouTube stated the reversal was because 'removing this content does curb some misinformation' but 'could also have the unintended effect of curtailing political speech.' Critics argued this enabled continued spread of election denialism.

In May 2020, YouTube published its COVID-19 Medical Misinformation Policy banning content contradicting WHO or local health authorities. In 2021, YouTube expanded the policy to cover all vaccines and removed accounts of prominent anti-vaccination activists including Joseph Mercola and Robert Kennedy Jr. Studies showed the policy significantly reduced the rate of misinformation videos on the platform compared to the pre-policy period.

negligent

Multiple academic studies found YouTube's recommendation algorithm directed users toward increasingly extreme content. A systematic review found 14 of 23 studies implicated YouTube's recommender system in facilitating problematic content pathways. Research from UC Davis and PNAS showed the algorithm was more likely to recommend extremist and conspiracy content to right-leaning users. Over 70% of content watched on YouTube is recommended by its proprietary, opaque algorithm. While some studies produced contradictory findings, the lack of algorithmic transparency prevented definitive conclusions.

negligent $170.0M

The FTC and New York Attorney General fined Google/YouTube $170 million ($136M to FTC, $34M to NY) for violating COPPA by collecting personal information from children under 13, including viewing history, without parental consent. YouTube had marketed its popularity with children to advertisers like Mattel and Hasbro while refusing to acknowledge portions of its platform were directed at kids. This was the largest COPPA penalty in history at the time.

negligent

In 2017, YouTube faced a major scandal known as 'Elsagate' where disturbing content disguised as children's videos—featuring violent, sexual, and abusive themes with popular children's characters—accumulated tens of millions of views. YouTube's content moderation systems failed to detect these videos, which were tagged to circumvent safety algorithms. YouTube Kids was also affected, with the platform later admitting its electronic moderation system was defunct. YouTube eventually removed over 150,000 videos, terminated 50+ channels, and disabled 625,000+ comment sections.

negligent

In March 2017, LGBTQ+ creators discovered YouTube's Restricted Mode was systematically hiding their content—including wedding videos, coming-out stories, and queer-themed pop culture commentary—while allowing Mortal Kombat fatality compilations and marijuana growing tutorials. YouTube acknowledged the system 'sometimes make[s] mistakes' and claimed to fix it in April 2017 by unfiltering 12 million videos, but creators reported the problems persisted. By 2019, LGBTQ+ creators filed a class-action lawsuit alleging discriminatory censorship, with plaintiffs reporting 75% revenue drops.

negligent

In March 2017, the Times of London revealed that ads from major brands and the UK government were running alongside extremist and terrorist content on YouTube. Over 250 brands including AT&T, Walmart, PepsiCo, and Starbucks pulled their ads. YouTube's response—implementing broad demonetization categories and raising monetization thresholds—disproportionately harmed legitimate creators covering sensitive topics including news, women's issues, and LGBTQ+ content, while the underlying brand safety problem persisted. Creators reported 30-85% revenue drops.

On February 24, 2015, YouTube launched the YouTube Kids app for Android and iOS, designed specifically for children ages 4-12. The app featured algorithmic and human-curated content filtering for child-friendliness, parental controls including screen time limits, search restriction, and channel blocking, plus age-based content categories. While initially criticized for some advertising concerns and content filtering gaps, it represented a proactive investment in child-safe platform design and COPPA compliance.

In November 2009, YouTube introduced automatic captions using speech recognition technology, initially for English-language content. The feature was led by Ken Harrenstien, a deaf Google engineer. While auto-captions had significant accuracy limitations (60-70% accuracy initially, improving over time), the feature represented a major step toward making the platform's massive video library accessible to deaf and hard-of-hearing users. YouTube subsequently expanded auto-captions to dozens of languages.

Support & Opposition

Actions from other entities targeting YouTube

Susan Wojcicki · Jan 12, 2022

Criticized by 80+ fact-checking organizations calling YouTube 'major conduit of disinformation'

In January 2022, over 80 fact-checking organizations sent an open letter to Wojcicki calling YouTube 'one of the major conduits of online disinformation and misinformation worldwide' and stating YouTube's measures to combat misinformation are 'proving insufficient.'

1 source 1 confirming