YouTube announced a feature allowing terminated creators to apply for new channels, reversing previous lifetime bans. Beneficiaries include former Trump adviser Steve Bannon, RFK Jr (now HHS head), and Dan Bongino (now FBI deputy director). Policy shift aligns with broader industry trend of rolling back content moderation.
YouTube
Video sharing platform acquired by Google in 2006. World's largest video platform with significant content moderation and creator economy implications.
Team & Alumni
Related Entities
Track Record
Settled Trump lawsuit for $24.5M including $22M to Trump's White House ballroom project
Sep 30, 2025YouTube/Alphabet agreed to pay $24.5M to settle Trump's lawsuit over his 2021 account suspension following Jan 6. Of this, $22M goes to Trump's Trust for the National Mall for a new White House State Ballroom. Settlement negotiations included mediation at Mar-a-Lago with Sundar Pichai and Sergey Brin. Legal experts called it 'straight influence-peddling' with no legal merit.
In late 2024, YouTube rewrote its moderation policy to allow videos with up to 50% violating content to remain online (up from 25%), prioritizing 'freedom of expression' over enforcement. Moderators instructed to leave up videos on elections, race, gender, abortion even if half violates rules against hate speech or misinformation. Changes disclosed publicly in June 2025 via NYT report.
YouTube implemented revenue sharing model giving Shorts creators 45% of ad revenue pool
Jan 1, 2024YouTube introduced a revenue sharing model for Shorts creators, giving them 45% of revenue from the Creator Pool while YouTube retains 55% (largely to cover music licensing costs). Ad revenue from between Shorts in the feed is pooled monthly, then distributed based on each creator's share of total eligible views. While payout rates are modest ($0.03-$0.10 per 1,000 views), the model provides an actual revenue share rather than a fixed fund, aligning platform and creator incentives. Additional monetization options include Super Thanks tipping and affiliate product tagging.
YouTube expanded Partner Program with lower monetization thresholds and Shorts revenue sharing for creators
Jun 13, 2023In 2023, YouTube significantly expanded creator monetization opportunities. In February, YouTube launched Shorts revenue sharing giving creators 45% of allocated ad revenue. In June 2023, YouTube lowered Partner Program eligibility thresholds from 1,000 to 500 subscribers and from 4,000 to 3,000 watch hours, enabling more emerging creators to earn money. The lower tier initially provided access to fan funding features (Super Chat, Super Thanks, channel memberships), with ad revenue sharing unlocking at the existing thresholds.
YouTube reversed its election misinformation policy, allowing 2020 election denial content back on the platform
Jun 2, 2023In June 2023, YouTube reversed its policy of removing content making false claims about the 2020 US presidential election being stolen. The platform had previously removed 'tens of thousands' of such videos since December 2020. YouTube stated the reversal was because 'removing this content does curb some misinformation' but 'could also have the unintended effect of curtailing political speech.' Critics argued this enabled continued spread of election denialism.
YouTube implemented COVID-19 and vaccine misinformation removal policies, removing accounts of prominent anti-vaxxers
May 20, 2020In May 2020, YouTube published its COVID-19 Medical Misinformation Policy banning content contradicting WHO or local health authorities. In 2021, YouTube expanded the policy to cover all vaccines and removed accounts of prominent anti-vaccination activists including Joseph Mercola and Robert Kennedy Jr. Studies showed the policy significantly reduced the rate of misinformation videos on the platform compared to the pre-policy period.
YouTube's recommendation algorithm found to promote ideologically extreme content, particularly for right-leaning users
Jan 29, 2020Multiple academic studies found YouTube's recommendation algorithm directed users toward increasingly extreme content. A systematic review found 14 of 23 studies implicated YouTube's recommender system in facilitating problematic content pathways. Research from UC Davis and PNAS showed the algorithm was more likely to recommend extremist and conspiracy content to right-leaning users. Over 70% of content watched on YouTube is recommended by its proprietary, opaque algorithm. While some studies produced contradictory findings, the lack of algorithmic transparency prevented definitive conclusions.
YouTube paid record $170M FTC/NY settlement for illegally collecting children's data without parental consent
Sep 4, 2019The FTC and New York Attorney General fined Google/YouTube $170 million ($136M to FTC, $34M to NY) for violating COPPA by collecting personal information from children under 13, including viewing history, without parental consent. YouTube had marketed its popularity with children to advertisers like Mattel and Hasbro while refusing to acknowledge portions of its platform were directed at kids. This was the largest COPPA penalty in history at the time.
YouTube failed to prevent 'Elsagate' child exploitation content reaching millions of children
Nov 11, 2017In 2017, YouTube faced a major scandal known as 'Elsagate' where disturbing content disguised as children's videos—featuring violent, sexual, and abusive themes with popular children's characters—accumulated tens of millions of views. YouTube's content moderation systems failed to detect these videos, which were tagged to circumvent safety algorithms. YouTube Kids was also affected, with the platform later admitting its electronic moderation system was defunct. YouTube eventually removed over 150,000 videos, terminated 50+ channels, and disabled 625,000+ comment sections.
YouTube's Restricted Mode systematically filtered LGBTQ+ content while allowing violent and drug-related videos
Mar 17, 2017In March 2017, LGBTQ+ creators discovered YouTube's Restricted Mode was systematically hiding their content—including wedding videos, coming-out stories, and queer-themed pop culture commentary—while allowing Mortal Kombat fatality compilations and marijuana growing tutorials. YouTube acknowledged the system 'sometimes make[s] mistakes' and claimed to fix it in April 2017 by unfiltering 12 million videos, but creators reported the problems persisted. By 2019, LGBTQ+ creators filed a class-action lawsuit alleging discriminatory censorship, with plaintiffs reporting 75% revenue drops.
YouTube's failure to prevent ads on extremist content triggered advertiser boycott and indiscriminate creator demonetization
Mar 17, 2017In March 2017, the Times of London revealed that ads from major brands and the UK government were running alongside extremist and terrorist content on YouTube. Over 250 brands including AT&T, Walmart, PepsiCo, and Starbucks pulled their ads. YouTube's response—implementing broad demonetization categories and raising monetization thresholds—disproportionately harmed legitimate creators covering sensitive topics including news, women's issues, and LGBTQ+ content, while the underlying brand safety problem persisted. Creators reported 30-85% revenue drops.
YouTube launched YouTube Kids app with dedicated child safety features and parental controls
Feb 24, 2015On February 24, 2015, YouTube launched the YouTube Kids app for Android and iOS, designed specifically for children ages 4-12. The app featured algorithmic and human-curated content filtering for child-friendliness, parental controls including screen time limits, search restriction, and channel blocking, plus age-based content categories. While initially criticized for some advertising concerns and content filtering gaps, it represented a proactive investment in child-safe platform design and COPPA compliance.
YouTube launched automatic captions using speech recognition, expanding video accessibility for deaf and hard-of-hearing users
Nov 19, 2009In November 2009, YouTube introduced automatic captions using speech recognition technology, initially for English-language content. The feature was led by Ken Harrenstien, a deaf Google engineer. While auto-captions had significant accuracy limitations (60-70% accuracy initially, improving over time), the feature represented a major step toward making the platform's massive video library accessible to deaf and hard-of-hearing users. YouTube subsequently expanded auto-captions to dozens of languages.
Support & Opposition
Actions from other entities targeting YouTube
Criticized by 80+ fact-checking organizations calling YouTube 'major conduit of disinformation'
In January 2022, over 80 fact-checking organizations sent an open letter to Wojcicki calling YouTube 'one of the major conduits of online disinformation and misinformation worldwide' and stating YouTube's measures to combat misinformation are 'proving insufficient.'