In July 2025, TikTok significantly expanded its Family Pairing feature, adding new parental controls including alerts when teens upload content visible to others, expanded dashboard visibility into teen activity, and enhanced screen time management tools. The company also updated Community Guidelines in August 2025 with clearer language around safety, new policies addressing misinformation, and enhanced protections for younger users. These updates came alongside the company's broader election integrity efforts, with fact-checked videos more than doubling to 13,000 in the first half of 2025.
TikTok significantly increased lobbying spending in 2025 as it faced a potential US ban. The company spent $6.65M in H1 2025, engaging multiple lobbying firms to influence Congress and the administration on legislation requiring ByteDance divestiture.
negligent $601.0M
Ireland's Data Protection Commission fined TikTok €530 million (€485M for data transfer violations, €45M for transparency failures) after finding TikTok transferred EEA user data to China without adequate safeguards. TikTok also admitted it had provided inaccurate information to the inquiry, revealing EU data had been stored on Chinese servers contrary to its own evidence. Third-largest GDPR fine ever and first EU data transfer fine involving China.
compelled
In January 2025, the US Supreme Court unanimously upheld the Protecting Americans from Foreign Adversary Controlled Applications Act, requiring ByteDance to divest TikTok by January 19, 2025 or face a ban. The court found the law sufficiently tailored to address national security concerns over data collection practices by a foreign adversary affecting 170 million US users. TikTok briefly went dark for US users on January 18-19 before Trump issued executive orders delaying enforcement. A consortium including Oracle, Silver Lake, and MGX eventually acquired 80% of TikTok US operations in a deal that closed January 2026.
$2.0B
In 2024, TikTok spent over $2 billion on trust and safety operations, removing more than 500 million videos for policy violations. Over 85% of violating content was identified and removed by automated systems, with 99% removed before any user reported it and over 90% removed before gaining any views. The company committed to investing another $2+ billion in trust and safety for the following year. TikTok also became the first platform to implement C2PA Content Credentials for identifying AI-generated content.
negligent
In October 2024, New York AG Letitia James and California AG Rob Bonta co-led a coalition of 14 attorneys general in filing lawsuits against TikTok for misleading the public about platform safety for young users. Internal documents revealed TikTok's own 60-minute time limit tool only reduced usage by 1.5 minutes (from 108.5 to 107 minutes/day) and the company measured its success by media coverage rather than actual harm reduction. The lawsuits alleged TikTok violated state consumer protection laws and that dangerous 'challenges' on the platform led to injuries, hospitalizations, and deaths.
Research by Rutgers University Network Contagion Research Institute (2023-2024) found TikTok's algorithm systematically suppresses content critical of China's human rights record. Searching for 'Uyghur' on TikTok returned only 2.5% anti-CCP content compared to 50% on Instagram and 54% on YouTube. For Tibet searches, 61-93% of results were pro-China or irrelevant. Leaked internal moderation guidelines (2019-2020) had explicitly directed moderators to censor content about Tiananmen Square, Tibetan independence, and Xinjiang. CEO Shou Zi Chew denied censorship during 2023 Congressional testimony, contradicting a UK parliamentary admission by TikTok executive Elizabeth Kanter that such policies had existed.
TikTok's original Creator Fund, launched in 2020 with $200M projected to reach $1B over three years, was widely criticized for extremely low payouts of $0.02-$0.04 per 1,000 views ($20-$40 per million views). Creator Hank Green reported earning 2.5 cents per 1,000 views. Creator SuperSaf earned ~$137 in 10 months for 25 million views. The fund was shut down on December 16, 2023, replaced by the Creator Rewards Program with reportedly higher rates of $0.40-$1.00 per 1,000 views, though creators have since reported sharp drops in income under the new program.
negligent
Between 2021 and 2022, multiple children died after attempting TikTok's 'blackout challenge,' which involved choking oneself until passing out. In one prominent case, a 10-year-old girl died after TikTok's algorithm recommended the challenge video to her. In August 2024, the U.S. Third Circuit Court of Appeals ruled that TikTok's algorithmic recommendation of the blackout challenge was not protected by Section 230, holding that the platform's recommendation algorithm constitutes its own expressive activity, not merely hosting user content.
In 2019, The Guardian reported TikTok's moderation practices resulted in removal of content positive toward LGBTQ+ people in countries including Turkey, such as same-sex couples holding hands. In December 2019, TikTok admitted it deliberately reduced the viral potential of videos made by LGBTQ+ users, claiming the goal was to 'reduce bullying' in comments. The Australian Strategic Policy Institute also found content from LGBTQ+ creators was systematically suppressed. While TikTok later updated its policies, the practice demonstrated algorithmic discrimination against marginalized communities under the guise of user protection.