Summary

  • YouTube recently doubled its threshold for removing misinformation to 50% of a video, for content that’s deemed popular enough.
  • Despite the new guidelines, 22% more content was pulled in Q1 2025 compared to a year earlier.
  • YouTube appears to continue struggling with content moderation amidst increasing uploads, leading to more removals.

Content moderation isn’t easy. Countless potential variables exist between the “F” and the “K”, forcing some companies to outsource video screening to moderation farms that wreak havoc on underpaid employees’ mental health. Add to that the ever-increasing volume of uploads to massive platforms like YouTube, and you’ve got a clear uphill battle against offensive, violent, and misinformative content.

That could be why, in December, YouTube issued new training materials, which The New York Times just got wind of (Source). The new guidelines doubled the threshold of misinformation content required for removal, from 25% of a video, to 50%. The internal documents explicitly listed examples of content, such as anti-vaccine misinformation, that was previously disallowed, but now gets a pass. Despite the increased tolerance, the platform still had to pull 22% more content in the first quarter of 2025 than it did a year earlier.

An onslaught of new content in every direction

A flood that’s impossible to keep up with

A phone with the YouTube Premium app sits atop an unfinished wood plank.

YouTube is far from the only platform to relax moderation policies in recent months. Meta axed its fact-checking program at the beginning of the year, moving to exclusively crowdsourced feedback similar to what Twitter/X has relied on for years. But, while those social media outfits openly announced the change, YouTube’s reduced restrictions flew under the radar.

According to the training materials, YouTube expanded an exemption that allows misinformation to remain on the platform, based on what it calls “public interest”. Rather than, “The welfare or well-being of the general public,” as some might interpret that, YouTube simply means “videos the public is interested in.” In other words, a public interest video is anything that YouTube deems gets enough clicks.

Previously, a public interest video made up of 25% misinformation or less was allowed to stay up. Anything higher would result in a flag for potential removal. The updated guidance doubled that threshold to 50%. One example provided in the materials points out a video entitled, “RFK Jr. Delivers SLEDGEHAMMER Blows to Gene-Altering JABS,” which promoted the objectively incorrect claim that Covid vaccines alter human DNA. Under the former guidelines, the video would be headed for the bin, but the updated exemption says because it’s popular enough — and not quite full-enough of lies — it gets a pass (although for unknown reasons, the referenced video is no longer available on YouTube).

More misinformation, but also more removals

A wire trash can with yellow papers in and next to it

Source: Pexels, Steve Johnson

With such a drastic change in moderation policy, one might expect a sharp downturn in removals. As YouTube spokesperson Nicole Bell explained, though, YouTube’s over 192,000 removals from January to March 2025 indicate that’s not the case. Likely due to the sheer volume of uploads to the absurdly popular site, allowing content that is twice as inaccurate couldn’t fend off that 22% increase compared to early 2024.

These changes didn’t exactly arrive out of the blue. YouTube constantly updates moderation guidelines to account for changing sensibilities and allow for self-expression, while striving to keep the service relatively safe for advertisers. After all, many companies don’t want to pay a platform to advertise their product next to hate speech or dangerous lies. YouTube appears to espouse the idea that if something’s popular enough, advertisers won’t mind it being associated with their brand — even if that popularity comes via fearmongering and conspiracy theories.

The leaked exemption expansion, followed nonetheless by an increase in misinformation takedowns, does highlight the difficulty of moderating global multimedia platforms. It also underscores the incredible number of creators worldwide vying for attention and clout, and the lengths they’re willing to go for them.