Summary
- YouTube must improve ad moderation due to increasing user discomfort with explicit and fake ads.
- A child was exposed to a sexually explicit ad on YouTube, causing an uncomfortable situation for everyone involved.
- Google’s AI moderation of ads is failing to keep users safe, leading to a rise in adblocker use and user dissatisfaction.
It’s pretty safe to say that few users appear to be happy with YouTube’s ads; change is needed. From the fake ads for mobile games that don’t offer the gameplay shown to the many sexually explicit ads that slip through, making things uncomfortable for all. Google even goes so far as to claim it prohibits a lot of sexual content when it comes to the ads it serves up and explicitly notes this content can’t target minors. And yet, a recent post on Reddit (reported by Android Authority) showcases precisely why Google’s insistence to moderate its ads with AI is often a failure when it comes to keeping children safe from sexual content, something we see time and time again.
Thanks to a recent post on Reddit, it’s reported that a 7-year-old child was streaming content from YouTube when they were served a sexual ad starring a porn star who just so happens to look similar to the YouTube streamer the child was watching, Loserfruit. As you can imagine, this made for an uncomfortable situation for the relative watching the child.
Related
5 things YouTube Premium needs to change to keep me as a subscriber
And yes, a cheaper plan is one of them
Who is to blame when smut shows up in ad form?
Of course, it’s easy to point at Google and wonder what went wrong, but you should keep in mind that the child was using their mother’s phone to watch YouTube (as revealed in a Reddit comment from the OP); in other words, an adult’s phone (a device the child doesn’t regularly use). There is also no mention if the child was using their own YouTube account, but YouTube does say you need to be thirteen or older to make one, not that showing a thirteen-year-old sexually explicit content is any better than showing it to a seven-year-old. Still, if you want a child-safe experience on YouTube, the YouTube Kids app exists, which has seen some improvements in the last few months to bring it more in line with the regular YouTube app.
Ultimately, Google does have an issue with showing inappropriate ads, from explicit sexual content to fake videos of video games that only exist in name, not gameplay. As YouTube’s bad ads have grown worse, they have also grown too frequent, resulting in an unsurprising meteoric rise in adblocker use that has washed over the userbase to the point Google has repeatedly gone out of its way to thwart adblockers and third-party apps using underhanded tricks like server-side ad injection to force ads on us all when it’s those very ads that are filled with lies, sex, and sometimes even malware that encourage users to remove them through free solutions.
Clearly, Google could do better, but better moderation will cost more money, which is why Google uses AI to moderate in the first place; it’s cheaper than humans. It’s also proving to be worse than humans, that is, if the goal is keeping us users safe in the first place and not the bottom line.
Google needs to do better before everyone learns to despise the company
Once you understand that Google works in a way that ties unique advertising IDs to our phones and why it may be a bad idea to hand a child an adult’s phone to use for YouTube content, it’s easy to see how the Reddit story is somewhat blown out of proportion when the child wasn’t old enough to be using YouTube proper in the first place. But at the end of the day, just about everyone dislikes YouTube’s ads for a reason, and those reasons seem to be growing when kids and adults are accidentally seeing sexual content thanks to Google’s sloppy AI moderation of its ad network, something the company should do something about sooner rather than later before stories like this grow in number where more and more users will pin their rage on YouTube and its bad ads no matter who is technically at fault.