Give us a little more information and we'll give you a lot more relevant content
Your child's birthday or due date
Girl Boy Other Not Sure
Add A Child
Remove A Child
I don't have kids
Thanks For Subscribing!
Oops! Something went wrong. Please contact

YouTube Is Taking Measures to Protect Kids From Viewing Disturbing Children’s Content

Whether or not those measures actually do anything to protect kids — or get to the heart of the problem — is a question worth exploring.

Kids Fun TV/YouTube

Youtube KIDS is in hot water for the wide-array of disturbing videos masquerading as children’s content on their site. Recently, the company announced it has been working on a new content policy that allows users to flag the videos that use popular keywords to slip videos through the site. While this would make it harder for inappropriate things to get into Youtube Kids, the real issue is the algorithm.

The new policy comes in light of several reports from such sources as the New York Times and Medium that have exposed the troubling and all-too-common trend of disturbing videos circulating around YoutTube Kids. These videos feature beloved children’s characters and seem like clickable, kid-friendly content. But they are in fact wildly inappropriate and dangerous.  The videos feature such disturbing content as Peppa Pig drinking bleach or The Joker burying Elsa from Frozen alive. According to the New York Times, a mother in Indiana noticed her 3-year-old son watching a video with characters from PAW Patrol. In the video, several of the characters die; one walked off a roof.

So, who is creating these videos and why?  It’s a hard question. In his thorough post on Medium titled “Something Is Wrong On The Internet” writer James Bridle dove into the disturbing trend and uncovered was millions and millions of videos marketed toward children and generated for profit. This isn’t a shock: The more views a channel gets, the more money the people behind the channel receive. It’s simple math. But while other reports such as the one published by The New York Times view the issue of violent content as an issue of moderation, Bridle sees it as a fundamental flaw in the way that the Internet and algorithms operate.

To Bridle, the way the system is structured is the problem, and the way the system is structured is why these deranged videos, which create tons and tons of profit, gain an audience. Parents (or any Youtube user) shouldn’t be responsible for flagging the content themselves. They’re not the problem. The algorithm is.