News

YouTube Bans Kids From Live-Streaming Unless There’s an Adult Present

The move is long overdue for a platform that has been plagued with child-safety concerns for years.

Updated: 
Originally Published: 
The search bar of the Safari browser with "www.youtube.com" typed in it, with the YouTube page slowl...
Getty

YouTube announced on Monday that it would no longer allow “younger minors” on YouTube to live-stream on the platform without parental supervision. This move followed in the wake of a New York Times report published earlier in the day that showed that YouTube’s algorithm recommends videos of children in suggestive clothing to users who had already been watching sexual content.

The policy seems to follow suit from a February plan by YouTube to no longer allow users to leave comments under videos featuring children, after a report revealed that pedophiles on YouTube were leaving comments with timestamps of the videos which featured kids in suggestive positions or when their skin was exposed. Other comments linked to unsearchable YouTube videos of pedophilic content or linked to WhatsApp groups. Many of those videos, although innocent when posted, were exploited for adult gain. They were also often monetized and, as a result, advertisers left the platform. YouTube changed their standards for monetized videos to ensure that YouTubers had a certain number of followers before they got paid for their content.

YouTube has had problems with child safety for years: In 2017, “Elsagate,” a scandal in which YouTube creators made videos relying on characters of family-friendly entertainment (Elsa, Peppa Pig, Caillou), reared its head. Those characters engaged in disturbing, violent and sexual behavior, but they remained on YouTube and YouTube Kids for quite some time before the problem came to the foreground. Many of the videos were also monetized.

A damning report published in April revealed that YouTube executives prioritized getting a billion hours of footage on the site in order to keep people on the platform for longer and thus make YouTube more money. That meant that inflammatory or even unsafe content was allowed on the platform in the name of increasing engagement, and therefore, advertising dollars.

In the announcement, YouTube noted that it has removed almost 1 million videos in violation of their child-safety policies and that most of them had been deleted before they had more than 10 views. This new regulation on YouTube Live will reportedly be enforced by beefed-up artificial intelligence measures.

This article was originally published on