Give us a little more information and we'll give you a lot more relevant content
Your child's birthday or due date
Girl Boy Other Not Sure
Add A Child
Remove A Child
I don't have kids
Thanks For Subscribing!
Oops! Something went wrong. Please contact support@fatherly.com.

YouTube Execs Knew About Content Problems Long Before Acknowledging It

The streaming platform allegedly focused on profit over safety.

Pixabay

YouTube executives were aware of toxic content aimed at children and abusive content across the platform and did nothing to address it. According to a news report published by Bloomberg on Tuesday, executives at YouTube had a full understanding of the toxic, inaccurate, and abusive content on the platform but took no action. Speaking anonymously, former YouTube employees told Bloomberg that internal discussions at the streaming company were a far cry from the outward message YouTube projected to the public.

Though YouTube brands itself as a sort of “library” (their CEO Susan Wojcicki said so at South by Southwest in Austin), the report shows internal struggles to monitor, monetize, and prioritize the algorithms were ongoing. The report claims that when one employee suggested flagging videos that were troubling (though not illegal), they were met with inaction by YouTube executives, who, apparently, preferred to focus on engagement and profit, rather than the troubling nature of some of the content.

One former employee, Micah Schaffer, left the company before 2010. Just around the time he left the company, YouTube had noticed a community of pro-anorexia videos, but at that time, staff began to delete the videos of the platform, delete them from the recommendations tab, and put age restrictions on the videos. That seems a far cry from the response to issues today, and, in 2012, a new message from Google, YouTube’s owner, emerged: more people need to be on the site for longer so YouTube can run more ads and make more money. They had a goal: get a billion hours of footage on the site.

After the 2016 election, instead of focusing on fake news that had populated the platform, YouTube execs focused on a new algorithm: paying creators for how many people watched their videos, and for how long they watched, even if no ads were running on the videos. That meant that video publishers that shared fringe-conspiracy theories or racist ideology and which aren’t promoted by brands like AT&T, would still make money, with no impetus to be accurate in their reporting. The plan was ultimately turned down.

In 2017, the Elsagate scandal emerged. Even on the kid-friendly version of YouTube, content creators had created unofficial cartoon videos that showed disturbing imagery of, say, Elsa being pregnant or Peppa Pig being beheaded. At the same time, “family” YouTube channels like Toy Freaks, a family channel, were taken down as a result of allegations of child abuse and because the content itself was also disturbing. Ads were run against some of these videos.

In a dual scandal, details of a pedophilic commenting ring — a ring in which pedophiles commented on videos uploaded by kids for kids marking time stamps of moments they were in physically compromising positions — began to emerge. Those videos had high engagement and led users down an algorithm based on what other people like them were watching and commenting on, thus leading otherwise normal YouTubers down a nefarious path of videos of children sucking lollipops, showing off their night time routine, or modeling underwear. YouTube promised they would fix the issue, in which many of those videos of children were monetized because of their high engagement.

Fatherly IQ
  1. How often do you have a family game night?
    Weekly
    Monthly
    Every few months
    Never
Thanks for the feedback!
Oops! Something went wrong. Please contact support@fatherly.com.

The problems persisted: videos emerged and went viral calling the Parkland survivors fake, paid actors, and internally, as a reaction to that scandal, sources alleged that YouTube execs rejected an idea to limit recommended news to trusted sources. They told their employees not to try to search for viral lies on the platform, because YouTube could then be liable for knowing about the content on the site and not doing enough to handle the problem.

One anonymous employee told Bloomberg that Wojcicki would “never put her fingers on the scale. Her view was, ‘My job is to run the company, not deal with this,’” suggesting that making a profit was more important than making sure that users of the platform were safe and accurately informed.

We’ve known for a long time that YouTube — as is true of most social media websites — value time spent and engaged on the website, rather than actually creating a meaningful or, well, truthful community. In 2019, the pedophile problem arose again, despite hiring 10,000 human content moderators in 2018. But while YouTube made every show, outwardly, of being proactive in de-platforming fake news and fixing their algorithm to keep kids safe, they dragged their feet internally, valuing profit over their community.