Roblox Will Roll Out New Parental Controls After Concern About Sexual Content
After parents sounded the alarm on sexual content proliferating on the platform, Roblox has announced plans to make the platform safer for kids.
A recent Wall Street Journal article about the gaming platform Roblox details a scourge of offensive content on the platform, as well as new parental controls the company will take to keep its base community of young users protected. More robust parental controls and content ratings are a few of the measures that the company says will be rolled out to increase safety.
Roblox’s particular brand of world- and character-building has always been popular, but the platform was uniquely set to thrive in the pandemic, as it allows kids to play together while staying apart. And thrive it did, as the platform’s sales grew by 82 percent over 2020. It now boasts more than 30 million users, half of which are under the age of 13. The company went public a few months ago and was valued at $30 billion in January.
However, as the Internet is the Internet, an increase in users often brings an increase in inappropriate content. The Wall Street Journal article shows that Roblox has been struggling to get a handle on sexual content on the platform, which might give some parents pause as to how safe the platform, which has always reported itself to be “zero tolerance” toward profanity and offensive content, really is.
Here’s what to know about Roblox, the sexual content problem they have, and the new and aggressive steps that Roblox is taking to make the platform safer.
What Is Roblox?
Roblox is an open-world platform designed to appeal to kids. It is not a game so much as it is a place where users can create games or play games made by other users. Users can play created games or create their own on the platform with simple programming tools. Much of the enjoyment from the game comes from users creating their own worlds. Roblox users who are 13+ can make money off of the games they make themselves.
Millions of games exist on the platform, from simulations to first-person shooters, making it a creative, open-world outlet — and one that makes money. Robux is the “money” in the platform, and it can be made or bought, with real dollars, much to many parents’ annoyance over microtransactions.
The game is massively popular — as any parent who may have been saddled with an unexpected Robux bill knows. About 75 percent of kids between the age of nine and 12 in the United States have used the platform.
How Does Roblox Currently Moderate Its Content?
Roblox has a “no tolerance” policy for profanity and offensive content. They boast a stringent safety system as well as text-filtering tech, third-party machine learning solutions, filters that are updated daily, and 2,300 full-time employees working to keep the platform clean. That said, 32.6 million users log on every day, making successful and meaningful content moderation no small task, per The Wall Street Journal.
Since the object of the platform is open-world creation and exploration, it’s not surprising that offensive content may slip through the cracks. No one can guarantee a 100% safe platform for children on the internet, not even the most dedicated team.
In 2019, NBC had found that neo-Nazis had infiltrated the platform, and FastCompany found a game full of naked Roblox avatars simulating sex acts — which is exactly what Roblox’s new parental control system will try to fight kids from accessing.
What’s Happening on the Platform That’s Unsafe for Kids?
The new Wall Street Journal article notes that parents have been sounding the alarm on what they see as parental controls that are anything but fool-proof. Such age-inappropriate content as simulated sex, racist imagery, actual photos of naked people, and more, slips through the filters.
Frustration has also been raised that parental controls are hard to find — and that when said controls are found and used to limit what content their child sees, only 1,000 out of the millions of games on the platform appear on the list.
One dad quoted in the WSJ article said that he signed up for the gaming platform after reading the company’s guide for parents and restricted the games his 8-year-old son could access. Still, he says he found that his son had accidentally entered a Roblox game where he saw a photo of a woman in a thong, the word “sex” written in the sky three times, and a depiction of sodomy. He said racist music was also playing in the background.
What Changes Is Roblox Making to its Content Moderation?
Roblox recently announced that they will be rolling out a new rating system to give parents more control over what their children see on the platform. But details are, so far, thin.
The rating system will rank content on age-appropriateness, so players will know who a game is suitable for within the platform before they join the game. The rating system may also stop kids who are too young for the age rating to even join the game at all.
A redesign will also make parental controls easier to use and access. But without a timeline, it’s hard to know just how long parents will have to worry.
In the meantime, parents can set up a child’s account so that inappropriate content, games, and language are filtered out of the account. Parents can also, via Roblox’s account settings, restrict the platform to play games that only Roblox says are appropriate for under-13’s and can set up a password that only they know if their kid wants to try to change settings.
And, as always, it’s important to remember that no online platform is 100-percent safe to allow a kid to play unsupervised. The best parental control is you playing by their side.