The alleged strategy was prompted by declining interest in the site among young users that was pushing the average age of its users higher and higher.
“Aging up is a real issue,” one Facebook researcher apparently wrote in an internal memo, and the statistics bear this out. Teen usage of the site has fallen by 13 percent in two years, and it’s projected to drop another 45 percent (!) in the next two years, according to Ars Technica.
Facebook reportedly needed a robust cohort of young people to buy into its platform for at least two reasons. One, young users are more valuable in the long term because they have more potential years of using Facebook (and being served ads on Facebook) ahead of them. Two, they are the primary source of new users, as older people are more likely to have already made a decision about whether to be on the platform.
So, the company apparently decided to launch a new initiative focused on recruiting kids to its platform. An internal post seeking applicants reveals some of its plans for that team.
“Our company is making a major investment in youth and has spun up a cross-company virtual team to make safer, more private, experiences for youth that improve their and their household’s well-being,” the April 9 post reads. “For many of our products, we historically haven’t designed for under 13 (with the exception of Messenger Kids) and the experiences built for those over 13 didn’t recognize distinctive maturity levels across the age spectrum.”
It’s not mentioned in the post, but a big reason Facebook historically shied away from kids under 13 is the Children’s Online Privacy Protection Act.
COPPA sets strict rules for companies who operate websites (or apps or connected devices) directed to or known to be used by children under the age of 13. It requires such operators to obtain verifiable parental consent before collecting personal information from children online, prohibits them from supplying that information to third parties in most cases, and secure that information, among other provisions.
Since companies like Facebook depend on data about their users to make money, these limitations make children less profitable customers in the short term. And COPPA violations can be expensive—just ask YouTube and TikTok.
It appears that the “aging up” problem prompted a shift in focus to younger users despite a previous safety issue with Messenger for Kids.
That product came under scrutiny two years ago when it patched a bug that allowed users to create group chats with unauthorized users. The bug was active for a year before it was discovered, and it took Facebook another month to inform parents, according to reporting in The Verge and confirmed by the company in a letter to Sen. Ed Markey.
Markey and Sen. Richard Blumenthal condemned Facebook’s response, particularly the lack of a comprehensive review of Messenger Kids after such a serious incident. The company has maintained that its products comply with COPPA.
“Messenger Kids takes children’s privacy and security seriously, and we are committed to ensuring any technical errors are investigated and addressed quickly,” Facebook public policy VP Kevin Martin wrote in the same letter.
Given this serious breakdown, it’s concerning that the internal post clearly shows that Facebook is interested in moving from what it calls a “simple” COPPA line—depicted in the post as a stop sign blocking kids under 13 from using non-Messenger Kids products—to a more gradual approach in which “[F]eatures, defaults, settings, and education [are] tailored to age and stage.”