Regulators in Ireland just handed Meta a jaw-dropping fine over Instagram’s mishandling of teenage users’ data and privacy rights. The fine of 402 million Euros (roughly equivalent to $405 million USD) levied by Ireland’s Data Protection Commission comes after two years of deliberation on the privacy of teen users who were allowed to start business accounts on the popular photo-sharing platform.
This is the third fine levied by Ireland’s DPC in four years, and certainly not the first time the social media giant has been at the center of a safety and privacy violation firestorm.
Business accounts are popular with some Instagram users because they provide more in-depth traffic analytics, but they also make the user’s contact information public. Allegedly, many teen users also had profiles that were set to public by default upon creation.
A representative of Meta said in a statement to Politico that Instagram’s privacy settings were updated one year ago and now require anyone under 18 to default to a private profile.
“This inquiry focused on old settings that we updated over a year ago, and we’ve since released many new features to help keep teens safe and their information private,” the statement said. “Anyone under 18 automatically has their account set to private when they join Instagram, so only people they know can see what they post, and adults can’t message teens who don’t follow them. We engaged fully with the DPC throughout their inquiry, and we’re carefully reviewing their final decision.”
In October of last year, Meta found itself at the center of the international controversy after Frances Haugen, a whistleblower and former employee, claimed in a 60 Minutes interview that Facebook capitalizes on the divisive nature of its content and is dangerous for marginalized and vulnerable communities. Haugen then went on to testify before Congress. Her damning testimony called the company a “profit optimizing machine” and contained assertions that “Facebook’s products harm children, stoke division, weaken our democracy and much more.” She added that the company’s products “[generate] self-harm and self-hate — especially for vulnerable groups, like teenage girls.”
Earlier that year, Meta shelved plans for an Instagram Kids platform that would cater to the under 13 set after an outcry from parents and privacy advocates as well as a strongly worded letter from the Attorneys General of dozens of states. Instagram also rolled out parental controls earlier this year.
Parents are right to be concerned with the way their children's privacy is protected — or not — by social media companies. Checking and maintaining privacy settings can seem daunting, especially when there are so many platforms that children may use. But there are simple steps parents can take to help protect their kids online. Here are just a few.
- Have candid, open, and frequent discussions about the dangers of tech with your kids.
- Cement the rules surrounding online behavior expectations and the consequences of online misbehavior.
- Establish boundaries for what children are allowed — and specifically not allowed — to share online. Contact information, physical location, and even identifying characteristics should be protected.
- Set parental controls on their devices. Explore the options. Does their device come with pre-loaded parental controls like Google Family or the Apple suite of parental controls? Many of these allow parents to monitor usage and block unsafe content or websites.
- Be involved in your children’s online experience and let them show you what they’re into online. Ask questions and let them share. Discuss any inappropriate content that may pop up and explain your position.
The internet can be a scary place, but Fatherly’s Guide to Internet Safety can help answer all of your questions. It’s important to remember that parental controls are not a substitute for parenting — and the best way to keep your kid safe online is to go online with them.