Give us a little more information and we'll give you a lot more relevant content
Your child's birthday or due date
Girl Boy Other Not Sure
Add A Child
Remove A Child
I don't have kids
Thanks For Subscribing!
Oops! Something went wrong. Please contact support@fatherly.com.

Facebook Will Finally Ban Anti-Vaxx Ads, Posts

But is their ban a case of too little, too late?

GETTY

Facebook’s ban against anti-vaxx misinformation ads on their website just went a bit further in scope — targeting not just ads, but individual posts and content in groups, on pages, and by targeting all vaccine misinformation, not just vaccine misinformation about COVID-19. 

On Monday, February 8th, Facebook announced they would remove all content, posts, and pages that proliferate anti-vaxx misinformation of any sort, going further than their previous policy of simply “downranking” content that had misinformation and making it harder to find, which made misinformation groups more insular. This is the latest in a series of steps of moves the company has made to start limiting misinformation on the website.

After all, in early December, Facebook has decided to expand the scope of their anti-vaxx and misinformation ban regarding the COVID-19 vaccine ahead of the rollout of the lifesaving vaccination. The website will now be monitoring for misinformation on Facebook and Instagram, which it owns.

And, in October, Facebook announced that they would finally begin banning advertisements that encourage people to stop getting vaccines. That announcement was part of a series of announcements from the company, including a recent ban on content that denies the Holocaust happened, a ban on QAnon pages and groups that led to a massive purge last week, and a ban on political ads after November 3rd, when the presidential election takes place.

The website will outright ban any posts that have false claims about the safety, effectiveness, ingredients, or side-effects of the COVID-19 vaccines, and will remove content that claims that people will be “microchipped” if they get the vaccine. It appears that the move will go after individual posts — not just ads, like the ads that populated the website in September about “stopping mandatory vaccinations” — making it much broader in scope.

In September, Facebook also decided to stop groups that give other Facebook users health advice, as these groups can become dangerous vectors for disinformation and can lead people to believe things that aren’t true about the medical community or medicine in general.

While this is a welcome step in the right direction, it’s worth wondering like, if their other recent steps to stop political ads, QAnon, Holocaust denial, and other major, and harmful, misinformation campaigns, is a case of too little, too late. After all, in just February of 2020, one Facebook group, “Stop Mandatory Vaccination,” might have been responsible for the death of a 4-year-old boy from Colorado who died from the flu, whose mother did not vaccinate him or any of his siblings from the potentially deadly virus.

That group had 139,000 Facebook members and was one of only a few of the largest misinformation groups on the platform. Luckily, under the new policy rules, these types of groups would be prohibited from existing and would be removed outright. It’s clear that the misinformation campaign against vaccines has not only been prevalent, and everywhere, for a very long time, but that the damage for so many hundreds of thousands of people might have already been done. And that’s just about vaccines: there have been allegations that misinformation about the Rohingya people in Myanmar proliferated on Facebook at the time of genocide against the Rohingya people, and this was only in 2018.  Still, it’s good that they’re taking action, even if irreparable harm has already been done.