Facebook’s ban against anti-vaxx misinformation ads on their website just went a bit further in scope — targeting not just ads, but individual posts and content.
On Tuesday, October 14, Facebook announced that they would finally begin banning advertisements that encourage people to stop getting vaccines. That announcement was part of a series of announcements from the company, including a recent ban on content that denies the Holocaust happened, a ban on QAnon pages and groups that led to a massive purge last week, and a ban on political ads after November 3rd, when the presidential election takes place.
Now, as of early December, Facebook has decided to expand the scope of their anti-vaxx and misinformation ban regarding the COVID-19 vaccine ahead of the rollout of the lifesaving vaccination. The website will now be monitoring for misinformation on Facebook and Instagram, which it owns.
The website will outright ban any posts that have false claims about the safety, effectiveness, ingredients, or side-effects of the COVID-19 vaccines, and will remove content that claims that people will be “microchipped” if they get the vaccine. It appears that the move will go after individual posts — not just ads, like the ads that populated the website in September about “stopping mandatory vaccinations” — making it much broader in scope.
In September, Facebook also decided to stop groups that give other Facebook users health advice, as these groups can become dangerous vectors for disinformation and can lead people to believe things that aren’t true about the medical community or medicine in general.
While this is a welcome step in the right direction, it’s worth wondering like, if their other recent steps to stop political ads, QAnon, Holocaust denial, and other major, and harmful, misinformation campaigns, is a case of too little, too late. After all, in just February of this year, one Facebook group, “Stop Mandatory Vaccination,” might have been responsible for the death of a 4-year-old boy from Colorado who died from the flu, whose mother did not vaccinate him or any of his siblings from the potentially deadly virus.
That group had 139,000 Facebook members and was one of only a few of the largest misinformation groups on the platform. It’s clear that the misinformation campaign against vaccines has not only been prevalent, and everywhere, for a very long time, but that the damage for so many hundreds of thousands of people might have already been done. And that’s just about vaccines: there have been allegations that misinformation about the Rohingya people in Myanmar proliferated on Facebook at the time of genocide against the Rohingya people, and this was only in 2018. Still, it’s good that they’re taking action, even if irreparable harm has already been done.