As the fight against anti-vaxxers continues to rage on in the U.S., even social media platforms are getting involved. YouTube announced Friday afternoon that it is removing ads on all videos containing anti-vaccine content.
“We have strict policies that govern what videos we allow ads to appear on, and videos that promote anti-vaccination content are a violation of those policies,” a YouTube spokesperson told BuzzFeed News. “We enforce these policies vigorously, and if we find a video that violates them, we immediately take action and remove ads.”
The decision, which will effectively demonetize the associated accounts (which include popular anti-vax channels like VAXXED TV and Larry Cook333), was made after several advertisers were unaware that their ads were running on videos with what YouTube considers “dangerous and harmful” content.
One of those advertisers, Grammarly, told BuzzFeed News, “Upon learning of this, we immediately contacted YouTube to pull our ads from appearing not only on this channel but also to ensure related content that promulgates conspiracy theories is completely excluded.”
YouTube also stated that it will be making changes to its algorithm to prevent anti-vaccination videos from appearing in its “Up Next” feature, which recommends related videos to viewers.
The social media platform’s announcement is just the latest in a push to prevent the spread of misinformation surrounding vaccination. It comes just days after Pinterest revealed that it now blocks all anti-vaccine search results and bans pins from related websites.
And medical experts and political leaders alike are urging other social networks to follow suit. Representative Adam Schiff recently sent a letter to Facebook and Google demanding that they crack down on anti-vaccine content.
Schiff expressed concern that the misinformation being shared about vaccinations would “discourage parents from vaccinating their children, a direct threat to public health, and reversing progress made in tackling vaccine-preventable diseases.”
While Facebook has yet to announce any official changes, USA Today reported that the platform said that it is taking “steps to reduce the distribution of health-related misinformation on Facebook, but we know that we have more to do.”