Parenting

Facebook’s Leaked Child Abuse Guidelines Anger Parents and Advocates

Back to the drawing board.

child abuse

Facebook is here to stay and, unfortunately, so are the challenges that come with moderating the horrific images of abuse that occasionally surface online. Over the weekend the Guardian published a selection of Facebook’s internal moderation manuals, which shed light on how the company moderates images that depict non-sexual child abuse.

“We allow ‘evidence’ of child abuse to be shared on the site to allow for the child to be identified and rescued,” one of the slides explains. “But we add protections to shield the audience.”

The manuals explain that Facebook moderators remove images only after they are reported, and intentionally leave some graphic content live so that community members can mine images for crucial clues that might help rescue an abused child. It’s a noble thought, but advocates say it would be better for the company to just remove these images and pass them along to the professionals. “In most cases the reality of sharing vile and violent images of violence and child abuse simply perpetuates the humiliation and abuse of a child,” Yvette Cooper, British politician and form chair of the home affairs select committee, told the Guardian. “Images should be given to the police and removed instead.”

The Guardian | Slide from Facebook moderation manual

Facebook defines child abuse as any physical action with intent to harm, or any action defined as abuse by the authorities. Per the manual, moderators need only remove images of non-sexual child abuse when they are shared with sadism and celebration. Otherwise, such images are simply labeled “disturbing”. Even live streams depicting self-harm need not be deleted or flagged, according to the manual, because that would constitute censorship.

Claire Lilley, the head of child safety online at the National Society for the Prevention of Cruelty to Children (NSPCC), suggests that Facebook rewrite its guidelines from scratch, and fast. The NSPCC has criticized the social media giant in the past, for allowing a video of a baby being dunked in water to remain online. “We want to see them taking non-sexual child abuse imagery just as seriously as they take sexual abuse imagery,” Lilley says.

“I’d like to see them take a step back and look at their guidelines that they hand to their moderators and look at the contradictions that are inherent in them. They need to throw them away and start again with a blank sheet of paper.”