On Bing, child pornography is not only easily accessible, but it’s being recommended to users. A new report released Thursday by TechCrunch revealed that Microsoft’s search engine is showing illegal child exploitation imagery in its search results.
With the help of online safety firm AntiToxin, TechCrunch launched an investigation from December 30, 2018 to January 7 into Bing’s potential child pornography issue after receiving an anonymous tip.
Researchers discovered that searching common terms related to child pornography, like “porn kids,” “porn CP” (CP is an abbreviation for “child pornography”) and “nude family kids” resulted in illegal images on Bing. The report describes the pictures as “images which any professional or novice can determine to be of underage boys and girls posing in partial/full nudity, as well as partaking in various sexual acts.”
And even looking up seemingly innocent words and phrases can lead to offensive photos. For example, when researchers looked up popular teen video chat app “Omegle Kids,” Bing auto-suggested “Omegle Kids Girls 13,” which contained child pornography in the results, along with explicit content in the Similar Images box.
All of the searches were done on a desktop computer with the “Safe Search” feature turned off. The report notes that similar queries on Google didn’t yield nearly as much exploitative imagery or as graphic content.
AntiToxin CEO Zohar Levkovitz says this kind of abuse on a search engine as widely used as Bing is unacceptable. “Speaking as a parent, we should expect responsible technology companies to double, and even triple-down to ensure they are not adding toxicity to an already perilous online environment for children.”
So how does something like this happen? “We index everything, as does Google, and we do the best job we can of screening it,” explained a Microsoft spokesperson when questioned by TechCrunch. “We use a combination of PhotoDNA and human moderation but that doesn’t get us to perfect every time. We’re committed to getting better all the time.”
After being made aware of the issue, Bing assured users that it has been corrected and that people will now have more options for flagging illegal images, including a “child sexual abuse” category.
Chief Vice President Jordi Ribas stated, “We acted immediately to remove [the images], but we also want to prevent any other similar violations in the future. We’re focused on learning from this so we can make any other improvements needed.”
You can find the full report from TechCrunch and AntiToxin here.