Give us a little more information and we'll give you a lot more relevant content
Your child's birthday or due date
Girl Boy Other Not Sure
Add A Child
Remove A Child
I don't have kids
Thanks For Subscribing!
Oops! Something went wrong. Please contact support@fatherly.com.

Apple Still Plans to Scan Phones for Child Pornography. Experts Are Sounding the Alarm.

The issue is with what this technology could mean for global privacy standards.

Over the summer, Apple released plans to automatically scan iPhone photos for child sexual abuse material (CSAM), a decision met with immediate backlash from some privacy experts.

Now, a group of leading computer science and security experts have published a paper documenting their concerns over the proposed software, according to ComputerWeekly.com. The paper, not peer-reviewed, was published via Columbia University and Arxiv, an open-access research database.

Plenty of internet companies scan for CSAM, including social media giants like Facebook, reports Vox. But while these scans take place on external servers, Apple’s plan would scan photos destined for iCloud accounts on personal devices like iPhones. In addition, Apple is planning to allow parents to opt into scanning photos sent via Messages by users less than 12 years old, Vox reports.

Scanning photos on the phones, as opposed to centralized servers, is called “client-side scanning,” or CSS — and that’s what concerns the authors of this new paper. They say that issue isn’t with scanning for CSAM, rather with the potential future uses of CSS technology.

Client Side Scanning (CSS) Could Be Hugely Problematic for Personal Privacy — and Safety

“Even if deployed initially to scan for child sex-abuse material, content that is clearly illegal, there would be enormous pressure to expand its scope,” the authors write. “We would then be hard-pressed to find any way to resist its expansion or to control abuse of the system.”

The authors note concerns about the potential to target everything from political activism to organized crime and terrorism to LGBTQ+ people, depending on who any given government is looking for. If this technology created the door into people’s phones, you could imagine the pressure to open that door to scan for images associated with unpopular political opinions or anti-authoritarian messages by authoritarian governments.

The initial backlash to the plan prompted the company to rework, but maintain, the proposal for iPhone CSS, reports Ars Technica. In addition, Apple wrote a series of responses to some questions on the subject. In response to a question of whether governments could make Apple scan for things other than CSAM, the company said that they would “refuse such demands” and that “this technology is limited to detecting CSAM stored in iCloud and we will not accede to any government’s request to expand it.”

Apple has held firm in privacy issues before, such as with the FBI’s request to unlock the San Bernardino shooter’s iPhone in 2016. Though the FBI eventually got into the phone, Apple maintained a refusal to help, according to The Verge.

But the concerns don’t just lie with state actors. The paper cites concerns with people like corrupt police officers, or even family members. “Consider, for example, a woman planning to escape a violent or controlling partner who abuses both her and their children,” the authors write. “The abuser will often install stalker ware on family phones, and he may use ‘smart home’ infrastructure such as video doorbells as a way to control them.”

Apple has responded to questions like if the feature will “prevent children in abusive homes from seeking help” by stating on their website that “The communication safety feature applies only to sexually explicit photos shared or received in Messages. Other communications that victims can use to seek help, including text in Messages, are unaffected,” and note that they are adding support in Siri and Search to provide victims of abuse more support.

The response to this proposal hasn’t been uniformly negative — The Verge notes that some experts are pleased with the potential to intercept CSAM. And for kids growing up in an internet-centric society, it is incredibly important to keep them safe online.

Make Sure You Keep Your Kids Offline As Much As Possible, And Educated When Online

One of the most important things you can do, besides keeping your kids off of the internet and smartphones for as long as possible, is to talk with your kids about internet safety and data privacy, letting them know the reality of surveillance, data-gathering, and potential bad actors in today’s world.

As the world grows online, it will be a challenge to find workable solutions to keep people, and in particular, children, safe and keep our data out of sight to everyone from criminals to authoritarian governments to private companies.

The authors of this new paper note some of the larger potential implications of mass surveillance. “The introduction of scanning on our personal devices—devices that keep information from to-do notes to texts and photos from loved ones—tears at the heart of privacy of individual citizens,” they write. “Such bulk surveillance can result in a significant chilling effect on freedom of speech and, indeed, on democracy itself.”