A coming iOS upgrade will scan users’ iPhones for any images of child sexual abuse, but one wonders how many livelihoods will be ruined whenever the bots make a mistake.
Global smartphone kingpin Apple has long marketed itself as the great vanguard of privacy, and acquired significant credibility in the department when they went to war with Facebook over data sharing. That credibility has taken a haircut in the last few days, after the company’s Friday announcement that they would be scanning your iPhone for sexual images of children, and move that makes iPhone owners worldwide wonder, “Exactly how accurate is this technology?"
It's hard to see how this doesn't result in thousands of people—or more—having their lives turned upside down after being incorrectly flagged and reported to the authorities.
— Dan Savage (@fakedansavage) August 6, 2021
The Associated Press has a nice explainer of how this would work, noting that the scans would only flag images that have already been identified as kiddie porn in a National Center for Missing and Exploited Children database. “The detection system will only flag images that are already in the center’s database of known child pornography,” the news service says. “Parents snapping innocent photos of a child in the bath presumably need not worry. But researchers say the matching tool — which doesn’t ‘see’ such images, just mathematical ‘fingerprints’ that represent them — could be put to more nefarious purposes.”
i have a hard time believing that CSAM is such an existential threat to apple's platform that this effort has merit beyond brokering corporate goodwill with law enforcement agencies and further perpetuating the puritanical moral code they have fostered.
— デミ (@queersorceress) August 7, 2021
Their technical term for child pornography is Child Sexual Abuse Material (CSAM), and tech companies have been busting people for years on it, mostly because they are legally required to. But scanning your phone is a new rubicon for them to cross. Per Apple’s explanation, it will only scan those images stored on your iCloud Photos account. But honestly, what percentage of iPhone users know the degree to which their settings are automatically uploading every snapshot and texted photo to iCloud?
No matter how well-intentioned, @Apple is rolling out mass surveillance to the entire world with this. Make no mistake: if they can scan for kiddie porn today, they can scan for anything tomorrow.
— Edward Snowden (@Snowden) August 6, 2021
They turned a trillion dollars of devices into iNarcs—*without asking.* https://t.co/wIMWijIjJk
Could, say, an Android user “SWAT” me, by sending me a sexually explicit image of a minor from their burner phone, and could I then be flagged because of that? Do we have any assurance that these vaunted machine-learning bots won’t commit mismatches? The Center for Democracy and Technology has studied these tools and concludes they are “notoriously error-prone.”
I think these two paragraphs get to the heart of what is so disturbing about Apple’s photo scanning initiative. https://t.co/bCe9Sg8Qmn pic.twitter.com/9jFqhyc3U1
— Matthew Green (@matthew_d_green) August 9, 2021
Apple has been playing defense since Friday’s announcement of the program, putting out a set of FAQs, and promising it would not allow hostile governments to surveil any type of activity other than child sexual abuse material. But they’ve always promised “end-to-end encryption,” and now they’re created a “backdoor” into the contents of your phone. And once Apple has created the backdoor, they can’t actually guarantee that someone else won’t figure out how to use it.
Related: Apple Clashes With Justice Department Yet Again Over iPhone Backdoor [SFist]
Image: Lastly via Unsplash