A coming iOS upgrade will scan users’ iPhones for any images of child sexual abuse, but one wonders how many livelihoods will be ruined whenever the bots make a mistake.

Global smartphone kingpin Apple has long marketed itself as the great vanguard of privacy, and acquired significant credibility in the department when they went to war with Facebook over data sharing. That credibility has taken a haircut in the last few days, after the company’s Friday announcement that they would be scanning your iPhone for sexual images of children, and move that makes iPhone owners worldwide wonder, “Exactly how accurate is this technology?"

The Associated Press has a nice explainer of how this would work, noting that the scans would only flag images that have already been identified as kiddie porn in a National Center for Missing and Exploited Children database. “The detection system will only flag images that are already in the center’s database of known child pornography,” the news service says. “Parents snapping innocent photos of a child in the bath presumably need not worry. But researchers say the matching tool — which doesn’t ‘see’ such images, just mathematical ‘fingerprints’ that represent them — could be put to more nefarious purposes.”

Their technical term for child pornography is Child Sexual Abuse Material (CSAM), and tech companies have been busting people for years on it, mostly because they are legally required to. But scanning your phone is a new rubicon for them to cross. Per Apple’s explanation, it will only scan those images stored on your iCloud Photos account. But honestly, what percentage of iPhone users know the degree to which their settings are automatically uploading every snapshot and texted photo to iCloud?

Could, say, an Android user “SWAT” me, by sending me a sexually explicit image of a minor from their burner phone, and could I then be flagged because of that? Do we have any assurance that these vaunted machine-learning bots won’t commit mismatches? The Center for Democracy and Technology has studied these tools and concludes they are “notoriously error-prone.”

Apple has been playing defense since Friday’s announcement of the program, putting out a set of FAQs, and promising it would not allow hostile governments to surveil any type of activity other than child sexual abuse material. But they’ve always promised “end-to-end encryption,” and now they’re created a “backdoor” into the contents of your phone. And once Apple has created the backdoor, they can’t actually guarantee that someone else won’t figure out how to use it.

Related: Apple Clashes With Justice Department Yet Again Over iPhone Backdoor [SFist]

Image: Lastly via Unsplash