Got the Latest Spy Phone?
Apple’s Privacy Problem
By Helen Webley-Brown
Artwork by Leslie Liu, Design Lead surveillance

A lot has changed in the two years since Apple proudly advertised its dedication to privacy on the side of a Las Vegas hotel. What happens on your iPhone no longer stays on your iPhone. With technology that can scan photo libraries for child sexual abuse material (CSAM), Apple is tackling an atrocious crime. But the problem isn’t what Apple’s iPhone scanning system can do today; it’s what it could do tomorrow. Apple must take its own advice, and “step back and consider the implications” of building a Trojan horse for mass surveillance. Privacy and democracy are on the line.

 

In 2016, Apple made waves in the Crypto Wars when it went toe-to-toe with the FBI over a suspected terrorist’s iPhone. Doubling down on their commitment to privacy rights, Apple refused to create a ‘backdoor’ into the iPhone’s encryption. Citing security and privacy concerns, Apple’s CEO Tim Cook warned that “Once created, the technique [to circumvent encryption] could be used over and over again, on any number of devices.” Despite heavy government pressure, Apple decided to not play with fire. Apple determined that the immediate benefits and good intentions in this one case did not outweigh the long-term privacy implications for every other iPhone user. Oh, how times have changed.

 

On August 5th, the tech giant unveiled two major updates to “expand protections for children.” The lesser of the two evils, an iMessage feature that scans for sexually explicit content, uses machine learning to vet images sent to and from children aged under 13. If the algorithm believes the image is explicit, it will blur the message and notify the child’s parents if it is viewed. This iMessage feature joins a long list of spyware apps that already enable parents to surveil their children’s digital activities. 

 

Still, built-in Apple-endorsed spying on kids certainly adds fuel to the fire.

 

A key question that Apple’s iMessage filter poses is whether some children are being harmed rather than protected. Critics have pointed out that the notification system could out LGBTQ+ children to their parents. From Twitter to YouTube, algorithms intended to detect sexually explicit content have been found to incorrectly flag LGBTQ+ content. Technology could push helicopter parenting too far. If parents warm to this control feature, how long until they demand other content is flagged and blurred? The strong parallels to Black Mirror‘s Arkangel, in which parents could blur their children’s actual vision, are terrifying. Despite being advertised as a safety tool, tech-enabled parental surveillance could put children at risk. 

 

Apple’s NeuralHash CSAM detection tool has far graver implications for privacy than blurred iMessages. Hash technology enables Apple to scan a user’s photos for CSAM without actually “seeing” them. An algorithm converts the file into a string that assigns a unique hash value to the encrypted image. If the user’s photo is the same as a known CSAM image, it will have a matching hash value. Julian Sanchez calls this method a “Pinpoint Search.” While physical searches reveal all available information, pinpoint searches can reveal the presence or absence of illegal material without revealing other private information.

 

Apple’s NeuralHash feature uses a “two party” design to scan iPhones. The hash of every photo on each user’s device will be scanned against Apple’s unencrypted database of CSAM photos, which is maintained by the National Center for Missing and Exploited Children (NCMEC). Apple will be alerted if multiple photos in the user’s library match with the CSAM database. At that point, Apple will review the reported images to screen out false alerts before contacting NCMEC and involving law enforcement agencies. 

 

Scanning images for inappropriate material is nothing new. Many of Apple’s peers already use content reporting capabilities and hash-matching to flag CSAM uploaded to their servers. However, staying true to form, Apple has decided to “think different” about privacy and security. While other providers scan images uploaded to their clouds, Apple has opted for on-device scanning. This seemingly innocuous difference is a step too far down a slippery slope. Over 113 million iPhone users have lost control over who has access to some of their most precious, personal moments.

 

Apple is turning cell phones into spy phones.

 

Apple’s well-intentioned child protection measures could be a Trojan horse for mass surveillance. In a series of tweets, Will Cathcart, the head of Facebook-owned WhatsApp, criticized these new features. Cathcart noted that they “could very easily be used to scan private content for anything they or a government decides it wants to control.” Although the NeuralHash feature is currently only used in the US, it is unclear how the feature could be used in other contexts and countries. What is considered a legal right in one place could be illegal in another. CSAM is an abhorrent problem that needs to be tackled, but how long until Apple’s technology is used to track LGBTQ+ people or to persecute religious minorities? The road to hell is paved with good intentions.

 

Artificial Intelligence (AI) tops most major emerging tech lists, and for good reason. According to The Brookings Institution, AI is transforming every walk of life- from school admissions to loan decisions-while raising important policy, regulatory, and ethical issues. Apple’s use of machine-learning, a subset of AI, to scan iPhone images is a case in point. On-device scanning sets a disturbing precedent for other cell phone manufacturers and governments to reach into our phones-and our lives. We must ask ourselves what rights we are willing to sacrifice to tackle tech-enabled crime. Is spyware pre-installed on our phones that small a price to pay?

 

In a world where many people will happily sacrifice their privacy for convenience alone, Apple has gambled that the laudable goal of protecting children will make their spyware more palatable. But Apple is playing with fire. With a built-in surveillance system, iPhones could be catching child abusers one day and monitoring protestors the next. 

 

Today’s promise of protection could be tomorrow’s silver bullet against privacy.

Share your thoughts