Apple recently announced that they would be cracking down on illegal pornographic material housed on their cloud storage. Apple plans to scan iPhones using NeuralHash to identify images of child pornography and then redirect the case to the proper authorities. While the aim of the effort is noble and admirable, the move raises significant privacy concerns for users. The actions of governments at home and abroad show that Americans should demand a hands-off approach when dealing with matters of digital privacy.
Americans typically have a strong sense of privacy. Nobody wants someone just walking into their house, even if they have good intentions. Americans have similar sensibilities when it comes to their digital data, too. The vast majority of Americans want laws protecting digital privacy; in fact, 83 percent want such legislation to be passed this year.
The government, however, is doing the opposite. Numerous outlets have reported that Apple and other tech companies have wilted under pressure from the government to hand over user data. The Electronic Frontier Foundation published a scathing attack against Apple’s decision in which they explained the changes that are being put into place:
There are two main features that the company is planning to install in every Apple device. One is a scanning feature that will scan all photos as they get uploaded into iCloud Photos to see if they match a photo in the database of known child sexual abuse material (CSAM) maintained by the National Center for Missing & Exploited Children (NCMEC). The other feature scans all iMessage images sent or received by child accounts — that is, accounts designated as owned by a minor — for sexually explicit material, and if the child is young enough, notifies the parent when these images are sent or received. This feature can be turned on or off by parents.
Apple, for their part, has told users that they will not hand over their data. In response to a question about privacy, Apple said:
Apple will refuse any such demands [to scan for non-pornographic image data]. Apple’s CSAM detection capability is built solely to detect known CSAM images stored in iCloud Photos that have been identified by experts at NCMEC and other child safety groups. We have faced demands to build and deploy government-mandated changes that degrade the privacy of users before, and have steadfastly refused those demands. We will continue to refuse them in the future.
It’s good that Apple says they intend to refuse future government demands. However, Apple was pressured into this decision, and it doesn’t have an unblemished record of keeping users’ information secure. Tech companies have had numerous breaches have resulted in billions of accounts being compromised. It’s concerning that Apple even needs to stand firm against the government intruding on citizens’ personal privacy. Citizens want more privacy and more Internet security, not less.
Beyond the American government, there are incentives for foreign politicians to pressure Big Tech companies into selling or monetizing consumer data. China, in particular, has used metadata to spy on its own citizens. American citizens aren’t going to get a free pass from foreign snooping.
It would be nice if American companies had a sterling record of standing up to foreign pressure, especially when that pressure from authoritarian regimes. Unfortunately, Xi Jinping has enormous economic influence, and tech companies have been all too willing to change their guidelines for Chinese cash.
Even though China censored Google Search which led to Google pulling its search engine, Google has still played nice with China. TikTok is operated by China and has been criticized for harvesting users’ biometric data. Apple, for their part, manufactures much of its products in China and the Chinese Communist Party is willing to pressure the company into doing their bidding. All of this should worry users who care about digital privacy.
American law enforcement wants to track down those who sexually exploit children. They should absolutely do so. However, using “every means necessary” can result in tracking mechanisms that have dire political consequences. Apple’s decision to allow automated scanning of cloud-image data could spark a backlash in favor of greater digital privacy. Let’s hope it does.