Apple giving into Russia twice this week on key civil liberties issues proves that the company’s CSAM misuse assurances cannot be trusted, argues a high-profile security expert.
Apple today pulled from the App Store an opposition tactical voting app after the Russian government threatened specific local company employees with “punishment” if they refused. It turns out that Apple also turned off its Private Relay service in Russia just yesterday, likely also in response to government pressure.
When Apple announced its plans to scan iPhones for Child Sexual Abuse Materials (CSAM), many pointed out that exactly the same technology could be used to scan phones for things like political content by opponents of repressive governments.
Apple responded by saying that it would never allow this. It would, it said, only search for image hashes in at least two different child safety organization databases.
Security experts, civil rights groups, democratic governments, and even Apple’s own employees called on the company to abandon its plans for this reason.
Addressing the issue of a repressive government forcing it to search for particular materials, Apple said it would “refuse such demands.” But it also states that it obeys the laws in each of the countries in which it operates, and commenters said that pressure could be applied to the company, even in the absence of such laws.
Apple giving into Russia proves the risks are real
As much as Apple claims it would never give in to government pressure to misuse its CSAM scanning feature, cryptography academic Matthew Green argues that the company just proved these assurances are worthless. Apple and Google shut down a voting app meant to help opposition parties organize against the Kremlin in a parliamentary election in Russia that’s taking place over the weekend. The companies removed the app from their app stores on Friday after the Russian government accused them of interfering in the country’s internal affairs, a clear attempt by President Vladimir Putin to obstruct free elections and stay in power.
The Smart Voting app was designed to identify candidates most likely to beat members of the government-backed party, United Russia, as part of a broader strategy organized by supporters of the imprisoned Russian activist Alexei Navalny to bring together voters who oppose Putin. In a bid to clamp down on the opposition effort, the Russian government told Google and Apple that the app was illegal, and reportedly threatened to arrest employees of both companies in the country. The move also comes amid a broader crackdown on Big Tech in Russia. Earlier this week, a Russian court fined Facebook and Twitter for not removing “illegal” content, and the country is reportedly blocking peoples’ access to Google Docs, which Navalny supporters had been using to share lists of preferred candidates.
Critics say the episode serves as an example of why Apple, specifically, can’t be trusted to protect people’s civil liberties and resist government pressure. The company strictly controls the software allowed on to millions of devices and has recently faced allegations of monopolistic behavior with regard to how it manages its App Store, which is the only way people can install apps on iPhones and iPads. While Google is also being accused of caving to censorship demands, Android users can still access the Russian voting app without relying on the Google Play store, though it’s more difficult. “Android users in Russia can find other ways to install this app, whereas Apple is actively helping the Russian government make it impossible for iOS users to do so,” Evan Greer, the director of the digital rights group Fight for the Future, told Recode. “Apple’s top-down monopolistic approach is at the root of their harm.”
Apple insisted just last month that it did, in fact, have the ability to defy this type of government influence. The company said so when it announced a new photo-scanning iPhone feature meant to identify images containing child sexual abuse material (CSAM). The tool, Apple explained, would involve downloading a National Center for Missing and Exploited Children (NCMEC) photo database, in the form of numerical codes, onto every iPhone. The update would have run those codes against photos stored in users’ iCloud accounts, looking for matches that would be reported to human reviewers, and then to the NCMEC. Apple’s ambiguous commitment to protect its users’ civil liberties is especially concerning because the company still insists that it should control large swaths of the software available on the iPhone. While developers like Epic Games have been pushing back against this “walled garden” approach, Apple still manages to maintain wide-ranging discretion over what programs and apps run on its devices. But as recent events in Russia make clear, Apple’s tight control over its App Store can be abused by authoritarian governments.
“Apple was trying to bake censorship into the operating system, adding technology that could search our own phones for banned files,” warned Albert Fox Cahn, the director of STOP, the Surveillance Technology Oversight Project. “But if one government can search for CSAM, another can search for religious texts and political discourse.”
- Security expert says Apple ceding to Russia shows CSAM guarantees cannot be trusted
- Check all news and articles from the latest Security news updates.