The big thing
In the past month, Apple did something it generally has done an exceptional job avoiding — the company made what seemed to be an entirely unforced error.
In early August — seemingly out of nowhere** — the company announced that by the end of the year they would be rolling out a technology called NeuralHash that actively scanned the libraries of all iCloud Photos users, seeking out image hashes that matched known images of child sexual abuse material (CSAM). For obvious reasons, the on-device scanning could not be opted out of.
This announcement was not coordinated with other major consumer tech giants, Apple pushed forward on the announcement alone.
Researchers and advocacy groups had almost unilaterally negative feedback for the effort, raising concerns that this could create new abuse channels for actors like governments to detect on-device information that they regarded as objectionable. As my colleague Zach noted in a recent story, “The Electronic Frontier Foundation said this week it had amassed more than 25,000 signatures from consumers. On top of that, close to 100 policy and rights groups, including the American Civil Liberties Union, also called on Apple to abandon plans to roll out the technology.”
(The announcement also reportedly generated some controversy inside of Apple.)
The issue — of course — wasn’t that Apple was looking at find ways that prevented the proliferation of CSAM while making as few device security concessions as possible. The issue was that Apple was unilaterally making a massive choice that would affect billions of customers (while likely pushing competitors towards similar solutions), and was doing so without external public input about possible ramifications or necessary safeguards.
A long story short, over the past month researchers discovered Apple’s NeuralHash wasn’t as air tight as hoped and the company announced Friday that it was delaying the rollout “to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.”
Having spent several years in the tech media, I will say that the only reason to release news on a Friday morning ahead of a long weekend is to ensure that the announcement is read and seen by as few people as possible, and it’s clear why they’d want that. It’s a major embarrassment for Apple, and as with any delayed rollout like this, it’s a sign that their internal teams weren’t adequately prepared and lacked the ideological diversity to gauge the scope of the issue that they were tackling. This isn’t really a dig at Apple’s team building this so much as it’s a dig on Apple trying to solve a problem like this inside the Apple Park vacuum while adhering to its annual iOS release schedule.
Apple is increasingly looking to make privacy a key selling point for the iOS ecosystem, and as a result of this productization, has pushed development of privacy-centric features towards the same secrecy its surface-level design changes command. In June, Apple announced iCloud+ and raised some eyebrows when they shared that certain new privacy-centric features would only be available to iPhone users who paid for additional subscription services.
You obviously can’t tap public opinion for every product update, but perhaps wide-ranging and trail-blazing security and privacy features should be treated a bit differently than the average product update. Apple’s lack of engagement with research and advocacy groups on NeuralHash was pretty egregious and certainly raises some questions about whether the company fully respects how the choices they make for iOS affect the broader internet.
Delaying the feature’s rollout is a good thing, but let’s all hope they take that time to reflect more broadly as well.
** Though the announcement was a surprise to many, Apple’s development of this feature wasn’t coming completely out of nowhere. Those at the top of Apple likely felt that the winds of global tech regulation might be shifting towards outright bans of some methods of encryption in some of its biggest markets.
Back in October of 2020, then United States AG Bill Barr joined representatives from the UK, New Zealand, Australia, Canada, India and Japan in signing a letter raising major concerns about how implementations of encryption tech posed “significant challenges to public safety, including to highly vulnerable members of our societies like sexually exploited children.” The letter effectively called on tech industry companies to get creative in how they tackled this problem.
|