With Apple’s newest software updates on the horizon, the company has revealed more about what your devices will be able to do after updating. Part of these improvements include new child protection features, which will affect users on devices supporting iOS 15, iPadOS 15, and macOS Monterey.
However, many critics worldwide have lashed out at the Silicon Valley giant for these new measures. But what are they saying and why? This article will answer both of these questions; let’s hop in and get started.
What Did Apple Announce?
In August 2021, Apple announced that it will scan images to combat the distribution of Child Sexual Abuse Material (CSAM). The company will do this as users upload them to their iCloud accounts, creating a token if they find inappropriate content. If this happens, Apple will also notify the National Center for Exploited and Missing Children (NCEMC).
Alongside its CSAM image scanning, Apple will also add several safety features on users’ devices. For example, with a child's Apple ID account, you’ll see a warning on your phone that you might be about to view sensitive content on your device if you receive such content via the Messages app.
Apple will also update answers on Siri to help parents and children who do not feel safe and warn potential offenders about the illegality of distributing CSAM.
To begin with, Apple’s image scanning tools will roll out on devices throughout the US. If you want a full breakdown of each of these features, check out our complete guide to Apple’s incoming child protection tools.
What Have Others Said?
The primary cause for controversy in Apple’s announcement related to its image scanning intentions more than anything else. One high-profile critic was the German governmental body, Bundestag, who penned a letter to Apple.
As reported by German outlet iFun, Manuel Höferlin of the Free Democratic Party addressed a letter to Apple CEO Tim Cook. In this, he praised the company for looking to combat abuse against children. But he also said:
The approach chosen by Apple however—namely CSAM scanning of end devices—is a dangerous one. Regardless of how noble your motives may be, you are embarking on a path that is very risky—not only for your own company. On the contrary, you would also be damaging one of the most important principles of the modern information society—secure and confidential communication. The price for this will most likely be paid not only by Apple, but by all of us.
A group of journalists from Switzerland, Austria, and Germany have also expressed their concerns. A press release by the Deutscher Journalisten-Verband (DJV) said:
In fact, this is also a tool with which a company wants to access other user data on their own devices, such as contacts and confidential documents. This is a danger to journalism and a clear violation of the European General Data Protection Regulation (GDPR), the ePrivacy Directive, and fundamental rights. Frank Everywhere, Federal chairman of the German Association of Journalists (DJV) considers the Apple plans only for the first step.
Meanwhile, Will Cathcart—CEO of WhatsApp—took to Twitter to express his concerns. He published a lengthy thread in which he said that Apple has needed to tackle CSAM with greater toughness but that he doesn’t feel this is the correct approach.
How Did Apple React to This Outrage?
Apple’s choice of words probably wasn’t the best, and the company itself has acknowledged this. Not long after the announcement, the Wall Street Journal held an exclusive interview with Craig Federighi—software chief for Apple.
In this interview, Federighi said that introducing the image scanning and protection in messages simultaneously was “a recipe for confusion”.
Federighi also said:
It’s pretty clear that a lot of messages got jumbled pretty badly…I do believe that the message that got out pretty early was, oh my god—Apple is scanning my phone for images! This is not what is happening.
The Apple Software Chief also clarified that the company is looking for illegal images stored on iCloud. He also pointed out that other companies already analyze every photo within their cloud and that Apple’s intention was to spot inappropriate content without going through every picture.
Why Were People Initially Angry About Apple's Announcement?
Apple managed to clarify its intentions pretty well. But in the high-speed world we live in, outrage spreads fast—and many corners of the internet were already up in arms before the WSJ interview’s release.
So, what can explain the anger that many people around the planet have directed in Apple’s direction? Below are three potential reasons.
1. Perceived Privacy Infringements
As Federighi said, many iPhone, iPad, and Mac users saw these scanning tools as a direct infringement on their privacy.
The images that users store on their iPhones and iPads in particular are often personal to them; they might contain fond memories or screenshots of inside jokes with friends. Naturally, the idea of a company going through these made a lot of customers uneasy.
In some instances, anger might also have stemmed from a betrayal of trust. Apple is often lauded for its emphasis on security and privacy, meaning that an announcement of this kind—regardless of its intention—could have come as a shock.
2. The Perception of Association
It goes without saying that combatting CSAM is critical and that abuse against children is heinous in all circumstances. And while most people wouldn’t dream of committing such a crime, many Apple users might have felt like this rollout on all supporting devices was associating them with those who would.
If Apple had announced that it would implement these measures without scanning all users’ images, some who opposed the image scanning might not have been so critical.
3. Concerns About Corporate and Government Surveillance
Big Tech has come under increasing scrutiny for how companies use people’s data. Facebook is one prime example, with the 2018 Cambridge Analytica scandal thrusting this issue into the limelight. Google and Amazon have also found themselves in hot water, each receiving fines worth millions of dollars for GDPR non-compliance in France.
2019 statistics by Pew Research Center revealed that 79% of Americans were concerned about how companies use their data, while 64% felt likewise with governments. Apple’s intentions to scan images will have left users on edge as to how far big companies and governments can push privacy boundaries, as was visible in some critics’ comments earlier in this article.
Did Apple Take Things Too Far?
Apple’s plans to scan images and identify sensitive content were controversial, but clearly something it deems necessary to combat the distribution of CSAM on its devices.
Scanning images isn’t a new concept. However, Apple’s announcement poses valid questions about privacy—and how far is too far.
Tackling child pornography and the like is crucial. Apple is unlikely to back down in scanning images, and most people don’t have anything to worry about in this respect. However, the company has also learned a valuable lesson in how it frames its messaging in the future.