Apple's New Child Pornography Features Protect Kids But Reduce Our Privacy
These features identify sensitive images, but do so by disabling end-to-end encryption, thereby creating an opening for bad actors to gain access to your content and letting Apple moderators view your photos in iCloud.
Aug 16, 2021Updated 2 months ago
Apple's child pornography features protect our children while maintaining some privacy standards. The features are designed so Apple does not get access to images or messages.
Apple's SVP Software Engineering, Craig Federighi, talks about how the child pornography features work
In the video above, Apple's SVP of Software Engineering, describes how their new child pornography features work.
First, Apple does not analyze anything, including messages or photos, on your iPhone. A two-part analysis only occurs if you have chosen to back up photos to iCloud. The first half of that analysis happens as the photo is being uploaded to iCloud, and the second half happens when the photo is on iCloud.
Second, the analysis only compares an encrypted signature of a photo on iCloud to a matching encrypted signature of a known child pornographic photo in the Child Sexual Abuse Material (CSAM) database.
Third, the only way an alert is triggered is if there are more than 30 of these encrypted signatures of known child pornographic images stored in your iCloud.
Thus, Apple never knows what photos or messages are on your iPhone, they never see the content of your photos or messages, and this does not give a back-door to any hackers or government agencies to look inside your iPhone.
However, a fundamental definition of privacy is presenting people with information without bias or coercion. For example, many tech companies coerce users into adopting new features in order to get access to their private information.
Steve Jobs even said privacy means people know what they're signing up for. By this simple definition, Apple broke their own privacy principles.
Steve Jobs interviewed by Walt Mossberg at the D8 conference.
At the 2:20 mark in this video, Walt Mossberg asked Steve Jobs if Apple would be moving into cloud-based technologies that could have privacy implications. Steve Jobs replied simply, "Privacy means people know what they're signing up for. In plain English."
It's because Apple moderators are viewing the flagged images. Apple did not clearly describe that their employees are viewing the images stored in iCloud if they get flagged by a fuzzy matching algorithm, which means many people will have their private photos viewed when that is unexpected.
Thus, Apple broke one key element of iPhone privacy: end-to-end encryption. By scanning photos uploaded to iCloud and matching them against a database, they have broken this encryption, which theoretically creates an entry point for hackers, law enforcement, or government agencies to inspect people's private photos.
While Apple's new features do provide added protections for families with young children, they have crossed a privacy line by at least one definition.
Share your thinking
Show others the intelligence behind your views. Click "Share" to present your idea now! Your name and photo will be at the top of the article, so recipients know they are seeing your perspective. On Goodpoint, posts have the conversation for you.
Goodpoint is an online community where writers are more persuasive, because articles are organized more intelligently. On Goodpoint, content is created in easy-to-read outlines. This allows reasons to be positioned underneath the ideas they support, making the information clearer. And sections are labeled as either fact or opinion, so readers always know what kind of idea they are evaluating.
Write your ideas
Use Goodpoint to present views that are too important to be unfairly attacked or misunderstood, and to explain yourself with a precision that has not been possible before. And use Goodpoint’s “idea leverage” feature to add reasons other writers have already written. Learn more