Apple's New Child Pornography Features Protect Kids But Reduce Our Privacy

These features identify sensitive images, but do so by disabling end-to-end encryption, thereby creating an opening for bad actors to gain access to your content and letting Apple moderators view your photos in iCloud.
Last updated on August 16, 2021
Apple's New Child Pornography Features Protect Kids But Reduce Our Privacy
5 Reasons
Apple's child pornography features protect our children while maintaining some privacy standards. The features are designed so Apple does not get access to images or messages.
Apple's SVP Software Engineering, Craig Federighi, talks about how the child pornography features work
In the video above, Apple's SVP of Software Engineering, describes how their new child pornography features work. First, Apple does not analyze anything, including messages or photos, on your iPhone. A two-part analysis only occurs if you have chosen to back up photos to iCloud. The first half of that analysis happens as the photo is being uploaded to iCloud, and the second half happens when the photo is on iCloud. Second, the analysis only compares an encrypted signature of a photo on iCloud to a matching encrypted signature of a known child pornographic photo in the Child Sexual Abuse Material (CSAM) database. Third, the only way an alert is triggered is if there are more than 30 of these encrypted signatures of known child pornographic images stored in your iCloud. Thus, Apple never knows what photos or messages are on your iPhone, they never see the content of your photos or messages, and this does not give a back-door to any hackers or government agencies to look inside your iPhone.
However, a fundamental definition of privacy is presenting people with information without bias or coercion. For example, many tech companies coerce users into adopting new features in order to get access to their private information.
Steve Jobs even said privacy means people know what they're signing up for. By this simple definition, Apple broke their own privacy principles.
Steve Jobs interviewed by Walt Mossberg at the D8 conference.
At the 2:20 mark in this video, Walt Mossberg asked Steve Jobs if Apple would be moving into cloud-based technologies that could have privacy implications. Steve Jobs replied simply, "Privacy means people know what they're signing up for. In plain English."
It's because Apple moderators are viewing the flagged images. Apple did not clearly describe that their employees are viewing the images stored in iCloud if they get flagged by a fuzzy matching algorithm, which means many people will have their private photos viewed when that is unexpected.
Thus, Apple broke one key element of iPhone privacy: end-to-end encryption. By scanning photos uploaded to iCloud and matching them against a database, they have broken this encryption, which theoretically creates an entry point for hackers, law enforcement, or government agencies to inspect people's private photos.
While Apple's new features do provide added protections for families with young children, they have crossed a privacy line by at least one definition.
Originally Authored By
Write on Goodpoint
Interested in writing on Goodpoint? Request writing privileges and join our growing community of intelligent influencers.
By continuing with Google or Facebook, you agree to our Terms & Conditions and acknowledge our Privacy Policy.
Login here
What is Goodpoint for? Learn how the platform makes a bigger impact.