Apple says its iCloud scanning will rely on multiple child safety groups to address privacy fears

Apple has filled in more details around its upcoming plans to scan iCloud Photos for child sexual abuse material (CSAM) via users’ iPhones and iPads. The company released a new paper delving into the safeguards it hopes will increase user trust in the initiative. That includes a rule to only flag images found in multiple child safety databases with different government affiliations — theoretically stopping one country from adding non-CSAM content to the system.
Apple’s upcoming iOS and iPadOS releases will automatically match US-based iCloud Photos accounts against known CSAM from a list of image hashes compiled by child safety groups. While many companies scan cloud storage services remotely, Apple’s device-based strategy has drawn sharp criticism from some cryptography and privacy experts.
The paper, called “Security Threat Model Review of Apple’s Child Safety Features,” hopes to allay privacy and security concerns around that rollout. It builds on a Wall Street Journal interview with Apple executive Craig Federighi, who outlined some of the information this morning.
In the document, Apple says it won’t rely on a single government-affiliated database — like that of the US-based National Center for Missing and Exploited Children, or NCMEC — to identify CSAM. Instead, it will only match pictures from at least two groups with different national affiliations. The goal is that no single government could have the power to secretly insert unrelated content for censorship purposes, since it wouldn’t match hashes in any other database.

Apple has referenced the potential use of multiple child safety databases, but until today, it hadn’t explained the overlap system. In a call with reporters, Apple said it’s only naming NCMEC because it hasn’t yet finalized agreements with other groups.
The paper confirms a detail Federighi mentioned: initially, Apple will only flag an iCloud account if it identifies 30 images as CSAM. This threshold was picked to provide a “drastic safety margin” to avoid false positives, the paper says — and as it evaluates the system’s performance in the real world, “we may change the threshold.”
It also provides more information on an auditing system that Federighi mentioned. Apple’s list of known CSAM hashes will be baked into iOS and iPadOS worldwide, although the scanning system will only run in the US for now. Apple will provide a full list of hashes that auditors can check against child safety databases, another method to make sure it’s not secretly matching more images. Furthermore, it says it will “refuse all requests” for moderators to report “anything other than CSAM materials” for accounts that get flagged — referencing the potential for using this system for other kinds of surveillance.
Federighi acknowledged that Apple had introduced “confusion” with its announcement last week. But Apple has stood by the update itself — it tells reporters that although it’s still finalizing and iterating on details, it hasn’t changed its launch plans in response to the past week’s criticism.
Apple has filled in more details around its upcoming plans to scan iCloud Photos for child sexual abuse material (CSAM) via users’ iPhones and iPads. The company released a new paper delving into the safeguards it hopes will increase user trust in the initiative. That includes a rule to only…
Recent Posts
- Over a million clinical records exposed in data breach
- Rabbit AI’s new tool can control your Android phones, but I’m not sure how I feel about letting it control my smartphone
- Everything missing from the iPhone 16e, including MagSafe and Photographic Styles
- Reddit is reportedly experiencing some outages
- Google may be close to launching YouTube Premium Lite
Archives
- February 2025
- January 2025
- December 2024
- November 2024
- October 2024
- September 2024
- August 2024
- July 2024
- June 2024
- May 2024
- April 2024
- March 2024
- February 2024
- January 2024
- December 2023
- November 2023
- October 2023
- September 2023
- August 2023
- July 2023
- June 2023
- May 2023
- April 2023
- March 2023
- February 2023
- January 2023
- December 2022
- November 2022
- October 2022
- September 2022
- August 2022
- July 2022
- June 2022
- May 2022
- April 2022
- March 2022
- February 2022
- January 2022
- December 2021
- November 2021
- October 2021
- September 2021
- August 2021
- July 2021
- June 2021
- May 2021
- April 2021
- March 2021
- February 2021
- January 2021
- December 2020
- November 2020
- October 2020
- September 2020
- August 2020
- July 2020
- June 2020
- May 2020
- April 2020
- March 2020
- February 2020
- January 2020
- December 2019
- November 2019
- September 2018
- October 2017
- December 2011
- August 2010