Apple has been scanning iCloud Mail for CSAM since 2019 Email client


As Apple prepares to begin scanning iPhones and iPads for child sexual assault material (CSAM) with the release of iOS 15, new details have emerged revealing that the company already scans iCloud Mail for CSAM.
According to a new report from 9to5Mac, the iPhone maker has been scanning iCloud Mail for CSAM since 2019 though it has not yet begun scanning iCloud Photos or iCloud backups for such material.
The news outlet decided to investigate the matter further after The Verge spotted an iMessage thread related to Epic’s lawsuit against Apple in which the company’s anti-fraud chief Eric Friedman said that it is “the greatest platform for distributing child porn”.
While Friedman’s statement certainly wasn’t meant to be seen publicly, it does raise the question as to how Apple could know this without scanning iCloud Photos.
Scanning iCloud Mail
9to5Mac‘s Ben Lovejoy reached out to Apple to find out more regarding the matter and the company did confirm that it has been scanning both outgoing and incoming iCloud Mail attachments for CSAM since 2019. As emails sent using iCloud Mail aren’t encrypted, scanning attachments as mail passes through Apple’s servers is not a difficult thing to do.
In addition to scanning attachments, Apple also told 9to5Mac that it does some limited scanning of other data though the company did not specify what this data is. It did however say that this “other data” does not include iCloud backups.
Back in January of last year, chief privacy officer at Apple, Jane Horvath said at a tech conference that the company uses screening technology to look for illegal images and that it disables accounts if evidence of CSAM is found.
While Friedman’s statement initially sounded as if it was based on hard data, it likely wasn’t. Instead he made the inference that since the company previously did not scan iCloud Photos or iCloud backups, more CSAM would likely exist on Apple’s platform compared to other cloud computing services that actively scan photos for CSAM.
We’ll likely find out more on how Apple plans to combat CSAM on its platform once the company rolls out Apple Child Safety photo scanning with the release of iOS 15 this fall.
Via 9to5Mac
As Apple prepares to begin scanning iPhones and iPads for child sexual assault material (CSAM) with the release of iOS 15, new details have emerged revealing that the company already scans iCloud Mail for CSAM. According to a new report from 9to5Mac, the iPhone maker has been scanning iCloud Mail…
Recent Posts
- DOGE can keep accessing government data for now, judge rules
- In a test, 2000 people were shown deepfake content, and only two of them managed to get a perfect score
- Quordle hints and answers for Wednesday, February 19 (game #1122)
- Facebook is about to mass delete a lot of old live streams
- An obscure French startup just launched the cheapest true 5K monitor in the world right now and I can’t wait to test it
Archives
- February 2025
- January 2025
- December 2024
- November 2024
- October 2024
- September 2024
- August 2024
- July 2024
- June 2024
- May 2024
- April 2024
- March 2024
- February 2024
- January 2024
- December 2023
- November 2023
- October 2023
- September 2023
- August 2023
- July 2023
- June 2023
- May 2023
- April 2023
- March 2023
- February 2023
- January 2023
- December 2022
- November 2022
- October 2022
- September 2022
- August 2022
- July 2022
- June 2022
- May 2022
- April 2022
- March 2022
- February 2022
- January 2022
- December 2021
- November 2021
- October 2021
- September 2021
- August 2021
- July 2021
- June 2021
- May 2021
- April 2021
- March 2021
- February 2021
- January 2021
- December 2020
- November 2020
- October 2020
- September 2020
- August 2020
- July 2020
- June 2020
- May 2020
- April 2020
- March 2020
- February 2020
- January 2020
- December 2019
- November 2019
- September 2018
- October 2017
- December 2011
- August 2010