Tuesday, August 10, 2021

Apple’s New CSAM Detection Policy Analysis

 August 10, 2021


Apple’s New CSAM Detection Policy Analysis


Several times a year, there seem to be current events or topics that strike a chord both inside and beyond the digital forensic community.  We’ve discussed these in previous articles with regard to the Carpenter v. US decision and Apple’s previous spat with the FBI in the wake of the San Bernardino, CA terrorist attack.


As no stranger to these current event discussions (i.e., controversy) when it comes to matters of privacy and cooperation with Law Enforcement, last week we had another “bombshell” dropped by Apple that in a new US-based update, they will be subjecting user’s on-device photos to hash analysis attempting to track down images of known child sex abuse material (CSAM) that may be uploaded to iCloud and forwarding this information for follow-up to the National Center for Missing & Exploited Children (NCMEC) or other law enforcement investigative entity.  Here, we’ll discuss how this works and explore both sides of the issue.







How CSAM Detection Works


It is a commonplace practice for internet (or electronic) service providers (ISPs/ESPs) to work in conjunction with law enforcement to detect known CSAM images.  The first question naturally is, what is a known CSAM image?  Simply put, an image becomes known CSAM when a victim has been positively identified in the image.  This routinely comes through the investigation of new/unknown images and positive identification on the children in those images.  Often times, the images are within a series, depicting child sex abuse of one or more victims in the same setting with the same abuser and around the same time frame.  Law enforcement entities such as NCMEC, DHS and the FBI all maintain file hash databases of these images for use by law enforcement in investigation.  These file hash values are used to track down purveyors of child pornography across peer-to-peer networks, as well as those who may upload and share CSAM images to cooperating ESPs such as DropBox, Gmail, Yahoo, Kik, etc.  


Once a known CSAM image has been identified by the ESP by hash value, the offending party’s account information as well as the specific date/time of upload and manner of upload are all provided to NCMEC as an investigative lead.  NCMEC then performs a variety of open-source intelligence gathering on the offending party and provides the information to law enforcement in what is known as a Cybertip.  An affiliate or cooperating law enforcement agency receives the Cybertip for investigative follow up, which can include knock-and-talk, search warrant, additional investigation or a combination of these (or other) investigative methods.  Some Cybertips go nowhere.  Some, like one I worked while a member of law enforcement, are not eligible for a search warrant, but end up in a knock-and-talk, consent search and confession from the offending party of not only possessing CSAM, but molestation of his 10 year-old step daughter.  The value of Cybertips in the hands of properly trained investigators cannot be overstated.


Whether or not you know it, you are explicitly agreeing to this process in the End User License Agreement (EULA) by using any of these services.



One Side of the Argument: Privacy


Privacy is understandably an important issue to users of technology across the spectrum.  Apple has traditionally been very privacy-centric in their practices, including refusal to help the FBI unlock the iPhone belonging to a terrorist couple who killed several people.  And privacy is very important to almost all users.  The argument here on the privacy side is that, by Apple’s own statement, they are installing an agent on iDevices to subject photos to a hashing algorithm and then alert law enforcement if a “threshold” of CSAM content is discovered.  This is a half-step further in a direction than what Google, Yahoo or Kik do to detect CSAM content in that they detect content uploaded to their servers and report what is suspected CSAM to NCMEC.  Apple is detecting content on individual devices and reporting suspected CSAM content that may be shared via iCloud which meets the “threshold” to NCMEC. 




In some spirited discussion about this new policy on Linked In recently, it was pointed out that this may be a violation of a user’s 4th Amendment rights against unreasonable search & seizure because Apple is acting as a would-be agent of law enforcement and “searching” people’s photos on their devices without a warrant.  We’ll discuss this a bit further in the next section, but this is a valid argument on its face.  But like with most things in life, the devil is in the details.  


Another point brought up in discussion was the fact that Apple is projecting that they have the technology to scan files on your device generally, which could be more of a concern in the future.  This is very powerful technology that has the capability of infiltrating people’s devices without their express knowledge and potentially providing information to a third party, possibly for criminal investigation.  Furthermore, it opens the door for a would-be hacking entity to exploit this new door that Apple has opened on a much wider scale.  The security and data privacy implications may only be in the early stages.


The Other Side of the Argument: Child Safety


As a former law enforcement investigator on the Internet Crimes Against Children Task Force, I can assure you that proliferation of CSAM images across the internet is a real problem.  While doing this work also in the private sector, I’m sometimes asked how law enforcement knows that the images are of children and not simply people in their upper-teens who could easily be mistaken for someone over 18.  They know.  The most egregious examples of CSAM material involve infants and toddlers in sexually exploitative scenarios that no investigator can ever un-see.  These images are by-in-large not questionable in age or physical development.  They are small children, even sometimes babies.  Are there exceptions to this?  Yes.  Teens who possess smart devices also possess the ability to make their own images and share them with whomever they wish, particularly if their parent/guardian is not tech-savvy or they have not been taught the impact that the decision to share explicit photos can be long-lasting when it comes to the internet.  These images sometimes become part of criminal investigations as well, and they can be added to the CSAM database if the correct criteria and procedures are in place.


So why is Apple’s decision generally a good one for law enforcement?  The approach that investigators take with CSAM images is that a child is victimized every time these images are viewed or shared.  This then requires law enforcement investigation and intervention to help stem the flow of these images across the internet and decrease child sexual exploitation.  Apple’s cooperation in this mission opens a door not previously available without another third-party alerting NCMEC of images potentially stored on a device.  Additionally, with the proliferation of human trafficking, particularly of missing children, Apple’s new policy gives law enforcement another tool in the proverbial toolbox to help track down and locate missing children.




The argument that this practice goes against 4th Amendment protections against unreasonable search & seizure is a compelling one.  But there are some arguments to the contrary.  First, no one has a Constitutional right to own an iPhone.  If you don’t like Apple’s new policy, switch to Android.  Second, whenever we get a new i-Device (iPhone, iPad, iPod, etc.), we are prompted to “Agree” to Apple’s terms of service in the EULA, and this is in perpetuity for the time we use their software and hardware.  It is Apple’s discretion to change or updated the terms of this policy, as they have done recently with implementation of this new practice.  In short, we gave them permission to do this.  Finally, an argument can be made that while Apple’s practice for detecting CSAM images goes a half-step beyond other ESPs, the protection of children from sexual exploitation is worth whatever freedom we give up.  There is a tried & true adage that liberty and security rarely go hand-in-hand.  We frequently give up liberty for security.  Been to the airport lately?  


Finally, the argument was also brought up about mistaken identification of traders of CSAM images through this new process.  If we take Apple’s statement at face-value, there is a one in “one trillion chance per year” that this could happen, meaning it is far less than statistically insignificant.  I’m also quite certain the army of Legal Counsel at Apple have thoroughly reviewed and signed-off on this practice.




Wrapping It Up


This is a hot topic that won’t soon go away.  Apple has caught heat many times for many different approaches over the years and this is just the latest measure to garner such attention.  It is ultimately up to us as the consumers of Apple to decide… Do we want to give up a tiny bit of privacy (for now) in furtherance of the mission to stem the flow of child sex abuse images or do we care more about privacy over the content of our devices?  It’s a personal choice and fortunately, we still have the freedom to choose in the United States!


NOTE:  Article with additional information published on 8/9/2021:  https://www-bbc-com.cdn.ampproject.org/c/s/www.bbc.com/news/technology-58145943.amp 


Author: 

Patrick J. Siewert

Founder & Principal Consultant

Professional Digital Forensic Consulting, LLC 

Virginia DCJS #11-14869

Based in Richmond, Virginia

Available Wherever You Need Us!



We Find the Truth for a Living!


Computer Forensics -- Mobile Forensics -- Specialized Investigation

About the Author:

Patrick Siewert is the Founder & Principal Consultant of Pro Digital Forensic Consulting, based in Richmond, Virginia (USA).  In 15 years of law enforcement, he investigated hundreds of high-tech crimes to precedent-setting results and continues to support litigation cases and corporations in his digital forensic practice.  Patrick is a graduate of SCERS & BCERT and holds several vendor-neutral and specific certifications in the field of digital forensics and high-tech investigation and is a court-certified expert witness.  He continues to hone his digital forensic expertise in the private sector while growing his consulting & investigation business servicing litigators and their clients, professional investigators and corporations, while keeping in touch with the public safety community as a Law Enforcement Instructor.

Email:  Inquiries@ProDigital4n6.com

Web: https://ProDigital4n6.com

Pro Digital Forensic Consulting on LinkedIn: https://www.linkedin.com/company/professional-digital-forensic-consulting-llc

Patrick Siewert on LinkedIn:  https://www.linkedin.com/in/patrick-siewert-92513445/