Wednesday, October 30, 2019

Three Reasons Why Call Detail Records Analysis Is Not “Junk Science”



October 31, 2019

Three Reasons Why Call Detail Records Analysis Is Not “Junk Science

Since introducing our private sector clients to the impact that cellular call detail records (CDR) analysis & mapping can have on their cases, we’ve had a lot of robust discussions with litigators and clients about the veracity and value of this evidence.  CDR analysis has been used for decades in law enforcement to help prove or disprove the approximate location of criminal defendants in major crimes.  Only in the past several years have civil litigators and insurance companies also been introduced to the value that this evidence can have on their cases and/or claims investigations.  In the time we’ve been conducting CDR analysis, we’ve worked varying types of cases from criminal prosecution for smaller prosecutor’s offices to domestic litigation to help prove/disprove cohabitation to high-dollar insurance claims to help determine if the claim and associated statements made under oath are verifiable with regard to location.  This specialty offshoot of digital forensics requires constant knowledge updating with regard to carrier practices and specialized training and tools to be able to perform these analyses effectively.


However mainly among the Criminal Defense Bar, the notion has been put forth that CDR analysis may be “junk science” and therefore potentially unreliable as evidence in legal proceedings.  One such high-profile case in which CDR analysis was used to obtain a conviction was the case of State v. Adnan Syed, chronicled in the Serial Podcast.  However, as we’ve seen more recent developments in that case unfold, the “junk science” claim doesn’t necessarily lie with the practice, rather with the potential practitioner.  Indeed, even in computer forensics, certain vendors of forensic tools like to claim their tool has been “validated in court”, when in reality it is the examiner and their competence that needs to be validated in court.  The tool (or in this case, the cellular records) is/are just a dataset that needs to be analyzed competently to be introduced as evidence in a legal proceeding. 

Toward the end of establishing that CDR analysis is not “junk science”, here are three salient points that will help debunk the myth that these records and their associated analysis is not worthy of evidentiary status.

Reason # 1:  Cellular Records Are “Pure” Evidence

What do we mean by “pure” evidence?  Consider for a moment other types of digital evidence that are analyzed for use in court, such as the cell phone itself or a computer system.  These items are generally affected by the user to a great degree and therefore can be open to some scrutiny about the weight and value they hold.  Cellular Records are only available via court order or search warrant to the cellular provider.  A Verizon Wireless customer cannot call customer support and ask for their cellular call detail records with historical cell site data.  The provider will not provide this data this absent legal process.  This means the user has very limited (if any) ability to manipulate the data, which makes the evidence about as pure as it gets. 

Furthermore, the record-keeper has no vested interest in altering the evidence.  In fact, they have every reason to maintain better, more accurate records!  It is a fact within the cellular industry that CDRs were never meant to be used as evidence in legal proceedings.  CDRs are kept by cellular providers so they can log and analyze their own networks for efficiency and to increase overall customer experience on the network.  Simply put, the records are kept for customer service purposes and cellular companies don’t make money by having poor customer service.  It is a fortunate byproduct that these records may be obtained via legal process and analyzed for potential use in legal proceedings.  This is why cellular providers don’t maintain these records indefinitely, as detailed in our 2017 article Cellular Provider Record Retention Periods.

Name another type of digital evidence that the user never touches and to which they generally don’t have access!

Reason # 2:  Automated Tools Have Greatly Decreased The Human Error Factor 

Back in 2001 when the incident detailed in season 1 of the Serial Podcast, there were few, if any, automated tools with which to conduct CDR analysis.  In modern casework, we have many options for automated tools analysis, including CellHawk, ZetX, CASTviz, Map Link, Pen Link, as well as some others.  Use of automated tools can save time and greatly reduce error, but they come with a few warnings:

·      Not all tools are created equally.  If you’re using a tool that is free [to law enforcement], you’re generally getting what you pay for. 
·      Don’t rely on the tool to do all of the work.  Automated tools are great, but they cannot tell you if someone likely shut their phone off or sent a call to voicemail or left their phone in one location while committing an offense somewhere else.  Only manual analysis of the data and the behavior of the user can help verify these conclusions.

·      VALIDATE!  If an automated tool is telling you something, make sure to always refer back to the original record for validation.  If an automated tool is citing a GPS coordinate for location, make sure you validate there is actually a cell site at that location. 


Reason # 3:  Trained, Experienced Analysts Don’t Deal in “Junk Science

One of the traps digital forensic examiners of all ilk are susceptible to fall is the drawing of conclusions not based on fact.  While it’s true that a trained, experienced professional may reach conclusions based upon device activity, those same conclusions have to be rooted in some facts at some point.  The trap that sometimes rears its ugly head is when we reach conclusions that are either outside of our expertise or are not supported by the data. 

There are several traps documented in litigation over the course of the life of CDR analysis in legal proceedings that have led to the claim of “junk science”.  Probably the biggest of these (and the one cited in the article linked above and again here) are conclusions about cell site range.  As analysts, we are not cellular engineers and we cannot be engaged in speculation or discussion about the “range” of a particular cell site.  This is why in most cases we approximate location of the target device in the investigation and DO NOT get entwined in discussions about cell site range.  Even if we were fortunate enough to have propagation maps from the cellular provider which detail the effective/optimal range of a cell site, we still won’t draw conclusions about range.  It is not within the expertise of most analysts to discuss range.  That is for a cellular engineer to conclude, not an analyst of cellular records.



There are behaviors and activity that the records can tell us, however.  A trained analyst can usually tell of the phone was off or if a call was sent straight to voice mail or if the phone was left in one location for a prolonged period.  At the heart of the records is usage behavior.  Is there a pattern of behavior that is not adhered to during the time of the alleged incident?  Is there link analysis that can be done to confirm likely associates and/or accomplices?  If there are alleged accomplices, does normally text and/or call activity cease with that person during the time frame of the incident?  All of these items and more can help lead a trained, experienced analyst come to conclusions within a reasonable degree of certainty, but with most of these items, we require a larger dataset to compare the behavior at the time of incident with the behavior at other times.  An analyst cannot identify these behaviors with 24 or 48 hours worth of records.  This is not enough data from which to draw conclusions about behavior.  This is also why we highly advise obtaining at least 30 days of records on either end of the incident, preferably more.  More data is better when it comes to CDR analysis.



The ultimate test of whether or not the conclusions based upon trained, experienced analysis of the records is “junk science” lies with the competencies of the analyst.  One who draws conclusions not based in fact is what leads to an otherwise valid form of data analysis to be dubbed “junk science”.

Wrapping It Up

In any forensic discipline, there is a possibility for human error or oversight.  We’re not infallible, after all, and we can’t be expected to be perfect all the time.  But CDR analysis is the only one in which the term “junk science” has been bandied about quite a bit.  Deeper inspection of the issues involved in each case where this claim has been made can be lessons for current and future analysts to read and take heed.  It’s when our conclusions span beyond the breadth of our expertise and what the data tells us that we get into trouble.  Ultimately, everyone wants to see justice done.  If we can use CDR analysis successfully in litigation without reaching past our ability into conclusions that are open to extreme scrutiny, justice will be served. 

Author:
Patrick J. Siewert
Principal Consultant
Professional Digital Forensic Consulting, LLC
Virginia DCJS #11-14869
Based in Richmond, Virginia
Available Wherever You Need Us!


We Find the Truth for a Living!

Computer Forensics -- Mobile Forensics -- Specialized Investigation
About the Author:
Patrick Siewert is the Principal Consultant of Pro Digital Forensic Consulting, based in Richmond, Virginia.  In 15 years of law enforcement, he investigated hundreds of high-tech crimes, incorporating digital forensics into the investigations, and was responsible for investigating some of the highest jury and plea bargain child exploitation investigations in Virginia court history.  Patrick is a graduate of SCERS, BCERT, the Reid School of Interview & Interrogation and multiple online investigation schools (among others).  He is a Cellebrite Certified Operator and Physical Analyst as well as certified in cellular call detail analysis and mapping.  He continues to hone his digital forensic expertise in the private sector while growing his consulting & investigation business marketed toward litigators, professional investigators and corporations, while keeping in touch with the public safety community as a Law Enforcement Instructor.
Email:  Inquiries@ProDigital4n6.com
Twitter: @ProDigital4n6

Saturday, June 1, 2019

Four Tips for Effective Forensic Report Writing


June 1, 2019

Four Tips for Effective Forensic Report Writing


Digital forensics is a complicated field.  As mentioned in previous articles, much of what we do as forensic practitioners is break down very complicated & technical matters to basic concepts that stake-holders in our cases can easily understand.  In fact, if you ever take any of the Mac Forensics courses taught by Sumuri, Instructor Steve Whalen starts out by asking “what is digital forensics?”  You’d be astonished how many people in the room who are digital forensic practitioners cannot answer the question.  Is this because they never (or rarely) have to present their findings in court?  Perhaps.  But even before the case gets to court, there has to be effective documentation of the steps undertaken to reach findings and conclusions.  Without this documentation, it makes it very hard to justify or affirm the conclusions.

Recently, we worked a criminal defense case where the law enforcement digital forensic examiners report was frankly abysmal.  This is not good for law enforcement, public safety or the digital forensic community overall.  We will not call out the examiner or his agency.  That’s unprofessional.  But in this article, we’ll relay some steps that can help make your forensic reports much more effective.  Whether the case is a criminal defense matter or a civil litigation domestic dispute, the report is your voice as an examiner and analyst and it’s extremely difficult, if not impossible, to do a “take-back”.  After all, when people’s lives and/or livelihood are on the line, don’t we all owe it to everyone involved to be thorough and accurate?


Tip # 1:  Know the Different Types of Reports

This seems basic, but it can often be confused by examiners, Counsel, judges and juries.  When explaining the different types of reports, we generally break it down like this:  There is the examiner’s narrative of the steps he took and a summary of the evidence and any conclusions.  This is the “Summary Report”(or narrative report).  The summary report refers to the forensic reports, which are generated by whichever forensic tools you’ve used in the case.  As most anyone who has been doing digital forensics for a while will attest, some forensic reports can be hundreds or thousands of pages long, depending on the type of case, the number of items analyzed, the amount of data and other factors.

Furthermore, it’s important that the distinction between the two reports is clear.  When we receive a narrative with no heading, no dates, no details about basic case items and no real format to it, it is automatically confusing.  Even more so when this type of “report” is not accompanied by any forensic report generated by a forensic tool.  We have to be clear and concise.  Confusion is the enemy in digital forensics.  While this may be a tactic used by some to overload or misdirect the opposing party, that too is unprofessional.  If your methods and findings are solid, why should there be a need to purposely confuse, confound or misdirect the other side?

Tip # 2:  Be Accurate

In the case mentioned above, we received a narrative that didn’t detail basic items about the system and tool(s) used in question.  These include:

·      Pictures of the examined item
·      Verification of system time
·      Operating system in use on the item
·      Version of forensic tool used to conduct the analysis
·      Detailed methods used for creating the forensic image

The last point proved to be rather important.  The forensic image of the Mac system was created in the .E01 format.  Normally, .E01 images are segmented into parts during the imaging process.  This one was not.  It was one large 265 GB .E01 file.  This was odd, but in and of itself not a big deal.  However, upon hashing the .E01 image that was provided, the hashes did not match the hash values in the log generated during the imaging process.  We still have no explanation for this, but there was missing data -- very important missing data.  One of the most frustrating things as an examiner is to have questions like this and no answers.  They can be huge or they can be inconsequential.  The problem is, we just don’t know because there is no accurate documentation.

In the narrative/summary report, it was stated that no activity was present on the system for the date in question (paraphrased), therefore it must have been wiped by CCleaner (see further on that below).  However, a timeline analysis of the system indicated there was a great deal of activity on the system on date in question.  At trial, the law enforcement examiner’s testimony and statement was updated to say that “no files were created” on the system on that date, not that there was no activity.  There’s a big difference.  Accuracy is important!



Tip # 3:  Be Thorough

In the cited case, there were allegedly illicit images downloaded from the defendant by the police, but no images were found anywhere on the computer system.  That in itself is intriguing from a forensic perspective and we were excited to see what the evidence showed.  One of our steps was to conduct a key word search for unique items in the file name of the main image charged (there were only two downloads, none of which were on the system).  We found several hits in a database for “PTHC” which is a frequent term in file names of illicit images and was in the file name of the main charged image.  We documented this for Counsel and were prepared to testify about it (despite the fact it did not help the defendant).

The law enforcement examiner also conducted a key word search for the same key word and the forensic report (which we obtained 2 days before trial – also unprofessional) simply stated 5 hits in 5 files.  No additional detail about what the hits were and where they were found was contained either in the summary/narrative report or the forensic report.  Did they find the same things we found?  Did they even look beyond the hits?  Did they index the system prior to conducting the key word search? To be clear, we received dozens of hits for that string of letters after indexing and conducting our search, but as often happens, the false positives needed to be checked and weeded-out and the relevant ones documented. 

The point about all of this is, we had no idea what we were dealing with in regards to the “5 hits in 5 files”.  Further, the issue of validation of these hits was outstanding.  We found 5 key word hits, but they were in 2 files, not 5.  The work was half done… or at least half documented. 

Tip # 4:  Your Conclusions Have to Make Sense

At issue in the cited case was the fact that no images were found on the suspect/defendant drive and CCleaner was also present on the system.  The law enforcement examiner’s narrative stated “No images were found and CCleaner was installed on the system, therefore the image(s) must have been wiped by the user using CCleaner” (paraphrased).  This conclusion was not supported by any evidence other than 1) no images found 2) CCleaner found.  That’s it. 

This conclusion is at least potentially erroneous and is not backed by any other facts, analysis, evidence or documentation.  It is a digital forensic leap to say that just because a disk cleaning utility exists on a system and little or no evidence relevant to the charges were found, that a disk cleaning utility must have been used to wipe the data.  Such an important hypothesis like this should be documented with logs, metadata, etc. to attempt to prove or disprove whether or not it is true.  Aside from that, it is conjecture and quite possibly coincidence.

Wrapping It Up

Our goal in this article is not to bash any one examiner or set of examiners.  We all make mistakes and while the examples cited here have been seen sporadically through the years, they are fortunately not the norm in digital forensics.  The goal is to help avoid complacency, inaccuracy and sloppy report writing in the future.  We’re all in this to find the truth, wherever that may lead and to whom ever benefit or detriment.  There’s an old saying that is drilled into police recruits heads in law enforcement basic training – IF YOU DIDN’T WRITE IT DOWN, IT DIDN’T HAPPEN!  This is a great rule to live by when it comes to everything from note-taking to writing your final summary/narrative reports. 

What we do is important.  Many times, the methods we undertake and the conclusions at which we arrive can mean a long prison sentence for some or a loss of a great deal of money or custody of their children for others.  We owe it to the stake-holders in the case and to the digital forensic community to adhere to a high standard when issuing our findings.  It’s the best way to ensure that justice is served, no matter the case.

Author:
Patrick J. Siewert
Principal Consultant
Professional Digital Forensic Consulting, LLC
Virginia DCJS #11-14869
Based in Richmond, Virginia
Available Wherever You Need Us!


We Find the Truth for a Living!

Computer Forensics -- Mobile Forensics -- Specialized Investigation
About the Author:
Patrick Siewert is the Principal Consultant of Pro Digital Forensic Consulting, based in Richmond, Virginia.  In 15 years of law enforcement, he investigated hundreds of high-tech crimes, incorporating digital forensics into the investigations, and was responsible for investigating some of the highest jury and plea bargain child exploitation investigations in Virginia court history.  Patrick is a graduate of SCERS, BCERT, the Reid School of Interview & Interrogation and multiple online investigation schools (among others).  He is a Cellebrite Certified Operator and Physical Analyst as well as certified in cellular call detail analysis and mapping.  He continues to hone his digital forensic expertise in the private sector while growing his consulting & investigation business marketed toward litigators, professional investigators and corporations, while keeping in touch with the public safety community as a Law Enforcement Instructor.
Email:  Inquiries@ProDigital4n6.com
Twitter: @ProDigital4n6