Working with Email in Discovery: Processing Options and Review Workflows

Featured

Introduction

Technologies that allow for easier review of ediscovery in native format have become more affordable and accessible. Working with files in native format has several advantages including avoiding loss of potentially relevant information, access to metadata and better searchability. Email is one of the most common of the native formats produced in discovery. This article will explore some approaches for processing email and identify a number of low-cost of tools that can assist. (This article deals with the processing but not the substantive review of emails for case analysis – for this you should consider other tools such as CaseMap, or – for larger collections of emails – review platforms such as Casepoint or IPRO.)

The tools and approaches you select will depend on a combination of three factors: (1) volume, (2) format(s) and (3) the defense team goals. While a single tool might facilitate a discreet goal, more involved goals may require different approaches with a combination of tools. These scenarios can be ends in themselves or phases in an overall workflow. This article does not try to anticipate every possible situation that might arise but will explore a few common scenarios.

Many electronic file formats produced in the course discovery like Acrobat, Excel and Word files are generally accessible via standard software available on most computers. However, email file formats like MSG, EML, PST, and MBOX files present more of a challenge as often the recipient may not know how to access these files. 

Below is a quick overview of some of the most common email file formats encountered in eDiscovery that will be discussed in this article:

  • MSG: A Microsoft format for single emails. Often associated with the Microsoft Outlook email client.
  • PST: A Microsoft format for a collection of emails (as well as other potential items including: Calendars, Contacts, Notes and Tasks). Often associated with the Microsoft Outlook email client.
  • EML: Email format for single emails used by many email clients including Novell GroupWise, Lotus notes, Windows Mail, Mozilla Thunderbird, and Postbox.
  • MBOX: Email format for a collection of emails (as well as other potential items including: Calendars, Contacts, Notes and Tasks) used by many email clients including Novell GroupWise, Lotus Notes, Windows Mail, Mozilla Thunderbird, and Postbox.

All four formats are typically received in discovery and subpoena returns. Google Takeout, a service offered by Google which allows you to download your email, will produce emails in the MBOX format.

Working with these email formats consists of understanding which tool is compatible with which file format, and which tool or set of tools will most effectively allow you to achieve your goals. Below is a table that maps out some of various tools available in terms of which file formats they are able to process, their functionality and cost. Before using any of these tools, make sure to work with a copy of the data as opposed to the original.

SoftwareCompatible FormatsCostFunctionality
Mozilla Thunderbird with the Import Export Tools add-onEML, MBOXfreeView emails, convert to EML, HTML, MBOX and PDF (without attachments)
Mbox ViewerEML, MBOXfreeView emails, convert to HTML or PDF (without attachments)
PSTViewer ProMSG, EML, PST, MBOX$129View emails, convert to multiple formats including EML, HTML, MBOX and PDF (includes advanced PDF attachment image options)
MS Outlook  MSG, EML, PST$159 or $69.99 per yearView emails, export to MSG, PST and PDF (requires Acrobat integration)
Aid4MailMSG, EML, PST, MBOX$299 per yearConvert email to multiple formats including MSG, HTML, EML, PST, MBOX and PDF
dtsearchMSG, EML, PST, MBOX$199 or *free
Search and view results in email viewer panel (no conversion or export options)

*For information about a free license of dtSearch available to CJA Panel Attorneys see: nlsblog.org/2014/03/25/dtsearch-desktop

This article will discuss demonstrate how to work with emails in terms of a series of discreet tasks including:

  1. Generate a list of emails to review.
  2. Viewing emails.
  3. Search, tag, and convert emails.
  4. Working with email attachments.

1. Generating a list of emails for review.
An initial task at the outset of a case might be to generate an index to facilitate early case assessment. Some programs, like PstViewer Pro, will work with many formats while other programs, like Mbox Viewer, work with a more limited number of formats.

  • Example 1 – Generating a list using Mbox Viewer:
    Mbox Viewer is a free tool that allows you to preview emails and generate a list of emails by simply selecting messages in the viewer, doing a right click and selecting print to CSV, then selecting which fields you would like to include in the spreadsheet (Figure 1-1).
Figure 1-1
  • The resulting CSV file contains a table that can be opened in Excel or imported into other programs (Figure 1-2).
Figure 1-2

2. Viewing emails.
While a list will provide you with a high-level overview of the emails you have in terms of subject matter, players involved and so forth, a closer review will require a different approach. MS Outlook, Mbox Viewer and Mozilla Thunderbird are all tools which can be utilized for this purpose.

  • Example 2.1 – Viewing emails received in PST format using MS Outlook:
    Within Outlook open the ‘File’ menu, select the ‘Open & Export’ button, then ‘Open Outlook Data File’. Navigate to the folder containing the PST file (Figure 2-1) and select the file to import. Outlook will create a folder within the ‘Personal folders’ from where you can conduct a review of the files.
Figure 2-1
  • Example 2.2 – Viewing emails received in MBOX format using Mozilla Thunderbird with the Import Export Tools add-on:
    The free ‘Import Export Tools’ add-on available for Mozilla Thunderbird allows for the import and viewing of MBOX files. After the add-on has been installed, right click on ‘local folders’, then choose ‘Import mbox file’ from the ‘ImportExportTools NG’ menu and navigate to the folder containing the MBOX file (Figure 2-2). This will copy the MBOX file into Thunderbird’s ‘Local Folders’ where, similar to Outlook, you can conduct a review of the emails within.
Figure 2-2

3. Search, tag, and convert emails
The approaches discussed in the two previous sections can be useful when you simply want to gain a high-level view of the emails, or take a closer look at particular emails in a smaller collection. However, when you are working with large volumes of emails, manual review becomes impractical and inefficient, and taking advantage of the search and tag functionality of the available tools is a better approach.

  • Example 3 – Searching, tagging and exporting within MS Outlook:
    Outlook can be utilized to conduct key word searches, and relevant files can be tagged exported as either MSG or PDF files (using the Acrobat integration that is included with licensed copies of Acrobat Standard and Pro). To tag an email, right click and select ‘Categories’ then select a color coded tag (Figure 3-1). You can also customize the tags using the ‘New Category’ option within the ‘Category’ dialog box (Figure 3-2).
Figure 3-1
Figure 3-2
  • You can then filter and tag a selection of emails (Figure 3-3) and save them to a folder as either individual MSG files or a new PST file. If you have a licensed version of Adobe Acrobat, there integration menu within Outlook can be used to convert messages into individual PDF’s or a combined ‘PDF Portfolio’ (Figure 3-4).
Figure 3-3
Figure 3-4
  • When choosing an export format, be aware of the limitations of the different conversion formats. The HTML and PDF export formats typically will not include the complete email metadata. Email header information that may include important information like IP addresses used may be lost during conversion. Export formats including the MSG, EML, MBOX and PST retain much more of the original email metadata.

4. Working with email attachments.
Emails invariably have attachments, which, in addition to the body of the email can contain substantive relevant information. The programs discussed in this post vary greatly with how attachments are handled during format conversion. Be aware that some of the programs are not able to include the attachments when exporting to PDF. While PDFs are generally easier to add bates stamps to or turn into exhibits not all programs include the attachments..

  • Example 4.1 – Exporting email with attachments using Mozilla Thunderbird with the Import Export Tools add-on:
    Thunderbird offers several export options including the ability to batch export relevant emails when using the Import Export Tools add-on. It does not have the ability to embed or append attachments when exporting messages to PDF, however it does allow for emails to be exported to the EML format (with attachments embedded) as well as an HTML format, which will include links to exported copies of the attachments (Figure 4-1).
Figure 4-1
  • Example 4.2 – Exporting email with attachments using PSTViewer Pro:
    PSTViewer Pro is yet another option for format conversion, and is a great tool to use in conjunction with tools like Thunderbird or Outlook. It can convert to many formats and includes some advanced PDF conversion options. When converting to PDF, attachments can either be embedded or “imaged” (Figure 4-2). The “imaged” option will convert supported attachments into PDF pages and appended them to the PDF version of the email (Figure 4-3).
Figure 4-2
Figure 4-3

Conclusion

As shown in this article there are a multiplicity of tools available to work with emails that are not universally compatible with all email formats and do not have the same functionality. This requires careful thought about how to leverage and integrate the tools. The best path forward through this thicket is to know what your goals are before you select your tool. Defining your goal early will help you select which tool or combination of tools you should use to develop an effective workflow that matches both the set of data you are working with and the needs of your case.

Google Data and Geofence Warrant Process

Featured

[Editor’s Note: John C. Ellis, Jr. is a National Coordinating Discovery Attorney for the Administrative Office of the U.S. Courts, Defender Services Office. In this capacity, he provides litigation support and e-discovery assistance on complex criminal cases to defense teams around the country. Before entering private practice, Mr. Ellis spent 13 years as a trial attorney and supervisory attorney with Federal Defenders of San Diego, Inc. He also serves as a digital forensic consultant and expert.]

Introduction

This is an updated version of a post originally published in December 2020, which provides a primer on how Google collects location data, the three-step warrant process used by law enforcement to obtain these records, and an example of how the data is collected and used by the prosecution. The updated version includes references to United States v. Chatrie, a recently decided district court opinion regarding the constitutionality of geofence warrants.[i] From the opinion and the pleadings in Chatrie, we have a better understanding of the Google collection and geolocation search warrant process.

What Can Google Do?

Google began collecting location data in order to provide location-based advertisements to its’ users. Google tracks location data from the users of its products, including from consumers who use Android telephones and those who use Google’s vast array of available apps on other devices such as Apple iPhones. For Android devices, Google is constantly tracking devices whenever the permission settings on the device are set to allow for the use of Google Location Accuracy. For iOS users, location information is only collected when a user is using a Google product, such as Google Maps.[ii] Google stores this information in a repository called “Sensorvault”, which “assigns each device a unique device ID…and receives and stores all location history data in the Sensorvault to be used in ads marketing.” 3:19-cr-00130-MHL at 7. The use of Sensorvault has been very profitable for Google. Since Google started collecting data and using Sensorvault in 2009, Google’s advertisement revenue has almost increased tenfold.

See https://www.statista.com/statistics/266249/advertising-revenue-of-google.

Google is able to determine the approximate location of a mobile device based on GPS chips in the device, as well as the device’s proximity to Wi-Fi hotspots, Bluetooth beacons, and cell sites.[iii] For purposes of Wi-Fi, Google uses the characteristics of wireless access points within range of the device (including received signal strength) to determine the device’s proximity to the access point, and thus approximate location. How Google tracks this data is dependent of the type of device (Android v. Apple) and an individual user’s privacy settings.[iv] Google cannot determine the exact location of a device, and as such, location records contain an “uncertainty value” which is expressed in meters.

Maps Display Radius:

Because Google does not know a device’s precise location, it represents the possible location in a sphere, or what Google refers to as the Maps Display Radius.

In this picture, Google’s “goal is that there will be an estimated 68% chance that the user is actually within” the spherical representation.[v]

To see how Google determines the approximate location of a mobile device, viewing the Location History of a Google account is instructive. In the following example, according to Google, the blue line indicates the path of travel, the orange dots represent wireless access points, and the grey sphere next to the blue arrow is the estimated range of the location source.

Generally, the location information source has the largest impact on the Maps Display Radius. Most often, GPS provides the smallest sphere whereas Cell Sites are generally the largest. By way of example, the map display radius for GPS is often a few meters whereas Wi-Fi is routinely over 1000 meters.

Use of Google’s Tools by Law Enforcement – Three-Step Warrant Process

Although the original intent of Google’s Sensorvault technology was to sell advertising more effectively, over the past few years this data has been sought by law enforcement to determine who was present in a specific geographical area at a particular time, for example, when a crime was committed. These warrants are often called “Geofence warrants” because officers seek information about devices contained within a geographic area. In 2021, Google released information about the number of geofence warrants sought by law enforcement. According to the data, “Google received 982 geofence warrants in 2018, 8,396 in 2019 and 11,554 in 2020.”[vi]

In current practice, Google requires law enforcement to obtain a single search warrant. The three stage warrant process is based on an agreement between Google and the Department of Justice’s Computer Crime and Intellectual Property Section (CCIPS). Once Google receives a geofence warrant, it takes on the extrajudicial role of determining when law enforcement officers have complied with probable cause such that additional information will be provided.

Stage One:

In response to the warrant, “Google must ‘search … all [Location History] data to identify users’ whose devices were present within the geofence during the defined timeframe” and to provide a de-identified list of such users. Chatrie at 19. The list includes: (1) anonymized user identifiers; (2) date and time the device was in the geofence; (3) approximate latitude and longitude of the device; (4) the maps display radius; and (5) the source of the location data.[vii]

Stage Two:

After reviewing the initial list, law enforcement can return to Google and request additional information about any device that is within in the first geofence. This includes “compel[ling] Google to provide additional…location coordinates beyond the time and geographic scope of the original request.” Chatrie at 21.[viii]  Troubling,
Google imposes “no geographical limits” for Stage Two review. Id.

Stage Three:

The third step involves compelling Google “to provide account-identifying information for the device numbers in the production that the government determines are relevant to the investigation. In response, Google provides account subscriber information such as the email address associated with the account and the name entered by the user on the account.”[ix]

It is important to note that in practice it appears that law enforcement routinely skips Stage Two and moves directly from Stage One to Stage Three analysis.

Past Examples

The shape of Google Geofence warrants has changed over time. For instance, In the Matter of the Search of information that is stored at premises controlled by Google, 1600 Amphitheatre Parkway, Mountain View, California 94043, law enforcement officers investigating a bank robbery sought information about “all Google accounts” located within a 30 meters radius around 43.110877, -88.337330 on October 13, 2018, from 8:50 a.m. to 9:20 a.m. CST.

Compare that to In the Matter of the Search of Information Regarding Accounts Associated with Certain Location and Date Information, Maintained on Computer Servers Controlled by Google, Inc.. In that instance, law enforcement was investigating a series of bombings and they sought location information for “all Google accounts” for a 12-hour period between March 1 and 2, 2018 in a “[g]eographical box” around 1112 Haverford Drive, Austin, Texas, 78753 containing the following coordinates: (1) 30.405511, -97.650988; (2) 30.407107, -97.649445; (3) 30.405590, -97.646322; and (4) 30.404329, -97.647983.

More recently, Google has requested that law enforcement submit Geofence warrants that are convex polygons in shape.

Starting from the Beginning – How the Process Works

To put this into perspective, the following example is illustrative. For these purposes, a crime occurred in the parking lot of a strip mall.

Because the crime occurred in the middle of a parking lot, we will create a geofence that includes storefronts because it will increase the chances that the suspect’s mobile device will be within range of a Wi-Fi hotspot or Bluetooth beacon. Conversely, the geofence will include the mobile devices of numerous people who are not connected to the offense.

The above geofence appears to only impact people who are present in the parking lot or surrounding business. However, the geofence would likely capture many more people, including people living or visiting in the nearby apartments and anyone who was driving on the surrounding streets during the time in question.

Stage One—The following is an example of a Stage One warrant return:

Device IDDateTimeLatitudeLongitudeSourceMaps Display Radius (m)
12345678912/20/2015:08:45(-8:00)32.752667-117.2168GPS5
98765432112/20/2015:08:55(-8:00)32.751569-117.216647Wi-Fi25
14785236912/20/2015:08:58(-8:00)32.752022-117.216369Cell1000
12345678912/20/2015:09:47(-8:00)32.752025-117.216369Cell800
98765432112/20/2015:09:55(-8:00)32.752023-117.216379Wi-Fi15
12345678912/20/2015:10:03(-8:00)32.752067-117.216368Wi-Fi25
98765432112/20/2015:10:45(-8:00)32.752020-117.216359Cell450
98765432112/20/2015:10:55(-8:00)32.752032117.216349Wi-Fi40
12345678912/20/2015:10:58(-8:00)32.752012117.216379Cell300

Here, Device ID 123456789 is Suspect One, Device ID 987654321 is Suspect Two, and Device ID 147852369 is Suspect Three.  For this example, only one location for each device is shown.

At first blush, it would appear as if the Geofence has located three possible suspects.  But this image does not tell the full story. The blue bubbles for Suspect One and Suspect Two show a Maps Display Radius of 5 and 25 meters respectfully.

Suspect Three’s location was derived from a Cell Site, with a Maps Display Radius of 1000 meters.

Thus, although Google believes that Suspect Three’s device was near the scene of the crime, it is possible it was located anywhere within the larger sphere, and it is possible that the device was not located within either sphere.

Stage Two—For this stage, we can expand our original results, as long as we only include one of the accounts returned in Stage One. Here, we will expand our results and determine if Suspect One’s device also present in the area Northeast of the original search location.

Stage Three—is the step whereby subscriber information about the accounts Google deems responsive. Meaning, law enforcement requests Google to provide the account number and information for Device IDs provided in either Stage One or Two. The following is an example of such a return:

Conclusion As technology and privacy concerns of consumers continue to change, so will the ability for law enforcement to obtain location data of users. The use of Google geofence warrants implicates a number of Fourth Amendment issues; future posts will explore the legal implications surrounding the overbreadth of these warrants.[x] But beyond the legal challenges, those encountering Google location warrants should remain mindful of the limitations of the data as well as the absence of concrete answers from Google regarding their methodology for determining location data


[i] See United States v. Chatrie, 3:19-cr-00130-MHL, Docket Entry 220.

[ii] The exception is for a user who has turned location services to always on, has a Google product open on a device, and has allowed for background app refresh. That means that is likely that Google knows far more about the location history of android users than iPhone users. That’s important because approximately 52 percent of devices on mobile networks are iOS devices. https://www.statista.com/statistics/266572/market-share-held-by-smartphone-platforms-in-the-united-states/.

[iii] https://policies.google.com/technologies/location-data (“On most Android devices, Google, as the network location provider, provides a location service called Google Location Services (GLS), known in Android 9 and above as Google Location Accuracy. This service aims to provide a more accurate device location and generally improve location accuracy. Most mobile phones are equipped with GPS, which uses signals from satellites to determine a device’s location – however, with Google Location Services, additional information from nearby Wi-Fi, mobile networks, and device sensors can be collected to determine your device’s location. It does this by periodically collecting location data from your device and using it in an anonymous way to improve location accuracy.”)

[iv] https://support.google.com/nexus/answer/3467281?hl=en

[v] See United States v. Chartrie, 19cr00130-MHL (EDVA 2020), ECF 1009 [Declaration of Marlo McGriff] (“A value of 100 meters, for example, reflects Google’s estimation that the user is likely located within a 100-meter radius of the saved coordinates based on a goal to generate a location radius that accurately captures roughly 68% of users. In other words, if a user opens Google Maps and looks at the blue dot indicating Google’s estimate of his or her location, Google’s goal is that there will be an estimated 68% chance that the user is actually within the shaded circle surrounding that blue dot.”)

[vi] https://techcrunch.com/2021/08/19/google-geofence-warrants/

[vii] Id. at 4 (“After that search is completed, LIS assembles the stored LH records responsive to the request without any account-identifying information. This deidentified ‘production version’ of the data includes a device number, the latitude/longitude coordinates and timestamp of the stored LH information, the map’s display radius, and the source of the stored LH information (that is, whether the location was generated via Wi-Fi, GPS, or a cell tower)”).

[viii] Id. at 17

[ix] Id.

[x] In the Matter of the Search of: Information Stored at Premises Controlled by Google, 20mc00392-GAF (NDIL 2020) provides a great overview of the Fourth Amendment issues relating to Google Geofence warrants.  See also https://www.eff.org/deeplinks/2020/07/eff-files-amicus-brief-arguing-geofence-warrants-violate-fourth-amendment


U.S. v. Morgan, et al: Know What You Don’t Have

[Editor’s Note: Tom O’Connor is an attorney, educator, and well respected e-discovery and legal technology thought leader. A frequent lecturer on the subject of legal technology, Tom has been on the faculty of numerous national CLE providers and has taught college level courses on legal technology. He has also written three books on legal technology and worked as a consultant or expert on computer forensics and electronic discovery in some of the most challenging, front page cases in the U.S. Tom is the Director of the Gulf Coast Legal Technology Center in New Orleans, LA ]

If you were practicing in federal court before email, ECF filing, and in the days when Joe Montana threw to Jerry Rice then you probably remember discovery productions were typically hardcopy documents you picked up at the US Attorney’s Office. The volume was so small it easily fit into your briefcase. Those were the days when everyone complained about not getting enough discovery. The challenge was moving to compel for more discovery when you didn’t know what you didn’t have.

Joe Montana and Jerry Rice

Fast forward to the present. Tom Brady is throwing to Rob Gronkowski (again but in a different city) and discovery is typically so voluminous it cannot be provided in hardcopy form. Productions can be hundreds of gigabytes and sometimes dozens of terabytes full of investigative reports, search warrant pleadings, surveillance audio and video, cell phone data, cell tower material, years of bank records, GPS data, social media materials, and forensic images of servers, desktop computers, and mobile devices. Common are duplicate folders of discovery produced “in the abundance of caution” to protect the Government against Brady violations. Despite the volume, the same issue exists: How do you know what you don’t have?

Tom Brady and Rob Gronkowski

US v Morgan (Western District of New York, 1:18-CR-00108 EAW, decided Oct 8, 2020) is an example of diligent defense counsel challenging the government on how it produced terabytes of data.

Defendants Robert Morgan, Frank Giacobbe, Todd Morgan, and Michael Tremiti were accused by way of a 114-count Superseding Indictment of running an illegal financial scheme spanning over a decade. The government alleged they defrauded financial institutions and government sponsored enterprises Freddie Mac and Fannie Mae in connection with the financing of multi-family residential apartment properties that they owned or managed. There were also allegations of related insurance fraud schemes against several of the defendants.

The government made several productions which the defense contended were deficient (including the lack of metadata on numerous documents) and, in several cases, omitted key pieces of evidence. The defense enlisted the help of e-Discovery experts, who stated the government failed to properly process and load evidence into their database for production to defense counsel.

The issue was brought before the court in defense motions to compel and dismiss. First to the magistrate judge then to the district court judge, which resulted in a critical analysis of the way the government handled the discovery.

CASE TIMELINE

The original status conference in the case was held on May 29, 2019. For the next year, a series of motions and hearings proceeded with regards to delays and failures on the part of the government to meet discovery deadlines imposed by the court.

An evidentiary hearing was finally held before district court Judge Elizabeth A. Wolford on July 14, 2020, continuing through the remainder of that week until July 17, 2020, and then resumed and concluded on July 22, 2020. There were multiple expert witnesses, and the review of that testimony is over 7 pages in the Opinion.

On September 10, 2020, oral argument on the motions to compel and dismiss was heard before Judge Wolford. The Court entered its Decision and Order on October 8, 2020.

There was no dispute that the discovery in this matter was not handled properly. In the second paragraph of the above cited Decision and Order, Judge Elizabeth A. Wolford states,

“The Court recognizes at the outset that the government has mishandled discovery in this case—that fact is self-evident and cannot be reasonably disputed. It is not clear whether the government’s missteps are due to insufficient resources dedicated to the case, a lack of experience or expertise, an apathetic approach to the prosecution of this case, or perhaps a combination of all of the above.”

Specifically, the government somehow failed to process and/or produce ESI from several devices seized pursuant to a search warrant executed in May 2018 and in one case, a cell phone, seems to have actually been lost. The court ultimately dismissed the case without prejudice. This gave the parties time to resolve the discovery issues. On March 4, 2021, a grand jury returned a new 104 count indictment.

More important for our purposes are the discussions regarding the ESI and production issues. They are outlined below.

PROJECT MANAGEMENT

The Court wasted no time in saying “It is evident that the government has demonstrated a disturbing inability to manage the massive discovery in this case, and despite repeated admonitions from both this Court and the Magistrate Judge, the government’s lackadaisical approach has manifested itself in repeated missed deadlines.”

And later, “To be clear, the Court does not believe the record supports a finding that any party acted in bad faith. Rather, the discovery in this case was significant, and the government failed to effectively manage that discovery. In the end, because of its own negligence, the government did not meet the discovery deadline set by the Magistrate Judge.”

COMPLEXITY OF LARGE AMOUNTS OF ESI

Judge Wolford made several references to the “massive discovery.” In an attempt to manage that data, the Magistrate Judge had initially directed the parties to draw up a document entitled “Data Delivery Standards” (hereinafter referred to as “the DPP”) which would control how documents were exchanged. It failed to do so for several reasons.

First was the large amount of data. Testimony by a defense expert witness at the evidentiary hearing of July 14, 2020, stated that “… the government’s Initial Production consisted of 1,450,837 documents, reflecting 882,841 emails and 567,996 other documents. Of those documents, 860,522 were missing DATE metadata, with over 430,000 documents reflecting no change in the DATE metadata field formatting after the DPP was agreed-upon. Once overlays were provided by the government, the DATE metadata field was corrected for almost one-third of the documents (primarily emails), but 590,448 documents still were missing DATE metadata, including 294,818 emails. Of those 294,818 emails, 169,287 had a misformatted DATE value and 125,531 had no DATE value. The Initial Production also contained missing values for the metadata fields of FILE EXTENSION, MD5 HASH, PATH, CUSTODIAN, MIME TYPE, and FILE SIZE— and the government overlays did not change the status of the information in any of those fields.”

Additionally, the USAO-WDNY’s processing tool was Nuix while another entity—the Litigation Technology Support Center in Columbia, South Carolina – processed some of the hard drives using a different processing tool called Venio. Additionally, the Federal Housing Finance Agency (“FHFA”) processed the Laptop Production using a “much more robust” version of Nuix than the system possessed by the USAO-WDNY.

These differing versions led to different productions which had different values for the metadata fields. Standardization on one tool could have prevented much of this. But the Court also noted that “… the quality review conducted by the government was insufficient to catch these errors.”

Inconsistent directions were an ongoing issue. For example, the Court found that “… the government prosecutors expressly instructed Mr. Bowman not to produce CUSTODIAN information for the Laptop Production, even though the government had provided similar information previously.”

Other government errors included:

  1. It applied different processing software inconsistently to the PST or OST files, thereby missing some metadata and producing varying results.
  2. It misformatted the DATE metadata caused by failing to catch the errors while conducting a quality review.
  3. It failed to produce native files in “the format in which they are ordinarily used and maintained during the normal course of business[.]” It produced near native or derivative native files from the OST or PST files without corresponding metadata.
  4. In many instances, load files necessary to install the document productions in the defense review software platform were missing.
  5. There were ongoing errors with respect to CUSTODIAN metadata, which were the result of human error on the part of the government.

WHAT DOES THIS MEAN TO YOU?

With regards to what specific steps can be used to take control of cases with large amounts of ESI, the Court mentioned several.

  1. Use an exchange protocol. In civil cases, this document would arise from FRCP Rule 26(f), which mandates a “Meet & Confer” conference of the parties so that they might plan for discovery through the presentation of a specific plan to the Court. 

    In Morgan, this was the document entitled the DPP. In criminal cases going forward, the new Federal Rule of Criminal Procedure 16.1 will address some of these concerns. Drawn up specifically as a response to deal with the manner and timing of the production of voluminous Electronically Stored Information (ESI) in complex cases, Subsection (a) requires the prosecution and defense counsel to confer “[n]o later than 14 days after the arraignment…to try to agree on a timetable and procedures for pretrial disclosure under Rule 16.1.” Subsection (b) authorizes the parties, separately or together, to “ask the court to determine or modify the time, place, manner or other aspects of disclosure to facilitate preparation for trial.”

  2. Standardize the use of technology. As Judge Wolford said, “In sum, the Court believes that it would have been much more prudent if the government, after reaching agreement with the defense about the DPP, had utilized a competent vendor to process the ESI (and all the previously produced ESI) in the same manner with the same settings and utilizing the same tools.”

  3. Get a data manager. A common saying in IT circles is that “someone needs to own the data.” In this case, where the Government used multiple parties who employed different tools to work with the data, nobody owned the data. This lack of a central manager “… led to electronic productions being produced in an inconsistent manner and, in some instances, in violation of the DPP.”

  4. Get an expert. After hearing multiple experts testify for several days on what had transpired with the ESI, the Court noted, “… electronic discovery is a complicated and very technical subject. As a result, facts can be easily spun in a light most favorable to one party’s position or the other. That occurred here on behalf of all parties.”

    Nonetheless, the experts were able to bring clarification to the issues of “missing” metadata and divergent processing results that had beleaguered the parties and the Court. This field, especially with large amounts of ESI, is a classic example of the old maxim, “do not try this at home.” Get an expert.

  5. Use a review tool. ESI in these large amounts are simply not able to be reviewed manually. Both parties here recognized that fact and, as the Court noted several times, most of the errors in the case were not due to software but what we nerds call the “loose nut on the keyboard” syndrome.

    Get review software. Get trained on it. Use it. One admonition I always make which could have avoided many delays in this matter is do not try to load everything at once into your review platform. Start with a small amount of sample data to be sure you are getting what you need. Which leads to our last takeaway.

  6. Talk with the government. Judge Wolford specifically noted that the “… the Court also concludes that Defendants and the government were not always communicating effectively regarding electronic discovery.” For example, none of the parties could recall “… any discussions during those negotiations about the processing tools that would be utilized or the type of native file that would be analyzed for purposes of creating a load file.”

CONCLUSION

The Morgan case illustrates there are ways to learn about what you don’t have so you can bring it to the government’s attention and if need be, to the Court. It is also example of a Court being knowledgeable about ESI productions. The Court noted often and in different ways that “… electronic discovery is challenging even under the best of circumstances. In other words, the facts and circumstances cannot be appropriately evaluated without considering the volume of discovery and the enormous efforts needed to manage an electronic production of this nature.”

Find an expert who understands your needs and has effective communication skills to convey to you, the government, and Court complex technical issues. For many years, Magistrate Judge Andrew Peck (SDNY, Retired) advocated “Bring-Your-Geek-To-Court Day,” in which parties bring an outside consultant or an in-house IT person to address disputes. If you were to remember only one thing form this case, it should be: Go get a geek.

Tom O’Connor
Director
Gulf Coast Legal Tech Center
toconnor@gulfltc.org
www.gulfltc.org 
Blog: https://technogumbo.wordpress.com/
Twitter: @gulfltc

Inside The Black Box: Excluding Evidence Generated by Algorithms

[Editor’s Note: John C. Ellis, Jr. is a National Coordinating Discovery Attorney for the Administrative Office of the U.S. Courts, Defender Services Office. In this capacity, he provides litigation support and e-discovery assistance on complex criminal cases to defense teams around the country. Before entering private practice, Mr. Ellis spent 13 years as a trial attorney and supervisory attorney with Federal Defenders of San Diego, Inc. He also serves as a digital forensic consultant and expert.]

Introduction:

For many years, law enforcement officers have used records generated by mobile carriers to place a mobile device in a general area. The records are called Call Detail Records (“CDRs”). CDRs are generated when a mobile device sends or receives calls and text messages. Mobile carriers likewise keep records of when data is used, such as browsing the internet. These records are called Usage Detail Records (“UDRs”). At times, the records generated by mobile carriers include the location of the cell site or cell sites and the direction of antenna that connected with the mobile device.

Cell Site Location Information (“CSLI”) is the practice of creating maps showing the possible coverage area of a cell site at the time a device was being used. For these purposes, it is important to keep in mind that the records only show the location of the cell site and the direction the antenna is facing. Recent technological improvements have resulted in mobile carriers now generating Enhanced Location Records (“ELRs”), which purport to show more precise location data. In AT&T parlance, such records are based on the Network Event Location System (“NELOS”). This location data is derived from proprietary algorithms.

In a recent federal case, the government, through a member of the Federal Bureau of Investigation’s (“FBI”) Cellular Analysis Survey Team (“CAST”), sought to introduce NELOS records in a trial. However, after a Daubert hearing where the CAST agent testified, the district court excluded the records, in part, because of concerns over the reliability of the algorithms used to determine the location data.

This article provides an overview of CSLI and NELOS records, discusses the order excluding NELOS records from trial, and provides practical advice for practitioners.

Overview:

When CDRs include cell site location data, analysts and law enforcement officers use these records to show the location of the cell site and the orientation of the sector. In North America, many cell towers contain three sets of antennas, with each set offering specific coverage area.

Picture 1

To illustrate this point, Picture 1 is an overview picture of a multi-directional cell tower. Each blue arm is a sector. When a mobile device connects to a cell site, the mobile carrier often records the activity (i.e., a sent text message), the time of the activity, and the location of the cell site and sector that was used.

Using these three data points, analysists and law enforcement officers create maps showing the location of the cell site and the orientation of the sector. In Map 1, the arms are used to demonstrate the beamwidth of the sector, which in this case records indicate is 120-degrees. The cone at the base of the triangle is only meant to show the orientation of the sector, not coverage area. Moreover, analysts generally will not testify that the mobile device was within the triangle. The triangle is only meant to represent the location of the cell site and the orientation of the sector.

Map 1

With NELOS records, on the other hand, the ELRs purport to show the location of a device as opposed to the location of the cell site. In the following example, the red pin represents the location of the device. The blue circle represents what AT&T calls the “Location Accuracy.” This accuracy ranges from approximately several meters to 10,000 meters. And some records are marked by “location accuracy unknown.” As discussed below, the Location Accuracy is determined by proprietary algorithms used by AT&T.

Map 2

In Map 2, the ELR indicates that the “[l]ocation accuracy [is] likely better than 300 meters.” In other words, the phone was at the red pin or within the blue circle at a specific date and time. NELOS records, however, contain the following statement: “The results provided are AT&T’s best estimate of the location of the target phone. Please exercise caution in using these records for investigative purposes, as location data is sourced from various databases, which may cause the location results to be less than exact.” DE 156 at 23 (emphasis added).

To put the first two examples into perspective, Map 3 shows both traditional CSLI and the use of NELOS records.

Map 3

The NELOS demonstrative, even taking account of the “Location Accuracy,” still provides a much smaller, and thus more specific, area of where the phone activity took place.

United States v. Smith, et al. (4:19-CR-514-DPM) (EDAR):

Donald Smith and Samuel Sherman were charged in a five-count indictment with various crimes relating to a murder. See Docket Entry (“DE”) 1. The government sought to introduce the testimony of CAST Agent Mark Sedwick “that provider-based location data typically is collected by obtaining historical call detail records for a particular cellular telephone from the service provider, along with a listing of the cell tower locations for that service provider.” DE 102 at 1. According to the government, “[t]his data is then analyzed for the purpose of generally placing a cellular telephone at or near an approximate location or locations on a map at points in time.” Id.

The government sought to have Agent Sedwick testify “regarding the activity and approximate locations of the cellular telephones believed to have been utilized by Donald Bill Smith, Samuel Sherman, Racheal Cooper and Susan Cooper on the approximate dates and times relevant to the charges in the Indictment.” Id. at 1-2. Attached to the government’s motion is the report created by Agent Sedwick. Maps 4 and 5 are examples from Agent Sedwick’s report. Map 4 shows how Agent Sedwick mapped traditional CSLI, and Map 5 shows how he mapped the same time period using NELOS records:

 

Map 4
Map 5

Map 4 shows traditional CSLI mapping with the location of the cell site and the orientation of the sector. With Map 5, each circle represents the area in which the device was used. Here, there are four such events. For comparison, in Map 4, Agent Sedwick’s opinion is limited to testifying about the location of the cell site and the orientation of the sector, whereas with Map 5, the testimony is the mobile device is within the circle.

Prior to trial, defense counsel challenged Agent Sedwick’s potential testimony and the district court conducted a hearing to determine the admissibility of the records pursuant to Daubert v. Merrell Dow Pharmaceuticals, Inc., 509 US 579 (1993). During the hearing, Agent Sedwick explained the reason AT&T created NELOS was to “test the health of the 3G network for planning and troubleshooting. It is a passive system where, while the phone is on the control channel communicating with the network across the control channel, it would passively pull whatever location data it could pull or data to compute location from that device.” DE 156 at 8.

Agent Sedwick further explained: “NELOS also became the generic term for any kind of location data. So depending, there might be other databases that were also pulled into the NELOS report that we receive from AT&T. Just from that report there’s no way to determine what other databases that was pulled from.” DE 156 at 9.

Agent Sedwick also provided information about known issues with NELOS data, specifically based on Temporary Mobile Subscriber Identity (“TMSI”). By way of background, mobile devices are assigned an International Mobile Subscriber Identity (“IMSI”), a unique number used by mobile carriers, which establishes that the mobile device can operate on a specific network. This is the number used by mobile carriers when creating CDRs. At times, however, in order to mask a device’s actual IMSI, networks assign the device a TMSI.[1] This is problematic for NELOS records because as Agent Sedwick explained, “[t]hat TMSI sometimes can get reallocated and then allocated back to a device, so you can have sometimes where the NELOS data will pull from a different device and get put into the records for the device that you’re requesting.” DE 156 at 10.

During cross-examination, Agent Sedwick was questioned about the portion of NELOS records that “caution in using these records for investigative purposes.” Agent Sedwick responded: “I wouldn’t rely on it if all I had was a NELOS point putting someone at a scene and that’s all I had, no, I would not use it. I’m using it—there is a caution with it, but I’m using it in the context of I have call and text to support it, I have other data to support, I have very good precise NELOS data. I feel very, very confident that this is accurate.” DE 156 at 24.

Agent Sedwick’s confidence in the accuracy of NELOS records was based on the proprietary algorithms created by the phone company. See DE 156 at 12 (“Question: Okay. So the device is sending various different events, they’re plugged into that algorithm, and essentially the algorithm will spit out what it computes as accuracy; is that correct? Answer: Yes, ma’am”). But Agent Sedwick acknowledged that he was not privy to the algorithm, nor whether NELOS was tested by AT&T for reliability. Instead, Agent Sewick testified he believed the algorithms are reliable “[b]ecause AT&T relies on that to make multi-million-dollar decisions on how they’re going to design their network.” DE 156 at 32.

In granting the defense’s motion to exclude NELOS data, the district court found:

What particularly concerns me, though, is this mystery algorithm that our—and the proprietary software. We don’t know, I don’t know exactly what is in the algorithm, and the agent gave some testimony at a general level about the kind of information that goes in, but it seems to me that I’m missing a—an important foundational stone there of something with more specificity as to the kinds of things that the algorithm uses and how the algorithm does its work.

We know that there are disturbances from time to time, or anomalies as was called, with the TMSI number. I also—I acknowledge some uncertainty about TMSI numbers and how many devices that might be connected with and how it is that the algorithm might deal with that. So there’s that. Then there is, in my view, almost a—so we’ve got our black box there, which is concerning, and I would say at this point there’s a peer review problem, as well, because I don’t have any scholarly literature or evaluation of the black boxes or the kind of things that could go into this black box and how it would work.

I understand about the corroboration, but I still find myself at sea of understanding how it is the—how things happen in the black box and whether—whether what comes out of the black box is sufficiently reliable that the jury can rely on it.

DE 156 at 85-87 (emphasis added).

Based on this, the district court entered the following order: “Agent Sedwick may testify about call detail records and historical cell-site analysis; but he may not testify about NELOS data and analysis.” DE 154.

Further Consideration:

The district court’s exclusion of NELOS records was based, in part, on the use of data generated by untested algorithms. Other mobile carriers also use ELRs, which generate purported location data that are also based on proprietary algorithms similar to NELOS. In seeking to exclude ELRs, as well as other forms of computer-generated data, counsel should encourage courts to question the reliability of evidence created by algorithms that lack independent validation and verification.

Glossary:

Acronym Full Title
CASTCellular Analysis Survey Team
CDRCall Detail Records
CSLICell Site Location Information
ELREnhanced Location Records
IMSIInternational Mobile Subscriber Identity
NELOSNetwork Event Location System
TMSITemporary Mobile Subscriber Identity
UDRUsage Detail Records

[1] As explained by EFF, “upon first connecting to a network, the network will ask for your IMSI to identify you, and then will assign you a TMSI … to use while on their network. The purpose of the pseudonymous TMSI is to try and make it difficult for anyone eavesdropping on the network to associate data sent over the network with your phone.” See https://www.eff.org/wp/gotta-catch-em-all-understanding-how-imsi-catchers-exploit-cell-networks.

Paralegals – The Linchpin to the Defense Team’s Discovery Review Process

Whether a federal criminal defense attorney is a sole practitioner, part of a firm or in a Federal Public or Community Defender Office, they are often assigned to a case on their own. In many situations, that is manageable because there is not a lot of information to organize, the client can help to review the discovery produced by the government or the strategy involves a plea. However, as cases continue to grow in size and complexity, it’s helpful to have paralegal assistance. A paralegal can support attorneys in many ways in a case, ranging from assisting with client contact to aiding attorneys at hearings and trial, but it is with discovery management that paralegals are increasingly important in today’s legal world. They can help the defense team get the work done faster and make the overall process more cost effective. A paralegal can contribute when an attorney is trying to understand the scope of the discovery and design a strategy to access and review the files more efficiently, organize everything, and ultimately search and review the discovery and case materials in a meaningful way.

Unique challenges that federal criminal defense practitioners face include increasing numbers of proprietary formats that standard software cannot open, large volumes of information that need to be sifted through and the potential lack of technology resources. All of these challenges make having human resources available even more important.

Fortunately, even sole practitioners need not fly solo. They can have paralegals as permanent members of their team or hire them specifically for a case.

While some paralegals have experience working on particular types of cases and are proficient in using certain software tools, some are new to the field and eager to learn. The type of paralegal that is the best fit for a criminal defense practice or for a case depends on the attorney’s working style, the type and complexity of the discovery involved, the timeline of the case, and the long-term goals to be met by adding a member to the team. Below are some questions an attorney should consider when thinking about hiring a paralegal.

  1. Do I need them to understand how to manage a case as soon as they walk in the door?
  2. What litigation support software am I using that I want them to be familiar with and have experience using?
  3. Do I want someone who knows about programs than can help me better manage discovery (and perhaps know more than I do on the topic)?
  4. Do I need someone experienced with using online document review databases?
  5. Do I need someone who understands how to search large sets of discovery using metadata filters (e.g. date ranges, file types, authors, and recipients, etc.) combined with keywords to help me identify the most relevant documents in the discovery?
  6. Do I need someone experience creating complex Boolean searches for culling large data sets into more manageable sets of discovery to review?
  7. Have they previously worked in my district on federal cases ?
    1. If not, are they willing to learn about the types of cases and the types of discovery generated here and become familiar with the unique nature of my district?
  8. Have they worked on the types of cases to which I am typically appointed?

If you are considering hiring a paralegal for a particular case, it is crucial that they have a suitable skillset for it. They should know how to leverage outside resources to make the overall discovery review process more efficient. For example, they may recommend using a third-party-vendors to process and host e-Discovery. This is reasonable and often times preferred, but they should not be billing time to have others do the work you expect from them.

The National Litigation Support Team (NLST) is a resource not only for CJA panel attorneys, but for private paralegals who assist panel attorneys as well. The NLST can answer questions and provide strategies about best practices when it comes to managing particular formats of discovery, demonstrating how a third party vendor can assist in particular situations, introducing your paralegal to  software available to panel members and provide one-on-one training on those tools, so that they can provide you with the best possible support.

Adding a paralegal to your practice, or to your team, for a single case can be the difference between discovery being left unreviewed due to shortage of time or lack of technology and being able to focus on telling your client’s story. Paralegals should be vetted, and you should have a clear understanding of their familiarity and experience with technology, the types of cases they have worked on and their willingness to learn new platforms and new ways of searching, reviewing, and managing information. When you find the paralegal that is a good fit for your practice, they will truly become the linchpin to your team’s discovery review process.

TrialDirector 360 Discount for CJA Panel Attorneys Licenses

The National Litigation Support Team (NLST) is pleased to announce that IPRO has agreed to provide a discounted rate for CJA panel attorneys to purchase a subscription license of TrialDirector 360.

TrialDirector 360 is a courtroom presentation tool that allows users the ability to present documents, pictures and videos in hearings and trials. Users can prepare exhibits in advance, or instantly display exhibits to jurors and judges. Additionally, attorneys can direct jurors’ attention to the most important parts of exhibits by doing call-outs, zoom-ins, mark-ups, highlights, and side-by-side comparisons of documents. During the examination of a witness, it is easy to do a screen capture of information that has been displayed to the jury for later use in the trial, and the software works well when used along with PowerPoint. TrialDirector has been successfully used for many years by FDOs and CJA panel attorneys representing clients and has been a staple of the Law and Technology workshop training series for close to 20 years.

CJA panel attorneys can purchase TrialDirector 360 at a discounted price of $556.50 per year (approximately 40% off the retail price). This price is for a subscription, so you must pay this amount each year to continue using the software.

If CJA panel attorneys are interested in purchasing TrialDirector 360 contact Kelly Scribner. If you have any questions regarding the utilization of TrialDirector 360 for your office, please contact the National Litigation Support Team (NLST): Kelly Scribner or Alex Roberts.

The NLST will be providing remote one-on-one training on how to use TrialDirector 360 for any user interested. Please have the user contact Kelly Scribner to schedule training.

Additional TrialDirector program information and resources are available on the IPRO TrialDirector 360 help center.

Microsoft Excel Tips & Tricks for CJA Cases: Filename Lists

By Alex Roberts

This post is part of an ongoing series of videos on how Microsoft Excel can help CJA practitioners (including attorneys, paralegals, investigators, and mitigation specialists) in their CJA cases.

Today’s Post: Filename Lists

When working with discovery, investigative documents, or other case-related materials, it is often helpful to have a list of filenames in an Excel table.

There are times when the government produces to defense counsel digital files where the name of the file indicates something about the file content without a user having to open each file individually.

For example, the government may produce a list of investigative reports in PDF format which, as part of the file name, has the date of the report, the type of report (e.g. FBI 302) and the author. In those instances, it can be beneficial to create a spreadsheet of the filenames and information about the files for later review and organization. Even in instances where the filename is only the Bates number of the file, it can be useful to have a spreadsheet of those numbers.

Microsoft Excel is a useful tool for generating such a filename list. When properly setup, Excel allows users to sort, filter and search for specific files based on different criteria. Fields can be created and associated such as comments, document type, review status, dates and related issues. Additionally, hyperlinks to a specific file or folder can be created for quick and easy access to an item. We will examine these functions in greater detail in future videos.

This video will demonstrate how lists can quickly be created and recommendations to follow when setting up a file list. The video looks at three methods for creating filename lists:

  • Method 1: Creating a query table by running the “Get Folder Data” process that is currently available in the newer “Office 365” version of Excel.
  • Method 2: Using the “Copy Path” process available in Windows File Explorer.
  • Method 3: Using a “File List Program” specifically designed for creating a list of files in Excel format (ex: Directory List and Print).

Discovery Coordination in Federal Criminal Cases

By Sean Broderick and Kelly Scribner

Introduction

We recently spoke to a well-respected CJA panel attorney, and he mentioned he had a discovery coordinator on a multidefendant case. He did not understand how discovery coordinators were either assigned or appointed in federal CJA cases, or what his expectations should be for what the discovery coordinator could do to assist him or his fellow CJA panel counsel. After talking with him, we thought it would help to have a blog post on the current state of discovery coordination in federal criminal cases.

Hundreds of multidefendant criminal prosecutions are occurring in federal courts throughout the United States. As federal criminal defense lawyers know well, these cases frequently involve complex forms and large amount of e-discovery. Complicating matters for many individual clients in multidefendant cases is that much of the discovery produced is not relevant to them. Even so, the defense team still needs to organize and manage the discovery. It can be laborious, overwhelming and time consuming for individual defense teams to organize the discovery on their own.

To help address this issue, the Administrative Office of U.S. Courts Committee on Defender Services approved the use of national Coordinating Discovery Attorneys (CDAs) to assist with discovery coordination between the government and the defense team, and to manage the discovery for all court appointed defense attorneys in multidefendant cases. Having a CDA serve as a single point of contact for distribution of discovery, managing the discovery and coordinating the vendor relationships necessary in complex cases can be an advantage to all involved. For the courts, who are in part overseeing CJA expenditures in a case, they are understandably interested in ways to lower costs by avoiding defense teams having to duplicate basic organization and management of discovery[1]. For defense counsel, who are concentrating on the needs and interests of their particular client, and who are focusing on case strategy, a CDA can assist with uploading, centralizing, and overseeing organization of voluminous discovery. For prosecutors, having a single source of distribution of discovery for all clients makes production of discovery more efficient. They can discuss form of production with one or two knowledgeable counsel as opposed to dozens of attorneys who may have varying experience and knowledge with technology and e-discovery.

However, not all discovery coordination is the same. Districts have implemented discovery coordination in a number of ways. Historically, there have been four principal types of discovery coordinators in federal criminal cases: National Coordinating Discovery Attorneys; Local Coordinating Discovery Attorney; Joint Paralegals or Investigators; and, Litigation Support Vendors.

This blog post describes four types of discovery coordination and explains the strengths and limitations of each one from the perspective of CJA panel counsel.

National Coordinating Discovery Attorneys

National Coordinating Discovery Attorneys (CDAs) are federal criminal defense attorneys who have experience working on CJA cases. The national CDAs have been appointed by federal district judges in numerous multidefendant cases in some of the most complex litigation in the United States. Since they are appointed by district courts, they have standing to communicate directly with the government. They are experienced in participating in Rule 16.1 “meet and confers” with the prosecution. These meetings can result in CJA panel attorneys obtaining discovery from the government in more useful formats, setting deadlines for rolling productions, getting volume estimates for planning purposes, and assisting the defense teams in setting dates for events that rely upon productions (e.g. pretrial motions, motions in limine, preliminary list of exhibits and witnesses, and trial dates). CDAs provide status reports to the court regarding the status of discovery productions which can assist defense counsel with preparing their case as the court will have a third party source to notify it regarding problems and challenges with discovery production (which can result in more time or more resources to assist defense counsel in the case).

The national CDAs are managed by the National Litigation Support Team (NLST), which provides a support network for guidance on pressing technology challenges. This national support network assists the CDAs develop innovative and practical solutions focused on the needs of CJA cases. Accordingly, CDAs are knowledgeable about the types of software programs available to assist in the management of discovery and know how to effectively use technology and litigation support vendors to assist with the organization, search, review and analysis of large volumes of electronically stored information (ESI).

CDAs have project management and technical support staff proficient in industry standard technology used to organize and review discovery. Additionally, CDAs’ staff provide training and technical support to all legal teams and can assist in executing the strategies that the CDA recommends in categorizing and searching the data received. CDAs monitor the marketplace and are experienced in vetting litigation support and e-discovery vendors to make certain that vendors provide quality services at the best possible rates. They are experienced in preparing funds requests to the court for third party assistance which they can do on behalf of CJA panel attorneys. Finally, CDAs are contracted with Administrative Office of U.S. Courts, Defender Services Office, so panel counsel need not prepare funds requests to the courts for their assistance.

Though CDAs have been appointed in cases in half of the federal districts in the country, they may not have experience in your jurisdiction. Due to their workload, they are only assigned to a limited number of cases. Also, they cannot do subjective analysis of the discovery for your particular client. For example, they will not tell defense counsel “here are all the files that relate to your client.” You will still need to develop a theory of defense, and use the tools provided to search, review and prepare the defense case (but that is what you are trained to do).

Currently, CDAs provide experience, technical proficiency, dedicated staff and accountability with experience in more than 45 federal district courts.

Local Coordinating Discovery Attorneys

In several jurisdictions, districts have appointed coordinating discovery attorneys on cases. Typically, they are attorneys that the court, or those who manage the CJA panel in that jurisdiction, have identified as having e-discovery experience in criminal cases. Since they are working in their own jurisdictions, these local CDAs know the practices and the types of government discovery productions which can help them when working with CJA panel attorneys. However, they have limited experience performing discovery coordination. The local CDAs do not have experienced staff such as project managers or technical support personnel who are knowledgeable and skilled with litigation support technology to help. Due to their limited assignments to complex cases (there have only been a handful of local CDAs appointed to multidefendant cases), they are limited in their experience in vetting litigation support and e-discovery vendors, and they do not have the breadth of experience training on various technology solutions similar to the national CDAs.

Joint Paralegals or Investigators

Joint paralegals have provided discovery coordination in a number of cases. Typically, this assistance has been done informally, where the paralegal has been officially appointed to assist one attorney representing a single defendant, but with the understanding that they may assist all of the defense teams with basic organization of discovery. The advantages of joint paralegals are that they often have significant experience working with and managing discovery in their own cases, and they are familiar with litigation support technology.

Though having a joint paralegal (or investigator) provide basic organization for multiple defense teams can work, there are issues to consider upfront to improve success for everyone. One question to clarify at the inception is how they are appointed to work in the case. Typically, even if there is an understanding that a joint paralegal’s work may be used to assist multiple defense teams, practically they will be appointed to assist a single client. Defense teams need to address what work is to be done, define what specific output they expect the paralegal to provide (e.g. level of detail in joint indices, what objective information is to coded, whether they will be producing spreadsheets or word indices, etc.), and prioritization of the work. Both the attorney who is responsible for the joint paralegal (under the Professional Rule of Responsibility 5.3), and the joint paralegal need to be clear about their roles between themselves and the rest of the defense teams and be aware of potential ethical considerations that may arise. As one example, an attorney for a different client may ask the joint paralegal to do subjective analysis specific to their client, but this request could reveal case strategy that the attorney may not want to share with defense counsel representing other clients.

Joint paralegals will be limited in communicating with the government. They are not in a position to receive the discovery directly from the government, nor are they in a position to lead a Rule 16.1 conference (though they certainly can assist counsel during that meeting or process). Joint paralegals rarely have staff to assist them. They will have limited experience compared to CDAs regarding various technology challenges that may be present in a case, limited exposure to litigation support technology outside of what they have been able to use with defense counsel they have worked with, and likely have limited experience in vetting litigation support and e-discovery vendors. Finally, they will have limited or no experienced in preparing funds requests to the court for third party assistance.

Third Party Vendors

There are several vendors who have worked on CJA cases, and who have played an important role in discovery coordination in multidefendant cases. Among other things, they have served as a clearing house for discovery productions and pushing out discovery to various defense teams and providing discovery tools such as spreadsheets or online databases for use in cases. In the right situation, a good litigation support or e-discovery vendor can bring industry standard technology, security and experience along with their services. They frequently have staff who can assist in the project, so they can scale up or down depending on the size of the case.

However, most vendors do not have significant experience working on CJA cases. Most litigation support and e-discovery vendors are focused on civil litigation (especially since it is challenging to be a viable business subsisting only on CJA cases).

Similar to joint paralegals, they will not be appointed to the case, but rather appointed to assist one of the defense teams, even if it is on behalf of the other defense teams. They, and the attorney who filed the funds request for their assistance from the court, must be mindful of the ethical issues that can arise as their appointment may likely to be specific to one attorney and client.

Vendors cannot communicate with the government on format of production or issues with data provided. They will not have experience participating in Rule 16.1 “meet and confers” with the government. Though they are experienced with litigation support technology, they may default to their own solutions, even if it is a poor fit for the needs of the case. They will not be experienced vetting litigation support and e-discovery vendors to make certain that vendors provide quality services at the best possible rates, nor will they be able to prepare funds for third party assistance.

Conclusion

Whatever your situation, be it a single or multidefendant case, the NLST is available to consult with appointed counsel when considering how best to manage and organize discovery in your case.


[1] See Generally, Case-Budgeting Techniques and Other Cost-Containment Policies , https://www.fd.org/sites/default/files/cja_resources/case-budgeting-techniques-and-other-cost-containment-strategies.pdf

E-Discovery: Mobile Forensic Reports

By Sean Broderick and John C. Ellis, Jr.

[Editor’s Note: Sean Broderick is the National Litigation Support Administrator.  He provides guidance and recommendations to federal courts, federal defender organization staff, and court appointed attorneys on electronic discovery and complex cases, particularly in the areas of evidence organization, document management and trial presentation. Sean is also the co-chair of the Joint Working Group on Electronic Technology in the Criminal Justice System (JETWG), a joint Department of Justice and Administrative Office of the U.S. Courts national working group which examines the use of electronic technology in the federal criminal justice system and suggested practices for the efficient and cost-effective management of post-indictment electronic discovery. 

John C. Ellis, Jr. is a National Coordinating Discovery Attorney for the Administrative Office of the U.S. Courts, Defender Services Office. In this capacity, he provides litigation support and e-discovery assistance on complex criminal cases to defense teams around the country. Before entering private practice, Mr. Ellis spent 13 years as a trial attorney and supervisory attorney with Federal Defenders of San Diego, Inc. He also serves as a digital forensic consultant and expert.]

Most federal criminal cases involve discovery that originally came from a cell phone. CJA panel attorneys and Federal Defenders have now become accustomed to receiving “reports” generated from Cellebrite.[1] In this blog post, we will talk about the valuable information that may be contained in those Cellebrite generated reports and what form of production you can get the reports in. Spoiler alert: we suggest you request that you receive those reports in Cellebrite Reader format and not just default to the PDF format that you know and love.

We are going to cover:

  1. the basic concepts behind the forensic process that law enforcement uses when using Cellebrite UFED to extract information from a phone,
  2. what is a Cellebrite generated mobile forensic report (which Cellebrite calls extraction reports), and
  3. the pros and cons for the potential formats you can receive Cellebrite generated reports in.

Though there are a number of forensic tools that law enforcement may use to extract data from a phone, the most common is Cellebrite. We are going to discuss Cellebrite, but know there are others (e.g. Oxygen, Paraben, etc.). Many of the processes and principles that apply to Cellebrite will apply to other tools.

Basic concepts behind the forensic process

How does a digital forensic examiner get the data from the mobile phone? Extracting data from mobile devices (a.k.a. acquisition) is complex and requires a great amount of skill when done correctly. For purposes of this blog post, we are only going to focus on one concept, which is the type of extraction that was performed. In order to retrieve data from a mobile phone, an examiner attaches the mobile phone to a computer which has the Cellebrite UFED software, follows a series of protocols, and saves a portion of the data on an external storage device. In most cases, examiners will not retrieve all data that was on the mobile phone at the time of the extraction—this is based in part on the phone’s memory architecture. Moreover, the type of extraction that is performed on the device can limit the amount of data that is retrieved.

The following are the most common types of extractions for Android devices: (1) Logical (or Advanced Logical); (2) File System; and (3) Physical. As for Apple, the most common types are Logical (Partial) and Advanced Logical. Generally, physical extractions retrieve the most data. After the iPhone 4, physical extractions are currently no longer available with Cellebrite with an iPhone device.

After a digital forensic examiner does an extraction of a phone (for this example, we will assume that the extraction was done through the Cellebrite UFED4PC), it generates an extraction files/folders, along with a .UFD (text) file that tells Cellebrite Physical Analyzer basic information about the extraction (such as which UFED was used, start and finish time, and hash information). The extraction files can be produced in a number of formats (.zip and .bin are common examples) depending on the type of extraction done. The takeaway here is that the type of extraction impacts the type and volume of data that was retrieved during the extraction process.

What is a Cellebrite generated report?

After extracting the data, the examiner uses Cellebrite Physical Analyzer to review the data retrieved from the mobile phone. The examiner also has the option of generating a report, which allows users without specialized forensic software to view the data retrieved from the mobile phone. As discussed below, the “extraction report” may be produced in multiple formats. Of note, the examiner can apply filters to decide what data types to export (e.g. emails, images, instant messages, searched items, etc.), and can further filter the data by date range. These reports are limited to the data extracted from the original device; the parameters of the forensic program dictated by the forensic examiner. The takeaway here is that a report does not necessarily include all data that was retrieved during the extraction.

Option for the Cellebrite generated report (extraction report)

Cellebrite generated reports, like the extractions described above, contain information from the mobile phone. This may include text messages, emails, call logs, web browsing history, location data, etc. They can be produced in a number of formats, though the most common are .PDF, .HTML, and .UFDR. There are pros and cons for each format of report.

PDF

Report in PDF format

There are several pros to receiving a Cellebrite generated report in PDF. CJA panel attorneys and Federal Defender defense teams are used to working PDFs. It is easy to add Bates stamps to them. They work on Macs. And they can be annotated and highlighted.

But there are also several important cons that make PDF a less desirable file type for Cellebrite generated reports. For instance, because phones have the capacity to contain large volumes of data, the reports generated from extractions can be quite large. A Cellebrite generated PDF report can easily reach 10,000 pages, which can cause a computer to slow down or even crash. Moreover, users cannot sort or filter data, hide data fields, or search within search results. In short, although PDFs are a convenient file type, it is not the most useful or efficient format for reviewing these types of reports.

HTML

Report in HTML format

There are several pros to receiving a Cellebrite generated report in the HTML format. The files load fast and can be viewed in any browser (such as Chrome, Firefox or Safari). In this format, each data type, such as SMS Messages, are hyperlinked and open in a new browser. (Please note that the hyperlinks only work if the file and the data are provided with the HTML file which can easily get overlooked when people move data.) Moreover, it is easy to search within HTML files and they operate on Macs.

But like PDFs, HTML files have several notable cons. First, you cannot sort or filter the data. Nor can you hide data fields. And you cannot easily generate reports for other subsets of information. Although HTML files are easy to use, they have significant limitations when it comes to reviewing reports.

UFDR

Report in UFDR format

The best format for receiving Cellebrite generated reports is the Cellebrite Reader format. The Cellebrite Reader format allows a user to create reports containing all data, or a portion thereof, in multiple formats including PDF, HTML and UFDR. So, if you receive if in UFDR format you can easily convert it to PDF or HTML later on (which is not possible if you receive it in HTML or PDF). Additionally, in this file format, users can sort and filter data, can search within results, can move or reorder data within columns, and can create tags—which is a convenient way to organize large volumes of discovery. And a user can open multiple UFDR files at the time and search across them. This allows a user to, amongst other things, search for keywords across multiple devices simultaneously.

The one downside to UFDR files is that they will not work on a Mac. You also need to have the free Cellebrite Reader program to open and use the UFDR file. Overall, this is the format you should request when speaking to the government about what form you would like reports generated from Cellebrite produced in.

Final note about formats: When deciding about your preferred format to review a Cellebrite generated report, remember that it is easy for an examiner to select all three formats at the same time. Often, an examiner will provide all three to make it easier for people to review the data in the way they want.

Conclusion

Mobile forensic reports are a ubiquitous part of discovery. When reviewing them, it is important to remember that the information in the report is limited by the limitations of retrieving data from mobile devices, the type of extraction performed on the device, and the data the examiner decided to include in the report. And the form of production of the report can affect how you review the data. Attorneys should consider contacting an expert or consultant if they have questions about the contents of a report.

Of note, Troy Schnack, Computer System Administrator for Federal Public Defender Office in Kansas City, Missouri, will be doing a webinar on mobile devices and will go into detail regarding Cellebrite Reader on Tuesday, September 22, 2020. Please register for the program on fd.org – we highly recommend it.


[1] Cellebrite UFED is a mobile forensic software program that allows trained users to extract and analyze phone call history, contact information, audio, photos, and videos and texts from mobile phones or forensic images of mobile devices produced as part of discovery. It has wide coverage for accessing digital devices from Android to Apple, with more than 31,000 device profiles of the most common phones. Cellebrite UFED can come as software only or can include a physical unit with accessories such as tip and cable set to connect to various mobile devices.

 

Ephemeral Messaging Apps

[Editor’s Note: John C. Ellis, Jr. is a National Coordinating Discovery Attorney for the Administrative Office of the U.S. Courts, Defender Services Office. In this capacity, he provides litigation support and e-discovery assistance on complex criminal cases to defense teams around the country. Before entering private practice, Mr. Ellis spent 13 years as a trial attorney and supervisory attorney with Federal Defenders of San Diego, Inc. He also serves as a digital forensic consultant and expert.]

Ephemeral Messaging Apps are a popular form of communication. With privacy a concern for everyone, using a self-destructing message that works like disappearing ink for text and photos has a certain allure. All messages are purposely short-lived, with the message deleting on the receiver’s device, the sender’s device, and on the system’s servers seconds or minutes after the message is read. Although these apps were initially only used by teenagers, they are now a ubiquitous part of corporate culture.

According to the 6th Annual Federal Judges Survey, put together by Exterro, Georgetown Law CLE, and EDRM, 20 Federal Judges were asked “[w]hat new data type should legal teams be most worried about in the 5 years?”[1]  The overwhelming response was “Ephemeral Apps (Snapchat, Instagram, etc.).” Id.  In fact, 68% of those surveyed believed ephemeral messaging apps where the most worrisome new data type, whereas only 16% responded that biometric data (including facial recognition and fingerprinting) were the greatest risk. Only 5% were concerned with Text Messages and Mobile, and 0% were concerned with the traditional social media such as Facebook and Twitter.  Id.

Even now, Courts are attempting to sort out the evidentiary issues cause by ephemeral messaging apps, see e.g., Waymo LLC v. Uber Technologies, Inc. 17cv0939-WHA (NDCA).  This article discusses popular ephemeral messaging apps and discusses guidelines for addressing potential evidentiary issues.

Short technical background:

There are several background definitions relevant to this discussion:

  1. Text Messages – otherwise known as SMS (“Short Message Service”) messages, text messages allow mobile device users to send and receive messages of up to 160 characters. These messages are sent using the mobile phone carriers’ network. Twenty-three billion text messages are sent worldwide each day.  Generally, mobile carriers do not retain the contents of SMS messages, so the records will only show the phone number that sent or received the messages and the time it was sent or received.
  2. Messaging Apps – allow users to send messages not tethered to a mobile device (e., a phone number). With some apps, a user may send messages from multiple devices. These apps include iMessage, WhatsApp, and Facebook Messenger. Messaging Apps are generally free. Unlike text messages, these apps rarely have monthly billing records or records showing when messages were sent or received.
  3. Ephemeral Messaging Apps – are a subset of Messaging Apps that allow users to cause messages (words or media) to disappear on the recipient’s device after a short duration. The duration of the message’s existence is set by the sender. Messages can last for seconds or days, unless the receiver of the message takes a “screenshot” of the message before its disappearance.
  4. End-to-End Encryption – also known as E2EE, this is a type of encryption where only the communicating parties can decipher the messages, which prevents eavesdroppers from reading them in transit.

Common Disappearing Messaging Apps:

Messaging apps, like all apps, are changing.  The following is a list and description of several popular ephemeral messaging apps.


Snapchat – both a messaging platform and a social network. The app allows users to send messages and media (including words and emojis appearing on the media) that disappear after a set period of time. Photos and videos created on Snapchat are called “snaps.” Approximately 1 million snaps are sent per day.

Signal – an encrypted communications app that uses the Internet to send one-to-one and group messages which can include files, voice notes, images and videos, which can be set to disappear after a set period of time. According to Wired, Signal is the one messaging app everyone should be using.

Wickr Me – a messaging app that allows users to exchange end-to-end encrypted and content-expiring messages, including photos, videos, and file attachments.

Telegram – cloud-based instant messaging app with end-to-end encryption that allows users to send messages, photos, videos, audio messages and files. It has a feature where messages and attachments can disappear after a set period of time.

CoverMe – a private messaging app that allows users to exchange messages, files, photographs, and phone calls from a fake (or “burner”) phone number. It also allows for private internet browsing, and llows users to hide messages and files.

Confide – a messaging app that allows users to send end-to-end encrypted messages.  The user can also send self-destructing messages purportedly screenshot-proof.

Evidentiary Issues:

Messaging app data, like other forms of evidence, must, amongst other criteria, be relevant (Fed.R.Evid. 401); authenticated (Fed.R.Evid. 901 et seq); and comply with the best evidence rule (Fed.R.Evid 1001 et seq).

As for the Best Evidence Rule, based on the nature of disappearing messaging apps, the original writing of the message is not preserved for litigation. See Fed.R.Evid. 1004(a) (finding that the original is not required if “all the originals are lost or destroyed, and not by the proponent acting in bad faith.”) Sometimes, the contents of the message may be established by the testimony of a witness. In other cases, the contents of the message may be based on a screen shot of the message.

Authenticating messages from apps, regardless of their ephemeral nature, is often difficult—text messages can be easily faked. When it comes ephemeral messages, we often must rely upon a screenshot or testimony regarding the alleged contents of the message.  In such circumstances, the following factors—repurposed from Best Practices for Authenticating Digital Evidence—are useful[2]:

  • testimony from a witness who identifies the account as that of the alleged author, on the basis that the witness on other occasions communicated with the account holder;
  • testimony from a participant in the conversation based on firsthand knowledge that the screen shot fairly and accurately captures the conversation;
  • evidence that the purported author used the same messaging app and associated screen name on other occasions;
  • evidence that the purported author acted in accordance with the message (e.g., when a meeting with that person was arranged in a message, he or she attended);
  • evidence that the purported author identified himself or herself as the individual sending the message;
  • use in the conversation of the customary nickname, avatar, or emoticon associated with the purported author;
  • disclosure in the message of particularized information either unique to the purported author or known only to a small group of individuals including the purported author;
  • evidence that the purported author had in his or her possession information given to the person using messaging app;
  • evidence that the messaging app was downloaded on the purported author’s digital device; and evidence that the purported author elsewhere discussed the same subject.

Conclusion:

Ephemeral messaging app data will continue to impact investigators, attorneys, and the Court. Defense teams should be prepared for the challenges ephemeral messages cause from investigations to evidentiary issues.


[1]Available at https://www.exterro.com/2020-judges-survey-ediscovery.

[2] Hon. Grimm, Capra, and Joseph, Best Practices for Authenticating Digital Evidence (West Academic Publishing 2016), pp. 11-12.