III. Background

A. What is Face Recognition Technology?

Face recognition is the automated process of comparing two images of faces to determine whether they represent the same individual.

Before face recognition can identify someone, an algorithm must first find that person’s face within the photo. This is called face detection. Once detected, a face is “normalized”—scaled, rotated, and aligned so that every face that the algorithm processes is in the same position. This makes it easier to compare the faces. Next, the algorithm extracts features from the face—characteristics that can be numerically quantified, like eye position or skin texture. Finally, the algorithm examines pairs of faces and issues a numerical score reflecting the similarity of their features.

Face recognition is inherently probabilistic: It does not produce binary “yes” or “no” answers, but rather identifies more likely or less likely matches.8 Most police face recognition systems will output either the top few most similar photos or all photos above a certain similarity threshold. Law enforcement agencies call these photos “candidates” for further investigation.

Some facial features may be better indicators of similarity than others. Many face recognition algorithms figure out which features matter most through training. During training, an algorithm is given pairs of face images of the same person. Over time, the algorithm learns to pay more attention to the features that were the most reliable signals that the two images contained the same person.

  • 8. See generally Joseph N. Pato and Lynette I. Millett, eds., Biometric Recognition: Challenges and Opportunities, 36–45 (National Academies Press 2010) (hereinafter “Pato Report”).
If a training set skews towards a certain race, the algorithm may be better at identifying members of that group.

The make-up of a training set can influence the kinds of photos that an algorithm is most adept at examining. For example, if a training set is skewed towards a certain race, the algorithm may be better at identifying members of that group as compared to individuals of other races. In general, this phenomenon is known as overfitting to the training data.9

The mathematical machinery behind a face recognition algorithm can include millions of variables that are optimized in the process of training. This intricacy is what gives an algorithm the capacity to learn, but it also makes it very difficult for a human to examine an algorithm or generalize about its behavior.

  • 9. See below Findings: Racial Bias.

B. Unique Risks of Face Recognition

Most law enforcement technology tracks your technology—your car, your phone, or your computer. Biometric technology tracks your body.10 The difference is significant.

Americans change smartphones every two and a half years, and replace cars every five to six and a half years.11 Fingerprints are proven to be stable for more than a decade, and are believed to be stable for life.12 Separately, many states’ driver’s license renewal requirements ensure that state governments consistently have an up-to-date image of a driver’s face.13

  • 10. Technically, biometric technology also analyzes human behavior, such as gait or keystroke patterns. This report will not focus on this aspect of biometrics. See Pato Report at 18 (defining biometrics as “the automated recognition of individuals based on their behavioral and biological characteristics.”).
  • 11. See Thomas Gryta, Americans Keep Their Cellphones Longer, Wall Street Journal (Apr. 18, 2016), http://www.wsj.com/articles/americans-keep-their-cellphones-longer-1461007321 (average phone replaced every 28 months in Q4 2015, according to Citigroup); Average Age of Light Vehicles in the U.S. Rises Slightly in 2015 to 11.5 years, IHS Reports, IHS Markit (July 29, 2015), http://press.ihs.com/press-release/automotive/average-age-light-vehicles-us-rises-slightly-2015-115-years-ihs-reports (average length of new vehicle ownership is 77.8 months, used vehicle ownership 63 months).
  • 12. See Soweon Yoon and Anil K. Jain, Longitudinal study of fingerprint recognition, 112 Proc. of the Nat’l Acad. of Sci. 8556 (July 14, 2015) (establishing the stability of high quality fingerprints for at least 12 years; citing anecdotal belief in stability of fingerprints over a lifetime).
  • 13. See generally Driver’s License Renewal, American Automobile Association, http://drivinglaws.aaa.com/tag/drivers-license-renewal/ (last visited Sept. 23, 2016) (showing that most states require driver’s licenses to be renewed every four to eight years).
Face recognition allows tracking from far away, in secret, and on large numbers of people.

Here, we can begin to see how face recognition creates opportunities for tracking—and risks—that other biometrics, like fingerprints, do not. Along with names, faces are the most prominent identifiers in human society—online and offline. Our faces—not fingerprints—are on our driver’s licenses, passports, social media pages, and online dating profiles. Except for extreme weather, holidays, and religious restrictions, it is generally not considered socially acceptable to cover one’s face; often, it’s illegal.14 You only leave your fingerprints on the things you touch. When you walk outside, your face is captured by every smartphone and security camera pointed your way, whether or not you can see them.

Face recognition isn’t just a different biometric; those differences allow for a different kind of tracking that can occur from far away, in secret, and on large numbers of people.

Professor Laura Donohue explains that up until the 21st century, governments used biometric identification in a discrete, one-time manner to identify specific individuals. This identification has usually required that person’s proximity or cooperation—making the process transparent to that person. These identifications have typically occurred in the course of detention or in a secure government facility. Donohue refers to this form of biometric identification as Immediate Biometric Identification, or IBI. A prime example of IBI would be the practice of fingerprinting someone during booking for an arrest.

In its most advanced forms, face recognition allows for a different kind of tracking. Donohue calls it Remote Biometric Identification, or RBI. In RBI, the government uses biometric technology to identify multiple people in a continuous, ongoing manner. It can identify them from afar, in public spaces. Because of this, the government does not need to notify those people or get their consent. Identification can be done in secret.

This is not business as usual: This is a capability that is “significantly different from that which the government has held at any point in U.S. history."15

  • 14. See, e.g., Ga. Code Ann., § 16-11-38 (2010); La. Stat. Ann. § 14:313 (2011); N.Y. Penal Law § 240.35 (McKinney 2010); Va. Code Ann. § 18.2-422 (2006).
  • 15. See Laura K. Donohue, Technological Leap, Statutory Gap, and Constitutional Abyss: Remote Biometric Identification Comes of Age, 97 Minn. L. Rev. 407, 415 (2012). Professor Donohue notes that face recognition is just the first of a new generation of biometrics—including iris identification and gait analysis—that allow for RBI. Id. Of all of these technologies, however, face recognition is by far the most widely deployed.

C. How Does Law Enforcement Use Face Recognition?

The first successful fully automated face recognition algorithm was developed in the early 1990s.16 Today, law enforcement agencies mainly use face recognition for two purposes. Face verification seeks to confirm someone’s claimed identity. Face identification seeks to identify an unknown face. This report focuses on face identification by state and local police and the FBI.

  • 16. Matthew Turk & Alex Pentland, Eigenfaces for Recognition, 3 J. Cognitive Neurosci. 71, 72 (1991); See also A. Jay Goldstein, Leon D. Harmon, & Ann B. Lesk, Identification of Human Faces, 59 Proc. of the IEEE 748, 748 (1971). Attempts to automate aspects of face recognition go back decades earlier, but these techniques were not fully automated. Goldstein et al. asked human jurors to examine images and manually identify and classify facial features (such as “eyebrow eight” and “chin profile”) on scales from 1 to 5. A computer was then given a description of a target in the form of a list of features and asked to use the juror’s coding to find a match. In contrast, Turk and Pentland created a computer program that automatically compares images of faces.
A California policeman displays a mobile face recognition app. (Photo: Sandy Huffaker/The New York Times/Redux)
Figure 3A California policeman displays a mobile face recognition app. (Photo: Sandy Huffaker/The New York Times/Redux)

Law enforcement performs face identification for a variety of tasks. Here are four of the most common:

  • Stop and Identify. On patrol, a police officer encounters someone who either refuses or is unable to identify herself. The officer takes her photo with a smartphone or a tablet, processes that photo through software installed on that device or on a squad car computer, and receives a near-instantaneous response from a face recognition system. That system may compare that “probe” photo to a database of mug shots, driver’s license photos, or face images from unsolved crimes, also known as an “unsolved photo file.” (As part of this process, the probe photo may also be enrolled in a database.) This process is known as field identification.
  • Arrest and Identify. A person is arrested, fingerprinted and photographed for a mug shot. Police enroll that mug shot in their own face recognition database. Upon enrollment, the mug shot may be searched against the existing entries, which may include mug shots, license photos, and an unsolved photo file. Police may also submit the arrest record, including mug shot and fingerprints, to the FBI for inclusion in its face recognition database, where a similar search is run upon enrollment.
  • Investigate and Identify. While investigating a crime, the police obtain a photo or video still of a suspect from a security camera, smartphone, or social media post—or they surreptitiously photograph the suspect. They use face recognition to search that image against a database of mug shots, driver’s licenses, or an unsolved photo file and obtain a list of candidates for further investigation, or, in the case of the unsolved photo file, learn if the individual is wanted for another crime. Alternately, when police believe that a suspect is using a pseudonym, they search a mug shot of that suspect against these same databases.
  • Real-time Video Surveillance. The police are looking for an individual or a small number of individuals. They upload images of those individuals to a “hot list.” A face recognition program extracts faces from live video feeds of one or more security cameras and continuously compares them, in real-time, to the faces of the people on the hot list. Every person that walks by those security cameras is subjected to this process. When it finds a match, the system may send an alert to a nearby police officer. Today, real-time face recognition is computationally expensive and is not instantaneous.17 Searches can also be run on archival video.

Face recognition is also used for driver’s license de-duplication. In this process, a department of motor vehicles compares the face of every new applicant for a license or other identification document to the existing faces in its database, flagging individuals who may be using a pseudonym to obtain fraudulent identification. Suspects are referred to law enforcement. However, because de-duplication is typically conducted by DMVs, not law enforcement, this use of the technology will not be a focus of this report.

  • 17. The task of sifting through dozens of high-resolution video frames each second and checking the faces that are found against databases of hundreds (let alone millions) of photos demands an enormous amount of expensive computing power. In the absence of this computing infrastructure, the video footage might need to be stored and processed minutes, hours, or even weeks later. Even when this infrastructure is available, the results will never be provided to an officer instantaneously—each step of the process, from recording the image and transmitting it to a computer to the face recognition itself takes a small but appreciable amount of time.

Sidebar 1: Face Recognition at the FBI

The FBI has used face recognition to support FBI and state and local police investigations since at least 2011.18 The FBI hosts one of the largest face recognition databases in the country, the Next Generation Identification Interstate Photo System (NGI-IPS). It is also home to a unit, Facial Analysis, Comparison, and Evaluation (FACE) Services, that supports other FBI agents by running or requesting face recognition searches of the FBI face recognition database, other federal databases, and state driver’s license photo and mug shot databases. (This report will refer to NGI-IPS as “the FBI face recognition database (NGI-IPS),” and will refer to FBI FACE Services as “the FBI face recognition unit (FACE Services).” The network of databases that the unit searches will be called “the FBI FACE Services network.”)

The FBI face recognition database (NGI-IPS) is mostly made up of the mug shots accompanying criminal fingerprints submitted to the FBI by state, local, and federal law enforcement agencies. It contains nearly 25 million state and federal criminal photos.19 Police in seven states can run face recognition searches against the FBI face recognition database, as can the FBI face recognition unit.20

The FBI face recognition unit (FACE Services) runs face recognition searches against a network of databases that includes 411.9 million photos. Over 185 million of these photos are drawn from 12 states that let the FBI to search their driver’s license and other ID photos; another 50 million are from four additional states that let the FBI to search both driver’s license photos and mug shots.21 While we do not know the total number of individuals that those photos implicate, there are close to 64 million licensed drivers in those 16 states.22

The FBI is expanding the reach of the FACE Services network, but the details are murky. In October 2015, the FBI began a pilot program to search photos against the State Department’s passport database, but it is unclear if the FBI is searching the photos of all 125 million U.S. passport holders, or if it is searching a subset of that database.23

In a May 2016 report, the Government Accountability Office reported that the FBI was negotiating with 18 additional states and the District of Columbia to access their driver’s license photos. In August, the GAO re-released the report, deleting all references to the 18 states and stating that there were “no negotiations underway.”24 The FBI now suggests that FBI agents had only conducted outreach to those states to explore the possibility of their joining the FACE Services network.25

The GAO report found that the FBI had failed to issue mandatory privacy notices required by federal law, failed to conduct adequate accuracy testing of the FBI face recognition database (NGI-IPS) and the state databases that the FBI face recognition unit accessed, and failed to audit the state searches of the FBI face recognition database or any of the face recognition unit’s searches.26

Despite these findings, the FBI is proposing to exempt the FBI face recognition database from key Privacy Act provisions that guarantee Americans the right to review and correct non-investigatory information held by law enforcement—and the right to sue if their privacy rights are violated.27

  • 18. See U.S. Gov’t Accountability Office, GAO-16-267, Face Recognition Technology: FBI Should Better Ensure Privacy and Accuracy 7, 15 (May 2016) (the Facial Analysis, Comparison, and Evaluation (FACE) Services unit began supporting investigations in August 2011).
  • 19. See U.S. Gov’t Accountability Office, GAO-16-267, Face Recognition Technology: FBI Should Better Ensure Privacy and Accuracy 46 (May 2016). It also contains almost 5 million “civil photos,” including photos submitted to the FBI for employment or immigration background checks, although these photos are not searched unless they are matched to people already enrolled in the criminal file. See generally Ernest J. Babcock, Privacy Impact Assessment for the Next Generation Identification (NGI) Interstate Photo System, Federal Bureau of Investigation, Department of Justice (Sept. 2015), https://www.fbi.gov/services/records-management/foipa/privacy-impact-assessments/interstate-photo-system.
  • 20. U.S. Gov’t Accountability Office, GAO-16-267, Face Recognition Technology: FBI Should Better Ensure Privacy and Accuracy 12–13  (May 2016).
  • 21. U.S. Gov’t Accountability Office, GAO-16-267, Face Recognition Technology: FBI Should Better Ensure Privacy and Accuracy 47–48 (May 2016).
  • 22. See Federal Highway Administration, U.S. Department of Transportation, Highway Statistics (Sept. 2015), http://www.fhwa.dot.gov/policyinformation/statistics/2014/pdf/dl22.pdf.
  • 23. See U.S. Gov’t Accountability Office, GAO-16-267, Face Recognition Technology: FBI Should Better Ensure Privacy and Accuracy 48 n. c (May 2016); U.S. Department of State, Bureau of Consular Affairs, U.S. Passports & International Travel: Passport Statistics, https://travel.state.gov/content/passports/en/passports/statistics.html (last visited Sept. 21, 2016) (showing that as of 2015, there are 125,907,176 valid U.S. passports in circulation).
  • 24. Compare U.S. Gov’t Accountability Office, GAO-16-267, Face Recognition Technology: FBI Should Better Ensure Privacy and Accuracy 51 (May 2016) (uncorrected copy, on file with authors) with U.S. Gov’t Accountability Office, GAO-16-267, Face Recognition Technology: FBI Should Better Ensure Privacy and Accuracy 51 (May 2016) (corrected copy) http://www.gao.gov/assets/680/677098.pdf.
  • 25. See Privacy SOS, In bizarre reversal, FBI suddenly claims it is not negotiating with states over face recognition access, ACLU of Massachusetts (Aug. 10, 2016), https://privacysos.org/blog/fbi-changes-tune-about-face-recognition-and-state-rmvs/.
  • 26. See U.S. Gov’t Accountability Office, GAO-16-267, Face Recognition Technology: FBI Should Better Ensure Privacy and Accuracy 18–32  (May 2016).
  • 27. See Privacy Act of 1974; Implementation, 81 Fed. Reg. 27288, 27289 (proposed May 5, 2016) (to be codified at 28 C.F.R. pt. 16); Center on Privacy & Technology et. al., Comment on Proposed Rule to Exempt Next Generation Identification System from Provisions of the Privacy Act and the Modified System of Records Notice for that System (July 6, 2016), https://www.regulations.gov/document?D=DOJ-OPCL-2016-0008-0114 (explaining the impact of the proposed exemptions).

D. Our Research

Thanks to the May 2016 Government Accountability Office report, the public now has access to basic information about the FBI’s face recognition programs and their privacy and accuracy issues. (Sidebar 1.)

By comparison, the public knows very little about the use of face recognition by state and local police, even though many of their systems are older, used more aggressively, and more likely to have a greater impact on the average citizen. No one has combined what we know about FBI systems with information about state and local face recognition to paint a comprehensive, national picture of how face recognition is changing policing in America, and the impact of these changes on our rights and freedoms.

This report closes these gaps. It begins with a threshold question: What uses of face recognition present greater or fewer risks to privacy, civil liberties, and civil rights? After proposing a Risk Framework for law enforcement face recognition, the report explores the following questions, each of which is answered in our Findings:

  • Deployment. Who is using face recognition, how often are they using it, and where do those deployments fall on the Risk Framework?
  • Fourth Amendment. How do agencies using face recognition protect our right to be free from unreasonable searches and seizures?
  • Free Speech. How do they ensure that face recognition does not chill our right to free speech, assembly, and association?
  • Accuracy. How do they ensure that their systems are accurate?
  • Racial Bias. How does law enforcement face recognition impact racial and ethnic minorities?
  • Transparency and Accountability. Are agencies using face recognition in a way that is transparent, accountable to the public, and subject to internal oversight?

To answer all of these questions, we submitted detailed public records requests to over 100 law enforcement agencies across the country. In total, our requests yielded more than 15,000 pages of responsive documents. Ninety agencies provided responsive documents—or substantive responses—of some kind. These responses suggested that at least 52 state and local law enforcement agencies that we surveyed were now using, or have previously used or obtained, face recognition technology. (We will refer to these agencies as “52 agencies.”) Of the 52 agencies, eight formerly used or acquired face recognition but have since discontinued those programs. Conversely, several other responsive agencies have opened their systems to hundreds of other agencies.28

To support our public records research, we conducted dozens of interviews with law enforcement agencies, face recognition companies, and face recognition researchers and conducted a fifty-state legal survey of biometrics and related surveillance laws, and an in-depth review of the technical literature on face recognition. We confirmed our findings through two site visits to law enforcement agencies with advanced face recognition systems. (Our full research methodology, including a breakdown of our records requests and a template for those requests can be found in the Methodology section.) 

After assessing these risks, this report proposes concrete recommendations for Congress, state legislatures, federal, state and local law enforcement agencies, the National Institute of Standards and Technology, face recognition companies, and community leaders.

  • 28. See below Methodology for a list of all agencies we surveyed, grouped by type of response received.