Figure 1: Law Enforcement Face Recognition Use By Jurisdiction

I. Executive Summary

There is a knock on your door. It’s the police. There was a robbery in your neighborhood. They have a suspect in custody and an eyewitness. But they need your help: Will you come down to the station to stand in the line-up?

Most people would probably answer “no.” This summer, the Government Accountability Office revealed that close to 64 million Americans do not have a say in the matter: 16 states let the FBI use face recognition technology to compare the faces of suspected criminals to their driver’s license and ID photos, creating a virtual line-up of their state residents. In this line-up, it’s not a human that points to the suspect—it’s an algorithm.

But the FBI is only part of the story. Across the country, state and local police departments are building their own face recognition systems, many of them more advanced than the FBI’s. We know very little about these systems. We don’t know how they impact privacy and civil liberties. We don’t know how they address accuracy problems. And we don’t know how any of these systems—local, state, or federal—affect racial and ethnic minorities.

One in two American adults is in a law enforcement face recognition network.

This report closes these gaps. The result of a year-long investigation and over 100 records requests to police departments around the country, it is the most comprehensive survey to date of law enforcement face recognition and the risks that it poses to privacy, civil liberties, and civil rights. Combining FBI data with new information we obtained about state and local systems, we find that law enforcement face recognition affects over 117 million American adults. It is also unregulated. A few agencies have instituted meaningful protections to prevent the misuse of the technology. In many more cases, it is out of control.

The benefits of face recognition are real. It has been used to catch violent criminals and fugitives. The law enforcement officers who use the technology are men and women of good faith. They do not want to invade our privacy or create a police state. They are simply using every tool available to protect the people that they are sworn to serve. Police use of face recognition is inevitable. This report does not aim to stop it.

Rather, this report offers a framework to reason through the very real risks that face recognition creates. It urges Congress and state legislatures to address these risks through commonsense regulation comparable to the Wiretap Act. These reforms must be accompanied by key actions by law enforcement, the National Institute of Standards and Technology (NIST), face recognition companies, and community leaders. 

A. Key Findings

Our general findings are set forth below. Specific findings for 25 local and state law enforcement agencies can be found in our Face Recognition Scorecard, which evaluates these agencies’ impact on privacy, civil liberties, civil rights, transparency and accountability. The records underlying all of our conclusions are available online.

Face recognition is neither new nor rare. FBI face recognition searches are more common than federal court-ordered wiretaps. At least one out of four state or local police departments has the option to run face recognition searches through their or another agency’s system. At least 26 states (and potentially as many as 30) allow law enforcement to run or request searches against their databases of driver’s license and ID photos. Roughly one in two American adults has their photos searched this way.

A face recognition search conducted in the field to verify the identity of someone who has been legally stopped or arrested is different, in principle and effect, than an investigatory search of an ATM photo against a driver’s license database, or continuous, real-time scans of people walking by a surveillance camera. The former is targeted and public. The latter are generalized and invisible. While some agencies, like the San Diego Association of Governments, limit themselves to more targeted use of the technology, others are embracing high and very high risk deployments.

Historically, FBI fingerprint and DNA databases have been primarily or exclusively made up of information from criminal arrests or investigations. By running face recognition searches against 16 states’ driver’s license photo databases, the FBI has built a biometric network that primarily includes law-abiding Americans. This is unprecedented and highly problematic.

Major police departments are exploring real-time face recognition on live surveillance camera video. Real-time face recognition lets police continuously scan the faces of pedestrians walking by a street surveillance camera. It may seem like science fiction. It is real. Contract documents and agency statements show that at least five major police departments—including agencies in Chicago, Dallas, and Los Angeles—either claimed to run real-time face recognition off of street cameras, bought technology that can do so, or expressed a written interest in buying it. Nearly all major face recognition companies offer real-time software.

No state has passed a law comprehensively regulating police face recognition. We are not aware of any agency that requires warrants for searches or limits them to serious crimes. This has consequences. The Maricopa County Sheriff’s Office enrolled all of Honduras’ driver’s licenses and mug shots into its database. The Pinellas County Sheriff’s Office system runs 8,000 monthly searches on the faces of seven million Florida drivers—without requiring that officers have even a reasonable suspicion before running a search. The county public defender reports that the Sheriff’s Office has never disclosed the use of face recognition in Brady evidence.

There is a real risk that police face recognition will be used to stifle free speech. There is also a history of FBI and police surveillance of civil rights protests. Of the 52 agencies that we found to use (or have used) face recognition, we found only one, the Ohio Bureau of Criminal Investigation, whose face recognition use policy expressly prohibits its officers from using face recognition to track individuals engaging in political, religious, or other protected free speech.

Face recognition is less accurate than fingerprinting, particularly when used in real-time or on large databases. Yet we found only two agencies, the San Francisco Police Department and the Seattle region’s South Sound 911, that conditioned purchase of the technology on accuracy tests or thresholds. There is a need for testing. One major face recognition company, FaceFirst, publicly advertises a 95% accuracy rate but disclaims liability for failing to meet that threshold in contracts with the San Diego Association of Governments. Unfortunately, independent accuracy tests are voluntary and infrequent.

Companies and police departments largely rely on police officers to decide whether a candidate photo is in fact a match. Yet a recent study showed that, without specialized training, human users make the wrong decision about a match half the time. We found only eight face recognition systems where specialized personnel reviewed and narrowed down potential matches. The training regime for examiners remains a work in progress.

Police face recognition will disproportionately affect African Americans. Many police departments do not realize that. In a Frequently Asked Questions document, the Seattle Police Department says that its face recognition system “does not see race.” Yet an FBI co-authored study suggests that face recognition may be less accurate on black people. Also, due to disproportionately high arrest rates, systems that rely on mug shot databases likely include a disproportionate number of African Americans. Despite these findings, there is no independent testing regime for racially biased error rates. In interviews, two major face recognition companies admitted that they did not run these tests internally, either.

Face recognition may be least accurate for those it is most likely to affect: African Americans.

Ohio’s face recognition system remained almost entirely unknown to the public for five years. The New York Police Department acknowledges using face recognition; press reports suggest it has an advanced system. Yet NYPD denied our records request entirely. The Los Angeles Police Department has repeatedly announced new face recognition initiatives—including a “smart car” equipped with face recognition and real-time face recognition cameras—yet the agency claimed to have “no records responsive” to our document request. Of 52 agencies, only four (less than 10%) have a publicly available use policy. And only one agency, the San Diego Association of Governments, received legislative approval for its policy.

Maryland’s system, which includes the license photos of over two million residents, was launched in 2011. It has never been audited. The Pinellas County Sheriff’s Office system is almost 15 years old and may be the most frequently used system in the country. When asked if his office audits searches for misuse, Sheriff Bob Gualtieri replied, “No, not really.” Despite assurances to Congress, the FBI has not audited use of its face recognition system, either. Only nine of 52 agencies (17%) indicated that they log and audit their officers’ face recognition searches for improper use. Of those, only one agency, the Michigan State Police, provided documentation showing that their audit regime was actually functional.

B. Recommendations

Such laws should require the FBI or the police to have a reasonable suspicion of criminal conduct prior to a face recognition search. After-the-fact investigative searches—which are invisible to the public—should be limited to felonies. 

Mug shots, not driver’s license and ID photos, should be the default photo databases for face recognition, and they should be periodically scrubbed to eliminate the innocent. Except for identity theft and fraud cases, searches of license and ID photos should require a court order issued upon a showing of probable cause, and should be restricted to identity theft and serious crimes. If these searches are allowed, the public should be notified at their department of motor vehicles.

If deployed pervasively on surveillance video or police-worn body cameras, real-time face recognition will redefine the nature of public spaces. At the moment, it is also inaccurate. Communities should carefully weigh whether to allow real-time face recognition. If they do, it should be used as a last resort to intervene in only life-threatening emergencies. Orders allowing it should require probable cause, specify where continuous scanning will occur, and cap the length of time it may be used.

Real-time face recognition will redefine the nature of public spaces. It should be strictly limited.

Use of face recognition to track people on the basis of their political or religious beliefs or their race or ethnicity should be banned. All face recognition use should be subject to public reporting and internal audits.

To lay the groundwork for future improvements in face recognition, Congress should provide funding to NIST to increase the frequency of accuracy tests, create standardized, independent testing for racially biased error rates, and create photo databases that facilitate such tests.

State and federal financial assistance for police face recognition systems should be contingent on public reporting, accuracy and bias tests, legislative approval—and public posting—of a face recognition use policy, and other standards in line with these recommendations.

A Model Face Recognition Act, for Congress or a state legislature, is included in the report.

The FBI should refrain from searching driver’s license and ID photos in the absence of express approval for those searches from a state legislature. If it proceeds with those searches, the FBI should restrict them to investigations of serious crimes where FBI officials have probable cause to implicate the search subject. The FBI should periodically scrub its mug shot database to eliminate the innocent, require reasonable suspicion for state searches of that database, and restrict those searches to investigations of felonies. Overall access to the database should be contingent on legislative approval of an agency’s use policy. The FBI should audit all searches for misuse, and test its own face recognition system, and the state systems that the FBI accesses, for accuracy and racially biased error rates.

The Department of Justice (DOJ) Civil Rights Division should evaluate the disparate impact of police face recognition, first in jurisdictions where it has open investigations and then in state and local law enforcement more broadly. DOJ should also develop procurement guidance for state and local agencies purchasing face recognition programs with federal funding.

The FBI should be transparent about its use of face recognition. It should reverse its current proposal to exempt its face recognition system from key Privacy Act requirements. It should also publicly and annually identify the photo databases it searches and release statistics on the number and nature of searches, arrests, and the convictions stemming from those searches, and the crimes that those searches were used to investigate.

Many police departments have run searches of driver’s license and ID photos without express legislative approval. Police should observe a moratorium on those searches until legislatures vote on whether or not to allow them.

Police should develop use policies for face recognition, publicly post those policies, and seek approval for them from city councils or other local legislative bodies. City councils should involve their communities in deliberations regarding support for this technology, and consult with privacy and civil liberties organizations in reviewing proposed use policies.

When buying software and hardware, police departments should condition purchase on accuracy and bias tests and periodic tests of the systems in operational conditions over the contract period. They should avoid sole source contracts and contracts that disclaim vendor responsibility for accuracy.

All agencies should implement audits to prevent and identify misuse and a system of trained face examiners to maximize accuracy. Regardless of their approach to contracting, all agencies should regularly test their systems for accuracy and bias.

A Model Police Face Recognition Use Policy is included in the report.

NIST should create regular tests for algorithmic bias on the basis of race, gender, and age, increase the frequency of existing accuracy tests, develop tests that mirror law enforcement workflows, and deepen its focus on tests for real-time face recognition. To help empower others to conduct testing, NIST should develop a set of best practices for accuracy tests and develop and distribute new photo datasets to train and evaluate algorithms. To help efforts to diminish racially biased error rates, NIST should ensure that these datasets reflect the diversity of the American population.

Companies should also voluntarily publish performance results for modern, publicly available benchmarks—giving police departments and city councils more bases upon which to draw comparisons.

Citizens are paying for police and FBI face recognition systems. They have a right to know how those systems are being used. If those agencies refuse, advocates should take them to court. Citizens should also press legislators and law enforcement agencies for laws and use policies that protect privacy, civil liberties, and civil rights, and prevent misuse and abuse. Law enforcement and legislatures will not act without concerted community action.

This report provides the resources that citizens will need to effect this change. In addition to the Face Recognition Scorecard, a list of questions that citizens can ask their elected representative or law enforcement agency is in the Recommendations.

II. Introduction

Chris Wilson at the University of South Florida campus. (Photo: Center on Privacy & Technology)
Figure 2Chris Wilson at the University of South Florida campus. (Photo: Center on Privacy & Technology)

Chris Wilson is a soft-spoken Classics major working towards her second Bachelor’s degree at the University of South Florida. She enjoys learning Latin and studying ancient Greece and Rome. “I’m a history nut,” she says.

But Chris is not just a scholar—she is also a civil rights leader. For her, social justice is at the core of education: “A lot of students believe that we have to put up with the way things are—and that’s not right.” Chris sees it as her responsibility to “pop the bubble.”

Earlier this year, Chris helped organize a protest against the treatment of black students at the Florida State Fair. In 2014, a 14 year-old honors student, Andrew Joseph III, had been killed by a passing car after being ejected by police from the Florida State Fair along with dozens of other students, most of them African American.1

On February 7, 2016—the second anniversary of Andrew’s death—Chris and three others locked themselves together just inside the fairground gates and called for a boycott. Police ordered them to leave. Chris and her friends stayed where they were.2

  • 1. See Sara DiNatale, Two years after Andrew Joseph III, 14, died outside the Florida State Fair, his parents' fight is just starting, Tampa Bay Times (Feb. 4, 2016), http://www.tampabay.com/news/publicsafety/two-years-after-teenager-andrew-joseph-iii-died-outside-the-florida-state/2264205.
  • 2. Interview with Chris Wilson (July 29, 2016) (notes on file with authors); See also WFTS Webteam, Black Lives Matter protesters arrested at Florida State Fair, WFTS Tampa Bay (Feb. 8, 2016), http://www.abcactionnews.com/news/local-news/black-lives-matter-protesters-arrested-at-state-fair.

Chris was arrested for trespass, a misdemeanor. The Hillsborough County Sheriff’s Office took her to a local station, fingerprinted her, took her mug shot, and released her that evening. She had never been arrested before, and so she was informed that she was eligible for a special diversion program. She paid a fine, did community service, and the charges against her were dropped.

Chris was not told that as a result of her arrest, her mug shot has likely been added to not one, but two separate face recognition databases run by the FBI and the Pinellas County Sheriff’s Office.3 These two databases alone are searched thousands of times a year by over 200 state, local, and federal law enforcement agencies.4

The next time Chris participates in a protest, the police won’t need to ask her for her name in order to identify her. They won’t need to talk to her at all. They only need to take her photo. FBI co-authored research suggests that these systems may be least accurate for African Americans, women, and young people aged 18 to 30.5 Chris is 26. She is black. Unless she initiates a special court proceeding to expunge her record, she will be enrolled in these databases for the rest of her life.6

What happened to Chris doesn’t affect only activists: Increasingly, law enforcement face recognition systems also search state driver’s license and ID photo databases. In this way, roughly one out of every two American adults (48%) has had their photo enrolled in a criminal face recognition network.7

They may not know it, but Chris Wilson and over 117 million American adults are now part of a virtual, perpetual line-up. What does this mean for them? What does this mean for our society? Can police use face recognition to identify only suspected criminals—or can they use it to identify anyone they want? Can police use it to identify people participating in protests? How accurate is this technology, and does accuracy vary on the basis of race, gender or age? Can communities debate and vote on the use of this technology? Or is it being rolled out in secret?

FBI and police face recognition systems have been used to catch violent criminals and fugitives. Their value to public safety is real and compelling. But should these systems be used to track Chris Wilson? Should they be used to track you?

 
  • 3. In September 2014, the Hillsborough County Sheriff’s Office (HCSO) finalized a Memorandum of Understanding with the Pinellas County Sheriff’s Office (PCSO) to enroll all existing and future mug shot photos in the PCSO face recognition database. See the Hillsborough County Sheriff’s Office, Memorandum of Understanding, Document. 014030–014034 at 014031. PCSO staff confirmed that all new mug shots taken by the HCSO are enrolled in the PCSO database. See Correspondence between Jake Ruberto, Technical Support Specialist, Pinellas County Sheriff’s Office, and Clare Garvie (July 28, 2016), Document p. 016831. The FBI has confirmed that Chris Wilson’s arrest record is in the Next Generation Identification database, but has refused to indicate whether her mug shot was enrolled in the FBI’s face recognition database, the Interstate Photo System. Florida is one of seven states that have the ability to search the Interstate Photo System. See U.S. Gov’t Accountability Office, GAO-16-267, Face Recognition Technology: FBI Should Better Ensure Privacy and Accuracy 13 (May 2016).
  • 4. The GAO found that the FBI alone conducted 118,490 face recognition searches of its face recognition database, the Next Generation Interstate Photo System (NGI-IPS) from December 2011 to December 2015, and that states authorized to access to the system conducted 20,000 searches in the four years ending in December 2015. See U.S. Gov’t Accountability Office, GAO-16-267, Face Recognition Technology: FBI Should Better Ensure Privacy and Accuracy 49, 12 (May 2016). The Pinellas County Sheriff’s Office (PCSO) system is searched by 242 local, state and federal agencies around 8,000 times a month. See Pinellas County Sheriff’s Office, Face Analysis Comparison & Evaluation System: FACES Training 2015, Document p. 014383–014417 at 014396.
  • 5. See Brendan Klare et. al, Face Recognition Performance: Role of Demographic Information, 7 IEEE Transactions on Info. Forensics and Sec. 1789, 1789 (Dec. 2012), https://assets.documentcloud.org/documents/2850196/Face-Recognition-Performance-Role-of-Demographic.pdf. In the report, co-author Richard W. Vorder Bruegge is identified as the FBI’s subject matter expert for face recognition. Id. at 1801.
  • 6. See Ernest J. Babcock, Privacy Impact Assessment for the Next Generation Identification (NGI) Interstate Photo System, Federal Bureau of Investigation, Department of Justice (Sept. 2015), https://www.fbi.gov/services/records-management/foipa/privacy-impact-assessments/interstate-photo-system (explaining that absent the request of a submitting agency or a court order, photos will be retained until the subject is 110 years of age, “or seven years after notification of death with biometric confirmation”); Pinellas County Sheriff's Office, Interview with PCSO Sheriff Bob Gualtieri and Technical Support Specialist Jake Ruberto (July 26, 2016) (notes on file with authors) (explaining that the PCSO system retains photos indefinitely in the absence of a court order).
  • 7. See below Figure 6 and accompanying text.