VI. Recommendations

A. LEGISLATURES

A core recommendation of this report is that Congress and state legislatures pass commonsense legislation, comparable to the Wiretap Act and its state analogs, to comprehensively regulate law enforcement face recognition. This legislation should implement the following recommendations, each of which is featured in the Model Face Recognition Act.

  • Recommendation 1. Law enforcement face recognition searches should be conditioned on an individualized suspicion of criminal conduct.

For over two centuries, American law enforcement has been constrained by a basic standard: The police cannot search anyone they please. Rather, before law enforcement officials infringe on an individual’s liberty, they generally must have an individualized suspicion that the individual is engaged in criminal conduct.

At a minimum, legislatures should require that face recognition searches be conditioned on an officer’s reasonable suspicion that an individual is engaged in criminal conduct. This standard currently applies to police investigatory stops.266 While some states require that people identify themselves at the request of police, the Supreme Court has ensured that those laws require a predicate of reasonable suspicion.267 Face recognition allows law enforcement to identify someone without stopping or even talking to her. Our proposal ensures that the old standard survives new technology.

A reasonable suspicion standard should apply to all Stop and Identify, Arrest and Identify, and Investigate and Identify searches that run on mug shot databases. Higher standards should apply to riskier deployments, such as systems that rely on driver’s license databases or real-time, continuous video surveillance.

  • Recommendation 2: Mug shot databases used for face recognition should exclude people who were found innocent or who had charges against them dropped or dismissed.

Mug shot databases used for face recognition include countless individuals who have interacted with law enforcement but who have never been convicted of a crime.268 This is particularly problematic in cases like Chris Wilson’s: A single act of peaceful civil disobedience should not result in a lifetime in a criminal face recognition database.

  • 266. See Terry v. Ohio, 392 U.S. 1 (1968) (requiring police officers to have a reasonable suspicion that a person is involved in criminal activity prior to an investigatory stop).
  • 267. See Brown v. Texas, 443 U.S. 47, 53 (1979) (finding that a Texas statute’s application was unconstitutional because it required individuals to identify themselves to police even if the police officer lacked reasonable suspicion); Hiibel v. Sixth Judicial Dist. of Nev., Humboldt Cty., 542 U.S. 177, at 5 (2004) (recognizing the “constitutional limitation” established in Brown, but upholding a Nevada Stop and Identify statute partly because the statute required that a field stop “be justified at its inception”).
  • 268. See Ellen Nakashima, FBI wants to exempt its huge fingerprint and photo database from privacy protections, Washington Post (June 1, 2016) (51 percent of all arrests in the FBI’s face and fingerprint database “lack final dispositions, such as whether a person has been convicted or even charged”).
A single act of peaceful civil disobedience should not result in a lifetime in a criminal face recognition database.

Congress and states that rely on mug shot databases for face recognition should follow the lead of Michigan, which requires the destruction of biometric data from people who are arrested but have been found innocent, or who have had the charges against them dropped or dismissed.269 The FBI should do this voluntarily, whether or not Congress commands it to do so.

  • Recommendation 3. Searches of license photos should only occur if state legislatures vote to allow them. States that allow access should notify the public.

Unless and until state legislatures openly debate this access and affirmatively vote to grant it, law enforcement face recognition systems should constrain themselves to mug shot databases. Even if state legislatures do approve searches of license and ID photos, many citizens may remain unaware of this practice. Therefore, these states should implement a notice requirement similar to that of Washington State, which requires special notices to driver’s license applicants, online postings, and “notices in conspicuous locations” in the Department of Licensing’s physical offices.270

  • Recommendation 4: Searches of driver’s license and ID photos should occur only under a court order issued upon a showing of probable cause.

States that allow searches of driver’s license and ID photos should require a higher level of individualized suspicion, preferably probable cause, for those searches. If a state scans all of its drivers’ faces as part of criminal investigations, it should, at a minimum, ensure that those searches are based on reasonably trustworthy information.271

The determination of probable cause should not be in the hands of the police or the FBI. It should be in the hands of a state judge or a federal judge with jurisdiction over the state. As with the Wiretap Act, this judicial oversight requirement should not be total. In true emergencies, searches should initially proceed without judicial approval (but require a follow-up application). Other scenarios should not require judicial approval at all. These include searches to identify missing children, deceased victims, and lawfully arrested people during the booking process.

We also believe that judicial approval should not be required for police searches that are narrowly designed to detect identity theft and fraud. These searches parallel departments of motor vehicles’ longstanding practice of “de-duping” ID photos to detect fraud.

  • Recommendation 5. Limit searches of license photos—and after-the-fact investigative searches—to investigations of serious offenses.

There is a tradition in American law enforcement of limiting the most controversial investigative techniques to the most serious crimes.272 That principle should apply to face recognition: If a state decides to allow law enforcement to conduct face recognition searches of its driver’s license and other ID photos, it should limit those searches to serious offenses, preferably those identified in the oral and wire provisions of the Wiretap Act, and identity theft and fraud.273

This principle should also apply to Moderate Risk deployments, which involve face recognition against a mug shot database. In a Stop and Identify deployment, where a police officer encounters someone in person, takes her photo, and uses that photo to run a face recognition search, the use of the technology is at least somewhat transparent to the search subject. In a lawfully initiated officer encounter, that officer’s safety is also in play; he has a need to know whether he is interacting with a law-abiding citizen or a wanted felon.274

In an Arrest and Identify search, where someone is arrested and her mug shot is simultaneously enrolled and run against a mug shot database, use of face recognition may or may not be transparent—but the Supreme Court has recognized a strong state interest in the reliable identification of suspects in government custody.275

In an Investigate and Identify search, where a suspect is identified after the commission of the offense from a video still or surreptitious photograph, none of these interests are at play. The search is entirely invisible to the subject and the public at large. Outside of the public eye, there is a risk that some officials may use a minor offense, like jaywalking, as a pretext to justify a search to identify a peaceful protester—or an attractive member of the opposite sex.276 For this reason, we believe that Investigate and Identify searches—even those limited to mug shot databases—should be limited to investigations of felonies.

  • Recommendation 6. Real-time video surveillance should only occur in life-threatening public emergencies under a court order backed by probable cause.

When operating through a large network of street surveillance footage—or, potentially, police-worn body cameras—real-time, continuous face recognition would allow law enforcement to secretly locate people and track their movements. Real-time video surveillance offers police the same abilities as do real-time GPS tracking or access to cell-site location information, techniques that require court-issued warrants in a growing number of jurisdictions.277

A simple warrant is not enough, however. If deployed pervasively, real-time video surveillance threatens to create a world where, once you set foot outside, the government can track your every move.

Some communities may conclude that real-time video surveillance is too inaccurate, or too threatening to civil liberties. Communities that decide to allow real-time video surveillance under a probable cause-backed court order should issue those orders only:

  • in life-threatening public emergencies;
  • in specific locations for a limited period of time; and
  • upon a showing that law enforcement has exhausted other means to investigate the crime.

Most of these restrictions have direct analogs in the Wiretap Act.278 Also like the Wiretap Act, if law enforcement is forced to use real-time video surveillance without a court order, it should file a prompt follow-up application to a court.279

  • Recommendation 7. Use of face recognition to track people on the basis of their race, ethnicity, religious or political views should be prohibited.

A statute regulating law enforcement face recognition should prohibit the use of the technology to track individuals solely on the basis of their political or religious beliefs, or any other conduct protected by the First Amendment, and prohibit tracking of individuals solely on the basis of their race, ethnicity, or other protected status. Without these prohibitions, there is a real danger that face recognition could chill free speech or endanger access to education or public health.

  • Recommendation 8. All law enforcement use of face recognition should be subject to public reporting requirements and internal audits.

Face recognition is too powerful to be secret. Any law enforcement agency using face recognition should be required to annually and publicly disclose information directly comparable to that required by the Wiretap Act.280 This would include:

  1. the number of face recognition searches run;
  2. the nature of those searches (i.e. Stop and Identify, Arrest and Identify, Investigate and Identify);
  3. the crimes that those searches were used to investigate;
  4. the arrests and convictions that resulted from those searches;
  5. the databases that those searches accessed;
  6. for real-time video surveillance, the duration and approximate location of those searches; and
  7. any other information that the jurisdiction deems appropriate.

These transparency measures should be coupled with the logging of all searches and rigorous audits to prevent and identify misuse.

  • 269. See Mich. Comp. Laws § 28.243(7)-(8); see also above note 76 and accompanying text.
  • 270. Wash. Rev. Code § 46.20.037(3).
  • 271. See Brinegar v. United States, 338 U.S. 160, 175-176 (1949) (“[P]robable cause exists where ‘the facts and circumstances within their [the officers’] knowledge and of which they had reasonably trustworthy information [are] sufficient in themselves to warrant a man of reasonable caution in the belief that’ an offense has been or is being committed.”) (citations omitted).
  • 272. See above Findings: Fourth Amendment.
  • 273. See 18 U.S.C. § 2516 (1)-(2) (limiting federal interception of wire and oral communications to investigations of certain federal offenses and state wire and oral interceptions to a narrow category of felonies).
  • 274. See Terry v. Ohio, 392 U.S. 1, 30 (1968) (recognizing a state interest in officer safety during police investigative stops).
  • 275. See Maryland v. King, 133 S. Ct. 1958, 1980 (2013) (identifying “significant state interests” in identifying suspect in custody “so that the criminal justice system can make informed decisions concerning pretrial custody).
  • 276. See Police across US misuse databases to look up celebrities, romantic partners and others, Associated Press (Sept. 28, 2016), http://www.nola.com/crime/index.ssf/2016/09/police_across_us_misuse_databa.html.
  • 277. See, e.g., Cal. Penal Code § 1546 et seq.; 725 Ill. Comp. Stat. act 168/1 et seq.; Md. Code Ann. Crim. Proc. § 1-203.1; see above notes 115, 116 and accompanying text.
  • 278. See 18 U.S.C. § 2516(1)-(2) (limiting federal interception of wire and oral communications to investigations of certain federal offenses); 18 U.S.C. § 2518(1)(c) (requiring a statement that other investigative techniques have been attempted, are unlikely to succeed, or are too dangerous); 18 U.S.C. § 2518(5) (capping wiretap authorizations to an extendable thirty-day period).
  • 279. See 18 U.S.C. § 2518(7) (allowing warrantless wiretaps in certain emergency situations, if the procedure is followed by an application within 48 hours after the wiretap has begun).
  • 280. See 18 U.S.C. § 2519.
Face recognition is too powerful to be secret.
  • Recommendation 9. Congress should provide funding to increase the frequency and scope of accuracy tests and create more diverse photo datasets for training.

The National Institute of Standards and Technology will need new funding to expand the frequency and scope of its accuracy tests—particularly to create new testing programs to prevent algorithmic bias and deepen testing of real-time face recognition systems. NIST can also play a role in helping face recognition companies prevent bias—not just test for it. With increased funding, NIST may be able to develop more diverse photo datasets that companies can use to improve the accuracy of their algorithms across racial, ethnic, and age groups.

  • Recommendation 10. State and federal financial assistance for face recognition should be conditioned on transparency, oversight, and accountability.

Many state and local face recognition systems receive federal financial assistance.281 Congress should use its power of the purse to promote transparency, public accountability, internal audits, and accuracy. Federal financial assistance should be restricted to federal, state and local agencies that:

  • publicly report use statistics;
  • publicly post use manuals and obtain approval for those manuals from elected officials;
  • certify that internal audits are in place; and
  • participate in NIST accuracy tests, and, when they are available, tests for racially biased error rates.

State legislatures can institute similar measures.

  • 281. See, e.g., Pinellas County Sheriff's Office, Request for Proposal: On-line User Training Program (May 27, 2011), Document p. 014451 ("In 2001, PCSO initiated a law enforcement facial recognition program from grants awarded by the U.S. Department of Justice, Office of Community Oriented Policing Services (COPS)."); Seattle Police Department, Mugshot Booking Photo Comparison DRAFT - Project Documents, Document p. 012489 ("The Mugshot Booking Photo [face recognition] comparison project . . . is a result of funding from a Department of Homeland Security (DHS) grant under the Urban Area Security Initiative (UASI) . . ."); SANDAG, Proposed FY 2015 Program Budget Amendment: Urban Area Security Initiative Tactical Identification System Project (Feb.  13, 2015), Document p. 005716 (describing an award of $99,000 from the DHS Urban Area Security Initiative to continue maintaining the Tactical Identification System, which includes face recognition); Chicago Police Department, CTA's Regional Transit Terrorism Prevention and Response System (T-CLEAR) (Sept. 12, 2012), Document p. 008725–008729 (a grant proposal to the DHS FY09 Transit Security Grant Program outlining a video security system that includes face recognition).

B. Law Enforcement

Regardless of when legislatures act, there are a number of steps that federal and state law enforcement can take to address the problems presented in this report.

1. FEDERAL BUREAU OF INVESTIGATION & DEPARTMENT OF JUSTICE

  • Recommendation 11: Require probable cause for and restrict searches of driver’s license and ID photo databases to serious, enumerated offenses.

The FBI should voluntarily refrain from searching driver’s license and ID photos in states that have not passed legislation expressly authorizing criminal face recognition searches of those databases. If the FBI decides to proceed with these searches, it should voluntarily restrict them to investigations of serious offenses enumerated in the oral and wire provisions of the Wiretap Act.282

In addition, these searches should be limited to instances in which the FBI has probable cause to believe that the subject of the search committed the offense in question. While a probable cause standard is not common for systems that rely on driver’s license databases, some major jurisdictions have adopted it without apparent impediments to their law enforcement mission.283

  • Recommendation 12: Leverage police access to the FBI face recognition database (NGI-IPS) to promote best practices for face recognition.

At the moment, seven states have the ability to search the FBI face recognition database, which is populated by 24.9 million mug shots. Over time, that number will grow, giving the FBI an even greater opportunity to promote best practices for state and local police. In line with the recommendations set out for legislatures, officers should be allowed to search the database only after certifying that they have a reasonable suspicion that the suspect in question committed a felony offense. The FBI should itself adopt this policy for all mug shot searches.

More broadly, access to the FBI face recognition database should be conditioned on an agency’s adoption of a face recognition use policy, public posting of that policy, and its approval by a city council or local legislature. They should also be contingent on audits.   

  • Recommendation 13: Audit police and FBI searches of the FBI face recognition database and FBI searches of state driver’s license and ID photo databases.

In a 2012 Senate hearing on FBI use of face recognition, then Deputy Assistant Director of FBI Criminal Justice Information Services (CJIS), Jerome Pender, assured the public that state agencies’ use of the FBI’s face recognition database would be audited—both by the FBI and the state agency in accordance with agreements signed with those agencies.284 The Privacy Impact Assessment for the FBI’s face recognition database also says that “robust audit processes [are] already present,” and that agencies requesting searches “will be subject to training and audit requirements by the applicable CJIS Systems Agency (CSA) and periodic FBI audits.”285 The 2016 GAO report revealed, however, that the FBI had never audited state agency searches of the FBI’s face recognition database—nor had the FBI audited its own use of state databases.286 Going forward, these audits should be conducted.

Despite the FBI’s assurances in the 2012 hearing, the signed agreements that have been made public do not require that states annually audit their own use of the FBI face recognition database.287 If it is not now in place, this requirement should be added to all MOUs between state agencies and the FBI.

  • Recommendation 14: The FBI should test its face recognition system for accuracy and racially biased error rates, and make the results public.

The last public accuracy statistics for the FBI face recognition database suggest that it successfully includes the correct candidate in a list of 50 potential matches only 86% of the time.288 In other words, in this test, one out of every seven searches returned a list of 50 “innocent” candidates. This test was run on a database 25 times smaller than the current FBI face recognition database; generally, errors increase with database size. 

This is not acceptable. The FBI should regularly test its system for accuracy and make those results public. It should do the same for racially biased error rates.

  • Recommendation 15: Investigate state and local agencies’ use of face recognition for potential disparate impact.

The Department of Justice Civil Rights Division regularly investigates state and local police practices. They should extend those investigations to explore face recognition, as the systems used by a number of agencies under recent investigation may produce a disparate impact on minority communities.

In Maricopa County, Arizona, for example—the subject of a recently settled DOJ civil rights lawsuit289—the Maricopa County Sheriff’s Office (MCSO) enrolled all of Honduras’ driver’s license photos into its face recognition system as part of an effort to combat Mara Salvatrucha, a Central and North American gang.290 African Americans are themselves likely overenrolled in the system, which also searches mug shots. In Arizona, African Americans are arrested at a rate close to three times that of their share of the population.291

  • Recommendation 16: Develop procurement guidance for state and local agencies purchasing face recognition programs with DOJ funding.

DOJ can support state and local accountability efforts by providing procurement guidance for agencies receiving DOJ funding. This guidance should discourage the use of sole source contracting for initial purchases or heighten sole source justification requirements. It could also encourage: (1) including specific target accuracy levels in agency Requests for Proposals (RFPs); (2) requiring proof of participation in NIST accuracy tests; (3) accuracy verification testing during the system acceptance process; and (4) regular independent accuracy tests during the contract period, including internal tests and submission to all applicable NIST tests during that period.

  • Recommendation 17: Reverse the current proposal to exempt the FBI’s face recognition system from key Privacy Act requirements.

If promulgated, this rule would eliminate key mechanisms for public transparency and accountability over a database already operating largely in the dark. This report strongly suggests that we need more transparency over face recognition, not less. The DOJ and FBI should reverse this proposal.292   

  • Recommendation 18: Voluntarily release detailed public reports on the FBI’s face recognition programs.

These reports would detail the databases that the FBI searches, the number and nature of face recognition searches conducted by the FBI and states accessing the FBI system, arrests and convictions stemming from those searches, and the types of crimes investigated. The annual release of this information would add a layer of public transparency and accountability to complement internal audits. These reports could be modeled after the annual reports required under the Wiretap Act.293

  • 282. See 18 U.S.C. § 2516 (1)–(2).
  • 283. See above Figure 9: Legal Standards for Face Recognition Search, by Jurisdiction.
  • 284. What Facial Recognition Technology Means for Privacy and Civil Liberties: Hearing before the Subcomm. on Privacy, Technology & the Law of the S. Comm. on the Judiciary, 112th Cong., 10–11 (2012) (“One of the things that the MOUs that we sign with the agencies that are going to access the system require is an audit process, so the local agencies are required to audit the use of the system on an annual basis to detect any type of misuse. And then, in addition to that, within our FBI CJIS Division we have an audit unit that goes out and does triennial audits of the same agencies . . . a double-check on the audits, as well as to be sure that the audit processes are in place and being done effectively.”).
  • 285. Federal Bureau of Investigation, U.S. Department of Justice, Privacy Impact Assessment for the Next Generation Identification (NGI) Interstate Photo System, (Sept. 2015), https://www.fbi.gov/services/records-management/foipa/privacy-impact-assessments/interstate-photo-system.
  • 286. See U.S. Gov’t Accountability Office, GAO-16-267, Face Recognition Technology: FBI Should Better Ensure Privacy and Accuracy 25–32 (May 2016) (audits “have not yet assessed the use of face recognition searches of NGI-IPS”).
  • 287. See, e.g. Michigan State Police, MOU between the FBI and The Michigan State Police for the Interstate Photo System Facial Recognition Pilot (IPSFRP) (Apr. 8, 2011), Document pp. 011304–011309; Nebraska State Patrol, MOU between the FBI and Nebraska State Patrol for the IPSFRP (Oct. 2012), Document pp. 009183–009189.
  • 288. See U.S. Gov’t Accountability Office, GAO-16-267, Face Recognition Technology: FBI Should Better Ensure Privacy and Accuracy 49, 26 (May 2016) (describing test results on a dataset of 926,000 photos.
  • 289. U.S. Department of Justice, Justice Department Reaches Settlement in Civil Rights Lawsuit Against Maricopa County, Arizona, and Maricopa County Sheriff, https://www.justice.gov/opa/pr/justice-department-reaches-settlement-civil-rights-lawsuit-against-maricopa-county-arizona (last visited Oct. 2, 2016).
  • 290. See Maricopa County Sheriff’s Office, Purchase of Equipment to Enhance the MCSO Facial Recognition Unit at the ACTIC (Aug. 20, 2007), Document p. 015058.
  • 291. See above note 236.
  • 292. For more information, see the Center’s filing on the proposed exemptions. Center on Privacy & Technology et. al., Comment on Proposed Rule to Exempt Next Generation Identification System from Provisions of the Privacy Act and the Modified System of Records Notice for that System (July 6, 2016), https://www.regulations.gov/document?D=DOJ-OPCL-2016-0008-0114 (explaining the impact of the proposed exemptions).
  • 293. See 18 U.S.C. § 2519.

2. STATE & LOCAL LAW ENFORCEMENT

  • Recommendation 19: Impose a moratorium on face recognition searches of state driver’s license and ID photos until state legislatures regulate that access.

Many states have driver’s privacy laws that allow law enforcement to access driver’s license and ID photos; we identified only two, however, that expressly allow law enforcement to run face recognition searches of those photos.294 State and local law enforcement should impose a moratorium on these searches until state legislatures have the opportunity to debate and regulate them through legislation.

  • Recommendation 20: Adopt public face recognition use policies that have received legislative review and approval.

All agencies, including those that access another agency’s system, should adopt a face recognition use policy, preferably in line with the recommendations set out above. Policies should be developed simultaneous to—if not before—the implementation of a face recognition system or upon gaining access to another agency’s system. Most importantly, they should be made public and submitted for approval by city councils or other local legislative bodies.

A Model Police Face Recognition Use Policy, based on best practices in existing policies around the country, is included in our report.

  • Recommendation 21: Use contracts and the contracting process to maximize accuracy.

Agencies should avoid sole source contracting for initial purchases. In a competitive RFP, agencies should require vendor companies to demonstrate target accuracy levels and prove an algorithm’s submission to NIST accuracy tests. The system acceptance process should include accuracy verification testing on searches that mimic the agency’s actual use of face recognition—such as on probe images that are of lower quality or feature a partially obscured face. Final contracts should require continued internal accuracy testing in operational settings and submission to all applicable NIST tests. Finally, agencies should avoid contracts where the vendor has disclaimed responsibility for the accuracy of the algorithm, even when the vendor uses a third-party algorithm.

  • Recommendation 22:  Implement internal audits, tests for accuracy and racial bias, and the use of trained face examiners.

Law enforcement agencies should audit their officers’ use of face recognition, regardless of whether the agency runs its own system or accesses another’s. They should regularly test their systems for accuracy and, when the tests become available, racial bias. Each search should be conducted or reviewed by trained facial examiners to minimize algorithm error and possible bias in the search results.

  • 294. See Mich. Comp. Laws Ann. § 28.248 ("Biometric data obtained under a law or rule for noncriminal identification purposes may be used for criminal identification purposes unless prohibited by law or rule."); Tex. Transp. Code § 521.059 ("The [Department of Motor Vehicles] shall use the image verification system established under this section ... to aid other law enforcement agencies").

C. NATIONAL INSTITUTE OF STANDARDS & TECHNOLOGY

  • Recommendation 23: Regularly include tests for algorithmic bias along the lines of race, gender, and age in face recognition competitions.

NIST’s assessments are widely considered to be the gold standard in face recognition testing, yet NIST has checked for racial bias only once despite holding Face Recognition Vendor Test (FRVT) competitions for more than a decade. This information would be of immense value to law enforcement agencies that look to acquire face recognition technology.

  • Recommendation 24: Increase the frequency of face recognition competitions, ideally testing on an annual or biennial basis.

New innovations, such as deep learning or drastic reductions in template sizes, can reshape the technological landscape seemingly overnight. NIST’s current testing regime, which offers a competition every three to four years, is too infrequent to keep up with the pace of innovation. Acknowledging resource limitations, NIST should ideally hold its competition every one to two years.

  • Recommendation 25: Continue to update tests to reflect state-of-the-art advances in face recognition and mobile biometrics.

Face recognition technology continues to advance. To NIST’s credit, it has kept pace with these developments, offering a Face in Video Evaluation (FIVE) competition that is currently in progress. NIST should continue to monitor developments in face recognition and test accordingly. For live video streams, we recommend that, in addition to testing for accuracy, NIST should emphasize measuring computational resource consumption—the constraint that researchers have noted to be the technology’s primary limiting factor.

  • Recommendation 26: Develop tests that closely mirror law enforcement workflows, and issue best practices for accuracy testing.

Law enforcement agencies ask face recognition algorithms to perform a wide variety of tasks; an algorithm that excels at one task may struggle at another. NIST should strive to ensure that FRVT competitions explicitly test algorithms in ways that mimic each of these real-world law enforcement workflows, ensuring that agencies can make informed procurement decisions tailored to their intended use-cases. Since NIST tests are considered the gold standard among both researchers and companies, we recommend that NIST establish a standards or best practices document to assist other organizations or law enforcement agencies that wish to test face recognition algorithms.

  • Recommendation 27: Develop and distribute diverse datasets of photos.

Researchers and engineers universally complain about the lack of large, high-quality, diverse datasets of faces. NIST, along with other government efforts (such as the IARPA Janus project), is well placed to take the lead in developing and distributing such data, which would both aid algorithm design and provide a continual source of independent benchmarks.

D. FACE RECOGNITION COMPANIES

  • Recommendation 28: Internally check for algorithmic bias along the lines of race, gender, and age.

Neither of the major face recognition companies that we interviewed in the spring of 2016 could point to an internal test that specifically checked for racial bias. Companies should develop tests to measure this bias. Furthermore, companies should work to find the sources of this bias, mitigate it where possible, and inform law enforcement agencies when it cannot be eliminated completely.

  • Recommendation 29: Submit to public, independent accuracy competitions and publish performance results using modern, publicly available datasets.

Public, independent accuracy tests by NIST and the University of Washington offer the only basis for comparing the performance of face recognition algorithms. Companies should continue to submit their algorithms for these tests; they should also voluntarily publish performance results for modern, publicly available benchmarks that can serve as an additional basis for comparison. Some companies, such as Cognitec,295 have done so in part, but only for older datasets such as the NIST Color FERET Database,296 which was created in 1996.

  • 295. See Cognitec, FaceVACS Technology: A16 Algorithm Performance, http://www.cognitec.com/files/layout/downloads/FaceVACS-algorithm-performance-A16.pdf (last visited Sept. 22, 2016).
  • 296. See National Institute of Standards and Technology, U.S. Department of Commerce, Color FERET Database, http://www.nist.gov/itl/iad/ig/colorferet.cfm (last visited Sept. 22, 2016).

E. COMMUNITY LEADERS

  • Recommendation 30: Press local and state police departments and the FBI to be transparent and adopt policies to protect privacy, civil liberties, and civil rights.

Face recognition systems cost money. Taxpayers are paying the bill. They have a right to know how those systems are being used, and demand that they respect their privacy, civil liberties, and civil rights.

Community leaders should press state and local agencies, and the FBI, to be fully transparent about how they use face recognition; if those agencies refuse, advocates should use state and federal Freedom of Information laws to take them to court. Advocates should also press city councils, state legislatures, and law enforcement for laws and use manuals that protect individual liberties and civil rights.

The Face Recognition Scorecard summarizes face recognition systems in 25 different jurisdictions and links to the original documents from those agencies. Whether or not a jurisdiction is available, citizens should ask their elected officials or local law enforcement agency the following questions:

  1. Who is enrolled in the police face recognition database? Is it built from mug shots, driver’s license and ID photos, or other sources? If mug shots are used, do police eliminate photos from cases involving no-charge arrests or not guilty verdicts? If they use driver’s license and ID photos, are people notified of this at the DMV?
  2. Who can search the face recognition database? Can other local, state, or federal law enforcement agencies (like the FBI) search or request searches of the system?
  3. What kinds of face recognition searches are run? Do they use it to identify people they arrest or stop on patrol? Do they use it to identify criminal suspects from surveillance video footage? Do they have plans to use face recognition to identify people in real-time from live surveillance video?
  4. Does the agency have a face recognition use policy? If not, why not?
  5. What legal requirements must be met before officers run a face recognition search? Does an officer at least need a reasonable suspicion that someone is involved in a crime before he can run a search to identify that person? Or can officers run a search on anyone, so long as it is for a law enforcement purpose? Do searches of license and ID photos require a higher standard, like probable cause? Will the agency require warrants for real-time searches on live surveillance video?
  6. Is the agency’s face recognition use policy available to the public? Was it approved by a city council or other elected officials? Did privacy and civil liberties groups review it?
  7. How does the agency ensure that its face recognition system is accurate? Has the company submitted its algorithm to accuracy tests conducted by the National Institute of Standards and Technology? Does the purchase contract require certain accuracy thresholds and require ongoing accuracy tests in operational conditions? Are all candidate matches screened by specially trained human examiners whose decisions are peer reviewed?
  8. How does the agency ensure that its face recognition system is not biased on the basis of race, gender or age? Has the agency tested the system to make sure it is not biased against certain demographic groups? Has the agency asked its face recognition vendor about this possibility, and if so, what steps has the vendor taken to address this problem?
  9. How does the agency’s face recognition use policy protect free speech? Does the policy expressly prohibit using the technology to identify individuals based solely on their political or religious beliefs or their membership in a racial or ethnic minority, or is this in a separate, general document? Does the policy allow face recognition to be used near schools and hospitals?
  10. How does the agency stop and detect misuse and abuse? Does it log all searches and audit them? If not, why not?

For law enforcement face recognition to be brought under a reasonable system of regulation, communities need to ask questions—and take action.