Trending Topics

Facial recognition is solving crimes — here’s how to use it the right way

Seven steps agencies can take to ensure facial recognition supports justice, accuracy and public confidence

Facial Recognition System concept

Facial recognition technology can help improve efficiency in several ways, by both helping to uncover new leads and by offering additional information on already identified persons of interest.

Getty Images

By Don Erickson and Dan Merkle

The benefits of facial recognition are proven and growing, through a wide range of vastly different consumer, commercial and government applications. In law enforcement, facial recognition software is used to assist in identifying and capturing criminals and bringing justice and closure for victims. At the same time, there is some confusion and misunderstanding around its role in criminal investigations. Put simply, it helps generate identification leads by comparing facial images for similarity.

How facial recognition assists criminal investigations

Across the vast range of applications, facial recognition technology (FRT) has three primary functions, with different outputs:

  • Verification: Verifying that a person matches and enrolled image associated with their identity. (Output: an automated yes/no decision to authenticate.)
  • Identification: Determining that an image matches one or more images enrolled in a database. (Output: either an automated yes/no decision of some kind, or flagging a likely match for further review.)
  • Investigation: Helping determine whether a matching image is in a database for an investigative purpose. (Output: a series of images from a database with the highest similarity scores are provided for review.)

Federal, state and local law enforcement agencies have used the technology for the latter function for well over a decade, comparing facial images in thousands of investigations. Many public safety officials feel that facial recognition has become a game-changer for keeping communities safe, pointing to instances where crimes would have never been solved or prevented without it. Use under appropriate policies and procedures has been endorsed by the nation’s leading law enforcement professional associations.

Why transparency and policy matter

Documenting law enforcement use of facial recognition is important, as many agencies seek to provide greater transparency around use of the technology and increasingly emphasize in policies and procedures that software results are never to be considered the sole basis for probable cause. A dozen states currently have laws that establish specific conditions for law enforcement use. These can be tied to adoption of uniform statewide policies, such as in Maryland, Virginia, Alabama and Kentucky. The most recent state law, adopted in Maryland in 2024, requires disclosure to defendants in cases where use of the software led to further investigative action (i.e., generated a lead that was acted upon).

Some headlines have painted a misleading picture. For example, an October 2024 Washington Post article expressed concern about the disclosure practices by law enforcement when leveraging the technology – but the concern was based on a comparison of the overall number of times queries were performed — with a much smaller number of instances where this was reportedly disclosed. The disparity alone is not a useful metric for transparency.

Don’t confuse leads with evidence

Not all uses of this software result in an investigative lead. Though some agencies may perform a high number of database queries, most of the time these queries do not generate leads and are thus irrelevant to specific cases. Additionally, some coverage has suggested that the technology performed ineffectively — for example, citing cases in which the technology flagged a photo as too poor for accurate comparison, did not return any leads and/or was used before leads “turned out to be wrong” after further investigation. However, these cases should not be perceived as misidentifications, false positives or failures of technology or policy – in fact, lead verification is an essential part of investigative work that must follow any lead in a case.

Unlike DNA or fingerprints, facial recognition does not confirm a match in investigations — instead, it generates multiple images from a database that are the most similar. If a possible match is found, independent investigator-gathered evidence is needed to positively identify a person and establish probable cause, like processing anonymous tips and other types of leads.

What about misidentifications?

It’s important to address the primary public concern expressed about law enforcement use of the technology — that inaccurate technology could cause misidentification of suspects or even wrongful arrest. Misidentifications do occur in law enforcement activities, sometimes leading to inappropriate temporary detentions. Software-enabled facial recognition accuracy far exceeds that of human-only comparison. Software paired with human verification exceeds the accuracy of either mode alone. Across just seven reported cases in the last six years where use of facial recognition software is alleged to have led to such arrests, it seems clear in each that a breakdown occurred in the human-conducted process of establishing probable cause — the core issue in almost any wrongful arrest. The technology should simply never be a contributing factor.

These instances also need to be considered within the broader context of the known effectiveness of this technology when leveraged effectively and responsibly. The Security Industry Association (SIA) has documented dozens of successes around the country in solving cold cases and violent crimes and fighting human trafficking and child sex crimes. These publicly reported successes are just the tip of the iceberg – there are many more.

And despite persistent outdated media claims that the technology isn’t accurate enough, high-performing facial recognition technologies today are over 99% accurate with consistent performance across demographics, according to National Institute of Standards and Technology (NIST) evaluation data. This U.S. government data, which is the most reliable information available, shows that a large number of leading technologies used in commercial and government applications today are well over 99% accurate overall and more than 97.5% accurate across more than 70 different demographic variables.

Additionally with regard to matching photos from large databases similar to law enforcement applications, NIST’s Investigative Performance evaluation shows that the top 30 technologies can accurately match a photo in a database of up to 12 million images 98%-99.4% of the time overall. And according to different evaluation that uses race-labeled images, the top 100 are over 99.5% accurate in matching images across Black male, white male, Black female and white female demographics. For the top 60 of these, accuracy for the highest performing demographic versus the lowest varies only between 99.7% and 99.85%. And unexpectedly, white male is the lowest performing of the four demographic groups among these top technologies.

There is growing consensus among law enforcement professionals regarding the technology’s necessity, as well as the appropriate processes and rules surrounding its use. That is why it is critical to take steps that build more public trust that such tools are being used in effective, lawful and nondiscriminatory ways.

Key procedural practices for responsible use

Consistent with SIA’s Principles for the Responsible and Effective Use of Facial Recognition Technology, we encourage any law enforcement agency and/or professional to consider key procedural practices when deploying or crafting policy around the use of technology in investigations. These include:

  1. Policy development and disclosure: Each agency using FRT should publicly post a use policy detailing procedural requirements applicable.
  2. Legitimate law enforcement purposes: Specific authorized uses should be defined in policy. An active, ongoing law enforcement investigation should be the prerequisite for use for investigative purposes, to aid the identification of witnesses, victims or suspects. Additional appropriate public safety and welfare uses of FRT should be authorized (i.e., to help identify a person who is missing, incapacitated or otherwise unable or unwilling to identify themselves).
  3. Program oversight: Each agency should designate a program manager responsible for overseeing and administering use of the technology to ensure policy compliance.
  4. Operator/investigator training: Use of the technology should be limited to personnel specifically authorized, who have received appropriate training. Agencies should also ensure investigators receiving software results are trained regarding the nature of an “investigative lead” and the proper steps necessary to follow up on this lead.
  5. Secondary review: Any results determined to include a potential match candidate should be confirmed as a lead by a secondary trained reviewer unconnected to the investigation.
  6. Discoverability: In cases where a lead was developed using the technology and further investigative action was taken, this should be documented in case files and made discoverable under applicable rules of evidence.
  7. Procurement standards: Any facial recognition technology purchased by agencies should utilize algorithm providers whose products are tested and evaluated independently, such as under the NIST Face Recognition Technology Evaluation.

We believe facial recognition must be used only for beneficial purposes that are lawful, ethical and nondiscriminatory. View the full policy principles for more information and guidance for law enforcement in its use of this powerful tool to help keep our communities safer.

About the authors

Dan Merkle is the founding chair and CEO of Lexipol, a compliance and governance provider and partner to public safety agencies and serves on the board of directors for the National Policing Institute, an independent, nonpartisan research organization pursuing excellence through science and innovation. He is the chair of the Security Industry Association’s Artificial Intelligence Advisory Board. Merkle is an advisor at FaceFirst, a facial recognition provider to public and private entities, and the former chair and CEO of the company.

Don Erickson was appointed CEO of the Security Industry Association on Nov. 1, 2011. He previously served as SIA director of government relations from 2006 to 2011.

As CEO, Don leads implementation of SIA’s Board Strategic Framework and oversees SIA’s collaboration with industry and vertical market associations and organizations. He is responsible for management of SIA’s operations and programs including marketing, membership, government relations, education and standards initiatives and serves as the organization’s primary liaison to ISC Events.

Prior to joining SIA, Don served as manager of legislative affairs for Alcatel and legislative director for a trade association representing rural telecommunications providers. He spent six years on the senior legislative staff of U.S. Sen. Rod Grams, where he facilitated the enactment of legislation pertaining to criminal justice, telecommunications and technology policy.

Don has served as a member of several boards including The Catholic University of America Alumni Association, Mission 500, the National Capital Region Security Forum and the IAHSS Foundation. He was selected as a Top CEO by CEO Update magazine in 2013 and by Security magazine as “One of the Most Influential People in Security” in 2018 and inducted into the Security Sales& Integration Security Hall of Fame in 2023. Don currently serves on the U.S. Chamber of Commerce Association Committee of 100 (C100).

| WATCH: Generative AI in law enforcement: Questions police chiefs need to answer

Police1 Special Contributors represent a diverse group of law enforcement professionals, trainers, and industry thought leaders who share their expertise on critical issues affecting public safety. These guest authors provide fresh perspectives, actionable advice, and firsthand experiences to inspire and educate officers at every stage of their careers. Learn from the best in the field with insights from Police1 Special Contributors.

(Note: The contents of personal or first person essays reflect the views of the author and do not necessarily reflect the opinions of Police1 or its staff.)

Interested in expert-driven resources delivered for free directly to your inbox? Subscribe for free to any our our Police1 newsletters.