Most tools used in law enforcement are well-known and straightforward in their function – service weapons, handcuffs, patrol cars and even bodyworn cameras all serve clear, specific purposes. There’s little mystery surrounding how or why they’re used.
However, some tools are more complex, especially those that officers don’t rely on every day. Facial recognition technology (FRT) falls into this category – a tool that’s gaining popularity yet still remains somewhat of a mystery to those who haven’t used it firsthand.
The concept of FRT is often mentioned with several myths, including that the technology is inherently racially biased. Officers using this technology may encounter other common misconceptions – here’s the truth.
Myth No. 1 – FRT is used without any training, policy or oversight
Many critics of facial recognition technology claim that it is a tool that’s arbitrarily handed to officers and allows them to run searches on images of anyone, anywhere and for any reason.
“Every tool used by law enforcement should be guided by a clear policy. Since the early days of FRT implementation, we have encouraged agencies to adopt policies that include comprehensive training and administrative oversight. Over time, officer training has significantly advanced, with more robust education and support resources now available,” explained Josh Findley, retired Homeland Security Investigations special agent and senior partner manager at Clearview AI.
“Every tool an officer has, whether it be handcuffs, a gun, a car, access to law enforcement databases – must be used responsibly,” he continued. “Our tool is no different. Our focus is on educating law enforcement on its ethical and proper use and providing the resources to help ensure it’s used responsibly.”
When an agency partners with Clearview AI, they are required to adhere to the platform’s terms of service, which include designating an individual at the agency who will audit officers’ use. Upper management within the agency must sign off on the use before deployment.
“It really boils down to the fact that our customers are vetted, and we apply a strong security layer for who can access the platform, how they’re going to access the platform and then, ultimately, it’s up to the end user to ensure it’s being used appropriately,” said Amos Kyler, Clearview AI’s chief technology officer.
Prior to use, Clearview AI provides an agency with training and how to properly use the Clearview platform. Any changes or upgrades to the platform are accompanied by recurring training and regular webinar events, and Clearview AI’s customer success team is available to provide additional support as needed.
Findley says the company does not have access to customer search results as a way to protect ongoing criminal investigations and agency privacy, so it’s critical to train auditors within the agency on how to use the platform’s auditing tools.
Myth No. 2 – FRT replaces the need for a full investigation
There’s no question facial recognition technology has the potential to save investigators countless hours by providing a lead, particularly when there is little other information to help determine which steps to take next. However, an image search result on Clearview AI is designed to be viewed as a lead and is a step in the investigative process.
“It’s ingrained into our training and in every step of our tool,” said Findley, “so whether you’re a new user and you’re signing an agreement about being able to use the platform or every time you sign into our platform you have to review, understand and accept our policy beforehand – that our platform is not a verification of identity, it is only a lead.”
He likens the use of FRT to a scenario where an agency might run a picture of a perpetrator on the nightly news and they receive dozens of calls on a hotline identifying who that person is. “You don’t just go out and arrest that person,” he said.
Depending on the type of case an investigator is working, their next steps after obtaining a lead through Clearview AI might vary. If officers are able to find a link to a social media account, they might look at other pictures to see if the information there helps prove or disprove a person’s involvement. Or, if a mug shot was identified using FRT, investigators might take steps to find out the person’s address, when the mug shot was taken and for what type of crime they were arrested.
While the use of FRT might be straightforward in some types of investigations, other crimes may pose a greater challenge. For example, a scam involving catfishing, where the picture a perpetrator uses isn’t even of themselves, might require more investigation beyond simply obtaining a lead via facial recognition.
“It gives you an idea of where to look, but it can’t give you a definitive answer,” said Kyler. “It can give you somewhere to look and then you can run through the investigative process to know if that was actually the right place to look or not.”
Myth No. 3 – FRT can be used successfully with images of any quality
Television shows often mislead viewers by portraying FRT as a tool that can effortlessly transform a blurry image into a clear, identifiable face. In reality, as many investigators know, the technology doesn’t work that way. The effectiveness of FRT depends heavily on the quality and characteristics of the images being analyzed, and results can vary significantly.
“There’s definitely a correlation between image quality and the quality of the results,” said Kyler. “Images that are more blurry have less information to perform an accurate match. We tell users if the image is low quality, that it may impact their results and to proceed with caution.”
However, Findley notes that even with low-quality images, some agencies can get surprisingly accurate results. Clearview AI includes several image tools to help improve search results, including cropping and rotating an image along with other photo editing options.
While FRT tools have a baseline level of performance – meaning they will function similarly regardless of whether a blurry image has been clarified or not – it’s best to try to obtain high-quality images from the start. “If you have CCTV footage, for example, it’s best to get the source video and upload the video into Clearview AI and choose a still image from it,” said Kyler.
Yet even with less-than-ideal images, some agencies have found success using FRT. Findley recalls a case in Colorado where people broke into a gun store wearing masks and drove away in a stolen vehicle. While this scenario doesn’t seem to offer much in the way of leads, law enforcement found footage from a taco stand drive-through in the area and was able to clean up the image well enough to obtain a lead using Clearview AI.
“FRT is a huge easy button for law enforcement,” said Findley. “It’s definitely not the only tool to use, but it saves a ton of time, and there are so many cases that I’ve personally been involved with that they wouldn’t have been solved without that jump start.”
Visit Clearview AI for more information.