How AI software could monitor real-time camera feeds to detect criminal behavior
Using AI and facial recognition software for real-time crime reporting is the next logical progression in how police are already using existing technology
Few forces are impacting law enforcement like video. Policing in the Video Age, P1’s yearlong special editorial focus on video in law enforcement, aims to address all facets of the topic with expanded analysis and reporting.
In the final installment of this four-part signature coverage effort, we take a look at the future of video in policing. Click here to learn more about the project.
Also be sure to check out our latest PoliceOne Digital Edition – 2018 Police Video Guide: The emerging tech, training and tactics shaping law enforcement – in which we explore how departments can best utilize emerging video technologies to enhance police officer safety and improve operational efficiencies. Download the free guide here.
Companies like Facebook and Google have been working on integrating artificial intelligence (AI) with facial recognition technology for years, and the capability is now rapidly migrating from social media to private security and video surveillance. Surely, it will soon also be employed by law enforcement.
How AI could assist police officers
AI-driven facial recognition software will allow police departments to gather video from feeds from municipal cameras, privately owned cameras and even bystanders’ mobile phone video.
In fact, this technological capability is not just coming – it’s here. Although not yet widespread in the West, China has a vast network of video surveillance cameras – reportedly more than 20 million cameras – “in what is believed to be the world’s most advanced surveillance system,” according to the Daily Mail. That system scans for aberrant or unusual behavior such as public intoxication or attempted theft.
While not as large in scope or scale as the Chinese deployment, the Detroit News reported recently that the Detroit Police Department “will soon integrate facial recognition software into the department’s Project Green Light video crime monitoring program.” The system will monitor video feeds from participating gas stations, convenience stores and other business.
Further, an automated system called AIsight was installed in Boston after the 2013 marathon bombing. The software monitors camera feeds in real time and alerts police if it detects criminal behavior.
Those private companies stop short of talking about facial recognition – insisting that their systems will scan for things and not people. In the search for a missing child for example, the software will search for the clothing, not the child’s face.
But it does not require a tremendous leap of imagination to envision facial recognition software – running on AI – eventually looking for individuals based on previous police booking photos, or even a driver’s license picture.
Using AI for crime reporting is a logical progression
Using AI and facial recognition software for real-time crime reporting is simply the next logical progression in how police are already using existing technology.
For years, investigators used machine learning to examine archived surveillance video for clues (and ultimately, evidence for trial) in criminal investigations. They input the face of the person they are interested in, and the software combs the archived video footage for a match.
“The need has become overwhelming,” said Bud Levin, an expert in AI at the FBI’s Training Division. “Everybody and every place seem to have one or more image-recording devices, from modern cameras and phones to relatively ancient CCTV devices. BWC images alone can swamp police departments and the offices of prosecutors if viewing must be done manually.”
Richard Myers, executive director of the Major Cities Chiefs Association, believes that the role of AI with any public safety camera systems – whether body worn or other – will surely evolve as the algorithms get more robust and facial recognition becomes more reliable.
“Technology already exists in the private sector security where AI helps stores identify behavior patterns that predict an imminent theft from a store shelf via CCTV,” Myers said. “Logically, some of that behavior pattern analysis AI will eventually be integrated into public safety video systems. It is probably more likely that this will first appear in ‘fixed’ CCTV systems, as the dynamic stream of video from body-worn systems will require even more AI rapid analysis.”
Myers added that just as Auto License Plate Reader technology provides immediate feedback when a wanted vehicle passes a camera equipped with ALPR, facial recognition software integrated with body-worn cameras may provide police officers with immediate information about who they’re dealing with.
“There will no doubt be significant legal debate about privacy rights and the like,” Myers said. “It will be important for public safety advocates to assert that there is no expectation of privacy in a public place, nor when engaged in a direct interaction with a uniformed police officer. This is a highly nuanced issue, however, as the facial recognition could work inside a private space where an officer isn’t yet identified as such.”
Could AI software provide real-time crime alerts?
Will AI and facial-recognition software eventually be used to alert police about crimes in progress or the sighting of a wanted fugitive in real time – essentially bypassing the 911 call-in system of crime reporting?
Levin says it will, but that it will be a while before we get there.
“One problem will be false positives and another will be writing AI so that it understands the complexities and absurdities of the Constitution, our laws and policies. Just as we will be reducing the need for patrol officers as autonomous vehicles gradually but inevitably supplant driver-operated vehicles, AI will be used to reduce demand for investigative and patrol manpower.”
There’s a potentially slippery slope here – the term “mission creep” comes to mind.
At the outset, the intention of such initiatives may be to catch the next active shooter or radicalized Islamist terrorists before they are able to unleash their attack. And most Americans would probably be okay with that – fear dramatically reduces concern about civil liberties, after all.
But as is the case in China, the software will do what you tell it to do, and the Chinese have told it to look for much more benign offenses than mass killing – public intoxication, while a nuisance, is not nearly as serious as international terrorism.
However, with the advances of AI and machine learning, the storylines of films like Enemy of the State and Minority Report begin to look a lot less like fiction than at the time those movies premiered in theaters.
Addressing privacy concerns as artificial intelligence expands in scale
Privacy rights proponents say that implementing AI in video surveillance on such a broad scale essentially assumes that everyone is a potential criminal. And privacy concerns are not limited to Hollywood producers and privacy rights groups.
In 2014, the U.S. Department of Justice warned that “the use of facial recognition and BWCs may pose serious risks to public privacy. Agencies that explore this integration and other new technologies should proceed very cautiously and should consult with legal counsel and other relevant stakeholders.”
Furthermore – and perhaps most important – facial recognition software is simply not yet accurate enough to use to search for fugitives in real time.
While facial recognition is much more successful on a small scale right now, researchers are working on developing massive-scale AI solutions.
For instance, the MegaFace Challenge – maintained by the University of Washington – “is the world’s first competition aimed at evaluating and improving the performance of face recognition algorithms at the million person scale,” according to UW.
Ira Kemelmacher-Shlizerman – the lead investigator on MegaFace – told the Daily Mail, “We need to test facial recognition on a planetary scale to enable practical applications. Testing on a larger scale lets you discover the flaws and successes of recognition algorithms.”
How will AI evolve?
As to the question of when this all may come to fruition, Myers points out that some futurists have expressed that AI’s evolution has progressed slower than originally forecast but that, like all technologies, the evolution is tied to a number of factors, not the least of which is economic support and marketplace demand.
“With major police vendors now implementing AI into their systems, it may accelerate the pace of its evolution,” Myers said, “I would think AI and behavior pattern analysis could be integrated into public safety CCTV systems within five years or less, and facial recognition with BWCs could also occur within that time frame. As with most technology advances, however, the limiting factors won’t be the technology, but the human interface, ease of use and user awareness, and the legal debate and limitations.”
“Dedicated video is only part of the AI story,” said Levin. “Consider the IoT. Consider clouds full of data – warehouses ripe for the tapping and AI extracting. Consider also that – well into the digital information age – we cling desperately to industrial-age concepts such as privacy, confidentiality, anonymity and secrecy. ‘Them cows done left the barn.’ Sooner or later, we’ll realize that for most of what we want to hide the cost of keeping them hidden is way beyond the value.”
The Singularity – the point at which computers and machines become more intelligent than the combined intelligence of all living humans – hasn’t happened yet, but with the rapid acceleration of AI and machine learning, it’s apparent that the prediction proffered by Professor Vernor Steffen Vinge in the early 1990s is very likely to come true.
And it will all be recorded on video.