Trending Topics

The beat partner who’s never walked a beat

How artificially intelligent virtual assistants will supplement future police staffing and experience

GettyImages-1256603064.jpg

Could the introduction of virtual partners driven by artificial intelligence (AI) positively augment the staffing and experience shortfalls due to less-experienced officers remaining?

Getty Images

Editor’s note: This article is based on research conducted as a part of the CA POST Command College. It is a futures study of a particular emerging issue of relevance to law enforcement. Its purpose is not to predict the future; rather, to project a variety of possible scenarios useful for planning and action in anticipation of the emerging landscape facing policing organizations.

The article was created using the futures forecasting process of Command College and its outcomes. Managing the future means influencing it — creating, constraining and adapting to emerging trends and events in a way that optimizes the opportunities and minimizes the threats of relevance to the profession.

Article highlights

  • The article discusses the use of a virtual emergency response assistant (VERA), an AI system, in law enforcement.
  • The U.S. Bureau of Labor Statistics reported that in 2021, over 47 million Americans voluntarily quit their jobs, including many from law enforcement. This has led to an understaffed and more junior policing profession. The solution to this problem could be AI technology like VERA.
  • To be effective in law enforcement, an AI platform must be adaptable to societal changes and role-based to help various levels of professionals. AI and robotics are a priority for many industry leaders, as they expect these intelligent tools to bring substantial cost savings and efficiencies to their industries.
  • While there is evidence that AI systems like VERA could deliver efficiencies 24/7, there are significant challenges to integrating AI into modern patrol vehicles. These include ensuring that the AI software is not encoded with biases, and ensuring the highest levels of privacy for the data it collects and stores.

By Lieutenant Wesley Herman
It was a blistering 94 degrees by 9:30 a.m. on a sunny Saturday morning in Southern California. Officer Flores, a newer police officer, received an urgent radio call about a lost child at a busy neighborhood park.

With adrenaline coursing through her veins, Officer Flores started receiving information from VERA (virtual emergency response assistant), an artificially intelligent virtual partner. Building on information seen and heard through Flores’s body-worn camera, police radio earpiece and the department’s record management system, VERA expeditiously scanned thousands of prior reports of missing children, surveillance footage from the vicinity and even weather patterns. It quickly provided Flores with the top three probable locations the child may have wandered off to within the park, significantly narrowing down the search area and time needed to explore it.

As Flores arrived at the park, VERA continued to assist her. It scanned the crowds using real-time facial-recognition technology, comparing faces to known images of the missing child. Suddenly a match appeared on Flores’s heads-up display, highlighting a couple walking hand in hand with the lost child. With VERA’s guidance, Flores approached the couple and calmly explained the situation. The couple, who had found the child wandering alone, had been uncertain about what to do. Thankful for VERA’s swift assistance, the officer ensured the child was reunited with her parents, and Flores moved on to assist the next community member in need.

As the months passed, VERA became an indispensable member of the police force. It assisted officers in locating missing persons, provided accurate legal and policy advice, synthesized and summarized reports that saved its patrol partners hours at the end of each shift, and even helped bridge the gap between non-English-speaking community members and law enforcement by providing real-time translation services.

The community grew accustomed to engaging with VERA and utilized its services to report crimes, obtain information and receive timely updates. Flores’s police chief also became quite comfortable with VERA’s seamless integration into the department. Like other agencies throughout the country, staffing shortages were significant, and without a clear path forward through new recruitment or retention strategies. The technological creation of VERA was an impressive and innovative partnership with AI – one that was now significantly augmenting ongoing staffing shortages with a credible, capable and compassionate assistant that simplified the workload for every patrol officer in Flores’s organization.

So, with declines in staffing and experience on the street, where will law enforcement agencies source the next generation of professional officers? And could the introduction of virtual partners driven by artificial intelligence (AI) positively augment the staffing and experience shortfalls due to less-experienced officers remaining?

Yes, they could. Here’s how.

The “great resignation” comes to law enforcement


In 2021, more than 47 million Americans voluntarily quit their jobs, according to the U.S. Bureau of Labor Statistics. [1] American law enforcement wasn’t immune – the profession saw a record number of midcareer resignations (up 18% in 2021 from the prior year) and end-of-career retirements (up 45% during the same period) according to a 2021 study conducted by the Police Executive Research Forum. [2]

These rates of resignation are not unique to policing. Human resources expert Ian Cook surveyed over nine million employee records for over 4,000 global companies and confirmed that “resignation rates were higher among employees who worked in fields that had experienced extreme increases in demand due to the pandemic, likely leading to increases in workloads and burnout.” [3] This, in addition to the continued loss of purpose due to decriminalization in the legal system and overall staff fatigue from working extra shifts, has only amplified the massive staffing shortfall.

The result is an understaffed and more junior policing profession, compounded by the increased societal pressure to avoid making errors while operating on camera and trying to solve some of society’s most complex social challenges and criminal behaviors. Consequently, American law enforcement is facing a reduced value proposition for those wanting to serve. It’s no surprise the recruitment and retention crises have only grown larger. How can policing continue to provide service in the face of diminishing numbers?

What’s required of an AI platform for police?


With worldwide businesses projected to spend $110 billion annually by 2024, artificial intelligence (AI) is expected to be the “disrupting influence changing entire industries over the next decade.” [4] Forecasted AI enhancements include increases in informed decision-making, efficiencies through automating time-consuming repetitive tasks, streamlining of basic service requests from the community, automating of computer-aided dispatch (CAD) responses, real-time language translation, automated alerts and continual improvements to human-computer interactions. There will likely be continued calls for police transparency and increased oversight by the community. By automating many of the mundane tasks of law enforcement, the human officer can focus entirely on emergencies and high-priority incidents with more complete concentration and clarity.

For an AI platform to be successful in policing, though, it must be adaptable to changes in society. Additionally, given the hierarchical structure of the police profession, the AI partner must be role-based to best provide advice to the various levels of professionals it’s assisting, such as an officer, dispatcher or supervisor.

Virtual assistants, however, are not new. Society has embraced virtual assistants in both the entertainment and home automation fields with voice-activated entities like Apple’s Siri, Amazon’s Alexa and Microsoft’s Cortana. These systems are currently adept at both prospective memory (“remind me to…”) as well as retrospective memory (“remind me of…”) capabilities. [5] Their integration into professional fields like policing is a likely next step in their evolution where effective multifaceted conversations and informed decision-making capabilities are a necessity. AI and robotics are a priority for many industry leaders, as they expect these intelligent tools to bring substantial cost savings and efficiencies to their industries. [6] With widespread adoption occurring in the automotive, manufacturing, health care, hospitality and telecommunications industries, adoption by law enforcement is inevitable.

In November 2022, OpenAI’s ChatGPT was introduced as an advanced natural language processor (NLP) chatbot and reached over 100 million monthly active users within two months of launch, making it the fastest-growing consumer application in history. [7] Confirming the demand for AI is here and now, it took TikTok nine months and Instagram 30 months to reach this same level of use.

When seconds count, the properly generated and professional AI-based virtual partner can thoroughly and efficiently collect and sort millions of data points in a hands-free platform, delivering timely intelligence and resources through NLP voice control. Utilizing a “wake” word, VERA can stand by, ready to assist at a moment’s notice, continually collecting data and improving its usefulness to its human partner. VERA’s human officer partners will have a sense of ease knowing VERA is an expert in the law, policy and emotional intelligence, all of which make the officer better at their job. These virtual partners will also provide a much more efficient method to share intelligence across platforms and increase the quality of life for both officers and the communities they serve.

Yet, despite overwhelming evidence that VERA could deliver these efficiencies 24/7 at a moment’s notice, there are considerable issues and challenges to see AI successfully integrated into the modern patrol vehicle.

What are we afraid of?


With little U.S. government oversight, private companies are using AI software to make decisions about health and medicine, employment, creditworthiness and even criminal justice with no answer on how to ensure the technology is not encoded, consciously or unconsciously, with structural biases derived during its machine learning input processes. [4] While VERA may not have been programmed with purposeful bias, the potential for unintended biases to emerge must be carefully monitored to ensure it doesn’t occur and public perception and trust remain at the highest levels while working alongside VERA-like systems.

Ensuring VERA can also adequately secure its data to ensure the highest levels of privacy will be a crucial requirement for its implementation. Law enforcement routinely has access to the most personal information about the people and communities they serve. Safeguarding its collection, storage and dissemination and guaranteeing it is only shared with its law enforcement partners will require significant protection and security.

In Stanley Kubrick’s and Arthur Clarke’s classic 1968 film, “2001: A Space Odyssey,” the machine-learned brain HAL of the spaceship “Discovery” decided to eliminate the entire crew after trying to determine how to resolve a programming contradiction that arose from two mutually exclusive directives that posed a moral conflict. The politicians in charge of the mission ordered HAL to lie to the crew about the actual mission goals, while the engineers and mission planners ordered HAL to be completely truthful. As a result, HAL believed all involved humans posed a danger to it and autonomously determined extinction of the humans involved was necessary. Although HAL is a fictional AI machine, fears about AI operating outside the parameters set by humans remain an issue today.

Not surprisingly, modern-day AI ethics boards, like the one Axon Enterprises, Inc. created, have noted that the agreed-upon responsible way to utilize AI technology is in collaboration with human partners, not instead of them, by accelerating tedious workflows and leaving the decision-making to officers and their agencies. [8] AI virtual partners like VERA may be able to predict crowd behavior, detect patterns and irregular behaviors, protect critical infrastructure or perhaps uncover complex criminal networks. [9] Generally, AI has the potential to build upon human performance by increasing the professional’s knowledge, skills and abilities to essentially supercharge their experience. [10] This would allow law enforcement agencies to recruit from a wider range of fields and professional backgrounds, knowing AI would enhance a mid-career-level professional’s knowledge and experience coming into policing.

Others have expressed concern that AI would replace humans and cause the staffing crisis to get even larger. If there were to be a wholesale transition from human workers to machines, the government would need to consider the significant impact on housing, education systems and food suppliers given the potential for massive unemployment. Subsequently, it would need to then identify new ways to generate revenue without payroll taxes or other funds generated by a human workforce. In the near term, though, AI appears to be a collaborative tool for humans, rather than a replacement for them.

The notion that AI is a tool and not a replacement for human decision-making was recently studied by social science Ph.D. candidate Hakan Aksoy. He concluded that AI is within the parameters of property and therefore cannot be held liable for crimes due to its use or directives, since those must ultimately be carried out by a human. [11] However, if in the future AI were to acquire “human” status, making fully autonomous and conscious decisions, radical changes would be needed in our legal system. Streamlining things for now, collaborative AI would be used to simply inform, not independently determine, human-directed outcomes.

Solutions


With the saturation of wearable devices and technology and the increase in automation among the Internet of Things (IoT), smart cities and forward-thinking police agencies are moving toward technology as a force multiplier to supplement staffing and maximize the performance of professionals they currently employ. Law enforcement has willingly adopted technologies like unmanned aerial systems, body-worn cameras, automated license plate reader technology, crowdsourced crime-prevention data from individual cell phone users, and gunshot-detection audio devices into an existing suite of technology-driven enhancements to public safety. It is not a stretch, then, to imagine agencies adopting a virtual partner when one becomes available.

There are numerous instances where the addition of more intelligence and increased situational awareness has led to enhanced decision-making that resulted in safer, and often more peaceful, outcomes for officers and their communities. Any AI that leads to fewer deaths or injuries while enhancing human interactions with the community should be welcomed.

Imagine an interconnected network of VERA assistants working together during your next active shooter incident. The reduction in time it would take VERA to scrape the massive amounts of data linking suspects and victims to the incident would be extraordinary. Seamlessly reviewing the details of all active officer body-worn cameras, regional automated license plate readers, authorized closed circuit television (CCTV) video cameras from nearby businesses, historical police CAD data and facial-recognition software systems would be a start. Evidence that would take multiple dispatchers, officers and investigators hours or days to collect, analyze and use for decisions can now be obtained within a few short seconds. Patterns of behavior detected, and even possible future suspect actions may be accurately predicted based on the analyzed data.

Would this save lives? Could the increase in situational awareness and intelligence prevent additional victims or officer deaths? How about easing officer stress, fatigue and burnout? It’s not only possible but very likely – and when technologies emerge that can act like VERA, astute leaders will work to adopt them.

Conclusion


The acceptance of artificially intelligent virtual partners within policing seems inevitable. AI has already been implemented within dozens of industries worldwide, and it has proven to be a more efficient way to accomplish mundane tasks, provide additional insight and forecast future outcomes, increasing decision-making capabilities. Law enforcement is at the crossroads of information sharing and staffing constraints. With increased amounts of information being collected, analyzed and disseminated from already-reduced staffing levels, the door is wide open for AI technology to supplement law enforcement.

The enhancements AI would bring are difficult to deny. And while considerable additional research and analysis remain to ensure a neutral, objective and nonbiased implementation of the technology, the overwhelming benefits possible to an already-overstressed policing system are indisputable.

Transparency and trust will be at the forefront of the discussion if adoption of the technology and its capabilities is desired. Enhancing less experienced officers’ decision-making, supercharging their lived professional experience, and reducing human errors and overall organizational liability and officer stress are just a few of the many reasons why AI technology is here to stay.

Topics for discussion


1. Given the potential for AI to augment staffing shortages, how might we prepare our officers and community for the integration of artificial intelligence like VERA into everyday policing? What training or educational programs should be implemented?

2. While AI such as VERA could provide significant efficiencies, how should we address concerns about potential structural bias in AI programming and the security of personal data collected by these AI systems?

3. Considering the ongoing “Great Resignation” within law enforcement and the potential for AI to fill certain roles, how can we redefine the role of human officers to maximize the benefits of technology while maintaining essential human elements of law enforcement?

References


1. Fuller J, Kerr W. The great resignation didn’t start with the pandemic. Harvard Business Rev. March, 2022.

2. Police Executive Research Forum. PERF Special Report: Survey on police workforce trends. June, 2021.

3. Cook I. Who is driving the great resignation? Harvard Business Rev. September, 2021.

4. Pazzanese C. Great promise but potential for peril. Harvard Gazette. October, 2020.

5. Brewer RN, Morris MR, Lindley SE. How to remember what to remember: Exploring possibilities for digital reminder systems. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies. September, 2017.

6. Brown S. What ‘work of the future’ means to 5 business leaders. MIT Management Sloan Schools. May, 2022.

7. Hu K. ChatGPT sets record for fastest-growing user base – analyst note. Reuters. February, 2023.

8. Smith R. Axon’s AI work: What’s ahead. Axon. May, 2017.

9. Rigano C. Using artificial intelligence to address criminal justice needs. Confederation of European Probation. November, 2020.

10. Stackpole B. 5 steps to ‘people-centered’ artificial intelligence. MIT Management Sloan Schools. January, 2020.

11. Aksoy H. Artificial intelligence assets and criminal law.

About the author

Lieutenant Wesley Herman has worked for the Citrus Heights Police Department (CHPD) in Northern California for over 15 years. During that time, he oversaw the startup of two new innovative investigation units and their teams. He’s led and worked in multiple positions in patrol operations, special operations and investigations focused on strategic public-private partnerships and programs that utilize technology to maximize efficiencies for law enforcement leaders. Additional responsibilities included the management of the firearms, K-9, honor guard and mobile crisis support units.

Herman is a recent graduate of the prestigious California POST Command College program, Class 70. He’s earned a Master of Science degree in law enforcement and public safety leadership from the University of San Diego in 2021, and a Bachelor of Science degree in economics and business administration from Saint Mary’s College of California. He is focused on innovative police futures and shares a passion for envisioning tomorrow’s law enforcement profession.

Police1 is using generative AI to create some content that is edited and fact-checked by our editors.

RECOMMENDED FOR YOU