4G or Long Term Evolution (LTE) wireless broadband holds the promise of bringing far more than just fast internet access and fewer dropped phone calls. As the computing power contained in handheld devices increases and these processors are able to access the volume of data they can digest, the law enforcement officer of the near future will command resources most haven’t even dreamed of.
It’s important to consider the processing power of modern Smartphones along with the big data pipes that 4G brings. The latest Android Smartphone, the Droid Bionic, contains a dual-core processor running at 1GHz and a gigabyte of random access memory. This is roughly the same computing power as was available in the fastest desktop machine you could buy four or five years ago, and close to the power available in a refrigerator-size Cray supercomputer from fifteen years ago — and it fits in your shirt pocket. The Smartphones we have now are more accurately computers that are capable of making phone calls and using wireless broadband data.
Another aspect of Smartphones that add tremendously to their functionality is that they are location-aware. Most of the time, they use GPS signals to determine where they are. If the GPS signal is unavailable (as will sometimes be the case when inside a building), they get a more general location from polling of cell towers and Wi-Fi networks in range of their antennas, measuring the time the signals require to make the trip to and from the device. Continuous polling tells the computer if it’s moving, and if so, in which direction and at what velocity. This information is always available to the processor to supplement and refine the information presented to the user.
A further refinement, not presently in widespread use, will be head-mounted displays that superimpose information over the user’s normal view. The equipment required for this is presently bulky, requiring the user to wear a helmet or other headgear to support a tiny display that sits just outboard of one or both eyes. These will get small enough to be embedded in the frames of wraparound eyeglasses, which will also carry cameras and microphones. The attached computers will know which way the user is looking, and what he is seeing.
Consider a Scenario
Officer Mary Future is on patrol when she receives a call to respond to a burglar alarm at 100 Main St. The call comes over the traditional voice radio, but at the same time is sent digitally to the communications computer she wears on her equipment harness. She hears the call through her earphone, and at the same time sees the address displayed at the top of her peripheral vision, overlaid on the view outside her patrol car windshield.
She knows where the address is, but the system guides her to it, anyway. Aided by data from traffic cameras, weather reports, and feedback from other beat officers equipped with the same gear, an augmented reality overlay suggests the most efficient route by superimposing arrows on her view of the street and intersections in front of her. As she approaches a crossroad of two major streets, the side street is overlaid by flashing red to alert her that another officer is approaching from that direction at high speed. That officer sees a yellow alert in Mary’s direction, telling him he should proceed through the intersection first and let Mary fall in behind.
Cameras mounted on the light bar of Mary’s patrol car scan and record the license plate numbers of every car Mary passes, comparing them against a hot list written to Mary’s computer memory before she left the station. That information is ignored for now, so Mary can focus on the incident at hand.
As the two officers approach the address, the building is highlighted for them with the entry doors indicated, even though only one is in their line of sight. Mary takes the front door; her cover officer goes to the back. The heads-up display of their glasses tells them which door generated the alarm, and the name of the business owner who is supposed to be on the premises.
Mary gets out of her car and sees a man exit the front door, smile, and walk towards her. As he gets close enough to recognize, the facial recognition software incorporated into the tactical patrol package compares the face to the driver’s license photo on file for the business owner. The man tells her he is the owner and he tripped the alarm in error while opening the store. The display overlay says different — this face doesn’t match the one in the database. Mary draws her sidearm and tells the man to stop where he is and show his hands. While he protests indignantly, the facial recognition program tells her there is a 98 percent match of this face to that of Bobby Burglar, who has four active arrest warrants on file.
Bobby’s mug shot floats alongside her view of his face, and she decides to believe the computer. With a voice command, she clears the overlay display, prones out the burglar, and tells her partner she has one at gunpoint. That last advisory was redundant. As soon as she took the gun out of her holster, her partner and the communications center were alerted and a streaming video image became available to officers needing to see it. During the entire evolution, everything Mary saw, real-world and augmented, was saved to the memory card in the computer she wore.
The Future is Now
Everything in this futuristic scenario is possible with technology already available. What makes it possible today is the ability to store huge volumes of data in “the cloud,” available for access via wireless broadband. It’s not feasible to make and keep current copies of this volume of data on individual computers in the field, but it’s a comparatively simple task to maintain a single database for each function, and allow client computers in the field to access it as needed.
In fact, the only limiting factor in making this scenario possible is that 4G broadband networks are not fully deployed by any carrier. By the end of 2015, most cities in the United States will have the data networks in place to make this futuristic scenario possible.