Trending Topics

Watch & learn

By Frank D. Roylance, Baltimore Sun

By analyzing a person’s body language, gait and other movements, behavior recognition software is helping catch criminals and may be useful in the war on terror, as well as have medical applications

It’s 11:30 at night on Lovegrove Street, an alley near the Homewood campus of the Johns Hopkins University.

A lone man is looking up and down the street, apparently waiting for someone. A pickup truck drives up. The man says something to the driver, gets in and they drive off.

Minutes later, a block away, a woman is robbed at gunpoint by two men who speed off in a pickup. No one at the scene can describe the truck to campus security officers or to Baltimore police.

This case last June might have gone cold. But it did not. The Lovegrove caper was solved by technology that plays the role of an old-fashioned tipster.

Behavior recognition software enables computers to alert people when something, or someone, appears suspicious. The Hopkins system employs one application of the technology to watch for aberrant movements captured by dozens of cameras - far more than any person could track.

Computer analysis of the way people and objects appear and move on video is also being developed at the University of Maryland’s A. James Clark School of Engineering and elsewhere for use in surveillance, security, anti-terrorism and even medical applications.

Hopkins’ security system caught the robbery suspects on a video camera on Lovegrove Street. The software registered the man’s behavior and the late hour, and alerted the security officer on duty in the command center.

The view down Lovegrove was singled out by the computer amid the incoming imagery from 89 campus security cameras. It popped automatically onto the officer’s screen, with the man’s image highlighted in a yellow box. She quickly zoomed in and recorded images of the suspect, the truck and its license plate.

After the victim reported the robbery, the tag number led police within hours to a borrowed truck and the suspect, who had a police record. The victim picked him out of a photo lineup. He was arrested a few days later, linked to a second crime and charged with both.

“If we didn’t have this video system, or she didn’t focus on him, he would have gotten away,” said Edmund Skrodzki, executive director for security for Hopkins’ Homewood campus. “One person can’t monitor 89 screens. You need help with it, and behavioral recognition provides that assistance.”

The technology has attracted interest and research dollars from the Pentagon and the Department of Homeland Security. Some of that money has gone to the University of Maryland, College Park, where Rama Chellappa is a professor of electrical and computer engineering and director of UM’s Center for Automation Research. He is exploring more sophisticated ways to flag people and suspicious activity.

One way is by their gait. The way people move while walking can reveal a lot, Chellappa said.

Is this person male or female? Are his arms both swinging, or is he carrying something? How tall is he? How heavy? How heavy does his burden appear to be? Is it in view or hidden? Has he put it down and walked away?

The computer may even calculate whether this is a gait “signature” it has seen before. Do we know this person?

Some Americans might also want to ask whether this sort of public surveillance and scrutiny is a good thing, or a threat to privacy. At Hopkins, Skrodzki said he’s had no complaints. “If anything, we’ve actually gotten a lot of feedback from faculty, students and staff saying we’ve increased their comfort level,” he said.

Hopkins’ camera system is watching a 140-acre campus and nearby areas - up from 32 cameras when Skrodzki was hired in mid-2005. It displays 19 scenes at any one time on a large screen in the command center.

In addition to the Lovegrove loiterer, the system has alerted security to a juvenile as he attempted to steal a motorbike, leading to an arrest. Another youth was spotted, tracked by cameras and arrested after spray-painting graffiti on campus buildings.

A nighttime bicycle thief was confronted after another alert brought nearby security officers to the scene. The crook dropped his bolt cutters and ran off, but the bike was recovered, Skrodzki said.

Other potential criminals were warned off because of automated alerts as they cased a sorority house, or tried doorknobs and car handles near campus.

“It’s been fantastic for us,” Skrodzki said. “We are responding more to alerts, and that’s a good thing because our whole thing is prevention.”

Campus bike thefts dropped from 25 during the 2005 fall semester to three last fall. Overall crime was down 20 percent in 2006. And Skrodzki credits the behavioral recognition software for providing a critical assist.

The Pentagon and Homeland Security have bigger fish to fry, of course.

At College Park, Chellappa said that the Pentagon’s interest in the remote identification of security threats began in 1996 after terrorists in Saudi Arabia drove a fuel truck into the Khobar Towers housing complex and blew it up, killing 19 American servicemen and one Saudi. More than 300 were injured.

Clearly, it’s too dangerous to let potential terrorists get close enough to be identified by fingerprints and iris scans. So the Defense Advanced Research Projects Agency (DARPA) began funding research to develop new ways to identify suspicious behavior at a safe distance.

In 2000, Chellappa began working on gait recognition with Mark Nixon of the University of Southampton in England. They developed computer programs, using algorithms that convert digital video data to mathematical patterns. The software analyzes whether patterns generated by an individual’s movements match those designated as suspicious.

For example, the computer can note when the subject is running, loitering or moving oddly; whether he is carrying something in one arm, both arms or concealed on his body, or whether someone’s gait changes in a way that suggests he has left a heavy object behind.

Chellappa has provided algorithms for gait recognition and estimating a person’s height to Honeywell International, which is using them under a Homeland Security contract to devise video technology for automatically tracking suspicious people through airports while alerting security personnel. Honeywell’s pilot system is running at Minneapolis-St. Paul International Airport.

“If you see someone you think may be a bad guy, and you want to follow that person, it’s pretty straightforward to track him in one vision zone,” said Dan Sheflin, chief technical officer for Honeywell’s Automation and Control Solutions division.

“But picking that person back up and being able to track them [on non-overlapping cameras] is a very important problem ... one nobody’s figured out yet,” he said. “I think we’re only a year or two away from having it figured out.”

A person’s weight, and the weight of any objects he’s carrying, could also be estimated from his height and movement patterns, Chellappa said.

“If I know how tall you are, and have a rough idea of how much you weigh, I can build a model and put a backpack on your back and increase the weight 10, 20, 30 pounds,” he said. He can then generate a “library” of subtle gait changes that computers could reference to provide weight estimates for suspect bundles.

Gait recognition technology may also prove useful in healing. Chellappa is working with Thomas P. Andriacchi of Stanford University and John Jeka in UM’s Department of Kinesiology to apply gait recognition technology to motion analysis in patients recovering from surgery, or coping with disabilities such as Parkinson’s disease.

That could be used to develop treatment regimens, adjust a patient’s prosthesis or spot developing problems.

But the technology’s primary applications, for now, seem to be security-related. With Virginia-based SET Corp., DARPA, Homeland Security and the Air Force, Chellappa is working on linking video tracking and gait analysis technology to a low-power radar system that - at a safe distance - could “frisk” a person singled out by his movements as a potential suicide bomber.

There are difficulties, of course. People walking in a crowd are difficult to isolate, and when they’re not walking crosswise to the camera there is less gait information to process.

“This [technology] will work when you have a portal kind of a situation and a person walks by one by one, for example when we come out of the immigration and customs area at an airport,” he said.

Even so, the variety of “normal” behaviors is so wide that it’s difficult to draw the line between the normal and the abnormal, and innocent people may be stopped for questioning. “If you create too many false alarms, nobody’s going to buy [the technology],” he said.

In the end, Chellappa said, even after an automated alert, “humans have to make the judgments.”