Skip the navigation

Big Brother Really Is Watching

Homeland Security is bankrolling futuristic profiling technology to nab terrorists before they strike.

January 14, 2008 12:00 PM ET

Computerworld - The year is 2012.

As soon as you walk into the airport, the machines are watching. Are you a tourist -- or a terrorist posing as one?

As you answer a few questions at the security checkpoint, the systems begin sizing you up. An array of sensors -- video, audio, laser, infrared -- feeds a stream of real-time data about you to a computer that uses specially developed algorithms to spot suspicious people.

The system interprets your gestures and facial expressions, analyzes your voice and virtually probes your body to determine your temperature, heart rate, respiration rate and other physiological characteristics -- all in an effort to determine whether you are trying to deceive.

Fail the test, and you'll be pulled aside for a more aggressive interrogation and searches.

That scenario may sound like science fiction, but the U.S. Department of Homeland Security (DHS) is deadly serous about making it a reality.

Interest in the use of what some researchers call behavioral profiling (the DHS prefers the term "assessing culturally neutral behaviors") for deception detection intensified last July, when the department's human factors division asked researchers to develop technologies to support Project Hostile Intent, an initiative to build systems that automatically identify and analyze behavioral and physiological cues associated with deception.

That project is part of a broader initiative called the Future Attribute Screening Technologies Mobile Module, which seeks to create self-contained, automated screening systems that are portable and relatively easy to implement.

The DHS has aggressive plans for the technology. The schedule calls for an initial demonstration for the Transportation Security Administration (TSA) early this year, followed by test deployments in 2010. By 2012, if all goes well, the agency hopes to begin deploying automated test systems at airports, border checkpoints and other points of entry.

If successful, the technology could also be used in private-sector areas such as building-access control and job-candidate screening. Critics, however, say that the system will take much longer to develop than the department is predicting -- and that it might never work at all.

In the Details

"It's a good idea fraught with difficulties," says Bruce Schneier, chief technology officer at security consultancy BT Counterpane in Santa Clara, Calif.

Schneier says that focusing on suspicious people is a better idea than trying to detect suspicious objects. The metal-detecting magnetometers that airport screeners have relied on for more than 30 years are easily defeated, he says. But he thinks the technology needed for Project Hostile Intent to succeed is still at least 15 years out. "We can't even do facial recognition," he says. "Don't hold your breath."

But Sharla Rausch, director of the DHS's human factors division, says the agency is already seeing positive results. In a controlled lab setting, she says, accuracy rates are in the range of 78 to 81%. The tests are still producing too many false positives, however. "In an operational setting, we need to be at a higher level than that," Rausch says, and she's confident that results will improve. At this point, though, it's still unclear how well the systems will work in real-world settings.

Measuring Hostile Intent

Current research focuses on three key areas. The first is recognition of gestures and so-called "microfacial expressions" -- a poker player might call them "tells" -- that flash across a person's face in about one third of a second. Some researchers say micro expressions can betray a person when he is trying to deceive.

The second area is analysis of variations in speech, such as pitch and loudness, for indicators of untruthfulness.

The third is measurement of physiological characteristics such as blood pressure, pulse, skin moisture and respiration that have been associated with polygraphs, or lie detectors.



Our Commenting Policies
Blog Spotlight
Sharky

This state transportation department uses computer science students from a local university as programming interns, and everyone is happy with the arrangement -- until one intern learns how to bring down the mainframe.

Richi Jennings
Richi Jennings