Faception can allegedly tell if you're a terrorist just by analyzing your face

An unnamed homeland security agency inked a contract with a pre-crime-like tech startup that claims it can spot if you are a terrorist, pedophile, genius or some other personality – judging a book by its cover with an 80% accuracy, by just analyzing your face.

An unnamed homeland security agency has signed a contract with a company that claims it can “reveal” your personality “with a high level of accuracy” just by analyzing your face, be that facial image captured via photo, live-streamed video, or stored in a database. It then sorts people into categories; with some labels as potentially dangerous such as terrorist or pedophile, it is disturbing that some experts believe the science behind it is antiquated, has previously been discredited, and the results are inaccurate.

Israeli start-up Faception, a facial personality profiling company, told The Washington Post that “a homeland security agency” has signed a contract to use Faception to help spot terrorists. The “computer vision and machine learning technology” can even be integrated into other facial recognition tech “to provide a full spectrum solution that covers known and anonymous individuals.”

Faception process Faception

Faceception CEO Shai Gilboa added, “Our personality is determined by our DNA and reflected in our face. It’s a kind of signal.” On the company’s site, the “science” behind the technology that can supposedly predict a person’s behavior and personality was described as:

  • According to Social and Life Science research personalities are affected by genes.
  • Our face is a reflection of our DNA.

People may judge other people by their faces, but the “science” of judging a book by its cover via face reading, or physiognomy, was basically “discredited and rejected” by the late 19th century. It’s one thing for a person to make a snap judgement based on appearance and another thing entirely to use Faception to “enrich your profile database with a variety of personality scores” and “turn unknown individuals into known ones.”

As Pedro Domingos, a professor of computer science at the University of Washington, pointed out to the Post, “Can I predict that you’re an ax murderer by looking at your face and therefore should I arrest you? You can see how this would be controversial.”

Princeton psychology professor Alexander Todorov told the Post, “The evidence that there is accuracy in these judgments is extremely weak. Just when we thought that physiognomy ended 100 years ago.”

“Faception has built 15 different classifiers,” the Post reported, and allegedly can evaluate certain traits with an “80% accuracy.” Put another way, one in five people could incorrectly be classified as a terrorist or pedophile. Gilboa said the company “will never make his classifiers that predict negative traits available to the general public.”

Eight classifiers are listed on the Faception site: High Q, academic researcher, professional poker player, bingo player, brand promoter, white-collar offender, terrorist and pedophile. “The classifiers represent a certain persona, with a unique personality type, a collection of personality traits or behaviors.” Algorithms are used to sort people according to how they fit into those classifiers.

For example, the company classifies a “bingo player” as being “endowed with a high mental ceiling, high concentration, adventurousness, and strong analytical abilities. Tends to be creative, with a high originality and imagination, high conservation and sharp senses.”

“Thrill seeking” is mentioned in the “terrorist” classifier. Thrills come in all shapes and sizes, right? Pity the adrenaline-junkie soul incorrectly identified as a terrorist.

The company claims “success” stories such correctly identifying four poker players out of 50 competing in a tournament. In the end, two of the predicted four players were finalists. Faception claims that its technology classified nine of 11 Paris terrorists “with no prior knowledge” and only three of those terrorists had a previous record. That is allegedly why it is “working with the leading Homeland Security Agency,” according to its marketing video.

Faception identified 9 of 11 terrorists from Paris attacks Faception

While Faception is not quite the same, it reminded me of Homeland Security’s pre-crime screening program dubbed FAST for Future Attribute Screening Technology (pdf); FAST has been likened to Minority Report as it was designed “to ‘sense’ and spot people who intend to commit a terrorist act.”

Like FAST, Faception believes it is “possible to know whether an individual is a potential terrorist, an aggressive person, or a criminal.”

Unlike Faception, FAST analyzes much more than the face. It reportedly analyzes facial expressions and uses trackers to measure pupils, position and gaze of eyes, but it also measures heart and respiration rates, analyzes body movement, body heat changes and pitch changes in voices.

EPIC (Electronic Privacy Information Center) has been trying to get more information from DHS about FAST since 2011. That same year at DefCon, researchers suggested FAST smelled like security snake oil and explained why it wouldn’t work (pdf). Let’s hope the unnamed homeland security agency which inked a $750,000 contract with Faception was not DHS.

China too has tinkered with “pre-crime” to identify terrorists; China being China, one has to wonder if dissident is synonymous with terrorist. Its “Citizen Score” is already an Orwellian nightmare.

Faception may not be meant for the general public, but “analyzing anonymous individuals who may impose a threat to public safety” could be wrapped into law enforcement as in “homeland security,” AI, personal robots as well as for “public safety” at buildings, shopping malls, stadiums and corporations, and used in retail, insurance, recruiting, finance, and even matchmaking.

Faception lists Sears and Manpower as a few of its clients, claiming it can also “predict online behavior” to find the “best paying users.”

Copyright © 2016 IDG Communications, Inc.

It’s time to break the ChatGPT habit
Shop Tech Products at Amazon