Skip the navigation

Elgan: When the iPhone feels your pain

Smartphones are smart, but tomorrow's gadgets will have emotional intelligence

January 17, 2011 07:45 AM ET

Computerworld - We love our gadgets. But they treat us with an indifference that sometimes feels like contempt. They're like cats.

But soon, they'll act more like dogs -- perceptive of how we feel, and reacting to our moods by joining in on our elation or treading lightly when we're angry.

Such capabilities are nearly inevitable, either sooner or later, because the trajectory of interface design is always toward making machines increasingly "human-compatible," which means they'll interact with us like another human being would. And that requires some level of empathy.

In a CES 2011 keynote address last week, Samsung President B.K. Yoon said, "Digital humanism will characterize the new decade that has begun." He said that "adding emotional value to digital technology" is central to Samsung's mission. And they're not the only ones.

MIT has a research division called the Affective Computing group that "aims to bridge the gap between computational systems and human emotions."

One of the group's more interesting projects resulted in a gadget called the Emotional Social Intelligence Prosthetic, or ESP -- there's no "I" in social intelligence, apparently. The purpose of the ESP is to inform the user about the emotional state of the person he or she is talking to.

The device is a tiny, handheld computer with special sensors and a camera. It looks for signs of boredom and other emotions, and informs the user so he can change the subject if he's getting on the listener's nerves.

Of course, the ESP project will never result in a product. But it's an example of research devoted to the engineering problem of detecting human emotions with a handheld device.

Cambridge University researchers have developed a technology they call EmotionSense that uses both speech-recognition software and special sensors in the phone to figure out how the user is feeling.

Their goal is to develop a nonintrusive way to accurately gauge the emotional state of a person holding a smartphone.

Their initial aim is to use the technology for social science research. The idea is to find correlations between how a user feels and various locations, people or other factors that are linked to a particular state of mind. Not surprisingly, people tended to be "happy" at home and "sad" at work.

In an initial trial, the researchers found that the system was roughly 70% accurate in perceiving the emotional states reported by test subjects in a follow-up survey.

Although the initial aims are scientific, it's clear that the first step in the development of an emotionally perceptive smartphone is accurate, nonintrusive sensors, which the Cambridge researchers are demonstrating.

Another, related project at Cambridge is looking at building emotional detection into GPS car navigation devices. Their vision is a dashboard GPS device that uses special sensors to monitor facial expressions, voice intonation and hand movements to perceive the emotions of the driver. For example, if the driver were stressed out, it could hold incoming calls, delay giving additional instructions or turn off the sound system. (Personally, I think a dashboard that did all this would actually make me angry.)



Our Commenting Policies
Consumerization of IT: Be in the know
consumer tech

Our new weekly Consumerization of IT newsletter covers a wide range of trends including BYOD, smartphones, tablets, MDM, cloud, social and what it all means for IT. Subscribe now and stay up to date!