How do you feel? Your phone may soon tell you

Emotion-detection capability is coming soon to a wide range of mobile apps, thanks to several systems now available.

A variety of projects unveiled in the past year aim to give mobile apps the ability to instantly detect a person's emotional state.

A startup called Affectiva, which emerged out of MIT's Media Lab, last month launched a software developer kit (SDK) for its emotion-tracking technology. The company claims that it's possible to assess the effect that advertising and branding have on a person if you analyze that person's facial expressions through the camera of a mobile device.

Affectiva's Affdex SDK could enable emotion-tracking to be built into mobile apps. For now, the SDK supports app development on Apple's iOS.

With Affdex, processing takes place on the device, not on a remote server, as is the case with some comparable technologies, according to Affectiva. That raises the possibility of developing systems that sense emotions in real time and feed their results into another app and thereby change what happens with that app. For example, constant emotional feedback could change the trajectory of a game or an interactive story depending on how the user feels about various scenes.

Google Glass app that can read emotions
Emotient co-founders Javier Movellan and Marian Bartlett created a Google Glass app that can read emotions and let the wearer know what others are feeling. (Photo: PRNewsFoto/Emotient)

Another startup, Emotient (pronounced like emotion with a T sound at the end), just received an additional $6 million in funding to further develop its facial-recognition emotion-sensing technology, especially an API for third-party software development. The company was founded by six people with Ph.D.s from the University of California at San Diego who are experts in machine learning, computer vision, cognitive science and facial behavioral analysis.

The Emotient system watches and analyzes facial expressions to determine seven emotions: joy, surprise, sadness, anger, fear, disgust and contempt, and also the general mood of people -- whether they're broadly happy, unhappy or somewhere in between.

Like several other companies in this market niche, Emotient wants to help retailers understand how customers feel about products while shopping.

The company this week announced the development of a Google Glass "glassware" app that's designed to perform on-the-spot "sentiment analysis." If you're wearing a Google Glass device, it will interpret the emotions of the people you're looking at and then tell you generally how the others are feeling. The first target customers of the app are restaurant workers, salespeople and others in retail who want to know how happy customers are about products or services.

Intel is an investor in Emotient, and reportedly plans to bring Emotient's libraries into the next version of its RealSense SDK. RealSense is like Microsoft's Kinect -- it's designed to be used with PCs and built into laptops to enable real-time 3D scans of the surrounding environment and the user for gaming and also content creation.

A Norwegian computer scientist named Audun Øygard created a face-reading tool called CLMtrackr. Applications created to demonstrate CLMtrackr's technology are available for free online. For example, you can visit this website to have your emotions tracked in real time using the emotion-tracking example.

CLMtrackr's approach is to read facial expressions and interpret them based on thousands of previous models. Essentially, the technology creates green lines based on 70 specific points on the human face, then compares the relative orientation of those lines to past examples.

The main point of CLMtrackr is to identify points on a face in real time with high precision. The technology has been released as a JavaScript library for fitting a facial model to faces in images or video. The application of emotion analysis is really just an example of what can be done when you can precisely track points on a face.

Øygard expects the technology to be useful in retail and sales -- to, for example, help analyze the effectiveness of TV commercials.

By the way, Øygard also created a program that superimposes the facial features of a famous celebrity on top of yours in real time. You can try it here.

Meanwhile, researchers at the University of Genoa in Italy have created a system that uses Microsoft Kinect cameras to figure out how you feel.

The system does this by reading and interpreting body language. It creates a stick figure in software, then interprets how the sticks move, and how fast or slow they move. Software looks for the same things people do when reading body language: head down and shoulders drooping may show sadness, for example.

The researchers are already applying their technology to build games that teach autistic children how to read body language, and how to use body language to express emotions.

What most of these projects have in common, besides tools with the ability to read emotions and then use that emotion data for a certain purpose, is that they're designed to be extended, built-upon and built into, usually, mobile apps and mobile devices. None of these major projects are holding back their technology as proprietary; instead, they're making their tools available as open systems for other companies to use.

That's really what makes all these various approaches to emotion detection so exciting: The systems can be integrated into a wide variety of mobile apps and devices and -- I don't see why not -- combined to enhance accuracy or flexibility.

The first targets appear to be in retail sales -- to figure out how customers feel. But with other app developers applying their creativity, we could see emotion sensing built into user interfaces to, say, make apps more friendly or more "tactful" in how they interact with users.

Who knows where emotion detection will show up next?

This article, "How Do You Feel? Your Phone May Soon Tell You," was originally published on Computerworld.com.

Mike Elgan writes about technology and tech culture. You can contact Mike and learn more about him at http://Google.me/+MikeElgan. You can also see more articles by Mike Elgan on Computerworld.com.

Join the discussion
Be the first to comment on this article. Our Commenting Policies