Skip the navigation

How do you feel? Your phone may soon tell you

Emotion-detection capability is coming soon to a wide range of mobile apps, thanks to several systems now available.

March 8, 2014 07:00 AM ET

Computerworld - A variety of projects unveiled in the past year aim to give mobile apps the ability to instantly detect a person's emotional state.

A startup called Affectiva, which emerged out of MIT's Media Lab, last month launched a software developer kit (SDK) for its emotion-tracking technology. The company claims that it's possible to assess the effect that advertising and branding have on a person if you analyze that person's facial expressions through the camera of a mobile device.

Affectiva's Affdex SDK could enable emotion-tracking to be built into mobile apps. For now, the SDK supports app development on Apple's iOS.

With Affdex, processing takes place on the device, not on a remote server, as is the case with some comparable technologies, according to Affectiva. That raises the possibility of developing systems that sense emotions in real time and feed their results into another app and thereby change what happens with that app. For example, constant emotional feedback could change the trajectory of a game or an interactive story depending on how the user feels about various scenes.

Google Glass app that can read emotions
Emotient co-founders Javier Movellan and Marian Bartlett created a Google Glass app that can read emotions and let the wearer know what others are feeling. (Photo: PRNewsFoto/Emotient)

Another startup, Emotient (pronounced like emotion with a T sound at the end), just received an additional $6 million in funding to further develop its facial-recognition emotion-sensing technology, especially an API for third-party software development. The company was founded by six people with Ph.D.s from the University of California at San Diego who are experts in machine learning, computer vision, cognitive science and facial behavioral analysis.

The Emotient system watches and analyzes facial expressions to determine seven emotions: joy, surprise, sadness, anger, fear, disgust and contempt, and also the general mood of people -- whether they're broadly happy, unhappy or somewhere in between.

Like several other companies in this market niche, Emotient wants to help retailers understand how customers feel about products while shopping.

The company this week announced the development of a Google Glass "glassware" app that's designed to perform on-the-spot "sentiment analysis." If you're wearing a Google Glass device, it will interpret the emotions of the people you're looking at and then tell you generally how the others are feeling. The first target customers of the app are restaurant workers, salespeople and others in retail who want to know how happy customers are about products or services.

Intel is an investor in Emotient, and reportedly plans to bring Emotient's libraries into the next version of its RealSense SDK. RealSense is like Microsoft's Kinect -- it's designed to be used with PCs and built into laptops to enable real-time 3D scans of the surrounding environment and the user for gaming and also content creation.



Our Commenting Policies
Internet of Things: Get the latest!
Internet of Things

Our new bimonthly Internet of Things newsletter helps you keep pace with the rapidly evolving technologies, trends and developments related to the IoT. Subscribe now and stay up to date!