With iPhone X, Apple just took a step to emotion sensing

The new Animoji feature inside the iPhone X may seem trivial, but it could be the first step in developing personalized machine intelligence.

Apple, iPhone, iPhone X, FaceID, Animoji, Emotient
Apple

I think Apple may go in a very interesting direction with the new Animoji feature inside the iPhone X.

Feel the force

Apple CEO Tim Cook already told us the new smartphone "sets the tone for the next decade in new technology."

We know that if we want to get an iPhone X, we're going to have to get smart about the pre-ordering process, as there won’t be enough to go round. We even know the iPhone 8 will still make us happy if we can’t get an iPhone X this year.

We also know Apple will continue to evolve the technologies it is introducing with iPhone X, and we can surmise that the company probably already has a roadmap for this future evolution.

That means Face ID, the UI, processors, and even the technology-packed "notch" will see improvement. Why would Apple neglect to have some ideas for future use of Animoji?

Toward an Apple emotion sensor

Animoji may seem trivial, but I think that’s an illusion. We know they work by tracking 50 different muscles in your face.

We also know the feature uses the same camera technologies as those used by FaceID. These technologies can already recognize your face. How long until they will become capable of sensing your emotions?

Ground-breaking MIT research is already being "productized" by a company called Affectiva, which is developing tech that will identify your emotions when you watch content on a device. Softbank's Pepper robot is another example of a machine that can identify some of your emotions.

Apple has an interest here. It acquired Emotient in 2016. That company was developing software to analyze your facial expressions in order to detect your emotions.

Such engagements in what is called “affective computing” are designed to create computers that can identify how we feel, and then respond appropriately.

This kind of personalized machine intelligence could work with Siri to help your voice assistant provide much more useful suggestions to help you with your life.

In conjunction with all the other biometric information Apple’s platforms have become capable of gathering about users, such as location, heart rate or activity levels, the addition of emotional sensing may unlock a profound boost to mental and physical healthcare.

Idealistically, I see a scenario in which your iPhone will be able to figure out when you are sad, make a reasonable guess as to when that feeling began, and take appropriate action to help you — it might suggest a friend to call, place to visit, or music to listen to.

Realistically, I can imagine such technology being used to monitor your reaction to music or advertising, as that’s part of what Emotient was working on.

Avatars anywhere

ARKit is opening a new developer gold rush.

We’re seeing lots of interest and excitement in what these tools can do.

I’ve already seen ARKit used inside classic platform games, two-player games, to add augmented intelligence to daily life for product sales and even immersive role-playing experiences.

One thing that’s lacking — at least in the current implementation of ARKit — is the capacity to see yourself and other people within virtual worlds. Now, we know an iPhone isn’t going to be the best vehicle for this, virtual reality headsets deliver a more intense first-person experience.

However, the capacity to recognize facial movement in conjunction with the physical movement as expressed within the ARKit UI has to considered. How long until your Animoji becomes your avatar? This may enable users to experience all kinds of shared AR experience, from Dungeons & Dragons type VR dungeon-crawling adventures to maintenance and emergency crews familiarizing themselves with situation maps in AR.

Your face counts

I guess what I’m saying is that it seems probable to me that we should take Animoji a little more seriously. While I expect it to take time, I think the technology will turn out to be the key that unlocks deep, gesture-based, emotionally responsive user experiences across both mobile devices and Macs. Assuming, of course, that such a user experience makes enough real-world sense. How would you use such technology?

Google+? If you use social media and happen to be a Google+ user, why not join AppleHolic's Kool Aid Corner community and get involved with the conversation as we pursue the spirit of the New Model Apple?

Got a story? Please drop me a line via Twitter and let me know. I'd like it if you chose to follow me there so I can let you know about new articles I publish and reports I find.

How to protect Windows 10 PCs from ransomware
Shop Tech Products at Amazon