Three big trends converge at CES

Lifelogging and lifestreaming got a boost from wearable computing and the quantified self. Welcome to the future.

1 2 Page 2
Page 2 of 2

Wearing cameras

Rochester Optical Manufacturing was the first company to publicly offer prescription glasses designed to work with Google Glass.

More than a dozen competitors to Google Glass are expected to ship in 2014, and many of these have cameras.

CES featured companies displaying ski goggles, scuba masks and helmets outfitted with cameras, some of them able to stream live to a website over the data connection of a smartphone.

A clear theme emerged at CES that very good cameras are becoming available on just about any object you might put on your face or head.

Other lifelogging and lifestreaming cameras I've told you about in this space before, such as the Autographer and Narrative Clip (formerly called the Memoto), are now shipping.

Over the next couple of years, it will become socially acceptable to wear such cameras and film random experiences in your life. Uploaded to social media or to a private server, they will respectively create a very detailed photographic memory of your life organized chronologically.

I've noticed that with Google Glass, it's hard to not lifelog or lifestream. It's super easy to take photos and video, and when you upload them they appear on your Google+ stream (either publicly or privately). By simply searching for your name and the auto-generated hashtag #throughglass that accompanies all Google Glass images, you see your experiences in reverse-chronological order.

And then there's Wolfram Alpha

Scientist and entrepreneur Stephen Wolfram, of Wolfram-Alpha fame, launched this week an incredibly ambitious project to connect every device that's connected to the Internet -- from smart toothbrushes to smart watches to smart refrigerators -- and extract their data and apply algorithms to make sense of that data.

Wolfram's Connected Devices Project sounds impossible, but so do other projects Wolfram has taken on and succeeded with.

The bottom line is that if every dumb device in our possession turns smart (becomes connected to the Internet), Wolfram's project and other future initiatives and projects should enable us to harvest all kinds of data about our lives, our health and our experiences.

Now, if you're asking yourself why would anyone want this, the answer is: Because we're human.

Why did ancient people make cave paintings and stone statues? Why did we embrace paintings? Why did we embrace photos? Why did we embrace video?

Human beings like to make things to remember and share experiences. Throughout the history of mankind, we've always embraced the better, higher-resolution, more-details way to do this.

Lifelogging and lifestreaming are just super high-tech cave paintings. Combined with wearable computing and the quantified self, lifelogging and lifestreaming are going to be super easy to do, and the results will be even more compelling and addictive than selfies, Snapchat and social networking.

So remember International CES 2014. Because next year, you won't have to.

This article, "Three Big Trends Converge at CES," was originally published on

Mike Elgan writes about technology and tech culture. You can contact Mike and learn more about him on Google+. You can also see more articles by Mike Elgan on

Copyright © 2014 IDG Communications, Inc.

1 2 Page 2
Page 2 of 2
Bing’s AI chatbot came to work for me. I had to fire it.
Shop Tech Products at Amazon