It wasn’t what I expected.
That’s what I walked away thinking after I tried Glass, Google’s new computerized eyeglasses, during last week’s Google I/O developer conference in San Francisco.
To be honest, I’d expected to feel like a total idiot when I put Glass on. Total dorkdom. I figured I’d slip them on and suddenly I might as well be wearing a pocket-protector and hiding from the cool kids. And for God’s sake, I wanted to make sure no one took my picture so friends in the newsroom couldn’t mock me with it later.
Well, I have been mocked… a bit… but I have to say that using Glass was really interesting and more fun than I had expected. I know. I know. I had the same skeptical thoughts that a lot of my friends and colleagues have had, but Glass surprised me.
I even wanted my picture taken with them.
While I was out at Google I/O, I met someone who said I could try Glass and see it coming right out of the box. You can’t pass that up.
Glass actually is pretty light, considering that it has a battery, a processor and the translucent screen that sits just above your right eye. My colleague JR Raphael, who also tried out Glass at the same time, said the right-side bow, which is the business side of these eyeglasses, felt a bit heavy or strange on his ear but I didn’t have any problem with it.
There’s an on/off button on the right bow but Glass also can be set up to automatically turn on once the user puts them on. This On-Head Detection is a nice feature.
To get Glass going, you can activate them by tapping the side of the bow or you can simply nod upward. (I have to say that seeing so many early adopters walking around the conference center making exaggerated nodding gestures was pretty amusing. It was like the Google I/O tick. )
To scroll through the menu of options, which you can see on the screen, swipe your finger either up and down or across the bow, depending on what you want to do.
You also can manipulate Glass by voice control, but you have to speak smoothly for it to work. Pause to collect your thoughts or remember the order of the commands, and it won’t work.
Sounds pretty simple, but it’s not.
OK, it’s obviously not rocket science, but manipulating Glass without ending up accidentally in a different app or having the device turn itself back off takes a bit of practice.
I was mainly sitting down while I took Glass for a test run. If I’d been walking around the conference center or the streets of San Francisco, I’m sure I would have ended up tripping over a curb or walking into a pole while I was nodding and swiping and trying to remember which command came first.
Learning to use Glass isn’t an insurmountable hurdle, but if you’re considering Glass for yourself you should figure on some getting-to-know-you time before heading out mountain biking or trying to tape your daughter’s birthday.
Once I tried Glass, I could certainly see why it would be helpful in some situations.
Need something translated into a different language? Glass won’t just show you the words, a voice will pronounce the translation into your ear.
Need directions? Glass will show you a map and give you directions. And if you turn your head, your perspective on the map changes. Believe me, a friend and I could have used them one night last week when a 10-minute walk to find a local pub became an hour-long adventure.
Yes, I could get translations and directions on my iPhone but the hands-free aspect of Glass makes it handy when you’re on the move. And the idea of being able to take pictures or shoot short video hands free also is intriguing. I could see using that while biking or playing with my dog on the beach.
One issue at this point in its development, though, is battery life.
One Glass user told me his battery lasted from around 8 a.m. to 5 p.m. I think that might be a stretch unless he didn’t use Glass much at all during the day.
Several other users said their battery life was more like several hours, and one man who shot video biking across the Golden Gate Bridge said it was more like half an hour for heavy video use.
Glass is just out as a prototype right now so I’ll be interested to see what changes are on the way for it when it finally arrives later this year or early in 2014.