Meet the new Google, the one powered by an Assistant that knows where you live and work, how you normally like to watch videos, and which sports teams you follow.
Google announced the Assistant will run on the Pixel smartphone (out later this month) by long-pressing on the home button. It will also run on the Google Home speaker (available next month for $129), which competes directly with Amazon Echo and the Alexa assistant.
“We imagine where the Assistant will be able to help from any device and in any context,” said Scott Huffman, the VP of engineering for Google Assistant at a Google event held today in San Francisco. Huffman said there will be an SDK for the Assistant, out next year, that helps with, as he said, every imaginable task.
That last phrase is an important one. Google’s mission has always been to make information more accessible. There’s been a dramatic shift in machine learning and A.I. lately where we access that information by text and voice, not always through apps or from a laptop. We don’t plug in a note about our day into Evernote or the Notes app on an iPhone. We talk to a chatbot. We don’t get directions by swiping in Google Maps, we ask the Assistant how far it is to the donut shop across town.
I haven’t tested this yet, but Google claims the Assistant will understand context in a new way. Before, context meant knowing you had asked about President Obama. When you ask a follow-up in Google Now (which doesn’t really exist as a service anymore) about “his” age, the app would know you meant Obama. Now, Google showed how the Assistant would “read” your conversations. If you mention a place to eat or a meeting, the Assistant can determine that you are talking about Chipotle and the downtown office. In one demo, a video for the band The Lumineers played on YouTube instead of the Google Music app because it understood that is the app the user normally likes and uses most often. To me, that’s true contextualization.
Context is knowing more about you. As a lifelong Minnesota Vikings fan, it’s cool to imagine how this might work. I could ask the Home speaker to record the game on DISH. The Assistant would know which “game” I mean. If I ask to order a new jersey for my team, it would know I like to shop on Amazon, it would know my size, and it would know I want Sam Bradford’s jersey. The product would arrive at my front door in two days thanks to Amazon Prime.
What about work? In the future, I can picture how the Assistant would track my meetings, flights, hotels stays, and even track my expenses. If I meet an editor for lunch in Las Vegas and pay with Google Wallet, it would know why I’m there and that I’m never in Las Vegas for anything but CES or some random tech conference and not as a personal trip (not going to happen, ever).
The Assistant parses our day. It will know why we are at a meeting at 2 p.m. on a Friday. It will look at traffic conditions for us and know when we normally like to leave the office. I’m guessing it will type up documents we dictate and send emails on our behalf to people, similar to how the Smart Reply feature works in Google Inbox today. It’s going to be a powerful aid.
The question is -- when will that happen? For now, it seems like the Assistant is a port of the Google Now code. It does help with flights and traffic, and you can set reminders. It has a long way to go. It will be interesting to see if it can take on some of the skills in Alexa, match up with how Siri now works on the Mac and with third-party apps, and can compete with the boatload of personal assistant chatbots like Mezo and Ozlo in the near future.
This article is published as part of the IDG Contributor Network. Want to Join?