'Siri, I have some some suggestions for you'

Apple's voice-controlled personal assistant could be better; here's how.

1 2 3 Page 2
Page 2 of 3

When it comes to new technology, users are often quick to judge and slow to change their minds. Once a technology gets a bad rap, whether deserved or not, it's in trouble.

Local functions on the phone?

The solution is easy to pinpoint, but apparently harder to implement: Siri commands for local phone functions should be processed on the phone itself. As is, all of the work done by Siri happens on the back-end on Apple's servers. The good news: Any upgrades to Siri's servers mean everyone benefits at the same time. The bad news: You still must be able to connect to Siri's servers. Removing the need for an active network connection will help make Siri more reliable for local functions.

Beyond reliability, there are a few no-brainer changes needed. Siri should be able to manipulate software settings, such as toggling Bluetooth or Wi-Fi on and off; changing the screen brightness; launching apps; and integrating with third-party apps. But I'm more interested in expanding Siri's conversation range so that anything that can be done via touch can be done via voice.

I'd like to see Apple expand Siri's Hands-Free operation, perhaps by making it more chatty. Currently, when Siri is asked a question, it either displays the answer as a graphic or, sometimes, replies out loud. Siri gives more information verbally, such as reading back a dictated text message, when Hands-Free mode is detected. The way I figure it, if I'm asking Siri out loud instead of manually manipulating the screen to get the answer, I'm probably in a situation where I can't easily look at the screen. Give me the answer out loud, not as a display on the screen.

A good example is when I use traffic in Maps. Downtown Orlando isn't always the easiest stretch to drive through during peak hours, and there are days when I want to know what the local traffic is like. I'd feel better about asking Siri to tell me local traffic if I didn't have to divert my eyes from the road to see an answer. In fact, tying the voice of Siri into the Maps app so that, when asked, Siri literally guides you to your destination would be a logical step. With the current system, where Siri displays a map graphic and a few routes, the process seems incomplete.

Summarize, summarize!

Another example of where more interactivity would help involves answers to search results. Mac OS X has a built-in Summary service that can summarize paragraphs effectively enough. Why not bring that tech to Siri?

Then the conversation, such as it is, would go like this:

Me: "Siri, who is Ben Franklin?"

Siri: "Ben Franklin was one of the founding fathers of the United States. Do you want to read more about him?"

Me: "No, just give me highlights."

Since I spend a lot of time in my car, I'd like access to reading material stored in Reading List or Instapaper. Currently, you can activate text-to-speech under Settings> General>Accessibility>Speak Selection and tap to Select All on an article's text. After tapping a Speak popup option, the computer voice reads the article. But integration with my text reader would be really welcome.

Get proactive

Beyond a more interactive Siri, I'd love to see a more proactive Siri. If I receive alerts when Siri is in Hands-Free mode, it would be nice if Siri actually volunteered more information. A proactive Siri could tell me who an email is from, and actually read out loud the subject line, perhaps asking if I'd like the rest of the text read out loud. This proactive approach can be applied to text messages, calendar invites and really any kind of notification.

1 2 3 Page 2
Page 2 of 3
8 simple ways to clean data with Excel
  
Shop Tech Products at Amazon