AI the Apple way
From Reuters: “Apple has ramped up its hiring of artificial intelligence experts, recruiting from PhD programs, posting dozens of job listings and greatly increasing the size of its AI staff, a review of hiring sites suggests and numerous sources confirm.”
In latter paragraphs the report describes Apple as seeking “at least” 86 more AI experts to help it develop machine learning solutions, searching for employees with relevant PhD skills.
Apple’s software engineering chief, Craig Federighi, says Apple aims at, "adding intelligence throughout the user experience in a way that enhances how you use your device but without compromising your privacy, things like improving the apps that you use most".
Recruitment is complex because Apple insists customer data is anonymized. "They want to make a phone that responds to you very quickly without knowledge of the rest of the world," said Joseph Gonzalez, co-founder of a machine learning startup. "It's harder to do that."
Harder, perhaps, but realistically it’s just a technology hurdle. Apple’s ambition is to create solutions that deliver convenience without any sacrifice of privacy.
Available soon, iOS 9 has a new feature called Proactive. This works to anticipate what you need, creating calendar events, making suggestions for routes or people to call and other handy augmentations. Proactive doesn’t require your personal details be shared with Apple, instead it demands intelligence on the device.
I believe Apple is looking to create machine learning/neural systems that combine information from numerous sources. These would include intelligent anonymized feedback from node devices (as in Proactive) and insights drawn from disparate data stacks at the core.
This means Apple does not need to know who you are, what you need, or where you go, it will simply requires generalized information from which to draw insights and recommendations. Customers will use tools on their devices (such as Proactive) to filter intelligence taken from the AI at the center through its own private and personal knowledge of you. Your details are never shared.
This model means future Apple machine learning solutions should be able to infer useful and actionable insights even without 100 percent data access. Your convenience does not cost you your privacy, no matter what competitors want you to believe.
Apple’s biggest challenge in developing private artificial intelligence is (ironically) the intelligence services. These bodies are demanding Apple ban encryption (key to its privacy model) while attempting to erode all notions of privacy “because #terrorism” (despite critics who say enabling such privacy erosion means the terrorists win).
Apple is at the leading edge of arguments against such privacy erosion but faces enormous pressure given its generally isolated position. Microsoft is also fighting this fight, presumably because it understands enterprise clients do not want their business data to become an open book for government spooks across the planet.
“There’s another attack on our civil liberties that we see heating up every day - it’s the battle over encryption,” Apple CEO, Tim Cook, told an electronic privacy conference earlier this year. “We think this is incredibly dangerous.”
The connection between civil liberties and individual privacy in the looming future of machine and artificial intelligence is crystal clear. Apple aims to deliver solutions that deliver the convenience of augmented intelligence while preserving civil liberties. This attempt should be respected, rather than reviled.
Google+? If you use social media and happen to be a Google+ user, why not join AppleHolic's Kool Aid Corner community and join the conversation as we pursue the spirit of the New Model Apple?
Got a story?Drop me a line via Twitter or in comments below and let me know. I'd like it if you chose to follow me on Twitter so I can let you know when fresh items are published here first on Computerworld.