Now every Swift developer can build AI apps, thanks to Apple and IBM

IBM and Apple announce new machine intelligence solutions that let developers deploy Watson and Core ML to create mobile, enterprise-grade AI apps.

Apple, IBM, Watson, Siri, CoreML, iOS, iPhone, iPad, software development
Magdalena Petrova

IBM and Apple announce new machine intelligence solutions that let developers deploy Watson and Core ML to create mobile, enterprise-grade AI apps.

Apple and IBM: Learning on the job

Apple and IBM have been working together for years, so the news is an extension of their existing relationship, which is putting Apple deep inside the new enterprise IT.

Announced at the IBM THINK 2018 conference, the news means both companies are working together to combine their machine learning/artificial intelligence (AI) tech into something they are calling IBM Watson Services for Core ML.

Coca-Cola Company is already developing prototypes that use these services to create intelligent mobile solutions for field workers, including the use of visual recognition problem identification, cognitive diagnosis, and augmented repair.

What does this mean?

It means enterprises can create bespoke solutions for specific tasks, adding self-learning and unique AI augmentation technologies that enable those solutions to become smarter as they are used.

“Apps built with IBM Watson Services for Core ML learn from user activity, getting smarter with each interaction,” IBM explained in a press release received by me.

How does this work

Enterprise developers can get hold of IBM Watson in the form of a set of cloud-based services they can use when they build AI applications. Core ML delivers advance machine learning to apps on Apple devices running iOS 11 or later.

Application developers can use pre-trained models or create custom models in IBM Watson. Such models can be exported to Apple’s Core ML to use in their iPhone apps.

A visual recognition model (Watson Visual Recognition Service) built on thousands of images is the first available model in this solution and can now be exported to Core ML and run on Apple devices. IBM also lets developers add current and forecasted weather to apps based on where a person is.

Swift development

One really significant move is that Apple and IBM have also introduced a tool that lets Swift developers weave AI into their apps.

This is in the form of a new developer console that “millions of Swift developers” can use to link to the IBM Cloud to build apps.

“The IBM Cloud Developer Console for Apple comes with enhanced step-by-step guidance for developers of all experience levels, along with integration with AI, data, and mobile services optimized for Swift,” the partners claim.

That means developers of all experience levels can now make us of IBM Watson Services for Core ML to quickly build machine learning apps and connect them to IBM Cloud.

The solution also includes tools to protect credentials, services, and data using the IBM Cloud Hyper Protect services. This makes for more secure enterprise apps and helps increase the inherent high-grade security that is built inside Apple’s apps.

What can you build?

The result is that a developer using Swift can now build an application that integrates machine intelligence in minutes, the companies claim.

However, in terms of off-the-shelf productions, what those apps can do will initially be limited to visual recognition apps — though that’s not such a bad deal for developers seeking innovative ways to put image recognition inside of their ARKit apps.

This makes it possible to build, for example, AR-based field service, support manuals, or visual troubleshooting guides.

The combined technologies go beyond those apps, of course — the built-in AI means that as the apps are used, they will become smarter — eventually you should be able to point your device at something, and the app will figure out what repair it most likely needs. There will also be implications in health, educational, research, and city management technologies.

“We are taking this to the next level through machine learning. We are very much on that path and bringing improved accelerated capabilities and providing better insight,” Mahmoud Naghshineh, general manager for IBM Partnerships and Alliance, said.

(Naghshineh, some may recall, described Apple technologies as being “pervasive in the enterprise” last year.)

The age of intelligent machines

This is perfectly in line with both companies' vision to put digital transformation at the center of enterprise IT.

The move to make it possible to use Swift as a development tool with which to build and deploy new breeds of intelligent apps will (I think) facilitate fast, agile application development paths, making it possible for business users to quickly trial new solutions and explore new business opportunities.

That’s particularly important, as doing so may level the playing field, perhaps enabling smaller enterprises to deploy machine intelligence quickly and affordably — in an iPad, Apple Watch, or iPhone.

Google+? If you use social media and happen to be a Google+ user, why not join AppleHolic's Kool Aid Corner community and get involved with the conversation as we pursue the spirit of the New Model Apple?

Got a story? Please drop me a line via Twitter and let me know. I'd like it if you chose to follow me there so I can let you know about new articles I publish and reports I find.

Enterprise mobility 2018: UEM is the next step
  
Shop Tech Products at Amazon