There's a data expert making a name for itself in the corporate world today, and it's attracting a lot of attention. It's a lightning-fast learner, speaks eight languages and is considered an expert in multiple fields. It's got an exemplary work ethic, is a speed reader and finds insights no one else can. On a personal note, it's a mean chef and even offers good dating advice.
The name of this paragon? Watson. IBM Watson.
Named after IBM's first CEO, Watson was created back in 2007 as part of an effort by IBM Research to develop a question-answering system that could compete on the American quiz show "Jeopardy." Since trouncing its human opponents on the show in 2011, it has expanded considerably. What started as a system focused on a single core capability -- answering questions posed by humans in natural language -- now includes dozens of services spanning language, speech, vision and data analysis.
Watson uses some 50 technologies today, tapping artificial-intelligence techniques such as machine learning, deep learning, neural networks, natural language processing, computer vision, speech recognition and sentiment analysis. But IBM considers Watson more than just AI, preferring the term "cognitive" instead. Whereas existing computers must be programmed, Watson understands the world in the way that humans do: through senses, learning and experience, IBM says.
"When we say 'cognitive,' we mean that it can learn, understand, reason and interact," said Steve Abrams, director of the IBM Watson Platform. "Watson can do each of those things with people, data or other systems."
With the ability to read more than 800 million pages per second, it can analyze vast volumes of data -- including the unstructured kind -- processing it by understanding natural language, generating hypotheses based on evidence, and learning as it goes.
It's tempting to imagine Watson as some giant "brain" churning away behind a curtain in the core of IBM's research facilities, but the reality is very different.
"It's an oversimplification to call Watson a cognitive computer," said Roger Kay, principal analyst at Endpoint Technologies. "What it does is marshal domain-specific resources and make that information available to humans through a natural-language interface."
The "cognitive" part is that Watson can "flash through its knowledge base for potential answers to users' questions by employing AI and machine-learning algorithms," Kay added. "What IBM has done is create a huge engine for this sort of analysis and put together a fairly simple means to program it as well as a straightforward human interface for end users."
Today, Watson can be viewed as a cloud utility, he said: "a powerful capability run by IBM that can be accessed via the web."
In 2014 IBM established a dedicated Watson Group with a global headquarters in New York City to propel and commercialize the technology. A Boston-based health unit and an IoT headquarters in Germany followed the next year. Today, Watson is available to partners and developers via the cloud and some 30 application programming interfaces (APIs). Hundreds of IBM clients and partners across 36 countries and more than 29 industries now have active projects underway with Watson, IBM says.
The Watson developer community represents more than 550 developers across 17 industries and disciplines, and more than 100 of them have already introduced commercial "cognitive" apps, products and services as a result. More than a million developers globally are using the Watson Developer Cloud on IBM's Bluemix platform, meanwhile, to pilot, test and deploy new business ideas. IBM has allocated $100 million for venture investments to support this community.
One of these business users is OmniEarth, an environmental analytics company that recently partnered with IBM. The idea is to leverage Watson’s visual-recognition services to decipher and classify physical features in aerial and satellite images, and OmniEarth's using those analyses to help tackle California's ongoing drought.
"Fundamentally, we're looking for what we can learn about outdoor water use to anticipate how much water a particular parcel of land might need," said OmniEarth lead data scientist Shay Strong.
It can take inordinate amounts of time and expertise to manually examine aerial photographs and satellite images to identify swimming pools and other pertinent landscape features on a particular lot, Strong said.
Now, OmniEarth uses a variety of machine-learning algorithms to do it -- some home-grown, and some that are part of Watson. (You can test out Watson's vision API for yourself here; OmniEarth's technology can be seen here.) Vast amounts of data are involved -- close to a terabyte for Los Angeles alone, Strong said -- but machine learning speeds up the process enormously. OmniEarth can now process aerial images 40 times faster than it could before, for example, tackling 150,000 images in just 12 minutes rather than several hours.
"It buys us incredible efficiency," Strong said.
It also enables better planning and budgeting. Whereas previously water districts like the City of Folsom and the East Bay Municipal Water District often used statewide averages to gauge their upcoming needs, OmniEarth's AI-based analyses allow them to create much more accurate forecasts. Watson is also helping regional utilities and conservation groups such as the Inland Empire Utility Agency and the Santa Ana Watershed Project Authority fine-tune their outreach programs to better educate families about modifying their water usage.