When it comes to big data, many enterprises are getting slammed with big problems. Google plans to focus on helping those companies over the next year.
Greg DeMichillie, director of product management for Google's cloud team, said the company's announcement of its new Cloud Dataflow analytics offering during the Google I/O keynote on Wednesday is just the beginning of a big data push at Google.
"Dataflow is the first of what we're doing to help people make sense of the data that they have," DeMichillie told Computerworld. "They need to focus on their business problems, not managing the infrastructure... Dataflow begins that and we'll continue that."
The company describes Google Cloud Dataflow as a managed data processing service designed to create data pipelines that "ingest, transform and analyze data in both batch and streaming modes."
The service, according to DeMichillie, is the first step in helping companies handle massive amounts of data.
Enterprises have been managing huge amounts of data for years. Now companies have to factor in data from more devices, along with more information, which requires complex, and near-real-time analytics.
Over the next year, Google will be focused on releasing cloud tools and services that will ease development tasks, while also helping companies monitor their operations, DeMichillie said.
"How do you stay on top of what's happening?" he asked. "The answer is monitoring and alerting. You get automatic intelligent defaults, dashboards and alerts. We're just at the beginning of what can be done there."
While DeMichillie wouldn't discuss any specific projects in the works, he said Google will release more services and tools specific to big data and analytics.
"I think it's huge," DeMichillie said. "Enterprises have reams of data, whether it's business data about transactions or Web log data or data about how customers are engaging through mobile apps. What they struggle with is, How do I make business value out of that?"
When companies take on big data projects, the IT department is often left to struggle with cleaning and filtering the data. It takes an immense amount of work to get to the point where the business value is found in the data.
"At Google, we went through that 15 years ago, and we developed a lot of technologies to help us make sense of that," DeMichillie said. "We have only started to tap the services we can expose to customers that we built here at Google."
With big data analysis, timing is everything. DeMichillie noted that many companies get weekly or monthly data analysis reports on topics like sales numbers.
What they really need is that analysis in real time, or at least near real time, he said.
"Knowing there was a trend isn't helpful if you find out a week later," he added. "You have to know about it when it's happening so you can act on it. What if there's Twitter activity that correlates to your business right now? That requires near-real-time understanding, and that's just not happening right now."
Patrick Moorhead, an analyst at Moor Insights & Strategy, said Google is on the right track with its focus on big data.
"Real-time analysis of big data can cost enterprises tens of millions of dollars in hardware and software," he added. "For enterprises that don't have the skills to set up and operate their own big data and real-time analytics operation, this will save a lot of time, money and headaches."
DeMichillie also noted that Google will continue to work to push prices down in the cloud, continuing what some have called a price war with Amazon Web Services.
"When we announced our price reductions, it was the beginning of a process," he said. "We have said what we did in March is not a one-time event.... Of course it's a competitive market and, of course, there are big players and you have to be cost-competitive."
Sharon Gaudin covers the Internet and Web 2.0, emerging technologies, and desktop and laptop chips for Computerworld. Follow Sharon on Twitter, at @sgaudin, and on Google+, or subscribe to Sharon's RSS feed . Her email address is email@example.com.