IDG News Service - IBM researchers have developed a new algorithm that could in minutes analyze terabytes' worth of raw data to more quickly predict weather and electricity usage, the company said today.
The mathematical algorithm, developed by IBM's laboratories in Zurich, can sort, correlate and analyze millions of random data sets, a task that could otherwise take days for supercomputers to process, said Costas Bekas, an IBM researcher.
The algorithm is just under a thousand lines of code and will be instrumental in establishing usage patterns or trends based on data gathered from sources such as sensors or smart meters, he said. The algorithm could be used to analyze a growing mass of data measuring electricity usage trends as well as air or water pollution levels. The algorithm could also break down data from global financial markets and assess individual and collective exposure to risk, Bekas said.
"We are interested in measuring the quality of data," Bekas said. Efficient analysis of large data sets requires new mathematical techniques that reduce computational complexity, Bekas said.
The algorithm combines models of data calibration and statistical analysis that can assess measurement models and hidden relationships between data sets. IBM has been working on the research for two years, Bekas said.
The algorithm can also reduce the cost burden on companies by analyzing data in a more energy-efficient way, Bekas said. The lab used a Blue Gene/P Solution system at the Forschungszentrum Julich research center in Germany to validate 9TBs of data in less than 20 minutes. Analyzing the same amount of data without the algorithm would have taken a day with the supercomputer operating at peak speeds, which would have added up to higher electricity bills, Bekas said.
According to Top500.org, the Blue Gene/P is the fourth-fastest supercomputer in the world as of last November, with 294,912 IBM Power processing cores that can provide peak performance of up to 1 petaflop.
The traditional approach to data analysis is to take multiple data sets and look at them individually, said Eleni Pratsini, manager of mathematical and computational sciences at the IBM research labs. However, the algorithm compares data sets against each other, which could help enterprises point toward larger trends in particular areas, such as risk reduction in financial portfolios.
Enterprises will want faster ways of generating business intelligence as masses of data flood servers with the expansion of computing to new devices, he said.
Now that the algorithm has been proven to work scientifically, the research lab is collaborating with IBM's Global Services unit to use it for specific services, Pratsini said. Ultimately, the algorithm could make its way to IBM applications such as the SPSS statistical analysis software, but the company didn't provide a specific time frame for that.
- An Insightful Approach to Optimizing Mainframe MLC Spend This paper discusses how you can penetrate the complexity of IBM mainframe MLC products and the MLC price model to gain insight into...
- Meeting the Exploding Demand for New IT Services In this eBook, explore the top trends driving the New IT for IT Service Management, and how leading organizations are evolving to focus...
- Hybrid IT-A Low-Risk Path from On-Premise to ITaaS This white paper provides a strategy to move part or all of your ITSM suite to the cloud as a stepping stone to...
- Paving the Windows XP Migration Path to Success Support for Windows XP has ended, leaving organizations with three choices: Windows 8, Windows 7 or a combination. With the right planning and...
- Increase Your Data Center IQ Discover how to improve network efficiency, lower IT costs and more proactively manage your physical, virtual and cloud environments.
- Optimize Data Center Resources and Plan for the Future Eliminate over-provisioning and capacity shortfalls with pro-active capacity optimization. Join us in the evolution from capacity monitoring to capacity optimization in your data... All Hardware White Papers | Webcasts