IBM to develop telescope data analysis system
IBM will develop advanced technologies for what will be the world's largest radio telescope, operational in 2024
IDG News Service - IBM is developing new data management and analysis technologies for what will be the world's largest radio telescope. The Square Kilometre Array (SKA), due to become operational in 2024, will produce so much data that even tomorrow's off-the-shelf computers will have difficulty processing all of it, the company predicted.
"This is a research project to find out how to build a computer system," to handle exabytes' worth of data each day, said Ton Engbersen, an IBM researcher on the project.
The Netherlands has granted IBM and the Netherlands Institute for Radio Astronomy (ASTRON) a five-year grant of a!32.9 million ($43.6 million) to design a system, with novel technologies, that can ingest the massive amounts of data that SKA will produce.
Funded by a consortium of 20 government agencies, SKA will be the world's largest and most sensitive radio telescope, able to give scientists a better idea of how the Big Bang unfolded 13 billion years ago. SKA will actually be comprised of 3,000 small antennas, each providing a continual stream of data.
Once operational, the telescope will produce more than an exabyte of data a day (an exabyte is 1 billion gigabytes). By way of comparison, an exabyte is two times the entire daily traffic on the World Wide Web, IBM estimated. The data will have to be downloaded from the telescope, which will either be in Australia or South Africa, and then summarized and shipped to researchers worldwide. Data processing will consist of assembling individual streams from each antenna into a larger picture of how the universe first came about.
Even factoring in how much faster computers will be in 2024, IBM still will need advanced technologies to process all that data, Engbersen said. Such a computer might use stacked chips for high volumes of processing, photonic interconnects for speedy connections with the chips, advanced tape systems for data storage, and phase-change memory technologies for holding data to be processed.
"We have to push the envelope on system design," Engbersen said. The researchers have made no decisions yet about whether it should be in one data center or spread out across multiple locations.
Because the system will be so large, the researchers must figure out how to make maximum use of all the hardware components to use as little energy as possible. They also must customize the data-processing algorithms to work with this specific hardware configuration.
After processing, the resulting dataset is expected to produce between 300 and 1,500 petabytes each year. This volume will dwarf the amount of data produced by what is now by far the largest generator of scientific data, CERN's Large Hadron Collider, which churns out about 15 petabytes of data each year.
- Big Data, Big Mess: Sound Risk Intelligence Through Complete Context This paper examines the insecurity of the small businesses in the supply chain and offers tips to close those backdoors into the enterprise.
- CIOs strive to harness Big Data while keeping an eye on the bottom line Read this whitepaper to learn how Red Hat Storage Server allows CIOs to confidently support business growth, manage cost and risk, capitalize on...
- Enterprise architects challenged to manage data explosion Read this whitepaper to find out how Red Hat Storage Server can allow enterprises to quickly and confidently deliver business applications that minimize...
- Software Asset Management: Ensuring Today's Assets Today's trends like BYOD and SaaS are new and exciting in terms of how they will help make our jobs more productive but...
- Live Webcast Charting Your Analytical Future - "Making predictive analytics part of your business processes" Webinar This session will show how predictive analytics can be used throughout the organization by anyone looking for answers and how organizations can make...
- Charting Your Analytical Future - "Making predictive analytics part of your business processes" Webinar This session will show how predictive analytics can be used throughout the organization by anyone looking for answers and how organizations can make...
- Capturing Data in Motion: Delivering Real-Time Insight from Data Streams This webcast will help organizations of all types and sizes learn about a technology and business strategy for tapping into the wealth of... All Business Intelligence/Analytics White Papers | Webcasts
Our new bimonthly Internet of Things newsletter helps you keep pace with the rapidly evolving technologies, trends and developments related to the IoT. Subscribe now and stay up to date!