Skip the navigation

Intel to customize chips for big data apps

Intel is looking to solve software gaps with on-chip accelerators and cores

By Agam Shah
August 15, 2013 04:44 PM ET

IDG News Service - Intel, increasingly customizing server chips for customers, is now tuning chips for workloads in big data.

Software is becoming an important building block in chip design, and customization will help applications gather, manage and analyze data a lot quicker, said Ron Kasabian, general manager of big data solutions at Intel.

Through hardware and software improvements, the company is trying to figure out how its chips can perform better in areas like predictive analytics, cloud data collection and specific task processing. The company has already released its own distribution of Hadoop, a scalable computing environment that deals with large data sets, and now chip improvements are on tap.

Kasabian said Intel is starting with the software. "It takes a while to get silicon to market," he said. "We understand where we can optimize for silicon, and there are certain things to [improve] for performance and optimization."

The company is taking lessons from software implementations and then looking to enhance the silicon to fill any software gap, Kasabian said, adding that the chip-design process takes about two years.

Server makers have been customizing servers specifically to carry out big data workloads, and improvements at the chip and instruction-set level could speed up task execution.

The plan includes developing accelerators or cores for big-data type workloads. For example, Intel is working with Chinese company Bocom to implement the Smart City project, which tries to solve counterfeit license plate problems in China by recognizing plates, car makes and models. The project involves sending images through server gateways, and Intel is looking to fill software gaps by enhancing the silicon. One improvement could be implementing accelerators to decode video, Kasabian said.

Intel has a big software organization, and the appointment of Renee James -- formerly head of the software unit -- earlier this year as the company's president was a sign of the chip maker's intent to dig deeper into software. The company does not want to become a packaged software distributor, but wants to enable software to work better on Intel architecture hardware. For a long time, Intel has backed open-source software and has hundreds of coders contributing to the development of Linux.

Different industries have different implementations of big data, Kasabian said. For example, a big data problem in genomics could differ from one in telecommunications.

Intel is also entering the space of the Internet of things, an emerging field in which networked devices with embedded processors and sensors are used as data-gathering instruments. The company has assets such as McAfee's software and hardware platform and Wind River's real-time operating system for its embedded chips to quickly process and securely collect data.

Reprinted with permission from IDG.net. Story copyright 2014 International Data Group. All rights reserved.
Our Commenting Policies