Intel's MIC processor finds a big customer in Texas
Supercomputer may reach 15 petaflops
Computerworld - Intel's forthcoming MIC processor will be used by the Texas Advanced Computing Center to build a supercomputer with a peak performance of 10 petaflops that will eventually be upgraded to "at least" 15 petaflops.
The system will include a combination of eight-core Intel Xeon chips, which will supply two petaflops of compute capacity, and chips based on the MIC (Many Integrated Core) architecture. The highly parallel MIC processors will provide an additional eight petaflops of performance to the Texas system, code-named "Stampede."
"This is definitely the first serious appearance of MIC in the marketplace," said Steve Conway, an analyst of high-performance computing at research firm IDC.
The Texas Advanced Computing Center, located at the University of Texas at Austin, is "immediately" getting $27.5 million from the National Science Foundation (NSF) to build the system, which is expected to be running by January 2013.
The estimated federal investment over a four-year period will be $50 million. That includes plans to add future generations of MIC chips, bringing the compute capacity to 15 petaflops, or 15,000 trillion calculations per second. NSF-funded computers are available to scientists to do a wide range of research in areas such as climate, energy, processor improvements and even the spread of diseases.
The supercomputer will also be comprised of several thousand Dell "Zeus" servers.
The MIC chip that the Texas Advanced Computing Center will be using is code-named "Knights Corner," a co-processor designed for highly parallel workloads. It may have more than 50 cores. Knights Corner is competing with Nvidia GPUs -- both can be used to help speed up processor power.
Nathan Brookwood, an analyst at Insight 64, said Nvidia is in the catbird seat for people looking for massively parallel types of systems, but the Intel chip has been designed to be amendable to x86 programming environments.
"It's easier to move code over because you do have the x86 compatibility and standard Intel compilers," Brookwood said. But on the other hand, a lot of code that runs in supercomputing environments is adapted to OpenCL, which Intel will support as well, he said.
Patrick Thibodeau covers SaaS and enterprise applications, outsourcing, government IT policies, data centers and IT workforce issues for Computerworld. Follow Patrick on Twitter at @DCgov or subscribe to Patrick's RSS feed . His e-mail address is email@example.com.
- In exascale, Japan stands apart with firm delivery plan
- Here comes a supercomputing app store
- An HPC champion helps Trek Bicycle shift gears
- D-Wave pitches quantum co-acceleration to supercomputing set
- Why the U.S. may lose the race to exascale
- Top500 shows growing inequality in supercomputing power
- Supercomputing's big problem: What's after silicon?
- Cray brings Hadoop to supercomputing
- Intel rushes to exascale with redesigned Knights Landing chip
- China still has the fastest supercomputer in the world
Read more about High Performance Computing in Computerworld's High Performance Computing Topic Center.
- Best iPhone, iPad Business Apps for 2014
- 14 Tech Conventions You Should Attend in 2014
- 10 Desktop Apps to Power Your Windows PC
- How to Add New Job Skills Without Going Back to School
- Slideshow: 7 security mistakes people make with their mobile device
- iOS vs. Android: Which is more secure?
- 11 sure signs you've been hacked
- Case Study: Murphy USA Gains Application Visibility Without Agents Murphy USA has more than 700 stores that share a 10Mbps VSAT link. So when something goes wrong with their applications, it's the...
Red Hat Enterprise Linux - The Original Cloud Operating System
Linux adoption is growing against a number of measures, such as the
number of supercomputers that run Linux and the size of the contributing...
- OpenStack Hype vs. Reality: CIO Quick Pulse Open-source architecture can enable IT departments to build infrastructure-as-a-service (IaaS) clouds running on standard hardware.
- Building a Bridge to the Next Generation Data Center Selecting a widely adopted operating system is a foundational component of a standardization strategy.
- Webinar: Building a Big Data solution that's production-ready Big data solutions are no longer just a nice-to-have.
- Meg Whitman presents Unlocking IT with Big Data During this Web Event you will hear Meg Whitman, President and CEO, HP discuss HAVEn - the #1 Big Data platform, as well... All High Performance Computing White Papers | Webcasts