Heading into next week’s Supercomputing 2015 conference in Austin, the topic on the tip of every attendee’s tongue is simply, "supercomputing." Researchers, industry leaders and computational users are gathering by the thousands to investigate how many ways can we explore the power of the microchip in a supercomputing environment.
And it can’t happen at a better time. What once was an American stranglehold -- the illustrious title of the country owning the most powerful supercomputer -- has slowly slipped from our grasp following the crash of the Cold War.
Maybe, as Americans, we devalued the supercomputer’s place in the race to capitalize on the PC, and subsequently lost our focus? Some examples of the most famous supercomputers in the American experience -- WOPR and Watson -- have no commercial, government, military or research value. Watson is merely an advertisement for IBM and best known for its Jeopardy appearances, while WOPR was a character in the 1980’s popular Cold War flick, War Games.
Could that trend change this week at Supercomputing? Coming on the heels of a landmark decision regarding supercomputing last summer by President Obama, Alan Alda, scientist and the lead character on the famous television show Mash, is planning to deliver a keynote speech to the Supercomputing crowd about the power of scientific computing. For a geek and fan like me, this is a can’t miss event!
Once considered a top priority, supercomputers today are often overlooked, as they simply aren't considered “cool” as they once were. The IT industry, and the media that covers it, is now enamored with hyperscale, big data and the commoditization of hardware and, as a result, applications that process big data and cloud-hosted web frameworks are garnering the majority of media headlines. That is, until President Obama recently delivered what one journalist labeled an “HPC moon shot.”
Alison Diana, managing editor of Enterprise Tech, recently provided some interesting analysis on the President’s latest executive order, which on July 29, 2015 established the National Strategic Computing Initiative (NSCI), a piece of legislation designed to put the Federal government into a position to sharpen, develop and streamline a wide range of new technology applications to investigate solutions to solve difficult IT problems and foster increased use of these innovations in the public and private sector.
In light of the President’s action, Diana’s essential question was thus: “Could President Barack Obama’s executive order establishing the National Strategic Computing Initiative (NSCI) do for high-performance computing what President John F. Kennedy’s proclamation about landing on the moon did for the space industry?”
Keep in mind; Kennedy’s proclamation started a space-based arms race between the United States and Russia. And while China does currently own the world’s top supercomputer, Tianhe-2, a supercomputer developed by China’s National University of Defense Technology, and the U.S. owns the 2nd ranked Titan at the Department of Energy’s (DOE) Oak Ridge National Laboratory, I’d argue that Diana’s comparison is a bit lofty in terms of drawing parallels to the race for the moon.
What Obama’s executive order really did was make HPC and supercomputing “cool” again. High-performance computing, while always vastly important in life sciences research, oil and gas exploration, now has has validation at the highest level. And it couldn’t have come at a better time.
Did consumerization of IT stunt the evolution of the supercomputer?
In my opinion, the "consumerization" of IT (eg. how enterprises are affected by, and can take advantage of, new technologies and models that originate and develop in the consumer space, rather than in the enterprise IT sector) has, in addition to a general increased desire to get away from supercomputing, killed some "soulful" HPC-related innovation.
The consumerization of IT was a shift towards a more distributed approach and, in some scenarios, stifled innovation of building fast and resilient supercomputers, mostly because they were less efficient in terms of consuming datacenter space and power, but also because they were more expensive. The ROI to a business or customer was less and, thus, shifted demand towards open-source x86 platforms. The decrease in market demand caused the OEMs to pivot and shift their strategy and investments. Sometimes, it’s not always technology that self evolves, but rather external market forces that can, and often do, dictate creativity and innovation.
Not everyone sees Obama’s HPC executive order in the same shining light as I do, however, especially those operating in a commercial enterprise where enterprise executives still question how best to leverage HPC technology advancements as historically seen in the national laboratories and research environments. Diana quoted Mike Torto, CEO of Embotics, as stating that, “while it is laudable that this president recognizes technology is imperative to compete globally, it is laughable to believe it can be regulated with the motto ‘to out compute is to out compete.’ Businesses innovate best when left to compete via market forces with no political interference or shackles to any one interest.”
From a developer’s standpoint, once you understand the basic concepts of the cloud, it can be actually quite boring. Not to say that it's not completely dis-interesting, but what “techie” really wants to hear about delivering the lowest CPU lease cycle? The only things really interesting with the cloud are the capacity planning issues concerning multi-tenancy. And when it comes to Hadoop and/or other unstructured database platforms, the most interesting aspect of the job is typically factoring the cost per gigabyte of storage, or factoring the cost per gigabyte of network throughput. These management activities are anything but cool, or innovative.
Building a supercomputer, however -- now that is cool. And as a result, I can’t wait to hear what Alda has to say about supercomputing on Monday.
This article is published as part of the IDG Contributor Network. Want to Join?