Supercomputers help New Orleans prepare for Hurricane Isaac
Computing advances since Katrina have helped the city plan better on the storm surge, for one
Computerworld - In the seven years since Katrina struck New Orleans, advances in computer power and storm surge modeling is giving the city detailed data about Hurricane Isaac's impact.
Computer models have already mapped, on a continuing basis as weather data changes, how the storm surge will invade coastal regions and neighborhoods.
The researchers, at university supercomputing centers in Texas and Louisiana, are working to inform emergency planners about what will happen once the hurricane sends water into canals, levies and neighborhoods.
These models are being proven out right now as the storm hits.
In 2005, when Katrina landed, the capability to model storm surge, while good, may be rudimentary compared to what is available today. Back then, Louisiana used computer models with as many as 300,000 "nodes" and it took six hours to run a simulation.
Each node represents a different location on a map where compute algorithms run physics computations that determine what will happen during a hurricane. The number of nodes is somewhat analogous to higher number of dots per square inch in a photograph: the more dots, the more detail that's available.
Today, says Robert Twilley, an oceanographer and executive director of the Louisiana Sea Grant Program, simulations with some 1.5 million nodes can be completed in 1.5 hours.
"It's incredible -- this is just since Katrina," said Twilley.
The computer models, which are being run at the Louisiana State University's Center for Computation and Technology, help to inform emergency planners what roads will flood and neighborhoods cut off.
They are being used to help determine the best staging areas for positioning people and supplies needed for the recovery, said Twilley.
Louisiana is using an unstructured grid that allows it to concentrate the nodes in areas where the analysis is needed, near inland waterways and flood prone areas. This enables them to adjust the detail to where it most needed, making it as precise as 10 meters for certain inland areas, while as much as five kilometers in the ocean. You can see some of this at their website.
About the time of Katrina, the computer models "were much coarser and had minimum resolutions of only 100-200 meters," said Casey Dietrich, a post-doctoral researcher at the Institute for Computational Engineering and Sciences at University of Texas in Austin.
Dietrich has been running compute models at the Texas Advanced Computing Center at the University of Texas to assess the impact of the storm surge on Texas.
Emergency planners in both states take the data generated by the university researchers and incorporate it into geographic information systems.
"They can look down at neighborhood scale and say 'on this street along the levy we're going to have water this high,' and plan accordingly," Dietrich said.
Comparing the capability today with that at the time of Katrina, Dietrich said: "I think we have a very strong understanding of how hurricane wave storm develop and how they can threaten a coastal environment."
The models are now being tested by actual events.
The winds were picking up early last night in New Orleans, said Bred Jacobs, CIO at Loyola University in the city. Jacobs was there for Katrina.
The university suspended operations on Tuesday, and students are sheltering in place. There is ample food and supplies, Jacobs said by email.
"We worked to harden out IT facilities quite a bit since Katrina and are hoping to stay operational in our main data center," Jacobs said. "Tapes we shipped off site and we placed our hot-site on alert as a precaution."
Patrick Thibodeau covers cloud computing and enterprise applications, outsourcing, government IT policies, data centers and IT workforce issues for Computerworld. Follow Patrick on Twitter at @DCgov or subscribe to Patrick's RSS feed . His e-mail address is email@example.com.
- China trounces US in Top500 supercomputer race
- Smartphone chips to power prototype supercomputer
- What China's supercomputing push means for the U.S.
- China surpassing U.S. with 54.9 petaflop supercomputer
- U.S. losing a Sputnik moment
- Smartphone chips could replace server processors in HPC, researchers say
- Cray offers a more modest supercomputer for the enterprise
- Dell working on ARM supercomputer prototypes
- Swiss supercomputer aims to predict mountain weather with help of GPUs
- IBM supercomputer takes on new role in health arena
Read more about High Performance Computing in Computerworld's High Performance Computing Topic Center.
- 10 Hot Big Data Startups to Watch
- 11 Unique Uses for Google Glass, Demonstrated by Celebs
- How to Export Your Google Reader Account
- How to Better Engage Millennials (and Why They Aren't Really so Different)
- Telltale signs of ATM skimming
- 20 security and privacy apps for Androids and iPhones
- Big screen con artists: 7 great movies about social engineering
Today, many government agencies – civilian and defense – find themselves in a technology quandary: the volume of data that must be stored is growing rapidly, while shrinking budgets are limiting capital expenditures (i.e. – servers, storage devices, etc.) required to store all of this data.
- IT Certification Study Tips
- Register for this Computerworld Insider Study Tip guide and gain access to hundreds of premium content articles, cheat sheets, product reviews and more.
- Federal IT Innovation Caught in a Catch-22
- Fed resources shoring up old infrastructure, holding back new technologies.
- ESG Lab Validation of QLogic's Caching SAN Adapter
- ESG details the results of their testing of QLogic's new 10000 Series 8Gb Fibre Channel Adapter with a focus on scalable database performance...
- Deliver Customer Value with Big Data Analytics
- Big Data requires that companies adopt a different method in understanding today's consumer. Read this white paper to learn why Big Data is...
- Cloud Analytics for the Masses
- Learn the best practices in building applications that can leverage volume, variety and velocity of Big Data for organizations of any size.
- An Interactive eGuide: DDoS Attacks
- In today's world, Distributed Denial of Service (DDoS) attacks on organizations are becoming more prevalent. The number of attacks are increasingly annually with... All Government IT White Papers
- 3 Reasons Why Sepaton is the World's Fastest Backup Solution
- Leading analyst, Storage Switzerland learns how Sepaton backs up and deduplicates massive data volumes while maintaining the industry's fastest performance - all in...
- Virtustream (Vayence) video taking a 3000-Seat SAP Environment to the Cloud
- How can public cloud services help your organization reduce costs and increase security for your mission
- Williams & Fudge on Transforming IT with EMC
- Watch Williams & Fudge Data Center Director Phillip Reynolds discuss why this accounts receivable management firm turned to EMC.
- The Success Network: Driving Business Forward
- The communications and connectivity infrastructure of your organization is the focus of this KnowledgeVault Exchange, sponsored by Comcast Business.
- Advanced Voice Solutions for Your Business
- How can hosted business class voice services help mid-sized business be more agile, competitive and ready for growth? All Government IT Webcasts