Supercomputers help New Orleans prepare for Hurricane Isaac
Computing advances since Katrina have helped the city plan better on the storm surge, for one
Computerworld - In the seven years since Katrina struck New Orleans, advances in computer power and storm surge modeling is giving the city detailed data about Hurricane Isaac's impact.
Computer models have already mapped, on a continuing basis as weather data changes, how the storm surge will invade coastal regions and neighborhoods.
The researchers, at university supercomputing centers in Texas and Louisiana, are working to inform emergency planners about what will happen once the hurricane sends water into canals, levies and neighborhoods.
These models are being proven out right now as the storm hits.
In 2005, when Katrina landed, the capability to model storm surge, while good, may be rudimentary compared to what is available today. Back then, Louisiana used computer models with as many as 300,000 "nodes" and it took six hours to run a simulation.
Each node represents a different location on a map where compute algorithms run physics computations that determine what will happen during a hurricane. The number of nodes is somewhat analogous to higher number of dots per square inch in a photograph: the more dots, the more detail that's available.
Today, says Robert Twilley, an oceanographer and executive director of the Louisiana Sea Grant Program, simulations with some 1.5 million nodes can be completed in 1.5 hours.
"It's incredible -- this is just since Katrina," said Twilley.
The computer models, which are being run at the Louisiana State University's Center for Computation and Technology, help to inform emergency planners what roads will flood and neighborhoods cut off.
They are being used to help determine the best staging areas for positioning people and supplies needed for the recovery, said Twilley.
Louisiana is using an unstructured grid that allows it to concentrate the nodes in areas where the analysis is needed, near inland waterways and flood prone areas. This enables them to adjust the detail to where it most needed, making it as precise as 10 meters for certain inland areas, while as much as five kilometers in the ocean. You can see some of this at their website.
About the time of Katrina, the computer models "were much coarser and had minimum resolutions of only 100-200 meters," said Casey Dietrich, a post-doctoral researcher at the Institute for Computational Engineering and Sciences at University of Texas in Austin.
Dietrich has been running compute models at the Texas Advanced Computing Center at the University of Texas to assess the impact of the storm surge on Texas.
Emergency planners in both states take the data generated by the university researchers and incorporate it into geographic information systems.
"They can look down at neighborhood scale and say 'on this street along the levy we're going to have water this high,' and plan accordingly," Dietrich said.
Comparing the capability today with that at the time of Katrina, Dietrich said: "I think we have a very strong understanding of how hurricane wave storm develop and how they can threaten a coastal environment."
The models are now being tested by actual events.
The winds were picking up early last night in New Orleans, said Bred Jacobs, CIO at Loyola University in the city. Jacobs was there for Katrina.
The university suspended operations on Tuesday, and students are sheltering in place. There is ample food and supplies, Jacobs said by email.
"We worked to harden out IT facilities quite a bit since Katrina and are hoping to stay operational in our main data center," Jacobs said. "Tapes we shipped off site and we placed our hot-site on alert as a precaution."
Patrick Thibodeau covers cloud computing and enterprise applications, outsourcing, government IT policies, data centers and IT workforce issues for Computerworld. Follow Patrick on Twitter at @DCgov or subscribe to Patrick's RSS feed . His e-mail address is firstname.lastname@example.org.
- In exascale, Japan stands apart with firm delivery plan
- Here comes a supercomputing app store
- An HPC champion helps Trek Bicycle shift gears
- D-Wave pitches quantum co-acceleration to supercomputing set
- Why the U.S. may lose the race to exascale
- Top500 shows growing inequality in supercomputing power
- Supercomputing's big problem: What's after silicon?
- Cray brings Hadoop to supercomputing
- Intel rushes to exascale with redesigned Knights Landing chip
- China still has the fastest supercomputer in the world
Read more about High Performance Computing in Computerworld's High Performance Computing Topic Center.
This state transportation department uses computer science students from a local university as programming interns, and everyone is happy with the arrangement -- until one intern learns how to bring down the mainframe.
- IT Certification Study Tips
- Register for this Computerworld Insider Study Tip guide and gain access to hundreds of premium content articles, cheat sheets, product reviews and more.
- Changing the Way Government Works: Four Technology Trends that Drive Down Costs and Increase Productivity
- This paper discusses four technology-based approaches to improving processes and increasing
productivity while driving down department and agency costs.
- Why Projects Fail
- CIOs are expected to deliver more projects that transform business, and do so on time, on budget and with limited resources.
- The New Business Case for Video Conferencing: 7 Real-World Benefits Beyond Cost-Savings
- This whitepaper provides insight into the value of video conferencing in today's business environment, and how organizations are using visual collaboration to find...
- Gartner Magic Quadrant for Client Management Tools
- The client management tool market is maturing and evolving to adapt to consumerization, desktop virtualization, and an ongoing need to improve efficiency.
- Audit Ready and Asset Optimized: The Solid Promise of an Intelligent Software Asset Management Solution
- In this paper Frost & Sullivan examines the benefits of enterprise-grade Software Asset Management solutions, and how these solutions serve as the convergence... All Government IT White Papers
- LIVE EVENT: 5/7, The End of Data Protection As We Know It. Introducing a Next Generation Data Protection Architecture. Traditional backup is going away, but where does this leave end-users?
- On-demand webinar: "Mobility Mayhem: Balancing BYOD with Enterprise Security" Check out this on-demand webinar to hear Sophos senior security expert John Shier deep dive into how BYOD impacts your enterprise security strategy...
- Mobile Security: Containerizing Enterprise Data In this on-demand webinar, Fixmo's Lee Cocking, VP of corporate strategy, explains why Apple-ization trends like mobility and "bring-your-own-device" (BYOD) are driving the...
- Endpoint Data Management: Protecting the Perimeter of the Internet of Things Not surprisingly, "Internet of Things" (IoT) and Big Data present new challenges AND opportunities for enterprise IT. Teams need to harness, secure and...
- How to Protect Enterprise Data Yet Enable Secure Access for End Users Learn how BYOD, Big Data and the use of rogue applications and devices is putting corporate data at risk, best practices from IT...
- All Government IT Webcasts