Supercomputers help New Orleans prepare for Hurricane Isaac
Computing advances since Katrina have helped the city plan better on the storm surge, for one
Computerworld - In the seven years since Katrina struck New Orleans, advances in computer power and storm surge modeling is giving the city detailed data about Hurricane Isaac's impact.
Computer models have already mapped, on a continuing basis as weather data changes, how the storm surge will invade coastal regions and neighborhoods.
The researchers, at university supercomputing centers in Texas and Louisiana, are working to inform emergency planners about what will happen once the hurricane sends water into canals, levies and neighborhoods.
These models are being proven out right now as the storm hits.
In 2005, when Katrina landed, the capability to model storm surge, while good, may be rudimentary compared to what is available today. Back then, Louisiana used computer models with as many as 300,000 "nodes" and it took six hours to run a simulation.
Each node represents a different location on a map where compute algorithms run physics computations that determine what will happen during a hurricane. The number of nodes is somewhat analogous to higher number of dots per square inch in a photograph: the more dots, the more detail that's available.
Today, says Robert Twilley, an oceanographer and executive director of the Louisiana Sea Grant Program, simulations with some 1.5 million nodes can be completed in 1.5 hours.
"It's incredible -- this is just since Katrina," said Twilley.
The computer models, which are being run at the Louisiana State University's Center for Computation and Technology, help to inform emergency planners what roads will flood and neighborhoods cut off.
They are being used to help determine the best staging areas for positioning people and supplies needed for the recovery, said Twilley.
Louisiana is using an unstructured grid that allows it to concentrate the nodes in areas where the analysis is needed, near inland waterways and flood prone areas. This enables them to adjust the detail to where it most needed, making it as precise as 10 meters for certain inland areas, while as much as five kilometers in the ocean. You can see some of this at their website.
About the time of Katrina, the computer models "were much coarser and had minimum resolutions of only 100-200 meters," said Casey Dietrich, a post-doctoral researcher at the Institute for Computational Engineering and Sciences at University of Texas in Austin.
Dietrich has been running compute models at the Texas Advanced Computing Center at the University of Texas to assess the impact of the storm surge on Texas.
Emergency planners in both states take the data generated by the university researchers and incorporate it into geographic information systems.
"They can look down at neighborhood scale and say 'on this street along the levy we're going to have water this high,' and plan accordingly," Dietrich said.
Comparing the capability today with that at the time of Katrina, Dietrich said: "I think we have a very strong understanding of how hurricane wave storm develop and how they can threaten a coastal environment."
The models are now being tested by actual events.
The winds were picking up early last night in New Orleans, said Bred Jacobs, CIO at Loyola University in the city. Jacobs was there for Katrina.
The university suspended operations on Tuesday, and students are sheltering in place. There is ample food and supplies, Jacobs said by email.
"We worked to harden out IT facilities quite a bit since Katrina and are hoping to stay operational in our main data center," Jacobs said. "Tapes we shipped off site and we placed our hot-site on alert as a precaution."
Patrick Thibodeau covers cloud computing and enterprise applications, outsourcing, government IT policies, data centers and IT workforce issues for Computerworld. Follow Patrick on Twitter at @DCgov or subscribe to Patrick's RSS feed . His e-mail address is email@example.com.
- Money talks, and that's all quantum maker D-Wave has to say
- IBM project aims to forecast and control Beijing's air pollution
- China has the fastest supercomputer, but the U.S. still rules
- ISC: Cray makes Lustre palatable for storage administrators
- SC500: China wins a slowing supercomputer race
- Fujitsu 56 Gbps circuit doubles communication speeds between CPUs
- HP enters supercomputing market with water-cooled Apollo system
- In exascale, Japan stands apart with firm delivery plan
- Here comes a supercomputing app store
- An HPC champion helps Trek Bicycle shift gears
Read more about High Performance Computing in Computerworld's High Performance Computing Topic Center.
This pilot fish is a contractor at a military base, working on some very cool fire-control systems for tanks. But when he spots something obviously wrong during a live-fire test, he can't get the firing-range commander's attention.
- IT Certification Study Tips
- Register for this Computerworld Insider Study Tip guide and gain access to hundreds of premium content articles, cheat sheets, product reviews and more.
- Reduce federal infrastructure risk with compliance management and situational awareness
- IBM continuous monitoring and management solutions deliver real-time situational awareness to help federal agencies understand vulnerabilities, and protect the infrastructure.
- How Four Citrix Customers Solved the Enterprise Mobility Challenge
- Managing mobile devices, data and all types of apps-Windows, datacenter, web and native mobile- through a single solution.
- 8 Steps to Fill the Mobile Enterprise Application Gap
- Traveling executives and Millennials alike expect to communicate, collaborate and access their important work applications and data from anywhere on whatever device they...
- Seattle Children's Accelerates Citrix Login Times by 500% with Cross-Tier Insight
- Seattle Children's is a leading research hospital with a large and growing Citrix XenDesktop deployment. With ExtraHop, the IT team at Seattle Children's...
- McKesson Makes Application Hosting for Hospitals Faster, More Efficient
- With ExtraHop, McKesson identified the root cause of slow Citrix XenApp application launches and adopted a more intelligent, proactive IT operations model that... All Government IT White Papers
- Keep Servers Up and Running and Attackers in the Dark An SSL/TLS handshake requires at least 10 times more processing power on a server than on the client. SSL renegotiation attacks can readily...
- On Demand: Mastering the Art of Mobile Content Management Mobile device usage in the enterprise has skyrocketed, and it continues to escalate. IT must answer to users who demand access to their...
- DevOps with PureApplication System: Reduce cost and speed delivery with an integrated IBM Cloud solution Join this webcast to hear what ING Netherlands has been able to achieve while deploying DevOps tools from IBM Rational. An ING executive...
- NSS Labs & Cisco Present: Evaluating Leading Breach Detection Systems Today's constantly evolving advanced malware and APTs can evade point-in-time defenses to penetrate networks. Security professionals must evolve their strategy in lockstep to...
- Will the Real Endpoint Threat Detection and Response Please Stand Up? This webinar explores new technologies & process for protecting endpoints from advanced attackers as well as the innovations that are pushing the envelope...
- All Government IT Webcasts