University of Texas at Austin

News

Hurricane Simulations in High Gear

By Jorge Salazar, Joanne Foote

Published Feb. 12, 2025

Hurricane storm surge flood waters can take a heavy toll on society. TACC supercomputer simulations are an important tool for state and federal agencies in protecting life and property. Credit: iStock (Hurricane Harvey in Kemah, TX, near Houston).

The 2024 Atlantic hurricane season left a trail of destruction in its wake, causing hundreds of fatalities and more than $200 billion dollars in damages. Despite the heavy toll, supercomputer simulations were an important tool for U.S. state and federal agencies in protecting life and property.

At the Texas Advanced Computing Center (TACC), Vista, Frontera, Stampede3, and Lonestar6 are used for urgent computing to meet the needs of emergency responders with rapid and frequently updated simulations of storm surge, the often deadly rise in sea level and coastal flooding from big storms and hurricanes.

“Here at TACC, we've continued many collaborations between Texas, Louisiana, North Carolina, other state agencies, and with research institutions to provide resources for running storm surge forecasting models,” said Carlos Del Castillo Negrete, a research associate in the High Performance Computing group of TACC.

Prior to joining TACC in 2023, Negrete served as a researcher in the Computational Hydraulics Group at the Oden Institute for Computational Engineering & Sciences at The University of Texas at Austin under the leadership of Clint Dawson. Dawson is the Department Chair of Aerospace Engineering & Engineering Mechanics at UT.

block.caption

Clint Dawson, UT Austin, with the Frontera supercomputer. Credit: TACC

For decades, Dawson has been a key developer of storm surge models tuned for running on supercomputers, mainly the Advanced Circulation (ADCIRC) model, originally developed by Rick Luettich of the University of North Carolina at Chapel Hill (UNC-Chapel Hill) and Joannes Westerink of the University of Notre Dame.

The ADCIRC code is approved by The Federal Emergency Management Agency (FEMA) and is officially used by the U.S. Federal Government in flood insurance studies and flood risk computations. “Our role at TACC is to keep developing better and faster systems for them to run these models,” Negrete said.

“To model events such as the 15-foot storm surge we saw with Hurricane Ian in 2022 at Fort Meyers Beach in Florida, we have to gather and ingest a lot of data,” Dawson said. For storm surge models, the major sources of data needed include bathymetry and topography of the underlying surface below water; land cover and land use to get a sense of the roughness and resistance to water flow; and precipitation, wind velocity, and pressure measurements via satellite and aircraft dropsondes from the National Hurricane Center.

block.caption

Hindcasting storm surge models are used to understand what happened in the past. Results are used to improve models for both present and future forecasting. CERA Group hindcast model run for Hurricane Helene using NHC's best track, 10/3/24. Credit: CERA

“Over the years, parallelizing the ADCIRC code has allowed us to not only use it to model and understand what happened after a hurricane hits, but with bigger and better supercomputing resources, we now use ADCIRC operationally for forecasting,” Dawson added. “High performance computing (HPC) revolutionized what we were able to do. We can resolve much higher detail in the models and include features such as floodplains that we couldn’t two decades ago, expand them to global coverage, and incorporate better physics for more accurate predictions.”

Hurricane Ike simulations done in 2008 by Dawson and many collaborators on TACC’s Ranger supercomputer marked a turning point for storm surge computer modeling, in that they predicted results that closely matched actual high water mark measurements. “That was a necessary step to build trust in the model,” Dawson said. “Now that people can trust our model, we can use it for forecasting.”

block.caption

Carlos Del-Castillo-Negrete, TACC. Credit: TACC

Storm Surge Forecast Origin

A surge forecast begins with the first signs of low pressure areas in the tropics with potential to become a storm. These disturbances are tracked by the National Hurricane Center (NHC), which begins issuing guidance updated every six hours. The updates provide predictions about the track and intensity of the storm using the output of nine large scale meteorological models such as the Global Forecast System from the National Oceanic and Atmospheric Adminstration (NOAA), products from the UK Met Office, models from the European Centre for Medium-Range Weather Forecasts, and others.

Dawson and collaborators initially ran storm surge forecast scenarios manually, often staying up late and sleeping only between computer runs. In the 2010’s, this process was automated in a project led by Dr. Jason Fleming, originally at UNC-Chapel Hill but now head of Seahorse Coastal Consulting. Fleming still leads a group that uses the fully automated ADCIRC Prediction System to provide forecasts during hurricane season. Seahorse uses allocations awarded on TACC’s Frontera, Lonestar6, and Stampede3 supercomputers, as well as running storm surge models at Louisiana State University (LSU).

“This system uses the ADCIRC+SWAN models to produce model guidance for hurricane storm surge using official advisories from the National Hurricane Center,” Fleming said. “The data is then transmitted to the Coastal Emergency Risk Assessment (CERA) platform at LSU, mapped, matched to real time water level measurements, and disseminated to decision makers like FEMA, the U.S. Coast Guard, emergency managers, and many others for use in their operations.”

The moment of truth for all of the data and modeling is the CERA website and its associated real-time visualization engine, which delivers hundreds of maps during an active storm event to emergency management personnel, flood managers, and more generally the public at large.

CERA is the go-to site for thousands of users when a hurricane approaches U.S. shores. It provides detailed information about hurricane-affected areas, impact times and locations, wind hazards, county and neighborhood flood risks, building and infrastructure safety, and recommended actions for emergency managers and first responders. 

block.caption

Jason Fleming, Seahorse Coastal Consulting LLC (L); Carola Kaiser, LSU (R).

“Hurricane events do not allow for delays or failures,” said CERA team leader Carola Kaiser of LSU’s Center for Computation and Technology. “By providing hundreds of simulations on supercomputers, making our workflow as robust and redundant as we can, and feeding the results into the CERA system in a completely automated way, we are empowering our users to make informed decisions and act. Our team heavily utilizes supercomputing resources from TACC which are, besides the HPC resources at LSU, the bedrock to accomplish our mission.”

Kaiser said that CERA relies on TACC's systems around the clock during hurricane season, when major storms can require the need for priority access and the allocation of additional compute nodes. "We work with the TACC team to have the necessary software installed and the latest information on hand about compilers, network specifics, storage space access, and new hardware developments.”

block.caption

Snapshot from a visualization on the CERA website showing maximum water height of hurricane storm surge as Hurricane Beryl made landfall on July 8, 2024. Credit: CERA

Hurricane Beryl made history on July 1, 2024, as the earliest ever recorded Category 5 Atlantic hurricane in a season, knocking out power for nearly three million Houstonians. “TACC’s HPC resources play a major role in keeping the focus, especially when storms like Beryl occur over a longer period of time and have swings in storm strength and path direction,” Kaiser said. Model runs with a variety of mesh resolutions and targeted at the current position of the hurricane proved to be essential for her team to validate the storm impacts.

TACC’s Stampede3 kicked into high gear early for hurricane storm surge simulations of Beryl, Helene, and more. “Stampede3 has become an extremely effective machine to run weather models, especially with its higher memory bandwidth, which tends to be what limits many of these storm surge models,” Negrete said. “We get better and more detailed forecasts faster, which can mean saving lives in hazardous situations.”

Negrete has been working lately with TACC to support the National Science Foundation (NSF)-funded Natural Hazards Engineering Research Infrastructure DesignSafe project, which provides cloud-based tools to manage, analyze, understand, and publish critical data for research to understand the impacts of natural hazards. DesignSafe provides software tools through a web portal for researchers to run ADCIRC simulations on Stampede3 as well as data storage and sharing.

“DesignSafe is a great community to share and collaborate hurricane hazards research and cross interdisciplinary bounds where our forecasting and research work within these storm surge models can be shared with other researchers that then use this information and data in new and impactful ways. DesignSafe is an excellent example of how the cyberinfrastructure that TACC builds helps foster a research community that wouldn't be possible otherwise,” said Negrete.

AI and Storm Surge Models

The explosive rise of AI provides promising opportunities for storm surge researchers to make faster and more accurate forecasts. “The Computational Hydraulics Group is heavily involved in using machine learning (ML) to see what we can get out of ML algorithms and possibly save our more costly HPC computations for when they're needed the most" said Dawson.

Research published in the journal Coastal Engineering, October 2023, by Dawson’s group, used neural network-based regression models on the NVIDIA A100 GPU nodes of TACC’s Lonestar6 supercomputer for predicting inundation level at wet points as part of a larger storm surge model.

Dawson and his team plan to continue exploring the benefits of AI as a component in storm surge models on Vista, TACC’s first AI-centric supercomputer, and later the Horizon supercomputer of the U.S. NSF Leadership Class Computing Facility (LCCF) that is planned for TACC in 2026.

“Vista and especially Horizon are going be incredible systems to use our breadth and wealth of computational simulation data to train ML models that can then be used in a more operational way,” Negrete said.

"Behind the storm surge forecasts, there’s a massive team of researchers, investigators, engineers, and software developers that have worked across a variety of institutions,” Negrete concluded. “TACC has been an integral part of that team for a long time. It's a set of remarkable people who care about making an impact in the world.”

Funding for the development and operation of Vista, Frontera, and Stampede3 comes from The National Science Foundation (NSF). Funding for Lonestar6 is supported by the University of Texas Research Cyberinfrastructure.

Story adapted from TACC