IBM and University of Texas at Austin Develop a Flood Prediction System

In the aftermath of Hurricane Irene, which caused flooding on the East Coast, it is good to know that IBM and researchers at the University of Texas at Austin are developing better flood prediction technology. The technology could predict flooding several days in advance, which would allow more time to evacuate and prepare. Since floods are a common natural disaster in the U.S., that is indeed good news.

Researchers are testing the system on the entire 230 miles of the Guadalupe River in Texas, and over 9,000 miles of tributaries. Flood prediction methods typically focus on the main steams of the largest rivers, and overlook tributary networks where flood starts. The new system can predict the river’s behavior at over 100 times its normal speed. The system, which is being run on IBM’s Power 7 system, can generate up to 100 hours of river behavior in one hour.

“Effective flood preparedness can be looked at as a large scale computing problem, with a huge number of relevant data and independencies,” said Frank Liu, Research Staff Member at IBM Research – Austin. “Using advanced models to simulate the scores of tributaries of large rivers along with other relevant real-time information such as weather, we are better able to give people valuable advance notice of a flood.”

The system can simulate tens of thousands of tributaries at a time, and could be used to predict the behavior of millions of river branches at the same time. The system combines analytic software with advanced weather simulation, including IBM’s Deep Thunder. In addition, the research team is linking the technology to NEXRAD radar precipitation, which allows them to better predict the risk of flooding on a creek-by-creek basis.

The system could eventually be used for more than just flood prediction. It could be used for irrigation management.

“Combining IBM’s complex system modeling with our research into river physics, we’ve developed new ways to look at an old problem,” said Ben Hodges, Associate Professor at UT Austin Center for Research in Water Resources. “Unlike previous methods, the IBM approach scales-up for massive networks and has the potential to simulate millions of river miles at once. With the use of river sensors integrated into web-based information systems, we can take this model even further.”

“We’re taking in a series of data, running it through a simulation and analytics engine, and getting an output of the flow and depth of the river. If you compare the flows and depths to the topography of the [surrounding] land, you can make a prediction of where flooding will occur,” said IBM researcher Fadi Gebara.

Photo: Flickr user, marnanel

Gina-Marie Cheeseman

Gina-Marie is a freelance writer and journalist armed with a degree in journalism, and a passion for social justice, including the environment and sustainability. She writes for various websites, and has made the 75+ Environmentalists to Follow list by

One response

Leave a Reply