Flood Modeling


Another way GIS matters is by helping
people prepare for, and respond to, emergencies and disasters. In the video
prelude to this case study, you heard professor David Maidment say that floods
claim more lives and cause more damage than any other kind of natural
disaster. Here we’ll consider flood prediction as an example of modeling
with Geographic Information Systems. In the late 1960s, US state and federal
government agencies began building Geographic Information Systems for flood
risk estimation and response. Lisa Warnecke, a longtime observer of GIS applications and states, tells us that the Texas Natural Resource Information
System, one of the earliest state GISs in the country, was established in 1968
in response to flooding. This image shows flooding on the Texas
coast in the wake of hurricane Beulah, which took 688 lives and caused a
billion dollars in damage in 1967. At about the same time, the National Flood
Insurance Act of 1968 created federal subsidies to help private property
owners pay for flood insurance. This map shows the distribution of subsidized
insurance policies by state, in 2013. Demand for insurance among property
owners in flood prone areas, hastened the production of elevation data needed to
map flood ways. This is how terrain contours used to be compiled – by skilled
operators who traced lines of constant elevation, from ghostly stereoscopic
images of the terrain, that were projected from overlapping aerial photo
pairs. Established in 1979, the US Federal Emergency Management Agency took
responsibility for administering the National Flood Insurance Program. FEMA’s
responsibility included overseeing production of Flood Rate Insurance Maps,
or FIRMs. Eventually FIRMs became DFIRMs, as Maps went digital. FEMA has been active ever since in flood
risk mapping and public outreach. Hazus-MH is FEMA’s suite of software models that quantify human, property, financial, and social impacts of natural hazards, such as earthquakes ,hurricanes, riverine and coastal floods, and tsunamis. Numerous flood prediction models are in use by state and local government agencies. And many private contractors provide services and spatial modeling and GIS,
related to flooding. To predict the impacts of a natural hazard, it’s
necessary to understand how the various processes that contribute to the hazard
actually work. Models are computational expressions of that understanding.
First and foremost, models designed to predict flooding from rivers and streams,
need to account for the landscape characteristics of an area of interest,
including watershed size, terrain, and soils, stream channel geometry, and others. Then the model needs to factor in forcing variables like rainfall,
intensity, and duration. GIS is useful in integrating the various data inputs that
flood models require. The Hazus-MH flood model, for example, is available as an
extension to ArcGIS. According to Kevin Mickey, lead instructor for FEMA’s
Emergency Management Institute, Hazus-MH offers a variety of options for defining flood hazards and modeling flood impacts. Have you worked with this or other GIS-coupled flood models? If so, please add a comment to this VoiceThread. To make sure we all know what flood modeling with GIS entails, let’s walk
through a workflow recommended by the United Nations Office for Space-based
Information for Disaster Management and Emergency Response. Although there are
many different approaches to flood modeling and prediction, I chose this one to discuss with you because it is fairly well documented and it’s in the public domain. Like other models, this one involves multiple stages, each of which produces its own intermediate outputs. In fact, although
the diagram indicates that the output from workflow A, becomes an input for
workflow B, that’s not exactly the case in the step-by-step demonstration of the
model, that’s included in the recommended practice. A, B and C are actually three
distinct workflows, in this example. Workflow A calculates potential runoff
from a given landscape. Workflow B delineates a drainage network and
calculates stream flow direction and rates, at various points along the
network. And workflow C produces an inundation polygon that represents the
area and depth of flooding for a given geography. Let’s look at these three
stages, one by one. The output of stage A is labeled “Multidata CN Grid.” CN stands
for Curve Number, the traditional name for an estimate of potential
precipitation runoff. To create the Curve Number Grid, the recommended practice
uses ArcGIS desktop software with the Spatial Analysis extension, and an
additional toolkit called HEC-Geo-HMS. HEC-Geo-HMS is the
Geospatial Hydrologic Modeling System developed by the US Army Corps of
Engineers’ Hydrologic Engineering Center. The integration of HEC-Geo-HMS with
ArcGIS is an example of what’s called a “loosely coupled” model. Data requirements for stage A include the USGS’ Land Cover data, SSURGO soils data, digital
elevation data, and a look-up table of curve numbers associated with land cover
categories. How familiar are you with the characteristics of these data sets?
You’ll have a chance to investigate them later. Notice that the elevation data input is labeled” Hydro DEM”. That’s a digital
elevation model that’s been processed to remove errors, like artificial sinks and
peaks, that can distort the flow service. More about that in a minute. The CN grid
produced at this stage quantifies the amount of potential runoff at each grid
cell, as a function of land cover, soils, and terrain. The CN grid is one of the
inputs required for stage B. Workflow B of the recommended practice involves
ArcMap, Spatial Analyst. the Corps of Engineers’ Geospatial Hydrologic Modeling
System, and an Esri toolkit called Arc Hydro. Arc Hydro provides tools needed to
correct those anomalous sinks and peaks in DEMs. It also enables you to
calculate a Flow Direction Grid from the hydro DEM. The Flow Direction Grid
quantifies the direction of downstream flow from each raster cell to other cells in the grid. From the Flow Direction Grid, Arc Hydro generates a flow accumulation grid, by calculating the number of upstream cells that flow into each cell. The larger the number of accumulated upstream cells in a given
cell, the more runoff will occur at that location. Are you following this OK? By
setting a threshold value for flow accumulation you can use Arc Hydro to
define the cells that correspond with stream channels. From the Stream Grid , Arc Hydro can segment the channel cells to represent individual stream reaches, that
have unique IDs. Arc Hydro can then delineate drainage areas, or catchments,
for each segment by backtracking the flow direction surface from channels to
ridges, thus defining the area that drains to a specific location. With all
those grids as inputs, and with an additional grid that quantifies precipitation for a given period, the corps of engineers extension can generate a computational model that calculates stream flows for drainage
points in a stream network. In the process diagram shown here, “Outflow
Hydrograph” refers to the amount of discharge at a particular location over
a given period of time. Keep in mind that the National Water Model we’ll discuss a
little later, calculates 2.7 million hydrographs every hour. Workflow C
demonstrates how a flood “Inundation Polygon” can be calculated from digital
elevation data, digital representations of stream channel geometry, and
characteristics of the floodway – such as obstacles – that affect flow. This stage
uses another toolkit produced by the Corps of Engineers, called HEC-Geo-RAS – the Geospatial River Analysis System. Required inputs to HEC-Geo-RAS
include a high-resolution digital representation of terrain called a
Triangulated Irregular Network. Are you familiar with TINs? If not, don’t worry, you’ll have a chance to investigate them later in the lesson. In addition to
a TIN, the Geospatial River Analysis System requires a number of digitized
vector feature classes including stream center lines, river banks, stream
cross-sections, and structures, such as bridges and culverts. The connectivity of
stream segments needs to be specified explicitly. In other words, the model
needs to know the topology of the drainage network. The model also needs to
know the estimated upstream flows in the drainage network, such as the hydrographs
produced in stage B. To predict the flood Inundation Polygon, the model compares a
calculated water surface grid with a digital terrain grid derived from the
TIN. After the terrain grid is subtracted from the water surface grid, any cells left with positive values, represent flooded
areas. Here we see the area and roads inundated by The Onion Creek flood, that
was discussed in the video we watched before the case study. Now, my description
of the three interrelated flood models was grossly simplified, of course. Even so,
the complexity of such models should be obvious. It follows that flood models are
computationally intensive. The 81 page step-by-step demonstration of the UN’s
recommended practice includes several warnings about time-consuming operations in ArcMap. Until recent times, flood modeling and hazard assessment have been limited to basin or sub basin scales, due to practical computer processing
limitations of desktop software and computers. That’s where the National
Flood Interoperability Experiment, or NFIE, described in the video, comes in. The
video describes a proof-of-concept for a national- scale streamflow model that’s
engineered to take advantage of distributed cloud computing capabilities.
A key element of NFIE was a simulation model called RAPID, the Routing Application for Parallel Computation of Discharge. Hmm
what a name! RAPID is a river routing model comparable to those in the Corps
of Engineers Hydrologic Modeling System that was used in workflow B earlier in
this presentation. In addition to being engineered for high-performance
computing, the RAPID model uses the National Hydrologic Dataset to
delineate the national stream network and its connectivity. RAPID relies on
GIS to prepare the stream network data for input, and to display maps and
hydrographs of its flow rate computations. Not long after the successful NFIE experiment, NOAA the National Oceanic and
Atmospheric Administration, implemented a National Water Model that uses
data from more than 8,000 USGS streamflow gauges, to produce hourly
simulations for 2.7 million stream reaches in the continental United States.
Previously, the press release reports, NOAA was only able to forecast
streamflow for four thousand locations every few hours. The National Water Model
is based on a framework called WRF-Hydro, that enables various component models
like RAPID, to be coupled with other simulation models and observations, that
together can simulate the complex physical processes responsible for
streamflow and floods. You can see a near real-time display of National Water
Model simulations in ArcGIS online. This view combines fifteen 1-hour forecast
intervals, visualized by flowrate and anomaly, compared to normal monthly flow
values. The visualization is rendered from 40 million data rows that are fed
by NOAA data services. The WRF-Hydro framework is built to operate on the
high-performance computing network that’s being developed with support from
the US National Science Foundation. NSF envisions a distributed computing
environment it calls cyberinfrastructure, that supports data intensive research at government agencies and academic institutions around the country. Some envision a Cyber-GIS that brings geoprocessing and spatial statistics tools to high-performance computing environments, for collaborative use by researchers across the network. Meanwhile, mainstream GIS architectures are maturing too, and the performance advantages of parallel
computing are becoming available in desktop software, like ArcGIS Pro. Of
course, flood prediction isn’t the only use of GIS that’s scaling up to
supercomputers in the cloud. Vast volumes of financial transaction data, social media interaction, and observations from ever-expanding
sensor networks, create opportunities for insights in commercial, national defense,
infrastructure, and natural resources domains, that only big data analytics can
realize. Data science is the emerging field that specializes in data intensive
discovery. Yet, through all this, space and place still matter, and so does GIS,
whatever you may call it.

Leave a Reply

Your email address will not be published. Required fields are marked *