Inside this issue

Simplified Real-Time Hydromodels Using Spreadsheets

By Bruce Rindahl, Leonard Rice Engineers

As the hydrologic and hydraulic simulation programs available to engineers become more sophisticated and faster on today’s computers, the models themselves are becoming more detailed and complex. The basic hydrologic and hydraulic equations, however, have remained relatively unchanged (e.g. Horton’s equation, Manning’s equation, etc.). In 2006, Urban Drainage Flood Control District (District) began investigating the use of simplified hydrologic and hydraulic models for use in real-time flood prediction. The use of spreadsheets and internet technology provided for an automated method of running the models utilizing the District’s ALERT rainfall as real-time input and permitted the posting of the results to a web interface.

A spreadsheet template was created in Excel to compute basin runoff based on the current design standards of the District’s Drainage Criteria Manual. Real-time rainfall in the template is computed from weighted ALERT rain gages instead of a design storm. ALERT raingage data is obtained using Excel’s built-in Web Query capability. The Web Query interface also allows the user to specify a time to automatically refresh the data to assure the model is reflecting current rainfall estimates. One Excel worksheet is assigned for each basin in the model.

Another template was created in the Excel spreadsheet to simplify channel routing by approximating more detailed routing techniques. Values from the basin runoff worksheets are linked to the routing worksheets and a full networked model is created. One Excel worksheet is assigned to this routing analysis.

The first basin to be tested as a proof of concept was a HEC-1 model for Boulder Creek. This model used the SCS Curve Number methodology and the Muskingum routing method for the calculations. Since these methods could be accurately reproduced in the spreadsheet version, the two models gave identical results for the same design storm. This model has been running during flood potential days in real-time for the 2006 and 2007 flood seasons.

Based on the success of the methods developed for Boulder Creek, the next step was to utilize existing detailed studies to develop simplified models for real-time analysis. The question we were seeking to answer was, “How detailed of a model do you need for real-time flood warning?”. The test model to answer this question was Harvard Gulch, located in Denver.

Ben Urbonas of the District had previously developed a SWMM model for the Harvard Gulch basin using 59 basins and approximately 26 junctions and routing structures. In order to simplify the model, three design points were selected at critical points and the tributary sub-basins were assigned a single raingage for analysis. This resulted in a simplified model with three ‘aggregated’ basins.

The unit hydrographs for each basin were developed by adjusting the raingage data in the first time step to produce exactly one inch of runoff volume from the entire aggregated basin. The model computed runoff from this slug of rainfall for each individual sub-basin and routed it through the model to the selected design points. Since the definition of a unit hydrograph is the hydrograph produced by one inch of runoff, the model results at the various design points became the unit hydrograph for that basin.

The routing worksheet used Muskingum routing techniques to route the runoff hydrographs downstream. Calibration was performed by comparing the results of the detailed SWMM model at downstream design points. The Muskingum parameters were then optimized in the Excel worksheet to match the SWMM output. At this point the simplified Excel model now mirrors the results from the detailed SWMM model. To test these results a standard 100 year design storm was input for both models and the results compared. Figure 1 shows the results of this analysis.

Figure 1: Harvard Gulch 100-year design storm

As shown in Figure 1, the resulting peak flow between the two models is within 10% and the time to peak is identical. Based on the excellent results of this calibration, a further analysis was performed to compare the two models using actual storm data. The date selected was July 8, 2001 where up to 3.55 inches of rain fell in a 90 minute period. Five USGS rain gages were available during this storm for analysis. The identical rainfall patterns were input to the two models and the results compared. Figure 2 shows the results of this analysis.

The July 8, 2001 storm results showed better correlation between the two models than in the 100 year calibration run. The SWMM run was calibrated to this particular storm which had an estimated peak flow of 2080 cfs (at the downstream end of Harvard Gulch Park) as determined by direct measurement of high water levels (Source: Flood Hazard News, Vol. 13, No. 1, December, 2001). This peak flow compares favorably with the model considering the attenuation and flow-splitting characteristics of the park.

Figure 2: Comparison of the models using the 7/8/2001 USGS rainfall data
Figure 3: Comparison of USGS rainfall data with ALERT rainfall data

Several District rain gages also recorded rainfall during this storm and a comparison run was made using the ALERT data versus the USGS rainfall data. This allows a glimpse into what the model would have shown in real-time operation. The result of this analysis is shown in Figure 3.

The comparison of the USGS rainfall data with the ALERT data also shows good results between the two models. The peak flow value occurred 15 minutes sooner with the ALERT data than the USGS data. This can be explained by the location of the rain gages. The ALERT gages used in this analysis are generally located west of the USGS gages and the storm of July 8, 2001 moved from west to east. Thus the ALERT gages measured the rainfall before the USGS gages as the storm center moved through the area.

Radar estimates of the rainfall from this storm were also developed by Vieux and Associates for each aggregated basin in the model. Meteorological radar analysis of the storm cell provides estimates of the uniform spatial distribution of rainfall over a specified basin. The benefit of radar estimates is the ability to estimate rainfall in basin locations where rainfall gages do not exist. A comparison of radar estimated rainfall to raingage data is useful in determining how closely radar estimates match real-time data.

The radar estimates were compared to raingage data to show the difference between the two rainfall estimates during the July 8, 2001 Harvard Gulch storm. The result of this comparison is shown in Figure 4.

Figure 4: Comparison of Radar rainfall estimates with raingage estimates
Figure 5: Comparison of Radar rainfall estimates with rain gage estimates for the June 3, 2005 Storm

Interestingly, the radar estimates reflected a lower peak and volume for the July 8, 2001 storm than the ALERT raingage data. Two explanations are possible for this. First the most intense part of the storm appears to have passed directly over some of the ALERT and USGS rain gages (for some reason, rainfall events normally seem to deliberately avoid existing gage locations) and thus the raingage location measured the most intense rainfall amounts. In addition, some data loss of the radar information occurred during part of the storm which was filled by using a mosaic of adjacent radar data streams. Different historic storms were analyzed using these methods and the radar estimates showed a better correlation with raingage data.

An additional comparison of radar estimates and ALERT raingage data was performed for the June 3, 2005 Harvard Gulch at Logan Street storm to better determine how closely the two rainfall estimates match. This rainfall event was not as large as the July 8, 2001 storm, so the resulting flow in Harvard Gulch is smaller. The results of the comparison are shown in Figure 5 and indicate the two rainfall estimates provide similar flows.

In conclusion, the simplified spreadsheet approach shows promise as a tool to accurately predict potential flooding in real-time conditions. Several models were run during the 2007 flood season and additional models are currently under development. New features to the interface are planned for better display of flooding estimates via a web interface in real-time.