Evaluación de Inundaciones, Determinación de Peligros y Gestión Del Riesgo (2020) .-31-60
Evaluación de Inundaciones, Determinación de Peligros y Gestión Del Riesgo (2020) .-31-60
C = D
B
C
D
E
An example of the application of this approach is visible on Figure 2.12. The 1976
flood of the Mistassini River has been simulated by four successive floods, all
with an identical recession pattern. For each flood, the river flow and the flood
volume have been adjusted in order to fit the observation. The blue area
represents the simulated volume of the flood series. The various peaks can be
exactly reproduced, in magnitude and timing; the close fit of the recession curves
confirms that they belong to the same family. The periods of growing river flow
show some small discrepancies due to not considered short rainfall events; in
addition, a very last small rainfall early July has not been considered.
To within half a percent, the integration of the blue area indicates a volume of
4’000 x 106 m3. This corresponds to the official estimates.
Figure 2.12 Observed and FIM-simulated Mistassini 1976 flood
As shown previously in this chapter, the flood recession follows a specific pattern
which is independent of the flood duration. Under this assumption, a large flood
should take more time to return to the normal conditions than a smaller flood
event; this will have an impact on the reconstitution of the flood hydrograph.
Figure 2.14 illustrates the reconstitution of the same flood hydrographs, but this
time considering the recession pattern of the drainage area. The flood peak and
volume are respected and the results appear more realistic.
For drainage areas where major floods are caused by a single event, the flood
peak discharge and the flood volume could have similar periods of recurrence.
However, for larger drainage areas with more complex flood conditions, the
situation is different. The longer the flood duration will be, lower should be the
correlation between the peak discharge and the volume.
The flood evaluation for different return periods can be performed using a
deterministic approach, particularly for locations where observed discharges data
are not available or only for a short duration. Results of the statistical and
deterministic analyses of the flood volume for rainfall event(s) should lead to
results in a similar range; this is particularly the case for small drainage areas, for
which a single rainfall event is usually considered.
A deterministic approach is also mainly used to evaluate the Probable Maximum
Flood, considering various possible scenarios maximizing the consequences on
the system 2.
The following elements must be considered for applying similar approaches:
a. Main rainfall event
The main rainfall event causing the flood normally corresponds to the expected
flood return period; higher is the probability to see such event, higher the number
of combinations that can generate a similar flood discharge or volume.
b. Antecedents events
Antecedent events are particularly important to establish the conditions prevailing
before the occurrence of the main rainfall event. This will have an impact on the
river discharge before the event and, even more important, on the soil moisture.
The more saturated the soil will be, the faster the response time of the system
will be (increasing the peak but also the volume for a specific duration). A similar
situation can be observed if a major rainfall event occurs when the soil is frozen.
Similarly, the base flow does not simply represent a reference river discharge on
which the peak flow is added but is an integral in the building up of the flood
structure. The net peak (additional discharge over the peak flow) generated by
an incoming flood volume is not a constant independent from the inflow pattern.
To evaluate the PMF, a large rainfall should be considered shortly before the
PMP to saturate the soil and ensure a maximum runoff.
c. Snow
The snow cover and the snowmelt period will have a direct impact on the flood
volume and the peak discharge. Both factors are important, since a rapid
snowmelt of a large snow cover will most likely generate large floods. When the
snow cover is an important part of the flood volume, the spring flood is very often
the largest one of the year, triggered by the combination of snowmelt and rainfall.
Deep snow covers increase the likelihood of large floods.
Before melting, the snow cover must be primed by warm temperatures bringing
it close to the melting point. A realistic temperature sequence must be developed
based on the observed conditions in the drainage area.
d. Subsequent events
Events following the main event can have a significant impact on the flood volume
and its duration. The impact of the subsequent events is particularly significant
until the reservoir returns to its maximum operation level (MOL). The longer the
2 The PMF scenario with the maximum peak discharge is not automatically the
worst scenario (maximum water level) for a dam with regulation capability. It
happens that a spring PMF (with lower peak discharge, but larger volume)
reaches a higher water level in a reservoir than a summer PMF (rainfall event).
duration to return to the MOL, the more vulnerable is the system in case of a new
large flood.
The subsequent events often occur in period(s) when large discharges have been
observed. This is particularly true for the evaluation of the PMF. For floods with
lower return periods however, it is not obvious to determine a subsequent
sequence of events, since there is in general no direct relation between the main
rainfall event and the subsequent events.
e. Initial conditions
Since oftentimes the objectives of such studies consist in determining the
maximum water level in a reservoir corresponding to a specific period of
recurrence, the initial conditions of the system are an important factor. The
expected volume of storage available before the flood will depend of the period
of the year. Normally the MOL is considered for a rainfall flood event.
However, for a spring flood (with a large percentage of the flood volume
generated by the snow cover), the reservoir level and the mode of operation
during the first part of the flood (until it reaches the MOL) will depend on the
expected conditions at this time of the year. The volume available for flood routing
will be larger; it is therefore unlikely that the spillway will be operated at full
capacity at the beginning of the flood. It may even not reach this discharge at all,
because of the uncertainty related to the final flood volume.
f. Comments
The evaluation of the flood volume for rainfall event(s) on large drainage areas
or for spring flood is complex, since it involves different events or conditions as
discussed above. If it is possible to identify the most likely scenario(s) to generate
a PMF, the number of scenarios to determine the 1:100, 1:1000 or 1:10,000 year
flood is almost infinite, since the combination of events leading to lower return
periods depends on too many combinations of parameters.
Comparison of the results from flood statistical analyses of spring flood volume
and deterministic analyses of the PMF volume can lead to inconsistencies. For
example, the extrapolation of the flood volume for a 1:10,000 year return period
can be higher than the volume of the PMF. Some explanation can be proposed:
- The statistical analyses overestimate the flood volume. The number of
recorded floods (usually a few tens) considered in a flood extrapolation
to the range of 1:1,000 years or more does not always guarantee a high
quality estimation, since the trends are not always well defined;
- Some of the parameters used in the deterministic analyses were
underestimated (for instance subsequent events). Since a period of
analysis of several weeks can follow the PMP, it is difficult to consider
realistic rainfall events during this period;
- Most probably a combination of both factors.
2.11. STOCHASTIC MODELING
The situation becomes even more complex for large drainage areas, since spatial
correlations and sometimes orographic effects at different locations must also be
considered. At the same time, the probability of deficiencies in the system and
“human” actions can play a significant role in the spatial and temporal evolution
of the flood.
The accuracy of the physical model(s) to represent large flood events must also
be considered, because of the limitations on the information available to calibrate
the model for such large floods. Usually a large number of assumptions (explicit
or implicit) are made at the basis of such a model; this of course generates larger
uncertainties.
Data of maximum floods observed in several countries around the world date
back to 1984, when the International Association of Hydrological Sciences (IAHS)
published the “World Catalogue of Maximum Observed Floods”. Also, ICOLD
Committee on “Dams and floods “published, in 2003, the Bulletin 125 on Dams
and Floods, which contributed with more significant data related to maximum
floods, mainly for dams and reservoirs. Recently, in 2014, a new and more
extensive review of maximum floods has been carried out on the data of flows
and volumes of maximum floods.
For the analysis of the peak discharge, the envelope curves method with the
Francou-Rodier (F-R) equation can be used. The F-R equation is the relationship
between the peak flow and the catchment area:
𝑲𝑲
𝑸𝑸 𝑨𝑨 𝟏𝟏−𝟏𝟏𝟏𝟏
=� �
𝑸𝑸𝟎𝟎 𝑨𝑨𝟎𝟎
where:
- Q= Peak flow (m3/s)
- A= Catchment area (km2)
- Q0 = 106 m3/s
- A0 = 108 km2
- K= Francou-Rodier coefficient
The database on flood volumes comes from the surveys carried out by ICOLD; it
consists of 187 records on volume of maximum floods in dams and reservoirs
from 15 most significant countries in this field.
The methodology used is similar to that used for the analysis of the peak flows,
assessing the relationship between the flood volumes and the catchment area,
through the equation:
𝑲𝑲𝑽𝑽
𝑽𝑽 𝑨𝑨 𝟐𝟐− 𝟏𝟏𝟏𝟏
=� �
𝑽𝑽𝟎𝟎 𝑨𝑨𝟎𝟎
where:
- V= Flood volume (hm3)
- V0 = 50 × 106 hm3
- A0 = 108 km2
- Kv = Coefficient of flood volume
Figure 2.16 shows the relationship between the flood volume and the catchment
area for the floods analysed with the available data. It defines an envelope curve
of the extreme flood volumes with a value of Kv = 10.5
The highest value was in Brazil (Tocantis reservoir) with a Kv= 10.5, in a 1980
flood.
Figure 2.17 shows the relationship between specific volume and the catchment
area. The specific volume, a measure of the volume generated per unit of the
catchment area, is expressed by :
𝑉𝑉
𝑉𝑉𝑠𝑠 =
𝐴𝐴
where:
- Vs = Specific volume (mm)
It is widely recognized that climate change will increase the variability of extreme
events. The increase in air temperature will have an impact on the maximum
rainfall that can be observed in several regions of the world; this will in turn have
a direct impact on the floods peak discharge and the flood volume.
In the northern areas and for spring floods, the impact on the volume of the major
floods will be generally less important on the spring flood than the impact on the
peak flows, since for a watershed, projected reductions in the snowpack volume
may partially offset the expected increases in rainfall (Ouranos 2015). In this
case, the volume of the flood could be similar but it may occur over a shorter
period, since the snowmelt season will possibly be shorter (which could lead to
higher peak discharge). However, this conclusion cannot be generalized because
the regional conditions can change significantly over the world. Some recent
meteorological events will probably have some impacts on our understanding of
their characteristics and of their consequences 4.
2.14. RECOMMENDATIONS
4 For example, Hurricane Harvey released about 1300 mm of rain in the Houston area
(USA) in 2017. The hurricane remained stationary for a few days, moved away from
the area and came back a few days later.
2.15. REFERENCES
Bacchi, B., Brath, A., Kottegoda, N.T., “Analysis of the Relationships Between
Flood Peaks and Flood Volumes Based on Crossing Properties of River Flow
Processes”, Water Resources Research, Vol. 28, No. 10, pp 2773-2782, October
1992
Carter, R.W., Godfrey. R.G., “Storage and Flood Routing”, Manual of Hydrology:
Part 3. Flood-Flow Techniques, GEOLOGICAL SURVEY WATER-SUPPLY
PAPER 1543-B, Methods and practices of the Geological Survey, 1960.
Gaál, L., Szolgay, J., Kohnová, S., Hlavčová, K., Parajka, J., Viglione, A., Merz
R., and Blöschl, G., “Dependence Between Flood Peaks and Volumes: A Case
Study on Climate and Hydrological Controls”, Hydrological Sciences Journal, 60
(6) 2015.
Joos B., “Flood Integration Method (FIM)”. ICOLD proceedings, Stavanger, 2015
Louie, P.Y.T. and Hogg, W.D. “Extreme Value Estimates of Snowmelt”, Canadian
Hydrology Symposium, pp 64-76, 1980
Molini, A., Katul, G.G., and Porporato, A., “Maximum Discharge from Snowmelt
in a Changing Climate“, Geophysical Research Letters, VOL. 38, L05402, 2011
Pramanik, N., Panda, R.K., Sen, D., “Development of Design Flood Hydrographs
Using Probability Density Functions”, Hydrological Processes, Hydrol. Process.
24. 415-428 (2010).
SNC-Lavalin Inc., “Gestion du réservoir Gouin – Étude complémentaire”, Juin
2001
Wang, Cheng, "A joint probability approach for the confluence flood frequency
analysis", Retrospective Theses and Dissertations. Iowa State University, Paper
14865 The Gumbel mixed model for flood frequency analysis, S Yuea, T.B.M.J
Ouardaa, B Bobéea, P Legendre1, b, 1, , P Bruneau1, b, 1, , Journal of Hydrology
Volume 226, Issues 1–2, 20 December 1999, Pages 88–100, 2007
Sheng Yue; Taha B. M. J. Ouarda; Bernard Bobée; Pierre Legendre; and Pierre
Bruneau , “Approach for Describing Statistical Properties of Flood Hydrograph”
http://ascelibrary.org/doi/abs/10.1061/(ASCE)1084-
0699(2002)7:2(147)#sthash.sIFP4f7S.dpuf
3. STOCHASTIC APPROACH TO FLOOD HAZARD
DETERMINATION
3.1. INTRODUCTION
The traditional concept of the Inflow Design Flood (IDF) has been and is still being
used to size the dam and its designated flood discharge facilities (i.e. spillway,
low level outlets) so that the dam could safely pass either a flood of pre‐
determined probability of exceedance or the Probable Maximum Flood (PMF).
The IDF standard is directly linked to the dam hazard classification so that low
hazard dams are designed using smaller IDF than high hazard dams. For high
(or extreme) hazard dams, two general world trends have developed (ICOLD,
2003):
1. USA, UK, Canada, Australia and countries under their economic and
technological influence use the PMF methodology. The PMF is defined as the
most severe “reasonably possible” combination of rainfall, snow accumulation,
air temperatures, and initial watershed conditions. The PMF is a deterministic
concept and its probability of occurrence cannot be determined. Theoretically, it
represents the upper physical flood limit for a given watershed at a given season.
In reality, PMF estimates are typically lower than the theoretical upper limit by
some variable amount that depends on the available data, the chosen
methodology and the analyst’s approach to deriving the estimate (Micovic et al.,
2015).
2. Most European countries use probabilistic methods to derive an inflow
flood characteristic (typically peak flow of certain duration) with return periods
ranging from 1,000 to 10,000 years.
For lower‐hazard dams, the IDF selections criteria vary but typically include either
a percentage of the PMF or return periods shorter than 1,000‐years (ICOLD,
2003).
For those types of dams and dam systems, the IDF, by characterizing inflow to
the reservoir, does not provide the necessary information (i.e. magnitude and
probability) on the flood hazard in terms of hydraulic forces acting on the dam
itself (peak reservoir level). The commonly used solution for this problem is to
route the IDF through the reservoir and determine the resulting peak reservoir
level, and thereby obtain at least some information (magnitude but not probability)
on the flood hazard acting on the dam. In addition, the IDF concept typically
assumes that, during an extreme flood, everything operates according to the plan,
i.e. accurate reservoir level measurements, spillway gates open as required,
necessary personnel available on site, communication lines fully functioning. In
other words, the IDF concept does not address possibility of “operational flood”
in which a dam could fail due to a combination of a flood that is much smaller
than the IDF and one or more operational faults.
Clearly, risk informed decision making for dam safety requires more than the IDF
concept. In order to have any scientifically‐based idea of the probability of dam
overtopping due to floods, it is necessary to focus on estimating probabilities of
peak reservoir level. The process can be described as follows:
- At the end of the process, the reservoir outflow and associated peak
reservoir level have different exceedance probability than the reservoir inflow that
started the process.
Note that the peak reservoir level, unlike the reservoir inflow, is not a natural and
random phenomenon and its probability distribution cannot be computed
analytically (e.g. by using statistical frequency analysis methods). The probability
of the peak reservoir level is the combination of probabilities of all factors that
influence it, including reservoir inflows, initial reservoir level, reservoir operating
rules, system components failure, human error, measurement error, as well as
unforeseen circumstances. Thus, the approach to estimating the full probability
distribution of the peak reservoir level consists of some kind of stochastic
simulation that includes as many of these factors and scenarios as possible. It is
a complex multi‐disciplinary analysis which is currently beyond technical
capabilities of some dam owners. However, without it, the proper risk‐informed
dam safety management is not possible. The main goal of stochastic simulation
approach to flood hazard is to carry out probabilistic analysis of various flood
characteristics (inflow, outflow, peak reservoir level) resulting from floods on a
dam system and derive the continuous probability distributions which could then
be used to evaluate exceedance probabilities of various reservoir levels including
the level corresponding to the dam crest (dam overtopping level) as well as the
level resulting from the PMF. That way, different design criteria could be
considered and evaluated at various flood frequency levels, thereby departing
from widely used strict “pass/fail” deterministic design criteria.
- Rainfall magnitude and its spatial and temporal distributions over the
watershed (typically provided in form of Probable Maximum Precipitation)
Another thing that all stochastic approaches to flood hazard for dam safety have
in common is the use of a deterministic watershed model (i.e. rainfall‐runoff
model) to convert rainfall and snow/glacier melt into watershed runoff, which
ultimately becomes the reservoir inflow. In terms of watershed model simulation,
the stochastic flood hazard methods could employ:
Note that this approach does not generate daily rainfall amounts greater than
those observed in the historical record; however, the technique of resampling with
replacement creates different temporal patterns resulting in multi‐day rainfall
amounts higher than those observed in the historical record. A watershed model
is used to calculate runoff from this synthetic rainfall and temperature series, and
the runoff is then routed using a hydrodynamic model to account for complexities
associated with retention and flooding along particular river stretches. This
procedure yields the continuous record of 50,000 years of daily discharges at or
near the points where rivers Rhine and Meuse enter the Netherlands. Finally, the
flood values for the various return periods are obtained by ranking the annual
maximum discharges in the generated 50,000‐year sequence in the ascending
order, where the rank in this ordered set determines the return period. The main
source of uncertainty in the GRADE method is the relatively short length of the
historical precipitation and temperature series used in the stochastic weather
generator. Using less than 100 years of historical data to generate 50,000 years
of synthetic data affects the ability to accurately capture year‐to‐year variability
over the long periods of time. For instance, resampling from a relatively wet
baseline series will result in relatively wet long‐duration synthetic series, which in
turn will increase uncertainty associated with derived flood discharge values,
especially for higher return periods. This large uncertainty is reflected in the
GRADE flood frequency results for the Meuse River, where the best estimate of
10,000‐yr flood of 4,400 m3/s is given with the 95% uncertainty range of 3,250 to
5,550 m3/s.
Finally, there are stochastic approaches that fit somewhere in between event‐
based simulation and continuous simulation – they could be called semi‐
continuous or hybrid approaches. They utilize a watershed model that has been
calibrated to satisfactorily represent hydrological behaviour of the watershed over
a long continuous period for which historical record of climate input data is
available. This creates a continuous database of watershed initial conditions
which can be stochastically sampled at any time/season of the year and
combined with a rainfall event of a certain duration and probability, sampled from
rainfall magnitude‐frequency curve. The end result is thousands of flood
hydrographs ranging in magnitudes from common to extreme. The advantage
over event‐based simulation approach is that statistical distributions of initial (pre‐
storm) watershed conditions are likely more realistic since they do not have to be
arbitrarily assumed. The advantage over continuous simulation approach is that
there is no need to carry out the difficult task of generating thousands of years of
continuous synthetic rainfall and temperature sequences of questionable
accuracy. Examples of semi‐continuous or hybrid stochastic flood models are
SEFM (Schaefer and Barker, 2002) and SCHADEX (Paquet et al., 2013).
There are three distinct aspects of stochastic flood simulation for a hydroelectric
system consisting of a single or multiple dams and reservoirs.
1. Simulation of natural runoff from the local watershed and inflow into the
reservoir.
2. Simulation of reservoir operating rules (if any), i.e. flood routing for a
single reservoir or a system of multiple dams and reservoirs.
3. Simulation of on‐demand availability of various system components such
as failure of different discharge facilities, telemetry errors, human operator errors,
or some combination of those.
Ideally, all three aspects are combined within the stochastic simulation
framework, and multi‐ thousand years of extreme storm and flood annual maxima
are generated by computer simulation. The simulation for each year contains a
set of climatic and storm parameters that are sampled through Monte Carlo
procedures based on the historical record and collectively preserved
dependencies among different hydrometeorological inputs. Execution of a
rainfall‐ runoff model combined with reservoir routing of the inflow floods through
the system and stochastically modelled failure/availability of various system
components provides the computation of a corresponding multi‐thousand year
series of annual flood maxima. Simulated flood characteristics such as peak
inflow, maximum reservoir release, inflow volume, and maximum reservoir level
are the parameters of interest.
However, it is extremely difficult if not impossible to accurately cover all three
aspects of stochastic flood simulation due to the enormous complexity of a
dam/reservoir system and all possible interactions among its components. That
is why in practical applications not all flood producing factors are modelled to the
same extent ‐ some are treated as stochastic variables, and some are fixed or
not modelled at all. For instance, the third aspect of flood hazard simulation
mentioned above (stochastic simulation of on‐demand availability of various
system components) is rarely carried out due to complexities and difficulties in
describing probability distributions of variables such as spillway gate failure or
human error. A recent study by Micovic et al. (2016) attempted to cover all three
aspects of stochastic flood hazard simulation on a system of three dams and
reservoirs with seasonally fluctuating reservoir levels and active discharge control
systems. The aim of the study was to examine how the inclusion of spillway gate
failures likelihood functions in the stochastic flood modelling framework affects
the probability of dam overtopping. The results indicated that dams are much
more likely to be overtopped due to an unusual combination of relatively common
individual events than due to a single extreme flood event. The all three aspects
of stochastic simulation framework are discussed in the following sections.
Figure 3.1 shows an example of this procedure applied to the 1,463 km2 Campbell
River watershed on Vancouver Island, BC, Canada and using the 72‐hr storm
duration. The procedure resulted in identification of 69 significant storms within
the 1896‐2009 period. A probability‐ plot was developed using numeric storm
dates (9.0 is September 1st, 9.5 is September 15th, 10.0 is October 1st, etc.) and
it was determined that the seasonality data could be well described by a Normal
distribution. A frequency histogram was then constructed based on the fitted
Normal distribution to depict the twice‐monthly distribution of the dates of
significant storms for input into a stochastic simulation framework.
Figure 3.1 Probability plot and frequency histogram of storm seasonality for the
Campbell R. watershed in Canada
Figure 3.1 shows that significant historical storms have occurred in the period
from early October through about mid‐March with a mean date of December 21st.
The probability of occurrence of a storm for any given mid‐month or end‐of‐month
can be determined from the incremental bi‐ monthly frequencies depicted in the
Figure 3.1 histogram (e.g. zero probability for September mid‐month, and
probability of 0.0228 for September end‐month).
Generally speaking, floods could result from storms of various durations. This is
especially true for very large watersheds (e.g. > 10,000 km2), where significant
floods could originate from intense short‐duration storms covering only a part of
the watershed, or from a wide‐spread general synoptic storm of longer duration
that cover the entire watershed area.
The direction of the storm and its speed of movement over the watershed is also
an important factor. Consequently, proper stochastic flood modelling process
should be sampling rainfall storms of different duration from their respective
frequency distributions and weight modelled floods according to the observed
frequency of different duration rainfall storms used to produce said floods.
Applying this approach to the Campbell River watershed above Strathcona Dam
on Vancouver Island, BC, Canada included assembling storm data from all
locations that were climatologically similar to the Campbell River region.
Precipitation annual maxima series data were assembled for the critical duration
(72‐hour in this case) from all stations on Vancouver Island and stations between
latitude 47° and 52° N from the Pacific Coast eastward to the crest of the Coastal
Mountains (Canada) and Cascade Mountains (USA). This totaled 143 stations
and 6,609 station‐years of record for stations with 25‐years or more of record.
The precipitation‐frequency relationship (Figure 3.2) was developed through
regional L‐ moment analyses of point precipitation and spatial analyses of
historical storms to develop point‐to‐area relationships and determine basin‐
average precipitation for the watershed using the 4‐parameter Kappa distribution.
The uncertainty bounds were developed through Latin‐hypercube sampling
method (McKay et al., 1979; Wyss and Jorgenson, 1998) where regional L‐
moment ratios and Kappa distribution parameters were varied to assemble 150
parameter sets and perform Monte Carlo simulation using different probability
distributions for individual parameters.
Figure 3.2 Computed 72‐hour precipitation‐frequency curve and 90%
uncertainty bounds for the 1193 km2 Strathcona Dam watershed
The process of stochastic storm generation requires both spatial and temporal
storm templates that are scalable. The spatial and temporal storm templates are
linearly scaled by the ratio of the desired basin‐average precipitation of certain
duration to the basin‐average precipitation of the same duration observed in a
selected storm template (i.e. prototype storm). These storm templates should be
prepared from as many historically observed storms as possible in order to
capture diversity among storms in terms of spatial and temporal distribution of
precipitation. Typically, 10 to 20 storm templates should be enough to capture
storm diversity over a given watershed, for watersheds sizes up to about
5,000 km2. Larger watersheds should be divided into zones of a size suitable for
describing the spatial and temporal variability of storm types that may affect the
watershed on a given day, with separate storm analyses carried out for each zone
within a watershed.
This procedure leads to the identification of the time span during which there was
a continuous influx of atmospheric moisture from the same air mass where
precipitation was produced under similar synoptic conditions. The identified time
span provides the starting and ending times for the precipitation segment that is
independent of surrounding precipitation and scalable for stochastic storm
generation. An example of this type of temporal storm template is shown in
Figure 3.5 which depicts the observed 10‐day period of basin‐average
precipitation for the storm of October 14‐23, 2003 for the Strathcona Dam basin,
with the portion of the hyetograph (in blue) that was identified as the independent
scalable segment of the storm and therefore adopted for use as a prototype storm
for stochastic storm generation.
Figure 3.4 Spatial storm template (October 1984 storm)