Contamination of Water Distribution Systems

advertisement
Contamination of Water Distribution Systems
Walter M. Grayman, PhD, PE
Abstract
It is widely accepted that our water supply systems are susceptible to contamination –
intentional or accidental. When a contaminant enters the water distribution system, depending
upon the point of entry, the type and quantity of contaminant, the duration of entry, and the
operation and design of the water system, it may travel short distances or may reach large
portions of the distribution system. This paper discusses the following issues related to
contamination of distribution systems:

the movement and transformation of contaminants in the distribution system

methods (models) for predicting the movement of contaminants and how these models
can be used to mitigate the consequences

the use of contamination warning systems as a mechanism for responding to an event
and reducing the impacts of the contamination event

siting of monitors in the distribution system to detect the contamination

design of distribution systems to reduce impacts if a contaminant enters the system

decontamination of a contaminated distribution system
Introduction
Water distribution systems are inherently susceptible to both intentional and accidental
contamination.
As shown in Figure 1, there are numerous points at which a
contaminant may enter the distribution system. These range from major components
such as treatment plants, pumps and tanks, to individual hydrants and customer
connections. Since it is highly infeasible (or virtually impossible) to fully protect all
potential points of entry, in addition to trying to lower the likelihood of contamination
events, a major emphasis has been on managing the distribution system in case of
contamination and trying to minimize the impacts of such events.
1
Hydrants
Pump stations
Sources
- WTP Wells
Tanks &
reservoirs
Buildings
Figure 1: Example potential entry points for contaminants into a distribution system
Fate and Transport of Contaminants in a Water Distribution System
When a contaminant enters a distribution system it will mix with the water flowing in the
distribution system and be transported along with the flowing water. Depending upon
the particular contaminant, it may also experience chemical, physical or biological
reactions and may interact with other constituents in the water and with the pipe wall.
The degree to which the contaminant spreads through the distribution system and
potentially impacts customers depends upon the point of entry, the type and quantity of
contaminant, the duration of entry, and the operation and design of the water system.
Consequently, it may travel short distances or may reach large portions of the
distribution system.
Mathematical hydraulic and water quality models of distribution systems may be used
to calculate (estimate) the movement of a contaminant in a distribution system
(Grayman 2005). Both public domain software (e.g., EPANET [Rossman 2000]) and
commercial packages are available. The models first calculate the flows and pressures
throughout the system over the course of a day or longer periods and then calculate
the movement of the contaminant and resulting concentration within the flow assuming
either conservative or reactionary behavior. As with all models, the results are only
approximations with the accuracy dependent upon factors such as the level of detail of
2
the model and how well it represents the actual system. These models have been used
for a variety of tasks related to the contamination of distribution systems including
estimating the vulnerability of a system to contamination,
design of monitoring
systems, and as part of a contamination warning system for mitigating and reducing
the impacts of a contamination system.
In order to demonstrate the movement of a contaminant in a distribution system, a
model has been applied to an example system that serves approximately 25,000
people under two separate contamination scenarios. In the first scenario, shown in
Figure 2, the clearwell at the water treatment plant is contaminated over a period of
several days. As illustrated, the contaminant (highlighted in bold in the map) moves
through the water distribution system relatively rapidly and after 24 hours a large
majority of the system is contaminated.
Eventually the entire system would be
affected. In the second scenario (Figure 3), a contaminant is injected into the system
along a small diameter transmission line (150 mm) serving a neighborhood in the
northwest corner of the system. As illustrated, this results in the entire neighborhood
being contaminated with the remainder of the system unaffected.
2 hours
6 hours
Injection site
24 hours
12 hours
Figure 2. Spread of contaminant in a distribution system – Scenario 1
3
Injection site
Figure 3. Spread of contaminant in a distribution system – Scenario 2
The speed at which a contaminant moves through the system is largely controlled by
the water demand (water consumption by customers), resulting in more rapid
movement during higher demand daylight hours than at night. The concentration of the
contaminant at any point within the distribution system is controlled by a set of
interacting factors including the volume of contaminant that enters the system, the
duration of the injection and the amount of uncontaminated water that dilutes the
contaminant. Other factors that affect the movement of the contaminant and its
concentration include the operation of the water system and the presence of storage
tanks. The model effectively integrates all of these factors in calculating the resulting
concentration.
Contamination Warning System
The primary mechanism that is currently available to mitigate or reduce the impacts of
a contamination event in the distribution system is a contamination warning system
(CWS). A contamination warning system is a combination of monitors, institutional
arrangements, analysis tools, emergency protocols, and response mechanisms
designed to provide early warning of contaminants in order to minimize customer
exposure.
USEPA (2007) describes a conceptual model for contamination warning system
operation as follows.
4

Monitoring and surveillance. The basic components of online water quality
monitoring, sampling and analysis, enhanced security monitoring, consumer
complaint surveillance, and public health surveillance occurs on a routine basis,
in near-real time until an anomaly or deviation from the baseline or base state is
detected.

Event detection and possible determination. Event detection is the process or
mechanism by which an anomaly or deviation from the baseline or base state is
detected.

Credible determination. Credibility determination procedures are performed
using information from all contamination warning system components as well as
external resources when available and relevant. If contamination is determined
to be credible, additional confirmatory and response actions are initiated.

Confirmed determination. In this stage of consequence management, additional
information is gathered and assessed to confirm drinking water contamination.
Response actions initiated during credible determination are expanded and
additional response activities may be implemented.

Remediation and recovery. Once contamination has been confirmed, and the
immediate crisis has been addressed through response (e.g., flushing,
emergency warnings, etc.), remediation and recovery actions defined in the
consequence management plan are performed to restore the system to normal
operations.
On-line monitors or sensors are widely considered as the primary means of detecting a
potential contamination event in a distribution system. The role of monitors in a
contamination warning system and how they may be applied during a potential
contamination event to detect, characterize and confirm a contaminant event is shown
schematically in Figure 4 (Grayman 2010). In order to reliably and efficiently detect
potential contaminants, the monitors must be sensitive to the presence of a wide range
of agents and there must be a sufficient number of appropriate monitors so that
detection occurs in a timely manner.
Historically, monitoring sites were selected
primarily on the basis of informal selection criteria that reflected the representativeness
and accessibility of the sites and the specific purpose(s) of the monitoring system.
5
On-Line
Monitoring
Immediate
contacts
INITIAL
DETECTION
Possible
detection
Initial
evaluation
Start
response
INITIAL
ACTIONS
Rapid Field
Assessment
Credible Threat ?
NO
YES
YES
CHARACTERIZATION
No action
needed
Confirmation
Characterization
Modeling
Laboratory
Confirmation
Develop action plan,
isolation, treatment, recovery,
notifications
MITIGATIVE
RESPONSE
Decontamination,
notifications, restart
RESTORE
NORMAL
SERVICE
POST
INCIDENCE
Post-incidence report
Figure 4. Flowchart of a CWS as part of a response to a potential contamination event
The success of a contamination warning system in reducing impacts of a contamination
event is highly dependent upon the length of time of the various steps shown in Figure
4. In a study in Ann Arbor, Michigan, Skadsen et al (2008) found that a total delay time
greater than about 8 hours between the entry of the contaminant into the distribution
system and the cessation of water use by customers (associated with the initial
detection, initial actions, characterization and mitigative response steps in Figure 4)
would significantly reduce the effectiveness of the CWS.
This threshold may vary
considerably in other systems. Discussions with water utility personnel suggest that in
order to meet a delay time goal of less than eight hours would require a very robust online monitoring network and a very efficient emergency response plan.
6
Over the past decade, there has been extensive research on the optimal placement of
a set of sensors in a distribution system (Murray and Hart 2010; Grayman 2010). Berry
et al. (2005) provide a succinct description of the sensor placement problem for CWS
design.
“The goal of a sensor placement optimization formulation is simple: to place a
limited number of sensors in a water distribution network such that the impact of
an accidental or intentional injection of contaminant to public health is minimized.
However, no specific, concrete formulation for sensor placement has emerged
that is widely accepted by the water community. There are a wide range of
alternative objectives that are also important when considering sensor
placements, such as minimizing the cost of installing and maintaining the sensors,
minimizing the response time to a contamination event, and minimizing the extent
of contamination (which impacts the recovery costs). Additionally, it is difficult to
quantify the health impact of a contamination event because human water usage
is often poorly characterized, both in terms of water consumption patterns as well
as how the water consumption impacts health effects. Consequently, surrogate
measures like the total volume of water consumed at all sites have been used to
model health impacts; this measure assumes that human water consumption is
proportional to water consumption at all junctions within the network.”
Many different algorithms have been proposed and tested for designing optimal sensor
networks (Ostfeld 2008). Most of these methods utilize integrated search techniques in
conjunction with distribution system hydraulic/water quality models that assess the
benefits associated with alternative sensor network designs. The most widely used
software in this category is the Threat Ensemble Vulnerability Assessment –Sensor
Placement Optimization Toolbox (TEVA-SPOT) distributed by the Environmental
Protection Agency (Berry et al 2010).
Design of Water Systems to Reduce Vulnerability
Contamination warning systems and other methods for detecting and mitigating
contamination events require active intervention in order to minimize impacts. An
alternative or supplemental “passive” mechanism for reducing impacts involves the redesign of distribution systems aimed specifically at improving security (Baker 2008). In
the United States and many other countries, distribution systems are designed as
7
looped systems composed of transmission lines and interconnected local delivery pipes
used to deliver water to customers. Looped systems generally result in multiple paths
that water can follow from the treatment plant to customers and provide redundancy in
case of outages. However, the loops also result in multiple pathways for contaminants
and greater difficulties in isolating contaminants.
Baker (2008) suggested the greater use of “zoned” distribution systems based in part
on the District Meter Area (DMA) concept that is widely used in the United Kingdom
and other areas. A DMA consists of an isolated water distribution system zone for
which there can be both control and calculation of flow amounts. The Baker approach,
shown schematically in Figure 5, extends this approach further to minimize interchange
between zones and “hardening” and reduction of access to the transmission system to
discourage entry of contaminants into the transmission backbone.
No connections
between distribution
blocks
Distribution
block
WTP
Tran
smi
ssio
n
line
CV
PRV
PUMP
No service
connections or
hydrant connections
to transmission main
No uncontrolled
connections
between
transmission main
and distribution
blocks
Figure 5. Schematic representation of a zoned distribution system
8
Murray et al (2010) tested this zoned approach using mathematical modeling and
simulation of several distribution systems and found that such an approach may
significantly reduce the spread of contaminants with little negative impact on water age
(a surrogate for water quality) and reliability. Converting a system from a looped to a
zoned system would require the addition of control valves, check valves and some
piping. The level of effort and costs associated with such a conversion is likely to vary
considerably depending on the existing distribution system configuration.
Remediation, Decontamination and Recovery
When a contamination event occurs, the immediate concern is minimizing the impacts
of the event on the customers of the water system. Once the immediate concern is
taken care of, attention turns to the goal of remediating and decontaminating the
system and then returning it to normal operation (USEPA 2008).
There are many
challenges during this period when the system is being returned to normal operation
including, for example:

Providing interim water supply while the water system is not available

Treatment of water contaminated through accidental or deliberate actions. This
water needs to be treated before being released into treatment systems or the
environment.

Decontaminating pipes. Many contaminants that have been identified as likely
threats to drinking water and distribution facilities can adhere to, or become
embedded in, rusty or corroded pipes or biologically active layers (biofilm).

Mitigating the “fear factor” associated with the contamination event and
regaining the confidence of customers that their water supply is safe.
Conclusions
In the United States, following the events of September 11, 2001, the concerns over
possible terrorist acts affected almost all aspects of life including water supply systems.
Previous concerns over accidental contamination of water sources or intrusion into
distribution systems took a backseat to the much greater concerns over intentional
contamination of the water supply. This fostered a significant upturn in research and
regulations affecting water supplies. The goal was certainly a water system that was
much less vulnerable to intentional and accidental events, a robust and ubiquitous
monitoring system that would immediately detect a wide range of contaminants at any
9
point from source to tap, and a response plan that would protect customers from
exposure to unwanted agents. In the years since 2001, some progress has been made
in reaching for the goals of a safer water system. However, since there have been no
major intentional contamination events in water systems in the U.S., one can postulate
that: (1) it is difficult to fully assess the degree of protection that has resulted and; (2)
the rate of research and development (and progress towards the lofty goals of 2001)
has slowed down.
There are still many issues that need to be confronted in order to ensure a safe water
supply.
Some issues and questions that are specifically related to potential
contamination of water distribution systems are outlined below:

Given the current state of technology, can a contamination warning system be
effective in protecting customers from contamination events?

Can we afford the relatively high capital and operational costs associated with a
contamination warning system to defend against what is assumed to be very
rare events?

Can we broaden the usage of the monitoring equipment associated with
contamination warning systems in order to address more routine water quality
issues and thus, broaden the cost basis?

What are the prospects for future breakthroughs in monitoring capabilities that
will lower costs and improve capabilities?

Should we consider more radical systems such as point-of-use treatment or
dual water delivery systems as a mechanism for improving the security of the
water delivery process?
References
Baker Corporation. (2008), Municipal Water Distribution System Security Study:
Recommendations for Science and Technology Investments, Final Report to the
U.S. Department of Homeland Security, Washington, DC.
Berry, J. W., Hart, W. E., Phillips, C. A., Uber, J. G., and Watson, J.-P. (2005).
Validation and assessment of integer programming sensor placement models. In
Proceedings of the World Water & Environmental Resources Congress, EWRI,
ASCE, Reston, VA.
Berry, J. et al. (2010). User’s Manual TEVA-SPOT Toolkit. Version 2.4. U.S.
Environmental Protection Agency. EPA 600/R-08/041B.
Grayman, W.M. (2010). Designing an Optimum Water Monitoring System. Wiley
Handbook of Science and Technology for Homeland Security, Ed., J.G. Voeller.
John Wiley & Sons, Inc.
10
Grayman, W.M. (2006). Use of Distribution System Water Quality Models In Support
Of Water Security. In Security of Water Supply Systems: From Source to Tap,
Edited by J. Pollert and B. Dedus. Springer Press, Netherlands.
Murray, R.E., Grayman, W.M., Savic, D.A., and Farmani. (2010). Effects of DMA
redesign on water distribution system performance. In Integrating Water Systems –
Edited by Boxall & Maksimovi´c. Taylor & Francis Group, London.
Murray, R. and Hart, W.E. (2010). Review of Sensor Placement Strategies for
Contamination Warning Systems. Journal of Water Resources Planning and
Management. American Society of Civil Engineers (ASCE), Reston, VA,
136(6):611-619.
Ostfeld, A., et al. (2008). The battle of the water sensor networks (BWSN): A design
challenge for engineers and algorithms. J. Water Resources Planning and
Management, American Society of Civil Engineers (ASCE), Reston, VA. 134(6):
556–568.
Rossman, L. A. (2000). EPANET 2 Users Manual. U.S. Environmental Protection
Agency.
Skadsen, J., Janke, R., Grayman, W., Samuels, W., TenBroek, M., Steglitz, B., and
Bahl, S. (2008). Distribution System Monitoring For Detecting Contamination
Events And Water Quality, J.AWWA. 100(7): 81-94.
U.S. EPA. (2007). “Water security initiative: Interim guidance on planning for
contamination warning system deployment.” EPA-817-R-07-002, Office of Water,
Office of Ground Water and Drinking Water, U.S. EPA, Washington, D.C.
U.S. EPA. (2008). “Decontamination and recovery planning: Water and wastewater
utility case study”.” EPA-817-F-08-004, Office of Water, U.S. EPA, Washington,
D.C.
11
Download