Potable Water Disinfection: Challenge for the 21st Century

July 1, 2002
Water utilities have come a long way since the early 1900's when they first began to dose drinking water with chlorine to abate cholera and typhoid epidemics. Water supply, treatment, and distribution practices have improved dramatically since then, but they still have a long way to go as utilities endeavor to keep ahead of the seemingly endless new challenges.

By Bruce W. Long, Robert A. Hulsey and Jeff J. Neemann

Water utilities have come a long way since the early 1900's when they first began to dose drinking water with chlorine to abate cholera and typhoid epidemics. Water supply, treatment, and distribution practices have improved dramatically since then, but they still have a long way to go as utilities endeavor to keep ahead of the seemingly endless new challenges.

Water utilities are seeing an increasing percentage of at-risk individuals in their service population. At the same time, diminishing supplies of higher quality raw water are forcing utilities to tap into new, but diminished-quality, water supplies. Treatment of these supplies calls for more sophisticated process technologies, which include more effective disinfection. An additional consideration is the consumers' higher education and consequently a greater awareness of the importance of water quality issues, in particular the quality of water delivered to their taps.

These challenges, combined with an increasing suite of regulations addressing the need for a higher degree of disinfection (acute concerns) and the attendant reduction of disinfection byproducts (chronic ingestion concerns), require water utilities to carefully consider the expanding array of demonstrated and developing best available technologies and water system operation and management practices.

While chlorination has saved many lives, concerns over chlorinated disinfection byproducts have led to an increased interest in alternative disinfectants such ozone.
Click here to enlarge image

Optimal solutions involve the application of multiple-barrier protection through the integration of source water protection, treatment, and maintenance of an effective disinfectant residual throughout the distribution system.

The US EPA's potable water disinfection regulations delineate log reduction requirements for enteric viruses and Giardia lamblia. It is anticipated that future regulations will include Cryptosporidium. Log reduction (1 log = 90%, 2 log = 99%, etc.) includes both physical removal (source water control, riverbank filtration, sedimentation, flotation, and granular media and membrane filtration) and inactivation (chlorine, chloramines, chlorine dioxide, ozone, ultraviolet irradiation) of the pathogens.

The value and effectiveness of source water protection programs as an important element in providing safe water is receiving increased recognition. The US EPA's pending Long Term 2 Enhanced Surface Water Treatment Rule offers utilities that implement source water control log credits for Cryptosporidium parvum reduction. The reduction in the concentrations of microbial contaminants in a water treatment plant's raw water supply provides a real, measurable, benefit and reduces dependence on subsequent particle removal and disinfection processes.

Enhanced removal of particulates is the utilities' principal goal as they look to increase the degree of disinfection they provide. Particulate removal offers the clear benefit of removing suspended particles, which include opportunistic pathogenic organisms. It also reduces the degree of inactivation required, which can reduce the amount of disinfectant used. The addition of an adsorbent, such as powdered activated carbon (PAC) or coagulants, also can lower the concentration of disinfectant byproduct precursors. The combination of a reduced concentration of disinfection byproduct precursors and the smaller amounts of disinfectant required, results in lowering the formation of disinfection byproducts. Filtered water turbidities less than 0.1 NTU are recommended as a measure of optimal removal of particulates.

The ability of filtration membranes, microfilters (MF) and ultrafilters (UF), to consistently produce filtered water turbidities less than 0.1 NTU independently of raw water quality variations is one of the reasons these technologies are gaining rapid acceptance by both water utilities and regulatory agencies. Many states are crediting these technologies for multiple log removals of Giardia, Cryptosporidium, and enteric viruses.

Recent testing has shown that UV light can easily inactivate Cryptosporidium and Giardia, along with viruses and other pathogens.
Click here to enlarge image

Integrity testing procedures to detect the rupture of membrane fibers have greatly improved, further enabling regulators to approve these technologies. Although MF and UF remove only particulate material, the addition of PAC or coagulants ahead of the membrane units enables them to effectively remove disinfection byproduct precursors as well as taste and odor causing compounds.

The final and ultimate barrier of protection is inactivation by disinfectant chemicals or, more recently in potable water disinfection, ultraviolet disinfection. Primary disinfectants are added in the treatment process and the contact time (T) times disinfectant residual (C) measured at the end of the contact time, or CT, contributes to the total disinfection credits achieved in the treatment process. A secondary disinfectant is added to produce and maintain a disinfectant residual throughout the distribution system.

The effectiveness of the primary disinfection step is an area that receives considerable attention. Application of the primary disinfectant at a point in the treatment train where higher water quality (reduced turbidity and organic carbon) has been attained increases the inactivation effectiveness and reduces the levels of disinfection byproducts formed. Historically, North America has relied on chlorination and chloramination for disinfection of its water supplies. While chlorination has saved many lives, concerns over chlorinated disinfection byproducts have led to an increased interest in alternative disinfectants such as chlorine dioxide, ozone, and ultraviolet light. While the historic role of these alternative technologies in potable water production can be traced back to the 1930's or earlier, they have only recently become popular as a means of controlling chlorinated byproducts while achieving varying degrees of pathogen inactivation.

Regarding the "bug du jour," ultraviolet light offers a distinct advantage over the other two disinfectants in terms of Cryptosporidium inactivation. While UV light has long been thought to be an effective means of inactivating viruses, recent testing has shown that it can easily inactivate Cryptosporidium and Giardia. While other commonly used disinfectants can provide protection against Giardia, chlorine is ineffective for controlling Cryptosporidium, and the effectiveness of chlorine dioxide and ozone is diminished at low water temperatures. The increase in the cost of ozone and chlorine dioxide at the higher dosages needed for inactivation of Cryptosporidium, the larger basins needed to provide longer reaction time, and the concerns over bromate and chlorite formation (which recently was regulated as ozone and chlorine dioxide disinfection byproducts) have sparked considerable interest in the use of UV light as a disinfectant.

Still, UV is no panacea. Factors such as low transmissivity of water requiring higher intensity UV exposure, monitoring and reporting of the inactivation provided in a UV reactor, and the sensitivity of the equipment to "blips" in the power supply must all be taken into account when considering the use of a UV system.

The answer to issues with individual disinfectants may lie in integrating them into a system that uses multiple disinfectants to take advantage of synergies between them. For example, the improvement in transmissivity gained by using an oxidant preceding UV disinfection can substantially reduce the size of the UV system.

In order to meet the challenges of emerging pathogens and their differing sensitivities to individual disinfectants, integrating multiple disinfectants into one treatment scheme (such as ozone/UV/chlorine) is expected to increase in the future.

Since the beginning of the 20th century, raising the barrier between the raw water supply and the consumer through disinfection and pathogen removal has had a highly beneficial effect on public health. For example, the rate of deaths in the United States attributed to typhoid fever has declined from 36 per 100,000 population in 1900 to 4 deaths for the entire US population in the 1989 to 1998 decade. Potable water filtration and disinfection, a multiple-barrier approach, has been an important factor in the improvement in disease control.

To further reduce the risk of waterborne disease outbreaks, more attention needs to be placed on the distribution system. In a recent article, Gunther Craun points out that 18 percent of the waterborne disease outbreaks that occurred between 1971 and 1998 involved contamination of the distribution system. Craun recommends that to "reduce the potential for distribution system contamination, water utilities must maintain adequate water pressures throughout the system, identify and replace or repair leaking water mains, maintain a chlorine residual and frequently monitor that residual in the distribution system."

In addition to the typical monitoring conducted in distribution systems (chlorine, pH), novel systems can be put into place to detect contamination by changes in other water quality variables, providing the utility with more information for use in making treatment decisions and controlling water distribution.

Conclusion

Providing a safe, aesthetically pleasing potable water is the goal of every professional in the water treatment industry. The challenges associated with achieving this goal have changed and will continue to change into the 21st century.

The water treatment professional now has more ways to meet these challenges, such as multiple barriers with improved particle removal and integrated disinfection systems, more information from the distribution system, and higher levels of security to protect the water along the way from its source to the consumer's tap.

The past century saw the health benefits of disinfection. As this century progresses, disinfection in all its forms will continue to play an important role in protecting all of us from disease.

About the Authors:

Bruce Long has a BS in Chemical Engineering from Lehigh University and a MS from Rutgers. He is a Vice President and Director of Water Treatment Technology for Black & Veatch Corp. Bob Hulsey and Jeff Neemann both have MS degrees in Environmental Engineering and are employed at Black & Veatch, specializing in disinfection technologies.

Sponsored Recommendations

ArmorBlock 5000: Boost Automation Efficiency

April 25, 2024
Discover the transformative benefits of leveraging a scalable On-Machine I/O to improve flexibility, enhance reliability and streamline operations.

Rising Cyber Threats and the Impact on Risk and Resiliency Operations

April 25, 2024
The world of manufacturing is changing, and Generative AI is one of the many change agents. The 2024 State of Smart Manufacturing Report takes a deep dive into how Generative ...

State of Smart Manufacturing Report Series

April 25, 2024
The world of manufacturing is changing, and Generative AI is one of the many change agents. The 2024 State of Smart Manufacturing Report takes a deep dive into how Generative ...

SmartSights WIN-911 Alarm Notification Software Enables Faster Response

March 15, 2024
Alarm notification software enables faster response for customers, keeping production on track