Reducing Lead Levels in Drinking Water
Last Modified: Nov 01, 2017
- Becki Rosenfeldt, Erik Rosenfeldt - Hazen and Sawyer
Recent events centered on lead in drinking water have eroded public trust in drinking water safety. In response, the United States Environmental Protection Agency (USEPA) is working on upcoming Lead and Copper Rule Long-Term Revisions (LCR LTR). The proposed revisions have the potential to cause significant impacts to community water systems (CWS) throughout the United States, requiring additional actions associated with optimal corrosion control treatment, lead service line replacement, public education, and localized household-level responses (USEPA and NDWAC, 2016).
Action Levels and Sampling Requirements
The current Lead and Copper Rule lead action limit (AL) of 15 μg/L, which serves as a benchmark for effective corrosion control treatment, is not a health-based standard. Research is currently underway to determine a health-based benchmark based on infant and child blood levels. New household health-based action levels proposed for the upcoming LCR LTR will likely decrease current lead level requirements, possibly forcing CWSs to turn to more aggressive corrosion control treatment options, increasing costs of compliance. While the exact Federal action level requirements of the LCR LTR have not been established, some State and local policymakers are already setting standards more stringent than EPA’s current AL. For example, the City of Buffalo, NY has recently lowered its AL to match the FDA’s requirement for lead levels in bottled water (5 μg/L).
Recent research has shown that the current standard of first-draw sampling only captures stagnant water in building fixtures and associated piping, while lead levels are often significantly higher in samples derived directly from lead service lines. Revised sampling protocols proposed for the LCR LTR require CWSs to obtain compliance samples from lead service lines, which will likely increase lead levels in compliance samples, triggering more systems to optimize their corrosion control. Lead tap sampling campaigns and optimization studies will need to be conducted and experts will need to help connect corrosion theory with the practical complexities of tap sampling.
Optimizing Corrosion Control
Optimized corrosion control is notoriously complicated and utility-specific, depending on factors such as source water quality, treatment, and interactions between finished water and pipe materials throughout the distribution system. To protect customers from exposure to lead and copper, utilities must have a complete understanding of where sources of lead may exist in a system, mechanisms by which lead may be leaching into drinking water, and possible treatment and operational changes to sustain water quality throughout the system. In short, a holistic “source to tap” approach must be taken when selecting a corrosion control treatment strategy.
There are only two currently available USEPA approved corrosion control methods (USEPA OCCT Guidance Manual, 2016):
1. pH/Alkalinity/DIC Adjustment
2. Corrosion Inhibitors (Phosphate or Silicate based)
When choosing optimal corrosion control treatment, there are a number of variables and questions within these recommended control strategies that are left unanswered. For instance, the optimal pH and DIC levels and appropriate type of chemicals to make these water quality adjustments must be determined for each individual utility. There is also an array of phosphate and silicate-based inhibitor blends available, each performing differently under various water quality conditions. Simultaneous compliance is also a challenge in choosing optimal corrosion control, as treatment and operational adjustments made to optimize corrosion control can also impact other requirements of the National Primary Drinking Water Regulations.
Desktop Optimization Studies
Desktop optimization studies are a simple and effective way to gain a full understanding of what sources of lead are in a system, what types of corrosion may be occurring, how water quality and corrosion reactions interact, and identify changes to better optimize corrosion control.
The most important, and oftentimes overlooked, step in conducting a lead sampling or corrosion control optimization study is a desktop evaluation to identify possible sources of lead in a system while also developing a holistic understanding of the system.
Historical lead and copper compliance results should be evaluated compared to historical water quality to identify any related trends between lead/copper levels and changes in water quality. Key water quality parameters to evaluate include:
- Inhibitor residuals
Current and Future Lead Service Line Replacement Requirements
Lead service lines (LSL) can be a significant source of lead in drinking water. The current LCR requires LSL replacement only after a lead action level exceedance and allows for partial lead service line replacement of only utility-owned portions of the LSL. Partial lead service line replacement has not been shown to reliably reduce lead levels in drinking water systems and has actually been associated with both temporary and long-term elevated lead levels. Under the proposed LCR LTR revisions, all drinking water systems would be required to establish a full LSLR program and perform a targeted outreach to consumers with LSLs. In order to protect consumer health, new regulations may also require the installation of point-of-use fi lters at lead service line replacement sites, as well as sites with elevated lead levels.
After investigating the sources of lead in a system, historical lead and copper levels, and historical water quality conditions, possible corrosion processes and mechanisms that may be contributing to lead and copper leaching in the system can be identified. Theoretical solubility curves may also be used to predict how water quality changes may impact lead leaching in the system. This knowledge allows utilities to fine-tune their existing treatment for optimized corrosion control and to understand how changing their control strategy would impact lead leaching in their system.
When to Act
Both recent and historic utility experiences have shown the severity of impacts that can occur when systems fail to recognize the need to re-evaluate corrosion control treatment. Utilities should consider re-evaluating corrosion control if there is a change in source water, treatment process, an action level exceedance, or an increase in lead/copper levels.
Extensive water quality monitoring should be performed both before and after a lead service line replacement or a change in treatment. Monitoring before will help establish baseline water quality conditions; monitoring after will help early identification of any unintended consequences and ensure that optimal corrosion control and stable water quality conditions are being maintained.
For more information, please contact the author at firstname.lastname@example.org.
Hear about new publications with our email newsletter
Horizons showcases significant water, wastewater, reuse, and stormwater projects and innovations that help our clients to achieve their goals, and can help you achieve yours. Articles are written by top engineers and process group leaders, demonstrating and explaining the beneficial application of a variety of technologies and tools.