Skip to Content
Proposed Rule

National Primary Drinking Water Regulations: Long Term 2 Enhanced Surface Water Treatment Rule

Action

Proposed Rule.

Summary

In this document, the Environmental Protection Agency (EPA) is proposing National Primary Drinking Water Regulations that require the use of treatment techniques, along with monitoring, reporting, and public notification requirements, for all public water systems (PWSs) that use surface water sources. The purposes of the Long Term 2 Enhanced Surface Water Treatment Rule (LT2ESWTR) are to improve control of microbial pathogens, including specifically the protozoan Cryptosporidium, in drinking water and to address risk-risk trade-offs with the control of disinfection byproducts. Key provisions in today's proposed LT2ESWTR include the following: source water monitoring for Cryptosporidium, with reduced monitoring requirements for small systems; additional Cryptosporidium treatment for filtered systems based on source water Cryptosporidium concentrations; inactivation of Cryptosporidium by all unfiltered systems; disinfection profiling and benchmarking to ensure continued levels of microbial protection while PWSs take the necessary steps to comply with new disinfection byproduct standards; covering, treating, or implementing a risk management plan for uncovered finished water storage facilities; and criteria for a number of treatment and management options (i.e., the microbial toolbox) that PWSs may implement to meet additional Cryptosporidium treatment requirements. The LT2ESWTR will build upon the treatment technique requirements of the Interim Enhanced Surface Water Treatment Rule and the Long Term 1 Enhanced Surface Water Treatment Rule.

EPA believes that implementation of the LT2ESWTR will significantly reduce levels of Cryptosporidium in finished drinking water. This will substantially lower rates of endemic cryptosporidiosis, the illness caused by Cryptosporidium, which can be severe and sometimes fatal in sensitive subpopulations (e.g., AIDS patients, the elderly). In addition, the treatment technique requirements of this proposal are expected to increase the level of protection from exposure to other microbial pathogens (e.g., Giardia lamblia).

Unified Agenda

National Primary Drinking Water Regulations: Long Term 2 Enhanced Surface Water Treatment Rule

2 actions from June 2003 to July 2004

  • June 2003
    • NPRM
  • July 2004
    • Final Action
 

Table of Contents Back to Top

Tables Back to Top

DATES: Back to Top

EPA must receive public comment on the proposal by November 10, 2003.

ADDRESSES: Back to Top

Comments may be submitted by mail to: Water Docket, Environmental Protection Agency, Mail Code 4101T, 1200 Pennsylvania Ave., NW., Washington, DC 20460, Attention Docket ID No. OW-2002-0039. Comments may also be submitted electronically or through hand delivery/courier by following the detailed instructions as provided in section I.C. of the SUPPLEMENTARY INFORMATION section.

FOR FURTHER INFORMATION CONTACT: Back to Top

For technical inquiries, contact Daniel Schmelling, Office of Ground Water and Drinking Water (MC 4607M), U.S. Environmental Protection Agency, 1200 Pennsylvania Ave., NW., Washington, DC 20460; telephone (202) 564-5281. For regulatory inquiries, contact Jennifer McLain at the same address; telephone (202) 564-5248. For general information contact the Safe Drinking Water Hotline, Telephone (800) 426-4791. The Safe Drinking Water Hotline is open Monday through Friday, excluding legal holidays, from 9 a.m. to 5:30 p.m. Eastern Time.

SUPPLEMENTARY INFORMATION: Back to Top

I. General Information Back to Top

A. Who Is Regulated by This Action?

Entities potentially regulated by the LT2ESWTR are public water systems (PWSs) that use surface water or ground water under the direct influence of surface water (GWUDI). Regulated categories and entities are identified in the following chart.

Category Examples of regulated entities
Industry Public Water Systems that use surface water or ground water under the direct influence of surface water.
State, Local, Tribal or Federal Governments Public Water Systems that use surface water or ground water under the direct influence of surface water.

This table is not intended to be exhaustive, but rather provides a guide for readers regarding entities likely to be regulated by this action. This table lists the types of entities that EPA is now aware could potentially be regulated by this action. Other types of entities not listed in this table could also be regulated. To determine whether your facility is regulated by this action, you should carefully examine the definition of public water system in § 141.3 of Title 40 of the Code of Federal Regulations and applicability criteria in §§ 141.76 and 141.501 of today's proposal. If you have questions regarding the applicability of the LT2ESWTR to a particular entity, consult one of the persons listed in the preceding section entitled FOR FURTHER INFORMATION CONTACT

B. How Can I Get Copies of This Document and Other Related Information?

1. Docket. EPA has established an official public docket for this action under Docket ID No. OW-2002-0039. The official public docket consists of the documents specifically referenced in this action, any public comments received, and other information related to this action. Although a part of the official docket, the public docket does not include Confidential Business Information (CBI) or other information whose disclosure is restricted by statute. The official public docket is the collection of materials that is available for public viewing at the Water Docket in the EPA Docket Center, (EPA/DC) EPA West, Room B102, 1301 Constitution Ave., NW., Washington, DC. The EPA Docket Center Public Reading Room is open from 8:30 a.m. to 4:30 p.m., Monday through Friday, excluding legal holidays. The telephone number for the Public Reading Room is (202) 566-1744, and the telephone number for the Water Docket is (202) 566-2426. For access to docket material, please call (202) 566-2426 to schedule an appointment.

2. Electronic Access. You may access this Federal Register document electronically through the EPA Internet under the “Federal Register” listings at http://www.epa.gov/fedrgstr/.

An electronic version of the public docket is available through EPA's electronic public docket and comment system, EPA Dockets. You may use EPA Dockets at http://www.epa.gov/edocket/ to submit or view public comments, access the index listing of the contents of the official public docket, and to access those documents in the public docket that are available electronically. Once in the system, select “search,” then key in the appropriate docket identification number.

Certain types of information will not be placed in the EPA Dockets. Information claimed as CBI and other information whose disclosure is restricted by statute, which is not included in the official public docket, will not be available for public viewing in EPA's electronic public docket. EPA's policy is that copyrighted material will not be placed in EPA's electronic public docket but will be available only in printed, paper form in the official public docket. Although not all docket materials may be available electronically, you may still access any of the publicly available docket materials through the docket facility identified in section I.B.1.

For public commenters, it is important to note that EPA's policy is that public comments, whether submitted electronically or in paper, will be made available for public viewing in EPA's electronic public docket as EPA receives them and without change, unless the comment contains copyrighted material, CBI, or other information whose disclosure is restricted by statute. When EPA identifies a comment containing copyrighted material, EPA will provide a reference to that material in the version of the comment that is placed in EPA's electronic public docket. The entire printed comment, including the copyrighted material, will be available in the public docket.

Public comments submitted on computer disks that are mailed or delivered to the docket will be transferred to EPA's electronic public docket. Public comments that are mailed or delivered to the Docket will be scanned and placed in EPA's electronic public docket. Where practical, physical objects will be photographed, and the photograph will be placed in EPA's electronic public docket along with a brief description written by the docket staff.

C. How and to Whom Do I Submit Comments?

You may submit comments electronically, by mail, or through hand delivery/courier. To ensure proper receipt by EPA, identify the appropriate docket identification number in the subject line on the first page of your comment. Please ensure that your comments are submitted within the specified comment period. Comments received after the close of the comment period will be marked “late.” EPA is not required to consider these late comments.

1. Electronically. If you submit an electronic comment as prescribed below, EPA recommends that you include your name, mailing address, and an e-mail address or other contact information in the body of your comment. Also include this contact information on the outside of any disk or CD ROM you submit, and in any cover letter accompanying the disk or CD ROM. This ensures that you can be identified as the submitter of the comment and allows EPA to contact you in case EPA cannot read your comment due to technical difficulties or needs further information on the substance of your comment. EPA's policy is that EPA will not edit your comment, and any identifying or contact information provided in the body of a comment will be included as part of the comment that is placed in the official public docket, and made available in EPA's electronic public docket. If EPA cannot read your comment due to technical difficulties and cannot contact you for clarification, EPA may not be able to consider your comment.

a. EPA Dockets. Your use of EPA's electronic public docket to submit comments to EPA electronically is EPA's preferred method for receiving comments. Go directly to EPA Dockets at http://www.epa.gov/edocket, and follow the online instructions for submitting comments. Once in the system, select “search,” and then key in Docket ID No. OW-2002-0039. The system is an “anonymous access” system, which means EPA will not know your identity, e-mail address, or other contact information unless you provide it in the body of your comment.

b. E-mail. Comments may be sent by electronic mail (e-mail) to OW-Docket@epa.gov, Attention Docket ID No. OW-2002-0039. In contrast to EPA's electronic public docket, EPA's e-mail system is not an “anonymous access” system. If you send an e-mail comment directly to the Docket without going through EPA's electronic public docket, EPA's e-mail system automatically captures your e-mail address. E-mail addresses that are automatically captured by EPA's e-mail system are included as part of the comment that is placed in the official public docket, and made available in EPA's electronic public docket.

c. Disk or CD ROM. You may submit comments on a disk or CD ROM that you mail to the mailing address identified in section I.C.2. These electronic submissions will be accepted in WordPerfect or ASCII file format. Avoid the use of special characters and any form of encryption.

2. By Mail. Send three copies of your comments and any enclosures to: Water Docket, Environmental Protection Agency, Mail Code 4101T, 1200 Pennsylvania Ave., NW., Washington, DC, 20460, Attention Docket ID No. OW-2002-0039.

3. By Hand Delivery or Courier. Deliver your comments to: Water Docket, EPA Docket Center, Environmental Protection Agency, Room B102, 1301 Constitution Ave., NW, Washington, DC, Attention Docket ID No. OW-2002-0039. Such deliveries are only accepted during the Docket's normal hours of operation as identified in section I.B.1.

D. What Should I Consider as I Prepare My Comments for EPA?

You may find the following suggestions helpful for preparing your comments:

1. Explain your views as clearly as possible.

2. Describe any assumptions that you used.

3. Provide any technical information and/or data you used that support your views.

4. If you estimate potential burden or costs, explain how you arrived at your estimate.

5. Provide specific examples to illustrate your concerns.

6. Offer alternatives.

7. Make sure to submit your comments by the comment period deadline identified.

8. To ensure proper receipt by EPA, identify the appropriate docket identification number in the subject line on the first page of your response. It would also be helpful if you provided the name, date, and Federal Register citation related to your comments.

Abbreviations Used in This Document Back to Top

AIPCAll Indian Pueblo Council

ASDWAAssociation of State Drinking Water Administrators

ASTMAmerican Society for Testing and Materials

AWWAAmerican Water Works Association

AWWARFAmerican Water Works Association Research Foundation

°CDegrees Centigrade

CCPComposite Correction Program

CDCCenters for Disease Control and Prevention

CFECombined Filter Effluent

CFRCode of Federal Regulations

COICost-of-Illness

CTThe Residual Concentration of Disinfectant (mg/L) Multiplied by the Contact Time (in minutes)

CWSCommunity Water Systems

DAPI4',6-Diamindino-2-phenylindole

DBPsDisinfection Byproducts

DBPRDisinfectants/Disinfection Byproducts Rule

DEDiatomaceous Earth

DICDifferential Interference Contrast (microscopy)

EAEconomic Analysis

EPAUnited States Environmental Protection Agency

GACGranular Activated Carbon

GWUDIGround Water Under the Direct Influence of Surface Water

HAA5Haloacetic acids (Monochloroacetic, Dichloroacetic, Trichloroacetic, Monobromoacetic and Dibromoacetic Acids)

HPCHeterotrophic Plate Count

ICRInformation Collection Request

ICRSSInformation Collection Rule Supplemental Surveys

ICRSSMInformation Collection Rule Supplemental Survey of Medium Systems

ICRSSLInformation Collection Rule Supplemental Survey of Large Systems

IESWTRInterim Enhanced Surface Water Treatment Rule

IFAImmunofluorescence Assay

LogLogarithm (common, base 10)

LRAALocational Running Annual Average

LRVLog Removal Value

LT1ESWTRLong Term 1 Enhanced Surface Water Treatment Rule

LT2ESWTRLong Term 2 Enhanced Surface Water Treatment Rule

MCLMaximum Contaminant Level

MCLGMaximum Contaminant Level Goal

MGDMillion Gallons per Day

M-DBPMicrobial and Disinfectants/Disinfection Byproducts

MFMicrofiltration

NCWSNon-community water systems

NFNanofiltration

NODANotice of Data Availability

NPDWRNational Primary Drinking Water Regulation

NTNCWSNon-transient Non-community Water System

NTTAANational Technology Transfer and Advancement Act

NTUNephelometric Turbidity Unit

OMBOffice of Management and Budget

PEPerformance Evaluation

PWSPublic Water System

QCQuality Control

QCRVQuality Control Release Value

RAARunning Annual Average

RFARegulatory Flexibility Act

ROReverse Osmosis

RSDRelative Standard Deviation

SABScience Advisory Board

SBARSmall Business Advocacy Review

SERsSmall Entity Representatives

SDWASafe Drinking Water Act

SWTRSurface Water Treatment Rule

TCRTotal Coliform Rule

TTHMTotal Trihalomethanes

TNCWSTransient Non-community Water Systems

UFUltrafiltration

UMRAUnfunded Mandates Reform Act

Table of Contents Back to Top

I. Summary

A. Why Is EPA Proposing the LT2ESWTR?

B. What Does the LT2ESWTR Proposal Require?

1. Treatment Requirements for Cryptosporidium

2. Disinfection Profiling and Benchmarking

3. Uncovered Finished Water Storage Facilities

C. Will This Proposed Regulation Apply to My Water System?

II. Background

A. What Is the Statutory Authority for the LT2ESWTR?

B. What Current Regulations Address Microbial Pathogens in Drinking Water?

1. Surface Water Treatment Rule

2. Total Coliform Rule

3. Interim Enhanced Surface Water Treatment Rule

4. Long Term 1 Enhanced Surface Water Treatment Rule

5. Filter Backwash Recycle Rule

C. What Public Health Concerns Does This Proposal Address?

1. Introduction

2. Cryptosporidium Health Effects and Outbreaks

a. Health Effects

b. Waterborne Cryptosporidiosis Outbreaks.

3. Remaining Public Health Concerns Following the IESWTR and LT1ESWTR

a. Adequacy of Physical Removal To Control Cryptosporidium and the Need for Risk Based Treatment Requirements.

b. Control of Cryptosporidium in Unfiltered Systems

c. Uncovered Finished Water Storage Facilities

D. Federal Advisory Committee Process

III. New Information on Cryptosporidium Health Risks and Treatment

A. Overview of Critical Factors for Evaluating Regulation of Microbial Pathogens

B. Cryptosporidium Infectivity

1. Cryptosporidium Infectivity Data Evaluated for IESWTR

2. New Data on Cryptosporidium Infectivity

3. Significance of New Infectivity Data

C. Cryptosporidium Occurrence

1. Occurrence Data Evaluated for IESWTR

a. Filtered Systems.

b. Unfiltered Systems

2. Overview of the Information Collection Rule and Information Collection Rule Supplemental Surveys (ICRSS)

a. Scope of the Information Collection Rule

b. Scope of the ICRSS

3. Analytical Methods for Protozoa in the Information Collection Rule and ICRSS

a. Information Collection Rule Protozoan Method

b. Method 1622 and Method 1623

4. Cryptosporidium Occurrence Results from the Information Collection Rule and ICRSS

a. Information Collection Rule Results

b. ICRSS Results

5. Significance of New Cryptosporidium Occurrence Data

6. Request for Comment on Information Collection Rule and ICRSS Data Sets

D. Treatment

1. Overview

2. Treatment Information Considered for the IESWTR and LT1ESWTR

a. Physical Removal

b. Inactivation

3. New Information on Treatment for Control of Cryptosporidium

a. Conventional Filtration Treatment and Direct Filtration

i. Dissolved Air Flotation.

b. Slow Sand Filtration

c. Diatomaceous Earth Filtration

d. Other Filtration Technologies

e. Inactivation

i. Ozone and Chlorine Dioxide

ii. Ultraviolet Light

iii. Significance of New Information on Inactivation

IV. Discussion of Proposed LT2ESWTR Requirements

A. Additional Cryptosporidium Treatment Technique Requirements for Filtered Systems

1. What Is EPA Proposing Today?

a. Overview of Framework Approach

b. Monitoring Requirements

c. Treatment Requirements

i. Bin Classification

ii. Credit for Treatment in Place

iii. Treatment Requirements Associated With LT2ESWTR Bins

d. Use of Previously Collected Data

2. How Was This Proposal Developed?

a. Basis for Targeted Treatment Requirements

b. Basis for Bin Concentration Ranges and Treatment Requirements

i. What Is the Risk Associated With a Given Level of Cryptosporidium in a Drinking Water Source?

ii. What Degree of Additional Treatment Should Be Required for a Given Source Water Cryptosporidium Level?

c. Basis for Source Water Monitoring Requirements

i. Systems Serving at Least 10,000 People

ii. Systems Serving Fewer Than 10,000 People

iii. Future Monitoring and Reassessment

d. Basis for Accepting Previously Collected Data

3. Request for Comment

B. Unfiltered System Treatment Technique Requirements for Cryptosporidium

1. What Is EPA Proposing Today?

a. Overview

b. Monitoring Requirements

c. Treatment Requirements

2. How Was This Proposal Developed?

a. Basis for Cryptosporidium Treatment Requirements

b. Basis for Requiring the Use of Two Disinfectants

c. Basis for Source Water Monitoring Requirements

3. Request for Comment

C. Options for Systems to Meet Cryptosporidium Treatment Requirements

1. Microbial Toolbox Overview

2. Watershed Control Program

a. What Is EPA Proposing Today?

b. How Was This Proposal Developed?

c. Request for Comment

3. Alternative Source

a. What Is EPA Proposing Today?

b. How Was This Proposal Developed?

c. Request for Comment

4. Off-stream Raw Water Storage

a. What Is EPA Proposing Today?

b. How Was This Proposal Developed?

c. Request for Comment

5. Pre-sedimentation With Coagulant

a. What Is EPA Proposing Today?

b. How Was This Proposal Developed?

i. Published Studies of Cryptosporidium Removal by Conventional Sedimentation Basins

ii. Data Supplied by Utilities on the Removal of Spores by Presedimentation

c. Request for Comment

6. Bank Filtration

a. What Is EPA Proposing Today?

b. How Was This Proposal Developed?

c. Request for Comment

7. Lime Softening

a. What Is EPA Proposing Today?

b. How Was This Proposal Developed?

c. Request for Comment

8. Combined Filter Performance

a. What Is EPA Proposing Today?

b. How Was This Proposal Developed?

c. Request for Comment

9. Roughing Filter

a. What Is EPA Proposing Today?

b. How Was This Proposal Developed?

c. Request for Comment

10. Slow Sand Filtration

a. What Is EPA Proposing Today?

b. How Was This Proposal Developed?

c. Request for Comment

11. Membrane Filtration

a. What Is EPA Proposing Today?

b. How Was This Proposal Developed?

c. Request for Comment

12. Bag and Cartridge Filtration

a. What Is EPA Proposing Today?

b. How Was This Proposal Developed?

c. Request for Comment

13. Secondary Filtration

a. What Is EPA Proposing Today?

b. How Was This Proposal Developed?

c. Request for Comment

14. Ozone and Chlorine Dioxide

a. What Is EPA Proposing Today?

b. How Was This Proposal Developed?

c. Request for Comments

15. Ultraviolet Light

a. What Is EPA Proposing Today?

b. How Was This Proposal Developed?

c. Request for Comment

16. Individual Filter Performance

a. What Is EPA Proposing Today?

b. How Was This Proposal Developed?

c. Request for Comment

17. Other Demonstration of Performance

a. What Is EPA Proposing Today?

b. How Was This Proposal Developed?

c. Request for Comment

D. Disinfection Benchmarks for Giardia lamblia and Viruses

1. What Is EPA Proposing Today?

a. Applicability and Schedule

b. Developing the Disinfection Profile and Benchmark

c. State Review

2. How Was This Proposal Developed?

3. Request for Comments

E. Additional Treatment Technique Requirements for Systems with Uncovered Finished Water Storage Facilities

1. What Is EPA Proposing Today?

2. How Was This Proposal Developed?

3. Request for Comments

F. Compliance Schedules

1. What Is EPA Proposing Today?

a. Source Water Monitoring

i. Filtered Systems

ii. Unfiltered Systems

b. Treatment Requirements

c. Disinfection Benchmarks for Giardia lamblia and Viruses

2. How Was This Proposal Developed?

3. Request for Comments

G. Public Notice Requirements

1. What Is EPA Proposing Today?

2. How Was This Proposal Developed?

3. Request for Comment

H. Variances and Exemptions

1. Variances

2. Exemptions

3. Request for Comment

a. Variances

b. Exemptions

I. Requirements for Systems To Use Qualified Operators

J. System Reporting and Recordkeeping Requirements

1. Overview

2. Reporting Requirements for Source Water Monitoring

a. Data Elements To Be Reported

b. Data System

c. Previously Collected Monitoring Data

3. Compliance With Additional Treatment Requirements

4. Request for Comment

K. Analytical Methods

1. Cryptosporidium

a. What Is EPA Proposing Today?

b. How Was This Proposal Developed?

c. Request for Comment

2. E. coli

a. What Is EPA Proposing Today?

b. How Was This Proposal Developed?

c. Request for Comment

3. Turbidity

a. What Is EPA Proposing Today?

b. How Was This Proposal Developed?

c. Request for Comment

L. Laboratory Approval

1. Cryptosporidium Laboratory Approval

2. E. coli Laboratory Approval

3. Turbidity Analyst Approval

4. Request for Comment

M. Requirements for Sanitary Surveys Conducted by EPA

1. Overview

2. Background

3. Request for Comment

V. State Implementation

A. Special State Primacy Requirements

B. State Recordkeeping Requirements

C. State Reporting Requirements

D. Interim Primacy

VI. Economic Analysis

A. What Regulatory Alternatives Did the Agency Consider?

B. What Analyses Support Selecting the Proposed Rule Option?

C. What Are the Benefits of the Proposed LT2ESWTR?

1. Non-quantifiable Health and Non-health Related Benefits

2. Quantifiable Health Benefits

a. Filtered Systems

b. Unfiltered Systems

3. Timing of Benefits Accrual (latency)

D. What Are the Costs of the Proposed LT2ESWTR?

1. Total Annualized Present Value Costs

2. Water System Costs

a. Source Water Monitoring Costs

b. Filtered Systems Treatment Costs

c. Unfiltered Systems Treatment Costs

d. Uncovered Finished Water Storage Facilities

e. Future Monitoring Costs

f. Sensitivity Analysis-influent Bromide Levels on Technology Selection for Filtered Plants

3. State/Primacy Agency Costs

4. Non-quantified Costs

E. What Are the Household Costs of the Proposed Rule?

F. What Are the Incremental Costs and Benefits of the Proposed LT2ESWTR?

G. Are There Benefits From the Reduction of Co-occurring Contaminants?

H. Are There Increased Risks From Other Contaminants?

I. What Are the Effects of the Contaminant on the General Population and Groups Within the General Populations That Are Identified as Likely to be at Greater Risk of Adverse Health Effects?

J. What Are the Uncertainties in the Baseline, Risk, Benefit, and Cost Estimates for the Proposed LT2ESWTR as well as the Quality and Extent of the Information?

K. What Is the Benefit/Cost Determination for the Proposed LT2ESWTR?

L. Request for Comment

VII. Statutory and Executive Order Reviews

A. Executive Order 12866: Regulatory Planning and Review

B. Paperwork Reduction Act

C. Regulatory Flexibility Act

D. Unfunded Mandates Reform Act

1. Summary of UMRA Requirements

2. Written Statement for Rules With Federal mandates of $100 million or more

a. Authorizing Legislation

b. Cost-benefit Analysis

c. Estimates of Future Compliance Costs and Disproportionate Budgetary Effects

d. Macro-economic Effects

e. Summary of EPA Consultation With State, local, and Tribal Governments and Their Concerns

f. Regulatory Alternatives Considered

g. Selection of the Least Costly, Most Cost-effective, or Least Burdensome Alternative That Achieves the Objectives of the Rule

3. Impacts on Small Governments

E. Executive Order 13132: Federalism

F. Executive Order 13175: Consultation and Coordination With Indian Tribal Governments

G. Executive Order 13045: Protection of Children from Environmental Health and Safety Risks

H. Executive Order 13211: Actions that Significantly Affect Energy Supply, Distribution, or Use

I. National Technology Transfer and Advancement Act

J. Executive Order 12898: Federal Actions to Address Environmental Justice in Minority Populations or Low-Income Populations

K. Consultations With the Science Advisory Board, National Drinking Water Advisory Council, and the Secretary of Health and Human Services

L. Plain Language

VIII. References

I. Summary Back to Top

A. Why Is EPA Proposing the LT2ESWTR?

EPA is proposing the Long Term 2 Enhanced Surface Water Treatment Rule (LT2ESWTR) to provide for increased protection against microbial pathogens in public water systems that use surface water sources. The proposed LT2ESWTR focuses on Cryptosporidium, which is a protozoan pathogen that is widespread in surface water. EPA is particularly concerned about Cryptosporidium because it is highly resistant to inactivation by standard disinfection practices like chlorination. Ingestion of Cryptosporidium oocysts can cause acute gastrointestinal illness, and health effects in sensitive subpopulations may be severe, including risk of mortality. Cryptosporidium has been identified as the pathogenic agent in a number of waterborne disease outbreaks across the U.S. and in Canada (details in section II).

The intent of the LT2ESWTR is to supplement existing microbial treatment requirements for systems where additional public health protection is needed. Currently, the Interim Enhanced Surface Water Treatment Rule (IESWTR) requires large systems that filter to remove at least 99% (2 log) of Cryptosporidium (63 FR 69478, December 16, 1998) (USEPA 1998a). The Long Term 1 Enhanced Surface Water Treatment Rule (LT1ESWTR) extends this requirement to small systems (67 FR 1812, January 14, 2002) (USEPA 2002a). Subsequent to promulgating these regulations, EPA has evaluated significant new data on Cryptosporidium infectivity, occurrence, and treatment (details in section III). These data indicate that current treatment requirements achieve adequate protection for the majority of systems, but there is a subset of systems with higher vulnerability to Cryptosporidium where additional treatment is necessary.

Specifically, national survey data show that average Cryptosporidium occurrence in filtered systems is lower than previously estimated. However, these data also demonstrate that Cryptosporidium concentrations vary widely among systems, and that a fraction of filtered systems have relatively high levels of source water Cryptosporidium contamination. Based on this finding, along with new data suggesting that the infectivity (i.e., virulence) of Cryptosporidium may be substantially higher than previously understood, EPA has concluded that the current 2 log removal requirement does not provide an adequate degree of treatment in filtered systems with the highest source water Cryptosporidium levels. Consequently, EPA is proposing targeted additional treatment requirements under the LT2ESWTR for filtered systems with the highest Cryptosporidium risk.

Under current regulations, unfiltered systems are not required to provide any treatment for Cryptosporidium. New occurrence data suggest that typical Cryptosporidium levels in the treated water of unfiltered systems are substantially higher than in the treated water of filtered systems. Hence, Cryptosporidium treatment by unfiltered systems is needed to achieve equivalent public health protection. Recent treatment studies have allowed EPA to develop criteria for systems to inactivate Cryptosporidium with ozone, ultraviolet (UV) light, and chlorine dioxide. As a result, EPA has concluded that it is feasible and appropriate to propose under the LT2ESWTR that all unfiltered systems treat for Cryptosporidium.

In addition to concern with Cryptosporidium, the LT2ESWTR proposal is intended to ensure that systems maintain adequate protection against microbial pathogens as they take steps to reduce formation of disinfection byproducts (DBPs). Along with the LT2ESWTR, EPA is also developing a Stage 2 Disinfection Byproducts Rule (DBPR), which will further limit allowable levels of trihalomethanes and haloacetic acids. The proposed LT2ESWTR contains disinfection profiling and benchmarking requirements to ensure that microbial protection is maintained as systems comply with the Stage 2 DBPR. Also in the proposed LT2ESWTR are requirements to limit risk associated with existing uncovered finished water storage facilities. Uncovered storage facilities are subject to contamination if not properly managed or treated.

Today's proposed LT2ESWTR reflects consensus recommendations from the Stage 2 Microbial and Disinfection Byproducts (M-DBP) Federal Advisory Committee. These recommendations are set forth in the Stage 2 M-DBP Agreement in Principle (65 FR 83015, December 29, 2000) (USEPA 2000a).

B. What Does the LT2ESWTR Proposal Require?

1. Treatment Requirements for Cryptosporidium

EPA is proposing risk-targeted treatment technique requirements for Cryptosporidium control in filtered systems that are based on a microbial framework approach. Under this approach, systems that use a surface water or ground water under the direct influence of surface water (referred to collectively as surface water systems) will conduct source water monitoring to determine an average Cryptosporidium concentration. Based on monitoring results, filtered systems will be classified in one of four possible risk categories (bins). A filtered system's bin classification determines the extent of any additional Cryptosporidium treatment requirements beyond the requirements of current regulations.

EPA expects that the majority of filtered systems will be classified in the Bin 1, which carries no additional treatment requirements. Those systems classified Bins 2-4 will be required to provide from 1.0 to 2.5 log of treatment (i.e., 90 to 99.7 percent reduction) for Cryptosporidium in addition to conventional treatment that complies with the IESWTR or LT1ESWTR (details in section IV.A). Filtered systems will meet additional Cryptosporidium treatment requirements by using one or more treatment or control steps from a “microbial toolbox” of options (details in section IV.C). Rather than monitoring, filtered systems may elect to comply with the treatment requirements of Bin 4 directly.

Under the proposed LT2ESWTR, all surface water systems that are not required to filter (i.e., unfiltered systems) must provide at least 2 log (i.e., 99 percent) inactivation of Cryptosporidium. In addition, unfiltered systems will monitor for Cryptosporidium in their source water and must achieve at least 3 log (i.e., 99.9 percent) inactivation of Cryptosporidium if the mean level exceeds 0.01 oocysts/L. Alternatively, unfiltered systems may elect to provide 3 log Cryptosporidium inactivation directly, instead of monitoring. All requirements established under the Surface Water Treatment Rule (SWTR) (54 FR 27486, June 29, 1989) (USEPA 1989a) for unfiltered systems will remain in effect, including 3 log inactivation of Giardia lamblia and 4 log inactivation of viruses. However, the LT2ESWTR proposal requires that unfiltered systems achieve their overall inactivation requirements using a minimum of two disinfectants (details in section IV.B).

2. Disinfection Profiling and Benchmarking

The purpose of disinfection profiling and benchmarking is to ensure that when a system makes a significant change to its disinfection practice, it does not compromise the adequacy of existing microbial protection. EPA established the disinfection benchmark under the IESWTR and LT1ESWTR for the Stage 1 M-DBP rules, and the LT2ESWTR proposal extends disinfection benchmark requirements to apply to the Stage 2 M-DBP rules.

The proposed profiling and benchmarking requirements are similar to those promulgated under IESWTR and LT1ESWTR. Systems that meet specified criteria must prepare disinfection profiles that characterize current levels of virus and Giardia lamblia inactivation over the course of one year. Systems with valid operational data from profiling conducted under the IESWTR or LT1ESWTR are not required to collect additional data. If a system that is required to prepare a profile proposes to make a significant change to its disinfection practice, the system must calculate a disinfection benchmark and must consult with the State regarding how the proposed change will affect the current benchmark (details in section IV.D).

3. Uncovered Finished Water Storage Facilities

The proposed LT2ESWTR also includes requirements for systems with uncovered finished water storage facilities. The IESWTR and LT1ESWTR require systems to cover all new storage facilities for finished water, but these rules do not address existing uncovered finished water storage facilities. Under the LT2ESWTR proposal, systems with uncovered finished water storage facilities must cover the storage facility or treat the storage facility discharge to achieve 4 log virus inactivation unless the State determines that existing risk mitigation is adequate. Where the State makes such a determination, systems must develop and implement a risk mitigation plan that addresses physical access, surface water run-off, animal and bird wastes, and on-going water quality assessment (details in section IV.E).

C. Will This Proposed Regulation Apply to My Water System?

All community and non-community water systems that use surface water or ground water under the direct influence of surface water are affected by the proposed LT2ESWTR.

II. Background Back to Top

A. What Is the Statutory Authority for the LT2ESWTR?

This section discusses the Safe Drinking Water Act (SDWA or the Act) sections that direct the development of the LT2ESWTR.

The Act, as amended in 1996, requires EPA to publish a maximum contaminant level goal (MCLG) and promulgate a national primary drinking water regulation (NPDWR) with enforceable requirements for any contaminant that the Administrator determines may have an adverse effect on the health of persons, is known to occur or there is a substantial likelihood that the contaminant will occur in public water systems (PWSs) with a frequency and at levels of public health concern, and for which in the sole judgement of the Administrator, regulation of such contaminant presents a meaningful opportunity for health risk reduction for persons served by PWSs (section 1412 (b)(1)(A)).

MCLGs are non-enforceable health goals, and are to be set at a level at which no known or anticipated adverse effect on the health of persons occur and which allows an adequate margin of safety (sections 1412(b)(4) and 1412(a)(3)). EPA established an MCLG of zero for Cryptosporidium under the IESWTR (63 FR 69478, December 16, 1998) (USEPA 1998a). The Agency is not proposing any changes to the current MCLG for Cryptosporidium.

The Act also requires that at the same time EPA publishes an NPDWR and MCLG, it must specify in the NPDWR a maximum contaminant level (MCL) which is as close to the MCLG as is feasible (sections 1412(b)(4) and 1401(1)(c)). The Agency is authorized to promulgate an NPDWR that requires the use of a treatment technique in lieu of establishing an MCL if the Agency finds that it is not economically or technologically feasible to ascertain the level of the contaminant (sections 1412(b)(7)(A) and 1401(1)(C)). The Act specifies that in such cases, the Agency shall identify those treatment techniques that would prevent known or anticipated adverse effects on the health of persons to the extent feasible (section 1412(b)(7)(A)).

The Agency has concluded that it is not currently economically or technologically feasible for PWSs to determine the level of Cryptosporidium in finished drinking water for the purpose of compliance with a finished water standard (the performance of available analytical methods for Cryptosporidium is described in section III.C; the treated water Cryptosporidium levels that the LT2ESWTR will achieve are described in section IV.A). Consequently, today's proposal for the LT2ESWTR relies on treatment technique requirements to reduce health risks from Cryptosporidium in PWSs.

When proposing a NPDWR that includes an MCL or treatment technique, the Act requires EPA to publish and seek public comment on an analysis of health risk reduction and cost impacts. This includes an analysis of quantifiable and nonquantifiable costs and health risk reduction benefits, incremental costs and benefits of each alternative considered, the effects of the contaminant upon sensitive subpopulations (e.g., infants, children, pregnant women, the elderly, and individuals with a history of serious illness), any increased risk that may occur as the result of compliance, and other relevant factors (section 1412 (b)(3)(C)). EPA's analysis of health benefits and costs associated with the proposed LT2ESWTR is presented in “Economic Analysis of the LT2ESWTR” (USEPA 2003a) and is summarized in section VI of this preamble. However, the Act does not authorize the Administrator to use additional health risk reduction and cost considerations to establish MCL or treatment technique requirements for the control of Cryptosporidium (section 1412 (b)(6)(C)).

Finally, section 1412 (b)(2)(C) of SDWA requires EPA to promulgate a Stage 2 Disinfectants and Disinfection Byproducts Rule within 18 months after promulgation of the LT1ESWTR, which occurred on January 14, 2002. Consistent with statutory requirements for risk balancing (section 1412(b)(5)(B)), EPA will finalize the LT2ESWTR with the Stage 2 DBPR to ensure parallel protection from microbial and DBP risks.

B. What Current Regulations Address Microbial Pathogens in Drinking Water?

This section summarizes the existing regulations that apply to control of pathogenic microorganisms in surface water systems. These rules form the baseline of regulatory protection that will be supplemented by the LT2ESWTR.

1. Surface Water Treatment Rule

The SWTR (54 FR 27486, June 29, 1989) (USEPA 1989a) applies to all PWSs using surface water or ground water under the direct influence (GWUDI) of surface water as sources (Subpart H systems). It established MCLGs of zero for Giardia lamblia, viruses, and Legionella, and includes treatment technique requirements to reduce exposure to pathogenic microorganisms, including: (1) Filtration, unless specified avoidance criteria are met; (2) maintenance of a disinfectant residual in the distribution system; (3) removal and/or inactivation of 3 log (99.9%) of Giardia lamblia and 4 log (99.99%) of viruses; (4) combined filter effluent turbidity of 5 nephelometric turbidity units (NTU) as a maximum and 0.5 NTU at 95th percentile monthly for treatment plants using conventional treatment or direct filtration (with separate standards for other filtration technologies); and (5) watershed protection and source water quality requirements for unfiltered systems.

2. Total Coliform Rule

The Total Coliform Rule (TCR) (54 FR 27544, June 29, 1989) (USEPA 1989b) applies to all PWSs. It established an MCLG of zero for total and fecal coliform bacteria, and an MCL based on the percentage of positive samples collected during a compliance period. Coliforms are used as a screen for fecal contamination and to determine the integrity of the water treatment process and distribution system. Under the TCR, no more than 5 percent of distribution system samples collected in any month may contain coliform bacteria (no more than 1 sample per month may be coliform positive in those systems that collect fewer than 40 samples per month). The number of samples to be collected in a month is based on the number of people served by the system.

3. Interim Enhanced Surface Water Treatment Rule

The IESWTR (63 FR 69477, December 16, 1998) (USEPA 1998a) applies to PWSs serving at least 10,000 people and using surface water or GWUDI sources. Key provisions established by the IESWTR include the following: (1) An MCLG of zero for Cryptosporidium; (2) Cryptosporidium removal requirements of 2 log (99 percent) for systems that filter; (3) strengthened combined filter effluent turbidity performance standards of 1.0 NTU as a maximum and 0.3 NTU at the 95th percentile monthly for treatment plants using conventional treatment or direct filtration; (4) requirements for individual filter turbidity monitoring; (5) disinfection benchmark provisions to assess the level of microbial protection provided as facilities take steps to comply with new DBP standards; (6) inclusion of Cryptosporidium in the definition of GWUDI and in the watershed control requirements for unfiltered public water systems; (7) requirements for covers on new finished water storage facilities; and (8) sanitary surveys for all surface water systems regardless of size.

The IESWTR was developed in conjunction with the Stage 1 Disinfectants and Disinfection Byproducts Rule (Stage 1 DBPR) (63 FR 69389; December 16, 1998) (USEPA 1998b), which reduced allowable levels of certain DBPs, including trihalomethanes, haloacetic acids, chlorite, and bromate.

4. Long Term 1 Enhanced Surface Water Treatment Rule

The LT1ESWTR (67 FR 1812, January 14, 2002) (USEPA 2002a) builds upon the microbial control provisions established by the IESWTR for large systems, through extending similar requirements to small systems. The LT1ESWTR applies to PWSs using surface water or GWUDI as sources that serve fewer than 10,000 people. Like the IESWTR, the LT1ESWTR established the following: 2 log (99 percent) Cryptosporidium removal requirements for systems that filter; individual filter turbidity monitoring and more stringent combined filter effluent turbidity standards for conventional and direct filtration plants; disinfection profiling and benchmarking; inclusion of Cryptosporidium in the definition of GWUDI and in the watershed control requirements for unfiltered systems; and the requirement that new finished water storage facilities be covered.

5. Filter Backwash Recycle Rule

EPA promulgated the Filter Backwash Recycling Rule (FBRR) (66 FR 31085, June 8, 2001) (USEPA 2001a) to increase protection of finished drinking water supplies from contamination by Cryptosporidium and other microbial pathogens. The FBRR requirements will reduce the potential risks associated with recycling contaminants removed during the filtration process. The FBRR provisions apply to all systems that recycle, regardless of population served. In general, the provisions include the following: (1) Recycling systems must return certain recycle streams prior to the point of primary coagulant addition unless the State specifies an alternative location; (2) direct filtration systems recycling to the treatment process must provide detailed recycle treatment information to the State; and (3) certain conventional systems that practice direct recycling must perform a one-month, one-time recycling self assessment.

C. What Public Health Concerns Does This Proposal Address?

This section presents the basis for the public health concern associated with Cryptosporidium in drinking water by summarizing information on Cryptosporidium health effects and outbreaks. This is followed by a description of the specific areas of public health concern that remain after implementation of the IESWTR and LT1ESWTR and that are addressed in the LT2ESWTR proposal. More detailed information about Cryptosporidium health effects may be found in the following criteria documents: Cryptosporidium: Human Health Criteria Document (USEPA 2001b), Cryptosporidium: Drinking Water Advisory (USEPA 2001c), and Cryptosporidium: Risks for Infants and Children (USEPA 2001d).

1. Introduction

While modern water treatment systems have substantially reduced waterborne disease incidence, drinking water contamination remains a significant health risk management challenge. EPA's Science Advisory Board in 1990 cited drinking water contamination, particularly contamination by pathogenic microorganisms, as one of the most important environmental risks (USEPA 1990). This risk is underscored by information from the Centers for Disease Control and Prevention (CDC) which indicates that between 1980 and 1998 a total of 419 outbreaks associated with drinking water were reported, with greater than 511,000 estimated cases of disease. A number of agents were implicated in these outbreaks, including viruses, bacteria, and protozoa, as well as several chemicals (Craun and Calderon 1996, Levy et al. 1998, Barwick et al. 2000). The majority of cases were associated with surface water, and specifically with the 1993 Cryptosporidium outbreak in Milwaukee, WI with an estimated 403,000 cases (Mac Kenzie et al. 1994). A recent study by McDonald et al. (2001), which used blood samples from Milwaukee children collected during and after the 1993 outbreak, suggests that Cryptosporidium infection, including asymptomatic infection, was more widespread than might be inferred from the illness estimates by Mac Kenzie et al. (1994).

It is important to note that the number of identified and reported outbreaks in the CDC database is believed to substantially understate the actual incidence of waterborne disease outbreaks and cases (Craun and Calderon 1996, National Research Council 1997). This under reporting is due to a number of factors. Many people experiencing gastrointestinal illness do not seek medical attention. Where medical attention is provided, the pathogenic agent may not be identified through routine testing. Physicians often lack sufficient information to attribute gastrointestinal illness to any specific origin, such as drinking water, and few States have an active outbreak surveillance program. Consequently, outbreaks are often not recognized in a community or, if recognized, are not traced to a drinking water source.

In addition, an unknown but probably significant portion of waterborne disease is endemic (i.e. isolated cases not associated with an outbreak) and, thus, is even more difficult to recognize. The Economic Analysis for the proposed LT2ESWTR (USEPA 2003a) uses data on Cryptosporidium occurrence, infectivity, and treatment to estimate the baseline endemic incidence of cryptosporidiosis attributable to drinking water, as well as the reductions projected as a result of this rule.

Most waterborne pathogens cause gastrointestinal illness with diarrhea, abdominal discomfort, nausea, vomiting, and other symptoms. The effects of waterborne disease are usually acute, resulting from a single or small number of exposures. Such illnesses are generally of short duration in healthy people. However, some pathogens, including Giardia lamblia and Cryptosporidium, may cause disease lasting weeks or longer in otherwise healthy individuals, though this is not typical for Cryptosporidium. Waterborne pathogens also cause more serious disorders such as hepatitis, peptic ulcers, myocarditis, paralysis, conjunctivitis, swollen lymph glands, meningitis, and reactive arthritis, and have been associated with diabetes, encephalitis, and other diseases (Lederberg 1992).

There are populations that are at greater risk from waterborne disease. These sensitive subpopulations include children (especially infants), the elderly, the malnourished, pregnant women, the disease impaired (e.g., diabetes, cystic fibrosis), and a broad category of those with compromised immune systems, such as AIDS patients, those with autoimmune disorders (e.g., rheumatoid arthritis, lupus erythematosus, multiple sclerosis), transplant recipients, and those on chemotherapy (Rose 1997). This sensitive segment represents almost 20% of the population in the United States (Gerba et al. 1996). The severity and duration of illness is often greater in sensitive subpopulations than in healthy individuals, and in a small percentage of such cases, death may result.

2. Cryptosporidium Health Effects and Outbreaks

Cryptosporidium is a protozoan parasite that exists in warm-blooded hosts and, upon excretion, may survive for months in the environment (Kato et al., 2001). Ingestion of Cryptosporidium can lead to cryptosporidiosis, a gastrointestinal illness. Transmission of cryptosporidiosis often occurs through consumption of feces contaminated food or water, but may also result from direct or indirect contact with infected persons or animals (Casemore 1990). Surveys (described in Section III) indicate that Cryptosporidium is common in surface waters used as drinking water supplies. Sources of Cryptosporidium contamination include animal agriculture, wastewater treatment plant discharges, slaughterhouses, birds, wild animals, and other sources of fecal matter.

EPA is particularly concerned about Cryptosporidium because, unlike pathogens such as bacteria and most viruses, Cryptosporidium oocysts are highly resistant to standard disinfectants like chlorine and chloramines. Consequently, control of Cryptosporidium in most treatment plants is dependent on physical removal processes. Finished water monitoring data indicate that Cryptosporidium is sometimes present in filtered, treated drinking water (LeChevallier et al. 1991; Aboytes et al. 2002). Moreover, as noted later, many of the individuals sickened by waterborne outbreaks of cryptosporidiosis were served by filtered surface water supplies (Solo-Gabriele and Neumeister, 1996). In some cases, these outbreaks were attributed to treatment deficiencies, while in other cases the cause was unidentified (see Table II-1).

These data suggest that surface water systems that filter and disinfect may still be vulnerable to Cryptosporidium, depending on the source water quality and treatment effectiveness. Today's proposed rule addresses concern with passage of Cryptosporidium through physical removal processes during water treatment, as well as in systems lacking filtration.

a. Health effects. Cryptosporidium infection is characterized by mild to severe diarrhea, dehydration, stomach cramps, and/or a slight fever. Symptoms typically last from several days to two weeks, though in a small percentage of cases, the symptoms may persist for months or longer in otherwise healthy individuals. Human feeding studies have demonstrated that a low dose of Cryptosporidium parvum (C. parvum) is sufficient to cause infection in healthy adults (DuPont et al. 1995, Chappell et al. 1999, Messner et al. 2001). Studies of immunosuppressed adult mice have demonstrated that a single viable oocyst can induce patent C. parvum infections (Yang et al. 2000).

There is evidence that an immune response to Cryptosporidium exists, but the degree and duration of this immunity is not well characterized. In a study by Chappell et al. (1999), individuals with a blood serum antibody (IgG), which can develop from exposure to C. parvum, demonstrated immunity to low doses of oocysts. The investigators found the ID50 dose (i.e., dose that infects 50% of the challenged population) of one C. parvum isolate for adult volunteers who had pre-existing serum IgG to be 1,880 oocysts in comparison to 132 oocysts for individuals reported as serologically negative. However, the implications of these data for studies of Cryptosporidium infectivity are unclear. Earlier work did not observe a correlation between the development of antibodies after Cryptosporidium exposure and subsequent protection from illness (Okhuysen et al. 1998). A subsequent investigation by Muller et al. (2001) observed serological responses to Cryptosporidium antigens in samples from individuals reported by Chappel et al. as serologically negative.

Cryptosporidium parvum was first recognized as a human pathogen in 1976 (Juranek 1995). Cases of illness from Cryptosporidium were rarely reported until 1982 when documented disease incidence increased due to the AIDS epidemic (Current 1983). As laboratory diagnostic techniques improved during subsequent years, outbreaks among immunocompetent persons were recognized as well. Human, cattle, dog and deer types of C. parvum have been found in healthy individuals (Ong et al. 2002, Morgan-Ryan et al. 2002). Other Cryptosporidium species (C. felis, C. meleagridis, and possibly C. muris) have infected healthy individuals, primarily children (Xiao et al. 2001, Chalmers et al. 2002, Katsumata et al. 2000). Cross-species infection occurs. The human type of C. parvum (now named C. hominis (Morgan-Ryan et al. 2002)) has infected a dugong and monkeys (Spano et al. 1998). The cattle type of C. parvum infects humans, wild animals, and other livestock, such as sheep, goats and deer (Ong et al. 2002).

As noted earlier, there are sensitive populations that are at greater risk from pathogenic microorganisms. Cryptosporidiosis symptoms in immunocompromised subpopulations are much more severe, including debilitating voluminous diarrhea that may be accompanied by severe abdominal cramps, weight loss, and low grade fever (Juranek 1995). Mortality is a significant threat to the immunocompromised infected with Cryptosporidium:

the duration and severity of the disease are significant: whereas 1 percent of the immunocompetent population may be hospitalized with very little risk of mortality, Cryptosporidium infections are associated with a high rate of mortality in the immunocompromised (Rose 1997)

A follow-up study of the 1993 Milwaukee, WI outbreak reported that at least 50 Cryptosporidium-associated deaths occurred among the severely immunocompromised (Hoxie et al. 1997).

b. Waterborne cryptosporidiosis outbreaks. Cryptosporidium has caused a number of waterborne disease outbreaks since 1984 when the first one was reported in the U.S. Table II-1 lists reported outbreaks in community water systems (CWS) and non-community water systems (NCWS). Between 1984—1998, nine outbreaks caused by Cryptosporidium were reported in the U.S. with approximately 421,000 cases associated cases of illness (CDC 1993, 1996, 1998, 2000, and 2001). Solo-Gabriele and Neumeister (1996) characterized water supplies associated with U.S. outbreaks of cryptosporidiosis. They determined that almost half of the outbreaks were associated with ground water (untreated or chlorinated springs and wells), but that the majority of affected individuals were served by filtered surface water supplies (rivers and lakes). They found that during outbreaks involving treated spring or well water, the chlorination systems were apparently operating satisfactorily, with a measurable chlorine residual.

Although the occurrence of Cryptosporidium in U.S. drinking water supplies has been substantiated by data collected during outbreak investigations, the source and density of oocysts associated with the outbreak have not always been detected or reported. Furthermore, because of limitations and uncertainties of the immunofluorescence assay (IFA) method used in earlier studies, negative results in source or finished water during these outbreaks do not necessarily mean that there were no oocysts in the water at the time of sampling.

Table II-1.—Outbreaks Caused by Cryptosporidium in Public Water Systems: 1984-1998 Back to Top
Year State Cases System Deficiency Source
†† =Total estimated cases were 3,000. The locations were nearby and cases overlapped in time Definitions of deficiencies = (1) untreated surface water; (2) untreated ground water; (3) treatment deficiency (e.g., temporary interruption of disinfection, chronically inadequate disinfection, and inadequate or no filtration); (4) distribution system deficiency (e.g., cross connection, contamination of water mains during construction or repair, and contamination of a storage facility); and (5) unknown or miscellaneous deficiency.
1984 TX 117 CWS 3 Well.
1987 GA 13,000 CWS 3 River.
1991 PA 551 NCWS 3 Well.
1992 OR †† CWS 3 Spring.
1992 OR †† CWS 3 River.
1993 NV 103 CWS 5 Lake.
1993 WI 403,000 CWS 3 Lake.
1994 WA 134 CWS 2 Well.
1998 TX 1,400 CWS 3 Well.

3. Remaining Public Health Concerns Following the IESWTR and LT1ESWTR

This section presents the areas of remaining public health concern following implementation of the IESWTR and LT1ESWTR that EPA proposes to address in the LT2ESWTR. These are as follows: (a) Adequacy of physical removal to control Cryptosporidium and the need for risk based treatment requirements; (b) control of Cryptosporidium in unfiltered systems; and (c) uncovered finished water storage facilities.

EPA recognized each of these issues as a potential public health concern during development of the IESWTR, but could not address them at that time due to the absence of key data. Accordingly, this section begins with a description of how EPA considered these issues during development of the IESWTR, including the data gaps that were identified at that time. This is followed by a statement of the extent to which new information has filled these data gaps, thereby allowing EPA to address these public health concerns in the LT2ESWTR proposal.

a. Adequacy of physical removal to control Cryptosporidium and the need for risk based treatment requirements. A question that received significant consideration during development of the IESWTR is whether physical removal by filtration plants provides adequate protection against Cryptosporidium in drinking water, or whether certain systems should be required to provide inactivation of Cryptosporidium based on source water pathogen levels. As discussed in the proposal, notice of data availability (NODA), and final IESWTR, EPA and stakeholders concluded that data available during IESWTR development were not adequate to support risk based inactivation requirements for Cryptosporidium. However, the Agency maintained that a risk based approach to Cryptosporidium control would be considered for the LT2ESWTR when data collected under the Information Collection Rule were available and other critical information needs had been addressed.

The IESWTR proposal (59 FR 38832, July 29, 1994) (USEPA 1994) included two treatment alternatives, labeled B and C, that specifically addressed Cryptosporidium. Under Alternative B, the level of required treatment would be based on the density of Cryptosporidium in the source water. The proposal noted concerns with this approach, though, due to uncertainty in the risk associated with Cryptosporidium and the feasibility of achieving higher treatment levels through disinfection. Consequently, EPA also proposed Alternative C, which would require 2 log (99%) removal of Cryptosporidium by filtration. This was based on the determination that 2 log Cryptosporidium removal is feasible using conventional treatment.

In the 1996 Information Collection Rule (61 FR 24354, May 14, 1996) (USEPA 1996a), EPA concluded that the analytical method prescribed for measuring Cryptosporidium was adequate for making national occurrence estimates, but would not suffice for making site specific source water density estimates. This finding further contributed to the rationale supporting Alternative C under the proposed IESWTR.

The NODA for the IESWTR (62 FR 59498, Nov. 3, 1997) (USEPA 1997a) presented the recommendations of the Stage 1 MDBP Federal Advisory Committee for the IESWTR. As stated in the NODA, the Committee engaged in extensive discussions regarding the adequacy of relying solely on physical removal to control Cryptosporidium and the need for inactivation. There was an absence of consensus on whether it was possible at that time to adequately measure Cryptosporidium inactivation efficiencies for various disinfection technologies. This was a significant impediment to addressing inactivation in the IESWTR. However, the Committee recognized that inactivation requirements may be necessary under future regulatory scenarios, as shown by the following consensus recommendation from the Stage 1 MDBP Agreement in Principle:

EPA should issue a risk based proposal of the Final Enhanced Surface Water Treatment Rule for Cryptosporidium embodying the multiple barrier approach (e.g., source water protection, physical removal, inactivation, etc.), including, where risks suggest appropriate, inactivation requirements (62 FR 59557, Nov. 3, 1997) (USEPA 1997a).

The preamble to the final IESWTR (63 FR 69478, Dec. 16, 1998) (USEPA 1998a) states that EPA was unable to consider the proposed Alternative B (treatment requirements for Cryptosporidium based on source water occurrence levels) for the IESWTR because occurrence data from the Information Collection Rule survey and related analysis were not available in time to meet the statutory promulgation deadline. The Agency affirmed, though, that further control of Cryptosporidium would be addressed in the LT2ESWTR.

In today's notice, EPA is proposing a risk based approach for control of Cryptosporidium in drinking water. Under this approach, the required level of additional Cryptosporidium treatment relates to the source water pathogen density. EPA believes many of the data gaps that prevented the adoption of this approach under the IESWTR have been addressed. As described in Section III of this preamble, information on Cryptosporidium occurrence from the Information Collection Rule and Information Collection Rule Supplemental Surveys, along with new data on Cryptosporidium infectivity, have provided EPA with a better understanding of the magnitude and distribution of risk for this pathogen. Improved analytical methods allow for a more accurate assessment of source water Cryptosporidium levels, and recent disinfection studies with UV, ozone, and chlorine dioxide provide the technical basis to support Cryptosporidium inactivation requirements.

b. Control of Cryptosporidium in unfiltered systems. There is particular concern about Cryptosporidium in the source waters of unfiltered systems because this pathogen has been shown to be resistant to conventional disinfection practices. In the IESWTR, EPA extended watershed control requirements for unfiltered systems to include the control of Cryptosporidium. EPA did not establish Cryptosporidium treatment requirements for unfiltered systems because available data suggested an equivalency of risk in filtered and unfiltered systems. This is described in the final IESWTR as follows:

it appears that unfiltered water systems that comply with the source water requirements of the SWTR have a risk of cryptosporidiosis equivalent to that of a water system with a well operated filter plant using a water source of average quality (63 FR 69492, Dec. 16, 1998) (USEPA 1998a)

The Agency noted that data from the Information Collection Rule would provide more information on Cryptosporidium levels in filtered and unfiltered systems, and that Cryptosporidium treatment requirements would be re-evaluated when these data became available.

In today's notice, EPA is proposing Cryptosporidium inactivation requirements for unfiltered systems. These proposed requirements stem from an assessment of Cryptosporidium source water occurrence in both filtered and unfiltered systems using data from the Information Collection Rule and other surveys, as described in Section III of this preamble. These new data do not support the finding described in the IESWTR of equivalent risk in filtered and unfiltered systems. Rather, Cryptosporidium treatment by unfiltered systems is necessary to achieve a finished water risk level equivalent to that of filtered systems. In addition, the development of Cryptosporidium inactivation criteria for UV, ozone, and chlorine dioxide in the LT2ESWTR has made it feasible for unfiltered systems to provide Cryptosporidium treatment.

c. Uncovered finished water storage facilities. In the IESWTR proposal, EPA solicited comment on a requirement that systems cover finished water storage facilities to reduce the potential for contamination by pathogens and hazardous chemicals. Potential sources of contamination to uncovered storage facilities include airborne chemicals, runoff, animal carcasses, animal or bird droppings, and growth of algae and other aquatic organisms (59 FR 38832, July 29, 1994) (USEPA 1994).

The final IESWTR established a requirement to cover all new storage facilities for finished water for which construction began after February 16, 1999 (63 FR 69493, Dec. 16, 1998) (USEPA 1998a). In preamble to the final IESWTR, EPA described future regulation of existing uncovered finished water storage facilities as follows:

EPA needs more time to collect and analyze additional information to evaluate regulatory impacts on systems with existing uncovered reservoirs on a national basis . . . EPA will further consider whether to require the covering of existing reservoirs during the development of subsequent microbial regulations when additional data and analysis to develop the national costs of coverage are available.

EPA continues to be concerned about contamination resulting from uncovered finished water storage facilities, particularly the potential for virus contamination via bird droppings, and now has sufficient data to estimate national cost implications for various regulatory control strategies. Therefore, EPA is proposing control measures for all systems with uncovered finished water storage facilities in the LT2ESWTR. New data and proposed requirements are described in section IV.E of this preamble.

D. Federal Advisory Committee Process

In March 1999, EPA reconvened the M-DBP Federal Advisory Committee to develop recommendations for the Stage 2 DBPR and LT2ESWTR. The Committee consisted of organizational members representing EPA, State and local public health and regulatory agencies, local elected officials, Indian Tribes, drinking water suppliers, chemical and equipment manufacturers, and public interest groups. Technical support for the Committee's discussions was provided by a technical workgroup established by the Committee at its first meeting. The Committee's activities resulted in the collection and evaluation of substantial new information related to key elements for both rules. This included new data on pathogenicity, occurrence, and treatment of microbial contaminants, specifically including Cryptosporidium, as well as new data on DBP health risks, exposure, and control. New information relevant to the LT2ESWTR is summarized in Section III of this proposal.

In September 2000, the Committee signed an Agreement in Principle reflecting the consensus recommendations of the group. The Agreement was published in a December 29, 2000 Federal Register notice (65 FR 83015, December 29, 2000) (USEPA 2000a). The Agreement is divided into Parts A B. The entire Committee reached consensus on Part A, which contains provisions that directly apply to the Stage 2 DBPR and LT2ESWTR. The full Committee, with the exception of one member, agreed to Part B, which has recommendations for future activities by EPA in the areas of distribution systems and microbial water quality criteria.

The Committee reached agreement on the following major issues discussed in this notice and the proposed Stage 2 DBPR:

LT2ESWTR: (1) Additional Cryptosporidium treatment based on source water monitoring results; (2) Filtered systems that must comply with additional Cryptosporidium treatment requirements may choose from a “toolbox” of treatment and control options; (3) Reduced monitoring burden for small systems; (4) Future monitoring to confirm source water quality assessments; (5) Cryptosporidium inactivation by all unfiltered systems; (6) Unfiltered systems meet overall inactivation requirements using a minimum of 2 disinfectants; (7) Development of criteria and guidance for UV disinfection and other toolbox options; (8) Cover or treat existing uncovered finished water reservoirs (i.e., storage facilities) or implement risk mitigation plans.

Stage 2 DBPR: (1) Compliance calculation for total trihanomethanes (TTHM) and five haloacetic acids (HAA5) revised from a running annual average (RAA) to a locational running annual average (LRAA); (2) Compliance carried out in two phases of the rule; (3) Performance of an Initial Distribution System Evaluation; (4) Continued importance of simultaneous compliance with DBP and microbial regulations; (5) Unchanged MCL for bromate.

III. New Information on Cryptosporidium Health Risks and Treatment Back to Top

The purpose of this section is to describe information related to health risks and treatment of Cryptosporidium in drinking water that has become available since EPA developed the IESWTR. Much of this information was evaluated by the Stage 2 M-DBP Federal Advisory Committee when considering whether and to what degree existing microbial standards should be revised to protect public health. It serves as a basis for the recommendations made by the Advisory Committee and for provisions in today's proposed rule. This section begins with an overview of critical factors that EPA considers when evaluating regulation of microbial pathogens. New information is then presented on three key topics: Cryptosporidium infectivity, occurrence, and treatment.

A. Overview of Critical Factors for Evaluating Regulation of Microbial Pathogens

When proposing a national primary drinking water regulation that includes a maximum contaminant level or treatment technique, SDWA requires EPA to analyze the health risk reduction benefits and costs likely to result from alternative regulatory levels that are being considered. For assessing risk, EPA follows the paradigm described by the National Academy of Science (NRC, 1983) which involves four steps: (1) Hazard identification, (2) dose-response assessment, (3) exposure assessment, and (4) risk characterization. The application of these steps to microbial pathogens is briefly described in this section, followed by a summary of how EPA estimates the health benefits and costs of regulatory alternatives.

Hazard identification for microbial pathogens is a description of the nature, severity, and duration of the health effects stemming from infection. Under SDWA, EPA must consider health effects on the general population and on subpopulations that are at greater risk of adverse health effects. See section II.C.2 of this preamble for health effects associated with Cryptosporidium.

Dose-response assessment with microorganisms is commonly termed infectivity and is a description of the relationship between the number of pathogens ingested and the probability of infection. Information on Cryptosporidium infectivity is presented in section III.B of this preamble.

Exposure to microbial pathogens in drinking water is generally a function of the concentration of the pathogen in finished water and the volume of water ingested (exposure also occurs through secondary routes involving infected individuals). Because it is difficult to directly measure pathogens at the low levels typically present in finished water, EPA's information on pathogen exposure is primarily derived from surveys of source water occurrence. EPA estimates the concentration of pathogens in treated water by combining source water pathogen occurrence data with information on the performance of treatment plants in reducing pathogen levels. Data on the occurrence of Cryptosporidium are described in section III.C of this preamble and in Occurrence and Exposure Assessment for the LT2ESWTR (USEPA 2003b). Cryptosporidium treatment studies are described in section III.D of this preamble.

Risk characterization is the culminating step of the risk assessment process. It is a description of the nature and magnitude of risk, and characterizes strengths, weaknesses, and attendant uncertainties of the assessment. EPA's risk characterization for Cryptosporidium is described in Economic Analysis for the LT2ESWTR (USEPA 2003a).

Estimating the health benefits and costs that would result from a new regulatory requirement involves a number of steps, including evaluating the efficacy and cost of treatment strategies to reduce exposure to the contaminant, forecasting the number of systems that would implement different treatment strategies to comply with the regulatory standard, and projecting the reduction in exposure to the contaminant and consequent health risk reduction benefits stemming from regulatory compliance. EPA's estimates of health benefits and costs associated with the proposed LT2ESWTR are presented in Economic Analysis for the LT2ESWTR (USEPA 2003a) and are summarized in section VI of this preamble.

B. Cryptosporidium Infectivity

This section presents information on the infectivity of Cryptosporidium oocysts. Infectivity relates the probability of infection by Cryptosporidium with the number of oocysts that a person ingests, and it is used to predict the disease burden associated with different Cryptosporidium levels in drinking water. Information on Cryptosporidium infectivity comes from dose-response studies where healthy human subjects ingest different numbers of oocysts and are subsequently evaluated for signs of infection and illness.

Data from a human dose-response study of one Cryptosporidium isolate (the IOWA study, conducted at the University of Texas-Houston Health Science Center) had been published prior to the IESWTR (DuPont et al. 1995). Following IESWTR promulgation, a study of two additional isolates (TAMU and UCP) was completed and published (Okhuysen et al. 1999). This study also presented a reanalysis of the IOWA study results. As described in more detail later in this section, this new study indicates that the infectivity of Cryptosporidium oocysts varies over a wide range. The UCP oocysts appeared less infective than those of the IOWA study while the TAMU oocysts were much more infective. Although the occurrence of these isolates among environmental oocysts is unknown, a meta-analysis of these data conducted by EPA suggests the overall infectivity of Cryptosporidium may be significantly greater than was estimated for the IESWTR (USEPA 2003a).

This section begins with a description of the infectivity data considered for the IESWTR. This is followed by a presentation of additional data that have been evaluated for the proposed LT2ESWTR and a characterization of the significance of these new data.

1. Cryptosporidium Infectivity Data Evaluated for IESWTR

Data from the IOWA study (DuPont et al. 1995) were evaluated for the IESWTR. In that study, 29 individuals were given single doses ranging from 30 oocysts to 1 million oocysts. This oocyst isolate was originally obtained from a naturally infected calf. Seven persons received doses above 500, and all were infected. Eleven of the twenty two individuals receiving doses of 500 or fewer were classified as infected based on oocysts detected in stool samples.

The IOWA study data were analyzed using an exponential dose-response model established by Haas et al. (1996) for Cryptosporidium:

Probability {Infection / Dose} =

1−e −Dose/k

Based on the maximum likelihood estimate of k (238), the probability of infection from ingesting a single oocyst (1/k) is approximately 0.4% (4 persons infected for every 1,000 who each ingest one oocyst). Based on the same estimate, the dose at which 50% of persons become infected (known as the median infectious dose or ID50) is 165.

2. New Data on Cryptosporidium Infectivity

A study of two additional Cryptosporidium isolates was conducted at the University of Texas-Houston Health Science Center (Okhuysen et al. 1999). One of the isolates (UCP) was originally collected from naturally infected calves. The other isolate (TAMU) was originally collected from a veterinary student who became infected during necropsy on an infected foal.

The TAMU and UCP studies were conducted with 14 and 17 subjects, respectively. Because thousands of oocysts per gram of stool can go undetected, researchers elected to use both stool test results and symptoms as markers of infection (only stool test results had been used for the IOWA study). Under this definition, two additional IOWA subjects were regarded as having been infected. As shown in Table III-1, all but two of the TAMU subjects were presumed infected and all but six of the UCP subjects were presumed infected following ingestion of the indicated oocyst doses.

Table III-1.—Cryptosporidium Parvum Infectivity in Healthy Adult Volunteers Back to Top
Isolate and dose (# of oocysts) Number of subjects1 Number infected1
1The two right columns list the number of subjects belonging to each category.
IOWA:    
30 5 2
100 8 4
300 3 2
500 6 5
1,000 2 2
10,000 3 3
100,000 1 1
1,000,000 1 1
TAMU:    
10 3 2
30 3 2
100 3 3
500 5 5
UCP:    
500 5 3
1,000 3 2
5,000 5 2
10,000 4 4

EPA conducted a meta-analysis of these results in which the three isolates were considered as a random sample (of size three) from a larger population of environmental oocysts (Messner et al. 2001). This meta analysis was reviewed by the Science Advisory Board (SAB). In written comments from a December 2001 meeting of the Drinking Water Committee, SAB members recommended the following: (1) two assumed infectivity distributions (of parameter r = 1/k as logit normal and logit-t) should be used in order to characterize uncertainty and (2) EPA should consider excluding the UCP data set because it seems to be an outlier (see Section VII.K). In response, EPA has used the two recommended distributions for infectivity and has conducted the meta-analysis both with and without the UCP data due to uncertainty about whether it is appropriate to exclude these data.

Table III-2 presents meta-analysis estimates of the probability of infection given one oocyst ingested. Results are shown for the four different analysis conditions (log normal and log-t distributions; with and without UCP data) as well as a combined result derived by sampling equally from each distribution. A more complete description of the infectivity analysis is provided in Economic Analysis for the LT2ESWTR (USEPA 2003a).

Table III-2.—Risk of Infection, Given One Oocyst Ingested Back to Top
Basis for analysis Probability of infection, one oocyst ingested
Studies used Distributional model Mean 80% Credible interval
1Student's t distribution with 3 degrees of freedom (3df).
IOWA, TAMU, and UCP Normal 0.07 0.007-0.19
IOWA, TAMU, and UCP Student's t (3df)1 0.09 0.015-0.20
IOWA and TAMU Normal 0.09 0.011-0.23
IOWA and TAMU Student's t (3df)1 0.10 0.014-0.25
Equal Mix of the Four Above 0.09 0.011-0.22

The results in Table III-2 show that the mean probability of infection from ingesting a single infectious oocyst ranges from 7% to 10% depending on the assumptions used. In comparison, the best estimate in the IESWTR of this probability was 0.4%, based on the IOWA isolate alone, and using the earlier definition of infection. Thus, these data suggest that both the range and magnitude of Cryptosporidium infectivity is higher than was estimated in the final IESWTR.

It should be noted that although significantly more data on Cryptosporidium infectivity are available now than when EPA established the IESWTR, there remains uncertainty about this parameter in several areas. It is unknown how well the oocysts used in the feeding studies represent Cryptosporidium naturally occurring in the environment, and the analyses do not fully account for variability in host susceptibility and the effect of previous infections. Furthermore, the sample sizes are relatively small, and the confidence bands on the estimates span more than an order of magnitude. Another limitation is that none of the studies included doses below 10 oocysts, while when people ingest oocysts in drinking water it is usually a single oocyst.

3. Significance of New Infectivity Data

The new infectivity data reveal that oocysts vary greatly in their ability to infect human hosts. Moreover, due to this variability and the finding of a highly infectious isolate, TAMU, the overall population of oocysts appears to be more infective than assumed for the IESWTR. The meta-analysis described earlier indicates the probability of infection at low Cryptosporidium concentrations may be about 20 times as great as previously estimated (which was based on the IOWA isolate alone and using the earlier definition of infection (stool-confirmed infections)).

C. Cryptosporidium Occurrence

This section presents information on the occurrence of Cryptosporidium oocysts in drinking water sources. Occurrence information is important because it is used in assessing the risk associated with Cryptosporidium in both filtered and unfiltered systems, as well as in estimating the costs and benefits of the proposed LT2ESWTR.

For the IESWTR, EPA had no national survey data and relied instead on several studies that were local or regional. Those data suggested that a typical (median) filtered surface water source had approximately 2 Cryptosporidium oocysts per liter, while a typical unfiltered surface water source had about 0.01 oocysts per liter, a difference of two orders of magnitude.

Subsequent to promulgating the IESWTR, EPA obtained data from two national surveys: the Information Collection Rule and the Information Collection Rule Supplemental Surveys (ICRSS). These surveys were designed to provide improved estimates of occurrence on a national basis. As described in more detail later in this section, the Information Collection Rule and ICRSS results show three main differences in comparison to Cryptosporidium occurrence data used for the IESWTR:

(1) Average Cryptosporidium occurrence is lower. Median oocyst levels for the Information Collection Rule and ICRSS data are approximately 0.05/L, which is more than an order of magnitude lower than IESWTR estimates.

(2) Cryptosporidium occurrence is more variable from location to location than was shown by the data considered for the IESWTR. This indicates that although median occurrence levels are below those assumed for the IESWTR, there is a subset of systems whose levels are considerably greater than the median.

(3) There is a smaller difference in Cryptosporidium levels between typical filtered and unfiltered system water sources. The Information Collection Rule data do not support the IESWTR finding that unfiltered water systems have a risk of cryptosporidiosis equivalent to that of a filter plant with average quality source water.

This section begins with a summary of occurrence data that were used to assess risk under the IESWTR (these data were also used in the main risk assessment for the LT1ESWTR). This is followed by a discussion of the Information Collection Rule and ICRSS that covers the scope of the surveys, analytical methods, results, and a characterization of how these new data impact current understanding of Cryptosporidium exposure. A more detailed description of occurrence data is available in Occurrence and Exposure Assessment for the Long Term 2 Enhanced Surface Water Treatment Rule (USEPA 2003b).

1. Occurrence Data Evaluated for IESWTR

Occurrence information evaluated for the IESWTR is detailed in Occurrence and Exposure Assessment for The Interim Enhanced Surface Water Treatment Rule (USEPA 1998c). This information is summarized in the next two paragraphs.

a. Filtered systems. In developing the IESWTR, EPA evaluated Cryptosporidium occurrence data from a number of studies. Among these studies, LeChevallier and Norton (1995) produced the largest data set and data from this study were used for the IESWTR risk assessment. This study provided estimates of mean occurrence at 69 locations from the eastern and central U.S. Although limited by the small number of samples per site (one to sixteen samples; most sites were sampled five times), variation within and between sites appeared to be lognormal. The study's median measured source water concentration was 2.31 oocysts/L and the interquartile range (i.e., 25th and 75th percentile) was 1.03 to 5.15 oocysts/L.

b. Unfiltered systems. To assess Cryptosporidium occurrence in unfiltered systems under the IESWTR, EPA evaluated Cryptosporidium monitoring results from several unfiltered water systems that had been summarized by the Seattle Water Department (Montgomery Watson, 1995). The median (central tendency) of these data was approximately 0.01 oocysts/L. Thus, the median concentration in these data set was about 2 orders of magnitude less than the median concentration in the data set used for filtered systems. These data, coupled with the assumption that filtered systems will remove at least 2 log of Cryptosporidium as required by the IESWTR, suggested that unfiltered systems that comply with the source water requirements of the SWTR may have a risk of cryptosporidiosis equivalent to that of a filter plant using a water source of average quality (62 FR 59507, November 3, 1997) (USEPA 1997a).

2. Overview of the Information Collection Rule and Information Collection Rule Supplemental Surveys (ICRSS)

The Information Collection Rule and the Information Collection Rule Supplemental Surveys (ICRSS) were national monitoring studies. They were designed to provide EPA with a more comprehensive understanding of the occurrence of microbial pathogens in drinking water sources in order to support regulatory decision making. The surveys attempted to control protozoa measurement error through requiring that (1) laboratories meet certain qualification criteria, (2) standardized methods be used to collect data, and (3) laboratories analyze performance evaluation samples throughout the duration of the study to ensure adequate analytical performance. Information Collection Rule monitoring took place from July 1997 to December 1998; ICRSS Cryptosporidium monitoring began in March 1999 and ended in February 2000.

a. Scope of the Information Collection Rule. The Information Collection Rule (61 FR 24354, May 14, 1996) (USEPA 1996a) required large PWSs to collect water quality and treatment data related to DBPs and microbial pathogens over an 18-month period. PWSs using surface water or ground water under the direct influence of surface water as sources and serving at least 100,000 people were required to monitor their raw water monthly for Cryptosporidium, Giardia, viruses, total coliforms, and E. coli. Approximately 350 plants monitored for microbial parameters.

b. Scope of the ICRSS. The ICRSS were designed to complement the Information Collection Rule data set with data from systems serving fewer than 100,000 people and by employing an improved analytical method for protozoa (described later). The ICRSS included 47 large systems (serving greater than 100,000 people), 40 medium systems (serving 10,000 to 100,000 people) and 39 small systems (serving fewer than 10,000 people). Medium and large systems conducted 1 year of twice-per-month sampling for Cryptosporidium, Giardia , temperature, pH, turbidity, and coliforms. Other water quality measurements were taken once a month. Small systems did not test for protozoa but tested for all other water quality parameters.

3. Analytical Methods for Protozoa in the Information Collection Rule and ICRSS

This subsection describes analytical methods for Cryptosporidium that were used in the Information Collection Rule and ICRSS. Information on Cryptosporidium analytical methods is important for the LT2ESWTR for several reasons: (1) It is relevant to the quality of Cryptosporidium occurrence data used to assess risk and economic impact of the LT2ESWTR proposal, (2) it provides a basis for the statistical procedures employed to analyze the occurrence data, and (3) it is used to assess the adequacy of Cryptosporidium methods to support source-specific decisions under the LT2ESWTR.

The Information Collection Rule and ICRSS data sets were generated using different analytical methods. The Information Collection Rule Protozoan Method (ICR Method) was used to analyze water samples for Cryptosporidium during the Information Collection Rule. For the ICRSS, a similar but improved method, EPA Method 1622 (later 1623), was used for protozoa analyses (samples were analyzed for Cryptosporidium using Method 1622 for the first 4 months; then Method 1623 was implemented so that Giardia concentrations could also be measured).

a. Information Collection Rule Protozoan Method. With the Information Collection Rule Method (USEPA 1996b), samples were collected by passing water through a filter, which was then delivered to an EPA-approved Information Collection Rule laboratory for analysis. The laboratory eluted the filter, centrifuged the eluate, and separated Cryptosporidium oocysts and Giardia cysts from other debris by density-gradient centrifugation. The oocysts and cysts were then stained and counted. Differential interference contrast (DIC) microscopy was used to examine internal structures.

The Information Collection Rule Method provided a quantitative measurement of Cryptosporidium oocysts and Giardia cysts, but it is believed to have generally undercounted the actual occurrence (modeling, described later, adjusted for undercounting). This undercounting was due to low volumes analyzed and low method recovery. The volume analyzed directly influences the sensitivity of the analytical method and the Information Collection Rule Method did not require a specific volume analyzed. As a result, sample volumes analyzed during the Information Collection Rule varied widely, depending on the water matrix and analyst discretion, with a median volume analyzed of only 3 L.

Method recovery characterizes the likelihood that an oocyst present in the original sample will be counted. Loss of organisms may occur at any step of the analytical process, including filtration, elution, concentration of the eluate, and purification of the concentrate. To assess the performance of the Information Collection Rule Method, EPA implemented the Information Collection Rule Laboratory Spiking Program. This program involved collection of duplicate samples on two dates from 70 plants. On each occasion, one of the duplicate samples was spiked with a known quantity of Giardia cysts and Cryptosporidium oocysts (the quantity was unknown to the laboratory performing the analysis), and both samples were processed according to the method. Recovery of spiked Cryptosporidium oocysts ranged from 0% to 65% with a mean of 12% and a standard deviation nearly equal to the mean (relative standard deviation (RSD) approximately 100%) (Scheller et al. 2002).

b. Method 1622 and Method 1623. EPA developed Method 1622 (detects Cryptosporidium) and 1623 (detects Cryptosporidium and Giardia) to achieve higher recovery rates and lower inter- and intra-laboratory variability than previous methods. These methods incorporate improvements in the concentration, separation, staining, and microscope examination procedures. Specific improvements include the use of more effective filters, immunomagnetic separation (IMS) to separate the oocysts and cysts from extraneous materials present in the water sample, and the addition of 4, 6-diamidino-2-phenylindole (DAPI) stain for microscopic analysis. The performance of these methods was tested through single-laboratory studies and validated through multiple-laboratory validation (round robin) studies.

The per-sample volume analyzed for Cryptosporidium during the ICRSS was larger than in the Information Collection Rule, due to a requirement that laboratories analyze a minimum of 10 L or 2 mL of packed pellet with Methods 1622/23 (details in section IV.K). To assess method recovery, matrix spike samples were analyzed on five sampling events for each plant. The protozoa laboratory spiked the additional sample with a known quantity of Cryptosporidium oocysts and Giardia cysts (the quantity was unknown to the laboratory performing the analysis) and filtered and analyzed both samples using Methods 1622/23. Recovery in the ICRSS matrix spike study averaged 43% for Cryptosporidium with an RSD of 47% (Connell et al. 2000). Thus, mean Cryptosporidium recovery with Methods 1622/23 under the ICRSS was more than 3.5 times higher than mean recovery in the Information Collection Rule lab spiking program and relative standard deviation was reduced by more than half.

Although Methods 1622 and 1623 have several advantages over the Information Collection Rule method, they also have some of the same limitations. These methods do not determine whether a cyst or oocyst is viable and infectious, and both methods require a skilled microscopist and several hours of sample preparation and analyses.

4. Cryptosporidium Occurrence Results from the Information Collection Rule and ICRSS

This section describes Cryptosporidium monitoring results from the Information Collection Rule and ICRSS. The focus of this discussion is the national distribution of mean Cryptosporidium occurrence levels in the sources of filtered and unfiltered plants.

The observed (raw, unadjusted) Cryptosporidium data from the Information Collection Rule and ICRSS do not accurately characterize true concentrations because of (a) the low and variable recovery of the analytical method, (b) the small volumes analyzed, and (c) the relatively small number of sample events. EPA employed a statistical treatment to estimate the true underlying occurrence that led to the data observed in the surveys and to place uncertainty bounds about that estimation.

A hierarchical model with Bayesian parameter estimation techniques was used to separately analyze filtered and unfiltered system data from the Information Collection Rule and the large and medium system data from the ICRSS. The model included parameters for location, month, source water type, and turbidity. Markov Chain Monte Carlo methods were used to estimate these parameters, producing a large number of estimate sets that represent uncertainty. This analysis is described more completely in Occurrence and Exposure Assessment for the Long Term 2 Enhanced Surface Water Treatment Rule (USEPA 2003b).

a. Information Collection Rule results. Figure III-1 presents plant-mean Cryptosporidium levels for Information Collection Rule plants as a cumulative distribution. Included in Figure III-1 are distributions of both the observed raw data adjusted for mean analytical method recovery of 12% and the modeled estimate of the underlying distribution, along with 90% confidence bounds. The two distributions (observed and modeled) are similar for plants where Cryptosporidium was detected (196 of 350 Information Collection Rule plants did not detect Cryptosporidium in any source water samples). The modeled distribution allows for estimation of Cryptosporidium concentrations in sources where oocysts may have been present but were not detected due to low sample volume and poor method recovery (this concept is explained further later in this section).

BILLING CODE 6560-50-P

[Graphic not available; view image of printed page]

BILLING CODE 6560-50-P

The results shown in Figure III-1 indicate that mean Cryptosporidium levels among Information Collection Rule plants vary widely, with many plants having relatively little contamination and a fraction of plants with elevated source water pathogen levels. The median and 90th percentile estimates of Information Collection Rule plant-mean Cryptosporidium levels are 0.048 and 1.3 oocysts/L, respectively. These levels are lower than Cryptosporidium occurrence estimates used in the IESWTR (USEPA 1998c), and the distribution of Information Collection Rule data is broader (i.e., more source-to-source variability). Also, the occurrence of Cryptosporidium in flowing stream sources was greater and more variable than in reservoir/lake sources (shown in USEPA 2003b).

The fact that only 44% of Information Collection Rule plants had one or more samples positive for Cryptosporidium and that only 7% of all Information Collection Rule samples were positive for Cryptosporidium suggests that oocyst levels were relatively low in many source waters. However, as noted earlier, it is expected that Cryptosporidium oocysts were present in many more source waters at the time of sampling and were not detected due to poor analytical method recovery and low sample volumes.

This concept is illustrated by Figure III-2, which shows the likelihood of no oocysts being detected by the Information Collection Rule method as a function of source water concentration (assumes median Information Collection Rule sample volume of 3 L). As can be seen in Figure III-2, when the source water concentration is 1 oocyst/L, which is a relatively high level, the probability of no oocysts being detected in a 3 L sample is 73%; for a source water with 0.1 oocyst/L, which is close to the median occurrence level, the probability of a non-detect is 97%. Consequently, EPA has concluded that it is appropriate and necessary to use a statistical model to estimate the underlying distribution.

EPA modeled Cryptosporidium occurrence separately for filtered and unfiltered plants that participated in the Information Collection Rule because unfiltered plants comply with different regulatory requirements than filtered plants. As shown in Table III-3, the occurrence of Cryptosporidium was lower for unfiltered sources.

BILLING CODE 6560-50-P

[Graphic not available; view image of printed page]

BILLING CODE 6560-50-C

Table III-3.—Summary of Information Collection Rule Cryptosporidium Modeled Source Water Data for Unfiltered and Filtered Plants Back to Top
Source Information collection rule modeled plant-mean (oocysts/L)
Mean Median 90th percentile
Unfiltered 0.014 0.0079 0.033
Filtered 0.59 0.052 1.4

The median Cryptosporidium occurrence level for unfiltered systems in the Information Collection Rule was 0.0079 oocysts/L, which is close to the median level of 0.01 oocysts/L reported for unfiltered systems in the IESWTR (Montgomery Watson, 1995). However, the Information Collection Rule data do not show the 2 log difference in median Cryptosporidium levels between filtered and unfiltered systems that was observed for the data used in the IESWTR. The ratio of median plant-mean occurrence in unfiltered plants to filtered plants is about 1:7 (see Table III-3). Thus, based on an assumption of a minimum 2 log removal of Cryptosporidium by filtration plants (as required by the IESWTR and LT1ESWTR), these data indicate that, on average, finished water oocysts levels are higher in unfiltered systems than in filtered systems.

b. ICRSS results. Figures III-3 and III-4 present plant-mean Cryptosporidium levels for ICRSS medium and large systems, respectively, as cumulative distributions. Medium and large system data were analyzed separately to identify differences between the two data sets. Similar to the Information Collection Rule data plot, Figures III-3 and III-4 include distributions for both the observed raw data adjusted for mean analytical method recovery of 43% and the modeled estimate of the underlying distribution, along with 90% confidence bounds. The observed and modeled distributions are similar for the 85% of ICRSS plants that detected Cryptosporidium, and the modeled distribution allows for estimation of Cryptosporidium concentrations for source waters where oocysts may have been present but were not detected.

Plant-mean Cryptosporidium concentrations for large and medium systems in the ICRSS are similar at the mid and lower range of the distribution and differ at the upper end. ICRSS medium and large systems both had median plant-mean Cryptosporidium levels of approximately 0.05 oocysts/L, which is close to the median oocyst level in the Information Collection Rule data set as well. However, the 90th percentile plant-mean was 0.33 oocysts/L for ICRSS medium systems and 0.24 oocysts/L for ICRSS large systems. Note that in the Information Collection Rule distribution, the 90th percentile Cryptosporidium concentration is 1.3 oocysts/L, which is significantly higher than either the ICRSS medium or large system distribution.

The reasons for different results between the surveys are not well understood, but may stem from year-to-year variation in occurrence, systematic differences in the sampling or measurement methods employed, and differences in the populations sampled. This topic is discussed further at the end of this section.

BILLING CODE 6560-50-P

[Graphic not available; view image of printed page]

[Graphic not available; view image of printed page]

BILLING CODE 6560-50-C

5. Significance of new Cryptosporidium occurrence data.

The Information Collection Rule and ICRSS data substantially improve overall knowledge of the occurrence distribution of Cryptosporidium in drinking water sources. They provide data on many more water sources than were available when the IESWTR was developed and the data are of more uniform quality. In regard to filtered systems, these new data demonstrate two points:

(1) The occurrence of Cryptosporidium in many drinking water sources is lower than was indicated by the data used in IESWTR. Median plant-mean levels for the Information Collection Rule and ICRSS data sets are approximately 0.05 oocysts/L, whereas the median oocyst concentration in the LeChevallier and Norton (1995) data used in the IESWTR risk assessment was 2.3 oocysts/L.

(2) Cryptosporidium occurrence is more variable from plant to plant than was indicated by the data considered for the IESWTR (i.e., occurrence distribution is broader). This is illustrated by considering the ratio of the 90th percentile to the median plant-mean concentration. In the LeChevallier and Norton (1995) data used for the IESWTR, this ratio was 4.6, whereas in the Information Collection Rule data, this ratio is 27.

These data, therefore, support the finding that Cryptosporidium levels are relatively low in most water sources, but there is a subset of sources with relatively higher concentrations where additional treatment may be appropriate.

In regard to unfiltered plants, the Information Collection Rule data are consistent with the Cryptosporidium occurrence estimates for unfiltered systems in the IESWTR. However, due to the lower occurrence estimates for filtered systems noted previously, the Information Collection Rule data do not support the IESWTR finding that unfiltered water systems in compliance with the source water requirements of the SWTR have a risk of cryptosporidiosis equivalent to that of a well-operated filter plant using a water source of average quality (63 FR 69492, December 16, 1998) (USEPA 1998a). Rather, these data indicate that Agency conclusions regarding the risk comparison between unfiltered and filtered drinking waters must be revised. For protection equivalent to that provided by filtered systems, unfiltered systems must take additional steps to strengthen their microbial barriers.

6. Request for Comment on Information Collection Rule and ICRSS Data Sets

EPA notes that there are significant differences in the Information Collection Rule and ICRSS medium and large system data sets. The median values for these data sets are 0.048, 0.050, and 0.045 oocysts/L, respectively, while the 90th percentile values are 1.3, 0.33, and 0.24 oocysts/L. The reasons for these differences are not readily apparent. The ICRSS used a newer method with better quality control that yields significantly higher recovery, and this suggests that these data are more reliable for estimating concentrations at individual plants. However, the Information Collection Rule included a much larger number of plants (350 v. 40 each for the ICRSS medium and large system surveys) and, consequently, may be more reliable for estimating occurrence nationally. The surveys included a similar number of samples per plant (18 v. 24 in the ICRSS). The two surveys cover different time periods (7/97-12/98 for the Information Collection Rule and 3/99-2/00 for the ICRSS).

In order to better understand the factors that may account for the differences in the three data sets, EPA conducted several additional analyses. First, EPA compared results for the subset of 40 plants that were in both the Information Collection Rule and ICRSS large system surveys. The medians for the two data sets were 0.13 and 0.045 oocysts/L, respectively, while the 90th percentiles were 1.5 and 0.24 oocysts/L. Clearly, the discrepancy between the two surveys persists for the subsample of data from plants that participated in both surveys. This suggests that the different sample groups in the full data sets are not the primary factor that accounts for the different results.

Next, EPA looked at the six month period (July through December) that was sampled in two consecutive years (1997 and 1998) during the Information Collection Rule survey to investigate year-to-year variations at the same plants. Estimated medians for 1997 and 1998 were 0.062 and 0.040 oocysts/L, respectively, while the 90th percentiles were 1.1 and 1.3 oocysts/L. While these comparisons show some interyear variability, it is less than the variability observed between the Information Collection Rule and ICRSS data sets. EPA has no data comparing the same plants using the same methods for the time periods in question (1997-98 and 1999-2000) so it is not known if the variation between these time periods was larger than the apparent variation between 1997 and 1998 in the Information Collection Rule data set.

The choice of data set has a significant effect on exposure, cost, and benefit estimates for the LT2ESWTR. Due to the lack of any clear criterion for favoring one data set over the other, EPA has conducted the analyses for this proposed rule separately for each, and presents a range of estimates based on the three data sets. EPA requests comment on this approach. EPA will continue to evaluate the relative strengths and limitations of the three data sets, as well as any new data that may become available for the final rule.

D. Treatment

1. Overview

This section presents information on treatment processes for reducing the risk from Cryptosporidium in drinking water. Treatment information is critical to two aspects of the LT2ESWTR: (1) estimates of the efficiency of water filtration plants in removing Cryptosporidium are used in assessing risk in treated drinking water and (2) the performance and availability of treatment technologies like ozone, UV light, and membranes that effectively inactivate or remove Cryptosporidium impact the feasibility of requiring additional treatment for this pathogen.

The majority of plants treating surface water use conventional filtration treatment, which is defined in 40 CFR 141.2 as a series of processes including coagulation, flocculation, sedimentation, and filtration. Direct filtration, which is typically used on sources with low particulate levels, includes coagulation and filtration but not sedimentation. Other common filtration processes are slow sand, diatomaceous earth (DE), membranes, and bag and cartridge filters.

For the IESWTR (and later the LT1ESWTR), EPA evaluated results from pilot and full scale studies of Cryptosporidium removal by various types of filtration plants. Based on these studies, EPA concluded that conventional and direct filtration plants meeting IESWTR filter effluent turbidity standards will achieve a minimum 2 log (99%) removal of Cryptosporidium. The Agency reached the same conclusion for slow sand and DE filtration plants meeting SWTR turbidity standards. Treatment credit for technologies like membranes and bag and cartridge filters was to be made on a product-specific basis.

Subsequent to promulgating the IESWTR and LT1ESWTR, EPA has reviewed additional studies of the performance of treatment plants in removing Cryptosporidium, as well as other micron size particles (e.g., aerobic spores) that may serve as indicators of Cryptosporidium removal. As discussed later in this section, the Agency has concluded that these studies support an estimate of 3 log (99.9%) for the average Cryptosporidium removal efficiency of conventional treatment plants in compliance with the IESWTR or LT1ESWTR. Section IV.A describes how this estimate of average removal efficiency is used in determining the need for additional Cryptosporidium treatment under the LT2ESWTR. Further, this estimate is consistent with the Stage 2 M-DBP Agreement in Principle, which states as follows:

The additional treatment requirements in the (LT2ESWTR) bin requirement table are based, in part, on the assumption that conventional treatment plants in compliance with the IESWTR achieve an average of 3 logs removal of Cryptosporidium.

In addition, the Agency finds that available data support an estimate of 3 log average Cryptosporidium removal for well operated slow sand and DE plants. Direct filtration plants are estimated to achieve a 2.5 log average Cryptosporidium reduction, in consideration of the absence of a sedimentation process in these plants.

The most significant developments in the treatment of Cryptosporidium since IESWTR promulgation are in the area of inactivation. During IESWTR development, EPA determined that available data were not sufficient to identify criteria for awarding Cryptosporidium treatment credit for any disinfectant. As presented in section IV.C.14, EPA has now acquired the necessary data to specify the disinfectant concentrations and contact times necessary to achieve different levels of Cryptosporidium inactivation with chlorine dioxide and ozone. Additionally, recent studies have demonstrated that UV light will produce high levels of Cryptosporidium and Giardia lamblia inactivation at low doses. Section IV.C.15 provides criteria for systems to achieve credit for disinfection of Cryptosporidium, Giardia lamblia, and viruses by UV.

This section begins with a summary of treatment information considered for the IESWTR and LT1ESWTR, followed by a discussion of additional data that EPA has evaluated since promulgating those regulations. Further information on treatment of Cryptosporidium is available in Technologies and Costs for Control of Microbial Contaminants and Disinfection Byproducts (USEPA 2003c), Occurrence and Exposure Assessment for the Long Term 2 Enhanced Surface Water Treatment Rule (USEPA 2003b) and section IV.C of this preamble.

2. Treatment information considered for the IESWTR and LT1ESWTR

Treatment studies that were evaluated during development of the IESWTR are described in the IESWTR NODA (62 FR 59486, November 3, 1997) (USEPA 1997b), the Regulatory Impact Analysis for the IESWTR (USEPA 1998d), and Technologies and Costs for the Microbial Recommendations of the M/DBP Advisory Committee (USEPA 1997b). Treatment information considered in development of the LT1ESWTR is described in the proposed rule (65 FR 59486, April 10, 2000) (USEPA 2000b). Pertinent information is summarized in the following paragraphs.

a. Physical removal. EPA evaluated eight studies on removal of Cryptosporidium by rapid granular filtration for the IESWTR. These were Patania et al. (1995), Nieminski and Ongerth (1995), Ongerth and Pecoraro (1995), LeChevallier and Norton (1992), LeChevallier et al. (1991), Foundation for Water Research (1994), Kelley et al. (1995), and West et al. (1994). These studies included both pilot and full scale plants.

Full scale plants in these studies typically demonstrated 2-3 log removal of Cryptosporidium, and pilot plants achieved up to almost 6 log removal under optimized conditions. In general, the degree of removal that can be quantified in full scale plants is limited because Cryptosporidium levels following filtration are often below the detection limit of the analytical method. Pilot scale studies overcome this limitation by seeding high concentrations of oocysts to the plant influent, but extrapolation of the performance of a pilot plant to the routine performance of full scale plants is uncertain.

Cryptosporidium removal efficiency in these studies was observed to depend on a number of factors including: water matrix, coagulant application, treatment optimization, filtered water turbidity, and the filtration cycle. The highest removal rates were observed in plants that achieved very low effluent turbidities.

EPA also evaluated studies of Cryptosporidium removal by slow sand (Schuler and Ghosh 1991, Timms et al. 1995) and DE filtration (Schuler and Gosh 1990) for the IESWTR. These studies indicated that a well designed and operated plant using these processes could achieve 3 log or greater removal of Cryptosporidium.

After considering these studies, EPA concluded that conventional and direct filtration plants in compliance with the effluent turbidity criteria of the IESWTR, and slow sand and DE plants in compliance with the effluent turbidity criteria established for these processes by the SWTR, would achieve at least 2 log removal of Cryptosporidium. Recognizing that many plants will achieve more than the minimum 2 log reduction, EPA estimated median Cryptosporidium removal among filtration plants as near 3 log (99.9%) for the purpose of assessing risk.

The LT1ESWTR proposal included summaries of additional studies of Cryptosporidium removal by conventional treatment (Dugan et al. 1999), direct filtration (Swertfeger et al. 1998), and DE filtration (Ongerth and Hutton 1997). These studies supported IESWTR conclusions stated previously regarding the performance of these processes. The LT1ESWTR proposal also summarized studies of membranes, bag filters, and cartridge filters (Jacangelo et al. 1995, Drozd and Schartzbrod 1997, Hirata and Hashimoto 1998, Goodrich et al. 1995, Collins et al. 1996, Lykins et al. 1994, Adham et al. 1998). This research demonstrated that these technologies may be capable of achieving 2 log or greater removal of Cryptosporidium. However, EPA concluded that variation in performance among different manufacturers and models necessitates that determinations of treatment credit be made on a technology-specific basis (65 FR 19065, April 10, 2000) (USEPA 2000b).

b. Inactivation. In the IESWTR NODA (62 FR 59486) (USEPA 1997a), EPA cited studies that demonstrated that chlorine is ineffective for inactivation of Cryptosporidium at doses practical for treatment plants (Korich et al. 1990, Ransome et al. 1993, Finch et al. 1997). The Agency also summarized studies of Cryptosporidium inactivation by UV, ozone, and chlorine dioxide. EPA evaluated these disinfectants to determine if sufficient data were available to develop prescriptive disinfection criteria for Cryptosporidium.

The studies of UV disinfection of Cryptosporidium that were available during IESWTR development were inconclusive due to methodological factors. These studies included: Lorenzo-Lorenzo et al. (1993), Ransome et al. (1993), Campbell et al. (1995), Finch et al. (1997), and Clancy et al. (1997). A common limitation among these studies was the use of in vitro assays, such as excystation and vital dye staining, to measure loss of infectivity. These assays subsequently were shown to overestimate the UV dose needed to inactivate protozoa (Clancy et al. 1998, Craik et al. 2000). In another case, a reactor vessel that blocked germicidal light was used (Finch et al. 1997).

EPA evaluated the following studies of ozone inactivation of Cryptosporidium for the IESWTR: Peeters et al. (1989), Korich et al. (1990), Parker et al. (1993), Ransome et al. (1993), Finch et al. (1997), Daniel et al. (1993), and Miltner et al. (1997). These studies demonstrated that ozone could achieve high levels of Cryptosporidium inactivation, albeit at doses much higher than those required to inactivate Giardia. Results of these studies also exhibited significant variability due to factors like different infectivity assays and methods of dose calculation.

The status of chlorine dioxide inactivation of Cryptosporidium during IESWTR development was similar to that of ozone. EPA evaluated a number of studies that indicated that relatively high doses of chlorine dioxide could achieve significant inactivation of Cryptosporidium (Peeters et al. 1989, Korich et al. 1990, Ransome et al. 1993, Finch et al. 1995 and 1997, and LeChevallier et al. 1997). Data from these studies showed a high level of variability due to methodological differences, and the feasibility of high chlorine dioxide doses was uncertain due to the MCL for chlorite that was established by the Stage 1 DBPR.

After reviewing these studies, EPA and the Stage 1 Federal Advisory Committee concluded that available data were not adequate to award Cryptosporidium inactivation credit for UV, ozone, or chlorine dioxide.

3. New Information on Treatment for Control of Cryptosporidium

a. Conventional filtration treatment and direct filtration. This section provides brief descriptions of seven recent studies of Cryptosporidium removal by conventional treatment and direct filtration, followed by a summary of key points.

Dugan et al. (2001) evaluated the ability of conventional treatment to control Cryptosporidium under varying water quality and treatment conditions, and assessed turbidity, total particle counts (TPC), and aerobic endospores as indicators of Cryptosporidium removal. Fourteen runs were conducted on a small pilot scale plant that had been determined to provide equivalent performance to a larger plant. Under optimal coagulation conditions, oocyst removal across the sedimentation basin ranged from 0.6 to 1.8 log, averaging 1.3 log, and removal across the filters ranged from 2.9 to greater than 4.4 log, averaging greater than 3.7 log. Removal of aerobic spores, TPC, and turbidity all correlated with removal of Cryptosporidium by sedimentation, and these parameters were conservative indicators of Cryptosporidium removal across filtration. Sedimentation removal under optimal conditions related to raw water quality, with the lowest Cryptosporidium removals observed when raw water turbidity was low.

Suboptimal coagulation conditions (underdosed relative to jar test predictions) significantly reduced plant performance. Oocyst removal in the sedimentation basin averaged 0.2 log, and removal by filtration averaged 1.5 log. Under suboptimal coagulation conditions, low sedimentation removals of Cryptosporidium were observed regardless of raw water turbidity.

Nieminski and Bellamy (2000) investigated surrogates as indicators of Giardia and Cryptosporidium in source water and as measures of treatment plant effectiveness. It involved sampling for microbial pathogens (Giardia, Cryptosporidium, and enteric viruses), potential surrogates (bacteria, bacteria spores, bacterial phages, turbidity, particles), and other water quality parameters in the source and finished waters of 23 surface water filtration facilities and one unfiltered system.

While Giardia and Cryptosporidium were found in the majority of source water samples, the investigators could not establish a correlation between either occurrence or removal of these protozoa and any of the surrogates tested. This was attributed, in part, to low concentrations of Giardia and Cryptosporidium in raw water and high analytical method detection limits. Removal of Cryptosporidium and Giardia averaged 2.2 and 2.6 log, respectively, when conservatively estimated using detection limits in filtered water. Aerobic spores were found in 85% of filtered water samples and were considered a measure of general treatment effectiveness. Average reduction of aerobic spores was 2.84 log. Direct filtration plants removed fewer aerobic spores than conventional or softening plants.

McTigue et al. (1998) conducted an on-site survey of 100 treatment plants for particle counts, pathogens (Cryptosporidium and Giardia), and operational information. The authors also performed pilot scale spiking studies. Median removal of particles greater than 2 mm was 2.8 log, with values ranging from 0.04 to 5.5 log. Removal generally increased with increasing raw water particle concentration. Results were consistent with previously collected data. Cryptosporidium and Giardia were found in the majority of raw water sources, but calculation of their log removal was limited by the concentration present. River sources had a higher incidence of pathogen occurrence. Direct filtration plants had higher levels of pathogens in the filtered water than others in the survey.

Nearly all of the filter runs evaluated in the survey exhibited spikes where filtered water particle counts increased, and pilot work showed that pathogens are more likely to be released during these spike events. Cryptosporidium removal in the pilot scale spiking study averaged nearly 4 log, regardless of the influent oocyst concentration. Pilot study results indicated a strong relationship between removal of Cryptosporidium and removal of particles ( 3 μm) during runs using optimal coagulation and similar temperatures.

Patania et al. (1999) evaluated removal of Cryptosporidium at varied raw water and filter effluent turbidity levels using direct filtration. Runs were conducted with both low (2 NTU) and high (10 NTU) raw water turbidity. Targeted filtered water turbidity was either 0.02 or 0.05 NTU. At equivalent filtered water turbidity, Cryptosporidium removal was slightly higher when the raw water turbidity was higher. Also, Cryptosporidium removal was enhanced by an average of 1.5 log when steady-state filtered water turbidity was 0.02 NTU compared to 0.05 NTU.

Huck et al. (2000) evaluated filtration efficiency during optimal and suboptimal coagulation conditions with two pilot scale filtration plants. One plant employed a high coagulation dose for both total organic carbon (TOC) and particle removal, and the second plant used a low dose intended for particle removal only. Under optimal operating conditions, which were selected to achieve filtered water turbidity below 0.1 NTU, median Cryptosporidium removal was 5.6 log at the high coagulant dose plant and 3 log at the low dose plant. Under suboptimal coagulation conditions, where the coagulant dose was reduced to achieve filtered water turbidity of 0.2 to 0.3 NTU, median Cryptosporidium removals dropped to 3.2 log and 1 log at the high dose and low dose plants, respectively. Oocyst removal also decreased substantially at the end of the filter cycle, although this was not always indicated by an increase in turbidity. Runs conducted with no coagulant resulted in very little Cryptosporidium removal.

Emelko et al. (2000) investigated Cryptosporidium removal during vulnerable filtration periods using a pilot scale direct filtration system. The authors evaluated four different operational conditions: stable, early breakthrough, late breakthrough, and end of run. During stable operation, effluent turbidity was approximately 0.04 NTU and Cryptosporidium removal ranged from 4.7 to 5.8 log. In the early breakthrough period, effluent turbidity increased from approximately 0.04 to 0.2 NTU, and Cryptosporidium removal decreased significantly, averaging 2.1 log. For the late breakthrough period, where effluent turbidity began at approximately 0.25 NTU and ended at 0.35 NTU, Cryptosporidium removal dropped to an average of 1.4 log. Two experiments tested Cryptosporidium removal during the end-of-run operation, when effluent turbidities generally start increasing. Turbidity started at about 0.04 NTU for both experiments and ended at 0.06 NTU for the first experiment and 0.13 NTU for the second. Reported Cryptosporidium removal ranged from 1.8 to 3.3 log, with an average of 2.5 log for both experiments.

Harrington et al. (2001) studied the removal of Cryptosporidium and emerging pathogens by filtration, sedimentation, and dissolved air flotation (DAF) using bench scale jar tests and pilot scale conventional treatment trains. In the bench scale experiments, all run at optimized coagulant doses, mean log removal of Cryptosporidium was 1.2 by sedimentation and 1.7 by DAF. Cryptosporidium removal was similar in all four water sources that were evaluated and was not significantly affected by lower pH or coagulant aid addition. However, removal of Cryptosporidium was greater at 22°C than at 5°C, and was observed to be higher with alum coagulant than with either polyaluminum hydroxychlorosulfate or ferric chloride.

In the pilot scale experiments, mean log removal of Cryptosporidium was 1.9 in filtered water with turbidity of 0.2 NTU or less. Removal increased as filtered water turbidity dropped below 0.3 NTU. There was no apparent effect of filtration rate on removal efficiency. In comparing Cryptosporidium removal by sand, dual media (anthracite/sand), and trimedia (anthracite/sand/garnet) filters, no difference was observed near neutral pH. However, at pH 5.7, removal increased significantly in the sand filter and it outperformed the other filter media configurations. The authors found no apparent explanation for this behavior. There was no observable effect of a turbidity spike on Cryptosporidium removal.

Significance of Conventional and Direct Filtration Studies

The performance of treatment plants under current regulations is a significant factor in determining the need for additional treatment. As described in section IV.A, the proposed Cryptosporidium treatment requirements associated with LT2ESWTR risk bins for filtered systems are based, in part, on an estimate that conventional plants in compliance with the IESWTR achieve an average of 3 log Cryptosporidium removal. The following discussion illustrates why EPA believes that available data support this estimate.

While Cryptosporidium removal at full scale plants is difficult to quantify due to limitations with analytical methods, pilot scale studies show that reductions in aerobic spores and total particle counts are often conservative indicators of filtration plant removal efficiency for Cryptosporidium (Dugan et al. 2001, McTigue et al. 1998, Yates et al. 1998, Emelko et al. 1999 and 2000). Surveys of full scale plants have reported average reductions near 3 log for both aerobic spores (Nieminski and Bellamy, 2000) and total particle counts (McTigue et al. 1998). Consequently, these findings are consistent with an estimate that average removal of Cryptosporidium by filtration plants is approximately 3 log.

Pilot scale Cryptosporidium spiking studies (Dugan et al. 2001, Huck et al. 2000, Emelko et al. 2000, McTigue et al. 1998, Patania et al. 1995) suggest that a conventional treatment plant has the potential to achieve greater than 5 log removal of Cryptosporidium under optimal conditions. However, these high removals are typically observed at very low filter effluent turbidity values, and the data show that removal efficiency can decrease substantially over the course of a filtration cycle or if coagulation is not optimized (Dugan et al. 2001, Huck et al. 2000, Emelko et al. 2000, Harrington et al. 2001). Removal efficiency also appears to be impacted by source water quality (Dugan et al. 2001, McTigue et al. 1998). Given these considerations, EPA believes that 3 log is a reasonable estimate of average Cryptosporidium removal efficiency for conventional treatment plants in compliance with the IESWTR or LT1ESWTR.

The Stage 2 M-DBP Advisory Committee did not address direct filtration plants, which lack the sedimentation basin of a conventional treatment train, but recommended that EPA address these plants in the LT2ESWTR proposal (65 FR 83015, December 29, 2000) (USEPA 2000a). While some studies have observed similar levels of Cryptosporidium removal in direct and conventional filtration plants (Nieminski and Ongerth, 1995, Ongerth and Pecoraro 1995), EPA has concluded that the majority of available data support a lower estimate of Cryptosporidium removal efficiency for direct filtration plants.

As described in section IV.C.5, pilot and full scale studies demonstrate that sedimentation basins, which are absent in direct filtration, can achieve 0.5 log or greater Cryptosporidium reduction (Dugan et al. 2001, Patania et al. 1995, Edzwald and Kelly 1998, Payment and Franco 1993, Kelley et al. 1995). In addition, Patania et al. (1995) observed direct filtration to achieve less Cryptosporidium removal than conventional treatment, and McTigue et al. (1998) found a higher incidence of Cryptosporidium in the treated water of direct filtration plants. Given these findings, EPA has estimated that direct filtration plants achieve an average of 2.5 log Cryptosporidium reduction (i.e., 0.5 log less than conventional treatment).

i. Dissolved air flotation. Dissolved air flotation (DAF) is a solid-liquid separation process that can be used in conventional treatment trains in place of gravity sedimentation. DAF takes advantage of the buoyancy of oocysts by floating oocyst/particle complexes to the surface for removal. In DAF, air is dissolved in pressurized water, which is then released into a flotation tank containing flocculated particles. As the water enters the tank, the dissolved air forms small bubbles that collide with and attach to floc particles and float to the surface (Gregory and Zabel, 1990).

In comparing DAF with gravity sedimentation, Plummer et al. (1995) observed up to 0.81 log removal of oocysts in the gravity sedimentation process, while DAF achieved 0.38 to 3.7 log removal, depending on coagulant dose. Edzwald and Kelley (1998) demonstrated a 3 log removal of oocysts using DAF, compared with a 1 log removal using gravity sedimentation in the clarification process before filtration. In bench scale testing by Harrington et al. (2001), DAF averaged 0.5 log higher removal of Cryptosporidium than gravity sedimentation. Based on these results, EPA has concluded that a treatment plant using DAF plus filtration can achieve levels of Cryptosporidium removal equivalent to or greater than a conventional treatment plant with gravity sedimentation.

b. Slow sand filtration. Slow sand filtration is a process involving passage of raw water through a bed of sand at low velocity (generally less than 0.4 m/h) resulting in substantial particulate removal by physical and biological mechanisms. For the LT2ESWTR proposal, EPA has reviewed two additional studies of slow sand filtration.

Fogel et al. (1993) evaluated removal efficiencies for Cryptosporidium and Giardia with a full scale slow sand filtration plant. The removals ranged from 0.1-0.5 log for Cryptosporidium and 0.9-1.4 log for Giardia. Raw water turbidity ranged from 1.3 to 1.6 NTU and decreased to 0.35-0.31 NTU after filtration. The authors attributed the low Cryptosporidium and Giardia removals to the relatively poor grade of filter media and lower water temperature. The sand had a higher uniformity coefficient than recommended by design standards. This creates larger pore spaces within the filter bed that retard biological removal capacity. Lower water temperatures (1 °C) also decreased biological activity in the filter media.

Hall et al. (1994) examined the removal of Cryptosporidium with a pilot scale slow sand filtration plant. Cryptosporidium removals ranged from 2.8 to 4.3 log after filter maturation, with an average of 3.8 log (at least one week after filter scraping). Raw water turbidity ranged from 3.0 NTU to 7.5 NTU for three of four runs and 15.0 NTU for a fourth run. Filtered water turbidity was 0.2 to 0.4 NTU, except for the fourth run which had 2.5 NTU filtered water turbidity. This study also included an investigation of Cryptosporidium removal during filter start-up where the filtration rate was slowly increased over a 4 day period. Results indicate that filter ripening did not appear to affect Cryptosporidium removal.

The study by Fogel et al. is significant because it indicates that a slow sand filtration plant may achieve less than 2 log removal of Cryptosporidium removal while being in compliance with the effluent turbidity requirements of the IESWTR and LT1ESWTR. The authors attributed this poor performance to the filter being improperly designed, which, if correct, illustrates the importance of proper design for removal efficiency in slow sand filters. In contrast, the study by Hall et al. (1994) supports other work (Schuler and Ghosh 1991, Timms et al. 1995) in finding that slow sand filtration can achieve Cryptosporidium removal greater than 3 log. Overall, this body of work appears to show that slow sand filtration has the potential to achieve Cryptosporidium removal efficiencies similar to that of a conventional plant, but proper design and operation are critical to realizing treatment goals.

c. Diatomaceous earth filtration. Diatomaceous earth filtration is a process in which a precoat cake of filter media is deposited on a support membrane and additional filter media is continuously added to the feed water to maintain the permeability of the filter cake. Since the IESWTR and LT1ESWTR, EPA has reviewed one new study of DE filtration (Ongerth and Hutton 2001). It supports the findings of earlier studies (Schuler and Gosh 1990, Ongerth and Hutton 1997) in showing that a well designed and operated DE plant can achieve Cryptosporidium removal equivalent to a conventional treatment plant (i.e., average of 3 log).

d. Other filtration technologies. In today's proposal, information about bag filters, cartridge filters, and membranes, including criteria for awarding Cryptosporidium treatment credit, is presented in section IV.C as part of the microbial toolbox. Section IV.C also addresses credit for pretreatment options like presedimentation basins and bank filtration.

e. Inactivation. Substantial advances in understanding of Cryptosporidium inactivation by ozone, chlorine dioxide, and UV have been made following the IESWTR and LT1ESWTR. These advances have allowed EPA to develop criteria to award Cryptosporidium treatment credit for these disinfectants. Relevant information is summarized next, with additional information sources noted.

i. Ozone and chlorine dioxide. With the completion of several major studies, EPA has acquired sufficient information to develop standards for the inactivation of Cryptosporidium by ozone and chlorine dioxide. For both of these disinfectants, today's proposal includes CT tables that specify a level of Cryptosporidium treatment credit based on the product of disinfectant concentration and contact time.

For ozone, the CT tables in today's proposal were developed through considering four sets of experimental data: Li et al. (2001), Owens et al. (2000), Oppenheimer et al. (2000), and Rennecker et al. (1999). Chlorine dioxide CT tables are based on three experimental data sets: Li et al. (2001), Owens et al. (1999), and Ruffell et al. (2000). Together these studies provide a large body of data that covers a range of water matrices, both laboratory and natural. While the data exhibit variability, EPA believes that collectively they are sufficient to determine appropriate levels of treatment credit as a function of disinfection conditions. CT tables for ozone and chlorine dioxide inactivation of Cryptosporidium are presented in Section IV.C.14 of this preamble.

ii. Ultraviolet light. A major recent development is the finding that UV light is highly effective for inactivating Cryptosporidium and Giardia at low doses. Research prior to 1998 had indicated that very high doses of UV light were required to achieve substantial disinfection of protozoa. However, as noted previously, these results were largely based on the use of in vitro assays, which were later shown to substantially overestimate the UV doses required to prevent infection (Clancy et al. 1998, Bukhari et al. 1999, Craik et al. 2000). Recent research using in vivo assays (e.g., neonatal mouse infectivity) and cell culture techniques to measure infectivity has provided strong evidence that both Cryptosporidium and Giardia are highly sensitive to low doses of UV.

BILLING CODE 6560-50-P

[Graphic not available; view image of printed page]

BILLING CODE 6560-50-C

Figure III-5 presents data from selected studies of UV inactivation of Cryptosporidium. While the data in Figure III-5 show substantial scatter, they are consistent in demonstrating a high level of inactivation at relatively low UV doses. These studies generally demonstrated at least 3 log Cryptosporidium inactivation at UV doses of 10 mJ/cm 2 and higher. In comparison, typical UV dose for drinking water disinfection are 30 to 40 mJ/cm 2. A recent investigation by Clancy et al. (2002) showed that UV light at 10 mJ/cm 2 provided at least 4 log inactivation of five strains of Cryptosporidium that are infectious to humans. Studies of UV inactivation of Giardia have reported similar results (Craik et al. 2000, Mofidi et al. 2002, Linden et al. 2002, Campbell and Wallis 2002, Hayes et al. 2003).

In addition to efficacy for protozoa inactivation, data indicate that UV disinfection does not promote the formation of DBPs (Malley et al. 1995, Zheng et al. 1999). Malley et al. (1995) evaluated DBP formation in a number of surface and ground waters with UV doses between 60 and 200 mJ/cm [2] . UV light did not directly form DBPs, such as trihalomethanes (THM) and haloacetic acids (HAA), and did not alter the concentration or species of DBPs formed by post-disinfection with chlorine or chloramines. A study by Zheng et al. (1999) reported that applying UV light following chlorine disinfection had little impact on THM and HAA formation. In addition, data suggest that photolysis of nitrate to nitrite, a potential concern with certain types of UV lamps, will not result in nitrite levels near the MCL under typical drinking water conditions (Peldszus et al. 2000, Sharpless and Linden 2001).

These studies demonstrate that UV light is an effective technology for inactivating Giardia and Cryptosporidium, and that it does not form DBPs at levels of concern in drinking water. Section IV.C.15 describes proposed criteria for awarding treatment credit for UV inactivation of Cryptosporidium, Giardia lamblia, and viruses. These criteria include UV dose tables, validation testing, and monitoring standards. In addition, EPA is preparing a UV Disinfection Guidance Manual with information on design, testing, and operation of UV systems. A draft of this guidance is available in the docket for today's proposal (http://www.epa.gov/edocket/).

iii. Significance of new information on inactivation. The research on ozone, chlorine dioxide, and UV light described in this proposal has made these disinfectants available for systems to use in meeting additional Cryptosporidium treatment requirements under LT2ESWTR. This overcomes a significant limitation to establishing inactivation requirements for Cryptosporidium that existed when the IESWTR was developed. The Stage 1 Advisory Committee recognized the need for inactivation criteria if EPA were to consider a risk based proposal for Cryptosporidium in future rulemaking (62 FR 59498, November 3, 1997) (USEPA 2000b). The CT tables for ozone and chlorine dioxide provide such criteria. In addition, the availability of UV furnishes another relatively low cost tool to achieve Cryptosporidium inactivation and DBP control.

While no single treatment technology is appropriate for all systems, EPA believes that these disinfectants, along with the other management and treatment options in the microbial toolbox presented in section IV.C, make it feasible for systems to meet the additional Cryptosporidium treatment requirements in today's proposal.

IV. Discussion of Proposed LT2ESWTR Requirements Back to Top

A. Additional Cryptosporidium Treatment Technique Requirements for Filtered Systems

1. What Is EPA Proposing Today?

a. Overview of framework approach. EPA is proposing treatment technique requirements to supplement the existing requirements of the SWTR, IESWTR, and LT1ESWTR (see section II.B). The proposed requirements will achieve increased protection against Cryptosporidium in public water systems that use surface water or ground water under the direct influence of surface water as sources. Under this proposal, filtered systems will be assigned to one of four risk categories (or “bins”), based on the results of source water Cryptosporidium monitoring. Systems assigned to the lowest risk bin incur no additional treatment requirements, while systems assigned to higher risk bins must reduce Cryptosporidium levels beyond IESWTR and LT1ESWTR requirements. Systems will comply with additional Cryptosporidium treatment requirements by selecting treatment and management strategies from a “microbial toolbox” of control options.

Today's proposal reflects recommendations from the Stage 2 M-DBP Federal Advisory Committee (65 FR 83015, December 29, 2000) (USEPA 2000a), which described this approach as a “microbial framework”. This approach targets additional treatment requirements to those systems with the highest source water Cryptosporidium levels and, consequently, the highest vulnerability to this pathogen. In so doing, today's proposal builds upon the current treatment technique requirement for Cryptosporidium under which all filtered systems must achieve at least a 2 log reduction, regardless of source water quality. The intent of this proposal is to assure that public water systems with the higher risk source water achieve a level of public health protection commensurate with systems with less contaminated source water.

b. Monitoring requirements. Today's proposal requires systems to monitor their source water (influent water prior to treatment plant) for Cryptosporidium, E. coli, and turbidity. The purpose of the monitoring is to assess source water Cryptosporidium levels and, thereby, classify systems in different risk bins. Proposed monitoring requirements for large and small systems are summarized in Table IV-I and are characterized in the following discussion.

Large Systems

Large systems (serving at least 10,000 people) must sample their source water at least monthly for Cryptosporidium, E. coli, and turbidity for a period of 2 years, beginning no later than 6 months after LT2ESWTR promulgation. Systems may sample more frequently (e.g., twice-per-month, once-per-week), provided the same sampling frequency is used throughout the 2-year monitoring period. As described in section IV.A.1.c, systems that sample more frequently (at least twice-per-month) use a different calculation that is potentially less conservative to determine their bin classification.

The purpose of requiring large systems to collect E. coli and turbidity data is to further evaluate these parameters as indicators to identify drinking water sources that are susceptible to high concentrations of Cryptosporidium. As described next, these data will be applied to small system LT2ESWTR monitoring.

Small Systems

EPA is proposing a 2-phase monitoring strategy for small systems (serving fewer than 10,000 people) to reduce their monitoring burden. This approach is based on Information Collection Rule and ICRSS data indicating that systems with low source water E. coli levels are likely to have low Cryptosporidium levels, such that additional treatment would not be required under the LT2ESWTR. Under this approach, small systems must initially conduct one year of bi-weekly sampling (one sample every two weeks) for E. coli, beginning 2.5 years after LT2ESWTR promulgation. Small systems are triggered into Cryptosporidium monitoring only if the initial E. coli monitoring indicates a mean concentration greater than 10 E. coli/100 mL for systems using a reservoir or lake as their primary source or greater than 50 E. coli/100 mL for systems using a flowing stream as their primary source. Small systems that exceed these E. coli trigger values must conduct one year of twice-per-month Cryptosporidium sampling, beginning 4 years after LT2ESWTR promulgation.

The analysis supporting the proposed E. coli values that trigger Cryptosporidium monitoring by small systems is presented in Section IV.A.2. However, as recommended by the Stage 2 M-DBP Advisory Committee, EPA will evaluate Cryptosporidium indicator relationships in the LT2ESWTR monitoring data collected by large systems. If these data support the use of different indicator levels to trigger small system Cryptosporidium monitoring, EPA will issue guidance with recommendations. The proposed LT2ESWTR allows States to specify alternative indicator values for small systems, based on EPA guidance.

Table IV-1.—LT2ESWTR Monitoring Requirements Back to Top
Public water systems Monitoring begins Monitoring duration Monitoring parameters and sample frequency requirements
Cryptosporidium E. coli Turbidity
a Public water systems may use equivalent previously collected (grandfathered) data to meet LT2ESWTR requirements. See section IV.A.1.d for details.
b Public water systems may sample more frequently (e.g., twice-per-month, once-per-week).
c Small systems must monitor for Cryptosporidium for one year, beginning 6 months after completion of E. coli monitoring, if the E. coli annual mean concentration exceeds 10/100 mL for systems using lakes/reservoir sources or 50/100 mL for systems using flowing stream sources.
N/A = Not applicable. No monitoring required.
Large systems (serving 10,000 or more people) 6 months after promulgation of LT2ESWTR a 2 years minimum 1 sample/month b minimum 1 sample/month b minimum 1 measurement/month b.
Small systems (serving fewer than 10,000 people) 30 months (21/2years) after promulgation of LT2ESWTR 1 year See following rows 1 sample every two weeks N/A
Possible additional monitoring requirement for Cryptosporidium. If small systems exceed E. coli trigger levels c, then * * *          
Small systems (serving fewer than 10,000 people) c 48 months (4 years) after promulgation of LT2ESWTR 1 year 2 samples/month N/A N/A.

Sampling Location

Source water samples must be representative of the intake to the filtration plant. Generally, sampling must be performed individually for each plant that treats a surface water source. However, where multiple plants receive all of their water from the same influent (e.g., multiple plants draw water from the same pipe), the same set of monitoring results may be applicable to each plant. Typically, samples must be collected prior to any treatment, with exceptions for certain pretreatment processes. Directions on sampling location for plants using off-stream storage, presedimentation, and bank filtration are provided in section IV.C.

Systems with plants that use multiple water sources at the same time must collect samples from a tap where the sources are combined prior to treatment if available. If a blended source tap is not available, systems must collect samples from each source and either analyze a weighted composite (blended) sample or analyze samples from each source separately and determine a weighted average of the results.

Sampling Schedule

Large systems must submit a sampling schedule to EPA within 3 months after promulgation of the LT2ESWTR. Small systems must submit a sampling schedule for E. coli monitoring to their primacy agency within 27 months after rule promulgation; small systems required to monitor for Cryptosporidium must submit a Cryptosporidium sampling schedule within 45 months after promulgation. The sampling schedules must specify the calendar date on which the system will collect each sample required under the LT2ESWTR. Scheduled sampling dates should be evenly distributed throughout the monitoring period, but may be arranged to accommodate holidays, weekends, and other events when collecting or analyzing a sample would be problematic.

Systems must collect samples within 2 days before or 2 days after a scheduled sampling date. If a system does not sample within this 5-day window, the system will incur a monitoring violation unless either of the following two conditions apply:

(1) If extreme conditions or situations exist that may pose danger to the sample collector, or which are unforeseen or cannot be avoided and which cause the system to be unable to sample in the required time frame, the system must sample as close to the required date as feasible and submit an explanation for the alternative sampling date with the analytical results.

(2) Systems that are unable to report a valid Cryptosporidium analytical result for a scheduled sampling date due to failure to comply with analytical method quality control requirements (described in section IV.K) must collect a replacement sample within 14 days of being notified by the laboratory or the State that a result cannot be reported for that date. Systems must submit an explanation for the replacement sample with the analytical results. Where possible, the replacement sample collection date should not coincide with any other scheduled LT2ESWTR sampling dates.

Approved Analytical Methods and Laboratories

To ensure the quality of LT2ESWTR monitoring data, today's proposal requires systems to use approved methods for Cryptosporidium, E. coli, and turbidity analyses (see section IV.K for sample analysis requirements), and to have these analyses performed by approved laboratories (described in section IV.L).

Reporting

Because source water monitoring by large systems will begin 6 months after promulgation of the LT2ESWTR, EPA is proposing that monitoring results for large systems be reported directly to the Agency though an electronic data system (described in section IV.J), similar to the approach currently used under the Unregulated Contaminants Monitoring Rule (64 FR 50555, September 17, 1999) (USEPA 1999c). Small systems will report data to EPA or States, depending on whether States have assumed primacy for the LT2ESWTR.

Previously Collected Monitoring Results

EPA is proposing to allow systems to use previously collected (i.e., grandfathered) Cryptosporidium monitoring data to meet LT2ESWTR monitoring requirements if the data are equivalent to data that will be collected under the rule (e.g., sample volume, sampling frequency, analytical method quality control). Criteria for acceptance of previously collected data are specified in section IV.A.1.d.

Providing Additional Treatment Instead of Monitoring

Filtered systems are not required to conduct source water monitoring under the LT2ESWTR if the system currently provides or will provide a total of at least 5.5 log of treatment for Cryptosporidium, equivalent to meeting the treatment requirements of Bin 4 as shown in Table IV-4 (i.e., the maximum required in today's proposal). Systems must notify EPA or the State not later than the date the system is otherwise required to submit a sampling schedule for monitoring and must install and operate technologies to provide a total of at least 5.5 log of treatment for Cryptosporidium by the applicable date in Table IV-23. Any filtered system that fails to complete LT2ESWTR monitoring requirements must meet the treatment requirements for Bin 4.

Ongoing Source Assessment and Second Round of Monitoring

Because LT2ESWTR treatment requirements are related to the degree of source water contamination, today's proposal contains provisions to assess changes in a system's source water quality following initial risk bin classification. These provisions include source water assessment during sanitary surveys and a second round of monitoring.

Under 40 CFR 142.16(b)(3)(i), source water is one of the components that States must address during the sanitary surveys that are required for surface water systems. These sanitary surveys must be conducted every 3 years for community systems and every 5 years for non-community systems. EPA is proposing that if the State determines during the sanitary survey that significant changes have occurred in the watershed that could lead to increased contamination of the source water, the State may require systems to implement specific actions to address the contamination. These actions include implementing options from the microbial toolbox discussed in section IV.C.

EPA is proposing that systems conduct a second round of source water monitoring, beginning six years after systems are initially classified in LT2ESWTR risk bins. To prepare for this second round of monitoring, the Advisory Committee recommended that EPA initiate a stakeholder process four years after large systems complete initial bin classification. The purpose of the stakeholder process would be to review risk information, and to determine the appropriate analytical method, monitoring frequency, monitoring location, and other criteria for the second round of monitoring.

If EPA does not modify LT2ESWTR requirements through issuing a new regulation prior to the second round of monitoring, systems must carry out this monitoring according to the requirements that apply to the initial round of source water monitoring. Moreover, systems will be reclassified in LT2ESWTR risk bins based on the second round monitoring results and using the criteria specified in this section for initial bin classification. However, if EPA changes the LT2ESWTR risk bin structure to reflect a new analytical method or new risk information, systems will undergo a site specific risk characterization in accordance with the revised rule.

c. Treatment Requirements

i. Bin classification. Under the proposed LT2ESWTR, surface water systems that use filtration will be classified in one of four Cryptosporidium concentration categories (bins) based on the results of source water monitoring. As shown in Table IV-2, bin classification is determined by averaging the Cryptosporidium concentrations measured for individual samples.

Table IV-2.— Bin Classification Table for Filtered Systems Back to Top
If your average Cryptosporidium concentration 1 is . . . Then your bin classification is . . .
1All concentrations shown in units of oocysts/L
Cryptosporidium 0.075/L Bin 1.
0.075/L ≤Cryptosporidium 1.0/L Bin 2.
1.0/L ≤Cryptosporidium 3.0/L Bin 3.
Cryptosporidium≥ 3.0/L Bin 4.

The approach that systems will use to average individual sample concentrations to determine their bin classification depends on the number of samples collected and the length of the monitoring period. Systems serving at least 10,000 people are required to monitor for 24 months, and their bin classification must be based on the following:

(1) Highest twelve month running annual average for monthly sampling, or

(2) two year mean if system conducts twice-per-month or more frequent sampling for 24 months (i.e., at least 48 samples).

Systems serving fewer than 10,000 people are required to collect 24 Cryptosporidium samples over 12 months if they exceed the E. coli trigger level, and their bin classification must be based on the mean of the 24 samples. As noted earlier, systems that fail to complete the required Cryptosporidium monitoring will be classified in Bin 4.

When determining LT2ESWTR bin classification, systems must calculate individual sample concentrations using the total number of oocysts counted, unadjusted for method recovery, divided by the volume assayed (see section IV.K for details). As described in Section IV.A.2, the ranges of Cryptosporidium concentrations that define LT2ESWTR bins reflect consideration of analytical method recovery and the percent of Cryptosporidium oocysts that are infectious. Consequently, sample analysis results will not be adjusted for these factors.

ii. Credit for treatment in place. A key parameter in determining additional Cryptosporidium treatment requirements is the credit that plants receive for treatment currently provided (i.e., treatment in place). For baseline treatment requirements established by the SWTR, IESWTR, and LT1ESWTR that apply uniformly to filtered systems, the Agency has awarded credit based on the minimum removal that plants will achieve. Specifically, in the IESWTR and LT1ESWTR, EPA determined that filtration plants, including conventional, direct, slow sand, and DE, meeting the required filter effluent turbidity criteria will achieve at least 2 log removal of Cryptosporidium. Consequently, these plants were awarded a 2 log Cryptosporidium removal credit, which equals the maximum treatment required under these regulations.

The LT2ESWTR will supplement existing regulations by mandating additional treatment at certain plants based on site specific conditions (i.e., source water Cryptosporidium level). When assessing the need for additional treatment beyond baseline requirements for higher risk systems, the Agency has determined that it is appropriate to consider the average removal efficiency achieved by treatment plants. As described in section III.D, EPA has concluded that conventional, slow sand, and DE plants in compliance with the SWTR, IESWTR, and LT1ESWTR achieve an average Cryptosporidium reduction of 3 log. Consequently, EPA is proposing to award these plants a 3 log credit towards Cryptosporidium treatment requirements under the LT2ESWTR. As noted previously, this approach is consistent with the Stage 2 M-DBP Agreement in Principle.

For other types of filtration plants, treatment credit under the LT2ESWTR differs. Conventional treatment is defined in 40 CFR 141.2 as a series of processes including coagulation, flocculation, sedimentation, and filtration, with sedimentation defined as a process for removal of solids before filtration by gravity or separation. Thus, plants with separation (i.e., clarification) processes other than gravity sedimentation between flocculation and filtration, such as DAF, may be regarded as conventional treatment for purposes of awarding treatment credit under the LT2ESWTR. However, for direct filtration plants, which lack a sedimentation process, EPA is proposing a 2.5 log Cryptosporidium removal credit. Studies that support awarding direct filtration plants less treatment credit than conventional plants are summarized in section III.D.

EPA is unable to estimate an average log removal for other filtration technologies like membranes, bag filters, and cartridge filters, due to variability among products. As a result, credit for these devices must be determined by the State, based on product specific testing described in section IV.C or other criteria approved by the State.

Table IV-3 presents the credit proposed for different types of plants towards LT2ESWTR Cryptosporidium treatment requirements. As described in section IV.C.18, a State may award greater credit to a system that demonstrates through a State-approved protocol that it reliably achieves a higher level of Cryptosporidium removal. Conversely, a State may award less credit to a system where the State determines, based on site specific information, that the system is not achieving the degree of Cryptosporidium removal indicated in Table IV-3.

Table IV-3.—Cryptosporidium Treatment Credit Towards LT2ESWTR Requirements 1 Back to Top
Plant type Conventional treatment (includes softening) Direct filtration Slow sand or diatomaceous earth filtration Alternative filtration technologies
1Applies to plants in full compliance with the SWTR, IESWTR, and LT1ESWTR as applicable
2Credit must be determined through product or site specific assessment
Treatment credit 3.0 log 2.5 log 3.0 log Determined by State 2.

iii. Treatment requirements associated with LT2ESWTR bins

The treatment requirements associated with LT2ESWTR risk bins are shown in Table IV-4. The total Cryptosporidium treatment required for Bins 2, 3, and 4 is 4.0 log, 5.0 log, and 5.5 log, respectively. For conventional (including softening), slow sand, and DE plants that receive 3.0 log credit for compliance with current regulations, additional Cryptosporidium treatment of 1.0 to 2.5 log is required when classified in Bins 2-4. Direct filtration plants that receive 2.5 log credit for compliance with current regulations must achieve 1.5 to 3.0 log of additional Cryptosporidium treatment in Bins 2-4.

For systems using alternative filtration technologies, such as membranes or bag/cartridge filters, and classified in Bins 2-4, the State must determine additional treatment requirements based on the credit awarded to a particular technology. The additional treatment must be such that plants classified in Bins 2, 3, and 4 achieve the total required Cryptosporidium reductions of 4.0, 5.0, and 5.5 log, respectively.

Table IV-4.—Treatment Requirements Per LT2ESWTR Bin Classification Back to Top
If your bin classification is . . . And you use the following filtration treatment in full compliance with the SWTR, IESWTR, and LT1ESWTR (as applicable), then your additional treatment requirements are . . .
Conventional filtration treatment (includes softening) Direct filtration Slow sand or diatomaceous earth filtration Alternative filtration technologies
1Systems may use any technology or combination of technologies from the microbial toolbox.
2Systems must achieve at least 1 log of the required treatment using ozone, chlorine dioxide, UV, membranes, bag/cartridge filters, or bank filtration.
3Total Cryptosporidium removal and inactivation must be at least 4.0 log.
4Total Cryptosporidium removal and inactivation must be at least 5.0 log.
5Total Cryptosporidium removal and inactivation must be at least 5.5 log.
Bin 1 No additional treatment No additional treatment No additional treatment No additional treatment.
Bin 2 1 log treatment1 1.5 log treatment1 1 log treatment1 As determined by the State 1, 3.
Bin 3 2 log treatment2 2.5 log treatment2 2 log treatment2 As determined by the State 2, 4.
Bin 4 2.5 log treatment2 3 log treatment2 2.5 log treatment2 As determined by the State 2, 5.

Plants can achieve additional Cryptosporidium treatment credit through implementing pretreatment processes like presedimentation or bank filtration, by developing a watershed control program, and by applying additional treatment steps like UV, ozone, chlorine dioxide, and membranes. In addition, plants can receive additional credit for existing treatment through achieving very low filter effluent turbidity or through a demonstration of performance. Section IV.C presents criteria for awarding Cryptosporidium treatment credit to a host of treatment and control options, including those listed here and others, which are collectively termed the “microbial toolbox”.

Systems in Bin 2 can meet additional Cryptosporidium treatment requirements through using any option or combination of options from the microbial toolbox. In Bins 3 and 4, systems must achieve at least 1 log of the additional treatment requirement through using ozone, chlorine dioxide, UV, membranes, bag filtration, cartridge filtration, or bank filtration.

d. Use of previously collected data. Today's proposal allows systems with previously collected Cryptosporidium data (i.e., data collected prior to the required start of monitoring under the LT2ESWTR) that are equivalent in sample number, frequency, and data quality to data that will be collected under the LT2ESWTR to use those data in lieu of conducting new monitoring. Specifically, EPA is proposing that Cryptosporidium sample analysis results collected prior to promulgation of the LT2ESWTR must meet the following criteria to be used for bin classification:

  • Samples were analyzed by laboratories using validated versions of EPA Methods 1622 or 1623 and meeting the quality control criteria specified in these methods (USEPA 1999a, USEPA 1999b, USEPA 2001e, USEPA 2001f).
  • Samples were collected no less frequently than each calendar month on a regular schedule, beginning no earlier than January 1999 (when EPA Method 1622 was first released as an interlaboratory-validated method).
  • Samples were collected in equal intervals of time over the entire collection period (e.g., weekly, monthly). The allowances for deviations from a sampling schedule specified under IV.A.1.b for LT2ESWTR monitoring apply to grandfathered data.
  • Samples were collected at the correct location as specified for LT2ESWTR monitoring. Systems must report the use of bank filtration, presedimentation, and raw water off-stream storage during sampling.
  • For each sample, the laboratory analyzed at least 10 L of sample or at least 2 mL of packet pellet volume or as much volume as two filters could accommodate before clogging (applies only to filters that have been approved by EPA for use with Methods 1622 and 1623).
  • The system must certify that it is reporting all Cryptosporidium monitoring results generated by the system during the time period covered by the previously collected data. This applies to samples that were (a) collected from the sampling location used for LT2ESWTR monitoring, (b) not spiked, and (c) analyzed using the laboratory's routine process for Method 1622 or 1623 analyses.
  • The system must also certify that the samples were representative of a plant's source water(s) and the source water(s) have not changed.

If a system has at least two years of Cryptosporidium data collected before promulgation of the LT2ESWTR and the system does not intend to conduct new monitoring under the rule, the system must submit the data and the required supporting documentation to EPA no later than two months following promulgation of the rule. EPA will notify the system within four months following LT2ESWTR promulgation as to whether the data are sufficient for bin determination. Unless EPA notifies the system in writing that the previously collected data are sufficient for bin determination, the system must conduct source water Cryptosporidium monitoring as described in section IV.A.1.b of this preamble.

If a system intends to grandfather fewer than two years of Cryptosporidium data, or if a system intends to grandfather 2 or more years of previously collected data and also to conduct new monitoring under the rule, the system must submit the data and the required supporting documentation to EPA no later than eight months following promulgation of the rule. Systems must conduct monitoring as described in section IV.A.1.b until EPA notifies the system in writing that it has at least 2 years of acceptable data. See section IV.J for additional information on reporting requirements associated with previously collected data.

2. How Was This Proposal Developed?

The monitoring and treatment requirements for filtered systems proposed under the LT2ESWTR stem from the data and analyses described in this section and reflect recommendations made by the Stage 2 M-DBP Federal Advisory Committee (65 FR 83015) (USEPA 2000a).

a. Basis for targeted treatment requirements. Under the IESWTR, EPA established an MCLG of zero for Cryptosporidium at the genus level based on the public health risk associated with this pathogen. The IESWTR included a 2 log treatment technique requirement for medium and large filtered systems that controlled for Cryptosporidium as close to the MCLG as was then deemed technologically feasible, taking costs into consideration. The LT1ESWTR extended this requirement to small systems. Given the advances that have occurred subsequent to the IESWTR in available technology to measure and treat for Cryptosporidium, a key question for the LT2ESWTR was the extent to which Cryptosporidium should be further controlled to approach the MCLG of zero, considering technical feasibility, costs, and potential risks from DBPs.

The data and analysis presented in Section III of this preamble suggest wide variability in possible risk from Cryptosporidium among public water systems. This variability is largely due to three factors: (1) The broad distribution of Cryptosporidium occurrence levels among source waters, (2) disparities in the efficacy of treatment provided by plants, and (3) differences in the infectivity among Cryptosporidium isolates. EPA and the Advisory Committee considered this wide range of possible risks and the desire to address systems where the 2 log removal requirement established by the IESWTR and LT1ESWTR may not provide adequate public health protection.

A number of approaches were evaluated for furthering control of Cryptosporidium. One approach was to require all systems to provide the same degree of additional treatment for Cryptosporidium (i.e., beyond that required by the IESWTR and LT1ESWTR). This approach could ensure that most systems, including those with poor quality source water, would be adequately protective. The uniformity of this approach has the advantage of minimizing transactional costs for determining what must be done by a particular system to comply. However, a significant downside is that it may require more treatment, with consequent costs, than is needed by many systems with low source water Cryptosporidium levels. In addition, there were concerns with the feasibility of requiring almost all surface water treatment plants to install additional treatment processes for Cryptosporidium.

A second approach was to base additional treatment requirements on a plant's source water Cryptosporidium level. Under this approach, systems monitor their source water for Cryptosporidium, and additional treatment is required only from those systems that exceed specified oocyst concentrations. This has the advantage of targeting additional public health protection to those systems with higher vulnerability to Cryptosporidium, while avoiding the imposition of higher treatment costs on systems with the least contaminated source water. In consideration of these advantages, the Advisory Committee recommended and EPA is proposing this second approach for filtered systems under the LT2ESWTR.

b. Basis for bin concentration ranges and treatment requirements. The proposed LT2ESWTR will classify plants into different risk bins based on the source water Cryptosporidium level, and the bin classification will determine the extent to which additional treatment beyond IESWTR and LT1ESWTR is required. Two questions were central in developing the proposed bin concentration ranges and additional treatment requirements:

  • What is the risk associated with a given level of Cryptosporidium in a drinking water source?
  • What degree of additional treatment should be required for a given source water Cryptosporidium level?

This section addresses these two questions by first summarizing how EPA assessed the risk associated with Cryptosporidium in drinking water, followed by a description of how EPA and the Advisory Committee used this type of information in identifying LT2ESWTR bin concentration ranges and treatment requirements. For additional information on these topics, see Economic Analysis for the LT2ESWTR (USEPA 2003a).

i. What is the risk associated with a given level of Cryptosporidium in a drinking water source? The risk of infection from Cryptosporidium in drinking water is a function of infectivity (i.e., dose-response associated with ingestion) and exposure. Section III.B summarizes available data on Cryptosporidium infectivity. EPA conducted a meta-analysis of reported infection rates from human feeding studies with 3 Cryptosporidium isolates. This analysis produced an estimate for the mean probability of infection given a dose of one oocyst near 0.09 (9%), with 10th and 90th percentile confidence values of 0.011 and 0.22, respectively.

Exposure to Cryptosporidium depends on the concentration of oocysts in the source water, the efficiency of treatment plants in removing oocysts, and the volume of water ingested (exposure can also occur through interactions with infected individuals). Based on data presented in section III.D, EPA has estimated that filtration plants in compliance with the IESWTR or LT1ESWTR reduce source water Cryptosporidium levels by 2 to 5 log (99% to 99.999%), with an average reduction near 3 log. For drinking water consumption, EPA uses a distribution, derived from the United States Department of Agriculture's (USDA) 1994-96 Continuing Survey of Food Intakes by Individuals, with a mean value of 1.2 L/day. Average annual days of exposure to drinking water in CWS, non-transient non-community water systems (NTNCWS), and transient non-community water systems (TNCWS) are estimated at 350 days, 250 days, and 10 days, respectively. (The Economic Analysis for the LT2ESWTR (USEPA 2003a) provides details on all parameters listed here, as well as morbidity, mortality, and other risk factors.)

Using an estimate of 1.2 L/day consumption and a mean probability of infection of 0.09 for one oocyst ingested, the daily risk of infection (DR) is as follows:

DR = (oocysts/L in source water) × (percent remaining after treatment) × (1.2 L/day) × (0.09).

The annual risk (AR) of infection for a CWS is

AR = 1−(1−DR) [350]

where 350 represents days of exposure in a CWS.

Table IV-5 presents estimates of the mean annual risk of infection by Cryptosporidium in CWSs for selected source water infectious oocyst concentrations and filtration plant removal efficiencies.

Table IV-5.—Annual Risk of Cryptosporidium Infection in CWSs That Filter, as a Function of Source Water Infectious Oocyst Concentration and Treatment Efficiency Back to Top
Source water concentration (infectious oocysts per liter) Mean annual risk of infection for different levels of treatment efficiency (log removal)1
2 log 3 log 4 log 5 log
1Scientific notation (E −x) designates 10 −x
0.0001 3.8E-05 3.8E-06 3.8E-07 3.8E-08
0.001 3.7E-04 3.8E-05 3.8E-06 3.8E-07
0.01 3.7E-03 3.7E-04 3.8E-05 3.8E-06
0.1 3.7E-02 3.7E-03 3.7E-04 3.8E-05
1 0.31 3.7E-02 3.7E-03 3.7E-04
10 0.89 0.31 3.7E-02 3.7E-03

For example, Table IV-5 shows that if a filtration plant had a mean concentration of infectious Cryptosporidium in the source water of 0.01 oocysts/L, and the filtration plant averaged 3 log removal, the mean annual risk of infection by Cryptosporidium is estimated as 3.7 × 10 −4 (3.7 infections per 10,000 consumers).

ii. What degree of additional treatment should be required for a given source water Cryptosporidium level? In order to develop targeted treatment requirements for the LT2ESWTR, it was necessary to identify a source water Cryptosporidium level above which additional treatment by filtered systems would be required. Based on the type of risk information shown in Table IV-5, EPA and Advisory Committee deliberations focused on mean source water Cryptosporidium concentrations in the range of 0.01 to 0.1 oocysts/L as appropriate threshold values for prescribing additional treatment.

Analytical method and sampling constraints were a significant factor in setting the specific Cryptosporidium level that triggers additional treatment by filtered systems. The number of samples that systems can be required to analyze for Cryptosporidium is limited. Consequently, if the bin threshold concentration for additional treatment was set near 0.01 oocysts/L, systems could exceed this level due to a very low number of oocysts being detected. For example, if systems took monthly 10 L samples and bin classification was based on a maximum running annual average, then a system would exceed a mean concentration of 0.01 oocysts/L by counting only 2 oocysts in 12 samples. Given the variability associated with Cryptosporidium analytical methods, the Advisory Committee did not support requiring additional treatment for filtered systems based on so few counts.

Another concern related to analytical method limitations was systems being misclassified in a lower bin. For example, if a system had a true mean concentration at or just above 0.1 oocysts/L, the mean that the system would determine through monitoring might be less than 0.1 oocyst/L. Thus, if the bin threshold for additional treatment was set at 0.1 oocysts/L, a number of systems with true mean concentrations above this level would be misclassified in the lower bin with no additional treatment required. This type of error, described in more detail in the next section, is a function of the number of samples collected and variability in method performance.

In consideration of the available information on Cryptosporidium risk, as well as the performance and feasibility of analytical methods, EPA is proposing that the source water threshold concentration for requiring additional Cryptosporidium treatment by filtered systems be established at a mean level of 0.075 oocysts/L. This is the level recommended by the Advisory Committee, and it affords a high likelihood that systems with true mean Cryptosporidium concentrations of 0.1 oocysts/L or higher will provide additional treatment under the rule.

Beyond identifying this first threshold, it was also necessary to determine Cryptosporidium concentrations that would demarcate higher risk bins. With respect to the concentration range that each bin should comprise, EPA and the Advisory Committee dealt with two opposing factors: bin misclassification and equitable risk reduction.

As described in the next section, a monthly monitoring program involving EPA Methods 1622 or 1623 can characterize a system's mean Cryptosporidium concentration within a 0.5 log (factor of 3.2) margin with a high degree of accuracy. However, the closer a system's true mean concentration is to a bin boundary, the greater the likelihood that the system will be misclassified into the wrong bin due to limitations in sampling and analysis. Accordingly, by establishing bins that cover a wide concentration range, the likelihood of system misclassification is reduced.

However, a converse factor relates to equitable protection from risk. Because identical treatment requirements will apply to all systems in the same bin, systems at the higher concentration end of a bin will achieve less risk reduction relative to their source water pathogen levels than systems at the lower concentration end of a bin. Thus, bins with a narrow concentration range provide a more uniform level of public health protection.

In balancing these factors and to account for the wide range of possible source water concentrations among different systems as indicated by Information Collection Rule and ICRSS data, the Advisory Committee recommended and EPA is proposing a second bin threshold at a mean level of 1.0 oocysts/L and a third bin threshold at a mean level of 3.0 oocysts/L. Information Collection Rule and ICRSS data indicate that few, if any, systems would measure mean Cryptosporidium concentrations greater than 3.0 oocysts/L, so there was not a need to establish a bin threshold above this value. Thus, the LT2ESWTR proposal includes the following four bins for classifying filtered systems: Bin 1: 0.075/L; Bin 2: ≥0.075 to 1.0/L; Bin 3: ≥1.0/L to 3.0/L; and Bin 4: ≥3.0/L (oocysts/L).

With respect to additional Cryptosporidium treatment for systems in Bins 2-4, values were considered ranging from 0.5 to 2.5 log and greater. As recommended by the Advisory Committee, EPA is proposing 1.0 log additional treatment for conventional plants in Bin 2. This level of treatment will ensure that systems classified in Bin 2 will achieve treated water Cryptosporidium levels comparable to systems in Bin 1, the lowest risk bin. In contrast, if systems in Bin 2 provided only 0.5 log additional treatment then those systems with mean source water concentrations in the upper part of Bin 2 would have higher levels of Cryptosporidium in their finished water than systems in Bin 1.

In consideration of the much greater potential vulnerability of systems in the highest risk bins, the Advisory Committee recommended additional treatment requirements of 2.0 log and 2.5 log for conventional plants in Bins 3 and 4, respectively. The Agency concurs with these recommendations and has incorporated them in today's proposal.

An important aspect of the proposed additional treatment requirements is that they are based, in part, on the current level of treatment provided by filtration plants. As noted earlier, the Advisory Committee assumed when developing its recommendations that conventional treatment plants in compliance with the IESWTR achieve an average of 3 log removal of Cryptosporidium. EPA has determined that available data, discussed in section III.D, support this assumption and has proposed a 3 log Cryptosporidium treatment credit for conventional plants under the LT2ESWTR. Thus, the additional treatment requirements for conventional plants in Bins 2, 3, and 4 translate to total requirements of 4.0, 5.0, and 5.5 log, respectively.

The Advisory Committee did not address additional treatment requirements for plants with treatment trains other than conventional, but recommended that EPA address such plants in the proposed LT2ESWTR and take comment. Based on treatment studies summarized in section III.D, EPA has concluded that plants with slow sand or DE filtration are able to achieve 3 log or greater removal of Cryptosporidium when in compliance with the IESWTR or LT1ESWTR. Because these plants can achieve comparable levels of performance to conventional treatment plants, EPA is proposing that slow sand and DE filtration plants also apply 1 to 2.5 log of additional treatment when classified in Bins 2-4.

Direct filtration differs from conventional treatment in that it does not include sedimentation or an equivalent clarification process prior to filtration. As described in section III.D, EPA has concluded that a sedimentation process can consistently achieve 0.5 log or greater removal of Cryptosporidium. The Agency is proposing that direct filtration plants in compliance with the IESWTR or LT1ESWTR receive a 2.5 log Cryptosporidium removal credit towards LT2ESWTR requirements. Accordingly, proposed additional treatment requirements for direct filtration plants in bins 2, 3, and 4 are 1.5 log, 2.5 log, and 3 log, respectively.

Section IV.C of this notice describes proposed criteria for determining Cryptosporidium treatment credits for other filtration technologies like membranes, bag filters, and cartridge filters. Due to the proprietary and product specific nature of these filtration devices, EPA is not able to propose a generally applicable credit for them. Rather, the criteria in section IV.C focus on challenge testing to establish treatment credit. Systems using these technologies that are classified in Bins 2-4 must work with their States to assess appropriate credit for their existing treatment trains. This will determine the level of additional treatment necessary to achieve the total treatment requirements for their assigned bins. EPA has developed guidance on challenge testing of bag and cartridge filters and membranes, which is available in draft form in the docket (http://www.epa.gov/edocket/).

In order to give systems flexibility in choosing strategies to meet additional Cryptosporidium treatment requirements, the Advisory Committee identified a number of management and treatment options, collectively called the microbial toolbox. The toolbox, which is described in section IV.C, contains components relating to watershed control, intake management, pretreatment, additional filtration processes, inactivation, and demonstrations of enhanced performance.

As recommended by the Advisory Committee, EPA is proposing that systems in Bin 2 can meet additional Cryptosporidium treatment requirements under the LT2ESWTR using any component or combination of components from the microbial toolbox. However, systems in Bins 3 and 4 must achieve at least 1 log of the additional treatment requirement using inactivation (UV, ozone, chlorine dioxide), membranes, bag filters, cartridge filters, or bank filtration. These specific control measures are proposed due to their ability to serve as significant additional treatment barriers for systems with high levels of pathogens.

c. Basis for source water monitoring requirements. The goal of monitoring under the LT2ESWTR is to correctly classify filtration plants into the four LT2ESWTR risk bins. The proposed sampling frequency, time frame, and averaging procedure for bin classification are intended to ensure that systems are accurately assigned to appropriate risk bins while limiting the burden of monitoring costs. The basis for the proposed monitoring requirements for large and small systems is presented in the following discussion.

i. Systems serving at least 10,000 people.

Sample Number and Frequency

Systems serving at least 10,000 people have two options for sampling under the LT2ESWTR: (1) They can collect 24 monthly samples over a 2 year period and calculate their bin classification using the highest 12 month running annual average, or (2) They can collect 2 or more samples per month over the 2 year period and use the mean of all samples for bin classification.

These proposed requirements reflect recommendations by the Advisory Committee and are based on analyses of misclassification rates associated with different monitoring programs that were considered. EPA is concerned about systems with high concentrations of Cryptosporidium being misclassified in lower bins as well as systems with low concentrations being misclassified in higher bins. The first type of error could lead to systems not providing an adequate level of treatment while the second type of error could lead to systems incurring additional costs for unnecessary treatment.

A primary way that EPA analyzed misclassification rates was by considering the likelihood that a system with a true mean Cryptosporidium concentration that is a factor of 3.2 (0.5 log) above or below a bin boundary would be assigned to the wrong bin.

Probabilities were assessed for two cases:

  • False negative: a system with a mean concentration of 0.24 oocysts/L (i.e., factor of 3.2 above the Bin 1 boundary of 0.075 oocysts/L) is misclassified low in Bin 1.
  • False positive: a system with a mean concentration of 0.024 oocysts/L (i.e., factor of 3.2 below the Bin 1 boundary of 0.075 oocysts/L) is misclassified high in Bin 2.

Table IV-6 provides false negative and false positive rates as defined previously for different approaches to monitoring and bin classification that were evaluated. Results are shown for the following approaches:

  • 48 samples with bin assignment based on arithmetic mean (i.e., average of all samples).
  • 24 samples with bin assignment based on highest 12 sample average, equivalent to the maximum running annual average (Max-RAA).
  • 24 samples with bin assignment based on arithmetic mean.
  • 12 samples with bin assignment based on the second highest sample result.
  • 8 samples with bin assignment based on the maximum sample result.

These estimated misclassification rates were generated with a Monte Carlo analysis that accounted for the volume assayed, variation in source water Cryptosporidium occurrence, and variable method recovery. See Economic Analysis for the LT2ESWTR (USEPA 2003a) for details.

Table IV-6.—False Positive and False Negative Rates for Monitoring and Binning Strategies Considered for the LT2ESWTR Back to Top
Strategy False positive1 False negative2
[In percentages]
1False positive rates calculated for systems with Cryptosporidium concentrations 0.5 log below the Bin 1 boundary of 0.075 oocysts/L.
2False negative rates calculated for systems with Cryptosporidium concentrations 0.5 log above the Bin 1 boundary of 0.075 oocysts/L.
48 sample arithmetic mean 1.7 1.4
24 sample Max-RAA 5.3 1.7
24 sample arithmetic mean 2.8 6.2
12 sample second highest 47 1.1
8 sample maximum 66 1.0

The first two of these approaches, the 48 sample arithmetic mean and 24 sample Max-RAA, were recommended by the Advisory Committee and are proposed for bin classification under the LT2ESWTR because they have low false positive and false negative rates. As shown in Table IV-6, these strategies have false negative rates of 1 to 2%, meaning there is a 98 to 99% likelihood that a plant with an oocyst concentration 0.5 log above the Bin 1 boundary would be correctly assigned to Bin 2. The false positive rate is near 2% for the 48 sample arithmetic mean and 5% for the 24 sample Max-RAA. These rates indicate that a plant with an oocyst concentration 0.5 log below the Bin 1 boundary would have a 95 to 98% probability of being correctly assigned to Bin 1. Bin misclassification rates across a wide range of concentrations are shown in Economic Analysis for the LT2ESWTR (USEPA 2003a).

The 24 sample arithmetic mean had a slightly lower false positive rate than the 24 sample Max-RAA (2.8% vs. 5.3%) but the false negative rate of the arithmetic mean was almost 4 times higher. Consequently, a plant with a mean Cryptosporidium level above the Bin 1 boundary would be much more likely to be misclassified in Bin 1 using a 24 sample arithmetic mean than with a 24 sample Max-RAA. In order to increase the probability that systems with mean Cryptosporidium concentrations above 0.075 oocysts/L will provide additional treatment, EPA is proposing that if only 24 samples are taken, the maximum 12 month running annual average must be used to determine bin assignment.

Monitoring strategies involving only 12 and 8 samples were evaluated to determine if lower frequency monitoring could provide satisfactory bin classification. The results of this analysis indicate that these lower sample numbers are not adequate and could unfairly bias excessive treatment requirements. For example, results in Table IV-6 show that if plants were classified in bins based on the second highest of 12 samples or the highest of eight samples then low false negative rates could be achieved. A system with a mean Cryptosporidium level 0.5 log above the Bin 1 boundary would have a 99% chance of being appropriately classified in a bin requiring additional treatment under either strategy. However, the false positive rates associated with these low sample numbers are very high. A system with a mean oocyst concentration 0.5 log below the Bin 1 boundary would have a 47% probability of being incorrectly classified in Bin 2 using the second highest result among 12 samples, or a 66% likelihood of being misclassified in Bin 2 using the maximum result among 8 samples. Due to high false positive rates, these strategies are not proposed.

EPA also evaluated lower frequency monitoring strategies that had lower false positive rates, such as bin classification based on the mean of 12 samples, the third highest result of 12 samples, and the second highest of 8 samples. Each of these strategies, though, had an unacceptably high false negative rate, meaning that many systems with mean oocyst concentrations greater than the Bin 1 boundary would be misclassified low in Bin 1. Consequently, these strategies are inconsistent with the public health goal of the LT2ESWTR for systems with mean levels above 0.075 oocysts/L to provide additional treatment.

Increasing the number of samples used to compute the maximum running annual average above 24 also increased the number of annual averages computed, so it did not reduce the likelihood of false positives. Raising the number of samples used to compute an arithmetic mean above 48 did reduce bin misclassification rates, but the rates were already very small (1 to 2% for plants with levels 0.5 log above or below bin boundaries). For sources with Cryptosporidium concentrations very near or at bin boundaries, increasing the number of samples did not markedly improve the error rates, which remained near 50% at the bin boundaries.

In summary, EPA believes that the proposed sampling designs perform well for the purpose of classifying plants in LT2ESWTR risk bins and, thereby, achieving the public health protection intended for the rule. More costly designs, involving more frequent sampling and analysis, provide only marginally improved performance. Less frequent sampling, though lower in cost, creates unacceptably high misclassification rates and would not provide for the targeted risk reduction goals of the rule.

No Adjustments for Method Recovery or Percent of Oocysts That Are Infectious

Two considerations in using Cryptosporidium monitoring data to project risk are (1) Fewer than 100% of oocysts in a sample are recovered and counted by the analyst and (2) not all the oocysts measured with Methods 1622/23 are viable and capable of causing infection. These two factors are offsetting in sign, in that oocyst counts not adjusted for recovery tend to underestimate the true concentration, while the total oocyst count may overestimate the infectious concentration that presents a health risk. Based on information described in this section, EPA is proposing that Cryptosporidium monitoring results be used directly to assign systems to LT2ESWTR risk bins and not be adjusted for either factor.

As described in section III.C, ICRSS matrix spike data indicate that average recovery of Cryptosporidium oocysts with Methods 1622/23 in a national monitoring program will be about 40%. There is no similar direct measure of the fraction of environmental oocysts that are infectious, but information related to this value can be derived from two sources: (1) A study where samples were analyzed with both Method 1623 and a cell culture-polymerase chain reaction (CC-PCR) test for oocyst infectivity, and (2) the structure of oocysts counted with Methods 1622 and 1623.

LeChevallier et al. (2003) conducted a study in which six natural waters were frequently tested for Cryptosporidium using both Method 1623 and a CC-PCR method to test for infectivity. Cryptosporidium oocysts were detected in 60 of 593 samples (10.1%) by Method 1623 and infectious oocysts were detected in 22 of 560 samples (3.9%) by the CC-PCR procedure. Recovery efficiencies for the two methods were similar. According to the authors, these results suggest that approximately 37% (22/60) of the Cryptosporidium oocysts detected by Method 1623 were viable and infectious.

In regard to oocyst structure, Cryptosporidium oocysts counted with Methods 1622/23 are characterized in one of three ways: (1) Internal structures, (2) amorphous structures, or (3) empty. Oocysts with internal structures are considered to have the highest likelihood of being infectious, while empty oocysts are believed to be non-viable (LeChevallier et al. 1997). During the ICRSS, 37% of the oocysts counted were characterized as having internal structures, 47% had amorphous structures, and 16% were empty. If it is assumed that empty oocysts could not be infectious, the mid-point value within the percentage range of counted oocysts that could have been infectious is 42%.

After considering this type of information, the Advisory Committee recommended that monitoring results not be adjusted upward for percent recovery, nor adjusted downward to account for the fraction of oocysts that are not infectious. While it is not possible to establish a precise value for either factor in individual samples, the data suggest that they may be of similar magnitude. EPA concurs with this recommendation and is proposing that systems be classified in bins under the LT2ESWTR using the total Cryptosporidium oocyst count, uncorrected for recovery, as measured using EPA Method 1622/23. The proposed LT2ESWTR risk bins are constructed to reflect this approach.

Data Collection To Support Use of a Microbial Indicator by Small Systems

As described in the next section, small systems will monitor for an indicator, currently proposed to be E. coli, to determine if they are required to sample for Cryptosporidium. The proposed E. coli levels that will trigger Cryptosporidium monitoring are based on Information Collection Rule and ICRSS data. However, to provide for a more extensive evaluation of Cryptosporidium indicator criteria, EPA is proposing that large systems measure E. coli and turbidity in their source water when they sample for Cryptosporidium. This was recommended by the Advisory Committee and will allow for possible development of alternative indicator levels or parameters (e.g., turbidity in combination with E. coli) to serve as triggers for small system Cryptosporidium monitoring.

Time Frame for Monitoring

In recommending a time frame for LT2ESWTR monitoring, the Agency considered the trade-off between monitoring over a long period to better capture year-to-year fluctuations, and the desire to prescribe additional treatment quickly to systems identified as having high source water pathogen levels. Reflecting Advisory Committee recommendations, EPA is proposing that large systems evaluate their source water Cryptosporidium levels using 2 years of monitoring. This will account for some degree of yearly variability, without significantly delaying additional public health protection where needed.

ii. Systems serving fewer than 10,000 people.

Indicator Monitoring

In recognition of the relatively high cost of analyzing samples for Cryptosporidium, EPA and the Advisory Committee explored the use of indicator criteria to identify drinking water sources that may have high levels of Cryptosporidium occurrence. The goal was to find one or more parameters that could be analyzed at low cost and identify those systems likely to exceed the Bin 1 boundary of 0.075 oocysts/L. Data from the Information Collection Rule and ICRSS were evaluated for possible indicator parameters, including fecal coliforms, total coliforms, E. coli, viruses (Information Collection Rule only), and turbidity. Based on available data, E. coli was found to provide the best performance as a Cryptosporidium indicator, and the inclusion of other parameters like turbidity was not found to improve accuracy.

The next part of this section presents data that support E. coli mean concentrations of 10/100 mL and 50/100 mL as proposed screening levels that will trigger Cryptosporidium monitoring in reservoir/lake and flowing stream systems, respectively. It describes how E. coli and Cryptosporidium data from the Information Collection Rule and ICRSS were analyzed and shows the performance of different concentrations of E. coli as an indicator for systems that will exceed the Bin 1 boundary of 0.075 oocysts/L.

Information Collection Rule data were evaluated as maximum running annual averages (Information Collection Rule samples were collected once per month for 18 months) while ICRSS data were evaluated using an annual mean (ICRSS samples were collected twice per month for 12 months). In addition, as indicators were being evaluated it became apparent that it was necessary to analyze plants separately based on source water type, due to a significantly different relationship between E. coli and Cryptosporidium in reservoir/lake systems compared to flowing stream systems.

Analyzing the performance of an E. coli level as a screen to trigger Cryptosporidium monitoring under the proposed LT2ESWTR involved evaluating each water treatment plant in the data set relative to two factors: (1) Did the plant E. coli level exceed the trigger value being assessed? and (2) Did the plant mean Cryptosporidium concentration exceed 0.075 oocysts/L? Accordingly, plants were sorted into four categories, based on Cryptosporidium and E. coli concentrations:

  • Plants with Cryptosporidium 0.075 oocysts/L that did not exceed the E. coli trigger level (Figure IV-1, box A)
  • Plants with Cryptosporidium 0.075 oocysts/L that exceeded the E. coli trigger level (Figure IV.1, box B)
  • Plants with Cryptosporidium≥ 0.075 oocysts/L that did not exceed the E. coli trigger level (Figure IV.1, box C)
  • Plants with Cryptosporidium≥ 0.075 oocysts/L that exceeded the E. coli trigger level (Figure IV.1, box D)

Summary data with E. coli trigger concentrations ranging from 5 to 100 per 100 mL are presented for Information Collection Rule and ICRSS data in Figures IV-2 and IV-3.

The performance of each E. coli level as a trigger for Cryptosporidium monitoring was evaluated based on false negative and false positive rates. False negatives occur when plants do not exceed the E. coli trigger value, but exceed a Cryptosporidium level of 0.075 oocysts/L. False positives occur when plants exceed the E. coli trigger value but do not exceed a Cryptosporidium level of 0.075 oocysts/L. The false negative rate is critical because it characterizes the ability of the indicator to identify those plants with high Cryptosporidium levels. In general, low false negative rates can be achieved by lowering the E. coli trigger concentration. However, when the E. coli trigger concentration is decreased, more plants with low Cryptosporidium levels in their source water exceed it. As a result, more plants incur false positives. Consequently, identifying an appropriate E. coli concentration to trigger Cryptosporidium monitoring involves balancing false negatives and false positives to minimize both.

Results of the indicator analysis for plants with flowing stream sources are shown in Figure IV-2. An E. coli trigger concentration of 50/100 mL produced zero false negatives for both data sets. This means that in these data sets, all plants that exceeded mean Cryptosporidium concentrations of 0.075 oocysts/L also exceeded the E. coli trigger concentration and would, therefore, be required to monitor. However, this trigger concentration had a significant false positive rate (i.e., it was not highly specific in targeting only those plants with high Cryptosporidium levels). False positive rates were 57% (24/42) and 53% (9/17) with Information Collection Rule and ICRSS data, respectively. At a higher E. coli trigger concentration, such as 100/100 mL, the false negative rate increased to 12.5% (3/24) with Information Collection Rule data and 50% (2/4) with ICRSS data, while the false positive rate decreased to 43% (18/42) and 35% (6/17), respectively. Consequently, EPA is proposing a mean E. coli concentration of 50/100 mL as a trigger for Cryptosporidium monitoring by small systems with flowing stream sources.

Results of the indicator analysis for plants with reservoir/lake sources are shown in Figure IV-3. An E. coli trigger of 10/100 mL resulted in a false negative rate of 20% (2/10) with Information Collection Rule data and 67% (2/3) with ICRSS data (misclassified 2 out of 3 plants over 0.075 oocysts/L). Going to a lower concentration E. coli trigger, such as 5 per 100 mL, decreased the false negative rate in both the Information Collection Rule and ICRSS data sets by one plant, but increased the false positive rate from 20% to 43% (13/30) in the ICRSS data and from 24% to 39% (44/114) in the Information Collection Rule data. Based on these results, EPA is proposing that a mean E. coli concentration of 10/100 mL trigger small systems using lake/reservoir sources into monitoring for Cryptosporidium. While the false negative rate associated with this trigger value in the ICRSS data set is high, the ICRSS data set contains only 3 reservoir/lake plants that exceeded a Cryptosporidium level of 0.075 oocysts/L.

Due to limitations in the available data, the Advisory Committee did not recommend that large systems use the E. coli indicator screen, as Cryptosporidium monitoring is less of an economic burden for large systems. Rather, the Advisory Committee recommended that large systems sample for E. coli and turbidity when they monitor for Cryptosporidium under the LT2ESWTR. These data will then be used to verify or, if necessary, further refine the proposed indicator trigger values for small systems. EPA concurs with these recommendations and they are reflected in today's proposal.

The proposed monitoring schedule under the LT2ESWTR is set up to allow EPA and stakeholders to evaluate large system monitoring data for indicator relationships prior to the start of small system E. coli monitoring. After one year of large system monitoring is completed, EPA will begin analyzing monitoring data to assess whether alternative indicator strategies would be appropriate. Depending on the findings of this analysis, EPA may issue guidance to States on approving alternative indicator trigger strategies for small systems. Therefore, the proposed rule is written with the allowance for States to approve alternative indicator strategies.

BILLING CODE 6560-50-P

[Graphic not available; view image of printed page]

[Graphic not available; view image of printed page]

BILLING CODE 6560-50-C

Cryptosporidium Monitoring

Small systems that exceed the E. coli trigger must conduct Cryptosporidium monitoring, beginning 6 months after completion of E. coli monitoring. As recommended by the Advisory Committee, EPA is proposing that small systems collect 24 Cryptosporidium samples over a period of one year. This number of samples is the same as required for large systems, but the monitoring burden is targeted only on those plants that E. coli monitoring indicates to have elevated levels of fecal matter in the source water. By completing Cryptosporidium monitoring in one year, small systems will conduct a total of 2 years of monitoring to determine LT2ESWTR bin classification (including the one year of E. coli monitoring). This time frame is equivalent to the requirement for large systems, which monitor for Cryptosporidium, E. coli, and turbidity for 2 years.

The Stage 2 M-DBP Agreement in Principle recommended that EPA explore the feasibility of alternative, lower frequency, Cryptosporidium monitoring criteria for providing a conservative mean estimate in small systems. As described earlier, EPA has evaluated smaller sample sizes, such as systems taking 12 or 8 samples instead of 24 (see Table IV-6). However, EPA has concluded that these smaller sample sizes result in unacceptably high misclassification rates. For example, bin classification based on the second highest of 12 samples produces an estimated false positive rate of 47% for systems with a mean Cryptosporidium concentration 0.5 log below the Bin 1 boundary of 0.075/L. In comparison, bin classification based on the mean of 24 samples achieves a false positive rate of 2.8% for systems at this Cryptosporidium concentration. Consequently, EPA is proposing no alternatives to the requirement that small systems take at least 24 samples.

Small system bin classification will be determined by the arithmetic mean of the 24 samples collected over one year. Because the bin structure in the LT2ESWTR is based on annual mean Cryptosporidium levels, it is necessary that bin classification involve averaging samples over at least one year. Consequently, small systems will determine their bin classification by averaging results from all Cryptosporidium samples collected during their one year of monitoring.

iii. Future monitoring and reassessment. EPA is proposing that beginning 6 years after the initial bin classification, large and small systems conduct another round of monitoring to determine if source water conditions have changed to a degree that may warrant a revised bin classification. The Advisory Committee recommended that EPA convene a stakeholder process within 4 years after the initial bin classification to develop recommendations on how best to proceed with implementing this second round of monitoring. Unless EPA modifies the LT2ESWTR to allow for an improved analytical method or a revised bin structure based on new risk information, the second round of monitoring will be conducted under the same requirements that apply to the initial round of monitoring.

In addition, EPA is proposing to use the required assessment of the water source during sanitary surveys as an ongoing measure of whether significant changes in watersheds have occurred that may lead to increased contamination. Where the potential for increased contamination is identified, States must determine what follow-up actions by the system are necessary, including the possibility of the system providing additional treatment from the microbial toolbox.

d. Basis for accepting previously collected data. Members of the Advisory Committee had multiple objectives in recommending that EPA allow the use of previously collected (grandfathered) Cryptosporidium data. These include (1) giving credit for data collected by proactive utilities, (2) facilitating early determination of LT2ESWTR compliance needs and, thereby, allowing for early planning of appropriate treatment selection, (3) increasing laboratory capacity to meet demand for Cryptosporidium analysis under the LT2ESWTR, and (4) allowing utilities to improve their data set for bin determination by considering more than 2 years of data (i.e., include data collected prior to effective date of LT2ESWTR). The latter objective incorporates the assumption that occurrence can vary from year to year, so that if more years of data are used in the bin determination, the source water concentration estimate will be a more accurate representation of the overall mean.

A significant issue with accepting previously collected data for making bin determinations is ensuring that the data are of equivalent quality to data that will be collected following LT2ESWTR promulgation. As noted previously, EPA is establishing requirements so that data collected under the LT2ESWTR will be similar in quality to data that were generated under the ICRSS. These requirements include the use of approved analytical methods and compliance with method quality control (QC) criteria, use of approved laboratories, minimum sample volume, and a sampling schedule with minimum frequency. For example, under the ICRSS, laboratories analyzed 10 L samples and (considered collectively) achieved a mean Cryptosporidium recovery of approximately 43% in spiked source water with a relative standard deviation (RSD) of 50%. EPA anticipates that laboratories conducting Cryptosporidium analysis for the LT2ESWTR will collectively achieve similar analytical method performance. Consequently, EPA expects previously collected data sets used under the LT2ESWTR to meet these standards and has established criteria for accepting previously collected data accordingly (see section IV.A.1.d).

Systems are requested, but not required, to notify EPA prior to promulgation of the LT2ESWTR of their intent to submit previously collected data. This will help EPA allocate the resources that will be needed to evaluate these data in order to make a decision on adequacy for bin determination. Systems that have at least 2 years of previously collected data to grandfather when the LT2ESWTR is promulgated and do not intend to conduct new monitoring under the rule are required to submit the previously collected data to EPA within 2 months following promulgation. This will enable EPA to evaluate the data and report back to the utility in sufficient time to allow, if needed, the utility to contract with a laboratory to conduct monitoring under the LT2ESWTR.

Systems that have fewer than 2 years of previously collected data to grandfather when the LT2ESWTR is promulgated, or that intend to grandfather 2 or more years of previously collected data and also conduct new monitoring under the rule, are required to submit the previously collected data to EPA within 8 months following promulgation. This will allow these utilities to continue to collect previously collected data in the 6 month period between promulgation and the date when monitoring under the LT2ESWTR must begin, plus a 2 month period for systems to compile the data and supporting documentation. Utilities may submit the data earlier than 8 months after promulgation if they acquire 2 years of previously collected data before this date.

Submitted grandfathered data sets must include all routine source water monitoring results for samples collected during the time period covered by the grandfathered data set (i.e., the time period between collection of the first and last samples in the data set). However, systems are not required under the LT2ESWTR to submit previously collected data for samples outside of this time period.

3. Request for Comment

EPA requests comments on all aspects of the monitoring and treatment requirements proposed in this section. In addition, EPA requests comment on the following issues:

Requirements for Systems That Use Surface Water for Only Part of the Year

Bin classification for the LT2ESWTR is based on the mean annual sourcewater Cryptosporidium level. Consequently, today's proposal requires E. coli and Cryptosporidium monitoring to be conducted over the full year. However, EPA recognizes that some systems use surface water for only part of the year. This occurs with systems that use surface water for part of the year (e.g., during the summer) to supplement ground water sources and with systems like campgrounds that are in operation for only part of the year. Year round monitoring for these systems may present both logistic and economic difficulties. EPA is requesting comment on how to apply LT2ESWTR monitoring requirements to surface water systems that operate or use surface water for only part of the year. Possible approaches that may be considered for comment include the following:

Small public water systems that operate or use surface water for only part of the year could be required to collect E. coli samples at least bi-weekly during the period when they use surface water. If the mean E. coli concentration did not exceed the trigger level (e.g., 10/100 mL for reservoirs/lakes or 50/100mL for flowing streams), systems could apply to the State to waive any additional E. coli monitoring. The State could grant the waiver, require additional E. coli monitoring, or require monitoring of an alternate indicator. If the mean E. coli concentration exceeded the trigger level, the State could require the system to provide additional treatment for Cryptosporidium consistent with Bin 4 requirements, or require monitoring of Cryptosporidium or an indicator, with the results potentially leading to additional Cryptosporidium treatment requirements.

Large public water systems that operate or use surface water for only part of the year could be required to collect Cryptosporidium samples (along with E. coli and turbidity) either twice-per-month during the period when they use surface water or 12 samples per year, whichever is smaller. Samples would be collected during the two years of the required monitoring period, and bin classification would be based on the highest average of the two years.

EPA requests comment on these and other approaches for both small and large systems.

Previously Collected Monitoring Data That Do Not Meet QC Requirements

EPA is proposing requirements for acceptance of previously collected monitoring data that are equivalent to requirements for data generated under the LT2ESWTR. The Agency is aware that systems will have previously collected Cryptosporidium data that do not meet all sampling and analysis requirements (e.g., quality control, sample frequency, sample volume) proposed for data collected under the LT2ESWTR. However, the Agency has been unable to develop an approach for allowing systems to use such data for LT2ESWTR bin classification. This is due to uncertainty regarding the impact of deviations from proposed sampling and analysis requirements on data quality and reliability. For example, Methods 1622 and 1623 have been validated within the limits of the QC criteria specified in these methods. While very minor deviations from required QA/QC criteria may have only a minor impact on data quality, the Agency has not identified a basis for establishing alternative standards for data acceptability.

EPA requests comment on whether or under what conditions previously collected data that do not meet the proposed criteria for LT2ESWTR monitoring data should be accepted for use in bin determination. Specifically, EPA requests comment on the sampling frequency requirement for previously collected data, and whether EPA should allow samples collected at lower or varying frequencies to be used as long as the data are representative of seasonal variation and include the required number of samples. If so, how should EPA determine whether such a data set is unbiased and representative of seasonal variation? How should data collected at varying frequency be averaged?

Monitoring for Systems That Recycle Filter Backwash

Plants that recycle filter backwash water may, in effect, increase the concentration of Cryptosporidium in the water that enters the filtration treatment train. Under the LT2ESWTR proposal, microbial sampling may be conducted on source water prior to the addition of filter backwash water. EPA requests comment on how the effect of recycling filter backwash should be considered in LT2ESWTR monitoring.

Bin Assignment for Systems That Fail To Complete Required Monitoring

Today's proposal classifies systems that fail to complete required monitoring in Bin 4, the highest treatment bin. EPA requests comment on alternative approaches for systems that fail to complete required monitoring, such as classifying the system in a bin based on data the system has collected, or classifying the system in a bin one level higher than the bin indicated by the data the system has collected. The shortcoming to these alternative approaches is that bin classification becomes more uncertain, and the likelihood of bin misclassification increases, as systems collect fewer than the required 24 Cryptosporidium samples. Consequently, the proposed approach is for systems to collect all required samples.

Note that under today's proposal, systems may provide 5.5 log of treatment for Cryptosporidium (i.e., comply with Bin 4 requirements) as an alternative to monitoring. Where systems notify the State that they will provide treatment instead of monitoring, they will not incur monitoring violations.

Monitoring Requirements for New Plants and Sources

The proposed LT2ESWTR would establish calendar dates when the initial and second round of source water monitoring must be conducted to determine bin classification. EPA recognizes that new plants will begin operation, and that existing plants will access new sources, after these dates. EPA believes that new plants and plants switching sources should conduct monitoring equivalent to that required of existing plants to determine the required level of Cryptosporidium treatment. The monitoring could be conducted before a new plant or source is brought on-line, or initiated within some time period afterward. EPA requests comment on monitoring and treatment requirements for new plants and sources.

Determination of LT2ESWTR Bin Classification

In today's proposal, EPA expects that systems will be assigned to LT2ESWTR risk bins based on their reported Cryptosporidium monitoring results and the calculations proposed for bin assignment described in this section. EPA requests comment on whether bin classifications should formally be made or reviewed by States.

Source Water Type Classification for Systems That Use Multiple Sources

In today's proposal, the E. coli concentrations that trigger small system Cryptosporidium monitoring are different for systems using lake/reservoir and flowing stream sources. However, EPA recognizes that some systems use multiple sources, potentially including both lake/reservoir and flowing stream sources, and that the use of different sources may vary during the year. Further, some systems use sources that are ground water under the direct influence (GWUDI) of surface water. EPA requests comment on how to apply the E. coli criteria for triggering Cryptosporidium monitoring to systems using multiple sources and GWUDI sources.

B. Unfiltered System Treatment Technique Requirements for Cryptosporidium

1. What Is EPA Proposing Today?

a. Overview. EPA is proposing treatment technique requirements for Cryptosporidium in unfiltered systems. Today's proposal requires all unfiltered systems using surface water or ground water under the direct influence of surface water to achieve at least 2 log (99%) inactivation of Cryptosporidium prior to the distribution of finished water. Further, unfiltered systems must monitor for Cryptosporidium in their source water, and where monitoring demonstrates a mean level above 0.01 oocysts/L, systems must provide at least 3 log Cryptosporidium inactivation. Disinfectants that can be used to meet this treatment requirement include ozone, ultraviolet (UV) light, and chlorine dioxide.

All current requirements for unfiltered systems under 40 CFR 141.71 and 141.72(a) remain in effect, including requirements to inactivate at least 3 log of Giardia lamblia and 4 log of viruses. In addition, unfiltered systems must meet their overall disinfection requirements using a minimum of two disinfectants. These proposed requirements reflect recommendations of the Stage 2 M-DBP Federal Advisory Committee. Details of the proposed requirements are described in the following sections.

b. Monitoring requirements. Requirements for Cryptosporidium monitoring by unfiltered systems are similar to requirements for filtered systems of the same size, as given in section IV.A.1. Unfiltered systems serving at least 10,000 people must sample their source water for Cryptosporidium at least monthly for two years, beginning no later than 6 months after promulgation of this rule. Samples may be collected more frequently (e.g., semi-monthly, weekly) as long as a consistent frequency is maintained throughout the monitoring period.

Unfiltered systems serving fewer than 10,000 people must conduct source water sampling for Cryptosporidium at least twice-per-month for one year, beginning no later than 4 years following promulgation of this rule (i.e., on the same schedule as small filtered systems). However, unlike small filtered systems, small unfiltered systems cannot monitor for an indicator (e.g., E. coli) to determine if they are required to monitor for Cryptosporidium. EPA has not identified indicator criteria that can effectively screen for plants with Cryptosporidium concentrations below 0.01 oocysts/L. Consequently, all small unfiltered systems must conduct Cryptosporidium monitoring.

As described in section IV.K and IV.L, Cryptosporidium analyses must be performed on at least 10 L per sample with EPA Methods 1622 or 1623, and must be conducted by laboratories approved for these methods by EPA. Analysis of larger sample volumes is allowed, provided the laboratory has demonstrated comparable method performance to that achieved on a 10 L sample. Section IV.J describes requirements for reporting sample analysis results. All Cryptosporidium samples must be collected in accordance with a schedule that is developed by the system and submitted to EPA or the State at least 3 months prior to initiation of sampling. Refer to section IV.A.1 for requirements pertaining to any failure to report a valid sample analysis result for a scheduled sampling date and procedures for collecting a replacement sample.

Unfiltered systems are required to participate in future Cryptosporidium monitoring on the same schedule as filtered systems of the same size. Future monitoring requirements for filtered systems are described in section IV.A.1.

Unfiltered systems are not required to conduct source water Cryptosporidium monitoring under the LT2ESWTR if the system currently provides or will provide a total of at least 3 log Cryptosporidium inactivation, equivalent to meeting the treatment requirements for unfiltered systems with a mean Cryptosporidium concentration of greater than 0.01 oocysts/L. Systems must notify the State not later than the date the system is otherwise required to submit a sampling schedule for monitoring. Systems must install and operate technologies to provide a total of at least 3 log Cryptosporidium inactivation by the applicable date in Table IV-24.

c. Treatment requirements. All unfiltered systems must provide treatment for Cryptosporidium, and the degree of required treatment depends on the level of Cryptosporidium in the source water as determined through monitoring. Unfiltered systems must calculate their average source water Cryptosporidium concentration using the arithmetic mean of all samples collected during the required two year monitoring period (or one year monitoring period for small systems). For unfiltered systems with mean source water Cryptosporidium levels of less than or equal to 0.01 oocysts/L, 2 log Cryptosporidium inactivation is required. Where the mean source water level is greater than 0.01 oocysts/L, 3 log inactivation is required.

In addition, unfiltered systems are required to use at least two different disinfectants to meet their overall inactivation requirements for viruses (4 log), Giardia lamblia (3 log), and Cryptosporidium (2 or 3 log). Further, each of the two disinfectants must achieve by itself the total inactivation required for one of these three pathogen types. For example, a system could use UV light to achieve 2 log inactivation of Cryptosporidium and Giardia lamblia, and use chlorine to inactivate 1 log Giardia lamblia and 4 log viruses. In this case, chlorine would achieve the total inactivation required for viruses while UV light would achieve the total inactivation required for Cryptosporidium, and the two disinfectants together would meet the overall treatment requirements for viruses, Giardia lamblia, and Cryptosporidium. In all cases unfiltered systems must continue to meet disinfectant residual requirements for the distribution system.

EPA has developed criteria, described in sections IV.C.14-15, for systems to determine Cryptosporidium inactivation credits for chlorine dioxide, ozone, and UV light. Unfiltered systems are allowed to use any of these disinfectants to meet the 2 (or 3) log Cryptosporidium inactivation requirement. The following paragraphs describe standards for demonstrating compliance with the proposed Cryptosporidium treatment technique requirement. For systems using ozone and chlorine dioxide, these standards are similar to current standards for compliance with Giardia lamblia and virus treatment requirements, as established by the SWTR in 40 CFR 141.72 and 141.74. However, for systems using UV light, modified compliance standards are proposed, due to the different way in which UV disinfection systems will be monitored.

Each day a system using ozone or chlorine dioxide serves water to the public, the system must calculate the CT value(s) from the system's treatment parameters, using the procedures specified in 40 CFR 141.74(b)(3). The system must determine whether this value(s) is sufficient to achieve the required inactivation of Cryptosporidium based on the CT criteria specified in section IV.C.14. The disinfection treatment must ensure at least 99 percent (or 99.9 percent if required) inactivation of Cryptosporidium every day the system serves water to the public, except any one day each month. Systems are required to report daily CT values on a monthly basis, as described in section IV.J.

Each day a system using UV light serves water to the public, the system must monitor for the parameters, including flow rate and UV intensity, that demonstrate whether the system's UV reactors are operating within the range of conditions that have been validated to achieve the required UV dose, as specified in section IV.C.15. Systems must monitor each UV reactor while in use and must record periods when any reactor operates outside of validated conditions. The disinfection treatment must ensure at least 99 percent (or 99.9 percent if required) inactivation of Cryptosporidium in at least 95 percent of the water delivered to the public every month. Systems are required to report periods when UV reactors operate outside of validated conditions on a monthly basis, as described in section IV.J.

Unfiltered systems currently must comply with requirements for DBPs as a condition of avoiding filtration under 40 CFR 141.71(b)(6). As described earlier, EPA is developing a Stage 2 DBPR, which will further limit allowable levels of certain DBPs, specifically trihalomethanes and haloacetic acids. EPA intends to incorporate new standards for DBPs established under the Stage 2 DBPR into the criteria for filtration avoidance.

2. How Was This Proposal Developed?

a. Basis for Cryptosporidium treatment requirements. The intent of the proposed treatment requirements for unfiltered systems is to achieve public health protection against Cryptosporidium equivalent to filtration systems. As described in section III.C, an assessment of survey data indicates that under current treatment requirements, finished water Cryptosporidium levels are higher in unfiltered systems than in filtered systems.

Information Collection Rule data show an average plant-mean Cryptosporidium level of 0.59 oocysts/L in the source water of filtered plants and 0.014 oocysts/L in unfiltered systems. Median plant-mean concentrations were 0.052 and 0.0079 oocysts/L in filtered and unfiltered system sources, respectively. Thus, these results suggest that typical Cryptosporidium occurrence in filtered system sources is approximately 10 times higher than in unfiltered system sources.

In translating these data to assess finished water risk, EPA and the Advisory Committee estimated that conventional plants in compliance with the IESWTR achieve an average Cryptosporidium removal of 3 log (see discussion in section III.D). Hence, if the median source water Cryptosporidium level at conventional plants is approximately 10 times higher than at unfiltered systems, and it is estimated that conventional plants achieve an average reduction of 3 log (99.9%), then the median finished water Cryptosporidium concentration at conventional plants is lower by a factor of 100 than at unfiltered systems. Therefore, to ensure equivalent public health protection, unfiltered systems should reduce Cryptosporidium levels by 2 log.

Due to the development of criteria for Cryptosporidium inactivation with ozone, chlorine dioxide, and UV light, EPA has determined that it is feasible for unfiltered systems to comply with a Cryptosporidium treatment technique requirement. Consequently, EPA is proposing that all unfiltered systems provide at least 2 log inactivation of Cryptosporidium.

The proposed treatment requirements for unfiltered systems with higher source water Cryptosporidium levels are consistent with proposed treatment requirements for filtered systems. As discussed previously, EPA is proposing that filtered plants with mean source water Cryptosporidium levels between 0.075 and 1.0 oocysts/L, as measured by Methods 1622 and 1623, provide at least a 4 log reduction (with greater treatment required for higher source water pathogen levels). These requirements will achieve average treated water Cryptosporidium concentrations below 1 oocyst/10,000 L in filtered systems. An unfiltered system with a mean source water Cryptosporidium concentration above 0.01 oocyst/L would need to provide more than 2 log inactivation in order to achieve an equivalent finished water oocyst level. Therefore, EPA is proposing that unfiltered systems provide at least 3 log inactivation where mean concentrations exceed 0.01 oocysts/L.

For unfiltered systems using UV disinfection to meet these proposed Cryptosporidium treatment requirements, EPA is proposing that compliance be based on a 95th percentile standard (i.e., at least 95 percent of the water must be treated to the required UV dose). This standard is intended to be comparable with the “every day except any one day per month” compliance standard established by the SWTR for chemical disinfection (see 40 CFR 141.72(a)(1)). Because UV disinfection systems will typically consist of multiple parallel reactors that will be monitored continuously, the Agency has determined that it is more appropriate to base a compliance determination on the percentage of water disinfected to the required level, rather than a single daily measurement. The UV Disinfection Guidance Manual (USEPA 2003d) will provide advice on meeting this proposed standard. A draft of this guidance is available in the docket for today's proposal (http://www.epa.gov/edocket/).

b. Basis for requiring the use of two disinfectants. EPA is proposing that unfiltered systems use at least two different disinfectants to meet the 2 (or 3), 3, and 4 log inactivation requirements for Cryptosporidium, Giardia lamblia, and viruses, respectively. The purpose of this requirement is to provide for multiple barriers of protection against pathogens. One benefit of this approach is that if one barrier were to fail then there would still be one remaining barrier to provide protection against some of the pathogens that might be present. For example, if a plant used UV to inactivate Cryptosporidium and Giardia lamblia, along with chlorine to inactivate viruses, and the UV system were to malfunction, the chlorine would still meet the treatment requirement for viruses and would provide some degree of protection against Giardia lamblia.

Another benefit of multiple barriers is that they will typically provide more effective protection against a broad spectrum of pathogens than a single disinfectant. Because the efficacy of disinfectants against different pathogens varies widely, using multiple disinfectants will generally provide more efficient inactivation of a wide range of pathogens than a single disinfectant.

EPA is aware, though, that this requirement would not result in a redundant barrier for each type of pathogen. In the example of a plant using chlorine and UV, the chlorine would provide essentially no protection against Cryptosporidium and might achieve only a small amount of Giardia lamblia inactivation if it was designed primarily to inactivate viruses. However, since the watersheds of unfiltered systems are required to be protected (40 CFR 141.71), the probability is low that high levels of Cryptosporidium or Giardia lamblia would occur during the time frame necessary to address a short period of treatment failure.

Note the request for comment on this topic at the end of this section.

c. Basis for source water monitoring requirements. Monitoring by unfiltered systems is necessary to identify those with mean source water Cryptosporidium levels above 0.01 oocysts/L. In order to allow for simultaneous compliance with other microbial and disinfection byproduct regulatory requirements, EPA is proposing that unfiltered systems monitor for Cryptosporidium on the same schedule as filtered systems of the same size. Because EPA was not able to identify indicator criteria, such as E. coli, that can discriminate among systems above and below a mean Cryptosporidium concentration of 0.01 oocysts/L, EPA is proposing that all unfiltered systems monitor for Cryptosporidium.

Consistent with requirements for filtered systems, unfiltered systems are required to analyze at least 24 samples of at least 10 L over the two year monitoring period (one year for small systems). However, if an unfiltered system collected and analyzed only 24 samples of 10 L then a total count of 3 oocysts among all samples would result in a source water concentration exceeding 0.01 oocysts/L. To avoid a relatively small number of counts determining an additional treatment implication, unfiltered systems may consider conducting more frequent sampling or analyzing larger sample volumes (e.g., 50 L). Since the water sources of unfiltered systems tend to have very low turbidity (compared to average sources in filtered systems), it is typically more feasible to analyze larger sample volumes in unfiltered systems. Filters have been approved for Cryptosporidium analysis of 50 L samples. Note that analysis of larger sample volumes would not reduce the required sampling frequency.

3. Request for Comment

EPA solicits comment on the proposed monitoring and treatment technique requirements for unfiltered systems. Specifically, the Agency seeks comment on the following issues:

Use of Two Disinfectants

EPA requests comment on the proposed requirement for unfiltered systems to use two disinfectants and for each disinfectant to meet by itself the inactivation requirement for at least one regulated pathogen. The requirement for unfiltered systems to use two disinfectants was recommended by the Advisory Committee because (1) disinfectants vary in their efficacy against different pathogens, so that the use of multiple disinfectants can provide more effective protection against a broad spectrum of pathogens, and (2) multiple disinfectants provide multiple barriers of protection, which can be more reliable than a single disinfectant.

An alternate approach would be to allow systems to meet the inactivation requirements using any combination of one or more disinfectants that achieved the required inactivation level for all pathogens. This would give systems greater flexibility and could spur the development of new disinfection techniques that would be applicable to a wide range of pathogens. However, this approach might be less protective against unregulated pathogens. A related question is whether the proposed requirements for use of two disinfectants establish an adequate level of multiple barriers in the treatment provided by unfiltered systems.

Treatment Requirements for Unfiltered Systems With Higher Cryptosporidium Levels

Under the proposed LT2ESWTR, a filtered system that measures a mean source water Cryptosporidium level of 0.075 oocysts/L or higher is required to provide a total of 4 log or more reduction of Cryptosporidium. However, if an unfiltered system, meeting the criteria for avoiding filtration were to measure Cryptosporidium at this level, it would be required to provide only 3 log treatment. Available occurrence data indicate that very few, if any, unfiltered systems will measure mean source water Cryptosporidium concentrations above 0.075 oocysts/L. However, EPA requests comment on whether or how this possibility should be addressed.

C. Options for Systems To Meet Cryptosporidium Treatment Requirements

1. Microbial Toolbox Overview

The LT2ESWTR proposal contains a list of treatment processes and management practices for water systems to use in meeting additional Cryptosporidium treatment requirements under the LT2ESWTR. This list, termed the microbial toolbox, was recommended by the Stage 2 M-DBP Advisory Committee in the Agreement in Principle. Components of the microbial toolbox include watershed control programs, alternative sources, pretreatment processes, additional filtration barriers, inactivation technologies, and enhanced plant performance. The intent of the microbial toolbox is to provide water systems with broad flexibility in selecting cost-effective LT2ESWTR compliance strategies. Moreover, the toolbox allows systems that currently provide additional pathogen barriers or that can demonstrate enhanced performance to receive additional Cryptosporidium treatment credit.

A key feature of the microbial toolbox is that many of the components carry presumptive credits towards Cryptosporidium treatment requirements. Plants will receive these credits for toolbox components by demonstrating compliance with required design and implementation criteria, as described in the sections that follow. Treatment credit greater than the presumptive credit may be awarded for a toolbox component based on a site-specific or technology-specific demonstration of performance, as described in section IV.C.17.

While the Advisory Committee made recommendations for the degree of presumptive treatment credit to be granted to different toolbox components, the Committee did not specify the design and implementation conditions under which the credit should be awarded. EPA has identified and is proposing such conditions in today's notice, based on an assessment of available data. For certain toolbox components, such as raw water storage and roughing filters, the Agency concluded that available data do not support the credit recommended by the Advisory Committee. Consequently, EPA is not proposing a presumptive credit for these options.

For each microbial toolbox component, EPA is requesting comment on: (1) Whether available data support the proposed presumptive credits, including the design and implementation conditions under which the credit would be awarded, (2) whether available data are consistent with the decision not to award presumptive credit for roughing filters and raw water off-stream storage, and (3) whether additional data are available on treatment effectiveness of toolbox components for reducing Cryptosporidium levels. EPA will consider modifying today's proposal for microbial toolbox components based on new information that may be provided.

EPA particularly solicits comment on the performance of alternative filtration technologies that are currently being used, as well as ones that systems are considering for use in the future, specifically including bag filters, cartridge filters, and bank filtration, in removing Cryptosporidium. The Agency requests both laboratory and field data that will support a determination of the appropriate level of Cryptosporidium removal credit to award to these technologies. In addition, the Agency requests information on the applicability of these technologies to different source water types and treatment scenarios. Data submitted in response to this request for comment should include, where available, associated quality assurance and cost information. This preamble discusses bank filtration in section IV.C.6 and bag and cartridge filters in section IV.C.12.

Table IV-7 summarizes presumptive credits and associated design and implementation criteria for microbial toolbox components. Each component is then described in more detail in the sections that follow. EPA is also developing guidance to assist systems with implementing toolbox components. Pertinent guidance documents include: UV Disinfection Guidance Manual (USEPA 2003d), Membrane Filtration Guidance Manual (USEPA 2003e), and Toolbox Guidance Manual (USEPA 2003f). Each is available in draft form in the docket for today's proposal (http://www.epa.gov/edocket/).

Table IV-7.—Microbial Toolbox: Proposed Options, Log Credits, and Design/Implementation Criteria1 Back to Top
Toolbox option Proposed Cryptosporidium log credit with design and implementation criteria1
1Table provides summary information only; refer to following preamble and regulatory language for detailed requirements.
Watershed control program 0.5 log credit for State-approved program comprising EPA specified elements. Does not apply to unfiltered systems.
Alternative source/Intake management No presumptive credit. Systems may conduct simultaneous monitoring for LT2ESWTR bin classification at alternative intake locations or under alternative intake management strategies.
Off-stream raw water storage No presumptive credit. Systems using off-stream storage must conduct LT2ESWTR sampling after raw water reservoir to determine bin classification.
Pre-sedimentation basin with coagulation 0.5 log credit with continuous operation and coagulant addition; basins must achieve 0.5 log turbidity reduction based on the monthly mean of daily measurements in 11 of the 12 previous months; all flow must pass through basins. Systems using existing pre-sed basins must sample after basins to determine bin classification and are not eligible for presumptive credit.
Lime softening 0.5 log additional credit for two-stage softening (single-stage softening is credited as equivalent to conventional treatment). Coagulant must be present in both stages—includes metal salts, polymers, lime, or magnesium precipitation. Both stages must treat 100% of flow.
Bank filtration (as pretreatment) 0.5 log credit for 25 ft. setback; 1.0 log credit for 50 ft. setback; aquifer must be unconsolidated sand containing at least 10% fines; average turbidity in wells must be 1 NTU. Systems using existing wells followed by filtration must monitor well effluent to determine bin classification and are not eligible for presumptive credit.
Combined filter performance 0.5 log credit for combined filter effluent turbidity ≤ 0.15 NTU in 95% of samples each month.
Roughing filters No presumptive credit proposed.
Slow sand filters 2.5 log credit as a secondary filtration step; 3.0 log credit as a primary filtration process. No prior chlorination.
Second stage filtration 0.5 log credit for second separate filtration stage; treatment train must include coagulation prior to first filter. No presumptive credit for roughing filters.
Membranes Log credit equivalent to removal efficiency demonstrated in challenge test for device if supported by direct integrity testing.
Bag filters 1 log credit with demonstration of at least 2 log removal efficiency in challenge test.
Cartridge filters 2 log credit with demonstration of at least 3 log removal efficiency in challenge test.
Chlorine dioxide Log credit based on demonstration of log inactivation using CT table.
Ozone Log credit based on demonstration of log inactivation using CT table.
UV Log credit based on demonstration of inactivation with UV dose table; reactor testing required to establish validated operating conditions.
Individual filter performance 1.0 log credit for demonstration of filtered water turbidity 0.1 NTU in 95 percent of daily max values from individual filters (excluding 15 min period following backwashes) and no individual filter 0.3 NTU in two consecutive measurements taken 15 minutes apart.
Demonstration of performance Credit awarded to unit process or treatment train based on demonstration to the State, through use of a State-approved protocol.

2. Watershed Control Program

a. What is EPA proposing today? EPA is proposing a 0.5 log credit towards Cryptosporidium treatment requirements under the LT2ESWTR for filtered systems that develop a State-approved watershed control program designed to reduce the level of Cryptosporidium. The watershed control program credit can be added to the credit awarded for any other toolbox component. However, this credit is not available to unfiltered systems, as they are currently required under 40 CFR 141.171 to maintain a watershed control program that minimizes the potential for contamination by Cryptosporidium as a criterion for avoiding filtration.

There are many potential sources of Cryptosporidium in watersheds, including sewage discharges and non-point sources associated with animal feces. The feasibility, effectiveness, and sustainability of control measures to reduce Cryptosporidium contamination of water sources will be site-specific. Consequently, the proposed watershed control program credit centers on systems working with stakeholders in the watershed to develop a site-specific program, and State review and approval are required. In the Toolbox Guidance Manual (USEPA 2003f), available in draft in the docket for today's proposal, EPA provides information on management practices that systems may consider in developing their watershed control programs.

Initial State approval of a system's watershed control program will be based on State review of the system's proposed watershed control plan and supporting documentation. The initial approval can be valid until the system completes the second round of Cryptosporidium monitoring described in section IV.A (systems begin a second round of monitoring six years after the initial bin assignment). During this period, the system is responsible for implementing the approved plan and complying with other general requirements, such as an annual watershed survey and program status report. These requirements are further described later in this section.

The period during which State approval of a watershed control program is in effect is referred to as the approval period. Systems that want to continue their eligibility to receive the 0.5 log Cryptosporidium treatment credit must reapply for State approval of the program for each subsequent approval period. In general, the re-approval will be based on the State's review of the system's reapplication package, as well as the annual status reports and watershed surveys. Subsequent approval(s) by the State of the watershed control program typically will be for a time equivalent to the first approval period, but States have the discretion to renew approval for a longer or shorter time period.

Requirements for Initial State Approval of Watershed Control Programs

Systems that intend to pursue a 0.5 log Cryptosporidium treatment credit for a watershed control program are required to notify the State within one year following initial bin assignment that the system proposes to develop a watershed control plan and submit it for State approval.

The application to the State for initial program approval must include the following minimum elements:

  • An analysis of the vulnerability of each source to Cryptosporidium. The vulnerability analysis must address the watershed upstream of the drinking water intake, including: A characterization of the watershed hydrology, identification of an “area of influence” (the area to be considered in future watershed surveys) outside of which there is no significant probability of Cryptosporidium or fecal contamination affecting the drinking water intake, identification of both potential and actual sources of Cryptosporidium contamination, the relative impact of the sources of Cryptosporidium contamination on the system's source water quality, and an estimate of the seasonal variability of such contamination.
  • An analysis of control measures that could address the sources of Cryptosporidium contamination identified during the vulnerability analysis. The analysis of control measures must address their relative effectiveness in reducing Cryptosporidium loading to the source water and their sustainability.
  • A plan that specifies goals and defines and prioritizes specific actions to reduce source water Cryptosporidium levels. The plan must explain how actions are expected to contribute to specified goals, identify partners and their role(s), present resource requirements and commitments including personnel, and include a schedule for plan implementation.

The proposed watershed control plan and a request for program approval and 0.5 log Cryptosporidium treatment credit must be submitted by the system to the State no later than 24 months following initial bin assignment.

The State will review the system's initial proposed watershed control plan and either approve, reject, or “conditionally approve” the plan. If the plan is approved, or if the system agrees to implementing the State's conditions for approval, the system will be awarded 0.5 log credit towards LT2ESWTR Cryptosporidium treatment requirements. A final decision on approval must be made no later than three years following the system's initial bin assignment.

The initial State approval of the system's watershed control program can be valid until the system completes the required second round of Cryptosporidium monitoring. The system is responsible for taking the required steps, described as follows, to maintain State program approval and the 0.5 log credit during the approval period.

Requirements for Maintaining State Approval of Watershed Control Programs

Systems that have obtained State approval of their watershed control program are required to meet the following ongoing requirements within each approval period to continue their eligibility for the 0.5 log Cryptosporidium treatment credit:

  • Submit an annual watershed control program status report to the State during each year of the approval period.
  • Conduct an annual State-approved watershed survey and submit the survey report to the State.
  • Submit to the State an application for review and re-approval of the watershed control program and for a continuation of the 0.5 log treatment credit for a subsequent approval period.

The annual watershed control program status report must describe the system's implementation of the approved plan and assess the adequacy of the plan to meet its goals. It must explain how the system is addressing any shortcomings in plan implementation, including those previously identified by the State or as the result of the watershed survey. If it becomes necessary during implementation to make substantial changes in its approved watershed control program, the system must notify the State and provide a rationale prior to making any such changes . If any change is likely to reduce the level of source water protection, the system must also include the actions it will take to mitigate the effects in its notification.

The watershed survey must be conducted according to State guidelines and by persons approved by the State to conduct watershed surveys. The survey must encompass the area of the watershed that was identified in the State-approved watershed control plan as the area of influence and, as a minimum, assess the priority activities identified in the plan and identify any significant new sources of Cryptosporidium.

The application to the State for review and re-approval of the system's watershed control program must be provided to the State at least six months before the current approval period expires or by a date previously determined by the State. The request must include a summary of activities and issues identified during the previous approval period and a revised plan that addresses activities for the next approval period, including any new actual or potential sources of Cryptosporidium contamination and details of any proposed or expected changes from the existing State-approved program. The plan must address goals, prioritize specific actions to reduce source water Cryptosporidium, explain how actions are expected to contribute to achieving goals, identify partners and their role(s), resource requirements and commitments, and the schedule for plan implementation.

The annual program status reports, watershed control plan and annual watershed sanitary surveys must be made available to the public upon request. These documents must be in a plain language format and include criteria by which to evaluate the success of the program in achieving plan goals. If approved by the State, the system may withhold portions of the annual status report, watershed control plan, and watershed sanitary survey based on security considerations.

b. How was this proposal developed? The M-DBP Advisory Committee recommended that systems be awarded 0.5 log Cryptosporidium treatment credit for implementing a watershed control program. This recommendation was based on the Committee's recognition that some systems will be able to reduce the level of Cryptosporidium in their source water by implementing a well-designed and focused watershed control program. Moreover, the control measures used in the watershed to reduce levels of Cryptosporidium are likely to reduce concentrations of other pathogens as well.

EPA concurs that well designed watershed control programs that focus on reducing levels of Cryptosporidium contamination of water sources should be encouraged, and that implementation of such programs will likely reduce overall microbial risk. A broad reduction in microbial risk will occur through the application of control measures and best management practices that are effective in reducing fecal contamination in the watershed. In addition, plant management practices may be enhanced by the knowledge systems acquire regarding the watershed and factors that affect microbial risk, such as sources, fate, and transport of pathogens.

Given the highly site-specific nature of a watershed control program, including the feasibility and effectiveness of different control measures, EPA believes that systems should demonstrate their eligibility for 0.5 log Cryptosporidium treatment credit by developing targeted programs that account for site-specific factors. As part of developing a watershed control program, systems will be required to assess a number of these factors, including watershed hydrology, sources of Cryptosporidium in the watershed, human impacts, and fate and transport of Cryptosporidium. Furthermore, EPA believes that the State is well positioned to judge whether a system's watershed control program is likely to achieve a substantial reduction of Cryptosporidium in source water. Consequently, EPA is proposing that approval of watershed control programs and allowance for an associated 0.5 log treatment credit be made by the State on a system specific basis.

A watershed control program could include measures such as (1) the elimination, reduction, or treatment of wastewater or storm water discharges, (2) treatment of Cryptosporidium contamination at the sites of waste generation or storage, (3) prevention of Cryptosporidium migration from sources, or (4) any other measures that are effective, sustainable, and likely to reduce Cryptosporidium contamination of source water. EPA recognizes that many public water systems do not directly control the watersheds of their sources of supply. EPA expects that systems will need to develop and maintain partnerships with landowners within watersheds, as well as with State governments and regional agencies that have authority over activities in the watershed that may contribute Cryptosporidium to the water supply. Stakeholders that have some level of control over activities that could contribute to Cryptosporidium contamination include municipal government and private operators of wastewater treatment plants, livestock farmers and persons who spread manure, individuals with failing septic systems, logging operations, and other government and commercial organizations.

EPA has initiated a number of programs that address watershed management and source water protection. In 2002, EPA launched the Watershed Initiative (67 FR 36172, May 23, 2002) (USEPA 2002b), which will provide grants to support innovative watershed based approaches to preventing, reducing, and eliminating water pollution. In addition, EPA has recently promulgated new regulations for Concentrated Animal Feeding Operations (CAFOs), which through the NPDES permit process will limit discharges that contribute microbial pathogens to watersheds.

SDWA section 1453 requires States to carry out a source water quality assessment program for the protection and benefit of public water systems. EPA issued program guidance in August of 1997, and expects that most States will complete their source water assessments of surface water systems by the end of 2003. These assessments will establish a foundation for watershed vulnerability analyses by providing the preliminary analyses of watershed hydrology, a starting point for defining the area of influence, and an inventory and hierarchy of actual and potential contamination sources. In some cases, these portions of the source water assessment may fully satisfy those analytical needs.

As noted earlier, EPA has published and is continuing to develop guidance material that addresses contamination by Cryptosporidium and other pathogens from both non-point sources (e.g., agricultural and urban runoff, septic tanks) and point sources (e.g., sewer overflows, POTWs, CAFOs). The Toolbox Guidance Manual, available in draft with today's proposal, includes a list of programmatic resources and guidance available to assist systems in building partnerships and implementing watershed protection activities. In addition, this guidance manual incorporates available information on the effectiveness of different control measures to reduce Cryptosporidium levels and provides case studies of watershed control programs. This guidance is intended to assist water systems in developing their watershed control programs and States in their assessment and approval of these programs.

In addition to guidance documents, demonstration projects, and technical resources, EPA provides funding for watershed and source water protection through the Drinking Water State Revolving Fund (DWSRF) and Clean Water State Revolving Fund (CWSRF). Under the DWSRF program, States may provide funding directly to public water systems for source water protection, including watershed management and pathogen source reduction plans. CWSRF funds have been used to develop and implement agricultural best management practices for reducing pathogen loading to receiving waters and to fund directly, or provide incentives for, the replacement of failing septic systems. EPA encourages the use of CWSRF for source protection and has developed guidelines for the award of funds to address non-point sources of pollution (CWA section 319 Non Point Source Pollution Program). Further, the Agency is promoting the broader use of SRF funds to implement measures to prevent and control non-point source pollution. Detailed sanitary surveys, with a specific analysis of sources of Cryptosporidium in the watershed, will facilitate the process of targeting funding available under SRF programs to eliminate or mitigate these sources.

c. Request for comment. EPA requests comment on the proposed watershed control program credit and associated program components.

  • Should the State be allowed to reduce the frequency of the annual watershed survey requirement for certain systems if systems engage in alternative activities like public outreach?
  • The effectiveness of a watershed control program may be difficult to assess because of uncertainty in the efficacy of control measures under site-specific conditions. In order to provide constructive guidance, EPA welcomes reports on scientific case studies and research that evaluated methods for reducing Cryptosporidium contamination of source waters.
  • Are there confidential business information (CBI) concerns associated with making information on the watershed control program available to the public? If so, what are these concerns and how should they be addressed?
  • How should the “area of influence” (the area to be considered in future watershed surveys) be delineated, considering the persistence of Cryptosporidium?

3. Alternative Source

a. What is EPA proposing today? Plant intake refers to the works or structures at the head of a conduit through which water is diverted from a source (e.g., river or lake) into the treatment plant. Plants may be able to reduce influent Cryptosporidium levels by changing the intake placement (either within the same source or to an alternate source) or managing the timing or level of withdrawal.

Because the effect of changing the location or operation of a plant intake on influent Cryptosporidium levels will be site specific, EPA is not proposing any presumptive credit for this option. Rather, if a system is concerned that Cryptosporidium levels associated with the current plant intake location and/or operation will result in a bin assignment requiring additional treatment under the LT2ESWTR, the system may conduct concurrent Cryptosporidium monitoring reflecting a different intake location or different intake management strategy. The State will then make a determination as to whether the plant may be classified in an LT2ESWTR bin using the alternative intake location or management monitoring results.

Thus, systems that intend to be classified in an LT2ESWTR bin based on a different intake location or management strategy must conduct concurrent Cryptosporidium monitoring. The system is still required to monitor its current plant intake in addition to any alternative intake location/management monitoring, and must submit the results of all monitoring to the State. In addition, the system must provide the State with supporting information documenting the conditions under which the alternative intake location/management samples were collected. The concurrent monitoring must conform to the sample frequency, sample volume, analytical method, and other requirements that apply to the system for Cryptosporidium monitoring as stated in Section IV.A.1.

If a plant's LT2ESWTR bin classification is based on monitoring results reflecting a different intake location or management strategy, the system must relocate the intake or implement the intake management strategy within the compliance time frame for the LT2ESWTR, as specified in section IV.F.

b. How was this proposal developed? In the Stage 2 M-DBP Agreement in Principle, the Advisory Committee identified several actions related to the intake which potentially could reduce the concentration of Cryptosporidium entering a treatment plant. These actions were included in the microbial toolbox under the heading Alternative Source, and include: (1) Intake relocation, (2) change to alternative source of supply, (3) management of intake to reduce capture of oocysts in source water, (4) managing timing of withdrawal, and (5) managing level of withdrawal in water column.

It is difficult to predict in advance the efficacy of any of these activities in reducing levels of Cryptosporidium entering the treatment plant. However, if a system relocates the plant intake or implements a different intake management strategy, it is appropriate for the plant to be assigned to an LT2ESWTR bin using monitoring results reflecting the new intake strategy.

EPA believes that the requirements specified for monitoring to determine bin placement are necessary to characterize a plant's mean source water Cryptosporidium level. Consequently, any concurrent monitoring carried out to characterize a different intake location or management strategy should be equivalent. For this reason, the sampling and analysis requirements which apply to the current intake monitoring also apply to any concurrent monitoring used to characterize a new intake location or management strategy.

EPA also recognizes that if plant's bin assignment is based on a new intake operation strategy then it is important for the plant to continue to use this new strategy in routine operation. Therefore, EPA is proposing that the system document the new intake operation strategy when submitting additional monitoring results to the State and that the State approve that new strategy.

c. Request for comment. EPA requests comment on the following issues:

  • What are intake management strategies by which systems could reduce levels of Cryptosporidium in the plant influent?
  • Can representative Cryptosporidium monitoring to demonstrate a reduction in oocyst levels be accomplished prior to implementation of a new intake strategy (e.g., monitoring a new source prior to constructing a new intake structure)?
  • How should this option be applied to plants that use multiple sources which enter a plant through a common conduit, or which use separate sources which enter the plant at different points?

4. Off-Stream Raw Water Storage

a. What is EPA proposing today? Off-stream raw water storage reservoirs are basins located between a water source (typically a river) and the coagulation and filtration processes in a treatment plant. EPA is not proposing presumptive treatment credit for Cryptosporidium removal through off-stream raw water storage. Systems using off-stream raw water storage must conduct Cryptosporidium monitoring after the reservoir for the purpose of determining LT2ESWTR bin placement. This will allow reductions in Cryptosporidium levels that occur through settling during off-stream storage to be reflected in the monitoring results and consequent LT2ESWTR bin assignment.

The use of off-stream raw water storage reservoirs during LT2ESWTR monitoring must be consistent with routine plant operation and must be recorded by the system. Guidance on monitoring locations is provided in Public Water System Guidance Manual for Source Water Monitoring under the LT2ESWTR (USEPA 2003g), which is available in draft in the docket for today's proposal.

b. How was this proposal developed? The Stage 2 M-DBP Agreement in Principle recommends a 0.5 log credit for off-stream raw water storage reservoirs with detention times on the order of days and 1.0 log credit for reservoirs with detention times on the order of weeks. After a review of the available literature, EPA is unable to determine criteria that provide reasonable assurance of achieving a 0.5 or 1 log removal of oocysts. Consequently, EPA is not proposing a presumptive treatment credit for this process.

This proposal for off-stream raw water storage represents a change from the November 2001 pre-proposal draft of the LT2ESWTR (USEPA 2001g), which described 0.5 log and 1 log presumptive credits for reservoirs with hydraulic detention times of 21 and 60 days, respectively. These criteria were based on a preliminary assessment of reported studies, described later in this section, that evaluated Cryptosporidium and Giardia removal in raw water storage reservoirs.

Subsequent to the November 2001 pre-proposal draft, the Science Advisory Board (SAB) reviewed the data that EPA had acquired to support Cryptosporidium treatment credits for off-stream raw water storage (see section VII.K). In written comments from a December 2001 meeting of the SAB Drinking Water Committee, the panel concluded that the available data were not adequate to demonstrate the treatment credits for off-stream raw water storage described in the pre-proposal draft, and recommended that no presumptive credits be given for this toolbox option. The panel did agree, though, that a utility should be able to take advantage of off-stream raw water storage by sampling after the reservoir for appropriate bin placement. EPA concurs with this finding by the SAB and today's proposal is consistent with their recommendation.

Off-stream raw water storage can improve the microbial quality of water in a number of ways. These include (1) reduced microbial and particulate loading to the plant due to settling in the reservoir, (2) reduced viability of pathogens due to die-off, and (3) dampening of water quality and hydraulic spikes. EPA has evaluated a number of studies that investigated the removal of Cryptosporidium and other microorganisms and particles in raw water storage basins. These studies are summarized in the following paragraphs, and selected results are presented in Table IV-8.

Table IV-8.—Studies of Cryptosporidium and Giardia Removal From Off-Stream Raw Water Storage Back to Top
Researcher Reservoir Residence time Log reductions
Ketelaars et al. 1995 Biesbosch reservoir system: man-made pumped storage (Netherlands) 24 weeks (average) Cryptosporidium-1.4 Giardia-2.3.
Van Breeman et al. 1998 Biesbosch reservoir system: man-made pumped storage (Netherlands) 24 weeks (average) Cryptosporidium-2.0 Giardia-2.6.
PWN (Netherlands) 10 weeks (average) Cryptosporidium-1.3 Giardia-0.8.
Bertolucci et al. 1998 Abandoned gravel quarry used for storage (Italy) 18 days (theoretical) Cryptosporidium-1.0 Giardia-0.8.
Ongerth, 1989 Three impoundments on rivers with limited public access (Seattle, WA) 40, 100 and 200 days (respectively) No Giardia removal observed.

Ketelaars et al. (1995) evaluated Cryptosporidium and Giardia removal across a series of three man-made pumped reservoirs, named the Biesbosch reservoirs, with reported hydraulic retention times of 11, 9, and 4 weeks (combined retention time of 24 weeks). To prevent algal growth and hypolimnetic deoxygenation, the reservoirs were destratified by air-injection. Based on weekly sampling over one year, mean influent and effluent concentrations of Cryptosporidium were 0.10 and 0.004 oocysts/100 L, respectively, indicating an average removal across the three reservoirs of 1.4 log. Mean removal of Giardia was 2.3 log.

Van Breemen et al. (1998) continued the efforts of Ketelaars et al. (1995) in evaluating pathogen removal across the Biesbosch reservoir system. Using a more sensitive analytical method, Van Breeman et al. measured mean Cryptosporidium levels of 6.3 and 0.064 oocysts/100 L at the inlet and outlet, respectively, indicating an average removal of 2.0 log. For Giardia, the average reduction was 2.6 log. In addition, Van Breeman et al. (1998) evaluated removal of Cryptosporidium, Giardia, and other microorganisms in a reservoir designated PWN, which had a hydraulic retention time of 10 weeks. Passage through this storage reservoir was reported to reduce the mean concentration of Cryptosporidium by 1.3 log and of Giardia by 0.8 log.

Bertolucci et al. (1998) investigated removal of Cryptosporidium, Giardia, and nematodes in a reservoir derived from an abandoned gravel quarry with a detention time reported as around 18 days. Over a 2 year period, average influent and effluent concentrations of Cryptosporidium were 70 and 7 oocysts/100 L, respectively, demonstrating a mean reduction of 1 log. Average Giardia levels decreased from 137 cysts/100L in the inlet to 46 cysts/100L at the outlet, resulting in a mean 0.5 log removal.

Ongerth (1989) studied concentrations of Giardia cysts in the Tolt, Cedar, and Green rivers, which drain the western slope of the Cascade Mountains in Washington. The watersheds of each river are controlled by municipal water departments for public water supply, and public access is limited. The Cedar, Green, and Tolt rivers each have impoundments with reported residence times of 100, 30-50, and 200 days, respectively, in the reach studied. Ongerth found no statistically significant difference in cyst concentrations above and below any of the reservoirs. Median cyst concentrations above and below the Cedar, Green, and Tolt reservoirs were reported as 0.12 and 0.22, 0.27 and 0.32, and 0.16 and 0.21 cysts/L, respectively. It is unclear why no decrease in cyst levels was observed. It is possible that contamination of the water in the impoundments by Giardia from animal sources, either directly or through run-off, may have occurred.

EPA has also considered results from studies which evaluated the rate at which Cryptosporidium oocysts lose viability and infectivity over time. Two studies are summarized next, with selected results presented in Table IV-9.

Table IV-9.—Studies of Cryptosporidium Die-Off During Raw Water Storage Back to Top
Researcher Type of experiment Log reduction
Medema et al. 1997 River water was inoculated with Cryptosporidium and bacteria and incubated 0.5 log reduction over 50 days at 5 °C; 0.5 log reduction over 20-80 days at 15 °C.
Sattar et al. 1999 Synthetic hard water and natural water from several rivers inoculated with Giardia and Cryptosporidium In vitro conditions showed 0.7 to 2.0 log reduction over 30 days at 20 °C. Little reduction at 4 °C. In situ conditions showed 0.4 to 1.5 log reduction at 21 days.

Medema et al. (1997) conducted bench scale studies of the influence of temperature and the presence of biological activity on the die-off rate of Cryptosporidium oocysts. Die-off rates were determined at 5°C and 15°C, and in both natural and sterilized (autoclaved) river water. Both excystation and vital dye staining were used to determine oocyst viability. At 5°C, the die-off rate under all conditions was 0.010 log 10/day, assuming first-order kinetics. This translates to 0.5 log reduction at 50 days. At 15°C, the die-off rate in natural river water approximately doubled to 0.024 log 10/day (excystation) and 0.018 log 10/day (dye staining). However, in autoclaved water at 15°C, the die-off rate was only 0.006 log 10/day (excystation) and 0.011 log 10/day (dye staining). These results suggest that oocyst die-off is more rapid at higher temperatures in natural water, and this behavior may be caused by increased biological or biochemical activity.

Sattar et al. (1999) evaluated factors impacting Cryptosporidium and Giardia survival. Microtubes containing untreated water from the Grand and St. Lawrence rivers (Ontario) were inoculated with purified oocysts and cysts. Samples were incubated at temperatures ranging from 4°C to 30°C, viability of oocysts and cysts was measured by excystation. At 20°C and 30°C, reductions in viable Cryptosporidium oocysts ranged from approximately 0.6 to 2.0 log after 30 days. However, relatively little inactivation took place when oocysts were incubated at 4°C (as low as 0.2 log at 100 days).

To evaluate oocyst survival under dynamic environmental conditions, Sattar et al. seeded dialysis cassettes with Cryptosporidium oocysts and placed them in overflow tanks receiving water from different rivers in Canada and the United States. Reductions in the concentration of viable oocysts ranged from approximately 0.4 to 1.5 log after 21 days. Survival of oocysts was enhanced by pre-filtering the water, suggesting that microbial antagonism was involved in the natural inactivation of the parasites.

Overall these studies indicate that off-stream storage of raw water has the potential to effect significant reductions in the concentration of viable Cryptosporidium oocysts, both through sedimentation and degradation of oocysts (i.e., die-off). However, these data also illustrate the challenge in reliably estimating the amount of removal that will occur in any particular storage reservoir. Removal and die-off rates reported in these studies varied widely, and were observed to be influenced by factors like temperature, contamination, hydraulic short circuiting, and biological activity (Van Breeman et al. 1998, Medema et al. 1997, Sattar et al. 1999). Because of this variability and the relatively small amount of available data, it is difficult to extrapolate from these studies to develop nationally applicable criteria for awarding removal credits to raw water storage.

c. Request for comment. EPA requests comment on the finding that the available data are not adequate to support a presumptive Cryptosporidium treatment credit for off-stream raw water storage, and that systems using off-stream storage should conduct LT2ESWTR monitoring at the reservoir outlet. This monitoring approach would account for reductions in oocyst concentrations due to settling, but would not provide credit for die-off, since non-viable oocysts could still be counted during monitoring. In addition, EPA would also appreciate comment on the following specific issues:

  • Is additional information available that either supports or suggests modifications to this proposal concerning off-stream raw water storage?
  • How should a system address the concern that water in off-stream raw water storage reservoirs may become contaminated through processes like algal growth, run-off, roosting birds, and activities on the watershed?

5. Pre-Sedimentation With Coagulant

a. What is EPA proposing today? Presedimentation is a preliminary treatment process used to remove particulate material from the source water before the water enters primary sedimentation and filtration processes in a treatment plant. EPA is proposing to award a presumptive 0.5 log Cryptosporidium treatment credit for presedimentation that is installed after LT2ESWTR monitoring and meets the following three criteria:

(1) The presedimentation basin must be in continuous operation and must treat all of the flow reaching the treatment plant.

(2) The system must continuously add a coagulant to the presedimentation basin.

(3) The system must demonstrate on a monthly basis at least 0.5 log reduction of influent turbidity through the presedimentation process in at least 11 of the 12 previous consecutive months. This monthly demonstration of turbidity reduction must be based on the arithmetic mean of at least daily turbidity measurements in the presedimentation basin influent and effluent, and must be calculated as follows:

Monthly mean turbidity log reduction = log 10 (monthly mean of daily influent turbidity)−log 10 (monthly mean of daily effluent turbidity).

If the presedimentation process has not been in operation for 12 months, the system must verify on a monthly basis at least 0.5 log reduction of influent turbidity through the presedimentation process, calculated as specified in this paragraph, for at least all but any one of the months of operation.

Systems with presedimentation in place at the time they begin LT2ESWTR Cryptosporidium monitoring are not eligible for the 0.5 log presumptive credit and must sample after the basin when in use for the purpose of determining their bin assignment. The use of presedimentation during LT2ESWTR monitoring must be consistent with routine plant operation and must be recorded by the system. Guidance on monitoring is provided in Public Water System Guidance Manual for Source Water Monitoring under the LT2ESWTR (USEPA 2003g), which is available in draft in the docket for today's proposal.

b. How was this proposal developed? Presedimentation is used to remove gravel, sand, and other gritty material from the raw water and dampen particle loading to the rest of the treatment plant. Presedimentation is similar to conventional sedimentation, except that presedimentation may be operated at higher loading rates and may not involve use of chemical coagulants. Also, some systems operate the presedimentation process periodically and only in response to periods of high particle loading.

Because presedimentation reduces particle concentrations, it is expected to reduce Cryptosporidium levels. In addition, by dampening variability in source water quality, presedimentation may improve the performance of subsequent treatment processes. In general, the efficacy of presedimentation in lowering particle levels is influenced by a number of water quality and treatment parameters including surface loading rate, temperature, particle concentration, coagulation, and characteristics of the sedimentation basin.

The Stage 2-M-DBP Agreement in Principle recommends 0.5 log presumptive Cryptosporidium treatment credit for presedimentation with the use of coagulant. Today's proposal is consistent with this recommendation. However, the proposed requirement for demonstrated turbidity reduction as a condition for presedimentation credit represents a change from the November 2001 pre-proposal draft of the LT2ESWTR (USEPA 2001g). Rather than a requirement for turbidity removal, the 2001 pre-proposal draft included criteria for maximum overflow rate and minimum influent turbidity as conditions for the 0.5 log presedimentation credit.

The Science Advisory Board (SAB) reviewed the criteria and supporting information for presedimentation credit in the November 2001 pre-proposal draft (see section VII.K). In written comments from a December 2001 meeting of the SAB Drinking Water Committee, the panel concluded that available data were minimal to support a 0.5 log presumptive credit and recommended that no credit be given for presedimentation. Additionally, the panel stated that performance criteria other than overflow rate need to be included if credit is to be given for presedimentation.

Due to this finding by the SAB, EPA further reviewed data on removal of aerobic spores (as an indicator of Cryptosporidium removal) and turbidity in full-scale presedimentation basins. As shown later in this section, these data indicate that presedimentation basins achieving a monthly mean reduction in turbidity of at least 0.5 log have a high likelihood of reducing mean Cryptosporidium levels by 0.5 log or more. Consequently, EPA has determined that it is appropriate to use turbidity reduction as a performance criterion for awarding Cryptosporidium treatment credit to presedimentation basins. The Agency believes this performance criterion addresses the concerns raised by the SAB.

The Agency has concluded that it is appropriate to limit eligibility for the 0.5 log presumptive Cryptosporidium treatment credit to systems that install presedimentation after LT2ESWTR monitoring. Systems with presedimentation in place prior to initiation of LT2ESWTR Cryptosporidium monitoring may sample after the presedimentation basin to determine their bin assignment. In this case, the effect of presedimentation in reducing Cryptosporidium levels will be reflected in the monitoring results and bin assignment. Systems that monitor after presedimentation are not subject to the operational and performance requirements associated with the 0.5 log credit. The SAB agreed that a system should be able to sample after the presedimentation treatment process for appropriate bin placement.

In considering criteria for awarding Cryptosporidium removal credit to presedimentation, EPA has evaluated both published studies and data submitted by water systems using presedimentation. There is relatively little published data on the removal of Cryptosporidium by presedimentation. Consequently, EPA has reviewed studies that investigated Cryptosporidium removal by conventional sedimentation basins. These studies are informative regarding potential levels of performance, the influence of water quality parameters, and correlation of Cryptosporidium removal with removal of potential surrogates. However, removal efficiency in conventional sedimentation basins may be greater than in presedimentation due to lower surface loading rates, higher coagulant doses, and other factors. To supplement these studies, EPA has evaluated data provided by utilities on removal of other types of particles, primarily aerobic spores, in the presedimentation processes of full scale plants. Data indicate that aerobic spores may serve as a surrogate for Cryptosporidium removal by sedimentation (Dugan et al. 2001).

i. Published studies of Cryptosporidium removal by conventional sedimentation basins. Table IV-10 summarizes results from published studies of Cryptosporidium removal by conventional sedimentation basins.

Table IV-10.—Summary of Published Studies of Cryptosporidium Removal by Conventional Sedimentation Basins Back to Top
Author(s) Plant/process type Cryptosporidium removal by sedimentation
Dugan et al. (2001) Pilot scale conventional 0.6 to 1.6 log (average 1.3 log).
States et al. (1997) Full scale conventional with primary and secondary sedimentation 0.41 log.
Edzwald and Kelly (1998) Bench scale sedimentation 0.8 to 1.2 log.
Payment and Franco (1993) Full scale conventional (2 plants) 3.8 log and 0.7 log.
Kelly et al. (1995) Full scale conventional (two stage lime softening) 0.8 log.
Full scale conventional (two stage sedimentation) 0.5 log.
Patania et al. (1995) Pilot scale conventional (3 plants) 2.0 log (median).

Dugan et al. (2001) evaluated the ability of conventional treatment to control Cryptosporidium under different water quality and treatment conditions on a small pilot scale plant that had been demonstrated to provide equivalent performance to a larger plant. Under optimal coagulation conditions, oocyst removal across the sedimentation basin ranged from 0.6 to 1.6 log, averaging 1.3 log. Suboptimal coagulation conditions (underdosed relative to jar test predictions) significantly reduced plant performance with oocyst removal in the sedimentation basin averaging 0.20 log. Removal of aerobic spores, total particle counts, and turbidity all correlated well with removal of Cryptosporidium by sedimentation.

States et al. (1997) monitored Cryptosporidium removal at the Pittsburgh Drinking Water Treatment Plant (65-70 million gallons per day (MGD)). The clarification process included ferric chloride coagulation, flocculation, and settling in both a small primary basin and a 120 MG secondary sedimentation basin. Geometric mean Cryptosporidium levels in the raw and settled water were 31 and 12 oocysts/100 L, respectively, indicating a mean reduction of 0.41 log.

Edzwald and Kelly (1998) conducted a bench-scale study to determine the optimal coagulation conditions with different coagulants for removing Cryptosporidium oocysts from spiked raw waters. Under optimal coagulation conditions, the authors observed oocysts reductions through sedimentation ranging from 0.8 to 1.2 log.

Payment and Franco (1993) measured Cryptosporidium and other microorganisms in raw, settled, and filtered water samples from drinking water treatment plants in the Montreal area. The geometric mean of raw and settled water Cryptosporidium levels in one plant were 742 and 0.12 oocysts/100 L, respectively, suggesting a mean removal of 3.8 log. In a second plant, mean removal by sedimentation was reported as 0.7 log, with raw and settled water Cryptosporidium levels reported as 2 and 0.2 oocysts/L, respectively.

Kelley et al. (1995) monitored Cryptosporidium levels in the raw, settled, and filtered water of two water treatment plants (designated site A and B). Both plants included two-stage sedimentation. At site A, mean raw and settled water Cryptosporidium levels were 60 and 9.5 oocysts/100 L, respectively, suggesting a mean removal of 0.8 log by sedimentation. At site B, mean raw and settled water Cryptosporidium levels were 53 and 16 oocysts/100 L, respectively, for an average removal by sedimentation of 0.5 log. Well water was intermittently blended in the second stage of sedimentation at site B, which may have reduced settled and filtered water pathogen levels.

Patania et al. (1995) evaluated removal of Cryptosporidium in four pilot scale plants. Three of these were conventional and one used in-line filtration (rapid mix followed by filtration). Cryptosporidium removal was generally 1.4 to 1.8 log higher in the process trains with sedimentation compared to in-line filtration. While the effectiveness of sedimentation for organism removal varied widely under the conditions tested, the median removal of Cryptosporidium by sedimentation was approximately 2.0 log.

ii. Data supplied by utilities on the removal of spores by presedimentation. Data on the removal of Cryptosporidium and spores (Bacillus subtilis and total aerobic spores) during operation of full-scale presedimentation basins were collected independently and reported by three utilities: St. Louis, MO, Kansas City, MO, and Cincinnati, OH. Cryptosporidium oocysts were not detected in raw water at these locations at levels sufficient to calculate log removals of oocysts directly. However, aerobic spores were present in the raw water of these utilities at high enough concentrations to measure log removals through presedimentation as a surrogate for Cryptosporidium removal. As noted earlier, data from Dugan et al. (2001) demonstrate a correlation between removal of aerobic spores and Cryptosporidium through sedimentation under optimal coagulation conditions. A summary of the spore removal data supplied by the these utilities is shown in Table IV-11.

Table IV-11.—Mean Spore Removal for Full-scale Presedimentation Basins Reported by Three Utilities Back to Top
Reporting utility Mean spore removal
St. Louis WaterDivision 1.1 log (B. subtilis).
Kansas City Water ServicesDepartment 0.8 log (B. subtilis) (with coagulant).
0.46 log (B. subtilis) (without coagulant).
Cincinnati Water Works 0.6 log (total aerobic spores).

The St. Louis Water Division operates four presedimentation basins at one facility. Coagulant addition prior to presedimentation includes polymer and occasional dosages of ferric sulfate. Bacillus subtilis spore samples were collected from June 1998 to September 2000. Reported mean spore concentrations in the raw water and following presedimentation were 108,326 and 8,132 cfu/100 mL, respectively, showing an average removal of 1.1 log by presedimentation.

The Kansas City Water Services Department collected Bacillus subtilis spore samples from January to November 2000 from locations before and after one of the facility's six presedimentation basins. Sludge generated by the primary clarifier of a softening process was recycled to the head of the presedimentation basins during the entire study period. In addition, coagulant (polymer and/or ferric sulfate) was added prior to presedimentation when raw water turbidity was higher. During periods when coagulant was added, mean spore levels before and after presedimentation were 102,292 and 13,154 cfu/100 mL, respectively, demonstrating a mean removal of 0.9 log. When no ferric sulfate or polymer was used, mean presedimentation influent and effluent spore levels were 13,296 and 4,609 cfu/100 mL, respectively, for an average reduction of 0.46 log.

The Cincinnati Water Works operates a treatment plant using lamella plate settlers for presedimentation. Lamella plate settlers are inclined plates added to a sedimentation basin to significantly increase the surface area available for particle settling. Coagulant (alum and polymer) is added to the raw water prior to presedimentation. Total aerobic spore samples were collected from January 1998 through December 2000. The mean concentration of spores decreased from 20,494 cfu/100 mL in the raw water to 4,693 cfu/100 mL in the presedimentation effluent, indicating a mean spore removal of 0.64 log.

In conclusion, literature studies clearly establish that sedimentation basins are capable of achieving greater than 0.5 log reduction in Cryptosporidium levels. Further, the data supplied by utilities on reduction in aerobic spore counts across full scale presedimentation basins demonstrate that presedimentation can achieve mean reductions of greater than 0.5 log under routine operating conditions and over an extended time period. Thus, these data suggest that a 0.5 log presumptive credit for Cryptosporidium removal by presedimentation is appropriate under certain conditions.

With respect to the conditions under which the 0.5 log presumptive credit for presedimentation is appropriate, the data do not demonstrate that this level of removal can be achieved consistently without a coagulant. In addition, available data do not establish aerobic spores as an effective indicator of Cryptosporidium removal in the absence of a coagulant. Thus, supporting data are consistent with a requirement that systems apply a coagulant to be eligible for the presumptive 0.5 log presedimentation credit. Moreover, such a requirement is consistent with the Agreement in Principle, which recommends 0.5 log credit for presedimentation basins with a coagulant.

EPA also has concluded that presedimentation basins need to be operated continuously and treat 100% of the plant flow in order to reasonably ensure that the process will reduce influent Cryptosporidium levels by at least 0.5 log over the course of a full year. The Agency recognizes that, depending on influent water quality, some systems may determine it is more prudent to operate presedimentation basins intermittently in response to fluctuating turbidity levels. By proposing these conditions for the presumptive presedimentation credit, EPA is not recommending against intermittent operation of presedimentation basins. Rather, EPA is attempting to identify the conditions under which a 0.5 log presumptive credit for presedimentation is warranted.

In response to the SAB panel recommendation that performance criteria other than overflow rate be included if credit is to be given for presedimentation, EPA analyzed the relationship between removal of spores and reduction in turbidity through presedimentation for the three utilities that supplied these data. Results of this analysis are summarized in Table IV-12, which shows the relationship between monthly mean turbidity reduction and the percent of months when mean spore removal was at least 0.5 log.

BILLING CODE 6560-50-P

[Graphic not available; view image of printed page]

BILLING CODE 6560-50-C

Within the available data set, achieving a mean turbidity reduction of at least 0.5 log appears to provide approximately a 90% assurance that average spore removal will be 0.5 log or greater. The underlying data are shown graphically in Figure IV-4. Based on this information, EPA has concluded that it is appropriate to require 0.5 log turbidity reduction, determined as a monthly mean of daily turbidity readings, as an operating condition for the 0.5 log presumptive Cryptosporidium treatment credit for presedimentation. Further, EPA is proposing that systems must meet the 0.5 log turbidity reduction requirement in at least 11 of the 12 previous months on an ongoing basis to remain eligible for the presedimentation credit.

BILLING CODE 6560-50-P

[Graphic not available; view image of printed page]

BILLING CODE 6560-50-C

c. Request for comment. EPA requests comment on the proposed criteria for awarding credit to presedimentation. EPA would particularly appreciate comment on the following issues:

  • Whether the information cited in this proposal supports the proposed credit for presedimentation and the operating conditions under which the credit will be awarded;
  • Additional information that either supports or suggest modifications to the proposed performance criteria and presumptive credit;
  • Today's proposal requires systems using presedimentation to sample after the presedimentation basin, and these systems are not eligible to receive additional presumptive Cryptosporidium removal credit for presedimentation. However, systems are also required to collect samples prior to chemical treatment, and EPA recognizes that some plants provide chemical treatment to water prior to, or during, presedimentation. EPA requests comment on how this situation should be handled under the LT2ESWTR.
  • Whether and under what conditions factors like low turbidity raw water, infrequent sludge removal, and wind would make compliance with the 0.5 log turbidity removal requirement infeasible.

6. Bank Filtration

a. What is EPA proposing today? EPA is proposing to award additional Cryptosporidium treatment credit (0.5 or 1.0 log) for systems that implement bank filtration as a pre-treatment technique if it meets the design criteria specified in this section. To be eligible for credit as a pre-treatment technique, bank filtration collection devices must meet the following criteria:

  • Wells are drilled in an unconsolidated, predominantly sandy aquifer, as determined by grain-size analysis of recovered core material—the recovered core must contain greater than 10% fine-grained material (grains less than 1.0 mm diameter) in at least 90% of its length;
  • Wells are located at least 25 feet (in any direction) from the surface water source to be eligible for 0.5 log credit; wells located at least 50 feet from the source surface water are eligible for 1.0 log credit;
  • The wellhead must be continuously monitored for turbidity to ensure that no system failure is occurring. If the monthly average of daily maximum turbidity values exceeds 1 NTU then the system must report this finding to the State. The system must also conduct an assessment to determine the cause of the high turbidity levels in the well and consult with the State regarding whether previously allowed credit is still appropriate.

Systems using existing bank filtration as pretreatment to a filtration plant at the time the systems are required to conduct Cryptosporidium monitoring, as described in section IV.A, must sample the well effluent for the purpose of determining bin classification. Where bin classification is based on monitoring the well effluent, systems are not eligible to receive additional credit for bank filtration. In these cases, the performance of the bank filtration process in reducing Cryptosporidium levels will be reflected in the monitoring results and bin classification.

Systems using bank filtered water without additional filtration typically must collect source water samples in the surface water (i.e., prior to bank filtration) to determine bin classification. This applies to systems using bank filtration to meet the Cryptosporidium removal requirements of the IESWTR or LT1ESWTR under the provisions for alternative filtration demonstration in 40 CFR 141.173(b) or 141.552(a). Note that the proposed bank filtration criteria for Cryptosporidium removal credit under the LT2ESWTR do not apply to existing State actions to provide alternative filtration Cryptosporidium removal credit for IESWTR or LT1ESWTR compliance.

In the case of systems that use GWUDI sources without additional filtration and that meet all the criteria for avoiding filtration in 40 CFR 141.71, samples must be collected from the ground water (e.g., the well). Further, such systems must comply with the requirements of the LT2ESWTR that apply to unfiltered systems, as described in section IV.B.

b. How was this proposal developed? This section describes the bank filtration treatment process, provides more detail on the aquifer types and ground water collection devices that are eligible for bank filtration credit, and describes the data supporting the proposed requirements.

Bank filtration is a water treatment process that makes use of surface water that has naturally infiltrated into ground water via the river bed or bank(s) and is recovered via a pumping well. Stream-bed infiltration is typically enhanced by the pumping action of near-stream wells (e.g., water supply, irrigation). Bank filtrate is water drawn into a pumping well from a nearby surface water source which has traveled through the subsurface, either vertically, horizontally or both, mixing to some degree with other ground water. Through bank filtration, microorganisms and other particles are removed by contact with the aquifer materials.

The bank filtration removal process performs most efficiently when the aquifer is comprised of granular materials with open pore-space for water flow around the grains. In these granular porous aquifers, the flow path is meandering, thereby providing ample opportunity for the organism to come into contact with and attach to a grain surface. Although detachment can occur, it typically occurs at a very slow rate so that organisms remain attached to a grain for long periods. When ground water travel times from source water to well are long or when little or no detachment occurs, most organisms will become inactivated before they can enter a well. Thus, bank filtration relies on removal, but also, in some cases, on inactivation to protect wells from pathogen contamination.

Only Wells Located in Unconsolidated, Predominantly Sandy Aquifers Are Eligible

Only granular aquifers are eligible for bank filtration credit. Granular aquifers are those comprised of sand, clay, silt, rock fragments, pebbles or larger particles and minor cement. The aquifer material is required to be unconsolidated, with subsurface samples friable upon touch. Uncemented granular aquifers are typically formed by alluvial or glacial processes. Such aquifers are usually identified on a detailed geologic map (e.g., labeled as Quaternary alluvium).

Under today's proposal, a system seeking Cryptosporidium removal credit must characterize the aquifer at the well site to determine aquifer properties. At a minimum, the aquifer characterization must include the collection of relatively undisturbed, continuous, core samples from the surface to a depth equal to the bottom of the well screen. The proposed site must have substantial core recovery during drilling operations; specifically, the recovered core length must be at least 90% of the total projected depth to the well screen.

Samples of the recovered core must be submitted to a laboratory for sieve analysis to determine grain size distribution over the entire recovered core length. Each sieve sample must be acquired at regular intervals over the length of the recovered core, with one sample representing a composite of each two feet of recovered core. A two-foot sampling interval reflects the necessity to sample the core frequently without imposing an undue burden. Because it is anticipated that wells will range from 50 to 100 foot in depth, a two-foot sampling interval will result in about 25 to 50 samples for analysis. Each sampled interval must be examined to determine if more than ten percent of the grains in that interval are less than 1.0 mm in diameter (#18 sieve size). In the U.S. Department of Agriculture soil classification system, the #18 sieve separates very coarse sands from coarse sands. The length of core (based on the samples from two-foot intervals) with more than ten percent of the grains less than 1.0 mm in diameter must be summed to determine the overall core length with sufficient fine-grained material so as to provide adequate removal. An aquifer is eligible for removal credit if at least 90% of the sampled core length contains sufficient fine-grained material as defined in this section.

Cryptosporidium oocysts have a natural affinity for attaching to fine-grained material. A study of oocyst removal in sand columns shows greater oocyst removal in finer-grained sands than in coarser-grained sands (Harter et al. 2000). The core sampling procedure described in this section is designed to measure the proportion of fine-grained sands (grains less than 1.0 mm in diameter) so as to ensure that a potential bank filtration site is capable of retarding transport (or removing) oocysts during ground water flow from the source surface water to the water supply well. The value of 1.0 mm for the bounding size of the sand grains was determined based on calculations performed by Harter using data from Harter et al. (2000). Harter showed that, for ground water velocities typical of a bank filtration site (1.5 to 15 m/day), a typical bank filtration site composed of grains with a diameter of 1.0 mm would achieve at least 1.0 log removal over a 50 foot transport distance. Larger-sized grains would achieve less removal, all other factors being equal.

Alluvial and glacial aquifers are complex mixtures of sand, gravel and other sized particles. Particles of similar size are often grouped together in the subsurface, due to sorting by flowing water that carries and then deposits the particles. Where there exists significant thickness of coarse-grained particles, such as gravels, with few finer materials, there is limited opportunity for oocyst removal. When the total gravel thickness, as measured in a core, exceeds 10%, it is more likely (based on analysis of ground water flow within mixtures containing differing-sized grains) that the gravel-rich intervals are interconnected. Interconnected gravel can form a continuous, preferential flow path from the source surface water to the water supply well. Where such preferential flow paths exist, a preponderance of the total ground water flow occurs within the preferential flow path, ground water velocity is higher, and natural filtration is minimal. A proposed bank filtration site is acceptable if at least 90% of the core length contains grains with sufficient fine-grained material (diameter less than 1.0 mm); that is, it is acceptable if the core contains less than 10% gravel-rich intervals.

Aquifer materials with significant fracturing are capable of transmitting ground water at high velocity in a direct flow path with little time or opportunity for die-off or removal of microbial pathogens. Consolidated aquifers, fractured bedrock, and karst limestone are aquifers in which surface water may enter into a pumping well by flow along a fracture, a solution-enhanced fracture conduit, or other preferential pathway. Microbial pathogens found in surface water are more likely to be transported to a well via these direct or preferential pathways. Cryptosporidium outbreaks have been associated with consolidated aquifers, such as a fractured chalk aquifer (Willocks et al. 1998) or a karst limestone (solution-enhanced fractured) aquifer (Bergmire-Sweat et al. 1999). These outbreaks show that the oocyst removal performance of consolidated aquifers is undermined by preferential water flow and oocyst transport through rock fractures or through rock dissolution zones. Wells located in these aquifers are not eligible for bank filtration credit because the flow paths are direct and the average ground water velocity is high, so that little inactivation or removal would be expected. Therefore, only unconsolidated aquifer are eligible for bank filtration oocyst removal credit.

A number of devices are used for the collection of ground water including horizontal and vertical wells, spring boxes, and infiltration galleries. Among these, only horizontal and vertical wells are eligible for log removal credit. The following discussion presents characteristics of ground water collection devices and the basis for this proposed requirement.

Horizontal wells are designed to capture large volumes of surface water recharge. They typically are constructed by the excavation of a central vertical caisson with laterals that extend horizontally from the caisson bottom in all directions or only under the riverbed. Horizontal wells are usually shallower than vertical wells because of the construction expense. Ground water flow to a horizontal well that extends under surface water is predominantly downward. In contrast, ground water flow to a vertical well adjacent to surface water may be predominantly in the horizontal direction. Surface water may have a short ground water flow path to a horizontal well if the well extends out beyond the bank.

Hancock et al. (1998) analyzed samples from eleven horizontal wells and found Cryptosporidium, Giardia or both in samples from five of those wells. These data suggest that some horizontal wells may not be capable of achieving effective Cryptosporidium removal by bank filtration. Insufficient data are currently available to suggest that horizontal well distances from surface water should be greater than distances established for vertical wells. Two ongoing studies in Wyoming (Clancy Environmental Consultants 2002) and Nebraska (Rice 2002) are collecting data at horizontal well sites.

A spring box is located at the ground surface and is designed to contain spring outflow and protect it from surface contamination until the water is utilized. Spring boxes are typically located where natural processes have enhanced and focused ground water discharge into a smaller area and at a faster volumetric flow rate than elsewhere (i.e., a spring). Often, localized fracturing or solution enhanced channels are the cause of the focused discharge to the spring orifice. Fractures and solution channels have significant potential to transport microbial contaminants so that natural filtration may be poor. Thus, spring boxes are not proposed to be eligible for bank filtration credit.

Cryptosporidium monitoring results (Hancock et al. 1998) and outbreaks are used to evaluate ground water collection devices. Hancock et al. sampled thirty five springs for Cryptosporidium oocysts and Giardia cysts. Most springs were used as drinking water sources and sampling was conducted to determine if the spring should be considered as a GWUDI source. Cryptosporidium oocysts were found in seven springs; Giardia cysts were found in five springs; and either oocysts or cysts were found in nine springs (26%). A waterborne cryptosporidiosis outbreak in Medford, Oregon (Craun et al. 1998) is associated with a spring water supply collection device. Also, a more recent, smaller outbreak of giardiasis in an Oregon campground is associated with a PWS using a spring. The high percentage of springs contaminated with pathogenic protozoan, the association with recent outbreaks, and an apparent lack of bank filtration capability indicate that spring boxes must not be eligible for bank filtration credit.

An infiltration gallery (or filter crib) is typically a slotted pipe installed horizontally into a trench and backfilled with granular material. The gallery is designed to collect water infiltrating from the surface or to intercept ground water flowing naturally toward the surface water (Symons et al. 2000). In some treatment plants, surface water is transported to a point above an infiltration gallery and then allowed to infiltrate. The infiltration rate may be manipulated by varying the properties of the backfill or the nature of the soil-water interface. Because the filtration properties of the material overlying an infiltration gallery may be designed or purposefully altered to optimize oocyst removal or for other reasons, this engineered system is not bank filtration, which relies solely on the natural properties of the system.

A 1992 cryptosporidiosis outbreak in Talent, Oregon was associated with poor performance of an infiltration gallery underneath Bear Creek (Leland et al. 1993). In this case, the ground water-surface water interface and the engineered materials beneath did not sufficiently reduce the high oocyst concentration present in the source water. The association of an infiltration gallery with an outbreak, the design that relies on engineered materials rather than the filtration properties of natural filtration media, and the shallow depth of constructed infiltration galleries, such that they typically are not located greater than 25 feet from the surface and surface water recharge, all indicate that infiltration galleries must not be eligible for bank filtration credit.

EPA notes that under the demonstration of performance credit described in section IV.C.17, States may consider awarding Cryptosporidium removal credit to infiltration galleries where the State determines, based on site-specific testing with a State-approved protocol, that such credit is appropriate (i.e., that the process reliably achieves a specified level of Cryptosporidium removal on a continuing basis).

Wells Located 25 Feet From the Surface Water Source Are Eligible for 0.5 Log Credit; Wells Located 50 Feet From the Surface Water Source Are Eligible for 1.0 Log Credit

A vertical or horizontal well located adjacent to a surface water body is eligible for bank filtration credit if there is sufficient ground water flow path length to effectively remove oocysts. For vertical wells, the wellhead must be located at least 25 horizontal feet from the surface water body for 0.5 log Cryptosporidium removal credit and at least 50 horizontal feet from the surface water body for 1.0 log Cryptosporidium removal credit. For horizontal wells, the laterals must be located at least 25 feet distant from the normal-flow surface water riverbed for 0.5 log Cryptosporidium removal credit and at least 50 feet distant from the normal-flow surface water riverbed for 1.0 log Cryptosporidium removal credit.

The ground water flow path to a vertical well is the measured distance from the edge of the surface water body, under high flow conditions (determined by the mapped extent of the 100 year floodplain elevation boundary or floodway, as defined in Federal Emergency Management Agency (FEMA) flood hazard maps), to the wellhead. The ground water flow path to a horizontal well is the measured distance from the bed of the river under normal flow conditions to the closest horizontal well lateral.

The floodway is defined by FEMA as the area of the flood plain where the water is likely to be deepest and fastest. The floodway is shown on FEMA digital maps (known as Q3 flood data maps), which are available for 11,990 communities representing 1,293 counties in the United States. Systems may identify the distance to surface water using either the 100 year return period flood elevation boundary or by determining the floodway boundary using methods similar to those used in preparing FEMA flood hazard maps. The 100 year return period flood elevation boundary is expected to be wider than the floodway but that difference may vary depending on local conditions. Approximately 19,200 communities in the United States have flood hazard maps that show the 100 year return period flood elevation boundary. If local FEMA floodway hazard maps are unavailable or do not show the 100 year flood elevation boundary, then the utility must determine either the floodway or 100 year flood elevation boundary.

The separation distance proposed for Cryptosporidium removal credit is based, in part, on measured data for the removal of oocyst surrogate biota in full-scale field studies. A variety of surrogate and indicator organisms were analyzed in each study evaluated for today's proposal. However, only two non-pathogenic organisms, anaerobic clostridia spores and aerobic endospores, are resistant to inactivation in the subsurface, approximately similar in size and shape to oocysts, and sufficiently ubiquitous in both surface water and ground water so that log removal can be calculated during passage across the surface water—ground water interface and during transport within the aquifer.

Anaerobic spores are typically estimated at about 0.3-0.4 μm in diameter as compared with 4-6 μm for oocysts. Aerobic spores, such as endospores of the bacterium Bacillus subtilis, are slightly larger than anaerobic spores, typically 0.5 × 1.0 × 2.0 μm in diameter (Rice et al. 1996). Experiments conducted by injecting Bacillus subtilis spores into a gravel aquifer show that they can be very mobile in the subsurface environment (Pang et al. 1998). As presented in the following discussion, available data indicate similar removal of both aerobic and anaerobic spores, either during passage across the surface water—ground water interface or during ground water flow. These data suggest that anaerobic spores, like aerobic spores, may be suitable surrogate measures of Cryptosporidium removal by bank filtration.

Available data establish that during bank filtration, significant removal of anaerobic and aerobic spores can occur during passage across the surface water-ground water interface, with lesser removal occurring during ground water transport within the aquifer away from that interface. The ground water-surface water interface is typically comprised of finer grained material that lines the bottom of the riverbed. Typically, the thickness of the interface is small, typically a few inches to a foot. The proposed design criteria of 25 and 50 feet for 0.5 and 1.0 log Cryptosporidium removal credit, respectively, are based on EPA's analysis of pathogen and surrogate monitoring data from bank filtration sites. Most of these data are from studies of aquifers developed in Dutch North Sea margin sand dune fields and, therefore, represent optimal removal conditions consistent with a homogenous, well sorted (by wind), uniform sand filter.

Medema et al. (2000) measured 3.3 log removal of anaerobic spores during transport over a 13 m distance from the Meuse River into adjacent ground water. Arora et al. (2000) measured greater than 2.0 log removal of anaerobic spores during transport from the Wabash River to a horizontal collector well. Havelaar et al. (1995) measured 3.1 log removal of anaerobic spores during transport over a 30 m distance from the Rhine River to a well and 3.6 log removal over a 25 m distance from the Meuse River to a well. Schijven et al. (1998) measured 1.9 log removal of anaerobic spores over a 2 m distance from a canal to a monitoring well. Using aerobic spores, Wang et al. (2001) measured 1.8 log removal over a 2 foot distance from the Ohio river to a monitoring well beneath the river.

During transport solely within shallow ground water (i.e., not including removal across the surface water-ground water interface), Medema et al. (2000) measured approximately 0.6 log removal of anaerobic spores over a distance of 39 feet. Using aerobic spores, Wang et al. (2001) measured 1.0 log removal of aerobic spores over a 48 foot distance from a monitoring well beneath a river to a horizontal well lateral.

At distances relatively far from an injection well in a deep, anaerobic aquifer, thereby minimizing the effects of injection, Schijven et al. measured negligible removal of anaerobic spores over a 30 m distance. However, few bank filtration systems occur in deeper, anaerobic ground water so these data may not apply to a typical bank filtration system in the United States.

These data demonstrate that during normal and low surface water elevations, the surface water-ground water interface performs effectively to remove microbial contamination. However, there will typically be high water elevation periods during the year, especially on uncontrolled rivers, that alter the nature and performance of the interface due to flood scour, typically for short periods. During these periods, lower removals would be expected to occur.

Averaging Cryptosporidium oocyst removal over the period of a year requires consideration of both high and low removal periods. During most of the year, high log removal rates would be expected to predominate (e.g., 3.3 log removal over 42 feet) due to the removal achieved during passage across the surface water-ground water interface. During short periods of flooding, substantially lower removal rates may occur (e.g., 0.5 log removal over 39 feet) due to scouring of the riverbed and removal of the protective, fine-grained material. By considering all time intervals with differing removal rates over the period of a year, EPA is proposing that 0.5 log removal over 25 feet (8 m) and 1.0 log removal over 50 feet (16 m) are reasonable estimates of the average performance of a bank filtration system over a year. This proposal is generally supported by colloidal filtration theory modeling results using data characteristic of the aquifers in Louisville and Cincinnati and column studies of oocyst transport in sand (Harter et al. 2000).

Wells must be continuously monitored for turbidity

Under the Surface Water Treatment Rule (40 CFR 141.73(b)(1)) the turbidity level of slow sand filtered water must be 1 NTU or less in 95% of the measurements taken each month. Turbidity sampling is required once every four hours, but may be reduced to once per day under certain conditions. Although slow sand filtration is not bank filtration, similar pathogen removal mechanisms are expected to occur in both processes. Just as turbidity monitoring is used to provide assurance that the removal credit assigned to a slow sand filter is being realized, EPA is proposing continuous turbidity monitoring for all bank filtration wells that receive credit.

If monthly average turbidity levels (based on daily maximum values in the well) exceed 1 NTU, the system is required to report to the State and present an assessment of whether microbial removal has been compromised. If the State determines that microbial removal has been compromised, the system must not receive credit for bank filtration until the problem has been remediated. The turbidity performance requirement for bank filtration is less strict than that for slow sand filtration because, unlike slow sand filtration, bank filtration is a pre-treatment technique followed by conventional or direct filtration.

BILLING CODE 6560-50-P

[Graphic not available; view image of printed page]

BILLING CODE 6560-50-C

In summary, EPA believes that the measured full-scale field data from operating bank filtration systems, the turbidity monitoring provision, and the design criteria for aquifer material, collection device type, and setback distance, together provide assurance that the presumptive log removal credit will be achieved by bank filtration systems that conform to the requirements in today's proposal.

c. Request for comment. The Agency requests comment on the following issues concerning bank filtration:

  • The performance of bank filtration in removing Cryptosporidium or surrogates to date at sites currently using this technology (e.g. sites with horizontal wells).
  • The use of other methods (e.g., geophysical methods such as ground penetrating radar) to complement or supplant core drilling to determine site suitability for bank filtration credit.
  • The number of GWUDI systems in each State (i.e., the number of systems having at least one GWUDI source) where bank filtration has been utilized as the primary filtration barrier (e.g., no other physical removal technologies follow); also, the method that was used by the State to determine that each system was achieving 2 log removal of Cryptosporidium.
  • For GWUDI systems where natural or alternative filtration (e.g. bank filtration or artificial recharge) is used in combination with a subsequent filtration barrier (e.g., bag or cartridge filters) to meet the 2 log Cryptosporidium removal requirement of the IESWTR or LT1ESWTR, how much Cryptosporidium removal credit has the State awarded (or is the State willing to grant if the bags/cartridges were found to be achieving 2.0 logs) for the natural or alternative filtration process and how did the State determine this value?
  • The proposed Cryptosporidium removal credit and associated design criteria, including any additional information related to this topic.
  • Suitable separation distance(s) to be required between vertical or horizontal wells and adjacent surface water.
  • Testing protocols and procedures for making site specific determinations of the appropriate level of Cryptosporidium removal credit to award to bank filtration processes.
  • Information on the data and methods suitable for predicting Cryptosporidium removal based on the available data from surrogate and indicator measurements in water collection devices.
  • The applicability of turbidity monitoring or other process monitoring procedures to indicate the ongoing performance of bank filtration processes.

7. Lime Softening

a. What is EPA proposing today? Lime softening is a drinking water treatment process that uses precipitation with lime and other chemicals to reduce hardness and enhance clarification prior to filtration. Lime softening can be categorized into two general types: (1) Single-stage softening, which is used to remove calcium hardness and (2) two-stage softening, which is used to remove magnesium hardness and greater levels of calcium hardness. A single-stage softening plant includes a primary clarifier and filtration components. A two-stage softening plant also includes a secondary clarifier located between the primary clarifier and filter. In some two-stage softening plants, a portion of the flow bypasses the first clarifier.

EPA has determined that lime softening plants in compliance with IESWTR or LT1ESWTR achieve a level of Cryptosporidium removal equivalent to conventional treatment plants (i.e., average of 3 log). Consequently, lime softening plants that are placed in Bins 2-4 as a result of Cryptosporidium monitoring incur the same additional treatment requirements as conventional plants. However, EPA is proposing that two-stage softening plants be eligible for an additional 0.5 log Cryptosporidium treatment credit. To receive the 0.5 log credit, the plant must have a second clarification stage between the primary clarifier and filter that is operated continuously, and both clarification stages must treat 100% of the plant flow. In addition, a coagulant must be present in both clarifiers (may include metal salts, polymers, lime, or magnesium precipitation).

b. How was this proposal developed? The lime softening process is used to remove hardness, primarily calcium and magnesium, through chemical precipitation followed by sedimentation and filtration. The addition of lime increases pH, causing the metal ions to precipitate. Other contaminants can coalesce with the precipitates and be removed in the subsequent settling and filtration processes. While elevated pH has been shown to inactivate some microorganisms like viruses (Battigelli and Sobsey, 1993, Logsdon et al. 1994), current research indicates that Cryptosporidium and Giardia are not inactivated by high pH (Logsdon et al. 1994, Li et al. 2001). A two-stage lime softening plant has the potential for additional Cryptosporidium removal because of the additional sedimentation process.

Limited data are available on the removal of Cryptosporidium by the lime softening treatment process. EPA has evaluated data from a study by Logsdon et al. (1994), which investigated removal of Giardia and Cryptosporidium in full scale lime softening plants. In addition, the Agency has considered data provided by utilities on the removal of aerobic spores in softening plants. These data are summarized in the following paragraphs.

Logsdon et al. (1994) measured levels of Cryptosporidium and Giardia in the raw, settled, and filtered water of 13 surface water plants using lime softening. Cryptosporidium was detected in the raw water at 5 utilities: one single-stage plant and four two-stage plants. Using measured oocyst levels, Cryptosporidium removal by sedimentation was 1.0 log in the single-stage plant and 1.1 to 2.3 log in the two-stage plants. Cryptosporidium was found in two filtered water samples of the single stage plant, leading to calculated removals from raw to filtered water of 0.6 and 2.2 log. None of the two-stage plants had Cryptosporidium detected in the filtered water. Based on detection limits, calculated Cryptosporidium removals from raw to filtered water in the two-stage plants ranged from 2.67 to 3.85 log.

Giardia removal across sedimentation was 0.9 log for a single-stage plant and ranged from 0.8 to 3.2 log for two-stage plants, based on measured cyst levels. Removal of Giardia from raw water through filtration was calculated using detection limits as 1.5 log in a single-stage plant and ranged from 0.9 to 3.3 log in two-stage plants.

While results from the Logsdon et al. study are constrained by sample number and method detection limits, they suggest that two-stage softening plants may achieve greater removal of Cryptosporidium than single-stage plants. The authors concluded that two stages of sedimentation, each preceded by effective flocculation of particulate matter, may increase removal of protozoa. Additionally, the authors stated that consistent achievement of flocculation that results in effective settling in each sedimentation basin is the key factor in this treatment process.

Removal of Aerobic Spores by Softening Plants

Additional information on the microbial removal efficiency of the lime softening process comes from data provided by softening plants on removal of aerobic spores. While few treatment plants have sufficient concentrations of oocysts to directly calculate a Cryptosporidium removal efficiency, some plants have high concentrations of aerobic spores in the raw water. Spores may serve as an indicator of Cryptosporidium removal by sedimentation and filtration (Dugan et al. 2001).

The following two-stage softening plants provided data on removal of aerobic spores: St. Louis, MO, Kansas City, MO, and Columbus, OH (2 plants). Cryptosporidium data were also collected at these utilities, but it was not possible to calculate oocyst removal due to low raw water detection rates. Data on removal of aerobic spores by these softening plants is summarized in Table IV-14.

Table IV-14.—Summary of Aerobic Spore Removal Data From Softening Plants Back to Top
Plant Mean log removal of aerobic spores
Primary clarifier Secondary clarifier Across plant *
* Excludes removal in pre-sedimentation basins; calculated spore removal may underestimate actual removal due to filter effluent levels below quantitation limits.
St. Louis 1.7 1.1 3.8
Kansas City 2.4 0 3.4
Columbus Plant 1 1.2 1.6 3.1
Columbus Plant 2 1.3 2.4 4.2

The City of St. Louis Water Division operates a two-stage lime softening process preceded by presedimentation. Ferric sulfate and polymer coagulants are added at various points in the process. St. Louis collected Bacillus subtilis spore samples between June 1998 and September 2000. During this time period, the mean spore concentration entering the softening process (i.e., after presedimentation) was 8,132 cfu/100 mL. The log removal values shown in Table IV-14 are based on average spore concentrations following primary clarification, secondary clarification, and filtration. However, spore levels in some filtered water samples were below the method detection limit, so that the true mean spore removal across the plant may have been higher than indicated by the calculated value.

The Kansas City Water Services Department plant includes two-stage lime softening with pre-sedimentation and sludge recycle. Bacillus subtilis spore data were collected from this plant during January through November 2000. The mean spore concentration entering the lime softening process (after presedimentation) was 5,965 cfu/100 mL. Mean spore levels following primary clarification, secondary clarification, and filtration were 21.1, 25.7, and 2.6 cfu/100 mL, respectively. Corresponding log removal values are shown in Table IV-14. Note that the average spore concentration in the effluent of the secondary clarifier was essentially equivalent to the effluent of the primary clarifier, indicating that little removal occurred in the secondary clarifier. This result may have been due to the high removal achieved in the primary clarifier and, consequently, the relatively low concentration of spores entering the second clarifier. As with the St. Louis plant, many of the filtered water observations were below method detection limits, so actual log removal across the plant may have been higher than the calculated value.

The City of Columbus operates two lime softening plants, each of which has two clarification stages. Coagulant is added prior to the first clarification stage but lime is not added until the second clarifier (i.e., first clarifier is not a softening stage). Between 1997 and 2000, samples for total aerobic spores were collected approximately monthly at each plant from raw water, following each clarification basin, and after filtration. Mean spore concentrations in the raw water sources for the two plants were 10,619 cfu/100 mL (Plant 1) and 22,595 cfu/100 mL (Plant 2). Mean log removals occurring in the two clarification stages and across the plant are shown for each plant in Table IV-14.

These data indicate that two-stage softening plants can remove high levels of Cryptosporidium, and, in particular, that a second clarification stage can achieve 0.5 log or greater removal. Three of the four plants that provided data on removal of aerobic spores achieved greater than 1 log reduction in the second clarifier. Kansas City, the one plant which achieved little removal in the second clarifier, achieved a mean 2.4 log removal in the primary clarifier. This was approximately 1 log more reduction than achieved in the primary clarifiers of the other three plants, so that the spore concentration entering the second clarifier in Kansas City may have been too low to serve as an indicator of removal efficiency. Consequently, EPA has concluded that these data support an additional Cryptosporidium treatment credit of 0.5 log for a two-stage softening plant.

EPA is proposing as a condition of the 0.5 log additional credit that a coagulant, which could include excess lime and soda ash or precipitation of magnesium hydroxide, be present in both clarifiers. This requirement is necessary to ensure that significant particulate removal occurs in both clarification stages. Logsdon et al. (1994) identified effective flocculation as being a key factor for removal of protozoa in softening plants. Among the softening plants that provided data on aerobic spore removal, St. Louis added ferric and polymer coagulants at different points in the process, and the two Columbus plants added lime to the second clarifier. Consequently, a requirement that plants add a coagulant, which may be lime, in the secondary clarifier is consistent with the data used to support the 0.5 log additional credit.

The Science Advisory Board (SAB) reviewed the proposed Cryptosporidium treatment credit for lime softening and supporting information, as presented in the November 2001 pre-proposal draft of the LT2ESWTR (USEPA 2001g). In written comments from a December 2001 meeting of the Drinking Water Committee, the SAB panel concluded that both single- and two-stage softening generally outperform conventional treatment due to the heavy precipitation that occurs. Further, the panel found that 0.5 log of additional Cryptosporidium removal is an average value for a two-stage lime softening plant. However, the SAB stated that the additional credit for two-stage softening should be given only if all the water passes through both stages. Today's proposal is consistent with these recommendations by the SAB.

EPA notes that by including a presumptive credit for softening plants, today's proposal differs from the Stage 2 M-DBP Agreement in Principle, which recommends up to 1 log additional Cryptosporidium treatment credit for softening plants based on demonstration of performance, but no additional presumptive credit.

c. Request for comment. EPA requests comment on the proposed criteria for awarding credit to lime softening plants. EPA would particularly appreciate comment on the following issues:

  • Whether the information and analyses presented in this proposal supports an additional 0.5 log credit for two-stage softening, and the associated criteria necessary for credit.
  • Additional information that either support or suggest modifications to the proposed criteria and credit.

8. Combined Filter Performance

a. What is EPA proposing today? This toolbox component will grant additional credit towards Cryptosporidium treatment requirements to certain plants that maintain finished water turbidity at levels significantly lower than currently required. EPA is proposing to award an additional 0.5 log Cryptosporidium treatment credit to conventional and direct filtration plants that demonstrate a turbidity level in the combined filter effluent (CFE) less than or equal to 0.15 NTU in at least 95 percent of the measurements taken each month. Compliance with this criterion must be based on measurements of the CFE every four hours (or more frequently) that the system serves water to the public. This credit is not available to membrane, bag/cartridge, slow sand, or DE plants, due to the lack of documented correlation between effluent turbidity and Cryptosporidium removal in these processes.

b. How was this proposal developed? Turbidity is an optical property measured from the amount of light scattered by suspended particles in a solution. It is a method defined parameter that can detect the presence of a wide variety of particles in water (e.g., clay, silt, mineral particles, organic and inorganic matter, and microorganisms), but it cannot provide specific information on particle type, number, or size. Turbidity is used as an indicator of raw and finished water quality and treatment performance. Turbidity spikes in filtered water indicate a potential for breakthrough of pathogens.

Under the IESWTR and LT1ESWTR, combined filter effluent turbidity in conventional and direct filtration plants must be less than or equal to 0.3 NTU in 95% of samples taken each month and must never exceed 1 NTU. These plants are also required to conduct continuous monitoring of turbidity for each individual filter, and provide an exceptions report to the State when certain criteria for individual filter effluent turbidity are exceeded (described in 63 FR 69487, December 16, 1998) (USEPA 1998a).

The Stage 2 M-DBP Advisory Committee recommended that systems receive an additional 0.5 log Cryptosporidium removal credit for maintaining 95th percentile combined filter effluent turbidity below 0.15 NTU, which is one half of the current required level of 0.3 NTU. In considering the technical basis to support this recommendation, EPA has reviewed studies that evaluated the efficiency of granular media filtration in removing Cryptosporidium when operating at different effluent turbidity levels.

For the IESWTR, EPA estimated that plants would target filter effluent turbidity in the range of 0.2 NTU in order to ensure compliance with a turbidity standard of 0.3 NTU. Similarly, EPA has estimated that plants relying on meeting a turbidity standard of 0.15 NTU in 95% of samples will consistently operate below 0.1 NTU in order to ensure compliance. Consequently, to assess the impact of compliance with the lower finished water turbidity standard, EPA compared Cryptosporidium removal efficiency when effluent turbidity is below 0.1 NTU with removal efficiency when effluent turbidity is in the range of 0.1 to 0.2 NTU. Results from applicable studies are summarized in Table IV-15 and are discussed in the following paragraphs.

Table IV-15.—Studies of Cryptosporidium Removal at Different Effluent Turbidity Levels Back to Top
Microorganism Average of log removals Filtered effluent turbidity Experiment design Researcher
Cryptosporidium 4.39 ≤0.1 NTU Pilot-scale Patania et al. (1995).
3.55 0.1 and ≤0.2 NTU    
Giardia 4.23 ≤0.1 NTU    
3.22 0.1 and ≤0.2 NTU    
Cryptosporidium 4.09 ≤0.1 NTU Bench-scale Emelko et al. (1999).
3.58 0.1 and ≤0.2 NTU    
Cryptosporidium 3.76 ≤0.1 NTU Pilot-scale Dugan et al. (2001).
2.56 0.1 and ≤0.2 NTU    

Patania et al. (1995) conducted pilot-scale studies at four locations to evaluate the removal of seeded Cryptosporidium and Giardia, turbidity, and particles. Treatment processes, coagulants, and coagulant doses differed among the four locations. Samples of filter effluent were taken at times of stable operation and filter maturation. Analysis of summary data from the seeded runs at all locations shows that average Cryptosporidium removal was greater by more than 0.5 log when effluent turbidity was less than 0.1 NTU, in comparison to removal with effluent turbidity in the range 0.1 to 0.2 NTU (see Table IV-15).

Emelko et al. (1999) used a bench scale dual media filter to study Cryptosporidium removal during both optimal and challenged operating conditions. Water containing a suspension of kaolinite (clay) was spiked with oocysts, coagulated in-line with alum, and filtered. Oocyst removal was evaluated during stable operation when effluent turbidity was below 0.1 NTU. Removal was also measured after a hydraulic surge that caused process upset, and with coagulant addition terminated. These later two conditions resulted in effluent turbidities greater than 0.1 NTU and decreased removal of Cryptosporidium. As shown in Table IV-15, average removal of Cryptosporidium during periods with effluent turbidity below 0.1 NTU was approximately 0.5 log greater than when effluent turbidity was between 0.1 to 0.2 NTU.

Dugan et al. (2001) evaluated Cryptosporidium removal in a pilot scale conventional treatment plant. Sixteen filtration runs seeded with Cryptosporidium were conducted at different raw water turbidities and coagulation conditions. Eleven of the runs had an effluent turbidity below 0.1 NTU, and five runs had effluent turbidity between 0.1 and 0.2 NTU. For runs where the calculated Cryptosporidium removal was concentration limited (i.e., effluent values were non-detect), the method detection limit was used to calculate the values shown in Table IV-15. Using this conservative estimate, average Cryptosporidium removal with effluent turbidity below 0.1 NTU exceeded by more than 1 log the average removal observed with effluent turbidity between 0.1 to 0.2 NTU.

In summary, these three studies all support today's proposal in showing that plants consistently operating below 0.1 NTU can achieve an additional 0.5 log or greater removal of Cryptosporidium than when operating between 0.1 and 0.2 NTU. Because EPA expects plants relying on compliance with a 0.15 NTU standard will consistently operate below 0.1 NTU, the Agency has determined it is appropriate to propose an additional 0.5 log treatment credit for plants meeting this standard.

The SAB reviewed the proposed additional 0.5 log Cryptosporidium removal credit for systems maintaining very low CFE turbidity, as presented in the November 2001 pre-proposal draft of the LT2ESWTR (USEPA 2001g). The SAB also reviewed a potential additional 1.0 log Cryptosporidium removal credit for systems achieving very low individual filter effluent (IFE) turbidity, which is addressed in section IV.C.16 of today's proposal.

In written comments from a December 2001 meeting of the Drinking Water Committee, the SAB panel stated that additional credit for lower finished water turbidity is consistent with what is known in both pilot and full-scale operational experiences for Cryptosporidium removal. Recognizing that IESWTR requirements for lowering turbidity in the treated water will result in lower concentrations of Cryptosporidium, the panel affirmed that even further lowering of turbidity will result in further reductions in Cryptosporidium in the filter effluent. However, the SAB concluded that limited data were presented to show the exact removal that can be achieved, and recommended that no additional credit be given to plants that demonstrate CFE turbidity of 0.15 NTU or less. The SAB recommended that 0.5 log credit be given to plants achieving IFE turbidity in each filter less than 0.15 NTU in 95% of samples each month.

In responding to this recommendation from the SAB, EPA acknowledges the difficulty in precisely quantifying Cryptosporidium removal through filtration based on effluent turbidity levels. Nevertheless, EPA finds that available data consistently show that removal of Cryptosporidium is increased by 0.5 log or greater when filter effluent turbidity is reduced to levels reflecting compliance with a 0.15 NTU standard, in comparison to compliance with a 0.3 NTU standard. Consequently, EPA has concluded that it is appropriate to propose this 0.5 log presumptive treatment credit for systems achieving very low CFE turbidity.

Measurement of Low Level Turbidity

Another important aspect of proposing to award additional removal credit for lower finished water turbidity is the performance of turbidimeters in measuring turbidity below 0.3 NTU. The following paragraphs summarize results from several studies that evaluated low level measurement of turbidity by different on-line and bench top instruments. Note that because compliance with the CFE turbidity limit is based on 4-hour readings, either on-line or bench top turbidimeters may be used. EPA believes that results from these studies indicate that currently available turbidity monitoring equipment is capable of reliably assessing turbidity at levels below 0.1 NTU, provided instruments are well calibrated and maintained.

The 1997 NODA for the IESWTR (67 FR 59502, Nov. 3, 1997) (USEPA 1997a) discusses issues relating to the accuracy and precision of low level turbidity measurements. This document cites studies (Hart et al. 1992, Sethi et al. 1997) suggesting that large tolerances in instrument design criteria have led to turbidimeters that provide different turbidity readings for a given suspension.

At the time of IESWTR NODA, EPA had conducted performance evaluation (PE) studies of turbidity samples above 0.3 NTU. A subsequent PE study (USEPA 1998e), labeled WS041, was carried out to address concern among the Stage 1 M-DBP Federal Advisory Committee regarding the ability to reliably measure lower turbidity levels. The study involved distribution of different types of laboratory prepared standard solutions with reported turbidity values of 0.150 NTU or 0.160 NTU. The results of this study are summarized in Table IV-16.

BILLING CODE 6560-50-P

[Graphic not available; view image of printed page]

BILLING CODE 6560-50-C

The data summarized in Table IV-16 indicate a positive bias for all instruments when compared against a reported “true value.” On-line instruments in this study had a larger positive bias and higher standard deviation (RSD approximately 50 percent). The positive bias is consistent with previous PE studies (USEPA 1998e) and suggests that error in turbidimeter readings may be generally conservative (i.e., systems will operate at lower than required effluent turbidity levels).

Letterman et al. (2001) evaluated the effect of turbidimeter design and calibration methods on inter-instrument performance, comparing bench top to on-line instruments and instruments within each of those categories from different manufacturers. The study used treated water collected from the filter effluent of water treatment plants. Reported sample turbidity values ranged from 0.05 to 1 NTU. Samples were analyzed in a laboratory environment. The results are consistent with those of the WS041 study, specifically the positive bias of on-line instruments. However, Letterman et al. found generally poor agreement among different on-line instruments and between bench-top and on-line instruments. The authors also observed that results were independent of the calibration method, though certain experiments suggested that analyst experience may have some effect on turbidity readings from bench-top instruments.

Sadar (1999) conducted an intra-instrument study of low level turbidity measurements among instruments from the same manufacturer. This study was performed under well-controlled laboratory conditions. Intra-instrument variation among different models and between bench top and on-line instruments occurred but at significantly lower levels than the Letterman et al. inter-instrument study. Newer instruments also tended to read lower than older instruments, which the author attributed to a reduction in stray light and lower sensitivities in the newer instruments. Sadar also found a generally positive bias when comparing on-line to bench-top and when comparing all instruments to a prepared standard.

The American Society for Testing and Materials (ASTM) has issued standard test methods for measurement of turbidity below 5 NTU by on-line (ASTM 2001) and static (ASTM 2003) instrument modes. The methods specify that the instrument should permit detection of turbidity differences of 0.01 NTU or less in waters having turbidities of less than 1.00 NTU (ASTM 2001) and 5.0 NTU (ASTM 2003), respectively. Inter-laboratory study data included with the method for a known turbidity standard of 0.122 NTU show an analyst relative deviation of 7.5% and a laboratory relative deviation of 16% (ASTM 2003).

In summary, the data collected in these studies of turbidity measurement indicate that currently available monitoring equipment can reliably measure turbidity at levels of 0.1 NTU and lower. However, this requires rigorous calibration and verification procedures, as well as diligent maintenance of turbidity monitoring equipment (Burlingame 1998, Sadar 1999). Systems that pursue additional treatment credit for lower finished water turbidity must develop the procedures necessary to ensure accurate and reliable measurement of turbidity at levels of 0.1 NTU and less. EPA guidance for the microbial toolbox will provide direction to water systems on developing these procedures.

c. Request for comment. EPA invites comment on the following issues regarding the proposed Cryptosporidium treatment credit for combined filter performance:

  • Do the studies cited here support awarding 0.5 log credit for CFE ≤ 0.15 NTU 95% of the time?
  • Does currently available turbidity monitoring technology accurately distinguish differences between values measured near 0.15 NTU?

9. Roughing Filter

a. What is EPA proposing today? The Stage 2 M-DBP Agreement in Principle recommends a 0.5 log presumptive credit towards additional Cryptosporidium treatment requirements for roughing filters. However, the Agreement further specifies that EPA is to determine the design and implementation criteria under which the credit would be awarded. Upon subsequent review of available literature, EPA is unable to identify design and implementation conditions for roughing filters that would provide reasonable assurance of achieving a 0.5 log removal of oocysts. Consequently, EPA is not proposing presumptive credit for Cryptosporidium removal by roughing filters. Today's proposal does, though, include a 0.5 log credit for a second granular media filter following coagulation and primary filtration (see section IV.C.13).

b. How was this proposal developed? Roughing filtration is a technique used primarily in developing countries to remove solids from high turbidity source waters prior to treatment with slow sand filters. Typically, roughing filters consist of a series of sedimentation tanks filled with progressively smaller diameter media in the direction of flow. The media can be gravel, plastic, crushed coconut, rice husks, or a similar locally available material. The flow direction in roughing filters can be either horizontal or vertical, and vertical roughing filters can be either upflow or downflow. The media in the tanks effectively reduce the vertical settling distance of particles to a distance of a few millimeters. As sediment builds on the media, it eventually sloughs off and begins to accumulate in the lower section of the filter, while simultaneously regenerating the upper portions of the filter. The filters require periodic cleaning to remove the collected silt.

Review of the scientific and technical literature pertaining to roughing filters has identified no information on removal of Cryptosporidium. Information is available on removal of suspended solids, turbidity, particles, fecal coliforms and some algae, but none of these has been demonstrated to be an indicator of Cryptosporidium removal by roughing filters. Moreover, roughing filters are not preceded by a coagulation step, and studies have found that some potential surrogates, such as aerobic spores, are not conservative indicators of Cryptosporidium removal by filtration when a coagulant is not present (Yates et al. 1998, Dugan et al. 2001). Thus, it is unclear how to relate results from studies of the removal of other particles by roughing filters to potential removal of Cryptosporidium.

In addition, some studies have observed very poor removal of Cryptosporidium by rapid sand filters when a coagulant is not used (Patania et al. 1995, Huck et al. 2000). Based on these findings, it is expected that there would be situations where a roughing filter would not achieve 0.5 log Cryptosporidium removal. Because available data are insufficient to determine the conditions that would be necessary for a roughing filter to achieve 0.5 log Cryptosporidium removal, EPA is unable to propose this credit. The following discussion describes four studies that analyzed the effectiveness of roughing filters for removing solids, turbidity, particles, fecal coliforms, and algae.

Wegelin et al. (1987) conducted pilot-scale studies on the use of horizontal roughing filters to reduce solids, turbidity, and particles. Testing was performed to determine the influence of different design parameters on filter performance. Data from the parameter testing was used to establish an empirical model to simulate filtrate quality as a function of filter length and time for a given filter configuration. Using the mathematical model, the researchers found that long filters (10 m) at low filtration rates (0.5 m/h) were capable of reducing high suspended solids concentrations (1000 mg/L TSS) down to less than 3 mg/L.

Further work by Wegelin (1988) evaluated roughing filters as pretreatment for slow sand filters for waters with variable and seasonably high suspended solids concentrations. This study collected data on roughing filters in Peru, Colombia, Sudan, and Ghana. Table IV-17 summarizes data for three of the roughing filters. These filters were capable of reducing peak turbidities by 80 to 90 percent. Further, the Peruvian and Colombian filters reduced fecal coliforms by 77 and 89 percent, respectively. The Sudanese filter may have removed around 90 percent of the fecal coliforms, but specific values were not given. Data collected from roughing filters in Ghana on algae removal indicate that the Merismopedia (0.5 μm) and Chlorophyta (2-10 μm), which are comparable in size to Cryptosporidium oocysts, were completely removed from the water in mature filters, and that some removal of Chlorophyta, but not Merismopedia, occurred in filters after three days of operation. However, the removal of these organisms has not been correlated with Cryptosporidium oocyst removal.

Table IV-17.—Roughing Filter Data From Wegelin, 1988 Back to Top
Location Azpita, Peru El Retiro, Colombia Blue Nile Health Project, Sudan
Roughing Filter Type Downflow Upflow (multi-layer filter) Horizontal-flow.
Filtration Rate 0.30 m/h (0.98 ft/hr) 0.74 m/h (2.43 f/hr) 0.3 m/h (0.98 ft/hr).
Design Capacity 35 m 3/d 790 m 3/d 5 m 3/d.
Turbidity (NTU)      
Raw Water 50-200 10-150 40-500
Roughing Filter Effluent 15-40 5-15 5-50
Fecal Coliforms (/100 mL)      
Raw Water 700 16,000 300
Roughing Filter Effluent 160 1,680 25

oller (1993) details the mechanisms of particle removal that occur in roughing filters. The conclusions are similar to those drawn by Wegelin et al. (1987). Particle analysis reviewed by Boller indicates that after seven days of operation, the four stage pilot filter utilized by Wegelin et al. (1987) removed more than 98 percent of particles sized 1.1 μm, and greater than 99 percent of particles sized 3.6 μm. After 62 days, only 80 percent of particles sized 1.1 μm were removed, while 90 percent of particles sized 3.6 μm were removed. Boller did not give the solids loading on the tested filter, and particle removal was not correlated to Cryptosporidium oocyst removal.

Collins et al. (1994) investigated solids and algae removal with pilot scale vertical downflow roughing filters. Gravel media size, filter depth, and flow rate were varied to determine which design variables had the greatest effect on filter performance. Results indicated that the most influential design parameters for removing solids from water, in order of importance, were filter length, gravel size, and hydraulic flow rate. For algae removal, the most influential design parameters were hydraulic flow rate, filter length, and gravel size. Solids removal was better in filters that had been ripened with algae for 5-7 days. However, extrapolation of these results to Cryptosporidium removal could not be made.

c. Request for comment. The Agency requests comment on the information that has been presented about roughing filters, and specifically the question of whether and under what conditions roughing filters should be awarded a 0.5 log credit for removal of Cryptosporidium. EPA also requests information on specific studies of Cryptosporidium oocyst removal by roughing filters, or from studies of the removal of surrogate parameters that have been shown to correlate with oocyst removal in roughing filters.

10. Slow Sand Filtration

a. What is EPA proposing today? Slow sand filtration is defined in 40 CFR 141.2 as a process involving passage of raw water through a bed of sand at low velocity (generally less than 0.4 m/h) resulting in substantial particulate removal by physical and biological mechanisms. Today's proposal allows systems using slow sand filtration as a secondary filtration step following a primary filtration process (e.g., conventional treatment) to receive an additional 2.5 log Cryptosporidium treatment credit. There must be no disinfectant residual in the influent water to the slow sand filtration process to be eligible for credit.

Note that this proposed credit differs from the credit proposed for slow sand filtration as a primary filtration process. EPA has concluded, based on treatment studies described in section III.D, that plants using well designed and well operated slow sand filtration as a primary filtration process can achieve an average Cryptosporidium removal of 3 log (Schuler and Ghosh, 1991, Timms et al. 1995, Hall et al. 1994). Consequently, as described in section IV.A, EPA is proposing that plants using slow sand filtration as a primary filtration process receive a 3 log credit towards Cryptosporidium treatment requirements associated with Bins 2-4 under the LT2ESWTR (i.e., credit equivalent to a conventional treatment plant).

The proposed 2.5 log credit for slow sand filtration as part of the microbial toolbox applies only when it is used as a secondary filtration step, following a primary filtration process like conventional treatment. While the removal mechanisms that make slow sand filtration effective as a primary filtration process would also be operative when used as a secondary filtration step, EPA has little data on this specific application. The Agency is proposing 2.5 log credit for slow sand filtration as a secondary filtration step, in comparison to 3 log credit as a primary filtration process, as a conservative measure reflecting greater uncertainty. In addition, the proposed 2.5 log credit for slow sand filtration as part of the microbial toolbox is consistent with the recommendation in the Stage 2 M-DBP Agreement in Principle.

b. How was this proposal developed? The Stage 2 M-DBP Agreement in Principle recommends that slow sand filtration receive 2.5 log or greater Cryptosporidium treatment credit when used in addition to existing treatment that achieves compliance with the IESWTR or LT1ESWTR. Slow sand filtration is not typically used as a secondary filtration step following conventional treatment or other primary filtration processes of similar efficacy. However, EPA expects that slow sand filtration would achieve significant removal of Cryptosporidium in such a treatment train.

While there is a significant body of data demonstrating the effectiveness of slow sand filtration for Cryptosporidium removal as a primary filtration process, as described in section III.D, EPA has limited data on the effectiveness of slow sand filtration when used as a secondary filtration step. Hall et al. (1994) evaluated oocyst removal for a pilot scale slow sand filter following a primary filtration process identified as a rapid gravity filter. The combined treatment train of a primary filtration process followed by slow sand filtration achieved greater than 3 log Cryptosporidium removal in three of five experimental runs, while approximately 2.5 log reduction was observed in the other two runs. In comparison, Hall et al. (1994) reported slow sand filtration alone to achieve at least a 3 log removal of oocysts in each of four experimental runs when not preceded by a primary filtration process. The authors offered no explanation for these results, but measured oocyst removals may have been impacted by limitations with the analytical method.

Removal of microbial pathogens in slow sand filters is complex and is believed to occur through a combination of physical, chemical, and biological mechanisms, both on the surface (schmutzdecke) and in the interior of the filter bed. It is unknown if the higher quality of the water that would be influent to a slow sand filter when used as a secondary filtration step would impact the efficiency of the filter in removing Cryptosporidium. Based on the limited data on the performance of slow sand filtration as a secondary filtration step, and in consideration of the recommendation of the Advisory Committee, EPA is proposing only a 2.5 log additional Cryptosporidium treatment credit for this application.

c. Request for comment. The Agency requests comment on whether the available data are adequate to support awarding a 2.5 log Cryptosporidium removal credit for slow sand filtration applied as a secondary filtration step, along with any additional information related to this application.

11. Membrane Filtration

a. What is EPA proposing today? EPA is proposing criteria for awarding credit to membrane filtration processes for removal of Cryptosporidium. To receive removal credit, the membrane filtration process must: (1) Meet the basic definition of a membrane filtration process, (2) have removal efficiency established through challenge testing and verified by direct integrity testing, and (3) undergo periodic direct integrity testing and continuous indirect integrity monitoring during use. The maximum removal credit that a membrane filtration process is eligible to receive is equal to the lower value of either:

—The removal efficiency demonstrated during challenge testing OR

—The maximum log removal value that can be verified through the direct integrity test (i.e., integrity test sensitivity) used to monitor the membrane filtration process.

By the criteria in today's proposal, a membrane filtration process could potentially meet the Bin 4 Cryptosporidium treatment requirements of this proposal. These criteria are described in more detail below. EPA is developing a Membrane Filtration Guidance Manual that provides additional information and procedures for meeting these criteria (USEPA 2003e). A draft of this guidance is available in the docket for today's proposal (http://www.epa.gov/edocket/).

Definition of a Membrane Filtration Process

For the purpose of this proposed rule, membrane filtration is defined as a pressure or vacuum driven separation process in which particulate matter larger than 1 μm is rejected by a nonfibrous, engineered barrier, primarily through a size exclusion mechanism, and which has a measurable removal efficiency of a target organism that can be verified through the application of a direct integrity test. This definition is intended to include the common membrane technology classifications: microfiltration (MF), ultrafiltration (UF), nanofiltration (NF), and reverse osmosis (RO). MF and UF are low-pressure membrane filtration processes that are primarily used to remove particulate matter and microbial contaminants. NF and RO are membrane separation processes that are primarily used to remove dissolved contaminants through a variety of mechanisms, but which also remove particulate matter via a size exclusion mechanism.

In today's proposal, the critical distinction between membrane filtration processes and bag and cartridge filters, described in section IV.C.12, is that the integrity of membrane filtration processes can be directly tested. Based on this distinction, EPA is proposing that membrane material configured into a cartridge filtration device that meets the definition of membrane filtration and that can be direct integrity tested according to the criteria specified in this section is eligible for the same removal credit as a membrane filtration process.

Membrane devices can be designed in a variety of configurations including hollow-fiber modules, hollow-fiber cassettes, spiral-wound elements, cartridge filter elements, plate and frame modules, and tubular modules among others. In today's proposal, the generic term module is used to refer to all of these various configurations and is defined as the smallest component of a membrane unit in which a specific membrane surface area is housed in a device with a filtrate outlet structure. A membrane unit is defined as a group of membrane modules that share common valving that allows the unit to be isolated from the rest of the system for the purpose of integrity testing or other maintenance.

Challenge Testing

A challenge test is defined as a study conducted to determine the removal efficiency (i.e., log removal value) of the membrane filtration media. The removal efficiency demonstrated during challenge testing establishes the maximum removal credit that a membrane filtration process is eligible to receive, provided this value is less than or equal to the maximum log removal value that can be verified by the direct integrity test (as described in the following subsection). Challenge testing is a product specific rather than a site specific requirement. At the discretion of the State, data from challenge studies conducted prior to promulgation of this regulation may be considered in lieu of additional testing. However, the prior testing must have been conducted in a manner that demonstrates a removal efficiency for Cryptosporidium commensurate with the treatment credit awarded to the process. Guidance for conducting challenge testing to meet the requirements of the rule is provided in the Membrane Filtration Guidance Manual (USEPA 2003e). Challenge testing must be conducted according to the following criteria:

  • Challenge testing must be conducted on a full-scale membrane module identical in material and construction to the membrane modules proposed for use in full-scale treatment facilities. Alternatively, challenge testing may be conducted on a smaller membrane module, identical in material and similar in construction to the full- scale module, if testing meets the other requirements listed in this section.
  • Challenge testing must be conducted using Cryptosporidium oocysts or a surrogate that has been determined to be removed no more efficiently than Cryptosporidium oocysts. The organism or surrogate used during challenge testing is referred to as the challenge particulate. The concentration of the challenge particulate must be determined using a method capable of discretely quantifying the specific challenge particulate used in the test. Thus, gross water quality measurements such as turbidity or conductivity cannot be used.
  • The maximum allowable feed water concentration used during a challenge test is based on the detection limit of the challenge particulate in the filtrate, and is determined according to the following equation:

Maximum Feed Concentration = 3.16 × 10 [6] × (Filtrate Detection Limit)

This will allow the demonstration of up to 6.5 log removal during challenge testing if the challenge particulate is removed to the detection limit.

  • Challenge testing must be conducted under representative hydraulic conditions at the maximum design flux and maximum design system recovery as specified by the manufacturer. Flux is defined as the flow per unit of membrane area. Recovery is defined as the ratio of filtrate volume produced by a membrane to feed water volume applied to a membrane over the course of an uninterrupted operating cycle. An operating cycle is bounded by two consecutive backwash or cleaning events. In the context of this rule, recovery does not consider losses that occur due to the use of filtrate in backwashing or cleaning operations.
  • Removal efficiency of a membrane filtration process is determined from the results of the challenge test, and expressed in terms of log removal values as defined by the following equation:

LRV = LOG 10 (C f)−LOG 10 (C p)

where LRV = log removal value demonstrated during challenge testing; C f= the feed concentration used during the challenge test; and C p= the filtrate concentration observed during the challenge test. For this equation to be valid, equivalent units must be used for the feed and filtrate concentrations. If the challenge particulate is not detected in the filtrate, then the term C p is set equal to the detection limit. A single LRV is calculated for each membrane module evaluated during the test.

  • The removal efficiency of a membrane filtration process demonstrated during challenge testing is expressed as a log removal value (LRV C-Test). If fewer than twenty modules are tested, then LRV C-Test is assigned a value equal to the lowest of the representative LRVs among the various modules tested. If twenty or more modules are tested, then LRV C-Test is assigned a value equal to the 10th percentile of the representative LRVs among the various modules tested. The percentile is defined by [i/(n+1)] where i is the rank of n individual data points ordered lowest to highest. It may be necessary to calculate the 10th percentile using linear interpolation.
  • A quality control release value (QCRV) must be established for a non-destructive performance test (e.g., bubble point test, diffusive airflow test, pressure/vacuum decay test) that demonstrates the Cryptosporidium removal capability of the membrane module. The performance test must be applied to each production membrane module that did not undergo a challenge test in order to verify Cryptosporidium removal capability. Production membrane modules that do not meet the established QCRV are not eligible for the removal credit demonstrated during challenge testing.
  • Any significant modification to the membrane filtration device (e.g., change in the polymer chemistry of the membrane) requires additional challenge testing to demonstrate removal efficiency of the modified module and to define a new QCRV for the nondestructive performance test.

Direct Integrity Testing

In order to receive removal credit for Cryptosporidium, the removal efficiency of a membrane filtration process must be routinely verified through direct integrity testing. A direct integrity test is defined as a physical test applied to a membrane unit in order to identify and isolate integrity breaches. An integrity breach is defined as one or more leaks that could result in contamination of the filtrate. The direct integrity test method must be applied to the physical elements of the entire membrane unit including membranes, seals, potting material, associated valving and piping, and all other components which under compromised conditions could result in contamination of the filtrate.

The direct integrity tests commonly used at the time of this proposal include those that use an applied pressure or vacuum (such as the pressure decay test and diffusive airflow test), and those that measure the rejection of a particulate or molecular marker (such as spiked particle monitoring). Today's proposal does not stipulate the use of a particular direct integrity test. Instead, the direct integrity test must meet performance criteria for resolution, sensitivity, and frequency.

Resolution is defined as the smallest leak that contributes to the response from a direct integrity test. Any direct integrity test applied to meet the requirements of this proposed rule must have a resolution of 3 μm or less. The manner in which the resolution criterion is met will depend on the type of direct integrity test used. For example, a pressure decay test can meet the resolution criterion by applying a net test pressure great enough to overcome the bubble point of a 3 μm hole. A direct integrity test that uses a particulate or molecular marker can meet the resolution criterion by applying a marker of 3 μm or smaller.

Sensitivity is defined as the maximum log removal value that can be reliably verified by the direct integrity test (LRV DIT). The sensitivity of the direct integrity test applied to meet the requirements of this proposed rule must be equal to or greater than the removal credit awarded to the membrane filtration process. The manner in which LRV DIT is determined will depend on the type of direct integrity test used. Direct integrity tests that use an applied pressure or vacuum typically measure the rate of pressure/vacuum decay or the flow of air through an integrity breach. The response from this type of integrity test can be related to the flow of water through an integrity breach (Q breach) during normal operation, using procedures such as those described in the Membrane Filtration Guidance Manual (USEPA 2003e). Once Q breach has been determined, a simple dilution model is used to calculate LRV DIT for the specific integrity test application, as shown by the following equation:

LRV DIT= LOG 10 (Q p/(VCF × Q breach))

where LRV DIT= maximum log removal value that can be verified by a direct integrity test; Q p= total design filtrate flow from the membrane unit; Q breach= flow of water from an integrity breach associated with the smallest integrity test response that can be reliably measured; and VCF = volumetric concentration factor.

The volumetric concentration factor is the ratio of the suspended solids concentration on the high pressure side of the membrane relative to the feed water, and is defined by the following equation:

VCF = C m/C f

where C m is the concentration of particulate matter on the high pressure side of the membrane that remains in suspension; and C f is the concentration of suspended particulate matter in the feed water. The magnitude of the concentration factor depends on the mode of system operation and typically ranges from 1 to 20. The Membrane Filtration Guidance Manual presents approaches for determining the volumetric concentration factor for different operating modes (USEPA 2003e).

Sensitivity of direct integrity tests that use a particulate or molecular marker is determined from the feed and filtrate concentrations of the marker. The LRV DIT for this type of direct integrity test is calculated according to the following equation:

LRV DIT= LOG 10 (C f) − LOG 10 (C p)

where LRV DIT= maximum log removal value that can be verified by a direct integrity test; C f= the typical feed concentration of the marker used in the test; and C p= the filtrate concentration of the marker from an integral membrane unit. For this equation to be valid, equivalent units must be used for the feed and filtrate concentrations. An ideal particulate or molecular marker would be completely removed by an integral membrane unit.

If the sensitivity of the direct integrity test is such that LRV DIT is less than LRV C-Test, LRV DIT establishes the maximum removal credit that a membrane filtration process is eligible to receive. Conversely, if LRV DIT for a direct integrity test is greater than LRV C-Test, LRV C-Test establishes the maximum removal credit.

A control limit is defined as an integrity test response which, if exceeded, indicates a potential problem with the system and triggers a response. Under this proposal, a control limit for a direct integrity test must be established that is indicative of an integral membrane unit capable of meeting the Cryptosporidium removal credit awarded by the State. If the control limit for the direct integrity test is exceeded, the membrane unit must be taken off-line for diagnostic testing and repair. The membrane unit could only be returned to service after the repair has been completed and confirmed through the application of a direct integrity test.

The frequency of direct integrity testing specifies how often the test is performed over an established time interval. Most direct integrity tests available at the time of this proposal are applied periodically and must be conducted on each membrane unit at a frequency of not less than once every 24 hours while the unit is in operation. If continuous direct integrity test methods become available that also meet the sensitivity and resolution criteria described earlier, they may be used in lieu of periodic testing.

EPA is proposing that at a minimum, a monthly report must be submitted to the State summarizing all direct integrity test results above the control limit associated with the Cryptosporidium removal credit awarded to the process and the corrective action that was taken in each case.

Continuous Indirect Integrity Monitoring

The majority of currently available direct integrity test methods are applied periodically since the membrane unit must be taken out of service to conduct the test. In order to provide some measure of process performance between direct integrity testing events, continuous indirect integrity monitoring is required. Indirect integrity monitoring is defined as monitoring some aspect of filtrate water quality that is indicative of the removal of particulate matter. If a continuous direct integrity test is implemented that meets the resolution and sensitivity criteria described previously, continuous indirect integrity monitoring is not required. Continuous indirect integrity monitoring must be conducted according to the following criteria:

  • Unless the State approves an alternative parameter, continuous indirect integrity monitoring must include continuous filtrate turbidity monitoring.
  • Continuous monitoring is defined as monitoring conducted at a frequency of no less than once every 15 minutes.
  • Continuous monitoring must be separately conducted on each membrane unit.
  • If indirect integrity monitoring includes turbidity and if the filtrate turbidity readings are above 0.15 NTU for a period greater than 15 minutes (i.e., two consecutive 15-minute readings above 0.15 NTU), direct integrity testing must be performed on the associated membrane units.
  • If indirect integrity monitoring includes a State-approved alternative parameter and if the alternative parameter exceeds a State-approved control limit for a period greater than 15 minutes, direct integrity testing must be performed on the associated membrane units.
  • EPA is proposing that at a minimum, a monthly report must be submitted to the primacy agency summarizing all indirect integrity monitoring results triggering direct integrity testing and the corrective action that was taken in each case.

b. How was this proposal developed? The Stage 2 M-DBP Agreement in Principle recommends that EPA develop criteria to award Cryptosporidium removal credit to membrane filtration processes. Today's proposal and the supporting guidance are consistent with the Agreement.

A number of studies have been conducted which have demonstrated the ability of membrane filtration processes to remove pathogens, including Cryptosporidium, to below detection levels. A literature review summarizing the results of several comprehensive studies was conducted by EPA and is presented in Low-Pressure Membrane Filtration for Pathogen Removal: Application, Implementation, and Regulatory Issues (USEPA 2001h). Many of these studies used Cryptosporidium seeding to demonstrate removal efficiencies as high as 7 log. The collective results from these studies demonstrate that an integral membrane module, i.e., a membrane module without any leaks or defects, with an exclusion characteristic smaller than Cryptosporidium, is capable of removing this pathogen to below detection in the filtrate, independent of the feed concentration.

Some filtration devices have used membrane media in a cartridge filter configuration; however, few data are available documenting their ability to meet the requirements for membrane filtration described in section IV.C.11.a of this preamble. However, in one study reported by Dwyer et al. (2001), a membrane cartridge filter demonstrated Cryptosporidium removal efficiencies in excess of 6 log. This study illustrates the potentially high removal capabilities of membrane filtration media configured into a cartridge filtration device, thus providing a basis for awarding removal credits to these devices under the membrane filtration provision of the rule, assuming that the device meets the definition of a membrane filtration process as well as the direct integrity test requirements.

Today's proposal requires challenge testing of membrane filtration processes used to remove Cryptosporidium. As noted in section III.D, EPA believes this is necessary due to the proprietary nature of these systems and the lack of any uniform criteria for establishing the exclusion characteristic of a membrane. Challenge testing addresses the lack of a standard approach for characterizing membranes by requiring direct verification of removal efficiency. The proposed challenge testing is product-specific and not site-specific since the intent of this testing is to demonstrate the removal capabilities of the membrane product rather than evaluate the feasibility of implementing membrane treatment at a specific plant.

Testing can be conducted using a full-scale module or a smaller module if the results from the small-scale module test can be related to full-scale module performance. Most challenge studies presented in the literature have used full-scale modules, which provide results that can be directly related to full-scale performance. However, use of smaller modules is considered feasible in the evaluation of removal efficiency, and a protocol for challenge testing using small-scale modules has been proposed (NSF, 2002a). Since the removal efficiency of an integral membrane is a direct function of the membrane material, it may be possible to use a small-scale module containing the same membrane fibers or sheets used in full-scale modules for this evaluation. However, it will be necessary to relate the results of the small-scale module test to the nondestructive performance test quality control release value that will be used to validate full-scale production modules.

Challenge testing with either Cryptosporidium oocysts or a surrogate is permitted. Challenge testing with Cryptosporidium clearly provides direct verification of removal efficiency for this pathogen; however, several studies have demonstrated that surrogates can provide an accurate or conservative measure of Cryptosporidium removal efficiency. Since removal of particulate matter larger than 1 μm by a membrane filtration process occurs primarily via a size exclusion mechanism, the shape and size distribution of the surrogate must be selected such that the surrogate is not removed to a greater extent than the target organism. Surrogates that have been successfully used in challenge studies include polystyrene microspheres and bacterial endospores. The bacterial endospore, Bacillus subtilis, has been used as a surrogate for Cryptosporidium oocysts during challenge studies evaluating pathogen removal by physical treatment processes, including membrane filtration (Rice et al. 1996, Fox et al. 1998, Trimboli et al. 1999, Owen et al, 1999). Studies evaluating cartridge filters have demonstrated that polystyrene microspheres can provide an accurate or conservative measure of removal efficiency (Long, 1983, Li et al. 1997). Furthermore, the National Sanitation Foundation (NSF) Environmental Technology Verification (ETV) protocol for verification testing for physical removal of microbiological and particulate contaminants specifies the use of polymeric microspheres of a known size distribution (NSF 2002b). Guidance on selection of an appropriate surrogate for establishing a removal efficiency for Cryptosporidium during challenge testing is presented in the Membrane Filtration Guidance Manual (USEPA 2003e).

The design of the proposed challenge studies is similar to the design of the seeding studies described in the literature cited earlier. Seeding studies are used to challenge the membrane module with pathogen levels orders of magnitude higher than those encountered in natural waters. However, elevated feed concentrations can lead to artificially high estimates of removal efficiency. To address this issue, the feed concentration applied to the membrane during challenge studies is capped at a level that will allow the demonstration of up to 6.5 log removal efficiency if the challenge particulate is removed to the detection level.

Because challenge testing with Cryptosporidium or a surrogate is not conducted on every membrane module, it is necessary to establish criteria for a non-destructive performance test that can be applied to all production membrane modules. Results from a non-destructive test, such as a bubble point test, that are correlated with the results of challenge testing can be used to establish a quality control release value (QCRV) that is indicative of the ability of a membrane filtration process to remove Cryptosporidium. The non-destructive test and QCRV can be used to verify the Cryptosporidium removal capability of modules that are not challenge tested. Most membrane manufacturers have already adapted some form of non-destructive testing for product quality control purposes and have established a quality control release value that is indicative of an acceptable product. It may be possible to apply these existing practices for the purpose of verifying the capability of a membrane filtration process to remove Cryptosporidium.

Challenge testing provides a means of demonstrating the removal efficiency of an integral membrane module; however, defects or leaks in the membrane or other system components can result in contamination of the filtrate unless they are identified, isolated, and repaired. In order to verify continued performance of a membrane system, today's proposal requires direct integrity testing of membrane filtration processes used to meet Cryptosporidium treatment requirements. Direct integrity testing is required because it is a test applied to the physical membrane module and, thus, a direct evaluation of integrity. Furthermore, direct integrity methods are the most sensitive integrity monitoring methods commonly used at the time of this proposal (Adham et al. 1995).

The most common direct integrity tests apply a pressure or a vacuum to one side of a fully wetted membrane and monitor either the pressure decay or the volume of displaced fluid over time. However, the proprietary nature of these systems makes it impractical to define a single direct integrity test methodology that is applicable to all existing and future membrane products. Therefore, performance criteria have been established for any direct integrity test methodology used to verify the removal efficiency of a membrane system. These performance criteria are resolution, sensitivity, and frequency.

As stated previously, the resolution of an integrity test refers to the smallest leak that contributes to the response from an integrity test. For example, in a pressure decay integrity test, resolution is the smallest leak that contributes to pressure loss during the test. Today's proposal specifies a resolution of 3 μm or less, which is based on the size of Cryptosporidium oocysts. This requirement ensures that a leak that could pass a Cryptosporidium oocyst would contribute to the response from an integrity test.

The sensitivity of an integrity test refers to the maximum log removal that can be reliably verified by the test. Again using the pressure decay integrity test as an example, the method sensitivity is a function of the smallest pressure loss that can be detected over a membrane unit. Today's proposal limits the log removal credit that a membrane filtration process is eligible to receive to the maximum log removal value that can be verified by a direct integrity test.

In order to serve as a useful process monitoring tool for assuring system integrity, it is necessary to establish a site-specific control limit for the integrity test that corresponds to the log removal awarded to the process. A general approach for establishing this control limit for some integrity test methods is presented in guidance; however, the utility will need to work with the membrane manufacturer and State to establish a site-specific control limit appropriate for the integrity test used and level of credit awarded. Excursions above this limit indicate a potential integrity breach and would trigger removal of the suspect unit from service followed by diagnostic testing and subsequent repair, as necessary.

Most direct integrity tests available at the time of this proposal must be applied periodically since it is necessary to take the membrane unit out of service to conduct the test. Today's proposal establishes the minimum frequency for performing a direct integrity test at once per 24 hours. Currently, there is no standard frequency for direct integrity testing that has been adopted by all States and membrane treatment facilities. In a recent survey, the required frequency of integrity testing was found to vary from once every four hours to once per week; however, the most common frequency for conducting a direct integrity test was once every 24 hours (USEPA 2001h). Specifically, 10 out of 14 States that require periodic direct integrity testing specify a frequency of once every 24 hours. Furthermore, many membrane manufacturers of systems with automated integrity test systems set up the membrane units to automatically perform a direct integrity test once per 24 hours. EPA has concluded that the 24 hour direct integrity test frequency ensures that removal efficiency is verified on a routine basis without resulting in excessive system downtime.

Since most direct integrity tests are applied periodically, it is necessary to implement some level of continuous monitoring to assess process performance between direct integrity test events. In the absence of a continuous direct integrity test, continuous indirect integrity monitoring is required. Although it has been shown that commonly used indirect integrity monitoring methods lack the sensitivity to detect small integrity breaches that are of concern (Adham et al. 1995), they can detect large breaches and provide some assurance that a major failure has not occurred between direct integrity test events. Turbidity monitoring is proposed as the method of indirect integrity monitoring unless the State approves an alternate approach. Available data indicate that an integral membrane filtration process can consistently produce water with a turbidity less than 0.10 NTU, regardless of the feedwater quality. Consequently, EPA is proposing that exceedance of a filtrate turbidity value of 0.15 NTU triggers direct integrity testing to verify and isolate the integrity breach.

c. Request for comment. EPA requests comment on the following issues:

  • EPA is proposing to include membrane cartridge filters that can be direct integrity tested under the definition of a membrane filtration process since one of the key differences between membrane filtration processes and bag and cartridge filters, within the context of this regulation, is the applicability of direct integrity test methods to the filtration process. EPA requests comment on the inclusion of membrane cartridge filters that can be direct integrity tested under the definition of a membrane filtration process in this rule.
  • The applicability of the proposed Cryptosporidium removal credits and performance criteria to Giardia lamblia.
  • Appropriate surrogates, or the characteristics of appropriate surrogates, for use in challenge testing. EPA requests data or information demonstrating the correlation between removal of a proposed surrogate and removal of Cryptosporidium oocysts.
  • The use of a non-destructive performance test and associated quality control release values for demonstrating the Cryptosporidium removal capability of membrane modules that are not directly challenge tested.
  • The appropriateness of the minimum direct integrity test frequency of once per 24 hours.
  • The proposed minimum reporting frequency for direct integrity testing results above the control limit and indirect integrity monitoring results that trigger direct integrity monitoring.

12. Bag and Cartridge Filtration

a. What is EPA proposing today? EPA is proposing criteria for awarding Cryptosporidium removal credit of 1 log for bag filtration processes and 2 log for cartridge filtration processes. To receive removal credit the process must: (1) Meet the basic definition of a bag or cartridge filter and (2) have removal efficiency established through challenge testing.

Definition of a Bag or Cartridge Filter

For the purpose of this rule, bag and cartridge filters are defined as pressure driven separation processes that remove particulate matter larger than 1 μm using an engineered porous filtration media through either surface or depth filtration.

The distinction between bag filters and cartridge filters is based on the type of filtration media used and the manner in which the devices are constructed. Bag filters are typically constructed of a non-rigid, fabric filtration media housed in a pressure vessel in which the direction of flow is from the inside of the bag to outside. Cartridge filters are typically constructed as rigid or semi-rigid, self-supporting filter elements housed in pressure vessels in which flow is from the outside of the cartridge to the inside.

Although all filters classified as cartridge filters share similarities with respect to their construction, there are significant differences among the various commercial cartridge filtration devices. From a public health perspective, an important distinction among these filters is the ability to directly test the integrity of the filtration system in order to verify that there are no leaks that could result in contamination of the filtrate. Any membrane cartridge filtration device that can be direct integrity tested according to the criteria specified in section IV.C.11.a is eligible for removal credit as a membrane, subject to the criteria specified in that section. Section IV.C.12 applies to all bag filters, as well as to cartridge filters which cannot be direct integrity tested.

Challenge Testing

In order to receive 1 log removal credit, a bag filter must have a demonstrated removal efficiency of 2 log or greater for Cryptosporidium. Similarly, to receive 2 log removal credit, a cartridge filter must have a demonstrated removal efficiency of 3 log or greater for Cryptosporidium. The 1 log factor of safety is applied to the removal credit awarded to these filtration devices based on two primary considerations. First, the removal efficiency of some bag and cartridge filters has been observed to vary by more than 1 log over the course of operation (Li et al. 1997, NSF 2001a, NSF 2001b). Second, bag and cartridge filters are not routinely direct integrity tested during operation in the field; hence, there is no means of verifying the removal efficiency of filtration units during routine use. Based on these considerations, a conservative approach to awarding removal credit based on challenge test results is warranted.

Removal efficiency must be demonstrated through a challenge test conducted on the bag or cartridge filter proposed for use in full-scale drinking water treatment facilities for removal of Cryptosporidium. Challenge testing is required for specific products and is not intended to be site specific. At the discretion of the State, data from challenge studies conducted prior to promulgation of this regulation may be considered in lieu of additional testing. However, the prior testing must have been conducted in a manner that demonstrates a removal efficiency for Cryptosporidium commensurate with the treatment credit awarded to the process. Guidance on conducting challenge studies to demonstrate the Cryptosporidium removal efficiency of filtration units is presented in the Membrane Filtration Guidance Manual (USEPA 2003e). Challenge testing must be conducted according to the following criteria:

  • Challenge testing must be conducted on a full-scale filter element identical in material and construction to the filter elements proposed for use in full-scale treatment facilities.
  • Challenge testing must be conducted using Cryptosporidium oocysts or a surrogate which is removed no more efficiently than Cryptosporidium oocysts. The organism or surrogate used during challenge testing is referred to as the challenge particulate. The concentration of the challenge particulate must be determined using a method capable of discretely quantifying the specific organism or surrogate used in the test, i.e., gross water quality measurements such as turbidity cannot be used.
  • The maximum allowable feed water concentration used during a challenge test is based on the detection limit of the challenge particulate in the filtrate and calculated using one of the following equations.

For bag filters:

Maximum Feed Concentration = 3.16 × 10 3× (Filtrate Detection Limit)

For cartridge filters:

Maximum Feed Concentration = 3.16 × 10 4× (Filtrate Detection Limit)

This will allow the demonstration of up to 3.5 log removal for bag filters and 4.5 log removal for cartridge filters during challenge testing if the challenge particulate is removed to the detection limit.

  • Challenge testing must be conducted at the maximum design flow rate specified by the manufacturer.
  • Each filter must be tested for a duration sufficient to reach 100% of the terminal pressure drop, a parameter specified by the manufacturer which establishes the end of the useful life of the filter. In order to achieve terminal pressure drop during the test, it will be necessary to add particulate matter to the test solution, such as fine carbon test dust or bentonite clay particles.
  • Each filter must be challenged with the challenge particulate during three periods over the filtration cycle: within 2 hours of start-up after a new bag or cartridge filter has been installed, when the pressure drop is between 45 and 55% of the terminal pressure drop, and at the end of the run after the pressure drop has reached 100% of the terminal pressure drop.
  • Removal efficiency of a bag or cartridge filtration process is determined from the results of the challenge test, and expressed in terms of log removal values as defined by the following equation:

LRV = LOG 10 (C f)−LOG 10 (C p)

where LRV = log removal value demonstrated during challenge testing; C f= the feed concentration used during the challenge test; and C p= the filtrate concentration observed during the challenge test. For this equation to be valid, equivalent units must be used for the feed and filtrate concentrations. If the challenge particulate is not detected in the filtrate, then the term C p is set equal to the detection limit. An LRV is calculated for each filter evaluated during the test.

  • In order to receive treatment credit for Cryptosporidium under this proposed rule, challenge testing must demonstrate a removal efficiency of 2 log or greater for bag filtration and 3 log or greater for cartridge filtration. If fewer than twenty filters are tested, then removal efficiency of the process is set equal to the lowest of the representative LRVs among the various filters tested. If twenty or more filters are tested, then removal efficiency of the process is set equal to the 10th percentile of the representative LRVs among the various filters tested. The percentile is defined by [i/(n+1)] where i is the rank of n individual data points ordered lowest to highest. It may be necessary to calculate the 10th percentile using linear interpolation.
  • Any significant modification to the filtration unit (e.g., changes to the filtration media, changes to the configuration of the filtration media, significant modifications to the sealing system) would require additional challenge testing to demonstrate removal efficiency of the modified unit.

b. How was this proposal developed? The Stage 2 M-DBP Agreement in Principle recommended that EPA develop criteria for awarding Cryptosporidium removal credits of 1 log for bag filters and 2 log for cartridge filters. Today's proposal is consistent with the Agreement.

A limited amount of published data are available regarding the removal efficiency of bag and cartridge filters with respect to Cryptosporidium oocysts or suitable surrogates. The relevant studies identified in the literature are summarized in Table IV-18.

Table IV-18.—Results From Studies of Cryptosporidium or Surrogate Removal by Bag and Cartridge Filters Back to Top
Process Log removal Organism/surrogate Reference
Bag and cartridge filtration in series 1.1 to 2.1 3 to 6 μm spheres NSF 2001a.
Cartridge filtration 3.5 (average) Cryptosporidium Enriquez et al. 1999.
Cartridge filtration 3.3 (average) Cryptosporidium Roessler, 1998.
Cartridge filtration 1.1 to 3.3 Cryptosporidium Schaub et al. 1993.
Cartridge filtration 0.5 to 3.6 5.7 μm spheres Long, 1983.
Cartridge filtration 2.3 to 2.8 Cryptosporidium Ciardelli, 1996a.
Cartridge filtration 2.7 to 3.7 Cryptosporidium Ciardelli, 1996b.
Prefilter and bag filter in series 1.9 to 3.2 3.7 μm spheres NSF 2001b.
Bag filtration ∼3.0 Cryptosporidium Cornwell and LeChevallier, 2002.
Bag filtration 0.5 to 3.6 Cryptosporidium Li et al. 1997.
Bag filtration 0.5 to 2.0 4.5 μm spheres Goodrich et al. 1995.

These data demonstrate highly variable removal performance for these processes, ranging from 0.5 log to 3.6 log for both bag and cartridge filtration. Results of these studies also show no correlation between the pore size rating established by the manufacturer and the removal efficiency of a filtration device. In a study evaluating two cartridge filters, both with a pore size rating of 3 μm, a 2 log difference in Cryptosporidium oocyst removal was observed between the two filters (Schaub et al. 1993). Another study evaluated seventeen cartridge filters with a range of pore size ratings from 1 μm to 10 μm and found no correlation with removal efficiency (Long, 1983). Li et al. (1997) evaluated three bag filters with similar pore size ratings and observed a 3 log difference in Cryptosporidium oocyst removal among them. These results indicate that bag and cartridge filters may be capable of achieving removal of oocysts in excess of 3 log; however, performance can vary significantly among products and there appears to be no correlation between pore size rating and removal efficiency.

Based on available data, specific design criteria that correlate to removal efficiency cannot be derived for bag and cartridge filters. Furthermore, the removal efficiency of these proprietary devices can be impacted by product variability, increasing pressure drop over the filtration cycle, flow rate, and other operating conditions. The data in Table IV-18 were generated from studies performed under a variety of operating conditions, many of which could not be considered conservative (or worst-case) operation. These considerations lead to the proposed challenge testing requirements which are intended to establish a product-specific removal efficiency.

The proposed challenge testing is product-specific and not site-specific since the intent of this testing is to demonstrate the removal capabilities of the filtration device rather than evaluate the feasibility of implementing the technology at a specific plant. Challenge testing must be conducted using full-scale filter elements in order to evaluate the performance of the entire unit, including the filtration media, seals, filter housing and other components integral to the filtration system. This will improve the applicability of challenge test results to full-scale performance. Multiple filters of the same type can be tested to provide a better statistical basis for estimating removal efficiency.

Either Cryptosporidium oocysts or a suitable surrogate could be used as the challenge particulate during the test. Challenge testing with Cryptosporidium provides direct verification of removal efficiency; however, some studies have demonstrated that surrogates, such as polystyrene microspheres, can provide an accurate or conservative measure of removal efficiency (Long 1983, Li et al. 1997). Furthermore, the National Sanitation Foundation (NSF) Environmental Technology Verification (ETV) protocol for verification testing for physical removal of microbiological and particulate contaminants specifies the use of polymeric microspheres of a known size distribution (NSF 2002b). Guidance on selection of an appropriate surrogate for establishing a removal efficiency for Cryptosporidium during challenge testing is presented in the Membrane Filtration Guidance Manual (USEPA 2003e).

In order to demonstrate a removal efficiency of at least 2 or 3 log for bag or cartridge filters, respectively, it will likely be necessary to seed the challenge particulate into the test solution. A criticism of published studies that use this approach is that the seeded levels are orders of magnitude higher than those encountered in natural waters and this could potentially lead to artificially high estimates of removal efficiency. To address this issue, the feed concentration applied to the filter during challenge studies is capped at a level that will allow the demonstration of a removal efficiency up to 4.5 log for cartridge filters and 3.5 log for bag filters if the challenge particulate is removed to the detection level.

The removal efficiency of some bag and cartridge filtration devices has been shown to decrease over the course of a filtration cycle due to the accumulation of solids and resulting increase in pressure drop. As an example, Li et al. (1997) observed that the removal of 4.5 μm microspheres by a bag filter decreased from 3.4 log to 1.3 log over the course of a filtration cycle. Studies evaluating bag and cartridge filtration under the NSF ETV program have also shown a degradation in removal efficiency over the course of the filtration cycle (NSF 2001a and 2001b). In order to evaluate this potential variability, the challenge studies are designed to assess removal efficiency during three periods of a filtration cycle: within two hours of startup following installation of a new filter, between 45% and 55% of terminal pressure drop, and at the end of the run after 100% of terminal pressure drop is realized.

Although challenge testing can provide an estimate of removal efficiency for a bag or cartridge filtration process, it is not feasible to conduct a challenge test on every production filter. This, coupled with variability within a product line, could result in some production filters that do not meet the removal efficiency demonstrated during challenge testing. For membrane filtration processes, this problem is addressed through the use of a quality control release value established for a non-destructive test, such as a bubble point test or pressure hold test, that is correlated to removal efficiency. Since the non-destructive test can be applied to all production membrane modules, this provides a feasible means of verifying the performance of every membrane module used by a PWS. However, the non-destructive tests applied to membrane filtration processes cannot be applied to most bag and cartridge filtration devices, and EPA is not aware of an alternative non-destructive test that can be used with these devices.

Typical process monitoring for bag and cartridge filtration systems includes turbidity and pressure drop to determine when filters must be replaced. However, the applicability of either of these process monitoring parameters as tools for verifying removal of Cryptosporidium has not been demonstrated. Only a few bag or cartridge filtration studies have attempted to correlate turbidity removal with removal of Cryptosporidium oocysts or surrogates. Li et al. (1997) found that the removal efficiency for turbidity was consistently lower than removal efficiency for oocysts or microspheres for the three bag filters evaluated. Furthermore, none of the filters was capable of consistently producing a filtered water turbidity below 0.3 NTU for the waters evaluated. The contribution to turbidity from particles much smaller than Cryptosporidium oocysts, and much smaller than the mesh size of the filter, make it difficult to correlate removal of turbidity with removal of Cryptosporidium. Consequently, EPA is proposing a 1 log factor of safety to be applied to challenge test results in awarding treatment credit to bag and cartridge filters, and is not proposing integrity monitoring requirements for these devices.

c. Request for comment. EPA requests comment on the following issues concerning bag and cartridge filters:

  • The performance of bag and cartridge filters in removing Cryptosporidium through all differential pressure ranges in a filter run—EPA requests laboratory and field data, along with associated quality assurance and quality control information, that will support a determination of the appropriate level of Cryptosporidium removal credit to award to these technologies.
  • The performance of bag and cartridge filters in removing Cryptosporidium when used in series with other bag or cartridge filters—EPA requests laboratory and field data, along with associated quality assurance and quality control information, that will support a determination of the appropriate level of Cryptosporidium removal credit to award to these technologies when used in series.
  • Appropriate surrogates, or the characteristics of appropriate surrogates, for use in challenge testing bag and cartridge filters—EPA requests data or information demonstrating the correlation between removal of a proposed surrogate and removal of Cryptosporidium oocysts.
  • The availability of non-destructive tests that can be applied to bag and cartridge filters to verify the removal efficiency of production filters that are not directly challenge tested—EPA requests data or information demonstrating the correlation between a proposed non-destructive test and the removal of Cryptosporidium oocysts.
  • The applicability of pressure drop monitoring, filtrate turbidity monitoring, or other process monitoring and process control procedures to verify the integrity of bag and cartridge filters—EPA requests data or information demonstrating the correlation between a proposed process monitoring tool and the removal of Cryptosporidium oocysts.
  • The applicability of bag and cartridge filters to different source water types and treatment scenarios.
  • The applicability of the proposed Cryptosporidium removal credits and testing criteria to Giardia lamblia.
  • The use of a 1 log factor of safety for awarding credit to bag and cartridge filters—EPA requests comment on whether this is an appropriate factor of safety to account for the inability to conduct integrity monitoring of these devices, as well as the variability in removal efficiency observed over the course of a filtration cycle for some filtration devices. This inability creates uncertainty regarding both changes in the performance of a given filter during use and variability in performance among filters in a given product line. If the 1 log factor of safety is higher than necessary to account for these factors, should the Agency establish a lower value, such as a 0.5 log factor of safety?

13. Secondary Filtration

a. What is EPA proposing today? Today's proposal allows systems using a second filtration stage to receive an additional 0.5 log Cryptosporidium removal credit. To be eligible for this credit, the secondary filtration must consist of rapid sand, dual media, granular activated carbon (GAC), or other fine grain media in a separate stage following rapid sand or dual media filtration. A cap, such as GAC, on a single stage of filtration will not qualify for this credit. In addition, the first stage of filtration must be preceded by a coagulation step, and both stages must treat 100% of the flow.

b. How was this proposal developed? Although not addressed in the Agreement in Principle, EPA has determined that secondary filtration meeting the criteria described in this section will achieve additional removal of Cryptosporidium oocysts. Consequently, additional removal credit may be appropriate. As reported in section III.D, many studies have shown that rapid sand filtration preceded by coagulation can achieve significant removal of Cryptosporidium (Patania et al. 1995, Nieminski and Ongerth 1995, Ongerth and Pecoraro 1995, LeChevallier and Norton 1992, LeChevallier et al. 1991, Dugan et al. 2001, Nieminski and Bellamy 2000, McTigue et al. 1998, Patania et al. 1999, Huck et al. 2000, Emelko et al. 2000). While these studies evaluated only a single stage of filtration, the same mechanisms of removal are expected to occur in a second stage of granular media filtration.

EPA received data from the City of Cincinnati, OH, on the removal of aerobic spores through a conventional treatment facility that employs GAC contactors for DBP, taste, and odor control after rapid sand filtration. As described previously, a number of studies (Dugan et al. 2001, Emelko et al. 1999 and 2000, Yates et al. 1998, Mazounie et al. 2000) have demonstrated that aerobic spores are a conservative indicator of Cryptosporidium removal by granular media filtration when preceded by coagulation.

During the period of 1999 and 2000, the mean values of reported spore concentrations in the influent and effluent of the Cincinnati GAC contactors were 35.7 and 6.4 cfu/100 mL, respectively, indicating an average removal of 0.75 log across the contactors. Approximately 16% of the GAC filtered water results were below detection limit (1 cfu/100 mL) so the actual log spore removal may have been greater than indicated by these results.

In summary, studies in the cited literature demonstrate that a fine granular media filter preceded by coagulation can achieve high levels of Cryptosporidium removal. Data on increased removal resulting from a second stage of filtration are limited, and there is uncertainty regarding how effective a second stage of filtration will be in reducing levels of microbial pathogens that are not removed by the first stage of filtration. However, EPA has concluded that a secondary filtration process can achieve 0.5 log or greater removal of Cryptosporidium based on (1) the theoretical consideration that the same mechanisms of pathogen removal will be operative in both a primary and secondary filtration stage, and (2) data from the City of Cincinnati showing aerobic spore removal in GAC contactors following rapid sand filtration. Therefore, EPA believes it is appropriate to propose 0.5 log additional Cryptosporidium treatment credit for systems using secondary filtration which meets the criteria of this section.

c. Request for comment. The Agency requests comment on awarding a 0.5 log Cryptosporidium removal credit for systems using secondary filtration, including the design and operational criteria required to receive the log removal credit. EPA specifically requests comment on the following issues:

  • Should there be a minimum required depth for the secondary filter (e.g., 24 inches) in order for the system to receive credit?
  • Should systems be eligible to receive additional Cryptosporidium treatment credit within the microbial toolbox for both a second clarification stage (e.g., secondary filtration, second stage sedimentation) and lower finished water turbidity, given that additional particle removal achieved by the second clarification stage will reduce finished water turbidity?

14. Ozone and Chlorine Dioxide

a. What is EPA proposing today? Similar to the methodology used for estimating log inactivation of Giardia lamblia by various chemical disinfectants in 40 CFR 141.74, EPA is proposing the CT concept for estimating log inactivation of Cryptosporidium by chlorine dioxide or ozone. In today's proposal, systems must determine the total inactivation of Cryptosporidium each day the system is in operation, based on the CT values in Table IV-19 for ozone and Table IV-20 for chlorine dioxide. The parameters necessary to determine the total inactivation of Cryptosporidium must be monitored as stated in 40 CFR 141.74(b)(3)(i), (iii), and (iv), which is as follows:

  • The temperature of the disinfected water must be measured at least once per day at each residual disinfectant concentration sampling point.
  • The disinfectant contact time(s) (“T”) must be determined for each day during peak hourly flow.
  • The residual disinfectant concentration(s) (“C”) of the water before or at the first customer must be measured each day during peak hourly flow.

Systems may have several disinfection segments (the segment is defined as a treatment unit process with a measurable disinfectant residual level and a liquid volume) in sequence along the treatment train. In determining the total log inactivation, the system may calculate the log inactivation for each disinfection segment and use the sum of the log inactivation estimates of Cryptosporidium achieved through the plant. The Toolbox Guidance Manual, available in draft with today's proposal, provides guidance on methodologies for determining CT values and estimating log inactivation for different disinfection reactor designs and operations.

Table IV-19.—CT Values for Cryptosporidium Inactivation by Ozone Back to Top
Log credit Water Temperature, °C1
=0.5 1 2 3 5 7 10 15 20 25
1CT values between the indicated temperatures may be determined by interpolation.
0.5 12 12 10 9.5 7.9 6.5 4.9 3.1 2.0 1.2
1.0 24 23 21 19 16 13 9.9 6.2 3.9 2.5
1.5 36 35 31 29 24 20 15 9.3 5.9 3.7
2.0 48 46 42 38 32 26 20 12 7.8 4.9
2.5 60 58 52 48 40 33 25 16 9.8 6.2
3.0 72 69 63 57 47 39 30 19 12 7.4
Table IV-20.—CT Values for Cryptosporidium Inactivation by Chlorine Dioxide Back to Top
Log credit Water Temperature, °C1
=0.5 1 2 3 5 7 10 15 20 25
1CT values between the indicated temperatures may be determined by interpolation.
0.5 319 305 279 256 214 180 138 89 58 38
1.0 637 610 558 511 429 360 277 179 116 75
1.5 956 915 838 767 643 539 415 268 174 113
2.0 1275 1220 1117 1023 858 719 553 357 232 150
2.5 1594 1525 1396 1278 1072 899 691 447 289 188
3.0 1912 1830 1675 1534 1286 1079 830 536 347 226

The system may demonstrate to the State, through the use of a State-approved protocol for on-site disinfection challenge studies or other information satisfactory to the State, that CT values other than those specified in Tables IV-19 or IV-20 are adequate to demonstrate that the system is achieving the required log inactivation of Cryptosporidium. Protocols for making such demonstrations are available in the Toolbox Guidance Manual.

b. How was this proposal developed? EPA relied in part on analyses by Clark et al. (2002a and 2002b) to develop the CT values for ozone and chlorine dioxide inactivation of Cryptosporidium in today's proposal. Clark et al. (2002a) used data from studies of ozone inactivation of Cryptosporidium in laboratory water to develop predictive equations for estimating inactivation (Rennecker et al. 1999, Li et al. 2001) and data from studies in natural water to validate the equations (Owens et al. 2000, Oppenheimer et al. 2000). For chlorine dioxide, Clark et al. (2002b) employed data from Li et al. (2001) to develop equations for predicting inactivation, and used data from Owens et al. (1999) and Ruffell et al. (2000) to validate the equations.

Another step in developing the CT values for Cryptosporidium inactivation in today's proposal involved consideration of the appropriate confidence bound to apply when analyzing the inactivation data. A confidence bound represents a safety margin that accounts for variability and uncertainty in the data that underlie the analysis. Confidence bounds are intended to provide a high likelihood that systems operating at a given CT value will achieve at least the corresponding log inactivation level in the CT table.

Two types of confidence bounds that are used when assessing relationships between variables, such as disinfectant dose (CT) and log inactivation, are confidence in the regression and confidence in the prediction. Confidence in the regression accounts for uncertainty in the regression line (e.g., a linear relationship between temperature and the log of the ratio of CT to log inactivation). Confidence in the prediction accounts for both uncertainty in the regression line and variability in experimental observations—it describes the likelihood of a single future data point falling within a range. Bounds for confidence in prediction are wider (i.e., more conservative) than those for confidence in the regression. Depending on the degree of confidence applied, most points in a data set typically will fall within the bounds for confidence in the prediction, while a significant fraction will fall outside the bounds for confidence in the regression.

In developing earlier CT tables, EPA has used bounds for confidence in the prediction. This was a conservative approach that was taken with consideration of the limited inactivation data that were available and that reasonably ensured systems would achieve the required inactivation level. The November 2001 draft of the LT2ESWTR included CT tables for Cryptosporidium inactivation by ozone and chlorine dioxide that were derived using confidence in prediction (USEPA 2001g). However, based on comments received on those draft tables, along with further analyses described next, EPA has revised this approach in today's proposal.

The underlying Cryptosporidium inactivation data used to develop the CT tables exhibit significant variability. This variability is due to both experimental error and potential true variability in the inactivation rate. Experimental error is associated with the assays used to measure loss of infectivity, measurement of the disinfectant concentration, differences in technique among researchers, and other factors. True variability in the inactivation rate would be associated with variability in resistance to the disinfectant between different populations of oocysts and variability in the effect of water matrix on the inactivation process.

In considering the appropriate confidence bounds to use for developing the CT tables in today's proposal, EPA was primarily concerned with accounting for uncertainty in the regression and for true variability in the inactivation rate. Variability associated with experimental error was a lessor concern, as the purpose of the CT tables is to ensure a given level of inactivation and not predict the measured result of an individual experiment.

Because confidence in the prediction accounts for all variability in the data sets (both true variability and experimental error), it may provide a higher margin of safety than is necessary. Nevertheless, in other disinfection applications, the use of confidence in the prediction may be appropriate, given limited data sets and uncertainty in the source of the variability. However, the high doses of ozone and chlorine dioxide that are needed to inactivate Cryptosporidium create an offsetting concern with the formation of DBPs (e.g., bromate and chlorite). In consideration of these factors and the statutory provision for balancing risks among contaminants, EPA attempted to exclude experimental error from the confidence bound when developing the CT tables in today's proposal (i.e., used a less conservative approach than confidence in the prediction).

In order to select confidence bounds reflecting potential true variability between different oocyst populations (lots) but not variability due to measurement and experimental imprecision, it was necessary to estimate the relative contributions of these variance components. This was done by first separating inactivation data points into groups having the same Cryptosporidium oocyst lot and experimental conditions (e.g., water matrix, pH, temperature). Next, the variance within each group was determined. It was assumed that this within-group variance could be attributed entirely to experimental error, as neither of the factors expected to account for true variability in the inactivation rate (i.e., oocyst lot or water matrix) changed within a group. Finally, comparing the average within-group variance to the total variance in a data set provided an indication of the fraction of total variance that was due to experimental error (see Sivaganesan 2003 and Messner 2003 for details).

In carrying out this analysis on the Li et al. (2001) and Rennecker et al. (1999) data sets for ozone inactivation of Cryptosporidium, EPA estimated that 87.5% of the total variance could be attributed to experimental error (Sivaganesan 2003). A similar analysis done by Najm et al. (2002) on the Oppenheimer et al. (2000) data set for ozone produced an estimate of 89% of the total variance due to experimental error. For chlorine dioxide inactivation of Cryptosporidium, EPA estimated that 62% of the total variance in the Li et al. (2001) and Ruffle et al. (1999) data sets could be attributed to experimental error (Messner 2003). The different fractions attributed to experimental error between the chlorine dioxide and ozone data sets presumably relates to the use of different experimental techniques (e.g., infectivity assays).

EPA employed estimates of the fraction of variance not attributable to experimental error (12.5% for ozone and 38% for chlorine dioxide) in a modified form of the equation used to calculate a bound for confidence in prediction (Messner 2003). These were applied to the regression equations developed by Clark et al. (2002a and 2002b) in order to estimate CT values for an upper 90% confidence bound (Sivaganesan 2003, Messner 2003). These are the CT values shown in Tables IV-19 and IV-20 for ozone and chlorine dioxide, respectively.

Since the available data are not sufficient to support the CT calculation for an inactivation level greater than 3 log, the use of Tables IV-19 and IV-20 is limited to inactivation less than or equal to 3 log. In addition, the temperature limitation for these tables is 1 to 25 °C. If the water temperature is higher than 25 °C, temperature should be set to 25 °C for the log inactivation calculation.

EPA recognizes that inactivation rates may be sensitive to water quality and operational conditions in the plant. To reflect this potential, systems are given the option to perform a site specific inactivation study to determine CT requirements. The State must approve the protocols or other information used to derive alternative CT values. However, EPA has provided guidance for systems in making such demonstrations in the Toolbox Guidance Manual.

During meetings of the Stage 2 M-DBP Advisory Committee, CT values were used in the model for impact analysis of different regulatory options (the model Surface Water Analytical Tool (SWAT), as described in Economic Analysis for the LT2ESWTR, USEPA 2003a). Those preliminary CT values were based on a subset of the data from the Li et al. (2001) study with laboratory waters and were adjusted with a factor to match the mean CT values derived from the Oppenheimer et al. (2000) study with natural waters. In comparison, the CT values in today's proposal are higher. However, the current CT values are based on larger data sets and more comprehensive analyses. Consequently, they provide more confidence in estimates of Cryptosporidium log inactivation than the preliminary estimates used in earlier SWAT modeling. EPA has subsequently re-run analyses for LT2ESWTR impact assessments with the updated CT values (USEPA 2003a).

c. Request for comments. EPA requests comment on the proposed approach to awarding credit for inactivation of Cryptosporidium by chlorine dioxide and ozone, including the following specific issues:

  • Determination of CT and the confidence bounds used for estimating log inactivation of Cryptosporidium;
  • The ability of systems to apply these CT tables in consideration of the MCLs for bromate and chlorite; and
  • Any additional data that may be used to confirm or refine the proposed CT tables.

15. Ultraviolet Light

a. What is EPA proposing today? EPA is proposing criteria for awarding credit to ultraviolet (UV) disinfection processes for inactivation of Cryptosporidium, Giardia lamblia, and viruses. The inactivation credit a system can receive for each target pathogen is based on the UV dose applied by the system in relation to the UV dose requirements in this section (see Table IV-21).

To receive UV disinfection credit, a system must demonstrate a UV dose using the results of a UV reactor validation test and ongoing monitoring. The reactor validation test establishes the operating conditions under which a reactor can deliver a required UV dose. Monitoring is used to demonstrate that the system maintains these validated operating conditions during routine use.

UV dose (fluence) is defined as the product of the UV intensity over a surface area (fluence rate) and the exposure time. In practice, UV reactors deliver a distribution of doses due to variation in light intensity and flow path as particles pass through the reactor. However, for the purpose of determining compliance with the dose requirements in Table IV-21, UV dose must be assigned to a reactor based on the degree of inactivation of a microorganism achieved during a reactor validation test. This assigned UV dose is determined through comparing the reactor validation test results with a known dose-response relationship for the test microorganism. The State may designate an alternative basis for awarding UV disinfection credit.

EPA is developing the UV Disinfection Guidance Manual (USEPA 2003d) to assist systems and States with implementing UV disinfection, including validation testing of UV reactors. This guidance is available in draft in the docket for today's proposal (http://www.epa.gov/edocket/).

UV Dose Tables

Table IV-21 shows the UV doses that systems must apply to receive credit for up to 3 log inactivation of Cryptosporidium and Giardia lamblia and up to 4 log inactivation of viruses. These dose values are for UV light at a wavelength of 254 nm as delivered by a low pressure mercury vapor lamp. However, the dose values can be applied to other UV lamp types (e.g., medium pressure mercury vapor lamps) through reactor validation testing, such as is described in the draft UV Disinfection Guidance Manual (USEPA 2003d). In addition, the dose values in Table IV-21 are intended for post-filter application of UV in filtration plants and for systems that meet the filtration avoidance criteria in 40 CFR 141.71.

BILLING CODE 6560-50-P

[Graphic not available; view image of printed page]

BILLING CODE 6560-50-C

Reactor Validation Testing

For a system to receive UV disinfection credit, the UV reactor type used by the system must undergo validation testing to demonstrate the operating conditions under which the reactor can deliver the required UV dose. Unless the State approves an alternative approach, this testing must involve the following: (1) Full scale testing of a reactor that conforms uniformly to the UV reactors used by the system and (2) inactivation of a test microorganism whose dose response characteristics have been quantified with a low pressure mercury vapor lamp.

Validation testing must determine a set of operating conditions that can be monitored by the system to ensure that the required UV dose is delivered under the range of operating conditions applicable to the system. At a minimum, these operating conditions must include flow rate, UV intensity as measured by a UV sensor, and UV lamp status. The validated operating conditions determined by testing must account for the following factors: (1) UV absorbance of the water, (2) lamp fouling and aging, (3) measurement uncertainty of on-line sensors, (4) dose distributions arising from the velocity profiles through the reactor, (5) failure of UV lamps or other critical system components, and (6) inlet and outlet piping or channel configurations of the UV reactor. In the draft UV Disinfection Guidance Manual (USEPA 2003d), EPA describes testing protocols for reactor validation that are intended to meet these criteria.

Reactor Monitoring

Systems must monitor for parameters necessary to demonstrate compliance with the operating conditions that were validated for the required UV dose. At a minimum systems must monitor for UV intensity as measured by a UV sensor, flow rate, and lamp outage. As part of this, systems must check the calibration of UV sensors and recalibrate in accordance with a protocol approved by the State.

b. How was this proposal developed? UV disinfection is a physical process relying on the transference of electromagnetic energy from a source (lamp) to an organism's cellular material (USEPA 1986). In the Stage 2 M-DBP Agreement in Principle, the Advisory Committee recommended that EPA determine the UV doses needed to achieve up to 3 log inactivation of Giardia lamblia and Cryptosporidium and up to 4 log inactivation of viruses.

The Agreement further recommends that EPA develop standards to determine if UV systems are acceptable for compliance with drinking water disinfection requirements, including (1) a validation protocol for drinking water applications of UV technology and (2) on-site monitoring requirements to ensure ongoing compliance with UV dose tables. EPA also agreed to develop a UV guidance manual to facilitate design and operation of UV installations. Today's proposal and accompanying guidance for UV are consistent with the Agreement.

UV Dose Tables

The UV dose values in Table IV-21 are based on meta-analyses of UV inactivation studies with Cryptosporidium parvum, Giardia lamblia, Giardia muris, and adenovirus (Qian et al. 2003, USEPA 2003d). Proposed UV doses for inactivation of viruses are based on the dose-response of adenovirus because, among viruses that have been studied, it appears to be the most UV resistant and is a widespread waterborne pathogen (health effects of adenovirus are described in Embrey 1999).

The data supporting the dose values in Table IV-21 are from bench-scale studies using low pressure mercury vapor lamps. These data were chosen because the experimental conditions allow UV dose to be accurately quantified. Low pressure lamps emit light primarily at a single wavelength (254 nm) within the germicidal range of 200-300 nm. However, as noted earlier, these dose tables can be applied to reactors with other lamp types through reactor challenge testing, as described in the draft guidance manual. Bench scale studies are preferable for determining pathogen dose-response characteristics, due to the uniform dose distribution.

The data sets and statistical evaluation that were used to develop the UV dose table for Cryptosporidium, Giardia lamblia, and viruses are described in the draft UV Disinfection Guidance Manual (USEPA 2003d) and Qian et al. 2003.

Reactor Validation Testing

Today's proposal requires testing of full-scale UV reactors because of the difficulty in predicting reactor disinfection performance based on modeled results or on the results of testing at a reduced scale. All flow-through UV reactors deliver a distribution of doses due to variation in light intensity within the reactor and the different flow paths of particles passing through the reactor. Moreover, the reactor dose distribution varies temporally due to processes like lamp aging and fouling, changes in UV absorbance of the water, and fluctuations in flow rate. Consequently, it is more reliable to evaluate reactor performance through a full scale test under conditions that can be characterized as “worst case” for a given application. Such conditions include maximum and minimum flow rate and reduced light intensity within the reactor that accounts for lamp aging, fouling, and UV absorbance of the water. Protocols for reactor validation testing are presented in the draft UV guidance manual.

c. Request for comment. The Agency requests comment on whether the criteria described in this section for awarding treatment credit for UV disinfection are appropriate, and whether additional criteria, or more specific criteria, should be included.

16. Individual Filter Performance

a. What is EPA proposing today? EPA is proposing an additional 1.0 log Cryptosporidium treatment credit for systems that achieve individual filter performance consistent with the goals established for the Partnership for Safe Water Phase IV in August 2001 (AWWA et al. 2001). Specifically, systems must demonstrate ongoing compliance with the following turbidity criteria, based on continuous monitoring of turbidity for each individual filter as required under 40 CFR 141.174 or 141.560, as applicable:

(1) Filtered water turbidity less than 0.1 NTU in at least 95% of the maximum daily values recorded at each filter in each month, excluding the 15 minute period following backwashes, and

(2) No individual filter with a measured turbidity level of greater than 0.3 NTU in two consecutive measurements taken 15 minutes apart.

Note that today's proposal does not include a required peer review step as a condition for receiving additional credit. Rather, EPA is proposing to award additional credit to systems that meet the performance goals of a peer review program (Phase IV). Systems that receive the 1 log additional treatment credit for individual filter performance, as described in this section, cannot also receive an additional 0.5 log additional credit for lower finished water turbidity as described in section IV.C.8.

b. How was this proposal developed? In the Stage 2 M-DBP Agreement in Principle, the Advisory Committee recommended a peer review program as a microbial toolbox component that should receive a 1.0 log Cryptosporidium treatment credit. The Committee specified Phase IV of the Partnership for Safe Water (Partnership) as an example of the type of peer review program where a 1.0 log credit would be appropriate.

The Partnership is a voluntary cooperative program involving EPA, the Association of Metropolitan Water Agencies (AMWA), the American Water Works Association (AWWA), the National Association of Water Companies (NAWC), the Association of State Drinking Water Administrators (ASDWA), the American Water Works Association Research Foundation (AWWARF), and surface water utilities throughout the United States. The intent of the Partnership is to increase protection against microbial contaminants by optimizing treatment plant performance.

At the time of the Advisory Committee recommendation, Phase IV was under development by the Partnership. It was to be based on Composite Correction Program (CCP) (USEPA 1991) procedures and performance goals, and was to be awarded based on an on-site evaluation by a third-party team. The performance goals for Phase IV were such that, over a year, each sedimentation basin and each filter would need to produce specified turbidity levels based on the maximum of all the values recorded during the day. Sedimentation performance goals were set at 2.0 NTU if the raw water was greater than 10 NTU on an annual basis and 1.0 NTU if the raw water was less than 10 NTU. Each filter was to meet 0.1 NTU 95% of the time except for the 15 minute period following placing the filter in operation. In addition, filters were expected to have maximum turbidity of 0.3 NTU and return to less than 0.1 NTU within 15 minutes of the filter being placed in service.

The primary purpose of the on-site evaluation was to confirm that the performance of the plant was consistent with Phase IV performance goals and that the system had the administrative support and operational capabilities to sustain the performance long-term. The on-site evaluation in Phase IV also allowed utilities that could not meet the desired performance goals to demonstrate to the third-party that they had achieved the highest level of performance given their unique raw water quality.

After the signing of the Stage 2 M-DBP Agreement in Principle in September 2000, the Partnership decided to eliminate the on-site third-party evaluation as a component of Phase IV. Instead, the requirement for Phase IV is for the water system to complete an application package that will be reviewed by trained utility volunteers. Included in the application package is an Optimization Assessment Spreadsheet in which the system enters water quality and treatment data to demonstrate that Phase IV performance levels have been achieved. The application also requires narratives related to administrative support and operational capabilities to sustain performance long-term.

Today's proposal is consistent with the performance goals of Phase IV. Rather than require systems to complete an application package with historical data and narratives, the LT2ESWTR requires systems to demonstrate to the State that they meet the individual filter performance goals of Phase IV on an ongoing basis to receive the 1.0 log additional Cryptosporidium treatment credit. EPA is not requiring systems to demonstrate that they meet sedimentation performance goals of Phase IV. While EPA recognizes that settled water turbidity is an important operational performance measure for a plant, the Agency does not have data directly relating it to finished water quality and pathogen risk.

The November 2001 pre-proposal draft of the LT2ESWTR described a potential 1.0 log credit for systems that achieved individual filter effluent (IFE) turbidity below 0.15 NTU in 95 percent of samples (USEPA 2001g). The Science Advisory Board (SAB) subsequently reviewed this credit and supporting data on the relationship between filter effluent turbidity and Cryptosporidium removal efficiency (described in section IV.C.8). In written comments from a December 2001 meeting of the Drinking Water Committee, an SAB panel recommended only a 0.5 log credit for 95th percentile IFE turbidity below 0.15 NTU.

To address this recommendation from the SAB, EPA is proposing that systems meet the individual filter performance criteria of Phase IV of the Partnership in order to be eligible for a 1.0 log additional Cryptosporidium treatment credit. This proposed approach responds to the concerns raised by the SAB because the Phase IV criteria are more stringent than those in the 2001 pre-proposal draft of the LT2ESWTR. For example, today's proposal sets a maximum limit on individual filter effluent turbidity of 0.3 NTU, whereas no such upper limit was described in the 2001 pre-proposal draft.

In summary, EPA has concluded that it is appropriate to award additional Cryptosporidium treatment credit for systems meeting stringent individual filter performance standards. Modestly elevated turbidity from a single filter may not significantly impact combined filter effluent turbidity levels, which are regulated under IESWTR and LT1ESWTR, but may indicate a substantial reduction in the overall pathogen removal efficiency of the filtration process. Consequently, systems that continually achieve very low turbidity in each individual filter are likely to provide a significantly more effective microbial barrier. EPA expects that systems that select this toolbox option will have achieved a high level of treatment process optimization and process control, and will have both a history of consistent performance over a range of raw water quality conditions and the capability and resources to maintain this performance long-term.

c. Request for comment. The Agency invites comment on the following issues related to the proposed credit for individual filter performance.

  • Are there different or additional performance measures that a utility should be required to meet for the 1 log additional credit?
  • Are there existing peer review programs for which treatment credit should be awarded under the LT2ESWTR? If so, what role should primacy agencies play in establishing and managing any such peer review program?
  • The individual filter effluent turbidity criterion of 0.1 NTU is proposed because it is consistent with Phase IV Partnership standards, as based on CCP goals. However, with allowable rounding, turbidity levels less than 0.15 NTU are in compliance with a standard of 0.1. Consequently, EPA requests comment on whether 0.15 NTU should be the standard for individual filter performance credit, as this would be consistent with the standard of 0.15 NTU that is proposed for combined filter performance credit in section IV.C.8.

17. Other Demonstration of Performance

a. What is EPA proposing today? The purpose of the “demonstration of performance” toolbox component is to allow a system to demonstrate that a plant, or a unit process within a plant, should receive a higher Cryptosporidium treatment credit than is presumptively awarded under the LT2ESWTR. For example, as described in section IV.A, plants using conventional treatment receive a presumptive 3 log credit towards the Cryptosporidium treatment requirements in Bins 2-4 of the LT2ESWTR. This credit is based on a determination by EPA that conventional treatment plants achieve an average Cryptosporidium removal of 3 log when in compliance with the IESWTR or LT1ESWTR. However, EPA recognizes that some conventional treatment plants may achieve average Cryptosporidium removal efficiencies greater than 3 log. Similarly, some systems may achieve Cryptosporidium reductions with certain toolbox components that are greater than the presumptive credits awarded under the LT2ESWTR, as described in this section (IV.C).

Where a system can demonstrate that a plant, or a unit process within a plant, achieves a Cryptosporidium reduction efficiency greater than the presumptive credit specified in the LT2ESWTR, it may be appropriate for the system to receive a higher Cryptosporidium treatment credit. Today's proposal does not include specific protocols for systems to make such a demonstration, due to the potentially complex and site specific nature of the testing that would be required. Rather, today's proposal allows a State to award a higher level of Cryptosporidium treatment credit to a system where the State determines, based on site-specific testing with a State-approved protocol, that a treatment plant or a unit process within a plant reliably achieves a higher level of Cryptosporidium removal on a continuing basis. Also, States may award a lower level of Cryptosporidium treatment credit to a system where a State determines, based on site specific information, that a plant or a unit process within a plant achieves a Cryptosporidium removal efficiency less than a presumptive credit specified in the LT2ESWTR.

Systems receiving additional Cryptosporidium treatment credit through a demonstration of performance may be required by the State to report operational data on a monthly basis to establish that conditions under which demonstration of performance credit was awarded are maintained during routine operation. The Toolbox Guidance Manual (USEPA 2003f) will describe potential approaches to demonstration of performance testing. This guidance is available in draft in the docket for today's proposal (http://www.epa.gov/edocket/).

Note that as described in section IV.C, today's proposal allows treatment plants to achieve additional Cryptosporidium treatment credit through meeting the design and/or operational criteria of microbial toolbox components, such as combined and individual filter performance, presedimentation, bank filtration, two-stage softening, secondary filtration, etc. Plants that receive additional Cryptosporidium treatment credit through a demonstration of performance are not also eligible for the presumptive credit associated with microbial toolbox components if the additional removal due to the toolbox component is captured in the demonstration of performance credit. For example, if a plant receives a demonstration of performance credit based on removal of Cryptosporidium or an indicator while operating under conditions of lower finished water turbidity, the plant may not also receive additional presumptive credit for lower finished water turbidity toolbox components.

This demonstration of performance credit does not apply to the use of chlorine dioxide, ozone, or UV light, because today's proposal includes specific provisions allowing the State to modify the standards for awarding disinfection credit to these technologies. As described in section IV.C.14, States can approve site-specific CT values for inactivation of Cryptosporidium by chlorine dioxide and ozone; as described in section IV.C.15, States can approve an alternative approach for validating the performance of UV reactors.

b. How was this proposal developed? The Stage 2 M-DBP Agreement in Principle recommends demonstration of performance as a process for systems to receive Cryptosporidium treatment credit higher than the presumptive credit for many microbial toolbox components, as well as credit for technologies not listed in the toolbox. EPA is aware that there may be plants where particular unit processes, or combinations of unit processes, achieve greater Cryptosporidium removal than the presumptive credit awarded under the LT2ESWTR. In addition, the Agency would like to allow for the use of Cryptosporidium treatment processes not addressed in the LT2ESWTR, where such processes can demonstrate a reliable specific log removal. Due to these factors, EPA is proposing a demonstration of performance component in the microbial toolbox, consistent with the Advisory Committee recommendation.

The Agreement in Principle makes no recommendations for how a demonstration of performance should be conducted. It is generally not practical for systems to directly quantify high log removal of Cryptosporidium in treatment plants because of the relatively low occurrence of Cryptosporidium in many raw water sources and limitations with analytical methods. Consequently, if systems are to demonstrate the performance of full scale plants in removing Cryptosporidium, this typically will require the use of indicators, where the removal of the indicator has been correlated with the removal of Cryptosporidium. As described previously, a number of studies have shown that aerobic spores are an indicator of Cryptosporidium removal by sedimentation and filtration (Dugan et al. 2001, Emelko et al. 1999 and 2000, Yates et al. 1998, Mazounie et al. 2000).

The nature of demonstration of performance testing that will be appropriate at a given facility will depend on site specific factors, such as water quality, the particular process(es) being evaluated, resources and infrastructure, and the discretion of the State. Consequently, EPA is not proposing specific criteria for demonstration of performance testing. Instead, systems must develop a testing protocol that is approved by the State, including any requirements for ongoing reporting if demonstration of performance credit is approved. EPA has developed a draft document, Toolbox Guidance Manual (USEPA 2003f), that is available with today's proposal and provides guidance on demonstration of performance testing.

c. Request for comment. The Agency requests comment on today's proposal for systems to demonstrate higher Cryptosporidium removal levels. EPA specifically requests comment on the following issues:

  • Approaches that should be considered or excluded for demonstration of performance testing;
  • Whether EPA should propose minimum elements that demonstration of performance testing must include;
  • Whether a factor of safety should be applied to the results of demonstration of performance testing to account for potential differences in removal of an indicator and removal of Cryptosporidium, or uncertainty in the application of pilot-scale results to full-scale plants;
  • Whether or under what conditions a demonstration of performance credit should be allowed for a unit process within a plant—a potential concern is that certain unit processes, such as a sedimentation basin, can be operated in a manner that will increase removal in the unit process but decrease removal in subsequent treatment processes and, therefore, lead to no overall increase in removal through the plant. An approach to address this concern is to limit demonstration of performance credit to removal demonstrated across the entire treatment plant.

D. Disinfection Benchmarks for Giardia lamblia and Viruses

1. What Is EPA Proposing Today?

EPA proposes to establish the disinfection benchmark under the LT2ESWTR as a procedure to ensure that systems maintain protection against microbial pathogens as they implement the Stage 2 M-DBP rules (i.e., Stage 2 DBPR and LT2ESWTR). The disinfection benchmark serves as a tool for systems and States to evaluate the impact on microbial risk of proposed changes in disinfection practice. EPA established the disinfection benchmark under the IESWTR and LT1ESWTR for the Stage 1 M-DBP rules, as recommended by the Stage 1 M-DBP Advisory Committee. Today's proposal extends disinfection benchmark requirements to apply to the Stage 2 M-DBP rules.

Under the proposed LT2ESWTR, the disinfection benchmark procedure involves a system charting levels of Giardia lamblia and virus inactivation at least once per week over a period of at least one year. This creates a profile of inactivation performance that the System must use to determine a baseline or benchmark of inactivation against which proposed changes in disinfection practice can be measured. Only certain systems are required to develop profiles and keep them on file for State review during sanitary surveys. When those systems that are required to develop a profile plan a significant change in disinfection practice (defined later in this section), they must submit the profile and an analysis of how the proposed change will affect the current disinfection benchmark to the State for review.

Systems that developed disinfection profiles under the IESWTR or LT1ESWTR and have not made significant changes in their disinfection practice or changed sources are not required to collect additional operational data to create disinfection profiles under the LT2ESWTR. Systems that produced a disinfection profile for Giardia lamblia but not viruses under the IESWTR or LT1ESWTR may be required to develop a profile for viruses under the LT2ESWTR. Where a previously developed Giardia lamblia profile is acceptable, systems may develop a virus profile using the same operational data (i.e., CT values) on which the Giardia lamblia profile is based. Spreadsheets developed by EPA and States automatically calculate Giardia lamblia and virus profiles using the same operational data. EPA believes that virus profiling is necessary because many of the disinfection processes that systems will select to comply with the Stage 2 DBPR and LT2ESWTR (e.g., chloramines, UV, MF/UF) are relatively less effective against viruses than Giardia lamblia in comparison to free chlorine.

The disinfection benchmark provisions contain three major components: (a) Applicability requirements and schedule, (b) characterization of disinfection practice, and (c) State review of proposed changes in disinfection practice. Each of these components is discussed in the following paragraphs.

a. Applicability and schedule. Proposed disinfection profiling and benchmarking requirements apply to surface water systems only. Systems serving only ground water are not subject to the requirements of the LT2ESWTR. The determination of whether a surface water system is required to develop a disinfection profile is based on whether DBP levels (TTHM or HAA5) exceed specified values, described later in this section, and whether a system is required to monitor for Cryptosporidium. These criteria trigger profiling because they identify systems that may be required to make treatment changes under the Stage 2 DBPR or LT2ESWTR. Note that it is not practical to wait until a system has completed Cryptosporidium monitoring to identify which systems should prepare a disinfection profile. A completed disinfection profile should be available at the point when a system is classified in a treatment bin and must begin developing plans to comply with any additional treatment requirements.

Unless the system developed a disinfection profile under the IESWTR or LT1ESWTR, all systems required to monitor for Cryptosporidium must develop Giardia lamblia and virus disinfection profiles under the LT2ESWTR. This includes all surface water systems except (1) systems that provide 5.5 log total treatment for Cryptosporidium, equivalent to meeting the treatment requirements of Bin 4 and (2) small systems (10,000 people served) that do not exceed the E. coli trigger (see section IV.A for details). Systems not required to monitor for Cryptosporidium as a result of providing 5.5 log of treatment are not required to prepare disinfection profiles. However, small systems that do not exceed the E. coli trigger are required to prepare Giardia lamblia and virus disinfection profiles if one of the following criteria apply, based on DBP levels in their distribution systems:

(1)* TTHM levels in the distribution system, based on samples collected for compliance with the Stage 1 DBPR, are at least 80% of the MCL (0.064 mg/L) at any Stage 1 DBPR sampling point based on a locational running annual average (LRAA).

(2)* HAA5 levels in the distribution system, based on the samples collected for compliance with the Stage 1 DBPR, are at least 80% of the MCL (0.048 mg/L) at any Stage 1 DBPR sampling point based on an LRAA.

*These criteria only apply to systems that are required to comply with the DBP rules, i.e., community and non-transient non-community systems.

Table IV-22 presents a summary schedule of the required deadlines for disinfection profiling activities, categorized by system size and whether a small system is required to monitor for Cryptosporidium. The deadlines are based on the expectation that a system should have a disinfection profile at the time the system is classified in a Cryptosporidium treatment bin under LT2ESWTR and/or has determined the need to make treatment changes for the Stage 2 DBPR. Systems have three years from this date, with a possible two year extension for capital improvements if granted by the State, within which to complete their evaluation, design, and implementation of treatment changes to meet the requirements of the LT2ESWTR and the Stage 2 DBPR.

Table IV-22.—Schedule of Implementation Deadlines Related to Disinfection Profiling1 Back to Top
Activity Systems serving ≥10,000 people2 Systems serving 10,000 people
Required to monitor for Cryptosporidium Not required to monitor for Cryptosporidium 2 3 6
1Numbers in table indicate months following promulgation of the LT2ESWTR.
2Systems providing a total of 5.5 log Cryptosporidium treatment (equivalent to meeting Bin 4 treatment requirements) are not required to develop disinfection profiles.
3Systems serving fewer than 10,000 people are not required to monitor for Cryptosporidium if mean E. coli levels are less than 10/100 mL for systems using lake/reservoir sources or less than 50/100 mL for systems using flowing stream sources.
4Unless system has existing disinfection profiling data that are acceptable.
5This deadline coincides with the start of the 3 year period at the end of which compliance with the LT2ESWTR and Stage 2 DBPR is required.
6Not required to conduct profiling unless TTHM or HAA5 exceeds trigger values of 80% of MCL at any sampling point based on LRAA.
Complete 1 year of E. coli monitoring NA 42 42
Determine whether required to profile based on DBP levels and notify State6 NA NA 42
Begin disinfection profiling4 24 54 42
Complete Cryptosporidium monitoring 30 60 NA
Complete disinfection profiling based on at least one year's data5 36 66 54

As described in the next section, systems can meet profiling requirements under the proposed LT2ESWTR using previously collected data (i.e., grandfathered data). Use of grandfathered data is allowed if the system has not made a significant change in disinfection practice or changed sources since the data were collected. This will permit most systems that prepared a disinfection profile under the IESWTR or the LT1ESWTR to avoid collecting any new operational data to develop profiles under the LT2ESWTR.

The locational running annual average (LRAA) of TTHM and HAA5 levels used by small systems that do not monitor for Cryptosporidium to determine whether profiling is required must be based on one year of DBP data collected during the period following promulgation of the LT2ESWTR, or as determined by the State. By the date indicated in Table IV-22, these systems must report to the State on their DBP LRAAs and whether the disinfection profiling requirements apply. If either DBP LRAA meets the criteria specified previously, the system must begin disinfection profiling by the date proposed in Table IV-22.

b. Developing the disinfection profile and benchmark. Under the LT2ESWTR, a disinfection profile consists of a compilation of Giardia lamblia and virus log inactivation levels computed at least weekly over a period of at least one year, as based on operational and water quality data (disinfectant residual concentration(s), contact time(s), temperature(s), and, where necessary, pH). The system may create the profile by conducting new weekly (or more frequent) monitoring and/or by using grandfathered data. A system that created a Giardia lamblia disinfection profile under the IESWTR or LT1ESWTR may use the operational data collected for the Giardia lamblia profile to create a virus disinfection profile.

Grandfathered data are those operational data that a system has previously collected at a treatment plant during the course of normal operation. Those systems that have all the necessary information to determine profiles using existing operational data collected prior to the date when the system is required to begin profiling may use these data in developing profiles. However, grandfathered data must be substantially equivalent to operational data that would be collected under this rule. These data must be representative of inactivation through the entire treatment plant and not just of certain treatment segments.

To develop disinfection profiles under this rule, systems are required to exercise one of the following three options:

Option 1—Systems conduct monitoring at least once per week following the process described later in this section.

Option 2—Systems that conduct monitoring under this rule, as described under Option 1, can also use one or two years of acceptable grandfathered data, in addition to one year of new operational data, in developing the disinfection profile.

Option 3—Systems that have at least one year of acceptable existing operational data are not required to conduct new monitoring to develop the disinfection profile under this rule. Instead, they can use a disinfection profile based on one to three years of grandfathered data.

Process to be followed by PWS for developing the disinfection profile:

—Measure disinfectant residual concentration (C, in mg/L) before or at the first customer and just prior to each additional point of disinfectant addition, whether with the same or a different disinfectant.

—Determine contact time (T, in minutes) for each residual disinfectant monitoring point during peak flow conditions. T could be based on either a tracer study or assumptions based on contactor basin geometry and baffling. However, systems must use the same method for both grandfathered data and new data.

—Measure water temperature (°C) (for disinfectants other than UV).

—Measure pH (for chlorine only).

To determine the weekly log inactivation, the system must convert operational data from one day each week to the corresponding log inactivation values for Giardia lamblia and viruses. The procedure for Giardia lamblia is as follows:

—Determine CT calc for each disinfection segment.

—Determine CT 99.9 (i.e., 3 log inactivation) from tables in the SWTR (40 CFR 141.74) using temperature (and pH for chlorine) for each disinfection segment. States can allow an alternate calculation procedure (e.g., use of a spreadsheet).

—For each segment, log inactivation = (CT calc/CT 99.9) × 3.0.

—Sum the log inactivation values for each segment to get the log inactivation for the day (or week).

For calculating the virus log inactivation, systems should use the procedures approved by States under the IESWTR or LT1ESWTR. Log inactivation benchmark is calculated as follows:

—Determine the calendar month with the lowest log inactivation.

—The lowest month becomes the critical period for that year.

—If acceptable data from multiple years are available, the average of critical periods for each year becomes the benchmark.

—If only one year of data is available, the critical period for that year is the benchmark.

c. State review. If a system that is required to produce a disinfection profile proposes to make a significant change in disinfection practice, it must calculate Giardia lamblia and virus inactivation benchmarks and must notify the State before implementing such a change. Significant changes in disinfection practice are defined as (1) moving the point of disinfection (this is not intended to include routine seasonal changes already approved by the State), (2) changing the type of disinfectant, (3) changing the disinfection process, or (4) making other modifications designated as significant by the State. When notifying the State, the system must provide a description of the proposed change, the disinfection profiles and inactivation benchmarks for Giardia lamblia and viruses, and an analysis of how the proposed change will affect the current inactivation benchmarks. In addition, the system should have disinfection profiles and, if applicable, inactivation benchmarking documentation, available for the State to review as part of its periodic sanitary survey.

EPA developed for the IESWTR, with stakeholder input, the Disinfection Profiling and Benchmarking Guidance Manual (USEPA 1999d). This manual provides guidance to systems and States on the development of disinfection profiles, identification and evaluation of significant changes in disinfection practices, and considerations for setting an alternative benchmark. If necessary, EPA will produce an addendum to reflect changes in the profiling and benchmarking requirements necessary to comply with LT2ESWTR.

2. How Was This Proposal Developed?

A fundamental premise in the development of the M-DBP rules is the concept of balancing risks between DBPs and microbial pathogens. Disinfection profiling and benchmarking were established under the IESWTR and LT1ESWTR, based on a recommendation by the Stage 1 M-DBP Federal Advisory Committee, to ensure that systems maintained adequate control of pathogen risk as they reduced risk from DBPs. Today's proposal would extend disinfection benchmarking requirements to the LT2ESWTR.

EPA believes this extension is necessary because some systems will make significant changes in their current disinfection practice to meet more stringent limits on TTHM and HAA5 levels under the Stage 2 DBPR and additional Cryptosporidium treatment requirements under the LT2ESWTR. In order to ensure that these systems continue to provide adequate protection against the full spectrum of microbial pathogens, it is appropriate for systems and States to evaluate the effects of such treatment changes on microbial drinking water quality. The disinfection benchmark serves as a tool for making such evaluations.

EPA projects that to comply with the Stage 2 DBPR, systems will make changes to their disinfection practice, including switching from free chlorine to chloramines and, to a lesser extent, installing technologies like ozone, membranes, and UV. Similarly, to provide additional treatment for Cryptosporidium, some systems will install technologies like UV, ozone, and microfiltration. While these processes are all effective disinfectants, chloramines are a weaker disinfectant than free chlorine for Giardia lamblia. Ozone, UV, and membranes can provide highly effective treatment for Giardia lamblia, but they, as well as chloramines, are less efficient for treating viruses than free chlorine, relative to their efficacy for Giardia lamblia. Because of this, a system switching from free chlorine to one of these alternative disinfection technologies could experience a reduction in the level of virus and/or Giardia lamblia (for chloramines) treatment it is achieving. Consequently, EPA believes that systems making significant changes in their disinfection practice under the Stage 2 M-DBP rules should assess the impact of these changes with disinfection benchmarks for Giardia lamblia and viruses.

Changes in the proposed benchmarking requirements under the LT2ESWTR in comparison to IESWTR requirements include decreasing the frequency of calculating CT values for the disinfection profile from daily to weekly and requiring all systems to prepare a profile for viruses as well as Giardia lamblia. The proposal of a weekly frequency for CT calculations was made to accommodate existing profiles from small systems, which are required to make weekly CT calculations for profiling under the LT1ESWTR. As described earlier, EPA would like for systems that have prepared a disinfection profile under the IESWTR or LT1ESWTR and have not subsequently made significant changes in disinfection practice to be able to grandfather this profile for the LT2ESWTR. Allowing weekly calculation of CT values under the LT2ESWTR will make this possible.

The IESWTR and LT1ESWTR required virus inactivation profiling only for systems using ozone or chloramine as their primary disinfectant. However, as noted earlier, EPA has projected that under the Stage 2 DBPR and LT2ESWTR, systems will switch from free chlorine to disinfection processes like chloramines, UV, ozone, and microfiltration. The efficiency of these processes for virus treatment relative to protozoa treatment is lower in comparison to free chlorine. As a result, a disinfection benchmark for Giardia lamblia would not necessarily provide an indication of the level or adequacy of treatment for viruses. Consequently, EPA believes it is appropriate for systems to develop profiles for both Giardia lamblia and viruses. Moreover, developing a profile for viruses involves a minimal increase in effort and no additional data collection for those systems that have disinfection profiles for Giardia lamblia. Systems will use the same calculated CT values for viruses as would be used for the Giardia lamblia profile.

The strategy of disinfection profiling and benchmarking stemmed from data provided to the Stage1 M-DBP Advisory Committee, in which the baseline of microbial inactivation (expressed as logs of Giardia lamblia inactivation) demonstrated high variability. Inactivation varied by several logs (i.e., orders of magnitude) on a day-to-day basis at particular treatment plants and by as much as tens of logs over a year due to changes in water temperature, flow rate, seasonal changes, pH, and disinfectant demand. There were also differences between years at individual plants. To address these variations, M-DBP stakeholders developed the procedure of profiling a plant's inactivation levels over a period of at least one year, and then establishing a benchmark of minimum inactivation as a way to characterize disinfection practice.

Benchmarking of inactivation levels, an assessment of the impact of proposed changes on the level of microbial inactivation of Giardia lamblia and viruses, and State review prior to approval of substantial changes in treatment are important steps in avoiding conditions that present an increase in microbial risk. In its assessment of the microbial risk associated with the proposed changes, States could consider site-specific knowledge of the watershed and hydrologic factors as well as variability, flexibility and reliability of treatment to ensure that treatment for both protozoan and viral pathogens is appropriate.

EPA emphasizes that benchmarking is not intended to function as a regulatory standard. Rather, the objective of the disinfection benchmark is to facilitate interactions between the States and systems for the purpose of assessing the impact on microbial risk of proposed significant changes to current disinfection practices. Final decisions regarding levels of disinfection for Giardia lamblia and viruses beyond those required by the SWTR that are necessary to protect public health will continue to be left to the States. For this reason EPA has not mandated specific evaluation protocols or decision matrices for analyzing changes in disinfection practice. EPA, however, will provide support to the States in making these analyses through the issuance of guidance.

3. Request for Comments

EPA requests comment on the proposed provisions of the inactivation profiling and benchmarking requirement.

E. Additional Treatment Technique Requirements for Systems With Uncovered Finished Water Storage Facilities

1. What Is EPA Proposing Today?

EPA is proposing requirements for systems with uncovered finished water storage facilities. The proposed rule requires that systems with uncovered finished water storage facilities must (1) cover the uncovered finished water storage facility, or (2) treat storage facility discharge to the distribution system to achieve a 4 log virus inactivation, unless (3) the system implements a State-approved risk mitigation plan that addresses physical access and site security, surface water runoff, animal and bird waste, and ongoing water quality assessment, and includes a schedule for plan implementation. Where applicable, the plans should account for cultural uses by Indian Tribes.

Systems must notify the State if they use uncovered finished water storage facilities no later than 2 years following LT2ESWTR promulgation. Systems must cover or treat uncovered finished facilities or have a State-approved risk mitigation plan within 3 years following LT2ESWTR promulgation, with the possibility of a two year extension granted by States for systems making capital improvements. Systems seeking approval for a risk mitigation plan must submit the plan to the State within 2 years following LT2ESWTR promulgation.

These provisions apply to uncovered tanks, reservoirs, or other facilities where water is stored after it has undergone treatment to satisfy microbial treatment technique requirements for Giardia lamblia, Cryptosporidium, and viruses. In most cases, this refers to storage of water following all filtration steps, where required, and primary disinfection.

2. How Was This Proposal Developed?

Today's proposal is intended to mitigate the water quality degradation and increased health risks that can result from uncovered finished water storage facilities. In addition, these proposed requirements for uncovered finished water storage facilities are consistent with recommendations of the Stage 2 M-DBP Advisory Committee in the Agreement in Principle (USEPA 2000a).

The use of uncovered finished water storage facilities has been questioned since 1930 due to their susceptibility to contamination and subsequent threats to public health (LeChevallier et al. 1997). Many potential sources of contamination can lead to the degradation of water quality in uncovered finished water storage facilities. These include surface water runoff, algal growth, insects and fish, bird and animal waste, airborne deposition, and human activity.

Algal blooms are the most common problem in open reservoirs and can become a public health risk, as they increase the presence of bacteria in the water. Algae growth also leads to the formation of disinfection byproducts and causes taste and odor problems. Some algae produce toxins that can induce headache, fever, diarrhea, abdominal pain, nausea, and vomiting. Bird and animal wastes are also common and significant sources of contamination. These wastes may carry microbial contaminants such as coliform bacteria, viruses, and human pathogens, including Vibrio cholera, Salmonella, Mycobacteria, Typhoid, Giardia lamblia, and Cryptosporidium (USEPA 1999e). Microbial pathogens are found in surface water runoff, along with agricultural chemicals, automotive wastes, turbidity, metals, and organic matter (USEPA 1999e, LeChevallier et al. 1997).

In an effort to minimize contamination, systems have implemented various controls such as reservoir covers and liners, regular draining and washing, security and monitoring, bird and insect control programs, and drainage design to prevent surface runoff from entering the facility (USEPA 1999e).

A number of studies have evaluated the degradation of water quality in uncovered finished water storage facilities. LeChevallier et al. (1997) compared influent and effluent samples from six uncovered finished water storage reservoirs in New Jersey for a one year period. There were significant increases in the turbidity, particle count, total coliform, fecal coliform, and heterotrophic plate count bacteria in the effluent relative to the influent. Of particular concern were fecal coliforms, which were detected in 18 percent of effluent samples (no influent samples were positive for coliforms). Fecal coliforms are used as an indicator of the potential for contamination by pathogens. Giardia and/or Cryptosporidium were detected in 15% of inlet samples and 25% of effluent samples, demonstrating a significant increase in the effluent. There was a significant decrease in the chlorine residual concentration in some effluent samples.

Increases in algal cells, heterotrophic plate count (HPC) bacteria, turbidity, color, particle counts, and biomass, and decreases in residual chlorine levels, have been reported in other studies of uncovered finished water reservoirs as well (Pluntze 1974, AWWA Committee 1983, Silverman et al. 1983). Researchers have shown that small mammals, birds, fish, and algal growth contribute to the microbial degradation of an open finished water reservoir (Graczyk et al. 1996, Geldreich 1990, Fayer and Ungar 1986, Current 1986).

As described in section II, the IESWTR and LT1ESWTR require water systems to cover all new reservoirs, holding tanks, or other storage facilities for finished water. However, these rules do not require systems to cover existing finished water storage facilities. EPA stated in the preamble to the final IESWTR (63 FR 69494, December 16, 1998) (USEPA 1998a) that with respect to requirements for existing uncovered finished water storage facilities, the Agency needed more time to collect and analyze additional information to evaluate regulatory impact. The IESWTR preamble affirmed that EPA would consider whether to require the covering of existing storage facilities during the development of subsequent microbial regulations when additional data to estimate national costs were available.

Since promulgation of the IESWTR, EPA has collected sufficient data to estimate national cost implications of regulatory control strategies for uncovered finished water storage facilities. Based on information provided by States, EPA estimates that there are approximately 138 uncovered finished water storage facilities in the United States and territories, not including reservoirs that systems currently plan to cover or take off-line. Costs for covering these storage facilities or treating the effluent, consistent with today's proposed requirements, are presented in section VI of this preamble and in the Economic Analysis for the LT2ESWTR (USEPA 2003a). Briefly, total capital costs were estimated as $64.4 million, resulting in annualized present value costs of $5.4 million at a three percent discount rate and $6.4 million at a seven percent discount rate.

Based on the findings of studies cited in this section, EPA continues to be concerned about contamination occurring in uncovered finished water storage facilities. Therefore, as recommended by the Advisory Committee, EPA is proposing control measures for all systems with uncovered finished water storage facilities. This proposal is intended to represent a balanced approach, recognizing both the potentially significant but uncertain risks associated with uncovered finished water storage facilities and the substantial costs of either covering them or building alternative storage. Today's proposal allows systems to treat the storage facility effluent instead of providing a cover. Alternatively, States may determine that existing risk mitigation is adequate, provided a system implements a risk mitigation plan as described in this section.

3. Request for Comments

EPA requests comment on the proposed requirements pertaining to uncovered finished water storage facilities. Specifically, the Agency would like comment on the following issues, and requests that comments include available supporting data or other technical information:

  • Is it appropriate to allow systems with uncovered finished water storage facilities to implement a risk management plan or treat the effluent to inactivate viruses instead of covering the facility?
  • If systems treat the effluent of an uncovered finished water storage facility instead of covering it, should systems be required to inactivate Cryptosporidium and Giardia lamblia, since these protozoa have been found to increase in uncovered storage facilities?
  • Additional information on contamination or health risks that may be associated with uncovered finished water storage facilities.
  • Additional data on how climatological conditions affect water quality, including daily fluctuations in the stability of the water related to corrosion control.
  • The definition of an uncovered finished water storage facility in 40 CFR 141.2 is a tank, reservoir, or other facility used to store water that will undergo no further treatment except residual disinfection and is open to the atmosphere. There is a concern that this definition may not include certain systems using what would generally be considered an uncovered finished water storage facility. An example is a system that applies a corrosion inhibitor compound to the effluent of an uncovered storage facility where water is stored after filtration and primary disinfection. In this case, the system may claim that the corrosion inhibitor constitutes additional treatment and, consequently, the reservoir does not meet EPA's definition of an uncovered finished water storage facility. EPA requests comment on whether the definition of an uncovered finished water storage facility should be revised to specifically include systems that apply a treatment such as corrosion control to water stored in an uncovered reservoir after the water has undergone filtration, where required, and primary disinfection.

F. Compliance Schedules

Today's proposal includes deadlines for public water systems to comply with the proposed monitoring, reporting, and treatment requirements. These deadlines stem from the microbial framework approach of the proposed LT2ESWTR, which involves a system-specific risk characterization through monitoring to determine the need for additional treatment.

1. What Is EPA Proposing Today?

a. Source water monitoring.

i. Filtered systems. Under today's proposal, filtered systems conduct source water Cryptosporidium monitoring for the purpose of being classified in one of four risk bins that determine the extent of any additional treatment requirements. Small filtered systems first monitor for E. coli as a screening analysis and are only required to monitor for Cryptosporidium if the mean E. coli level exceeds specified trigger values. Note that systems that currently provide or will provide a total of at least 5.5 log of treatment for Cryptosporidium are exempt from monitoring requirements.

Large surface water systems (serving at least 10,000 people) that filter must sample at least monthly for Cryptosporidium, E. coli, and turbidity in their source water for 24 months, beginning 6 months after promulgation of the LT2ESWTR. Large systems must submit a sampling schedule to their primacy agency (in this case, EPA) no later than 3 months after promulgation of the LT2ESWTR.

Small surface water systems (fewer than 10,000 people served) that filter must conduct biweekly E. coli sampling in their source water for 1 year, beginning 30 months after LT2ESWTR promulgation. States may designate an alternate indicator monitoring strategy based on EPA guidance, but compliance schedules will not change. Small systems that exceed the indicator trigger value (i.e., mean E. coli 10/100 mL for lake/reservoir sources or 50/100 mL for flowing stream sources) must conduct source water Cryptosporidium sampling twice-per-month for 1 year, beginning 48 months after LT2ESWTR promulgation (i.e., beginning 6 months following the completion of E. coli sampling). Small systems must submit an E. coli sampling schedule to their primacy agency no later than 27 months after LT2ESWTR promulgation. If Cryptosporidium monitoring is required, small systems must submit a Cryptosporidium sampling schedule no later than 45 months after LT2ESWTR promulgation.

Large systems must carry out a second round of source water monitoring beginning 108 months after LT2ESWTR promulgation, which is 6 years after initial bin classification. Similarly, small systems must conduct a second round of indicator monitoring (E. coli or other as designated by the State) beginning 138 months after LT2ESWTR promulgation, which is 6 years after their initial bin classification. Small systems that exceed the indicator trigger value in the second round of indicator monitoring must conduct a second round of Cryptosporidium monitoring, beginning 156 months after LT2ESWTR promulgation.

Compliance dates for filtered systems are summarized in Table IV-23.

Table IV-23.—Summary of Compliance Dates for Filtered Systems Back to Top
System type Requirement Compliance date
1Systems may be eligible to use previously collected (grandfathered) data to meet LT2ESWTR requirements if specified quality control criteria are met (described in section IV.A.1.d).
2Systems are not required to monitor if they will provide at least 5.5 log Cryptosporidium treatment and notify EPA or the State.
3States may grant up to an additional two years for systems making capital improvements.
4If the E. coli annual mean concentration exceeds 10/100 mL for systems using lakes/reservoir sources or exceeds 50/100 mL for systems using flowing stream sources, Cryptosporidium monitoring is required.
5Systems that do not exceed the E. coli trigger level are classified in Bin 1 and are not required to provide Cryptosporidium treatment beyond LT1ESWTR levels.
Large Systems (serve ≥10,000 people) Submit sampling schedule 1,2 No later than 3 months after promulgation.
Source water Cryptosporidium, E. coli and turbidity monitoring Begin monthly monitoring 6 months after promulgation for 24 months.
Comply with additional Cryptosporidium treatment requirements No later than 72 months after promulgation. 3
Second round of source water Cryptosporidium, E. coli, and turbidity monitoring 2 Begin monthly monitoring 108 months after promulgation for 24 months.
Small Systems (serve 10,000 people) Submit E. coli sampling schedule 2 No later than 27 months after promulgation.
Source water E. coli monitoring Begin biweekly monitoring 30 months after promulgation for 1 year.
Second round of source water E. coli monitoring 2 Begin biweekly monitoring 138 months after promulgation for 1 year.
Additional requirements if indicator (e.g., E. coli) trigger level is exceeded 4
Submit Cryptosporidium sampling schedule 1,2 No later than 45 months after promulgation.
Source water Cryptosporidium monitoring Begin twice-per-month monitoring no later than 48 months after promulgation for 1 year.
Comply with additional Cryptosporidium treatment requirements No later than 102 months after promulgation. 3, 5
Second round of source water Cryptosporidium monitoring Begin twice-per-month monitoring no later than 156 months after promulgation for 1 year.

ii. Unfiltered systems. Surface water systems that do not filter and meet the criteria for avoidance of filtration (40 CFR 141.71) (i.e., unfiltered systems) are required to conduct source water Cryptosporidium monitoring to determine if their mean source water Cryptosporidium level exceeds 0.01 oocysts/L. There is no E. coli screening analysis available to small unfiltered systems. However, both large and small unfiltered systems conduct Cryptosporidium monitoring on the same schedule as filtered systems of the same size. Note that unfiltered systems that currently provide or will provide a total of at least 3 log Cryptosporidium inactivation are exempt from monitoring requirements.

Large unfiltered systems (serving at least 10,000 people) must conduct at least monthly Cryptosporidium sampling for 24 months, beginning 6 months after LT2ESWTR promulgation. Small unfiltered systems (serving fewer than 10,000 people) must conduct at least twice-per-month Cryptosporidium sampling for 12 months, beginning 48 months after LT2ESWTR promulgation. Large systems must submit a Cryptosporidium sampling schedule to EPA no later than 3 months after LT2ESWTR promulgation, and small systems must submit a sampling schedule to their State no later than 45 months after LT2ESWTR promulgation.

Unfiltered systems are required to conduct a second round of Cryptosporidium monitoring on the same schedule as filtered systems of the same size. Large systems must carry out a second round of Cryptosporidium monitoring, beginning 108 months after LT2ESWTR promulgation. Small systems must perform a second round of Cryptosporidium monitoring, beginning 156 months after LT2ESWTR promulgation.

Compliance dates for unfiltered systems are summarized in Table IV-24.

Table IV-24.—Summary of Compliance Dates for Unfiltered Systems Back to Top
System type Requirement Compliance date
1Systems may be eligible to use previously collected (grandfathered) data to meet LT2ESWTR requirements if specified quality control criteria are met (described in section IV.A.1.d).
2States may grant up to an additional two years for systems making capital improvements.
Large Systems (serve ≥10,000 people) Submit sampling schedule1 No later than 3 months after promulgation.
Source water Cryptosporidium monitoring Begin monthly monitoring [6 months after promulgation for 24 months.
Comply with Cryptosporidium inactivation requirements No later than 72 months after promulgation.2
Second round of source water Cryptosporidium monitoring Begin monthly monitoring 108 months after promulgation for 24 months.
Small Systems (serve 10,000 people) Submit sampling schedule1 No later than 45 months after promulgation.
Source water Cryptosporidium monitoring Begin twice-per-month monitoring no later than 48 months after promulgation for 1 year.
Comply with Cryptosporidium inactivation requirements No later than 102 months after promulgation.2
Second round of source water Cryptosporidium monitoring Begin twice-per-month monitoring no later than 156 months after promulgation for 1 year.

b. Treatment requirements. Filtered systems must determine their bin classification and unfiltered systems must determine their mean source water Cryptosporidium level within 6 months of the scheduled month for collection of their final Cryptosporidium sample in the first round of monitoring. This 6 month period provides time for systems to receive all sample analysis results from the laboratory, analyze the data, and work with their primacy agency.

Filtered systems have 3 years following initial bin classification to meet any additional Cryptosporidium treatment requirements. This equates to compliance dates of 72 months after LT2ESWTR promulgation for large systems and 102 months after LT2ESWTR promulgation for small systems (see Table IV-23). Unfiltered systems must comply with Cryptosporidium treatment requirements on the same schedule as filtered systems of the same size (see Table IV-24). The State may grant systems an additional two years to comply when capital investments are necessary, as specified in the Safe Drinking Water Act (section 1412(b)(10)).

Systems with uncovered finished water storage facilities are required to comply with the provisions described in section IV.E by 36 months following LT2ESWTR promulgation, with the possibility of a 2 year extension granted by the State for systems making capital improvements. Systems seeking approval for a risk mitigation plan must submit the plan to the State within 24 months following LT2ESWTR promulgation.

Systems must comply with additional Cryptosporidium treatment requirements by implementing one or more treatment processes or control strategies from the microbial toolbox. Most of the toolbox components require submission of documentation to the State demonstrating compliance with design and/or implementation criteria required to receive credit. Compliance dates for reporting requirements associated with microbial toolbox components are presented in detail in section IV.J, Reporting and Recordkeeping Requirements.

c. Disinfection benchmarks for Giardia lamblia and viruses. Today's proposed LT2ESWTR includes disinfection profiling and benchmarking requirements, which consist of three major components: applicability determination, characterization of disinfection practice, and State review of proposed changes in disinfection practice. Each of these components is discussed in detail in section IV.D. Compliance deadlines associated with each of these components, including associated reporting requirements, are stated in section IV.J, Reporting and Recordkeeping Requirements.

2. How Was This Proposal Developed?

The compliance dates in today's proposal reflects the risk-targeted approach of the proposed LT2ESWTR, wherein additional treatment requirements are based on a system specific risk characterization as determined through source water monitoring. Additionally, they are designed to allow for systems to simultaneously comply with the LT2ESWTR and Stage 2 DBPR in order to balance risks in the control of microbial pathogens and DBPs. These dates are consistent with recommendations from the Stage 2 M-DBP Federal Advisory Committee.

Under the LT2ESWTR, large systems will sample for Cryptosporidium for a period of two years in order to characterize source water pathogen levels and capture a degree of annual variability. To expedite the date by which systems will provide additional treatment where high risk source waters are identified, large system Cryptosporidium monitoring will begin six months after promulgation of the LT2ESWTR. Upon completion of Cryptosporidium monitoring, systems will have six months to work with their primacy agency to determine their bin classification. Beginning at this point, which is three years following LT2ESWTR promulgation, large systems will have three years to implement the treatment processes or control strategies necessary to comply with any additional treatment requirements stemming from bin classification.

Other large system compliance dates in areas like approval of grandfathered monitoring data, disinfection profiling and benchmarking, and reporting deadlines associated with microbial toolbox components all stem from the Cryptosporidium monitoring and treatment compliance schedule.

With respect to small systems under the LT2ESWTR, EPA is proposing that small systems first monitor for E. coli as a screening analysis in order to reduce the number of small systems that incur the cost of Cryptosporidium monitoring. However, due to limitations in available data, the Agency has determined that it is necessary to use data generated by large systems under the LT2ESWTR to confirm or refine the E. coli indicator criteria that will trigger small system Cryptosporidium monitoring. Consequently, small system indicator monitoring will begin at the conclusion of large system monitoring. This approach was recommended by the Advisory Committee.

Accordingly, small systems will monitor for E. coli for one year, beginning 30 months after LT2ESWTR promulgation. Following this, small systems will have six months to determine if they are required to monitor for Cryptosporidium and, if so, contract with an approved analytical laboratory. Cryptosporidium monitoring by small systems will be conducted for one year, which, when added to the one year of E. coli monitoring, equals two years of source water monitoring. This is equivalent to the time period large systems spend in source water monitoring.

The time periods associated with bin assignment and compliance with additional treatment requirements for small systems are the same as those proposed for large systems. Specifically, small systems will have six months to work with their States to determine their bin classification following the conclusion of Cryptosporidium sampling. From this point, which is 5.5 years after LT2ESWTR promulgation, small systems have three years to meet any additional treatment requirements resulting from bin classification. States can grant additional time to small systems for compliance with treatment technique requirements through granting exemptions (see SDWA section 1416).

3. Request for Comments

EPA requests comments on the treatment technique compliance schedules for large and small systems in today's proposal, including the following issues:

Time Window Between Large and Small System Monitoring

Under the current proposal, small filtered system E. coli monitoring begins in the month following the end of large system Cryptosporidium, E. coli, and turbidity monitoring. EPA plans to evaluate large system monitoring results on an ongoing basis as the data are reported to determine if any refinements to the E. coli levels that trigger small system Cryptosporidium monitoring are necessary. If such refinements were deemed appropriate, EPA would issue guidance to States, which can establish alternative trigger values for small system monitoring under the LT2ESWTR.

This implementation schedule does not leave any time between the end of large system monitoring and the initiation of small system monitoring. Consequently, if it is necessary to provide guidance on alternative trigger values prior to when small system monitoring begins, such guidance would be based on less than the full set of large system results (e.g., first 18 months of large system data). EPA requests comment on whether an additional time window between the end of large system monitoring and the beginning of small system monitoring is appropriate and, if so, how long such a window should be.

Implementation Schedule for Consecutive Systems

The Stage 2 M-DBP Agreement in Principle (65 FR 83015, December 29, 2000) (USEPA 2000a) continues the principle of simultaneous compliance to address microbial pathogens and disinfection byproducts. Systems are generally expected to address LT2ESTWR requirements concurrently with those of the Stage 2 DBPR (as noted earlier, the Stage 2 DBPR is scheduled to be proposed later this year and to be promulgated at the same time as the LT2ESWTR).

As with the LT2ESWTR, small water systems ( 10,000 served) generally begin monitoring and must be in compliance with the Stage 2 DBPR at a date later than that for large systems. However, the Advisory Committee recommended that small systems that buy/receive from or sell/deliver finished water to a large system (that is, they are part of the same “combined distribution system”) comply with Stage 2 DBPR requirements on the same schedule as the largest system in the combined distribution system. This approach is intended to ensure that systems consider impacts throughout the combined distribution system when making compliance decisions (e.g, selecting new technologies or making operational modifications) and to facilitate all systems meeting the compliance deadlines for the rule.

The issue of combined distribution systems associated with systems buying and selling water is expected to be of less significance for the LT2ESWTR. The requirements of the LT2ESWTR apply to systems treating raw surface water and generally will not involve compliance steps when systems purchase treated water. Consequently, the compliance schedule for today's proposal does not address combined distribution systems. However, this proposed approach raises the possibility that a small system treating surface water and selling it to a large system could be required to take compliance steps at an earlier date under the Stage 2 DBPR than under the LT2ESWTR. While a small system in this situation could choose to comply with the LT2ESWTR on an earlier schedule, the two rules would not require simultaneous compliance. EPA requests comment on how this scenario should be addressed in the LT2ESWTR.

G. Public Notice Requirements

1. What Is EPA Proposing Today?

EPA is proposing that under the LT2ESWTR, a Tier 2 public notice will be required for violations of additional treatment requirements and a Tier 3 public notice will be required for violations of monitoring and testing requirements. Where systems violate LT2ESWTR treatment requirements, today's proposal requires the use of the existing health effects language for microbiological contaminant treatment technique violations, as stated in 40 CFR 141 Subpart Q, Appendix B.

2. How Was This Proposal Developed?

In 2000, EPA published the Public Notification Rule (65 FR 25982, May 4, 2000) (USEPA 2000d), which revised the general public notification regulations for public water systems in order to implement the public notification requirements of the 1996 SDWA amendments. This regulation established the requirements that public water systems must follow regarding the form, manner, frequency, and content of a public notice. Public notification of violations is an integral part of the public health protection and consumer right-to-know provisions of the 1996 SDWA Amendments.

Owners and operators of public water systems are required to notify persons served when they fail to comply with the requirements of a NPDWR, have a variance or exemption from the drinking water regulations, or are facing other situations posing a risk to public health. The public notification requirements divide violations into three categories (Tier 1, Tier 2 and Tier 3) based on the seriousness of the violations, with each tier having different public notification requirements.

EPA has limited its list of violations and situations routinely requiring a Tier 1 notice to those with a significant potential for serious adverse health effects from short term exposure. Tier 1 violations contain language specified by EPA that concisely and in non-technical terms conveys to the public the adverse health effects that may occur as a result of the violation. States and water utilities may add additional information to each notice, as deemed appropriate for specific situations. A State may elevate to Tier 1 other violations and situations with significant potential to have serious adverse health effects from short-term exposure, as determined by the State.

Tier 2 public notices address other violations with potential to have serious adverse health effects on human health. Tier 2 notices are required for the following situations:

  • All violations of the MCL, maximum residual disinfectant level (MRDL) and treatment technique requirements, except where a Tier 1 notice is required or where the State determines that a Tier 1 notice is required; and
  • Failure to comply with the terms and conditions of any existing variance or exemption.

Tier 3 public notices include all other violations and situations requiring public notice, including the following situations:

  • A monitoring or testing procedure violation, except where a Tier 1 or 2 notice is already required or where the State has elevated the notice to Tier 1 or 2; and
  • Operation under a variance or exemption.

The State, at its discretion, may elevate the notice requirement for specific monitoring or testing procedures from a Tier 3 to a Tier 2 notice, taking into account the potential health impacts and persistence of the violation.

As part of the IESWTR, EPA established health effects language for violations of treatment technique requirements for microbiological contaminants. EPA believes this language, which was developed with consideration of Cryptosporidium health effects, is appropriate for violations of additional Cryptosporidium treatment requirements under the LT2ESWTR.

3. Request for Comment

EPA requests comment on whether the violations of additional treatment requirements for Cryptosporidium under the LT2ESWTR should require a Tier 2 public notice and whether the proposed health effects language is appropriate.

H. Variances and Exemptions

SDWA section 1415 allows States to grant variances from national primary drinking water regulations under certain conditions; section 1416 establishes the conditions under which States may grant exemptions to MCL or treatment technique requirements. For the reasons presented in the following discussion, EPA has determined that systems will not be eligible for variances or exemptions to the requirements of the LT2ESWTR.

1. Variances

Section 1415 specifies two provisions under which general variances to treatment technique requirements may be granted:

(1) A State that has primacy may grant a variance to a system from any requirement to use a specified treatment technique for a contaminant if the system demonstrates to the satisfaction of the State that the treatment technique is not necessary to protect public health because of the nature of the system's raw water source. EPA may prescribe monitoring and other requirements as conditions of the variance (section 1415(a)(1)(B)).

(2) EPA may grant a variance from any treatment technique requirement upon a showing by any person that an alternative treatment technique not included in such requirement is at least as efficient in lowering the level of the contaminant (section 1415(a)(3)).

EPA does not believe the first provision for granting a variance is applicable to the LT2ESWTR because Cryptosporidium treatment technique requirements under this rule account for the degree of source water contamination. Systems initially comply with the LT2ESWTR by conducting source water monitoring for Cryptosporidium. Filtered systems are required to provide additional treatment for Cryptosporidium only if the source water concentration exceeds a level where current treatment does not provide sufficient protection. All unfiltered systems are required to provide a baseline of 2 log inactivation of Cryptosporidium to achieve finished water risk levels comparable to filtered systems; however, unfiltered systems are required to achieve 3 log inactivation only if the source water level exceeds 0.01 oocysts/L.

The second provision for granting a variance is not applicable to the LT2ESWTR because the treatment technique requirements of this rule specify the degree to which systems must lower their source water Cryptosporidium level (e.g., 4, 5, and 5.5 log reduction in Bins 2, 3, and 4, respectively). The LT2ESWTR provides broad flexibility in how systems achieve the required level of Cryptosporidium reduction, as shown in the discussion of the microbial toolbox in section VI.C Moreover, the microbial toolbox contains an option for Demonstration of Performance, under which States can award treatment credit based on the demonstrated efficiency of a treatment process in reducing Cryptosporidium levels. Thus, there is no need for this type of variance under the LT2ESWTR.

SDWA section 1415(e) describes small system variances, but these cannot be granted for a treatment technique for a microbial contaminant. Hence, small system variances are not allowed for the LT2ESWTR.

2. Exemptions

Under SDWA section 1416(a), a State may exempt any public water system from a treatment technique requirement upon a finding that (1) due to compelling factors (which may include economic factors such as qualification of the system as serving a disadvantaged community), the system is unable to comply with the requirement or implement measures to develop an alternative source of water supply; (2) the system was in operation on the effective date of the treatment technique requirement, or for a system that was not in operation by that date, no reasonable alternative source of drinking water is available to the new system; (3) the exemption will not result in an unreasonable risk to health; and (4) management or restructuring changes (or both) cannot reasonably result in compliance with the Act or improve the quality of drinking water.

If EPA or the State grants an exemption to a public water system, it must at the same time prescribe a schedule for compliance (including increments of progress or measures to develop an alternative source of water supply) and implementation of appropriate control measures that the State requires the system to meet while the exemption is in effect. Under section 1416(b)(2)(A), the schedule shall require compliance as expeditiously as practicable (to be determined by the State), but no later than three years after the otherwise applicable compliance date for the regulations established pursuant to section 1412(b)(10). For public water systems that do not serve more than a population of 3,300 and that need financial assistance for the necessary improvements, EPA or the State may renew an exemption for one or more additional two-year periods, but not to exceed a total of six years.

A public water system shall not be granted an exemption unless it can establish that: (1) The system cannot meet the standard without capital improvements that cannot be completed prior to the date established pursuant to section 1412(b)(10); or (2) in the case of a system that needs financial assistance for the necessary implementation, the system has entered into an agreement to obtain financial assistance pursuant to section 1452 or any other Federal or state program; or (3) the system has entered into an enforceable agreement to become part of a regional public water system.

EPA believes that granting an exemption to the Cryptosporidium treatment requirements of the LT2ESWTR would result in an unreasonable risk to health. As described in section II.C, Cryptosporidium causes acute health effects, which may be severe in sensitive subpopulations and include risk of mortality. Moreover, the additional Cryptosporidium treatment requirements of the LT2ESWTR are targeted to systems with the highest degree of risk. Due to these factors, EPA is not proposing to allow exemptions under the LT2ESWTR.

3. Request for Comment

a. Variances. EPA requests comment on the determination that the provisions for granting variances are not applicable to the proposed LT2ESWTR, specifically including Cryptosporidium inactivation requirements for unfiltered systems.

In theory it would be possible for an unfiltered system to demonstrate raw water Cryptosporidium levels that were 3 log lower than the cutoff for bin 1 for filtered systems and, thus, that it may be providing comparable public health protection without additional inactivation. However, EPA has determined that in practice it is not currently economically or technologically feasible for systems to ascertain the level of Cryptosporidium at this concentration. This is due to the extremely large number and volume of samples that would be necessary to make this demonstration with sufficient confidence. Based on this determination and the Cryptosporidium occurrence data described in section III.C, EPA is not proposing to allow unfiltered systems to demonstrate raw water Cryptosporidium levels low enough to avoid inactivation requirements. EPA requests comment on this approach.

b. Exemptions. EPA requests comment on the determination that granting an exemption to the Cryptosporidium treatment requirements of the LT2ESWTR would result in an unreasonable risk to health.

I. Requirements for Systems To Use Qualified Operators

The SWTR established a requirement that each public water system using a surface water source or a ground water source under the direct influence of surface water must be operated by qualified personnel who meet the requirements specified by the State (40 CFR 141.70). The Stage 1 DBPR extended this requirement to include all systems affected by that rule, and required that States maintain a register of qualified operators (40 CFR 141.130(c)). While the proposed LT2ESWTR establishes no new requirements regarding the operation of systems by qualified personnel, the Agency would like to emphasize the important role that qualified operators play in delivering safe drinking water to the public. EPA encourages States that do not already have operator certification programs in effect to develop such programs. States should also review and modify, as required, their qualification standards to take into account new technologies (e.g., ultraviolet disinfection) and new compliance requirements.

J. System Reporting and Recordkeeping Requirements

1. Overview

Today's proposal includes reporting and recordkeeping requirements associated with proposed monitoring and treatment requirements. As described earlier, systems must conduct source water monitoring to determine a treatment bin classification for filtered systems or a mean Cryptosporidium level for unfiltered systems. Systems with previously collected monitoring data may be able to use (i.e., grandfather) those data in lieu of conducting new monitoring. Following source water monitoring, systems will be required to comply with any additional Cryptosporidium treatment requirements by implementing treatment and control strategies from a microbial toolbox of options. Systems must conduct a second round of source water monitoring six years after bin classification.

In addition, systems using uncovered finished water storage facilities must cover the facility or provide treatment unless the system implements a State-approved risk management strategy. Certain systems will be required to conduct disinfection profiling and benchmarking.

The proposed rule requires public water systems to submit schedules for Cryptosporidium, E. coli, and turbidity sampling at least 3 months before monitoring must begin. Source water sample analysis results must be reported not later than ten days after the end of first month following the month when the sample is collected. As described later, large systems (at least 10,000 people served) will report monitoring results from the initial round of monitoring directly to EPA through an electronic data system. Small systems will report monitoring results to the State. Both small and large systems will report monitoring results from the second round of monitoring to the State.

Systems must report a bin classification (filtered systems) or mean Cryptosporidium level (unfiltered systems) within six months following the month when the last sample in a particular round of monitoring is scheduled to be collected. If systems are required to provide additional treatment for Cryptosporidium, they must report regarding the use of microbial toolbox components. Systems must notify the State within 24 months following promulgation of the rule if they use uncovered finished water storage facilities. Systems must also make reports related to disinfection profiling and benchmarking. Reporting requirements associated with these activities are summarized in Tables IV-25 to IV-28.

Table IV-25.— Summary of Initial Large Filtered System Reporting Requirements Back to Top
You must report the following items On the following schedule
1States may grant an additional two years for systems making capital improvements.
Sampling schedule for Cryptosporidium, E. coli, and turbidity monitoring No later than 3 months after promulgation.
Results of Cryptosporidium, E. coli, and turbidity analyses No later than 10 days after the end of the first month following the month in which the sample is collected.
Bin determination No later than 36 months after promulgation.
Demonstration of compliance with additional treatment requirements Beginning 72 months after promulgation1(See table IV-34).
Disinfection profiling component reports See Table IV-35.
Table IV-26.—Summary of Initial Small Filtered System Reporting Requirements Back to Top
You must report the following items On the following schedule
1If the E. coli annual mean concentration exceeds 10/100 mL for systems using lakes/reservoirs or exceeds 50/100 mL for systems using flowing streams, then systems must conduct Cryptosporidium monitoring. States may approve alternative indicator criteria to trigger Cryptosporidium monitoring.
2States may grant an additional two years for systems making capital improvements.
Sampling schedule for E. coli monitoring No later than 27 months after promulgation.
Results of E. coli analyses (unless State approves a different indicator) No later than 10 days after the end of the first month following the month in which the sample was collected.
Mean E. coli concentration (unless State approves a different indicator) No later than 45 months after promulgation.
Disinfection profiling component reports See Table IV-36.
Additional requirements if E. coli trigger level is exceeded 1  
Sampling schedule for Cryptosporidium monitoring No later than 45 months after promulgation.
Results of Cryptosporidium analyses No later than 10 days after the end of the first month following the month in which the sample is collected.
Bin determination No later than 66 months after promulgation.
Demonstration of compliance with additional treatment requirements Beginning 102 months after promulgation2(See Table IV-34).
Table IV-27.—Summary of Initial Large Unfiltered System Reporting Requirements Back to Top
You must report the following items On the following schedule
1States may grant an additional two years for systems making capital improvements.
Cryptosporidium sampling schedule No later than 3 months after promulgation.
Results of Cryptosporidium analyses No later than 10 days after the end of the first month following the month in which the sample was collected.
Determination of mean Cryptosporidium concentration No later than 36 months after promulgation.
Disinfection profiling component reports See Table IV-35.
Demonstration of compliance with Cryptosporidium inactivation requirements Beginning 72 months after promulgation1(see Table IV-34).
Table IV-28.—Summary of Initial Small Unfiltered System Reporting Requirements Back to Top
You must report the following items On the following schedule
1States may grant an additional two years for systems making capital improvements.
Cryptosporidium sampling schedule No later than 45 months after promulgation.
Results of Cryptosporidium analyses No later than 10 days after the end of the first month following the month in which the sample was collected.
Determination of mean Cryptosporidium concentration No later than 66 months after promulgation.
Disinfection profiling component reports See Table IV-35.
Demonstration of compliance with Cryptosporidium inactivation requirements Beginning 102 months after promulgation1(see Table IV-34).

2. Reporting Requirements for Source Water Monitoring

a. Data elements to be reported. Proposed reporting requirements for LT2ESWTR monitoring stem from proposed analytical method requirements. As stated in sections IV.K and IV.L, systems must have Cryptosporidium analyses conducted by EPA-approved laboratories using Methods 1622 or 1623. E. coli analyses must be performed by State-approved laboratories using the E. coli methods proposed for approval in section IV.K. Systems are required to report the data elements specified in Table IV-29 for each Cryptosporidium analysis. To comply with LT2ESWTR requirements, only the sample volume filtered and the number of oocysts counted must be reported for samples in which at least 10 L is filtered and all of the sample volume is analyzed. Additional information is required for samples where the laboratory analyzes less than 10 L or less than the full sample volume collected. Table IV-30 presents the data elements that systems must report for E. coli analyses.

As described in the following section, EPA is developing a data system to manage and analyze the microbial monitoring data that will be reported by large systems under the LT2ESWTR. EPA is exploring approaches for application of this data system to support small system data reporting as well. Systems, or laboratories acting as the systems' agents, must keep Method 1622/1623 bench sheets and slide examination report forms until 36 months after an equivalent round of source water monitoring has been completed (e.g., second round of Cryptosporidium monitoring).

Table IV-29.—Proposed Cryptosporidium Data Elements to be Reported Back to Top
Data element Reason for data element
1For matrix spike samples, sample volume spiked and estimated number of oocysts spiked must be reported. These data are not required for field samples.
2For samples in which 10 L is filtered or 100% of the sample volume is examined, the number of filters used and the packed pellet volume must also be reported to verify compliance with LT2ESWTR sample volume analysis requirements. These data are not required for most samples.
3For samples in which 100% of sample is examined, the volume of resuspended concentrate and volume of this resuspension processed through IMS must be reported to calculate the sample volume examined. These data will not be required for most samples.
Identifying information  
• PWSID Needed to associate plant with public water system.
• Facility ID Needed to associate sample result with facility.
• Sample collection point Needed to associate sample result with sampling point.
• Sample collection date Needed to determine that utilities are collecting samples at the frequency required.
• Sample type (field or matrix spike)1 Needed to distinguish field samples from matrix samples for recovery calculations.
Sample results  
• Sample volume filtered (L), to nearest1/4L2 Needed to verify compliance with sample volume requirements.
• Was 100% of filtered volume examined?3 Needed to calculate the final concentration of oocysts/L and determine if volume analyzed requirements are met.
• Number of oocysts counted Needed to calculate the final concentration of oocysts/L.
Table IV-30.—Proposed E. coli Data Elements to be Reported Back to Top
Data element Reason for collecting data element
Identifying Information  
PWS ID Needed to associate analytical result with public water system.
Facility ID Needed to associate plant with public water system.
Sample collection point Needed to associate sample result with sampling point.
Sample collection date Needed to determine that utilities are collecting samples at the frequency required.
Analytical method number Needed to associate analytical result with analytical method.
Method Type Needed to verify that an approved method was used and call up correct web entry form.
Source water type Needed to assess Cryptosporidium indicator relationships.
E. coli/100 mL Sample result (although not required, the laboratory also will have the option of entering primary measurements for a sample into the LT2ESWTR internet-based database to have the database automatically calculate the sample result).
Turbidity Information  
Turbidity result Needed to assess Cryptosporidium indicator relationships.

b. Data system. Because source water monitoring by large systems (serving at least 10,000 people) will begin 6 months following promulgation of the LT2ESWTR, EPA expects to act as the primacy agency with oversight responsibility for large system sampling, analysis, and data reporting. To facilitate collection and analysis of large system monitoring data, EPA is developing an Internet-based electronic data collection and management system. This approach is similar to that used under the Unregulated Contaminants Monitoring Rule (UCMR) (64 FR 50556, September 17, 1999) (USEPA 1999c).

Analytical results for Cryptosporidium, E. coli, and turbidity analyses will be reported directly to this database using web forms and software that can be downloaded free of charge. The data system will perform logic checks on data entered and calculate final results from primary data (where necessary). This is intended to reduce reporting errors and limit the time involved in investigating, checking, and correcting errors at all levels. EPA will make large system monitoring data available to States when States assume primacy for the LT2ESWTR or earlier under State agreements with EPA.

Large systems should instruct their laboratories to electronically enter monitoring results into the EPA data system using web-based manual entry forms or by uploading XML files from laboratory information management systems (LIMS). After data are submitted by a laboratory, systems may review the results on-line. If a system believes that a result was entered into the data system erroneously, the system may notify the laboratory to rectify the entry. In addition, if a system believes that a result is incorrect, the system may submit the result as a contested result and petition EPA or the State to invalidate the sample. If a system contests a sample result, the system must submit a rationale to the primacy agency, including a supporting statement from the laboratory, providing a justification. Systems may arrange with laboratories to review their sample results prior to the results being entered into the EPA data system. Also, if a system determines that its laboratory does not have the capability to report data electronically, the system can submit a request to EPA to use an alternate reporting format.

Regardless of the reporting process used, systems are required to report an analytical monitoring result to the primacy agency no later than 10 days after the end of the first month following the month when the sample was collected. As described in section IV.A.1, if a system is unable to report a valid Cryptosporidium analytical result for a scheduled sampling date due to failure to comply with the analytical method requirements (e.g., violation of quality control requirements), the system must collect a replacement sample within 14 days of being notified by the laboratory or the State that a result cannot be reported for that date and must submit an explanation for the replacement sample with the analytical results. A system will not incur a monitoring violation if the State determines that the failure to report a valid analysis result was due to circumstances beyond the control of the system. However, in all cases the system must collect a replacement sample.

The data elements to be collected by the electronic data system will enhance the reliability of the microbial data generated under the LT2ESWTR, while reducing the burden on the analytical laboratories and public water systems. Tables IV-31 and IV-32 summarize the system's data analysis functions for Cryptosporidium measurements.

Table IV-31.— LT2ESWTR Data System Functions for Cryptosporidium Data Back to Top
Value calculated Formula Applicability to sample types
Field Matrix spike
Calculation of sample volume analyzed (Volume filtered) * (resuspended concentrate volume transferred to IMS/resuspended concentrate volume) Yes Yes.
Pellet volume analyzed (pellet volume)*(resuspended concentrated volume transferred to IMS/resuspended concentrate volume) Yes Yes.
Calculation of oocysts/L (Number of oocysts counted)/(sample volume analyzed) Yes Yes.
Calculation of estimated number of oocysts spiked/L (Number of oocysts spiked)/(sample volume spiked) No Yes.
Calculation of percent recoveries for MS samples ((Calculated # of oocysts/L for the MS sample)—(Calculated # of oocysts/Lin the associated fieldsample)) /(Estimated number ofoocysts spiked/L) * 100% No Yes.
Table IV-32.—LT2ESWTR Data System Functions for Cryptosporidium Compliance Checks Back to Top
LT2 requirements Description
Sample volume analysis Specifies that the LT2 requirements for sample volume analyzed were met when:
• volume analyzed is 10 L.
• volume analyzed is 10 L and pellet volume analyzed isat least 2 mL.
• volume analyzed 10 L and pellet volume analyzed 2mL and 100% of filtered volume examined= Y and twofilters were used.
Specifies that the LT2 requirements for sample volumeanalyzed were not met when:
• volume analyzed 10 L and pellet volume analyzed is2 mL and 100% of filtered volume examined= N.
• volume analyzed is 10 L and pellet volume analyzed2 mL and only 1 filter used.
Schedule met Specifies that the predetermined sampling schedule is metwhen the sample collection data is within ± 2 days of thescheduled date.

c. Previously collected monitoring data. Table IV-33 provides a summary of the items that systems must report to EPA for consideration of previously collected (grandfathered) monitoring data under the LT2ESWTR. For each field and matrix spike (MS) sample, systems must report the data elements specified in Table IV-29. In addition, the laboratory that analyzed the samples must submit a letter certifying that all Method 1622 and 1623 quality control requirements (including ongoing precision and recovery (OPR) and method blank (MB) results, holding times, and positive and negative staining controls) were performed at the required frequency and were acceptable. Alternatively, the laboratory may provide for each field, MS, OPR, and MB sample a bench sheet and sample examination report form (Method 1622 and 1623 bench sheets are shown in USEPA 2003h).

Systems must report all routine source water Cryptosporidium monitoring results collected during the period covered by the previously collected data that have been submitted. This applies to all samples that were collected from the sampling location used for monitoring, not spiked, and analyzed using the laboratory's routine process for Method 1622 or 1623 analyses, including analytical technique and QA/QC. Other requirements associated with use of previously collected data are specified in section IV.A.1.d. Where applicable, systems must provide documentation addressing the dates and reason(s) for re-sampling, as well as the use of presedimentation, off-stream storage, or bank filtration during monitoring. Review of the submitted information, along with the results of the quality assurance audits of the laboratory that produced the data, will be used to determine whether the data meet the requirements for grandfathering.

Table IV-33.—Items That Must Be Reported for Consideration of Grandfathered Monitoring Data Back to Top
The following items must be reported1 On the following schedule1
1See section IV.A.1. for details.
Data elements listed in Table IV-29 for each field and MS sample No later than 2 months after promulgation if the system does not intend to conduct new monitoring under the LT2ESWTR.
Letter from laboratory certifying that method-specified QC was performed at required frequency and was acceptable  
OR OR
Method 1622/1623 bench sheet and sample examination report form for each field, MS, OPR, and method blank sample No later than 8 months after promulgation if the system intends to conduct new monitoring under the LT2ESWTR.
Letter from system certifying (1) that all source water data collected during the time period covered by the previously collected data have been submitted and (2) that the data represent the plant's current source water
Where applicable, documentation addressing the dates and reason(s) for re-sampling, as well as the use of presedimentation, off-stream storage, or bank filtration during monitoring

3. Compliance With Additional Treatment Requirements

Under the proposed LT2ESWTR, systems may choose from a “toolbox” of management and treatment options to meet their additional Cryptosporidium treatment requirements. In order to receive credit for toolbox components, systems must initially demonstrate that they comply with any required design and implementation criteria, including performance validation testing. Additionally, systems must provide monthly verification of compliance with any required operational criteria, as shown through ongoing monitoring. Required design, implementation, operational, and monitoring criteria for toolbox components are described in section IV.C. Proposed reporting requirements associated with these criteria are shown in Table IV-34 for both large and small systems.

Table IV-34.—Toolbox Reporting Requirements Back to Top
Toolbox option (potential Cryptosporidium reduction log credit) You must submit the following items On the following schedule1 (systems serving ≥10,000 people) On the following schedule1 (systems serving 10,000 people)
1States may allow an additional two years for systems making capital improvements.
Watershed Control Program (WCP) (0.5 log) Notify State of intention to develop WCP Submit initial WCP plan to State No later than 48 months after promulgation No later than 60 months after promulgation No later than 78 months after promulgation. No later than 90 months after promulgation.
Annual program status report and State-approved watershed survey report By a date determined by the State, every 12 months, beginning 84 months after promulgation By a date determined by the State, every 12 months, beginning 114 months after promulgation.
Request for re-approval and report on the previous approval period No later than 6 months prior to the end of the current approval period or by a date previously determined by the State No later than 6 months prior to the end of the current approval period or by a date previously determined by the State.
Pre-sedimentation (0.5 log) (new basins) Monthly verification of: Continuous basin operation Treatment of 100% of the flow Continuous addition of a coagulant At least 0.5 log removal of influent turbidity based on the monthly mean of daily turbidity readings for 11 of the 12 previous months Monthly reporting within 10 days following the month in which the monitoring was conducted, beginning 72 months after promulgation Monthly reporting within 10 days following the month in which the monitoring was conducted, beginning 102 months after promulgation.
Two-Stage Lime Softening (0.5 log) Monthly verification of: Continuous operation of a second clarification step between the primary clarifier and filter Presence of coagulant (may be lime) in first and second stage clarifiers Both clarifiers treat 100% of the plant flow No later than 72 months after promulgation No later than 102 months after promulgation.
Bank filtration (0.5 or 1.0 log) (new) Initial demonstration of: Unconsolidated, predominantly sandy aquifer Setback distance of at least 25 ft. (0.5 log) or 50 ft. (1.0 log) Initial demonstration no later than 72 months after promulgation Initial demonstration no later than 102 months after promulgation.
If monthly average of daily max turbidity is greater than 1 NTU then system must report result and submit an assessment of the cause Report within 30 days following the month in which the monitoring was conducted, beginning 72 months after promulgation Report within 30 days following the month in which the monitoring was conducted, beginning 102 months after promulgation.
Combined filter performance (0.5 log) Monthly verification of: Combined filter effluent (CFE) turbidity levels less than or equal to 0.15 NTU in at least 95 percent of the 4 hour CFE measurements taken each month Monthly reporting within 10 days following the month in which the monitoring was conducted, beginning on 72 months after promulgation Monthly reporting: within 10 days following the month in which the monitoring was conducted, beginning on 102 months after promulgation.
Membranes (MF, UF, NF, RO) (2.5 log or greater based on verification/integrity testing) Initial demonstration of: Removal efficiency through challenge studies Methods of challenge studies meet rule criteria Integrity test results and baseline No later than 72 months after promulgation No later than 102 months after promulgation.
Monthly report summarizing: All direct integrity test results above the control limit and the corrective action that was taken All indirect integrity monitoring results triggering direct integrity testing and the corrective action that was taken Within 10 days following the month in which monitoring was conducted, beginning 72 months after promulgation Within 10 days following the month in which monitoring was conducted, beginning 102 months after promulgation.
Bag filters (1.0 log) and Cartridge filters (2.0 log) Initial demonstration that the following criteria are met: Process meets the basic definition of bag or cartridge filtration; Removal efficiency established through challenge testing that meets rule criteria Challenge test shows at least 2 and 3 log removal for bag and cartridge filters, respectively No later than 72 months after promulgation No later than 102 months after promulgation.
Chlorine dioxide (log credit based on CT) Summary of CT values for each day and log inactivation based on tables in section IV.C.14 Within 10 days following the month in which monitoring was conducted, beginning 72 months after promulgation Within 10 days following the month in which monitoring was conducted, beginning 102 months after promulgation.
Ozone (log credit based on CT) Summary of CT values for each day and log inactivation based on tables in section IV.C.14 Within 10 days following the month in which monitoring was conducted, beginning 72 months after promulgation Within 10 days following the month in which monitoring was conducted, beginning 102 months after promulgation.
UV (log credit based UV dose and operating within validated conditions) Results from reactor validation testing demonstrating operating conditions that achieve required UV dose No later than 72 months after promulgation No later than 102 months after promulgation.
Monthly report summarizing the percentage of water entering the distribution system that was not treated by UV reactors operating within validated conditions for the required UV dose in section IV.C.15 Within 10 days following the month in which monitoring was conducted, beginning 72 months after promulgation Within 10 days following the month in which monitoring was conducted, beginning 102 months after promulgation.
Individual filter performance (1.0 log) Monthly verification of the following, based on continuous monitoring of turbidity for each individual filter: Filtered water turbidity less than 0.1 NTU in at least 95 percent of the daily maximum values from individual filters (excluding 15 minute period following start up after backwashes) No individual filter with a measured turbidity greater than 0.3 NTU in two consecutive measurements taken 15 minutes apart Monthly reporting within 10 days following the month in which the monitoring was conducted, beginning on 72 months after promulgation Monthly reporting: within 10 days following the month in which the monitoring was conducted, beginning 102 months after promulgation.
Demonstration of Performance Results from testing following State approved protocol No later than 72 months after promulgation No later than 102 months after promulgation.
Monthly verification of operation within State-approved conditions for demonstration of performance credit Within 10 days following the month in which monitoring was conducted, beginning 72 months after promulgation Within 10 days following the month in which monitoring was conducted, beginning 102 months after promulgation.

Reporting requirements associated with disinfection profiling and benchmarking are summarized in Table IV-35 for large systems and in Table IV-36 for small systems.

Table IV-35.—Disinfection Benchmarking Reporting Requirements for Large Systems Back to Top
System type Benchmark component Submit the following items On the following schedule
1Systems that provide at least 5.5 log of Cryptosporidium treatment consistent with a Bin 4 treatment implication are not required to conduct Cryptosporidium monitoring.
Systems required to conduct Cryptosporidium monitoring Characterization of Disinfection Practices Giardia lamblia and virus inactivation profiles must be on file for Statereview during sanitary survey No later than 36 months after promulgation.
State Review of Proposed Changes to Disinfection Practices Inactivation profiles and benchmark determinations Prior to significant modification of disinfection practice.
Systems not required to conduct Cryptosporidium monitoring1 Applicability None None.
Characterization of Disinfection Practices None None.
State Review of Proposed Changes to Disinfection Practices None None.
Table IV-36.—Disinfection Benchmarking Reporting Requirements for Small Systems Back to Top
System type Benchmark component Submit the following items On the following schedule
1Systems that provide at least 5.5 log of Cryptosporidium treatment consistent with a Bin 4 treatment implication are not required to conduct Cryptosporidium monitoring.
2If the E. coli annual mean concentration is ≤ 10/100 mL for systems using lakes/reservoir sources or ≤ 50/100 mL for systems using flowing stream sources, the system is not required to conduct Cryptosporidium monitoring and will only be required to characterize disinfection practices if DBP triggers are exceeded.
3If the system is a CWS or NTNCWSs and TTHM or HAA5 levels in the distribution system are at least 0.064 mg/L or 0.048 mg/L, respectively, calculated as an LRAA at any Stage 1 DBPR sampling site, then the system is triggered into disinfection profiling.
Systems required to conduct Cryptosporidium monitoring Characterization of Disinfection Practices Giardia lamblia and virus inactivation profiles must be on file for State review during sanitary survey No later than 66 months after promulgation.
State Review of Proposed Changes to Disinfection Practices Inactivation profiles and benchmark determinations Prior to significant modification of disinfection practice.
Systems not required to conduct Cryptosporidium monitoring and that exceed DBP triggers1,2,3 Applicability Period Notify State that profiling is required based on DBP levels No later than 42 months after promulgation.
Characterization of Disinfection Practices Giardia lamblia and virus inactivation profiles must be on file for State review during sanitary survey No later than 54 months after promulgation.
State Review of Proposed Changes to Disinfection Practices Inactivation profiles and benchmark determinations Prior to significant modification of disinfection practice.
Systems not required to conduct Cryptosporidium monitoring and that do not exceed DBP triggers2,3 Applicability Period Notify State that profiling is not required based on DBP levels No later than 42 months after promulgation.
Characterization of Disinfection Practices None None.
State Review of Proposed Changes to Disinfection Practices None None.

4. Request for Comment

EPA requests comment on the reporting and recordkeeping requirements proposed for the LT2ESWTR.

Specifically, the Agency requests comment on the proposed requirement that systems report monthly on the use of microbial toolbox components to demonstrate compliance with their Cryptosporidium treatment requirements. An alternative may be for systems to keep records on site for State review instead of reporting the data.

K. Analytical Methods

EPA is proposing to require public water systems to conduct LT2ESWTR monitoring using approved methods for Cryptosporidium, E. coli, and turbidity analyses. This includes meeting quality control criteria stipulated by the approved methods and additional method-specific requirements, as stated later in this section. Related requirements on the use of approved laboratories are discussed in section IV.L, and proposed requirements for reporting of data were stated previously in section IV.J. EPA has developed draft guidance for sampling and analyses under the LT2ESWTR (see USEPA 2003g and 2003h). This guidance is available in draft form in the docket for today's proposal (http://www.epa.gov/edocket/).

1. Cryptosporidium

a. What is EPA proposing today? Method 1622: “Cryptosporidium in Water by Filtration/IMS/FA” (EPA-821-R-01-026, April 2001) (USEPA 2001e) and Method 1623: “Cryptosporidium and Giardia in Water by Filtration/IMS/FA” (EPA 821-R-01-025, April 2001) (USEPA 2001f) are proposed for Cryptosporidium analysis under this rule. Methods 1622 and 1623 require filtration, immunomagnetic separation (IMS) of the oocysts from the captured material, and examination based on IFA, DAPI staining results, and differential interference contrast (DIC) microscopy for determination of oocyst concentrations.

Method Requirements

For each Cryptosporidium sample under this proposal, all systems must analyze at least a 10-L sample volume. Systems may collect and analyze greater than a 10-L sample volume. If a sample is very turbid, it may generate a large packed pellet volume upon centrifugation (a packed pellet refers to the concentrated sample after centrifugation has been performed in EPA Methods 1622 and 1623). Based on IMS purification limitations, samples resulting in large packed pellets will require that the sample concentrate be aliquoted into multiple “subsamples” for independent processing through IMS, staining, and examination. Because of the expense of the IMS reagents and analyst time to examine multiple slides per sample, systems are not required to analyze more than 2 mL of packed pellet volume per sample.

In cases where it is not feasible for a system to process a 10-L sample for Cryptosporidium analysis (e.g., filter clogs prior to filtration of 10 L) the system must analyze as much sample volume as can be filtered by 2 filters, up to a packed pellet volume of 2 mL. This condition applies only to filters that have been approved by EPA for nationwide use with Methods 1622 and 1623—the Pall Gelman Envirochek TM and Envirochek TM HV filters, the IDEXX Filta-Max TM foam filter, and the Whatman CrypTest TM cartridge filter.

Methods 1622 and 1623 include fluorescein isothiocyanate (FITC) as the primary antibody stain for Cryptosporidium detection, DAPI staining to detect nuclei, and DIC to detect internal structures. For purposes of the LT2ESWTR, systems must report total Cryptosporidium oocysts as detected by FITC as determined by the color (apple green or alternative stain color approved for the laboratory under the Lab QA Program described in section VI.L), size (4-6 μm) and shape (round to oval). This total includes all of the oocysts identified as described here, less atypical organisms identified by FITC, DIC, or DAPI (e.g., possessing spikes, stalks, appendages, pores, one or two large nuclei filling the cell, red fluorescing chloroplasts, crystals, spores, etc.).

Matrix Spike Samples

As required by Method 1622 and 1623, systems must have 1 matrix spike (MS) sample analyzed for each 20 source water samples. The volume of the MS sample must be within ten percent of the volume of the unspiked sample that is collected at the same time, and the samples must be collected by splitting the sample stream or collecting the samples sequentially. The MS sample and the associated unspiked sample must be analyzed by the same procedure. MS samples must be spiked and filtered in the laboratory. However, if the volume of the MS sample is greater than 10 L, the system is permitted to filter all but 10 L of the MS sample in the field, and ship the filtered sample and the remaining 10 L of source water to the laboratory. In this case, the laboratory must spike the remaining 10 L of water and filter it through the filter used to collect the balance of the sample in the field.

EPA is proposing to require the use of flow cytometer-counted spiking suspensions for spiked QC samples during the LT2ESWTR. This provision is based on the improved precision expected for spiking suspensions counted with a flow cytometer, as compared to those counted using well slides or hemacytometers. During the Information Collection Rule Supplemental Surveys, the mean relative standard deviation (RSD) across 25 batches of flow cytometer-sorted Cryptosporidium spiking suspensions was 1.8%, with a median of 1.7% (Connell et al. 2000). In EPA Performance Evaluation (PE) studies, the mean RSD for flow cytometer sorted Cryptosporidium spiking suspensions was 3.4%. In comparison, the mean RSD for Cryptosporidium spiking suspensions enumerated manually by 20 laboratories using well slides or hemacytometers was 17% across 108 rounds of 10-replicate counts.

QC requirements in Methods 1622 and 1623 must be met by laboratories analyzing Cryptosporidium samples under the LT2ESWTR. The QC acceptance criteria are the same as stipulated in the method. For the initial precision and recovery (IPR) test, the mean Cryptosporidium recovery must be 24% to 100% with maximum relative standard deviation (i.e., precision) of 55%. For each ongoing precision and recovery (OPR) sample, recovery must be in the range of 11% to 100%. For each method blank, oocysts must be undetected.

Methods 1622 and 1623 are performance-based methods and, therefore, allow multiple options to perform the sample processing steps in the methods if a laboratory can meet applicable QC criteria and uses the same determinative technique. If a laboratory uses the same procedures for all samples, then all field samples and QC samples must be analyzed in that same manner. However, if a laboratory uses more than one set of procedures for Cryptosporidium analyses under LT2ESWTR then the laboratory must analyze separate QC samples for each option to verify compliance with the QC criteria. For example, if the laboratory analyzes samples using both the Envirochek TM and Filta-Max TM filters, a separate set of IPR, OPR, method blank, and MS samples must be analyzed for each filtration option.

b. How was this proposal developed? EPA is proposing EPA Methods 1622 and 1623 for Cryptosporidium analyses under the LT2ESWTR because these are the best available methods that have undergone full validation testing. In addition, these methods have been used successfully in a national source water monitoring program as part of the Information Collection Rule Supplemental Surveys (ICRSS). The minimum sample volume and other quality control requirements are intended to ensure that data are of sufficient quality to assign systems to LT2ESWTR risk bins. Further, the proposed method requirements for analysis of Cryptosporidium are consistent with recommendations by the Stage 2 M-DBP Advisory Committee. In the Agreement in Principle, the Committee recommended that source water Cryptosporidium monitoring under the LT2ESWTR be conducted using EPA Methods 1622 and 1623 with no less than 10 L samples. EPA also has proposed these methods for approval for ambient water monitoring under Guidelines Establishing Test Procedures for the Analysis of Pollutants; Analytical Methods for Biological Pollutants in Ambient Water (66 FR 45811, August 30, 2001) (USEPA 2001i).

When considering the method performance that could be achieved for analysis of Cryptosporidium under the LT2ESWTR, EPA and the Advisory Committee evaluated the Cryptosporidium recoveries reported for Methods 1622 and 1623 in the ICRSS. As described in section III.C, the ICRSS was a national monitoring program that involved 87 utilities sampling twice per month over 1 year for Cryptosporidium and other microorganisms and water quality parameters. During the ICRSS, the mean recovery and relative standard deviation associated with enumeration of MS samples for total oocysts by Methods 1622 and 1623 were 43% and 47%, respectively (Connell et al. 2000).

EPA believes that with provisions like the Laboratory QA Program for Cryptosporidium laboratories (see section IV.L), comparable performance to that observed in the ICRSS can be achieved in LT2ESWTR monitoring with the use of Methods 1622 and 1623, and that this level of performance will be sufficient to realize the public health goals intended by EPA and the Advisory Committee for the LT2ESWTR. Other methods would need to achieve comparable performance to be considered for use under the LT2ESWTR. For example, EPA does not expect the Information Collection Rule Method, which resulted in 12% mean recovery for MS samples during the Information Collection Rule Laboratory Spiking Program (Scheller, 2002), to meet LT2ESWTR data quality objectives.

For systems collecting samples larger than 10 L, EPA is proposing the approach of allowing systems to filter all but 10 L of the corresponding MS sample in the field, and ship the filtered sample and the remaining 10 L of source water to the laboratory for spiking and analysis. The Agency has determined that the added costs associated with shipping entire high-volume (e.g. 50-L) samples to a laboratory for spiking and analysis are not merited by improved data quality relative to the use of Cryptosporidium MS data under the LT2ESWTR. EPA estimates that the average cost for shipping a 50-L bulk water sample is $350 more than the cost of shipping a 10-L sample and a filter. A study comparing these two approaches (i.e., spiking and filtering 50 L vs. field filtering 40 L and spiking 10 L) indicated that spiking the 10-L sample produced somewhat higher recoveries (USEPA 2003i). However, the differences were not significant enough to offset the greatly increased shipping costs, given the limited use of MS data in LT2ESWTR monitoring.

c. Request for comment. EPA requests comment on the proposed method requirements for Cryptosporidium analysis, including the following specific issues:

Minimum Sample Volume

It is the intent of EPA that LT2ESWTR sampling provide representative annual mean source water concentrations. If systems were unable to analyze an entire sample volume during certain periods of the year due to elevated turbidity or other water quality factors, this could result in systems analyzing different volumes in different samples. Today's proposal requires systems to analyze at least 10 L of sample or the maximum amount of sample that can be filtered through two filters, up to a packed pellet volume of 2 mL. EPA requests comment on whether these requirements are appropriate for systems with source waters that are difficult to filter or that generate a large packed pellet volume. Alternatively, systems could be required to filter and analyze at least 10 L of sample with no exceptions.

Approval of Updated Versions of EPA Methods 1622 and 1623

EPA has developed draft revised versions of EPA Methods 1622 and 1623 in order to consolidate several method-related changes EPA believes may be necessary to address LT2ESWTR monitoring requirements (see USEPA 2003j and USEPA 2003k). EPA is requesting comment on whether these revised versions should be approved for monitoring under the LT2ESWTR, rather than the April 2001 versions proposed in today's rule. If the revised versions were approved, previously collected data generated using the earlier versions of the methods would still be acceptable for grandfathering, provided the other criteria described in section IV.A.1.d were met. Drafts of the updated methods are provided in the docket for today's rule, and differences between these versions and the April 2001 versions of the methods are clearly indicated for evaluation and comment. Changes to the methods include the following:

(1) Increased flexibility in matrix spike (MS) and initial precision and recovery (IPR) requirements—the requirement that the laboratory must analyze an MS sample on the first sampling event for a new PWS would be changed to a recommendation; the revised method would allow the IPR test to be performed across four different days, rather than restrict analyses to 1 day;

(2) Clarification of some method procedures, including the spiking suspension vortexing procedure and the buffer volumes used during immunomagnetic separation (IMS); requiring (rather than recommending) that laboratories purchase HCl and NaOH standards at the normality specified in the method; and clarification that the use of methanol during slide staining in section 14.2 of the method is as per manufacturer's instructions;

(3) Additional recommendations for minimizing carry-over of debris onto microscope slides after IMS and information on microscope cleaning;

(4) Clarification in the method of the actions to take in the event of QC failures, such as that any positive sample in a batch associated with an unacceptable method blank is unacceptable and that any sample in a batch associated with an unacceptable ongoing precision and recovery (OPR) sample is unacceptable;

(5) Changes to the sample storage and shipping temperature to “less than 10°C and not frozen”, and additional guidance on sample storage and shipping procedures that addresses time of collection, and includes suggestions for monitoring sample temperature during shipment and upon receipt at the laboratory.

(6) Additional analyst verification procedures—adding examination using differential interference contrast (DIC) microscopy to the analyst verification requirements.

(7) Addition of an approved method modification using the Pall Gelman Envirochek HV filter. This approval was based on an interlaboratory validation study demonstrating that three laboratories, each analyzing reagent water and a different source water, met all method acceptance criteria for Cryptosporidium. EPA issued a letter (dated March 21, 2002) under the Alternative Test Procedures program approving the procedure as an acceptable version of Method 1623 for Cryptosporidium (but not for Giardia). EPA also noted in the letter that the procedure was considered to be an acceptable modification of EPA Method 1622.

(8) Incorporation of detailed procedures for concentrating samples using an IDEXX Filta-Max TM foam filter. A method modification using this filter already is approved by EPA in the April 2001 versions of the methods.

(9) Addition of BTF EasySeed TM irradiated oocysts and cysts as acceptable materials for spiking routine QC samples. EPA approved the use of EasySeed TM based on side-by-side comparison tests of method recoveries using EasySeed TM and live, untreated organisms. EPA issued a letter (dated August 1, 2002) approving EasySeed TM for use in routine QC samples for EPA Methods 1622 and 1623 and for demonstrating comparability of method modifications in a single laboratory.

(10) Removal of the Whatman Nuclepore CrypTest TM cartridge filter. Although a method modification using this filter was approved by EPA in the April 2001 versions of the methods, the filter is no longer available from the manufacturer, and so is no longer an option for sample filtration.

The changes in the June 2003 draft revisions of EPA Methods 1622 and 1623 reflect method-related clarifications, modifications, and additions that EPA believes should be addressed for LT2ESWTR Cryptosporidium monitoring. Alternatively, these issues could be addressed through regulatory requirements in the final LT2ESWTR (for required changes and additions) and through guidance (for recommended changes and clarifications). However, EPA believes that addressing these issues through a single source in updated versions of EPA Methods 1622 and 1623 (which could be approved in the final LT2ESWTR) may be more straightforward and easier for systems and laboratories to follow than addressing them in multiple sources (i.e., existing methods, the final rule, and laboratory guidance).

2. E. coli

a. What is EPA proposing today? For enumerating source water E. coli density under the LT2ESWTR, EPA is proposing to approve the same methods that were proposed by EPA under Guidelines Establishing Test Procedures for the Analysis of Pollutants; Analytical Methods for Biological Pollutants in Ambient Water (66 FR 45811, August 30, 2001) (USEPA 2001i). These methods are summarized in Table IV-37. Methods are listed within the general categories of most probable number tests and membrane filtration tests. Method identification numbers are provided for applicable standards published by EPA and voluntary consensus standards bodies (VCSB) including Standard Methods, American Society of Testing Materials (ASTM), and the Association of Analytical Chemists (AOAC).

Table IV-37.— Proposed Methods for E. Coli Enumeration1 Back to Top
Technique Method1 EPA VCSB methods Commercial example
Standard methods2 ASTM3 AOAC4
1Tests must be conducted in a format that provides organism enumeration.
2Standard Methods for the Examination of Water and Wastewater. American Public Health Association. 20th, 19th, and 18th Editions. Amer. Publ. Hlth. Assoc., Washington, DC.
3Annual Book of ASTM Standards—Water and Environmental Technology. Section 11.02. ASTM. 100 Barr Harbor Drive, West Conshohocken, PA 19428.
4Official Methods of Analysis of AOAC International, 16th Edition, Volume I, Chapter 17. AOAC International. 481 North Frederick Avenue, Suite 500, Gaithersburg, Maryland 20877-2417.
5Manufactured by IDEXX Laboratories, Inc., One IDEXX Drive, Westbrook, Maine 04092.
6Manufactured by Hach Company, 100 Dayton Ave., Ames, IA 50010.
7Acceptable version of method approved as a drinking water alternative test procedure.
Most Probable Number (MPN) LTB, EC-MUG 9221B.1/9221F      
ONPG-MUG 9223B 991.15 Colilert®5.
ONPG-MUG 9223B Colilert-18®5 7.
Membrane Filter (MF) mFC➝NA-MUG 9222D/9222G      
mENDO or LES-ENDO➝NA-MUG 9222B/9222G      
mTEC agar 1103.1 9213D D5392-93    
Modified mTEC agar 1603        
MI medium 1604        
m-ColiBlue24 broth m-ColiBlue246.

EPA is proposing to allow a holding time of 24 hours for E. coli samples. The holding time refers to the time between sample collection and initiation of analysis. Currently, 40 CFR 141.74(a) limits the holding time for source water coliform samples to 8 hours and requires that samples be kept below 10°C during transit. EPA believes that new studies, described later in this section, demonstrate that E. coli analysis results for samples held for 24 hours will be comparable to samples held for 8 hours, provided the samples are held below 10°C and are not allowed to freeze. This proposed increase in holding time is significant for the LT2ESWTR because typically it is not feasible for systems to meet an 8-hour holding time when samples cannot be analyzed on-site. Many small systems that will conduct E. coli monitoring under the LT2ESWTR lack a certified on-site laboratory for E. coli analyses and will be required to ship samples to a certified laboratory. EPA believes that it is feasible for these systems to comply with a 24 hour holding time for E. coli samples through using overnight delivery services.

b. How was this proposal developed? As noted, EPA recently proposed methods for ambient water E. coli analysis under Guidelines Establishing Test Procedures for the Analysis of Pollutants; Analytical Methods for Biological Pollutants in Ambient Water (66 FR 45811, August 30, 2001) (USEPA 2001i). These proposed methods were selected based on data generated by EPA laboratories, submissions to the alternate test procedures (ATP) program and voluntary consensus standards bodies, published peer reviewed journal articles, and publicly available study reports.

The source water analysis for E. coli that will be conducted under the LT2ESWTR is similar to the type of ambient water analyses for which these methods were previously proposed (66 FR 45811, August 30, 2001) (USEPA 2001i). EPA continues to support the findings of this earlier proposal and believes that these methods have the necessary sensitivity and specificity to meet the data quality objectives of the LT2ESWTR.

New Information on E. coli Sample Holding Time

It is generally not feasible for systems that must ship E. coli samples to an off-site laboratory to comply with an 8-hour holding time requirement. During the ICRSS, 100% of the systems that shipped samples off-site for E. coli analysis exceeded the 8 hour holding time; 12% of these samples had holding times in excess of 30 hours. Most large systems that will be required to monitor for E. coli under the LT2ESWTR could conduct these analyses on-site, but many small systems will need to ship samples off-site to a certified contract laboratory.

EPA participated in three phases of studies to assess the effect of increased sample holding time on E. coli analysis results. These are summarized as follows, and are described in detail in Pope et al. (2003).

  • Phase 1-EPA, the Wisconsin State Laboratory of Hygiene (WSLH), and DynCorp conducted a study to evaluate E. coli sample concentrations from four sites at 8, 24, 30, and 48 hours after sample collection for samples stored at 4°C, 10°C, 20°C, and 35°C. Temperature was varied to assess the effect of different shipping conditions. Samples were analyzed in triplicate by membrane filtration (mFC followed by transfer to NA-MUG) and Colilert (Quanti-Tray 2000) (Pope et al. 2003).
  • Phase 2-EPA conducted a study to evaluate E. coli sample concentrations from seven sites at 8, 24, 30, and 48 hours after sample collection for samples stored in coolers containing wet ice or Utek ice packs (to assess real-world storage conditions). Samples were analyzed in triplicate by membrane filtration (mFC followed by transfer to NA-MUG) and Colilert (Quanti-Tray 2000) (Pope et al. 2003).
  • Phase 3-EPA, through cooperation with AWWA, obtained E. coli holding time data from ten drinking water utilities that evaluated samples from 12 source waters. Each utility used an E. coli method of its choice (Colilert, mTEC, mEndo to NA-MUG, or mFC to NA-MUG). Samples were stored in coolers with wet ice, Utek ice packs, or Blue ice (Pope et al. 2003).

Phase 1 results indicated that E. coli concentrations were not significantly different after 24 hours at most sites when samples were stored at lower temperatures. Results from Phase 2, which evaluated actual sample storage practices, verified the Phase 1 observations at most sites. Similar results were observed during Phase 3, which evaluated a wider variety of surface waters from different regions throughout the U.S. During Phase 3, E. coli concentrations were not significantly different after 24 hours at most sites when samples were maintained below 10°C and did not freeze during storage. At longer holding times (e.g., 48 hours), larger differences were observed.

Based on these studies, EPA has concluded that E. coli samples can be held for up to 24 hours prior to analysis without compromising the data quality objectives of LT2ESWTR E. coli monitoring. Further, EPA believes that it is feasible for systems that must ship E. coli samples to an off-site laboratory for analysis to meet a 24 hour holding time. EPA is developing guidance for systems on packing and shipping E. coli samples so that samples are maintained below 10°C and not allowed to freeze (USEPA 2003g). This guidance is available in draft in the docket for today's proposal (http://www.epa.gov/edocket/).

c. Request for comment. EPA requests comment on whether the E. coli methods proposed for approval under the LT2ESWTR are appropriate, and whether there are additional methods not proposed that should be considered. Comments concerning method approval should be accompanied by supporting data where possible.

EPA also requests comment on the proposal to extend the holding time for E. coli source water sample analyses to 24 hours, including any data or other information that would support, modify, or repudiate such an extension. Should EPA limit the extended holding time to only those E. coli analytical methods that were evaluated in the holding time studies noted in this section? The results in Pope et al. (2003) indicate that most E. coli samples analyzed using ONPG-MUG (see methods in Table IV-37) incurred no significant degradation after a 30 to 48 hour holding time. As a result, should EPA increase the source water E. coli holding time to 30 or 48 hours for samples evaluated by ONPG-MUG, and retain a 24-hour holding time for samples analyzed by other methods? EPA also requests comment on the cost and availability of overnight delivery services for E. coli samples, especially in rural areas.

3. Turbidity

a. What is EPA proposing today? For turbidity analyses that will be conducted under the LT2ESWTR, EPA is proposing to require systems to use the analytical methods that have been previously approved by EPA for analysis of turbidity in drinking water, as listed in 40 CFR Part 141.74. These are Method 2130B as published in Standard Methods for the Examination of Water and Wastewater (APHA 1992), EPA Method 180.1 (USEPA 1993), and Great Lakes Instruments Method 2 (Great Lakes Instruments, 1992), and Hach FilterTrak Method 10133.

EPA method 180.1 and Standard Method 2130B are both nephelometric methods and are based upon a comparison of the intensity of light scattered by the sample under defined conditions with the intensity of light scattered by a standard reference suspension. Great Lakes Instruments Method 2 is a modulated four beam infrared method using a ratiometric algorithm to calculate the turbidity value from the four readings that are produced. Hach Filter Trak (Method 10133) is a laser-based nephelometric method used to determine the turbidity of finished drinking waters.

Turbidimeters

Systems are required to use turbidimeters described in EPA-approved methods for measuring turbidity. For regulatory reporting purposes, either an on-line or a bench top turbidimeter can be used. If a system chooses to use on-line units for monitoring, the system must validate the continuous measurements for accuracy on a regular basis using a protocol approved by the State.

b. How was this proposal developed? EPA believes the currently approved methods for analysis of turbidity in drinking water are appropriate for turbidity analyses that will be conducted under the LT2ESWTR.

c. Request for comment. EPA requests comment on whether the turbidity methods proposed today for the LT2ESWTR should be approved, and whether there are additional methods not proposed that should be approved.

L. Laboratory Approval

Given the potentially significant implications in terms of both cost and public health protection of microbial monitoring under the LT2ESWTR, laboratory analyses for Cryptosporidium, E. coli, and turbidity must be accurate and reliable within the limits of approved methods. Therefore, EPA proposes to require public water systems to use laboratories that have been approved to conduct analyses for these parameters by EPA or the State. The following criteria are proposed for laboratory approval under the LT2ESWTR:

  • For Cryptosporidium analyses under the LT2ESWTR, EPA proposes to approve laboratories that have passed a quality assurance evaluation under EPA's Laboratory Quality Assurance Evaluation Program (Lab QA Program) for Analysis of Cryptosporidium in Water (described in 67 FR 9731, March 4, 2002) (USEPA 2002c). If States adopt an equivalent approval process under State laboratory certification programs, then systems can use laboratories approved by the State.
  • For E. coli analyses, EPA proposes to approve laboratories that have been certified by EPA, the National Environmental Laboratory Accreditation Conference, or the State for total coliform or fecal coliform analysis in source water under 40 CFR 141.74. The laboratory must use the same analytical technique for E. coli that the laboratory uses for total coliform or fecal coliform analysis under 40 CFR 141.74.
  • Turbidity analyses must be conducted by a person approved by the State for analysis of turbidity in drinking water under 40 CFR 141.74.

These criteria are further described in the following paragraphs.

1. Cryptosporidium Laboratory Approval

Because States do not currently approve laboratories for Cryptosporidium analyses and LT2ESWTR monitoring will begin 6 months after rule promulgation, EPA will initially assume responsibility for Cryptosporidium laboratory approval. EPA expects, however, that States will include Cryptosporidium analysis in their State laboratory certification programs in the future. EPA has established the Lab QA Program for Cryptosporidium analysis to identify laboratories that can meet LT2ESWTR data quality objectives. This is a voluntary program open to laboratories involved in analyzing Cryptosporidium in water. Under this program, EPA assesses the ability of laboratories to reliably measure Cryptosporidium occurrence with EPA Methods 1622 and 1623, using both performance testing samples and an on-site evaluation.

EPA initiated the Lab QA Program for Cryptosporidium analysis prior to promulgation of the LT2ESWTR to ensure that adequate sample analysis capacity will be available at qualified laboratories to support the required monitoring. The Agency is monitoring sample analysis capacity at approved laboratories through the Lab QA Program, and does not plan to implement LT2ESWTR monitoring until the Agency determines that there is adequate laboratory capacity. In addition, utilities that choose to conduct Cryptosporidium monitoring prior to LT2ESWTR promulgation with the intent of grandfathering the data may elect to use laboratories that have passed the EPA quality assurance evaluation.

Laboratories seeking to participate in the EPA Lab QA Program for Cryptosporidium analysis must submit an interest application to EPA, successfully analyze a set of initial performance testing samples, and undergo an on-site evaluation. The on-site evaluation includes two separate but concurrent assessments: (1) Assessment of the laboratory's sample processing and analysis procedures, including microscopic examination, and (2) evaluation of the laboratory's personnel qualifications, quality assurance/quality control program, equipment, and recordkeeping procedures.

Laboratories that pass the quality assurance evaluation will be eligible for approval for Cryptosporidium analysis under the LT2ESWTR. The Lab QA Program is described in detail in a Federal Register Notice (67 FR 9731, March 4, 2002) (USEPA 2002c) and additional information can be found online at: www.epa.gov/safewater/lt2/cla_int.html.

Laboratories in the Lab QA Program will receive a set of three ongoing proficiency testing (OPT) samples approximately every four months. EPA will evaluate the precision and recovery data for OPT samples to determine if the laboratory continues to meet the performance criteria of the Laboratory QA Program.

2. E. coli Laboratory Approval

Pubic water systems are required to have samples analyzed for E. coli by laboratories certified under the State drinking water certification program to perform total coliform and fecal coliform analyses under 40 CFR 141.74. EPA is proposing that the general analytical techniques the laboratory is certified to use under the drinking water certification program (e.g., membrane filtration, multiple-well, multiple-tube) will be the methods the laboratory can use to conduct E. coli source water analyses under the LT2ESWTR.

3. Turbidity Analyst Approval

Measurements of turbidity must be conducted by a party approved by the State. This is consistent with current requirements for turbidity measurements in drinking water (40 CFR 141.74).

4. Request for Comment

EPA requests comment on the laboratory approval requirements proposed today, including the following specific issues:

Analyst Experience Criteria

The Lab QA Program, which EPA will use to approve laboratories for Cryptosporidium analyses under the LT2ESWTR, includes criteria for analyst experience. Principal analyst/supervisors (minimum of one per laboratory) should have a minimum of one year of continuous bench experience with Cryptosporidium and immunofluorescent assay (IFA) microscopy, a minimum of six months experience using EPA Method 1622 and/or 1623, and a minimum of 100 samples analyzed using EPA Method 1622 and/or 1623 (minimum 50 samples if the person was an analyst approved to conduct analysis for the Information Collection Rule Protozoan Method) for the specific analytical procedure they will be using.

Under the Lab QA Program, other analysts (no minimum number of analysts per laboratory) should have a minimum of six months of continuous bench experience with Cryptosporidium and IFA microscopy, a minimum of three months experience using EPA Method 1622 and/or 1623, and a minimum of 50 samples analyzed using EPA Method 1622 and/or 1623 (minimum 25 samples if the person was an analyst approved to conduct analysis for the Information Collection Rule Protozoan Method) for the specific analytical procedures they will be using.

The Lab QA Program criteria for principal analyst/supervisor experience are more rigorous than those in Methods 1622 and 1623, which are as follows: the analyst must have at least 2 years of college lecture and laboratory course work in microbiology or a closely related field. The analyst also must have at least 6 months of continuous bench experience with environmental protozoa detection techniques and IFA microscopy, and must have successfully analyzed at least 50 water and/or wastewater samples for Cryptosporidium. Six months of additional experience in the above areas may be substituted for two years of college.

In seeking approval for an Information Collection Request, EPA requested comment on the Lab QA Program (67 FR 9731, March 4, 2002) (USEPA 2002c). A number of commenters stated that the analyst qualification criteria are restrictive and could make it difficult for laboratories to maintain adequate analyst staffing (and, hence, sample analysis capacity) in the event of staff turnover or competing priorities. Some commenters suggested that laboratories and analysts should be evaluated based on proficiency testing, and that analyst experience standards should be reduced or eliminated. (Comments are available in Office of Water docket, number W-01-17).

Another aspect of the analyst experience criteria is that systems may generate Cryptosporidium data for grandfathering under the LT2ESWTR using laboratories that meet the analyst experience requirement of Methods 1622 or 1623 but not the more rigorous principal analyst/supervisor experience requirement of the Lab QA Program.

EPA requests comment on whether the criteria for analyst experience in the Lab QA Program are necessary, whether systems are experiencing difficulty in finding laboratories that have passed the Lab QA Program to conduct Cryptosporidium analysis, and whether any of the Lab QA Program criteria should be revised to improve the LT2ESWTR lab approval process.

State Programs To Approve Laboratories for Cryptosporidium Analysis

Under today's proposal, systems must have Cryptosporidium samples analyzed by a laboratory approved under EPA's Lab QA Program, or an equivalent State laboratory approval program. Because States do not currently approve laboratories for Cryptosporidium analyses, EPA will initially assume responsibility for Cryptosporidium laboratory approval. EPA expects, however, that States will adopt equivalent approval programs for Cryptosporidium analysis under State laboratory certification programs. EPA requests comment on how to establish that a State approval program for Cryptosporidium analysis is equivalent to the Lab QA Program.

Specifically, should EPA evaluate State Approval programs to determine if they are equivalent to the Lab QA Program? EPA also requests comment on the elements that would constitute an equivalent State approval program for Cryptosporidium analyses, including the following: (1) Successful analysis of initial and ongoing blind proficiency testing samples prepared using flow cytometry, including a matrix and meeting EPA's pass/fail criteria (described in USEPA 2002c); (2) an on-site evaluation of the laboratory's sample processing and analysis procedures, including microscopic examination skills, by auditors who meet the qualifications of a principal analyst as set forth in the Lab QA Program (described in USEPA 2002c); (3) an on-site evaluation of the laboratory's personnel qualifications, quality assurance/quality control program, equipment, and recordkeeping procedures; (4) a data audit of the laboratories' QC data and monitoring data; and (5) use of the audit checklist used in the Lab QA Program or equivalent.

M. Requirements for Sanitary Surveys Conducted by EPA

1. Overview

In today's proposal, EPA is requesting comment on establishing requirements for public water systems with significant deficiencies as identified in a sanitary survey conducted by EPA under SDWA section 1445. These requirements would apply to surface water systems for which EPA is responsible for directly implementing national primary drinking water regulations (i.e., systems not regulated by States with primacy). As described in this section, these requirements would ensure that systems in non-primacy States, currently Wyoming, and systems not regulated by States, such as Tribal systems, are subject to standards for sanitary surveys similar to those that apply to systems regulated by States with primacy.

2. Background

As established by the IESWTR in 40 CFR 142.16(b)(3), primacy States must conduct sanitary surveys for all surface water systems no less frequently than every three years for community water systems and no less frequently than every five years for noncommunity water systems. The sanitary survey is an onsite review and must address the following eight components: (1) Source, (2) treatment, (3) distribution system, (4) finished water storage, (5) pumps, pump facilities, and controls, (6) monitoring, reporting, and data verification, (7) system management and operation, and (8) operator compliance with State requirements.

Under the IESWTR, primacy States are required to have the appropriate rules or other authority to assure that systems respond in writing to significant deficiencies outlined in sanitary survey reports no later than 45 days after receipt of the report, indicating how and on what schedule the system will address significant deficiencies noted in the survey (40 CFR 142.16(b)(1)(ii)). Further, primacy States must have the authority to assure that systems take necessary steps to address significant deficiencies identified in sanitary survey reports if such deficiencies are within the control of the system and its governing body (40 CFR 142.16(b)(1)(iii)). The IESWTR did not define a significant deficiency, but required that primacy States describe in their primacy applications how they will decide whether a deficiency identified during a sanitary survey is significant for the purposes of the requirements stated in this paragraph (40 CFR 142.16(b)(3)(v)).

EPA conducts sanitary surveys under SDWA section 1445 for public water systems not regulated by primacy States (e.g., Tribal systems, Wyoming). However, EPA does not have the authority required of primacy States under 40 CFR 142 to ensure that systems address significant deficiencies identified during sanitary surveys. Consequently, the sanitary survey requirements established by the IESWTR create an unequal standard. Systems regulated by primacy States are subject to the States' authority to require correction of significant deficiencies noted in sanitary survey reports, while systems for which EPA has direct implementation authority do not have to meet an equivalent requirement.

3. Request for Comment

In order to ensure that systems for which EPA has direct implementation authority address significant deficiencies identified during sanitary surveys, EPA requests comment on establishing either or both of the following requirements under 40 CFR 141 as part of the NPDWR established in the final LT2ESWTR:

(1) For sanitary surveys conducted by EPA under SDWA section 1445, systems would be required to respond in writing to significant deficiencies outlined in sanitary survey reports no later than 45 days after receipt of the report, indicating how and on what schedule the system will address significant deficiencies noted in the survey.

(2) Systems would be required to correct significant deficiencies identified in sanitary survey reports if such deficiencies are within the control of the system and its governing body.

For the purposes of these requirements, a sanitary survey, as conducted by EPA, is an onsite review of the water source (identifying sources of contamination by using results of source water assessments where available), facilities, equipment, operation, maintenance, and monitoring compliance of a public water system to evaluate the adequacy of the system, its sources and operations, and the distribution of safe drinking water. A significant deficiency includes a defect in design, operation, or maintenance, or a failure or malfunction of the sources, treatment, storage, or distribution system that EPA determines to be causing, or has the potential for causing the introduction of contamination into the water delivered to consumers.

V. State Implementation Back to Top

This section describes the regulations and other procedures and policies States will be required to adopt to implement the LT2ESWTR, if finalized as proposed today. States must continue to meet all other conditions of primacy in 40 CFR Part 142.

The Safe Drinking Water Act (Act) establishes requirements that a State or eligible Indian tribe must meet to assume and maintain primary enforcement responsibility (primacy) for its public water systems. These requirements include: (1) Adopting drinking water regulations that are no less stringent than Federal drinking water regulations, (2) adopting and implementing adequate procedures for enforcement, (3) keeping records and making reports available on activities that EPA requires by regulation, (4) issuing variances and exemptions (if allowed by the State), under conditions no less stringent than allowed under the Act, and (5) adopting and being capable of implementing an adequate plan for the provisions of safe drinking water under emergency situations.

40 CFR part 142 sets out the specific program implementation requirements for States to obtain primacy for the public water supply supervision program as authorized under section 1413 of the Act. In addition to adopting basic primacy requirements specified in 40 CFR Part 142, States may be required to adopt special primacy provisions pertaining to specific regulations where implementation of the rule involves activities beyond general primacy provisions. States must include these regulation specific provisions in an application for approval of their program revision. Primacy requirements for today's proposal are discussed below.

To implement the proposed LT2ESWTR, States will be required to adopt revisions to:

§ 141.2—Definitions

§ 141.71—Criteria for avoiding filtration

§ 141.153—Content of the reports

§ 141.170—Enhanced filtration and disinfection

Subpart Q—Public Notification

New Subpart W—Additional treatment technique requirements for Cryptosporidium

§ 142.14—Records kept by States

§ 142.15—Reports by States

§ 142.16—Special primacy requirements

A. Special State Primacy Requirements

To ensure that a State program includes all the elements necessary for an effective and enforceable program under today's rule, a State primacy application must include a description of how the State will perform the following:

(1) Approve watershed control programs for the 0.5 log watershed control program credit in the microbial toolbox (see section IV.C.2);

(2) Assess significant changes in the watershed and source water as part of the sanitary survey process and determine appropriate follow-up action (see section IV.A);

(3) Determine that a system with an uncovered finished water storage facility has a risk mitigation plan that is adequate for purposes of waiving the requirement to cover the storage facility or treat the effluent (see section IV.E);

(4) Approve protocols for removal credits under the Demonstration of Performance toolbox option (see section IV.C.17) and for site specific chlorine dioxide and ozone CT tables (see section IV.C.14); and

(5) Approve laboratories to analyze for Cryptosporidium.

Note that a State program can be more, but not less, stringent than Federal regulations. As such, some of the elements listed here may not be applicable to a specific State program. For example, if a State chooses to require all finished water storage facilities to be covered or provide treatment and not to allow a risk mitigation plan to substitute for this requirement, then the description for item (3) would be inapplicable.

B. State Recordkeeping Requirements

The current regulations in § 142.14 require States with primacy to keep various records, including the following: Analytical results to determine compliance with MCLs, MRDLs, and treatment technique requirements; system inventories; State approvals; enforcement actions; and the issuance of variances and exemptions. The proposed LT2ESWTR will require States to keep additional records of the following, including all supporting information and an explanation of the technical basis for each decision:

  • Results of source water E. coli and Cryptosporidium monitoring;
  • Cryptosporidium bin classification for each filtered system, including any changes to initial bin classification based on review of the watershed during sanitary surveys or the second round of monitoring;
  • Determination of whether each unfiltered system has a mean source water Cryptosporidium level above 0.01 oocysts/L;
  • The treatment processes or control measures that each system employs to meet Cryptosporidium treatment requirements under the LT2ESWTR; this includes documentation to demonstrate compliance with required design and implementation criteria for receiving credit for microbial toolbox options, as specified in section IV.C;
  • A list of systems required to cover or treat the effluent of an uncovered finished water storage facilities; and
  • A list of systems for which the State has waived the requirement to cover or treat the effluent of an uncovered finished water storage facility, along with supporting documentation of the risk mitigation plan.

C. State Reporting Requirements

EPA currently requires in § 142.15 that States report to EPA information such as violations, variance and exemption status, and enforcement actions. The LT2ESWTR, as proposed, will add additional reporting requirements in the following area:

  • The Cryptosporidium bin classification for each filtered system, including any changes to initial bin classification based on review of the watershed during sanitary surveys or the second round of monitoring;
  • The determination of whether each unfiltered system has a mean source water Cryptosporidium level above 0.01 oocysts/L, including any changes to this determination based on the second round of monitoring.

D. Interim Primacy

On April 28, 1998, EPA amended its State primacy regulations at 40 CFR 142.12 to incorporate the new process identified in the 1996 SDWA Amendments for granting primary enforcement authority to States while their applications to modify their primacy programs are under review (63 FR 23362, April 28, 1998) (USEPA 1998f). The new process grants interim primary enforcement authority for a new or revised regulation during the period in which EPA is making a determination with regard to primacy for that new or revised regulation. This interim enforcement authority begins on the date of the primacy application submission or the effective date of the new or revised State regulation, whichever is later, and ends when EPA makes a final determination. However, this interim primacy authority is only available to a State that has primacy (including interim primacy) for every existing NPDWR in effect when the new regulation is promulgated.

As a result, States that have primacy for every existing NPDWR already in effect may obtain interim primacy for this rule, beginning on the date that the State submits the application for this rule to USEPA, or the effective date of its revised regulations, whichever is later. In addition, a State that wishes to obtain interim primacy for future NPDWRs must obtain primacy for this rule. As described in Section IV.A, EPA expects to oversee the initial source water monitoring that will be conducted under the LT2ESWTR by systems serving at least 10,000 people, beginning 6 months following rule promulgation.

VI. Economic Analysis Back to Top

This section summarizes the economic analysis (EA) for the LT2ESWTR proposal. The EA is an assessment of the benefits, both health and non-health related, and costs to the regulated community of the proposed regulation, along with those of regulatory alternatives that the Agency considered. EPA developed this EA to meet the requirement of SDWA section 1412(b)(3)(C) for a Health Risk Reduction and Cost Analysis (HRRCA), as well as the requirements of Executive Order 12866, Regulatory Planning and Review, under which EPA must estimate the costs and benefits of the LT2ESWTR. The full EA is presented in Economic Analysis for the Long Term 2 Enhanced Surface Water Treatment Rule (USEPA 2003a), which is available in the docket for today's proposal (www.epa.gov.edocket/).

Today's proposed LT2ESWTR is the second in a staged set of rules that address public health risks from microbial contamination of surface and GWUDI drinking water supplies and, more specifically, prevent Cryptosporidium from reaching consumers. As described in section I, the Agency promulgated the IESWTR and LT1ESWTR to provide a baseline of protection against Cryptosporidium in large and small drinking water systems, respectively. Today's proposed rule would achieve further reductions in Cryptosporidium exposure for systems with the highest vulnerability. This economic analysis considers only the incremental reduction in exposure from the two previously promulgated rules (IESWTR and LT1ESWTR) to the alternatives evaluated for the LT2ESWTR.

Both benefits and costs are determined as annualized present values. The process allows comparison of cost and benefit streams that are variable over a given time period. The time frame used for both benefit and cost comparisons is 25 years; approximately five years account for rule implementation and 20 years for the average useful life of the equipment used to comply with treatment technique requirements. The Agency uses social discount rates of both three percent and seven percent to calculate present values from the stream of benefits and costs and also to annualize the present value estimates (see EPA's Guidelines for Preparing Economic Analyses (USEPA 2000c) for a discussion of social discount rates). The LT2ESWTR EA (USEPA 2003a) also shows the undiscounted stream of both benefits and costs over the 25 year time frame.

A. What Regulatory Alternatives Did the Agency Consider?

Regulatory alternatives considered by Agency for the LT2ESWTR were developed through the deliberations of the Stage 2 M-DBP Federal Advisory Committee (described in section II). The Committee considered several general approaches for reducing the risk from Cryptosporidium in drinking water. As discussed in section IV.A.2, these approaches included both additional treatment requirements for all systems and risk-targeted treatment requirements for systems with the highest vulnerability to Cryptosporidium following implementation of the IESWTR and LT1ESWTR. In addition, the Committee considered related factors such as surrogates for Cryptosporidium monitoring and alternative monitoring strategies to minimize costs to small drinking water systems.

After considering these general approaches, the Committee focused on four specific regulatory alternatives for filtered systems (see Table VI-1). With the exception of Alternative 1, which requires all systems to achieve an additional 2 log (99%) reduction in Cryptosporidium levels, these alternatives incorporate a microbial framework approach. In this approach, systems are classified in different risk bins based on the results of source water monitoring. Additional treatment requirements are directly linked to the risk bin classification. Accordingly, these rule alternatives are differentiated by two criteria: (1) The Cryptosporidium concentrations that define the bin boundaries and (2) the degree of treatment required for each bin.

In assessing regulatory alternatives, EPA and the Advisory Committee were concerned with the following questions: (1) Do the treatment requirements adequately control Cryptosporidium concentrations in finished water? (2) How many systems will be required to add treatment? (3) What is the likelihood that systems with high source water Cryptosporidium concentrations will not be required to provide additional treatment (i.e., be misclassified in a low risk bin)? and (4) What is the likelihood that systems with low source water Cryptosporidium concentrations will be required to provide unnecessary additional treatment (i.e., misclassified in a high risk bin)?

The Committee reached consensus regarding additional treatment requirements for unfiltered systems and uncovered finished water storage facilities without formally identifying regulatory alternatives. Table VI-1 summarizes the four alternatives that were considered for filtered systems.

Table VI-1.—Summary of Regulatory Alternatives for Filtered Systems Back to Top
Average source water Cryptosporidium monitoring result (oocysts/L) Additional treatment requirements1
1Note: “Additional treatment requirements” are in addition to levels already required under existing rules (e.g., the IESWTR and LT1ESWTR).
Alternative A1  
2.0 log inactivation required for all systems  
Alternative A2  
0.03 No action.
≥ 0.03 and 0.1 0.5 log.
≥ 0.1 and 1.0 1.5 log.
≥ 1.0 2.5 log.
Alternative A3—Preferred Alternative  
0.075 No action.
≥ 0.075 and 1.0 1 log.
≥ 1.0 and 3.0 2 log.
≥ 3.0 2.5 log.
Alternative A4  
0.1 No action.
≥ 0.1 and 1.0 0.5-log.
≥1.0 1.0 log.

B. What Analyses Support Selecting the Proposed Rule Option?

EPA has quantified benefits and costs of each of the regulatory alternatives in Table VI-1, as well as for the proposed requirements for unfiltered systems. Quantified benefits stem from estimated reductions in the incidence of cryptosporidiosis resulting from the regulation. To make these estimates, the Agency developed a two-dimensional Monte Carlo model that accounts for uncertainty and variability in key parameters like Cryptosporidium occurrence, infectivity, and treatment efficiency. Analyses involved estimating the baseline (pre-LT2ESWTR) risk from Cryptosporidium in drinking water, and then projecting the reductions in exposure and risk resulting from the additional treatment requirements of the LT2ESWTR. Costs result largely from the installation of additional treatment, with lesser costs due to monitoring and other implementation activities. Results of these analyses are summarized in the following subsections, and details are shown in the LT2ESWTR EA (USEPA 2003a).

Cryptosporidium occurrence significantly influences the estimated benefits and costs of regulatory alternatives. As discussed in section III.C, EPA analyzed data collected under the Information Collection Rule, the Information Collection Rule Supplemental Surveys of medium systems (ICRSSM), and the Information Collection Rule Supplemental Surveys of large systems (ICRSSL) to estimate the national occurrence distribution of Cryptosporidium in surface water. EPA evaluated these distributions independently when assessing benefits and costs for different regulatory alternatives. In most cases, results from the ICRSSM data set are within the range of results of the Information Collection Rule and ICRSSL data sets.

EPA selected a Preferred Regulatory Alternative for the LT2ESWTR, consistent with the recommendations of the Advisory Committee. As described next, this selection was based on the estimated impacts and feasibility of the alternatives shown in Table VI-1.

Alternative A1 (across-the-board 2-log inactivation) was not selected because it was the highest cost option and imposed costs but provided few benefits to systems with high quality source water (i.e., relatively low Cryptosporidium risk). In addition, there were concerns about the feasibility of requiring almost every surface water treatment plant to install additional treatment processes (e.g., UV or ozone) for Cryptosporidium.

Alternatives A2-A4 were evaluated based on several factors, including predictions of costs and benefits, performance of analytical methods for classifying systems in the risk bins, and other specific impacts (e.g., impacts on small systems or sensitive subpopulations). Alternative A3 was recommended by the Advisory Committee because it provides significant health benefits in terms of avoided illnesses and deaths for an acceptable cost. In addition, the Agency believes this alternative is feasible with available analytical methods and treatment technologies.

Incremental costs and benefits of regulatory alternatives for the LT2ESWTR are shown in section VI.F, and the LT2ESWTR EA contains more detailed information about the benefits and costs of each regulatory option (USEPA 2003a).

C. What Are the Benefits of the Proposed LT2ESWTR?

As discussed previously, the LT2ESWTR is expected to substantially reduce drinking water related exposure to Cryptosporidium, thereby reducing both illness and death associated with cryptosporidiosis. As described in section II, cryptosporidiosis is an infection caused by Cryptosporidium and is an acute, typically self-limiting, illness with symptoms that include diarrhea, abdominal cramping, nausea, vomiting, and fever (Juranek, 1995). Cryptosporidiosis patients in sensitive subpopulations, such as infants, the elderly, and AIDS patients, are at risk for severe illness, including risk of death. While EPA has quantified and monetized the health benefits for reductions in endemic cryptosporidiosis that would result from the LT2ESWTR, the Agency was unable to quantify or monetize other health and non-health related benefits associated with this rule. These unquantified benefits are characterized next, followed by a summary of the quantified benefits.

1. Non-Quantifiable Health and Non-health Related Benefits

Although there are substantial monetized benefits that result from this rule due to reduced rates of endemic cryptosporidiosis, other potentially significant benefits of this rule remain unquantified and non-monetized. The unquantified benefits that result from this rule are summarized in Table VI-2 and are described in greater detail in the LT2ESWTR EA (USEPA 2003a).

Table VI-2.—Summary of Nonquantified Benefits Back to Top
Benefit type Potential effect on benefits Comments
Source: Chapter 5 of the LT2ESWTR Economic Analysis (USEPA 2003a).
Reducing outbreak risks and response costs Increase Some outbreaks are caused by human or equipment failures that may occur even with the proposed new requirements; however, by adding barriers of protection for some systems, the rule will reduce the possibility of such failures leading to outbreaks.
Reducing averting behavior (e.g., boiling tap water or purchasing bottled water) Increase / No Change Averting behavior is associated with both out-of-pocket costs (e.g., purchase of bottled water) and opportunity costs (e.g., time requiring to boil water) to the consumer. Reductions in averting behavior are expected to have a positive impact on benefits from the rule.
Improving aesthetic water quality Increase Some technologies installed for this rule (e.g., ozone) are likely to reduce taste quality and odor problems.
Reducing risk from co-occurring and emerging pathogens Increase Although focused on removal of Cryptosporidium from drinking water, systems that change treatment processes will also increase removal of pathogens that the rule does not specifically regulate. Additional benefits will accrue.
Increased source water monitoring Increase The greater understanding of source water quality that results from monitoring may enhance the ability of plants to optimize treatment operations in ways other than those addressed in this rule.
Reduced contamination due to covering on treating finished water storage facilities Increase Although insufficient data were available to quantify benefits, the reduction of contaminants introduced through uncovered finished water storage facilities would produce positive public health benefits.

2. Quantifiable Health Benefits

EPA quantified benefits for the LT2ESWTR based on reductions in the risk of endemic cryptosporidiosis. Several categories of monetized benefits were considered in this analysis.

First, EPA estimated the number of cases expected to result in premature mortality (primarily for members of sensitive subpopulations such as AIDS patients). In order to estimate the benefits from deaths avoided as a result of the rule, EPA multiplied the estimates for number of illnesses avoided by a projected mortality rate. This mortality rate was developed using mortality data from the Milwaukee cryptosporidiosis outbreak of 1993 (described in section II), with adjustments to account for the subsequent decrease in the mortality rate among people with AIDS and for the difference between the 1993 Milwaukee AIDS rate and the current national rate. EPA estimated a mortality rate of 16.6 deaths per 100,000 illnesses for those served by unfiltered systems and a mortality rate of 10.6 deaths per 100,000 illnesses for those served by filtered systems. These different rates are associated with the incidence of AIDS in populations served by unfiltered and filtered systems. A complete discussion on how EPA derived these rates can be found in subchapter 5.2 of the LT2ESWTR EA (USEPA 2003a).

Reductions in mortalities were monetized using EPA's standard methodology for monetizing mortality risk reduction. This methodology is based on a distribution of value of statistical life (VSL) estimates from 26 labor market and stated preference studies, with a mean VSL of $6.3M in 2000, and a 5th to 95th percentile range of $1.0 to $14.5. A more detailed discussion of these studies and the VSL estimate can be found in EPA's Guidelines for Preparing Economic Analyses (USEPA 2000c). A real income growth factor was applied to these estimates of approximately 2.3% per year for the 20 year time span following implementation. Income elasticity for VSL was estimated as a triangular distribution that ranged from 0.08 to 1.00, with a mode of 0.40. VSL values for the 20 year span are shown in the LT2 EA in Exhibit C.13 (USEPA 2003a).

The substantial majority of cases are not expected to be fatal and the Agency separately estimated the value of non-fatal illnesses avoided that would result from the LT2ESWTR. For these, EPA first divided projected cases into three categories, mild, moderate, and severe, and then calculated a monetized value per case avoided for each severity level. These were then combined into a weighted average value per case based on the relative frequency of each severity level. According to a study conducted by Corso et al. (2003), the majority of illness falls into the mild category (88 percent). Approximately 11 percent of illness falls into the moderate category, which is defined as those who seek medical treatment but are not hospitalized. The final one percent have severe symptoms that result in hospitalization. EPA estimated different medical expenses and time losses for each category.

Benefits for non-fatal cases were calculated using a cost-of-illness (COI) approach. Traditional COI valuations focus on medical costs and lost work time, and leave out significant categories of benefits, specifically the reduced utility from being sick (i.e., lost personal or non-work time, including activities such as child care, homemaking, community service, time spent with family, and recreation), although some COI studies also include an estimate for unpaid labor (household production) valued at an estimated wage rate designed to reflect the market value of such labor (e.g., median wage for household domestic labor). This reduced utility is variously referred to as lost leisure or a component of pain and suffering. Ideally, a comprehensive willingness to pay (WTP) estimate would be used that includes all categories of loss in a single number. However, a review of the literature indicated that the available studies were not suitable for valuing cryptosporidiosis; hence, estimates from this literature are inappropriate for use in this analysis. Instead, EPA presents two COI estimates: a traditional approach that only includes valuation for medical costs and lost work time (including some portion of unpaid household production); and an enhanced approach that also factors in valuations for lost unpaid work time for employed people, reduced utility (or sense of well-being) associated with decreased enjoyment of time spent in non-work activities, and lost productivity at work on days when workers are ill but go to work anyway.

Table VI-3 shows the various categories of loss and how they were valued for each estimate for a “typical” case (weighted average of severity level—see LT2ESWTR EA—Chapter 5 for more details (USEPA 2003a).

Table VI-3.—Traditional and Enhanced COI for Cryptosporidiosis Back to Top
Loss category Traditional COI Enhanced COI
1Assigned to 38.2% of the population not engaged in market work; assumes 40 hr, unpaid work week, valued at $5.46/hr in traditional COI and $10.92/hr in enhanced COI. Does not include lost unpaid work for employed people and may not include all unpaid work for people outside the paid labor force.
2Values lost work or leisure time for people caring for the ill. Traditional approach does not include lost leisure time.
3Includes child care and homemaking (to the extent not covered in lost unpaid work days above), time with family, and recreation for people within and outside the paid labor force.
4Detail may not calculate to totals due to independent rounding; Source: Appendix L in LT2ESWTR EA (USEPA 2003a).
5Not included.
Direct Medical Costs $93.82 $93.82
Lost Paid Work Days 109.88 109.88
Lost Unpaid Work Days1 20.22 40.44
Lost Caregiver Days2 20.70 54.31
Lost Leisure Time3 5 333.96
Lost Productivity at Work 5 112.49
Total4 244.62 744.89

The various loss categories were calculated as follows: Medical costs are a weighted average across the three illness severity levels of actual costs for doctor and emergency room visits, medication, and hospital stays. Lost paid work represents missed work time of paid employees, valued at the median pre-tax wage, plus benefits of $18.47 hour. The average number of lost work hours per case is 5.95 (this assumes that 62 percent of the population is in the paid labor force and the loss is averaged over seven days). Medical costs and lost work days reflect market transactions. Medical costs are always included in COI estimates and lost work days are usually included in COI estimates.

In the traditional COI estimate, an equivalent amount of lost unpaid work time was assigned to the 38% of the population that are not in the paid labor force. This includes homemakers, students, children, retires, and unemployed persons. EPA did not attempt to calculate what percent of cases falls in each of these five groups, or how many hours per week each group works, but rather assumed an across-the-board 40 hour unpaid work week. This time is valued at $5.46 per hour, which is one half the median post-tax wage, (since work performed by these groups is not taxed). This is approximately the median wage for paid household domestic labor.

In the enhanced COI estimate, all time other than paid work and sleep (8 hours per day) is valued at the median after tax wage, or $10.92 per hour. This includes lost unpaid work (e.g., household production) and leisure time for people within and outside the paid labor force. Implicit in this approach, is that people would pay the same amount not to be sick during their leisure time as they require to give up their leisure time to work (i.e., the after tax wage). In reality, people might be willing to pay either more than this amount (if they were very sick and suffering a lot) or less than this amount (if they were not very sick and still got some enjoyment out of activities such as resting, reading and watching TV), not to be sick. Multiplying 16 hours by $10.92 gives a value of about $175.00 for a day of “lost” unpaid work and leisure (i.e., lost utility of being sick).

An estimate of lost unpaid work days for the enhanced approach was made by assigning the value of $10.92 per hour to the same number of unpaid work hours valued in the traditional COI approach (i.e., 40 unpaid work hours per week for people outside the paid labor force). Lost unpaid work for employed people and any unpaid labor beyond 40 hours per week for those not in the labor market is shown as lost leisure time in Table VI-3 for the enhanced approach and is not included in the traditional approach. In addition, for days when an individual is well enough to work but still experiencing symptoms, such as diarrhea, the enhanced estimate also includes a 30% loss of work and leisure productivity, based on a study of giardiasis illness (Harrington et al. 1985) which is similar to cryptosporidiosis. Appendix P in the EA describes similar productivity losses for other illnesses such as influenza (35%-73% productivity losses). In the traditional COI analysis, productivity losses are not included for either work or non-work time.

The Agency believes that losses in productivity and lost leisure time are unquestionably present and that these categories have positive value; consequently, the traditional COI estimate understates the true value of these loss categories. EPA notes that these estimates should not be regarded as upper and lower bounds. In particular, the enhanced COI estimate may not fully incorporate the value of pain and suffering, as people may be willing to pay more than $201 to avoid a day of illness. The traditional COI estimate includes a valuation for a lost 40 hour work week for all persons not in the labor force, including children and retirees. This may be an overstatement of lost productivity for these groups, which would depend on the impact of such things as missed school work or volunteer activities that may be affected by illness.

As with the avoided mortality valuation, the real wages used in the COI estimates were increased by a real income growth factor that varies by year, but is the equivalent of about 2.3% over the 20 year period. This approach of adjusting for real income growth was recommended by the SAB (USEPA 2000e) because the median real wage is expected to grow each year (by approximately 2.3%)—the median real wage is projected to be $38,902 in 2008 and $59,749 in 2027. Correspondingly, the real income growth factor of the COI estimates increases by the equivalent of 2.3% per year (except for medical costs, which are not directly tied to wages). This approach gives a total COI valuation in 2008 of $268.92 for the traditional COI estimate and $931.06 for the enhanced COI estimate; the valuation in 2027 is $362.75 for the traditional COI estimate and $1,429.99 for the enhanced COI estimate. There is no difference in the methodology for calculating the COI over this 20 year period of implementation; the change in valuation is due to the underlying change in projected real wages.

Table VI-4 summarizes the annual cases of cryptosporidiosis illness and associated deaths avoided due to the LT2ESWTR proposal. The proposed rule, on average, is expected to reduce 256,000 to 1,019,000 illnesses and 37 to 141 deaths annually after full implementation (range based on the ICRSSL, ICRSSM, and Information Collection Rule data sets).

Table VI-4.—Summary of Annual Avoided Illness and Deaths Back to Top
Data set Annual illinesses avoided Annual deaths avoided
Mean 90 percent confidence bound Mean 90 percent confidence bound
Lower (5th %ile) Upper (95th %ile) Lower (5th %ile) Upper (95th %ile)
Source: The LT2ESWTR Economic Analysis (USEPA 2003a).
Annual Total After Full Implementation            
ICR 1,018,915 169,358 2,331,467 141 25 308
ICRSSL 256,173 45,292 560,648 37 7 78
ICRSSM 498,363 84,724 1,177,415 70 13 157
Annual Average Over 25 years            
ICR 720,668 119,694 1,647,796 100 18 218
ICRSSL 181,387 32,179 396,845 26 5 55
ICRSSM 352,611 59,942 833,290 50 9 111

Tables VI-5a and VI-5b show the monetized present value of the benefit for reductions in endemic cryptosporidiosis estimated to result from the LT2ESWTR for the enhanced and traditional COI values, respectively. Estimates are given for the Information Collection Rule, ICRSSL, and ICRSSM occurrence data sets.

With the enhanced COI and a three percent discount rate, the annual present value of the mean benefit estimate ranges from $374 million to $1.4 billion, with a 90 percent confidence bound of $52 million to $198 million at the lower 5th percentile and $959 million to $3.7 billion at the upper 95th percentile; at a seven percent discount rate, this estimate ranges from $318 million to $1.2 billion, with a 90 percent confidence bound of $44 million to $168 million at the lower 5th percentile and $816 million to $3.1 billion at the upper 95th percentile. With the traditional COI, the corresponding benefit estimate at a three percent discount rate ranges from $253 million to $967 million, with a 90 percent confidence bound of $27 million to $105 million at the lower 5th percentile and $713 million to $2.7 billion at the upper 95th percentile; for a seven percent discount rate, this estimate ranges from $216 million to $826 million, with a 90 percent confidence bound of $23 million to $89 million at the lower 5th percentile and $610 million to $2.3 billion at the upper 95th percentile. None of these values include the unquantified and non-monetized benefits discussed previously.

Table VI-5A.—Summary of Quantified Benefits—Enhanced COI Back to Top
Data set Value of benefits—Enhanced COI1
Mean 90 percent confidence bound
Lower (5th %ile) Upper (95th %ile)
[$millions, 2000$]
1The traditional COI only includes valuation for medical costs and lost work time (including some portion of unpaid household production). The enhanced COI also factors in valuations for lost personal time (non-worktime) such as child care and homemaking (to the extent not covered by the traditional COI), time with family, and recreation, and lost productivity at work on days when workers are ill but go to work anyway. Source: The LT2ESWR Economic Analysis (USEPA 2003a).
Annualized Value (at 3%, 25 Years)      
ICR $1,445 $198 3,666
ICRSSL 374 52 959
ICRSSM 715 96 1,849
Annualized Value (at 7%, 25 Years)      
ICR 1,230 168 3,120
ICRSSL 318 44 816
ICRSSM 609 81 1,577
Table VI-5b.—Summary of Quantified Benefits—Traditional COI Back to Top
Data Set Value of Benefits—Traditional COI1
Mean 90 percent confidence bound
Lower (5th %ile) Upper 95th %ile)
[($Millions, 2000$]
1The traditional COI only includes valuation for medical costs and lost work time (including some portion of unpaid household production). The enhanced COI also factors in valuations for lost personal time (non-worktime) such as child care and homemaking (to the extent not covered by the traditional COI), time with family, and recreation, and lost productivity at work on days when workers are ill but go to work anyway. Source: The LT2ESWTR Economic Analysis (USEPA 2003a).
Annualized Value (at 3%, 25 Years)      
ICR $967 $105 $2,713
ICRSSL 253 27 713
ICRSSM 481 50 1,372
Annualized Value (at 7%, 25 Years)      
ICR 826 89 2,315
ICRSSL 216 23 610
ICRSSM 411 43 1,172

a. Filtered systems. Benefits to the approximately 161 million people served by filtered surface water and GWUDI systems range from 88,000 to 472,000 reduction in mean annual cases of endemic illness based on ICRSSL, ICRSSM, and ICR data sets. In addition, premature mortality is expected to be reduced by an average of 9 to 50 deaths annually.

b. Unfiltered systems. The 12 million people served by unfiltered surface water or GWUDI systems will see a significant reduction in cryptosporidiosis as a result of the LT2ESWTR. In this population, the rule is expected to reduce approximately 168,000 to 547,000 cases of illness and 28 to 91 premature deaths annually.

For unfiltered systems, only the Information Collection Rule data set is used to directly calculate illness reduction because it is the only data set that includes sufficient information on unfiltered systems. Illness reduction in unfiltered systems was estimated for the ICRSSL and ICRSSM data sets by multiplying the Information Collection Rule unfiltered system result by the ratio, for the quantity estimated, between filtered system results from the supplemental survey data set (SSM or SSL) and filtered system results from the Information Collection Rule.

3. Timing of Benefits Accrual (Latency)

In previous rulemakings, some commenters have argued that the Agency should consider an assumed time lag or latency period in its benefits calculations. The Agency has not conducted a latency analysis for this rule because cryptosporidiosis is an acute illness; therefore, very little time elapses between exposure, illness, and mortality. However, EPA does account for benefits and costs that occur in future years by converting these to present value estimates.

D. What Are the Costs of the Proposed LT2ESWTR?

In order to estimate the costs of today's proposed rule, the Agency considered impacts on public water systems and on States (including territories and EPA implementation in non-primacy States). EPA assumed that systems would be in compliance with the IESWTR, which has a compliance date of January 2002 for large systems and the LT1ESWTR, which has a compliance date of January 2005 for small systems. Therefore, this cost estimate only considers the additional requirements that are a direct result of the LT2ESWTR. More detailed information on cost estimates are described next and a complete discussion can be found in chapter 6 of the LT2ESWTR EA (USEPA 2003a). An detailed discussion of the proposed rule provisions is located in section IV of this preamble.

1. Total Annualized Present Value Costs

Tables VI-6a and VI-6b summarize the annualized present value cost estimates for the proposed LT2ESWTR at three percent and seven percent discount rates, respectively. The mean annualized present value costs of the proposed LT2ESWTR are estimated to range from approximately $73 to $111 million using a three percent discount rate and $81 to $121 million using a seven percent discount rate. This range in mean cost estimates is associated with the ICRSSL and Information Collection Rule Cryptosporidium occurrence data sets. Using different occurrence data sets results in different bin classifications and, thus, impacts the cost of the rule. Results for the ICRSSM fall within the range of results for the Information Collection Rule and ICRSSL. In addition to mean estimates of costs, the Agency calculated 90 percent confidence bounds by considering the uncertainty in Cryptosporidium occurrence estimates and around the mean unit technology costs (USEPA 2003a).

Public water systems will incur approximately 99 percent of the rule's total annualized present value costs. States incur the remaining rule costs. Table VI-7 shows the undiscounted initial capital and one-time costs broken out by rule component. A comparison of annualized present value costs among the rule alternatives considered by the Agency is located in subsection VI.F. and in the LT2ESWTR EA (USEPA 2003a). Using a present value allows costs and benefits that occur during different time periods to be compared. For any future cost, the higher the discount rate, the lower the present value. Specifically, a future cost evaluated at a seven percent discount rate will always result in a lower total present value cost than the same future cost evaluated at a three percent discount rate.

BILLING CODE 6560-50-P

[Graphic not available; view image of printed page]

[Graphic not available; view image of printed page]

[Graphic not available; view image of printed page]

BILLING CODE 6560-50-C

2. Water System Costs

The proposed LT2ESWTR applies to all community, non-transient non-community, and transient non-community water systems that use surface water or GWUDI as a source (including both filtered and unfiltered systems). EPA has estimated the cost impacts for these three types of public drinking water systems. As shown in Table VI-6a and VI-6b, the mean annualized present value costs for all drinking water systems range from approximately $73 to $111 million using a three percent discount rate ($81 to $121 million using a seven percent discount rates).

The majority of costs of the rule result from treatment changes incurred by filtered and unfiltered systems. Table VI-8 shows the number of filtered and unfiltered systems that will incur costs by rule provision. Subsection VI.D.2.b discusses treatment costs for filtered system and subsection VI.D.2.c discusses treatment options for unfiltered systems. All non-purchased surface water and GWUDI systems subject to the LT2ESWTR (including filtered and unfiltered systems) will incur one-time costs that include time for staff training on rule requirements. Systems will incur monitoring costs to assess source water Cryptosporidium levels, though monitoring requirements vary by system size (large vs. small) and system type (filtered vs. unfiltered). A discussion of future monitoring that will occur six years after initial bin assignments can be found in subsection VI.D.2.e.

BILLING CODE 6560-50-P

[Graphic not available; view image of printed page]

BILLING CODE 6560-50-C

a. Source water monitoring costs. Source water monitoring costs are structured on a per-plant basis. Also, as with implementation activities, purchased plants are assumed not to treat source water and will not have any monitoring costs. There are three types of monitoring that plants may be required to conduct—turbidity, E. coli and Cryptosporidium. Source water turbidity is a common water quality parameter used for plant operational control. Also, to meet SWTR, LT1ESWTR and IESWTR requirements, most water systems have turbidity analytical equipment in-house and operators are experienced with turbidity measurement. Thus, EPA assumes that the incremental turbidity monitoring burden associated with the LT2ESWTR is negligible.

Filtered plants in small systems initially will be required to conduct one year of biweekly E. coli source water monitoring. These plants will be required to monitor for Cryptosporidium if, as a result of initial bin classification, E. coli levels exceed the following concentrations: (1) Annual mean 10 E. coli/100 mL for lakes and reservoir sources, and (2) annual mean 50 E. coli/100 mL for flowing stream sources. EPA estimated the percent of small plants that would be triggered into Cryptosporidium monitoring as being equal to the percent of large plants that would fall into any bin requiring additional treatment.

Estimates of laboratory fees, shipping costs, labor hours for sample collection, and hours for reporting results were used to predict system costs for initial source water monitoring under the LT2ESWTR. Table VI-9 summarizes the present value of monitoring costs for initial bin classification. Total present value monitoring costs for initial bin classification range from $46 million to $60 million depending on the occurrence data set and discount rate. Appendix D of the LT2ESWTR EA provides a full explanation of how these costs were developed (USEPA 2003a).

Table VI-9.—Summary of Present Value Monitoring Costs for Initial Bin Classification Back to Top
System Size ICR (3%) A ICR (7%) B ICRSSL (3%) C ICRSSL (7%) D ICRSSM (3%) E ICRSSM (7%) F
($millions, 2000$)
Source: Chapter 6 of the LT2ESWTR Economic Analysis (USEPA 2003a).
≤10K $34.6 $29.7 $25.7 $22.2 $29.2 $25.1
10K 25.7 24.3 25.7 24.3 25.7 24.3
Total 60.3 54.0 51.4 46.5 54.9 49.4

b. Filtered systems treatment costs. The Agency calculated treatment costs by estimating the number of plants that will be adding treatment technologies and coupling these estimates with unit costs ($/plant) of the selected technologies. Table VI-10 shows the number of plants estimated to select different treatment technologies; Table VI-11 summarizes the present value treatment costs and annualized present value costs for both filtered and unfiltered systems.

To estimate the number of filtered plants that would select a particular treatment technology, the Agency followed a two step process. First, the number of plants that must make treatment changes to meet the proposed LT2ESWTR requirement was determined by the binning process. Second, EPA predicted the treatment technologies that plants would choose to meet the proposed requirements. The Agency used a “least-cost decision tree” as the basic framework for determining the treatment technology selection. In other words, EPA assumed that drinking water plants would select the least expensive technology or combination of technologies to meet the log removal requirements of a given action bin. However, these technology selections were constrained by maximum use percentages, which recognize that some plants will not be able to implement certain technologies because of site-specific conditions. In addition, certain potentially lower cost components of the microbial toolbox, such as changes to the plant intake, were not included because the Agency lacked data to estimate the number of plants that could select it. These limitations on technology use may result in an overestimate of costs. An in-depth discussion of the technology selection methodology and unit cost estimates can be found in appendices E and F of the proposed LT2ESWTR EA (USEPA 2003a).

Table VI-10.—Technology Selection Forecasts for Filtered Plants Back to Top
Data set
ICR ICRSSL ICRSSM
1Some plants are projected to select more than one technology to meet LT2ESWTR bin requirements; consequently, the value for total plants does not equal the sum of all technologies selected. Source: Chapter 6 of the LT2ESWTR Economic Analysis (USEPA 2003a).
Technology Selections      
Bag Filter 1.0 Log 1,545 1,236 1,441
Cartridge Filter 2.0 Log 190 17 52
CL02 0.5 Log 77 60 70
Combined Filter Performance 0.5 Log 16 12 14
In-bank Filtration 1.0 Log 5 3 4
MF/UF 2.5 Log 10 3 5
Technology Selections1      
03 0.5 Log 26 17 21
03 1.0 Log 24 18 21
03 2.0 Log 9 1 2
Secondary Filter 1.0 Log 0 0 0
UV 2.5 Log 998 490 632
WS Control 0.5 Log 0 0 0
Total Plants Selecting Technologies 2,893 1,852 2,255

c. Unfiltered systems treatment costs. The proposed LT2ESWTR requires all unfiltered plants to achieve 2 logs of inactivation if their mean source water Cryptosporidium concentration is less than or equal to 0.01 oocysts/L and 3 logs of inactivation if it is greater than 0.01 oocysts/L. For most systems, UV appears to be the least expensive technology that can achieve the required log inactivation of Cryptosporidium, and it is expected to be widely used by unfiltered systems to meet the rule requirement. However, as with filtered systems, EPA estimated that a small percentage of plants would elect to install a technology more expensive than UV due to the configuration of existing equipment or other factors. Ozone is the next least expensive technology that will meet the inactivation requirements for some systems, and is estimated to be used by plants that do not use UV.

All unfiltered plants must meet requirements of the LT2ESWTR; therefore, the percent of plants adding technology is 100 percent. This also assumes that no unfiltered systems currently use these additional treatment technologies. For this cost analysis, the Agency assumed 100 percent of very small unfiltered systems will use UV; for all other unfiltered system sizes, the Agency estimated that 90 percent would install UV and 10 percent would add ozone. This analysis is discussed in more detail in the LT2ESWTR EA (USEPA 2003a). Treatment costs for unfiltered systems are included in Table VI-11.

Table VI-11.—Total Present Value and Annualized Present Value Treatment Costs for Filtered and Unfiltered Plants Back to Top
Data Set System Size (population served) Present Value Capital Costs at 3%A Present Value Capital Costs at 7%B Annualized OM Costs at 3%C Annualized OM Costs at 7%D Total Annuallized Costs at 3%E Total Annualized Costs at 7%F
Source: Chapter 6 of the LT2ESWTR Economic Analysis (USEPA 2003a)
ICR ≤10,000 $76.1 $56.0 $5.2 $4.3 $9.6 $9.1
10,000 1,092.4 868.0 26.1 22.7 88.8 97.1
TOTAL 1,168.5 924.0 31.3 26.9 98.4 106.2
ICRSSL ≤10,000 42.8 31.5 2.9 2.4 5.3 5.1
10,000 707.1 561.8 16.2 14.0 56.8 62.3
TOTAL 749.8 593.3 19.0 16.4 62.1 67.3
ICRSSM ≤10,000 52.6 38.7 3.5 2.9 6.6 6.2
10,000 842.4 669.3 19.4 16.9 67.8 74.3
TOTAL 894.9 708.0 23.0 19.8 74.4 80.6

d. Uncovered finished water storage facilities. As part of the LT2ESWTR, systems with uncovered finished water storage facilities have the option to cover the storage facility or provide disinfection after the storage facility, unless the State has determined that existing risk mitigation is adequate. Disinfection alternatives must achieve at least four logs of virus inactivation. To develop national cost estimates for systems to comply with this provision of the LT2ESWTR, unit costs for each treatment alternative and the percentage of systems selecting each alternative were estimated for the inventory of systems with uncovered finished water storage facilities. A full description of the unit costs and other assumptions used in this analysis is presented in Chapter 6 and Appendix I of the LT2ESWTR EA (USEPA 2003a).

The Agency assumed that all systems with uncovered finished water storage facilities will have to either install a cover or treat their discharge. This overestimates the cost of this provision because States can determine that systems with uncovered finished storage facilities do not need to take these additional measures. The technology selection for the uncovered finished water storage facilities was developed through a least-cost approach.

For systems with uncovered storage facility capacities of five million gallons (MG) or less, covering the storage facilities is the least expensive alternative. Although chlorination is the least expensive alternative for the remaining systems, the ability of a system to use booster chlorination depends on their current residual disinfectant type. Less than half of all surface water systems are predicted to use chloramination following implementation of the Stage 2 DBPR. Adding chlorine to water that has been treated with chloramines is not a feasible alternative; therefore, the fraction of systems projected to add booster chlorination to the effluent from the storage facility was estimated at 50 percent, with the remaining 50 percent estimated to add covers. The technology selection for uncovered finished water storage facilities is presented in Table VI-12.

Table VI-12.—Estimated Technology Selection for Uncovered Storage Facilities Back to Top
Size category (MG) Number of uncovered storage facilities Floating cover (%) Booster chlorination (%)
Source: Appendix I of the LT2ESWTR Economic Analysis (USEPA 2003a)
0-0.1 25 100
0.1-1 7 100
1-5 44 100
5-10 12 100
10-20 10 100
20-40 9 50 50
40-60 4 50 50
60-80 4 50 50
80-100 6 50 50
100-150 6 50 50
150-200 2 50 50
200-250 4 50 50
250-1,000 4 50 50
1,000 1 50 50

Table VI-13 summarizes total annualized present value costs for the uncovered storage facility provision using both three and seven percent discount rates. The Agency estimates the total annualized present value cost for covering or treating uncovered finished water storage facilities to be approximately $5.4 million at a three percent discount rate and $6.4 million at a seven percent discount rate.

Table VI-13.—Estimated Annualized Present Value Cost for Uncovered Finished Water Storage Facility Provision (2000$) Back to Top
System size (population served) Annualized cost at 3% Annualized cost at 7%
Capital OM Total Capital OM Total
Source: Appendix I of the LT2ESWTR Economic Analysis (USEPA 2003a)
≤10,000 $3,520 $1,649 $5,169 $4,713 $1,552 $6,264
10,000 3,349,320 2,046,425 5,395,745 4,483,927 1,925,203 6,409,129
Total 3,352,840 2,048,074 5,400,915 4,488,639 1,926,754 6,415,393

e. Future monitoring costs. Six years after initial bin classification, filtered and unfiltered plants will be required to conduct a second round of monitoring to assess whether source water Cryptosporidium levels have changed significantly. EPA will evaluate new analytical methods and surrogate indicators of microbial water quality in the interim. While the costs of monitoring are likely to change in the six years following rule promulgation, it is difficult to predict how they will change. In the absence of any other information, it was assumed that the laboratory costs would be the same as for the initial monitoring.

All plants that conducted initial monitoring were assumed to conduct the second round of monitoring as well, except for those systems that installed treatment that reduces 2.5 logs of Cryptosporidium or greater as a result of the rule. These systems are exempt from monitoring under the LT2ESWTR. Table VI-8 shows the number of systems that are estimated to conduct the second round of monitoring (listed as “future” monitoring in the table). EPA estimates the cost of re-binning will range from $23 million to $38 million depending on the occurrence data set and discount rate used in the estimate (see Table VI-14). Costs differ among Cryptosporidium occurrence data sets due to differences in estimates of the number of plants that will add technologies to achieve at least 2.5 log Cryptosporidium reduction and the number of small plants that will be triggered into monitoring for Cryptosporidium. Appendix D of the EA provides further details (USEPA 2003a).

Table VI-14.—Present Value of Monitoring Costs of Future Re-binning Back to Top
System size ICR (3%) ICR (7%) ICRSSL (3%) ICRSSL (7%) ICRSSM (3%) ICRSSM (7%)
A B C D E F
[$millions, 2000$]
Source: Chapter 6 of the LT2ESWTR Economic Analysis (USEPA 2003a)
≤10K $23.5 $14.3 $18.4 $11.3 $20.7 $12.6
10k 14.4 9.8 16.4 11.2 15.6 10.7
Total 37.8 24.1 34.8 22.5 36.3 23.3

f. Sensitivity analysis—influent bromide levels on technology selection for filtered plants. One concern about the ICR data set was that it may not actually reflect influent bromide levels in some plants during droughts. High influent bromide levels (the precursor for bromate formation) limits ozone use because the plant would not be able to meet the MCL for bromate. The Agency conducted a sensitivity analysis to estimate an impact of higher influent bromide levels would have on technology decisions. The sensitivity analysis assumes influent bromide concentrations of 50 parts per billion (ppb) above the ICR concentrations. Overall, the impact of these assumptions have a minimal impact on costs. A complete discussion of this sensitivity analysis is located in LT2ESWTR EA (USEPA 2003a).

3. State/Primacy Agency Costs

The Agency estimates that States and primacy agencies will incur an annualized present value cost of $0.9 to $1.0 million using a three percent discount rate and $1.2 million at seven percent. State implementation activities include regulation adoption and program implementation, training State staff, training PWS staff, providing technical assistance to PWSs, and updating the management system. To estimate implementation costs to States/Primacy Agencies, the number of full-time employees (FTEs) per activity is multiplied by the number of labor hours per FTE, the cost per labor hour, and the number of States and Territories.

In addition to implementation costs, States and primacy agencies will also incur costs associated with monitoring data management. Because EPA will directly manage the first round of monitoring by large systems (serving at least 10,000 people), States are not predicted to incur costs for these activities. States will, however, incur costs associated with small system monitoring. This is a result of the delayed start of small system monitoring, which will mean that some States will assume primacy for small system monitoring. In addition, States will review of the second round of monitoring results. States will also incur costs in reviewing technology compliance data and consulting with systems regarding benchmarking for systems that change their disinfection procedures to comply with the rule. Appendix D of the LT2ESWTR EA provides more information about the State and primacy agency cost analysis (USEPA 2003a).

4. Non-Quantified Costs

EPA has quantified all the major costs for this rule and has provided uncertainty analyses to bound the over or underestimates in the costs. There are some costs that EPA has not quantified, however, because of lack of data. For example, some systems may merge with neighboring systems to comply with this rule. Such changes have both costs (legal fees and connecting infrastructure) and benefits (economies of scale). Likewise, systems would incur costs for procuring a new source of water that may result in lower overall treatment costs.

In addition, the Agency was unable to predict the usage or estimate the costs of several toolbox options. These options include intake management and demonstrations of performance. They have not been included in the quantified analysis because data are not available to estimate the number of systems that may use these toolbox options to comply with the LT2ESWTR. Not including these generally low-cost options may result in overestimation of costs.

E. What Are the Household Costs of the Proposed Rule?

Another way to assess a rule's impact is to consider how it might impact residential water bills. This analysis considers the potential increase in a household's water bill if a CWS passed the entire cost increase resulting from this rule on to its customers. It is a tool to gauge potential impacts and should not be construed as precise estimates of potential changes to individual water bills.

Included in this analysis are all CWS costs, including rule implementation, initial and future monitoring for bin classification, additional Cryptosporidium treatment, and treating or covering uncovered finished water storage facilities. Costs for small systems Cryptosporidium monitoring, additional Cryptosporidium treatment, and uncovered finished water storage facilities are assigned only to the subset of systems expected to incur them. Although implementation and monitoring represent relatively small, one-time costs, they have been included in the analysis to provide a complete distribution of the potential household cost. A detailed description of the derivation of household costs is in section 6.10 and Appendix J of the LT2ESTWR EA (USEPA 2003a).

For purchased systems that are linked to larger nonpurchased systems, the households costs are calculated based on the unit costs of the larger system but included in the distribution from the size category of the purchased system. Households costs for these purchased systems are based on the household usage rates appropriate for the retail system and not the system selling the water. This approach for the purchased systems reflects the fact that although they will not face increased costs from adding their own treatment, whatever costs the wholesale utility incurs would likely be passed on as higher water costs.

Table VI-15 shows the results of the household cost analysis. In addition to mean and median estimates, the Agency calculated the 90th and 95th percentile. EPA estimates that all households served by surface and GWUDI sources will face some increase in household costs due to implementation of the LT2ESWTR (except for those few served by systems that have already installed 5.5 logs of treatment for Cryptosporidium). Of all the households subject to the rule, from 24 to 35 percent are projected to incur costs for adding treatment, depending on the Cryptosporidium occurrence data set used.

Approximately 95 percent of the households potentially subject to the rule are served by systems serving at least 10,000 people; these systems experience the lowest increases in costs due to significant economies of scale. Over 90 percent of all households will face an annual cost increase of less than $5. Households served by small systems that install advanced technologies will face the greatest increases in annual costs. EPA expects that the model's projections for these systems are, in some cases, overstated. Some systems are likely to find alternative treatment techniques such as other toolbox options not included in this analysis, or sources of water (ground water, purchased water, or consolidating with another system) that would be less costly than installing more expensive treatment techniques.

Table VI-15.—Potential Annual Household Costs Impacts for the Preferred Regulatory Option (2000$) Back to Top
System: type/size Households Mean Median 90th Percentile 95th Percentile Percent of systems with household cost increase $12 Percent of systems with household cost increase $120
Source: Chapter 6 of the LT2ESWTR Economic Analysis (USEPA 2003a).
All Systems—ICR              
All CWS 65,816,979 $1.68 $0.13 $4.06 $7.57 98.37 99.99
CWS ≤ 10,000 3,318,012 4.61 1.34 13.04 14.92 87.88 99.88
All Systems—ICRSSL              
All CWS 65,816,979 $1.07 $0.03 $3.24 $5.43 98.31 100.00
CWS ≤ 10,000 3,318,012 2.68 0.80 6.10 9.39 95.71 99.95
All Systems—ICRSSM              
All CWS 65,816,979 $1.28 $0.03 $3.48 $6.47 99.07 100.00
CWS ≤ 10,000 3,318,012 3.27 0.80 6.62 13.04 93.90 99.93

F. What Are the Incremental Costs and Benefits of the Proposed LT2ESWTR?

Incremental costs and benefits are those that are incurred or realized in reducing Cryptosporidium exposures from one alternative to the next. Estimates of incremental costs and benefits are useful in considering the economic efficiency of different regulatory options considered by the Agency. Generally, the goal of an incremental analysis is to identify the regulatory option where incremental benefits most closely equal incremental costs. However, the usefulness of this analysis is limited because many benefits from this rule are unquantified and not monetized. Incremental analyses should consider both quantified and non-quantified (where possible) benefits and costs.

Usually an incremental analysis implies increasing levels of stringency along a single parameter, with each alternative providing all the protection of the previous alternative, plus additional protection. However, the regulatory alternatives in this rule vary by multiple parameters (e.g, risk bin boundaries, treatment requirements). The comparison between any two alternatives is, therefore, between two separate sets of benefits, in the sense that they may be distributed to somewhat different population groups.

The regulatory alternatives, however, do achieve increasing levels of benefits at increasing levels of costs. As a result, it is possible to display incremental net benefits from the baseline and alternative to alternative. Tables VI-16a and VI-16b show incremental costs, benefits, and net benefits for the four regulatory alternatives shown in Table VI-1, using the enhanced and traditional COI, respectively. All values are annualized present values expressed in Year 2000 dollars. The displayed values are the mean estimates for the different occurrence distributions.

With the enhanced COI, incremental costs are generally closest to incremental benefits for A2, a more stringent alternative than the Preferred Alternative, A3. For the traditional COI, incremental costs most closely equal incremental benefits for A3, the Preferred Alternative, under the majority of conditions evaluated.

BILLING CODE 6560-50-P

[Graphic not available; view image of printed page]

[Graphic not available; view image of printed page]

BILLING CODE 6560-50-C

G. Are There Benefits From the Reduction of Co-Occurring Contaminants?

This section presents information on the unquantified benefits that will accrue from removal of other contaminants, primarily pathogens, due to improved control of Cryptosporidium. While the benefits analysis for the LT2ESWTR only includes reductions in illness and mortality attributable to Cryptosporidium, the LT2ESWTR is expected to reduce exposure to other parasitic protozoans that EPA regulates, or is considering for future regulation. For example, it is expected that the LT2ESWTR will improve control of Giardia lamblia, Cyclospora sp. and members of the Microsporididea class, seven genera (10 species) of which have been recovered in humans (Mota et al., 2000). In addition, greater Cryptosporidium control may improve control of the pathogenic bacteria and viruses. Chemical contaminants such as arsenic, DBPs and atrazine may also be controlled, in part, by control of Cryptosporidium, depending on the technologies selected.

Giardia lamblia and Cyclospora sp. are larger than Cryptosporidium, while Microsporididea, bacteria, and the viruses are smaller than Cryptosporidium. The expected removal of co-occurring microorganisms can often be predicted for those treatment unit processes whose removal efficiency depends in part, or entirely, on the size of the organism. For example, a study by Goodrich and Lykins (1995) evaluating bag filters showed that any microbe or object greater than 4.5 microns in size (the average size of Cryptosporidium) would be subject to removal ranging from 0.5 to 2.0 logs.

Although not directly dependent on organism size, other treatment technologies identified in the LT2ESWTR should also provide additional control of co-occurring microbial pathogens. Membrane processes that remove Cryptosporidium are shown to achieve equivalent log removal of Giardia under worst-case and normal operating conditions (USEPA 2003c). Reduction in individual filter turbidities will reduce concentrations of other pathogens as well as Cryptosporidium. For example, in Dutch surface water, Giardia and Cryptosporidium occurrence appeared to correlate well with each other and for the Rhine River, with turbidity (Medema et al. 2001). Thus, improved control of Cryptosporidium should also result in improved control of Giardia lamblia.

Some membrane technologies that might be installed to comply with the LT2ESWTR can also reduce or eliminate chemical contaminants including arsenic, DBPs and atrazine. EPA has recently finalized a rule to further control arsenic levels in drinking water and is concurrently proposing the Stage 2 DBPR to address DBP control.

The extent to which the LT2ESWTR can reduce the overall risk from other contaminants has not been quantitatively evaluated because of the Agency's lack of data regarding the co-occurrence among Cryptosporidium and other microbial pathogens and contaminants. Because of the difficulties in establishing which systems would have multiple problems, such as microbial contamination, arsenic, and DBPs or any combination of the three, no estimate was made of the potential cost savings from addressing more than one contaminant simultaneously.

H. Are There Increased Risks From Other Contaminants?

It is unlikely that the LT2ESWTR will result in a significant increase in risk from other contaminants. Many of the options that systems will select to comply with the LT2ESWTR, such as UV, improved filtration performance, and watershed control, do not form DBPs. Other technologies that are effective against Cryptosporidium, such as ozone and chlorine dioxide, do form DBPs. However, these DBPs are currently regulated under the Stage 1 DBPR, and systems will have to comply with these regulations when implementing technologies to meet the LT2ESWTR.

I. What Are The Effects of the Contaminant on the General Population and Groups Within the General Populations That Are Identified as Likely To Be at Greater Risk of Adverse Health Effects?

Section II of this preamble discusses the health effects associated with Cryptosporidium on the general population as well as the effects on other sensitive sub-populations. In addition, health effects associated with children and pregnant women are discussed in greater detail in section VII.G of this preamble.

J. What Are the Uncertainties in the Baseline, Risk, Benefit, and Cost Estimates for the Proposed LT2ESWTR as Well as the Quality and Extent of the Information?

Today's proposal models the current baseline risk from Cryptosporidium exposure, as well as the reduction in risk and the cost for various rule options. There is uncertainty in the risk calculation, the benefit estimate, the cost estimates, and the interaction of other upcoming rules. Section IV of the proposed rule considers the uncertainty with the risk estimates; however, a brief summary of the major risk uncertainties as they relate to benefit estimation is provided next. In addition, the LT2ESWTR EA has a more extensive discussion of all of the uncertainties (USEPA 2003a).

In addition, the Agency conducted sensitivity analyses to address uncertainty. The sensitivity analyses focus on various occurrence, benefit and cost factors that may have a significant effect on the estimated impacts of the rule. All of these sensitivity analyses are explained in more detail in the EA for the LT2ESWTR (USEPA 2003a).

One area of uncertainty is associated with the estimate of Cryptosporidium occurrence on a national basis. The Information Collection Rule plant-mean data were higher than the ICRSS medium or large system plant-mean data at the 90th percentile. The reasons for these differing results are not well understood but may stem from differences in the populations sampled, year-to-year variation in occurrence, and systematic differences in the sampling and measurement methods employed. These data suggest that Cryptosporidium levels are relatively low in most water sources, but there is a subset of sources with significantly higher concentrations. Additional uncertainty is associated with estimating finished water occurrence because the analysis is based on assumptions about treatment plant performance. To account for these uncertainties, the Agency used Monte Carlo simulation models that allow substantial variation in each estimate and computed finished water occurrence values based on statistical sampling of the variable estimates.

The risk associated with finished water occurrence is of lesser uncertainty than is typical for many contaminants because the health effects are measured based on Cryptosporidium challenge studies to human volunteer populations. Nevertheless, there is significant uncertainty about the dose-response associated with Cryptosporidium because there exists considerable differences in infectivity among the various tested Cryptosporidium parvum isolates. As described in section III.B, the Agency accounted for these differences using Monte Carlo simulations that randomly sampled from infectivity distributions for the three tested isolates. The different simulations were designed to account for the limited number of challenge studies and the variability in the infectivity of the isolates themselves. In addition, because the Cryptosporidium dosing levels in the human feeding studies were above typical drinking water exposure levels (e.g., one oocyst), there remains significant uncertainty that could not be quantified into the analysis.

While all of the significant costs of today's proposed rule have been identified by EPA, there are uncertainties about some of the estimates. However, the Agency explored the impact of the uncertainties that might have the greatest impact by conducting sensitivity analyses and using Monte Carlo techniques. For example, section VI.D.2.f of today's rule explores the impact of influent bromide levels on technology selection. As shown in the EA for this rule, the impact of higher influent bromide levels will not have a significant impact on the rule's costs. In addition, subsection 6.12 of the EA summarizes other cost uncertainties including the Agency's inability to include some lower cost toolbox options in the cost analysis (USEPA 2003a).

Last, EPA has recently finalized new regulations for arsenic, radon, Cryptosporidium in small surface water systems, and filter backwash in all system sizes (LT1ESWTR and Filter Backwash Rule); proposed a rule for microbials in ground water systems (Ground Water Rule); and is concurrently proposing additional control of disinfection byproducts (Stage 2 Disinfection Byproducts Rule). These rules may have overlapping impacts on some drinking water systems but the extent is not possible to estimate because of lack of information on co-occurrence. However, it is possible for a system to choose treatment technologies that would address multiple contaminants. Therefore, while the total cost impact of these drinking water rules is uncertain, it is most likely less than the estimated total cost of all individual rules combined.

K. What is the Benefit/Cost Determination for the Proposed LT2ESWTR?

The Agency has determined that the benefits of the proposed LT2ESWTR justify the costs. As discussed in section VI.C, the proposed rule provides a large reduction in endemic cryptosporidiosis illness and mortalities. More stringent alternatives provide greater reductions but at higher costs. Alternative A1 provides the greatest overall reduction in illnesses and mortalities but the incremental benefits between this option and the preferred option are relatively small while the incremental costs are significant. In addition, the preferred regulatory option, unlike option A1, specifically targets those systems whose source water requires higher levels of treatment.

Tables VI-17a and VI-17b present net benefits for the four regulatory alternatives that were evaluated. Generally, analysis of net benefits is used to identify alternatives where benefits exceed costs, as well as the alternative that maximizes net benefits. However, as with the analysis of incremental net benefits discussed previously, the usefulness of this analysis in evaluating regulatory alternatives for the LT2ESWTR is limited because many benefits from this rule are un-quantified and non-monetized. Analyses of net benefits should consider both quantified and non-quantified (where possible) benefits and costs.

Also, as noted earlier, the regulatory alternatives considered for the LT2ESWTR vary both in the population that experiences benefits and costs (i.e., risk bin boundaries) and the magnitude of the benefits and costs (i.e., treatment requirements). Consequently, the more stringent regulatory alternatives provide benefits to population groups that do not experience any benefit under less stringent alternatives.

As shown by Tables VI-17a and VI-17b, net benefits are positive for all four regulatory alternatives evaluated. With the enhanced COI (Table VI-17a), net benefits are highest for the Preferred Alternative, A3, under the majority of occurrence distributions and discount rates evaluated. When the traditional COI (Table VI-17b) is used, the Preferred Alternative has the highest net benefits at a three percent discount rate for the two of the occurrence distributions, the Information Collection Rule and ICRSSM, while the least stringent alternative, A4, is highest for the ICRSSL. At a seven percent discount rate, A4 maximizes net benefits under all occurrence distributions.

Table VI-17a.— Mean Net Benefits by Rule Option—Enhanced COI ($millions, 2000$) Back to Top

[Graphic not available; view image of printed page]

BILLING CODE 6560-50-C

In addition to the net benefits of the proposed LT2ESWTR, the Agency used several other techniques to compare costs and benefits. For example, EPA calculated the cost of the rule per case avoided. Table VI-18 shows both the cost of the rule per illness avoided and cost of the rule per death avoided. This cost effectiveness measure is another way of examining the benefits and costs of the rule but should not be used to compare alternatives because an alternative with the lowest cost per illness/death avoided may not result in the highest net benefits. With the exception of alternative A1, the rule options look favorable from a cost effectiveness analysis when you compare them to both the average cost of cryptosporidiosis illness ($745 and $245 for the two COI approaches) and the mean value of a death avoided—approximately $7 million dollars. Additional information about this analysis and other methods of comparing benefits and costs can be found in chapter 8 to the LT2ESWTR EA (USEPA 2003a).

Table VI-18.—Cost Per Illness or Death Avoided Back to Top
Data set Rule alternative Cost per illness avoided ($) Cost per death avoided ($ millions, 2000$)
3% 7% 3% 7%
Source: Chapter 8 of the LT2ESWTR Economic Analysis (USEPA 2003a)
A1 339 244 2.5 1.8
A2 128 93 0.9 0.7
ICR A3—Preferred 107 78 0.8 0.6
A4 62 45 0.4 0.3
A1 1,098 789 8.0 5.7
A2 356 259 2.5 1.8
ICRSSL A3—Preferred 282 208 1.9 1.4
A4 165 122 1.1 0.8
A1 631 453 4.6 3.3
A2 213 155 1.6 1.1
ICRSSM A3—Preferred 170 125 1.2 0.9
A4 99 73 0.7 0.5

L. Request for Comment

The Agency requests comment on all aspects of the proposed rule's economic impact analysis. Specifically, EPA seeks input into the following issues:

  • Both of the methodologies for valuing non-fatal cryptosporidiosis and the use of a real income growth factor to adjust these estimates for the years 2008 through 2027;
  • How can the Agency fully incorporate all toolbox options into the economic analysis?
  • How can the Agency estimate the potential benefits from reduced epidemic outbreaks of cryptosporidiosis?

VII. Statutory and Executive Order Reviews Back to Top

A. Executive Order 12866: Regulatory Planning and Review

Under Executive Order 12866, (58 FR 51735, October 4, 1993) the Agency must determine whether the regulatory action is “significant” and therefore subject to OMB review and the requirements of the Executive Order. The Order defines “significant regulatory action” as one that is likely to result in a rule that may:

(1) Have an annual effect on the economy of $100 million or more or adversely affect in a material way the economy, a sector of the economy, productivity, competition, jobs, the environment, public health or safety, or State, local, or Tribal governments or communities;

(2) Create a serious inconsistency or otherwise interfere with an action taken or planned by another agency;

(3) Materially alter the budgetary impact of entitlements, grants, user fees, or loan programs or the rights and obligations of recipients thereof; or

(4) Raise novel legal or policy issues arising out of legal mandates, the President's priorities, or the principles set forth in the Executive Order.

Pursuant to the terms of Executive Order 12866, it has been determined that this rule is a “significant regulatory action.” As such, this action was submitted to OMB for review. Changes made in response to OMB suggestions or recommendations will be documented in the public record.

B. Paperwork Reduction Act

The information collection requirements in this proposed rule have been submitted for approval to the Office of Management and Budget (OMB) under the Paperwork Reduction Act, 44 U.S.C. 3501 et seq. The Information Collection Request (ICR) document prepared by EPA has been assigned EPA ICR number 2097.01.

The information collected as a result of this rule will allow the States and EPA to determine appropriate requirements for specific systems, and to evaluate compliance with the rule. For the first 3 years after LT2ESWTR promulgation, the major information requirements concern monitoring activities and compliance tracking. The information collection requirements are mandatory (part 141), and the information collected is not confidential.

The estimate of annual average burden hours for the LT2ESWTR during the first three years following promulgation is 145,854 hours. The annual average cost estimate is $3.9 million for labor and $9.8 million per year for operation and maintenance including lab costs (which is a purchase of service). The burden hours per response is 1.47 hours and the cost per response is $138.12. The frequency of response (average responses per respondent) is 39, annually. The estimated number of likely respondents is 2,560 (the product of burden hours per response, frequency, and respondents does not total the annual average burden hours due to rounding). Note that the burden hour estimates for the first 3-year cycle include large system but not small system monitoring. Conversely, burden estimate for the second 3-year cycle will include small system monitoring but not large system, which will have been completed by then.

Burden means the total time, effort, or financial resources expended by persons to generate, maintain, retain, or disclose or provide information to or for a Federal agency. This includes the time needed to review instructions; develop, acquire, install, and utilize technology and systems for the purposes of collecting, validating, and verifying information, processing and maintaining information, and disclosing and providing information; adjust the existing ways to comply with any previously applicable instructions and requirements; train personnel to be able to respond to a collection of information; search data sources; complete and review the collection of information; and transmit or otherwise disclose the information.

An agency may not conduct or sponsor, and a person is not required to respond to a collection of information unless it displays a currently valid OMB control number. The OMB control numbers for EPA's regulations in 40 CFR are listed in 40 CFR part 9.

To comment on the Agency's need for this information, the accuracy of the provided burden estimates, and any suggested methods for minimizing respondent burden, including the use of automated collection techniques, EPA has established a public docket for this rule, which includes this ICR, under Docket ID No. OW-2002-0039. Submit any comments related to the ICR for this proposed rule to EPA and OMB. See Addresses section at the beginning of this notice for where to submit comments to EPA. Send comments to OMB at the Office of Information and Regulatory Affairs, Office of Management and Budget, 725 17th Street, NW., Washington, DC 20503, Attention: Desk Office for EPA. Since OMB is required to make a decision concerning the ICR between 30 and 60 days after August 11, 2003, a comment to OMB is best assured of having its full effect if OMB receives it by September 10, 2003. The final rule will respond to any OMB or public comments on the information collection requirements contained in this proposal.

C. Regulatory Flexibility Act

The Regulatory Flexibility Act (RFA) generally requires an agency to prepare a regulatory flexibility analysis for any rule subject to notice and comment rulemaking requirements under the Administrative Procedure Act or other statute unless the agency certifies that the rule will not have a significant economic impact on a substantial number of small entities. Small entities include small businesses, small organizations, and small governmental jurisdictions.

The RFA provides default definitions for each type of small entity. It also authorizes an agency to use alternative definitions for each category of small entity, “which are appropriate to the activities of the agency” after proposing the alternative definition(s) in the Federal Register and taking comment. 5 U.S.C. secs. 601(3)-(5). In addition to the above, to establish an alternative small business definition, agencies must consult with SBA's Chief Council for Advocacy.

For purposes of assessing the impacts of today's proposed rule on small entities, EPA considered small entities to be public water systems serving 10,000 or fewer persons. This is the cut-off level specified by Congress in the 1996 Amendments to the Safe Drinking Water Act for small system flexibility provisions. In accordance with the RFA requirements, EPA proposed using this alternative definition in the Federal Register, (63 FR 7620, February 13, 1998), requested public comment, consulted with the Small Business Administration (SBA), and expressed its intention to use the alternative definition for all future drinking water regulations in the Consumer Confidence Reports regulation (63 FR 44511, August 19, 1998). As stated in that final rule, the alternative definition is applied to this proposed regulation.

After considering the economic impacts of today's proposed rule on small entities, I certify that this action will not have a significant economic impact on a substantial number of small entities. We have determined that 274 small systems, which are 2.32% of the 11,820 small systems regulated by the LT2ESWTR, will experience an impact of one percent or greater of average annual revenues; further, 31 systems, which are 0.26% of the systems regulated by this rule, will experience an impact of three percent or greater of average annual revenues (see Table VII-1).

Table VII-1.—Annualized Compliance Cost as a Percentage of Revenues for Small Entities ($2000) Back to Top
Entity by system size Number of small systems (Percent) Average annual estimated revenuses per system ($) Systems experiencing costs of % their revenues Systems experiencing costs of % of their revenues
Percent of sustem Number of systems Percent of systems Number of systems
Note: Detail may not add due to independent rounding. Data are based on the means of the highest modeled distributions using Information Collection Rule occurrence data set. Costs are discounted at 3 percent, summed to present value, and annualized over 25 years. Source: Chapter 7 of the LT2ESWTR EA (USEPA 2003a).
A B E F=A*E G H=A*G
Small Governments 5,910 50 2,434,200 2.4 140 0.3 15
Small Businesses 4,846 41 2,391,978 2.4 115 0.3 13
Small Organizations 1,064 9 4,446,165 1.2 13 0.1 1
All Small Entities 11,820 100 2,597,966 2.3 274 0.3 31

The LT2ESWTR contains provisions that will affect systems serving fewer than 10,000 people that use surface water or GWUDI as a source. In order to meet the LT2ESWTR requirements, approximately 1,382 to 2,127 small systems would need to make capital improvements. Impacts on small entities are described in more detail in Chapters 6 and 7 of the Economic Analysis for the LT2ESWTR (USEPA 2003a). Table VII-2 shows the annual compliance costs of the LT2ESWTR on the small entities by system size and type based on a three percent discount rate (other estimates based on different data sets and discount rates produce lower costs). EPA has determined that in each size category, fewer than 20% of systems and fewer than 1000 systems will experience an impact of one percent or greater of average annual revenues (USEPA 2003a).

Table VII-2.—Annual Compliance Costs for the Proposed LT2ESWTR by System Size and Type Back to Top
System type System size (population served) Total
100 101-500 501-1,000 1,001-3,300 3,301-10,000
[$Millions, 2000$]
Note: Results are based on the mean of the Information Collection Rule Cryptosporidium occurrence distribution. Costs are annualized at a three percent discount rate.
Source: Appendix D and Q of the LT2ESWTR EA (USEPA 2003a).
Public owned $0.46 $0.88 $0.94 $2.62 $5.57 $10.37
Privately owned 1.00 0.71 0.22 0.31 0.36 2.60
All systems 1.45 1.59 1.07 2.92 5.93 12.97

Although this proposed rule will not have a significant economic impact on a substantial number of small entities, EPA nonetheless has tried to reduce the impact of this rule on small entities. The LT2ESWTR contains a number of provisions to minimize the impact of the rule on systems generally, and on small systems in particular. The risk-targeted approach of the LT2ESWTR will impose additional treatment requirements only on the subset of systems with the highest vulnerability to Cryptosporidium, as indicated by source water pathogen levels. This approach will spare the majority of systems from the cost of installing additional treatment. Also, development of the microbial toolbox under the LT2ESWTR will provide both large and small systems with broad flexibility in selecting cost-effective compliance options to meet additional treatment requirements.

Small systems will monitor for E. coli as a screening analysis for source waters with low levels of fecal contamination. Cryptosporidium monitoring will only be required of small systems if they exceed the E. coli trigger value. Because E. coli analysis is much cheaper than Cryptosporidium analysis, the use of E. coli as a screen will significantly reduce monitoring costs for the majority of small systems. In order to allow EPA to review Cryptosporidium indicator relationships in large system monitoring data, small systems will not be required to initiate their monitoring until large system monitoring has been completed. This will provide small systems with additional time to become familiar with the rule and to prepare for monitoring and other compliance activities.

Funding would be available from programs administered by EPA and other Federal agencies to assist small public water systems (PWSs) in complying with the LT2ESWTR. The Drinking Water State Revolving Fund (DWSRF) assists PWSs with financing the costs of infrastructure needed to achieve or maintain compliance with SDWA requirements. Through the DWSRF, EPA awards capitalization grants to States, which in turn can provide low-cost loans and other types of assistance to eligible PWSs. Loans made under the program can have interest rates between 0 percent and market rate and repayment terms of up to 20 years. States prioritize funding based on projects that address the most serious risks to human health and assist systems most in need. Congress provided $1.275 billion for the DWSRF program in fiscal year 1997, and has provided an additional $3.145 billion for the DWSRF program for fiscal years 1998 through 2001.

The DWSRF places an emphasis on small and disadvantaged communities. States must provide a minimum of 15% of the available funds for loans to small communities. A State has the option of providing up to 30% of the grant awarded to the State to furnish additional assistance to State-defined disadvantaged communities. This assistance can take the form of lower interest rates, principal forgiveness, or negative interest rate loans. The State may also extend repayment terms of loans for disadvantaged communities to up to 30 years. A State can set aside up to 2% of the grant to provide technical assistance to systems serving communities with populations fewer than 10,000.

In addition to the DWSRF, money is available from the Department of Agriculture's Rural Utility Service (RUS) and Housing and Urban Development's Community Development Block Grant (CDBG) program. RUS provides loans, guaranteed loans, and grants to improve, repair, or construct water supply and distribution systems in rural areas and towns of up to 10,000 people. In fiscal year 2002, RUS had over $1.5 billion of available funds for water and environmental programs. The CDBG program includes direct grants to States, which in turn are awarded to smaller communities, rural areas, and colon as in Arizona, California, New Mexico, and Texas and direct grants to U.S. territories and trusts. The CDBG budget for fiscal year 2002 totaled over $4.3 billion.

Although not required by the RFA to convene a Small Business Advocacy Review (SBAR) Panel because EPA determined that this proposal would not have a significant economic impact on a substantial number of small entities, EPA did convene a panel to obtain advice and recommendations from representatives of the small entities potentially subject to this rule's requirements.

Before convening the SBAR Panel, EPA consulted with a group of 24 small entity stakeholders likely to be impacted by the LT2ESWTR and who were asked to serve as Small Entity Representatives (SERs) after the Panel was convened. The small entity stakeholders included small system operators, local government representatives, and representatives of small nonprofit organizations. The small entity stakeholders were provided with background information on SDWA and potential alternatives for the LT2ESWTR in preparation for teleconferences on January 28, 2000, February 25, 2000, and April 7, 2000. This information package included data on preliminary unit costs for treatment enhancements under consideration.

During these three conference calls, the information that had been provided to the small entity stakeholders was discussed and EPA responded to questions and recorded initial comments. Following the three calls, the small entity stakeholders were asked to provide input on the potential impacts of the rule from their perspective. Seven small entity stakeholders provided written comments on these materials.

The SBAR Panel convened on April 25, 2000. The small entity stakeholders comments were provided to the SBAR Panel when it convened. After a teleconference between the SERs and the SBAR Panel on May 25, 2000, the SERs were invited to provide additional comments on the information provided. Seven SERs provided additional comments on the rule components.

The SBAR Panel's report, Final Report of the Small Business Advocacy Review Panel on Stage 2 Disinfectants and Disinfection Byproducts Rule (Stage 2 DBPR) and Long Term 2 Enhanced Surface Water Treatment Rule (LT2ESWTR) (USEPA 2000f), the SERs comments on the LT2ESWTR, and the background information provided to the SBAR Panel and the SERs are available for review in the docket for today's proposal (http://www.epa.gov.edocket/).

In general, the SERs who were consulted on the LT2ESWTR were concerned about the impact of these proposed rules on small water systems, the ability of small systems to acquire the technical and financial capability to implement requirements while maintaining flexibility to tailor the requirements to their needs, and the limitations of small systems. The SBAR Panel evaluated information and small-entity comments on issues related to the impact of the LT2ESWTR.

The LT2ESWTR takes into consideration the recordkeeping and reporting concerns identified by the SBAR Panel and the SERs. The SBAR Panel recommended that EPA evaluate ways to minimize the recordkeeping and reporting burdens under the rule by ensuring that the States have appropriate capacity for rule implementation, and that EPA provide as much monitoring flexibility as possible to small systems. EPA believes that the continuity with the IESWTR and LT1ESWTR was maintained to the extent possible to ease the transition to the LT2ESWTR, especially for small systems. The LT2ESWTR builds on the protection afforded under the IESWTR and LT1ESWTR, while minimizing the impact on small systems by using a risk-targeted approach (i.e., source water monitoring) to identify systems that are still at risk from Cryptosporidium exposure.

The SBAR Panel noted the concern of several SERs that flexibility be provided in the compliance schedule of the rule. SERs commented on the technical and financial limitations of some small systems, the significant learning curve for operators with limited experience, and the need to continue providing uninterrupted service as reasons why additional compliance time may be needed for small systems. The SBAR Panel encouraged EPA to keep these limitations in mind in developing the proposed rule and provide as much compliance flexibility to small systems as is allowable under SDWA.

EPA has concluded that the proposed schedule for the LT2ESWTR provides sufficient time for small systems to achieve compliance. The schedule for small system monitoring and compliance with additional treatment requirements lags behind the schedule for large systems. The basis for the lagging schedule for small systems is that it allows EPA to confirm or refine the E. coli screening criteria that small systems will use to reduce monitoring costs. However, the lagging schedule also provides greater time for small systems to become knowledgeable about the LT2ESWTR, including the new monitoring requirements, and to become familiar with innovative technologies, like UV, that may be used by some small systems to meet additional treatment requirements.

Some SERs emphasized that EPA needs to maintain an appropriate balance between control of known microbial risks through adequate disinfection and for the more uncertain risks that may be associated with DBPs. The SBAR Panel did not foresee any potential conflict between rules regulating control of microbial contaminants and those regulating DBPs. EPA also believes that today's proposal and the accompanying proposed Stage 2 DBPR achieve an appropriate balance between microbial and DBP risks. The profiling and benchmarking requirements described in section IV.D of this preamble will ensure that systems maintain protection against pathogens as they make treatment changes to control the formation of DBPs.

The SBAR Panel considered a wide range of options and regulatory alternatives for providing small businesses with flexibility in complying with the LT2ESWTR. The SBAR Panel was concerned with the option of an across-the-board additional Cryptosporidium inactivation requirement because of the potential high cost to small systems and the uncertainty regarding the extent to which implementation of the LT1ESWTR will adequately address Cryptosporidium contamination at small systems. The SBAR Panel noted that, at the time, the Stage 2 M-DBP Federal Advisory Committee was exploring a targeted approach to Cryptosporidium control based on limited monitoring and system assessment, which would identify a subset of vulnerable systems to provide additional treatment in the range of 0.5-to 2.5-log reduction. Further, this approach would allow E. coli monitoring in lieu of Cryptosporidium monitoring as a screening device for small systems. The SBAR Panel was also encouraged by recent developments suggesting that UV is a viable, cost-effective means of fulfilling any additional inactivation requirements.

The SBAR Panel recommended that, in developing any additional inactivation requirements based on a targeted approach, EPA carefully consider the potential impa