Environmental Protection: More Consistency Needed Among EPA Regions in
Approach to Enforcement (Chapter Report, 06/01/2000, GAO/RCED-00-108).

Pursuant to a congressional request, GAO provided information on the
consistency of the Environmental Protection Agency's (EPA) regional
offices' enforcement of environmental requirements, focusing on: (1) the
extent to which variations exist among EPA's regional offices in the
actions they take to enforce environmental requirements; (2) what
factors contribute to any variations; and (3) what EPA is doing to
achieve consistency in regional enforcement activities.

GAO noted that: (1) variations exist among EPA's regional offices in the
actions they take to enforce environmental requirements, as illustrated
by a number of key indicators that EPA headquarters enforcement
officials have used to monitor regional performance; (2) GAO also found
variations in regions' overall strategies in overseeing the states
within their jurisdiction, which may result in more in-depth reviews in
some regional programs than in others; (3) EPA headquarters enforcement
officials emphasize that the data, by themselves, do not offer the
appropriate context to help determine the extent to which the variations
pose problems; (4) the officials note, however, that the data are useful
for identifying general trends and possible strengths and weaknesses in
regional and state programs, along with potential issues to investigate
at greater length; (5) also corroborating the variation it identified
among regional enforcement activities, GAO found broad agreement in its
interviews with EPA and state enforcement officials on key factors that
contribute to such variations; (6) among the factors most commonly cited
by these officials are: (a) differences in the philosophical approaches
among enforcement staff about how to best achieve compliance with
environmental requirements; (b) differences in state laws and
enforcement authorities, and in the manner in which regions respond to
these differences; (c) variations in resources available to both state
and regional enforcement offices; (d) the flexibility afforded by EPA
policies and guidance that allow states a degree of latitude in their
enforcement programs; and (e) incomplete and inadequate enforcement data
which hamper EPA's ability to accurately characterize the extent of
variations; (7) EPA headquarters and regional enforcement officials have
a number of efforts underway to help achieve greater consistency in
regional enforcement activities; (8) at the headquarters level,
enforcement officials are developing performance information that will
allow for comparisons among both regions and states in their conduct of
key enforcement responsibilities; (9) a number of EPA regional offices
have also sought to ensure more consistency in their state oversight by
developing and applying new audit protocols in their state reviews and
by encouraging more effective communication between and among regional
and state enforcement staff; and (10) however, a number of factors will
continue to challenge EPA's ability to ensure consistent regional
enforcement, including the absence of reliable data on how both states
and regions are performing their enforcement responsibilities.

--------------------------- Indexing Terms -----------------------------

 REPORTNUM:  RCED-00-108
     TITLE:  Environmental Protection: More Consistency Needed Among
	     EPA Regions in Approach to Enforcement
      DATE:  06/01/2000
   SUBJECT:  Environmental law
	     Performance measures
	     Pollution control
	     Law enforcement
	     State-administered programs
	     Environmental policies
	     Noncompliance
	     Federal/state relations
	     Industrial pollution
	     Program evaluation
IDENTIFIER:  EPA Denver Regional Oversight System
	     EPA National Pollutant Discharge Elimination System

******************************************************************
** This file contains an ASCII representation of the text of a  **
** GAO Testimony.                                               **
**                                                              **
** No attempt has been made to display graphic images, although **
** figure captions are reproduced.  Tables are included, but    **
** may not resemble those in the printed version.               **
**                                                              **
** Please see the PDF (Portable Document Format) file, when     **
** available, for a complete electronic file of the printed     **
** document's contents.                                         **
**                                                              **
******************************************************************

GAO/RCED-00-108

4

16

EPA and State Roles in Enforcing Environmental Programs 18

EPA Tries to Achieve Consistency in Its Enforcement Programs
Nationwide 19

Objectives, Scope, and Methodology 21

23

Inspection Coverage 23

Number and Type of Enforcement Actions 25

Referrals of Violations to the Department of Justice 29

Regions' Approaches to Enforcement Oversight 30

State and EPA Perspectives on the Extent of Regional Variation 32

Agency Comments 33

Enforcement Activities

37

Differences in Regions' and States' Enforcement Approaches 37

Variations in State Laws and Enforcement Authorities 39

Variations in Resource Commitments to Enforcement 41

Flexibility in EPA Policies and Guidance 42

Incomplete and Inaccurate Enforcement Data 43

Regional Enforcement Activities

45

Developing Comparative Information on Regional Enforcement 45

Recognizing the Seriousness of the Data Quality Problem 47

Improving Regional and State Communications 48

Developing Regional Audit Protocols to Improve Oversight 48

Conclusions 49

Recommendations 51

Agency Comments 51

Appendix I: Comments From the Environmental Protection
Agency

54

Appendix II: GAO Contacts and Staff Acknowledgments

71

Figure 1: EPA's10 Regions and Regional Office Locations 17

EPA Environmental Protection Agency

GAO General Accounting Office

OECA Office of Enforcement and Compliance Assurance

Resources, Community, and
Economic Development Division

B-284777

June 2, 2000

The Honorable Christopher S. Bond
Chairman, Committee on Small Business
United States Senate

Dear Mr. Chairman:

In response to your request, we are reporting on the extent to which
variations exist among the Environmental Protection Agency's regions in
enforcing environmental requirements, the factors that contribute to such
variations, and the agency's efforts to achieve greater consistency in its
nationwide enforcement program.

This report will not be distributed until 30 days after its issuance date
unless you publicly announce its contents earlier. At that time, we will
send copies to the appropriate congressional committees; the Honorable Carol
Browner, the Administrator, Environmental Protection Agency; and the
Honorable Jacob Lew, Director, Office of Management and Budget. We will also
make copies available to others upon request.

Please call me at (202) 512-6111 if you or your staff have any questions.
Major contributors to this report are listed in appendix II.

Sincerely yours,
Peter F. Guerrero
Director, Environmental Protection
Issues

Executive Summary

The Environmental Protection Agency (EPA) administers its environmental
enforcement responsibilities through its headquarters Office of Enforcement
and Compliance Assurance (OECA). While OECA provides overall direction on
enforcement policies, and occasionally takes direct enforcement action, much
of its enforcement responsibilities are carried out by its 10 regional
offices. These offices are responsible for taking direct enforcement action
and for overseeing the enforcement programs of state agencies that have been
delegated enforcement authority.

Although EPA acknowledges that some variation in environmental enforcement
is necessary to take into account local conditions and local concerns, core
enforcement requirements must nonetheless be consistently implemented. EPA
also maintains that to ensure fairness and equitable treatment, similar
violations should be met with similar enforcement responses, regardless of
geographic location. Concerned that environmental requirements are not being
enforced with sufficient consistency among EPA's regional offices, the
Chairman, Senate Committee on Small Business, asked us to provide
information on (1) the extent to which variations exist among EPA's regional
offices in the actions they take to enforce environmental requirements, (2)
what factors contribute to any variations, and (3) what the agency is doing
to achieve consistency in regional enforcement activities.

Since its creation in 1970, EPA has been responsible for enforcing the
nation's environmental laws. This responsibility has traditionally involved
monitoring compliance by those in the regulated community (such as factories
or small businesses that release pollutants into the environment or use
hazardous chemicals), ensuring that violations are properly identified and
reported, and ensuring that timely and appropriate enforcement actions are
taken against violators when necessary.

Most major environmental statutes allow EPA to authorize qualified states to
implement key programs and to enforce their requirements. EPA establishes,
by regulation, the requirements for state enforcement authority, such as the
authority to seek civil and criminal penalties. EPA also outlines, by policy
and guidance, its views as to the elements of an acceptable state
enforcement program, such as the type and timing of the action that should
be taken for various violations, and tracks how well states comply.
Environmental legislation generally provides authority for EPA to take
appropriate enforcement action against violators in states that have been
delegated authority for these programs when states fail to initiate an
enforcement action. The statutes also provide that EPA may withdraw approval
of a program if the state is not adequately administering or enforcing it.

EPA issues regulations, policies, and guidance to help ensure a consistent
approach nationwide in the implementation of environmental requirements.
OECA expects the regions to take a systematic approach in administering and
overseeing the enforcement programs among delegated and nondelegated
programs and, in doing so, to follow EPA's policies and guidance. Of
particular note, agency officials maintain that enforcement responses
selected should be directly related to the severity of the violation and
that like violations should generally be met with comparable penalties.
While federal and state enforcement officials agree that basic program
elements should be largely consistent, some variation is to be
expected--and, in some cases, encouraged. According to EPA, for example,
some variation is to be expected in how regions target resources to the most
significant compliance issues in different regions and states and in the
level of regional oversight of state enforcement programs (with the greater
oversight provided for weaker programs).

Variations exist among EPA's regional offices in the actions they take to
enforce environmental requirements, as illustrated by a number of key
indicators that EPA headquarters enforcement officials have used to monitor
regional performance. These indicators include, for example, (1) inspection
coverage by EPA and state enforcement staff of facilities discharging
pollutants within each region, (2) the number and type of enforcement
actions taken, and (3) the size of the penalties assessed and the criteria
used in determining penalties assessed. GAO also found variations in
regions' overall strategies in overseeing the states within their
jurisdiction, which may result in more in-depth reviews in some regional
programs than in others. The type of variations shown in these data
corroborate earlier findings detailed in a series of reports by EPA's Office
of Inspector General and by headquarters own internal evaluations. EPA
headquarters enforcement officials emphasize that the data, by themselves,
do not offer the appropriate context to help determine the extent to which
the variations pose problems. The officials note, however, that the data are
useful for identifying general trends and possible strengths and weaknesses
in regional and state programs, along with potential issues to investigate
at greater length.

Also corroborating the variation it identified among regional enforcement
activities, GAO found broad agreement in its interviews with EPA and state
enforcement officials on key factors that contribute to such variations.
Among the factors most commonly cited by these officials are (1) differences
in the philosophical approaches among enforcement staff about how to best
achieve compliance with environmental requirements; (2) differences in state
laws and enforcement authorities, and in the manner in which regions respond
to these differences; (3) variations in resources available to both state
and regional enforcement offices; (4) the flexibility afforded by EPA
policies and guidance that allow states a degree of latitude in their
enforcement programs; and (5) incomplete and inadequate enforcement data
which, among other things, hamper EPA's ability to accurately characterize
the extent of variations.

EPA headquarters and regional enforcement officials have a number of efforts
underway to help achieve greater consistency in regional enforcement
activities. At the headquarters level, for example, enforcement officials
are developing performance information that will allow for comparisons among
both regions and states in their conduct of key enforcement
responsibilities. Such assessments are expected to highlight any major
variations and will be communicated through the issuance of periodic
"Program Status Reports." A number of EPA regional offices have also sought
to ensure more consistency in their state oversight by developing and
applying new audit protocols in their state reviews and by encouraging more
effective communication between and among regional and state enforcement
staff. Notwithstanding these efforts, however, a number of factors will
continue to challenge EPA's ability to ensure consistent enforcement across
its regions. Among the most important of these factors is the absence of
reliable data on how both states and regions are performing their
enforcement responsibilities. Without such data, EPA is hampered in its
ability to ascertain the extent to which inconsistencies do in fact exist,
the impact they may have on human health and the environment, and the manner
in which they should be addressed. This report makes a number of
recommendations to further EPA's efforts to promote greater consistency in
how EPA's regions approach the agency's nationwide enforcement program.

Variations exist among EPA's regional offices in the actions they take to
enforce environmental requirements, as illustrated by a number of key
indicators that EPA headquarters enforcement officials have used to monitor
regional performance. EPA's enforcement program, for example, depends
heavily upon inspections by regional and/or state enforcement staff as the
primary means of detecting violations and evaluating overall facility
compliance. Thus, EPA maintains that the quality and the content of the
agency's and states' inspections, and the number of inspections undertaken
to ensure adequate coverage, are important indicators of an enforcement
program's effectiveness. Fiscal year 1998 EPA data show that regional and
state inspection coverage for Clean Air Act-related programs ranged from a
low of 27 percent of facilities inspected in the Chicago region to a high of
74 percent for facilities in the Philadelphia region. For major dischargers
under the Clean Water Act, inspection coverage also varied from a low of 57
percent of facilities in the Denver region to a high of 92 percent in the
Atlanta region. The examples, however, also illustrate the importance of
getting behind the data to understand the cause of apparently wide
disparities to understand whether they reflect a problem. OECA's Deputy
Assistant Administrator noted, for instance, that the Chicago office's
relatively low 27-percent inspection figure could be explained by that
office's recent emphasis on conducting detailed and resource-intensive
investigations of the region's numerous electric power plants, which rely on
resources from that office's inspection budget.

EPA Inspector General and OECA reports also found that regions vary in the
way they oversee state-delegated programs. Among their findings were that,
contrary to EPA policy, some regions did not (1) conduct an adequate number
of oversight inspections of state programs; (2) sufficiently encourage
states to consider economic benefit in calculating penalties; (3) take more
direct federal actions where states were slow to act; and (4) require states
to report all significant violators. A number of regions have recently begun
to develop and implement state audit protocols in response to these
findings, believing that having such a protocol could help them to review
the state programs within their jurisdiction with greater consistency. Here
too, regions' approaches differ. The Boston region has adopted a
comprehensive "multimedia" approach in which it simultaneously examines all
of a state's delegated environmental programs. The Philadelphia region,
however, favors a more targeted approach where air, water, and waste
programs are audited individually. On the other hand, the Chicago office's
air enforcement branch chief said that he did not view an audit protocol as
particularly useful, noting instead that he prefers regional staff to engage
in joint inspections with states to assess their performance in the field
and to take direct federal action where a state action is inadequate.

Regional and state officials GAO interviewed generally indicated that it was
difficult for them to ascertain the extent of variation in enforcement
activities among regions, given their focus on activities within their own
geographic environment. However, EPA headquarters officials responsible for
the air and water programs noted that such variation is fairly commonplace
and does pose problems. The director of OECA's water enforcement division,
for example, said that in reacting to similar violations, enforcement
responses in certain regions are weaker than they are in others. He also
said that such inconsistency has increased in recent years. The director of
OECA's air enforcement division said that given the considerable autonomy of
the regional offices, it is not surprising that variations exist in how they
approach enforcement and state oversight. According to the director, for the
air inspection program, disparities exist among regions in the number and
quality of inspections conducted and in the number of permits written in
relation to the number of sources requiring permits.

EPA data and recent studies document variation in key measures associated
with the agency's enforcement program, but do little to explain the causes
of the variation. Without such information, it is difficult to determine the
extent to which variation represents a problem, whether it is preventable,
or the extent to which it represents the appropriate exercise of flexibility
by regions and states to apply national program goals to their unique
circumstances. Accordingly, in visits to regional offices and states and in
discussions with headquarters officials, GAO sought to identify the factors
that may be contributing to variation. Among the factors most commonly cited
were the following:

Differences in philosophical approach to enforcement. While OECA has issued
policies, memorandums, and other documents to guide regions in their
approach to enforcement, the considerable autonomy built into EPA's
decentralized, multilevel structure allows regional offices considerable
latitude in adapting headquarters direction in a way they believe best suits
their jurisdiction. Such differences often reflect alternative enforcement
approaches such as whether the region should (1) rely predominantly on fines
and other traditional enforcement methods to deter noncompliance and to
bring violators into compliance or (2) place greater reliance on alternative
strategies such as compliance assistance (e.g., workshops and site visits to
identify potential compliance problems). Regions have also differed as to
whether deterrence could be achieved best through (1) a small number of high
profile, resource-intensive cases or (2) a larger number of smaller cases
that establishes a more widespread, albeit lower-profile enforcement
presence. Further complicating matters are the similarly wide differences
among states in their enforcement approaches and the various ways in which
regions respond to these differences. Some regions step more readily into
cases where they consider a state's action to be inadequate, while other
regions are more concerned about infringing on states' discretion if they
have been delegated enforcement responsibilities.

Differences in state laws and enforcement authorities. According to nearly
all regional and state enforcement officials GAO interviewed, differences in
state laws and enforcement authorities also contribute significantly to
variations in enforcement programs. The enforcement director in EPA's
Philadelphia office, for example, noted that Maryland, among other states,
does not specifically provide that when calculating penalties, the penalties
should be large enough to offset the economic benefits achieved by
noncompliance (as provided for by EPA policy). States also vary widely as to
whether they can pursue enforcement actions administratively or must instead
use the more time-consuming and resource-intensive approach of referring a
case to the state's Office of the Attorney General for judicial action.

Incomplete and inaccurate national enforcement data. OECA needs accurate and
complete enforcement data to help it determine whether core program
requirements are being consistently implemented by regions and states and
whether there are significant variations from these requirements that should
be corrected. Responsibility for inputting data to EPA's national databases
resides with the region or with the state responsible for carrying out the
enforcement program. Both the quality of and quality controls over these
data were criticized by state and regional staff GAO interviewed. Recent
internal OECA studies have also acknowledged the seriousness of the problem.
For example, an internal OECA work group, the "Targeting Program Review
Team," stated in its November 1999 report that key functions related to data
quality, such as the consistent entry of information by regions and states,
were not working properly and that there were important information gaps in
its enforcement-related data bases. Another OECA work group concluded that
". . . OECA managers do not have available to them timely, complete, and
detailed analyses of regional or national performance." A third OECA work
group asserted that the situation has deteriorated from past years, noting
that ". . . managers in the regions and in OECA headquarters have become
increasingly frustrated that they are not receiving from [the Office of
Compliance] the reports and data analyses they need to manage their
programs," and that there "…has been less attention to the data in
the national systems, a commensurate decline in data quality, and
insufficient use of data by enforcement/compliance managers . . .." OECA has
recognized the seriousness of its data problem. Noting that the resources
devoted to data quality may have been insufficient in recent years, the
Acting Director of the Office of Compliance indicated headquarters intention
to shift some resources internally to help alleviate the problem. GAO
concluded, however, that in light of the scope and seriousness of the
problem, EPA still needs a strategy that will bring to bear sufficient
priority and resources so that the problem can be adequately addressed.

Headquarters and regional enforcement officials identified a number of
activities it believes will further help to achieve greater consistency in
how regional offices take direct enforcement action, and in how they oversee
state enforcement programs within their jurisdiction. GAO acknowledges the
merit of many of these activities but believes that additional action, and
in some cases changes to its approach, would further EPA's effort to achieve
an appropriate level of consistency in regional enforcement:

Providing comparative data on regional performance. OECA is developing a
system in which periodic "Program Status Reports" would provide comparative
information on each region's performance and the performance of the states
within each region. According to senior OECA officials, the reports would
allow for comparisons on a broader array of information that focuses
increasingly on the results the enforcement program is trying to achieve.
Additionally, OECA is developing a system of "Program Element Reviews,"
which would be in-depth reviews targeting the regions' implementation of a
particular program element. Both reviews have the potential to convey useful
information to both EPA managers and to the public on the extent to which
the enforcement program is being implemented consistently and fairly
nationwide. However, the example cited previously concerning the Chicago
office's relatively low 27-percent air inspection rate illustrates how a
data point, unaccompanied by an explanation of the circumstances behind the
data, can lead to incorrect conclusions. OECA officials have also
acknowledged that such data can be easily misinterpreted without the
contextual information needed to clarify whether variation in a given
instance is inappropriate, or whether it reflects the appropriate exercise
of flexibility by regions and states to tailor their needs and priorities to
their individual circumstances. GAO, therefore, concluded that the Program
Status Reports can better serve their intended purpose for EPA management
and the public if they (1) clarify what aspects of EPA's enforcement program
the agency expects to see implemented consistently from region to region and
where it believes greater variation is appropriate and (2) supplement their
region-by-region data with contextual information that helps to explain why
variations occur and thereby clarifies the extent to which variations are
problematic.

Improving regional-state communications. Regional officials cited improved
communication as a key component in their efforts to initiate new processes
and effect change among their staff and among their states. Senior officials
in the Seattle region, for example, instituted a Regional Enforcement Forum
that brought together all regional program directors and top managers to
share information and to ensure they are aware of how enforcement is
approached elsewhere in the region. Other regional officials conveyed
similar experiences, noting that better communication among federal and
state enforcement officials within a region helps to identify approaches or
performance levels that deviate significantly from the norm, thereby
promoting a more consistent approach.

Regional development of audit protocols. A number of regional offices have
developed protocols that they hope will achieve more effective and more
consistent oversight of the states within their jurisdiction. One of the
more comprehensive of these new protocols is the Denver office's "Unified
Oversight System." Under the system, regional staff evaluate all state
environmental programs using certain performance criteria such as data
entry, timeliness of actions, penalties recovered, and the effectiveness of
inspections. Each state is graded on each category and then given an overall
rating. The system is built, in part, on the concept of a comparative review
system to pinpoint the weakest states and programs needing the most
oversight attention. The belief is that by developing and disseminating
comparative data among the region's states, the states with the lowest
rating will, over time, be assisted and encouraged to rise to the level of
their peers. GAO acknowledges the potential of these protocols to achieve
greater consistency by a region in its oversight of its states, and that
such protocols should be tailored to meet the needs of each region. However,
GAO concluded that headquarters guidance on key elements that should be
common to all protocols would help to engender a higher level of consistency
among all 10 regional offices in the way oversight of states is conducted
nationwide.

GAO recommends that the Administrator of EPA

ï¿½ provide, as part of the agency's efforts to develop Program Status Reports
containing comparative data on regional and state enforcement performance,
the contextual information needed to help EPA management and the public
properly understand them;

ï¿½ develop a comprehensive strategy that will bring to bear sufficient
priority and resources so that the problems affecting the quality of the
agency's enforcement data can be adequately addressed; and

ï¿½ issue guidance to EPA regions describing the required elements of audit
protocols to be used in overseeing state enforcement programs.

GAO provided a draft of this report to EPA for its review and comment. EPA
said that it shared GAO's view on the importance of consistency of regional
enforcement but raised a number of issues concerning the draft report's
discussion of that issue. Among them, EPA noted that the draft report was
not clear as to whether GAO was evaluating consistency among EPA regions or
between EPA regions and states. In GAO's view, the draft report was clear on
this point. The report stated that GAO's evaluation focused on "the extent
to which there are variations among EPA's regional offices in enforcing
environmental requirements." The report's "Objectives, Scope, and
Methodology" section further clarified that while variation among states'
enforcement programs has also been the subject of study by various
organizations, "[GAO] examined such variations only to the extent that they
provide insights into the actions of, and variations among, EPA's regional
programs."

EPA also said that the draft report did not define consistency or provide
parameters for defining consistency in a way that would be instructive for
EPA. At the outset of its review, GAO worked with EPA headquarters
enforcement staff to identify the criteria or areas where EPA would expect
to see consistency among regions in conducting enforcement programs and
overseeing state delegated programs. These staff identified several elements
that should be "consistent or largely consistent." These elements included
such issues as whether inspections consistently detect noncompliance; the
selection of enforcement response; the manner in which enforcement data are
entered into databases and used for performance measurement; and whether
comparable penalties are imposed for like offenses. During its field work,
GAO discussed these elements with EPA regional and state officials, who
generally concurred that these elements should be largely consistent from
region to region.

EPA noted that GAO's draft report did not identify any inconsistent
enforcement results or present evidence that unequal treatment of similarly
situated violators is occurring. GAO met with EPA officials on several
occasions to explore the possibilities of identifying similar violations in
different regions to allow for such cross-regional comparisons. EPA staff
pointed out that such an approach would require detailed follow-up work for
each violation to determine the specific circumstances in each case. The
staff also acknowledged that regardless of the follow-up work conducted,
questions could still be raised as to whether the selected violations were
truly comparable. Consequently, GAO focused its review on the elements of
EPA's enforcement program that are most likely to determine whether
consistent treatment of violators is likely to occur.

EPA said that GAO's draft report incorrectly implied that in a number of
areas of program management, variation is inappropriate and that it is a
widespread problem. GAO believes the report neither stated nor implied that
variation was either widespread or that it was always inappropriate. GAO
believes it took a cautious approach in characterizing both the extent and
appropriateness of variation. The draft report states, for example, that
variation in some cases may represent "…the appropriate exercise of
flexibility by regions and states to apply national program goals to their
unique circumstances."

EPA emphasized that it has principles and management mechanisms that ensure
national consistency among its regional enforcement programs. The draft
report did include a description of many of these principles and mechanisms
but was revised to provide a fuller description of these items in response
to EPA's comment. Importantly, however, GAO's findings suggest that the
effectiveness of principles and management systems in "ensuring" consistency
depends heavily on their implementation by the regions.

EPA disagreed with GAO's recommendation that the agency's Program Status
Reports include the contextual information needed to help EPA management and
the public better understand raw data characterizing regional performance.
EPA noted that the reports are not intended for public distribution and,
consequently, do not "need contextual information … since they are
designed to be used by Agency program managers who understand how to use
them." GAO disagrees with this statement. First, past experience indicates
that whether intended or not as public documents, the Program Status Reports
will likely be made public and will be used by interested parties.
Consequently, the contextual information explaining the variations is
essential if the reports are to clarify, rather than confuse, the public's
interpretation of the data. Second, while EPA notes that the reports are
designed for agency managers "who know how to use them," GAO's experience
during this review indicates that without better contextual information,
even agency managers have had difficulty interpreting the raw data to
determine the extent to which variations are problematic, whether they are
preventable, or whether they represented the appropriate exercise of
flexibility.

EPA did not respond directly to GAO's recommendation that the agency issue
guidance identifying elements that should be common to all regions' state
oversight audit protocols. However, the agency expressed concern about the
comprehensiveness of some of the protocols, noting that they "do not all
review State performance against all national policies, including the 1986
State Guidance, other national policies, and the [Memorandum of Agreement]
process." GAO acknowledges EPA's concern about the comprehensiveness of the
various protocols being tested in different regions and continues to believe
that the recommended guidance would help to address the problem identified
by EPA while still allowing each region to tailor its protocol to meet its
unique circumstances.

EPA's comments and GAO's responses are discussed in detail at the end of
chapters 2 and 4. The full text of EPA's comments and GAO's responses are
included in appendix I.

Introduction

Since the Environmental Protection Agency's (EPA) creation in 1970, the
agency has been responsible for enforcing the nation's environmental laws.
This responsibility has traditionally involved monitoring compliance by
those in the regulated community (such as factories or small businesses that
release pollutants into the environment or use hazardous chemicals),
ensuring that violations are properly identified and reported, and ensuring
that timely and appropriate enforcement actions are taken against violators
when necessary.

Most major federal environmental statutes permit EPA to allow states meeting
specified requirements the authority to implement key programs and to
enforce their requirements. EPA establishes by regulation the requirements
for state enforcement authority, such as the authority to seek injunctive
relief1 and civil and criminal penalties. EPA also outlines by policy and
guidance its views as to the elements of an acceptable state enforcement
program, such as necessary legislative authorities and the type and timing
of the action for various violations, and tracks how well states comply.
Environmental statutes generally provide authority for EPA to take
appropriate enforcement action against violators in states that have been
delegated authority for these programs when states fail to initiate
enforcement action. The statutes also provide that EPA may withdraw approval
of a state's program if the program is not administered or enforced
adequately.

EPA administers its environmental enforcement responsibilities through its
headquarters Office of Enforcement and Compliance Assurance (OECA). While
OECA provides overall direction on enforcement policies, and sometimes takes
direct enforcement action, it carries out much of its enforcement
responsibilities through its 10 regional offices. (See fig. 1.). These
offices are responsible for taking direct enforcement action and for
overseeing the enforcement programs of state agencies in those instances in
which the state has been delegated such enforcement authority.

Figure 1: EPA's10 Regions and Regional Office Locations
Although EPA acknowledges that some variation in environmental enforcement
is necessary to take into account local conditions and local concerns, core
enforcement requirements must nonetheless be consistently implemented. EPA
also maintains that to ensure fairness and equitable treatment, like
violations in different regions of the country should be met with comparable
enforcement responses.

Most major federal environmental statutes allow EPA to delegate
responsibility to states to administer environmental programs. One of the
key conditions for delegating this responsibility to a state is that the
state acquire and maintain adequate authority to enforce the federal law.
For example, to obtain EPA approval to administer the Clean Air Act's title
V permitting program for major air pollution sources,2 states must have,
among other things, adequate authority to ensure compliance with title V
permitting requirements and to enforce permits, including authority to
recover civil penalties and provide appropriate criminal penalties.3
Similarly, the Clean Water Act allows EPA to approve state water pollution
programs under the National Pollutant Discharge Elimination System if the
state programs contain, among other things, adequate authority to issue
permits that ensure compliance with applicable requirements of the act, and
to abate violations, using civil and criminal penalties and other ways and
means of enforcement.4 For permitting programs, such as those authorized by
the Clean Air Act and Clean Water Act, facilities either report periodically
to the cognizant state or federal regulatory authority on whether they are
in compliance with their permit, or are subject to periodic inspections that
check for compliance.

EPA develops enforcement policies for these programs. The enforcement
policies outline EPA's traditional regulatory approach to enforcement,
including what constitutes a violation−especially the significant
violations that are likely to require an enforcement action. When a
violation is discovered, the policies generally require an escalating series
of enforcement actions, depending on the seriousness of the violation and
the facility's level of cooperation in correcting it. Actions might start
with a verbal warning, or a warning letter, and escalate to administrative
orders to change the facility's practices. These enforcement policies also
define timely and appropriate enforcement actions for various types of
violations. In the most serious cases, EPA or the states can assess
penalties or refer the case to the U.S. Department of Justice or the states'
Office of Attorney General for prosecution. The monetary penalties EPA
assesses include two amounts: one amount based on the seriousness of the
violation and the other amount designed to remove any financial advantage
the violator obtained over its competitors through noncompliance. EPA may
also pursue criminal enforcement action if the situation warrants.

Whether EPA or state personnel take the lead in taking enforcement actions
depends on whether the state has been delegated the authority to administer
the program. If EPA retains the program, the cognizant EPA regional office
generally takes the lead in taking enforcement actions, often with support
and/or guidance from EPA headquarters program offices, OECA, and the Office
of General Counsel.

In situations in which the state has been delegated authority to administer
the program, EPA's enforcement polices provide guidance to the states.
Moreover, EPA's regions and the states work together each year to establish
enforcement expectations and lay out their respective roles. EPA also
provides grant funds to states to assist in the implementation of the
federal programs and can, under certain circumstances, condition receipt of
grant funds on compliance with EPA guidance.

EPA oversees the states' enforcement in a variety of ways, including
reviewing inspection reports and enforcement actions, and accompanying state
inspectors. EPA also requires states to report information on various
aspects of their enforcement efforts, such as the number and type of
inspections the state has taken, the results of those inspections, and any
enforcement actions resulting from discovered violations. EPA's enforcement
policy under the Clean Air Act and Clean Water Act is concentrated primarily
on large facilities and large sources of pollution. States have more
autonomy to determine how they will enforce the law at smaller sources and
smaller facilities.

Nationwide

EPA has established consistent principles to define a quality enforcement
and compliance program. State guidance, providing the framework for
state/EPA enforcement agreements, has been in place since 1986. According to
EPA, this state guidance, together with statute-specific guidance, is the
blueprint for both EPA and state enforcement and compliance programs and
serve as the basis for both authorizing and reviewing state programs.
Additionally, EPA has established (1) enforcement response policies that
classify types of violations, appropriate responses, and the timeline in
which those violations must be addressed; (2) penalty policies that specify
the dollar amount assigned to classes of violations; and (3) a national
model that can be used for the recovery of the economic benefit so that
violators do not gain an economic advantage over law abiding competitors.
EPA has also established a management system that includes mechanisms to (1)
agree on enforcement priorities at the national, regional, and state levels;
(2) allow EPA to determine whether states are adhering to national policies
and principles; and (3) provide oversight of both regional and state
enforcement of environmental laws throughout the nation. The management
system also includes strategies and procedures for compliance monitoring.

OECA expects the regions to take a systematic approach to administering and
overseeing the enforcement programs among delegated and nondelegated
programs and, in doing so, to follow the policies and guidance issued for
this purpose. While federal and state enforcement officials agree that core
enforcement requirements should be generally implemented consistently, some
variation is to be expected--and, in some cases, encouraged. According to
EPA, for example, some variation is to be expected in how regions target
resources to the most significant compliance issues in different regions and
states, the level of enforcement activity (which should vary with the
severity of the problem), and the level of regional oversight of state
enforcement programs (with the greater oversight provided for weaker
programs).

EPA officials use a number of methods to oversee regional and state
enforcement programs. An important first step undertaken every 2 years
between EPA headquarters and the regions is the Memorandum of Agreement,
which contains the core program requirements and national priorities that
both headquarters and the regions agree must be addressed. In addition to
the national priorities, the agreements with each individual region contain
region-specific priorities that are reviewed and approved by OECA. The
regions share this agreement with their states so that all key parties
understand the regions' goals and commitments with headquarters. Senior OECA
managers visit the regions during the year to review regional progress in
meeting the agreed-upon enforcement goals and commitments in the memorandum
and to make mid-year corrections. OECA also sponsors national meetings,
routinely scheduled conference calls between headquarters and regional media
program staff, and conducts periodic evaluations of regional enforcement
programs. EPA regional enforcement program staff frequently communicate with
state enforcement staff through routinely scheduled telephone conferences.
In addition, a number of regions have implemented protocols for overseeing
state performance.

This report examines (1) the extent to which there are variations among
EPA's regional offices in the action they take to enforce environmental
requirements, (2) what factors contribute to the variations in regional
enforcement activities, and (3) what the agency is doing to achieve greater
consistency in regional enforcement activities.

Our review examined the extent of variation among EPA regional enforcement
programs, focusing in particular on air and water pollution programs. While
variation among states' enforcement programs have been the subject of study
by a number of organizations, we examined such variations only to the extent
that they provide insights into the actions of, and variations among, EPA's
regional programs.

To respond to the first objective, we examined past studies by GAO, OECA,
EPA's Office of Inspector General, and other organizations to ascertain the
elements of program implementation where issues of inconsistent regional
performance may exist (e.g., the extent to which variations exist in the
penalties assessed in different regions for comparable violations and
variations in the number and type of enforcement actions taken). We also
examined the most current available EPA data on a variety of these elements,
much of which is in the agency's Reporting for Enforcement and Compliance
Assurance Priorities system. This system compiles information from other
databases containing compliance and enforcement information, such as the
Permit Compliance System for water programs, the Aerometric Information
Retrieval System Facility Subsystem (AFS) for air programs, and the EPA
civil docket. These data are summarized in EPA's April 1999 Measures of
Success Management Report, which was used for some of the information
presented in chapter 2 of this report. In situations where we needed more
detailed information, we obtained OECA's assistance in extracting and
interpreting the additional information from its databases.

We did not perform an independent test of the data's accuracy and
completeness. However, we did seek information concerning their accuracy and
completeness in our interviews with headquarters, regional, and state
officials and by examining past studies and EPA documentation.

To supplement our interpretation of these data and to address the second and
third objectives, we interviewed officials responsible for enforcement
issues, and more specifically for the enforcement of air and water programs
in three EPA regions--Chicago, Philadelphia, and Seattle. We selected these
regions based largely on the history of enforcement performance (as
indicated by EPA's national databases and prior work by EPA's Inspector
General and us) and on a desire to assess regions in different geographical
settings and with different environmental and regulatory issues. (For
example, the Chicago region is more industrialized than most other regions,
and the Seattle region includes some states that have and have not been
delegated either air or water programs.) To obtain diverse state
perspectives within each region on the key research questions for this
review, we selected two states within each region for detailed study. These
states included Maryland and West Virginia in EPA's Philadelphia region;
Indiana and Ohio in the Chicago region; and Idaho and Washington in the
Seattle region. We also obtained pertinent data and other documentation from
these officials. In some cases, we contacted officials in other regions and
in other states to obtain additional perspectives and to substantiate
findings from the states and regions we selected for detailed study. We
verified statements attributed to these state officials and other
information provided by them in our draft report.

We also contacted senior enforcement officials at EPA headquarters to ensure
that we had current information on agency regulations, policies, and
guidance and to obtain their perspectives and other information on our
issues of inquiry. In addition, we contacted other groups to obtain a
national perspective on these issues, including the Association of State and
Interstate Water Pollution Control Administrators, the Environmental Council
of the States, the Environmental Law Institute, the Environmental Working
Group, the National Petrochemical and Refiners' Association, and the State
and Territorial Air Pollution Program Administrators.

We conducted our work from July 1999 through March 2000 in accordance with
generally accepted government auditing standards.

Variations in Regional Enforcement Activities

Variations exist among EPA's regional offices in the actions they take to
enforce environmental requirements, as illustrated by a number of key
indicators that EPA headquarters enforcement officials have used to monitor
regional performance. These indicators include (1) the percentage of
regulated facilities that are inspected by EPA and/or state enforcement
staff for compliance (and the comprehensiveness of those inspections), (2)
the number and type of enforcement actions taken, (3) the size of the
penalties assessed and the criteria used in determining the penalties, and
(4) the extent to which violations are referred to the Department of
Justice. The type of variations conveyed by these data corroborate earlier
findings detailed in a series of reports by EPA's Office of Inspector
General and by OECA's own internal evaluations. We also found variations in
regions' overall strategies for auditing state enforcement programs to
determine whether program requirements are being met.

OECA officials emphasize that enforcement data, by themselves, do not offer
the appropriate context to help determine the extent to which the variations
pose problems. Rather, the officials maintain that the data are a useful
starting point for identifying general trends and possible strengths and
weaknesses in regional and state programs, along with potential issues to
investigate at greater length.

EPA's enforcement program depends heavily upon inspections by EPA regional
and/or state enforcement staff as the primary means of detecting violations
and evaluating overall facility compliance. Thus, EPA maintains that the
quality and the content of the inspections, and the number of inspections
conducted to ensure adequate coverage, are important indicators of an
enforcement program's success. Where programs are delegated to the states,
regional offices retain oversight responsibility for ensuring that state
inspection programs meet EPA's criteria in terms of how inspections are
conducted and how the results are reported.

Data in OECA's most recent Measures of Success Management Report do, in
fact, show that there is wide variation in inspection rates nationwide. The
April 1999 report noted, for example, that for fiscal year 1998, regional
and state inspection coverage for Clean Air Act-related programs ranged from
a low of 27 percent of facilities inspected in the Chicago region to a high
of 74 percent for facilities in the Philadelphia region. For major
dischargers under the Clean Water Act, inspection coverage also varied from
a low of 57 percent of facilities in the Denver region to a high of 92
percent in the Atlanta region. The example, however, also illustrates the
importance of getting behind the data to understand the cause of apparently
wide disparities to understand whether they reflect a problem. OECA's Deputy
Assistant Administrator said that the Chicago office's 27-percent inspection
figure could be appropriately explained by that office's recent emphasis on
conducting detailed investigations of the region's numerous electric power
plants. According to this official, such investigations can be extremely
resource-intensive, and that the resources to conduct them would most likely
come from the region's budget for inspections.

In addition to the variation in the percentage of facilities inspected,
EPA's Inspector General reports in recent years have documented variations
in the comprehensiveness of inspections. In its 1998 consolidated report on
air enforcement audits completed in EPA's Boston, Philadelphia, Dallas, and
Seattle offices, the Inspector General points out that to adequately
evaluate a facility's compliance with the Clean Air Act, it is necessary to
perform at least a level 2 inspection−designed at a level of
sufficient detail as to detect violations−at major stationary sources.
The report found that in four of the six states included in the audit,
inspectors did not always complete the tests required for such an inspection
and noted that the regions did not ensure that state inspectors completed
the tests and evaluations required.

In response to the Inspector General's report, OECA contracted to undertake
a review of its 1991 compliance monitoring strategy for the air program to
determine the extent that it is used and whether it needs to be updated. The
compliance monitoring strategy contains guidance for regions and states for,
among other things, determining sources to target for inspections and the
appropriate level of inspection that should be conducted. The contractor's
July 1999 review documented considerable variations among the 10 regions in
their approach toward the compliance monitoring strategy. Specifically, the
review noted that only five regions implement major components of the
strategy. The other five regions reported that they do not implement the
strategy and engage in only minimal inspection planning and oversight with
their states. According to the study, almost all regions agreed that
compliance monitoring requires some guiding method or strategy, but many
were resistant to any guidance that was highly prescriptive and imposed new
requirements on the states.

While EPA recognizes that differences may exist in the choices that regions
and states make in selecting enforcement actions, agency officials maintain
that enforcement responses selected should be directly related to the
severity of the violation and that like violations should generally be met
with comparable penalties. As discussed later, EPA data show that the number
and type of formal enforcement actions have varied considerably from one
regional office to another.

Where a facility fails to achieve compliance within a specified period of
time and/or fails to respond to an informal action such as a notice of
violation, EPA and approved states may proceed with a formal enforcement
action. A formal enforcement action requires compliance, lays out a specific
timetable for completing certain items, contains consequences for not doing
so, and subjects a person or facility to adverse legal consequences for
noncompliance. Such actions may be imposed either administratively by the
region or state enforcing agency, or judicially by the courts.

Administrative actions can generally be processed more quickly and easily
than can civil judicial actions. In choosing an administrative action, the
region or state issues an administrative compliance order requiring the
facility to return to compliance within a certain time frame and/or an
administrative penalty order, which assesses a certain dollar amount.

Civil judicial actions tend to be more complex and resource-intensive than
administrative actions. At the federal level, such actions are generally
initiated by regional offices and then referred to the Justice Department.
Justice, in turn, generally files a formal lawsuit in U.S. federal court.
This course of action is generally used for cases that may set a precedent,
involve serious environmental harm, or where a violator is deemed likely to
be uncooperative. For state-delegated agencies, such judicial remedies are
sought through the state's Office of the Attorney General. Judicial actions
generally result in penalties and court orders requiring correction of the
violation, along with specific actions to prevent future violations, and
tend to be taken more seriously by the regulated community.

Simply presenting the absolute numbers of enforcement actions across
regional offices does not take into account important factors, such as the
varying size of different regions and the enforcement resources available to
them. Accordingly, the agency has accounted for these factors by comparing
the number of actions to the size of the enforcement staff in each region.
Thus, for example, OECA's Measures of Success Management report observes
that the Chicago office made considerably more civil referrals to Justice
for judicial action--as a function of its available enforcement
resources−than did other regions. Similarly, the report's data show
that the Philadelphia office obtained the greatest percentage of civil
judicial penalties in comparison to its allocation of resources. On the
other hand, citing corresponding data from the previous year, OECA's
evaluation of the Seattle office's performance observed that the region was
less "productive" when comparing its enforcement activity to its allocation
of resources. The study noted specifically that while the region was
allocated 6.5 percent of the national enforcement resources, its percentage
of output of civil judicial referrals was only 4.5 percent, and its issuance
of administrative penalty orders was only 4.3 percent of the national total.

OECA notes that the complexity of a region's enforcement actions may affect
its relative productivity as compared with other regions. In its 1998
evaluation of the Chicago office, OECA reported that while the air, water,
and waste programs historically produced lower outputs of administrative
penalty orders relative to their enforcement resources, these numbers needed
to be considered in light of the office's higher outputs of more
resource-intensive judicial referrals by each of these programs. For
example, in fiscal year 1997 the Chicago air program with 24 percent of all
regions' air resources produced about 30 percent of all regions' judicial
referrals. The Chicago water and waste programs in fiscal year 1997 also
outpaced their enforcement resources in producing judicial referrals.

The impact on productivity of a region's reliance on more complex actions
was reinforced by OECA water enforcement officials, who provided comparative
information among the regions concerning their enforcement of the National
Pollutant Discharge Elimination System under the Clean Water Act.5 The
officials noted that the Chicago office emphasizes civil judicial actions
and currently has a large docket of open judicial cases. They noted that the
region's choice of these more complex, resource-intensive actions could help
to explain the lower numbers in other aspects of its enforcement program
(e.g., lower inspection coverage previously discussed). The officials also
cited other regions in which high numbers of administrative orders may also
reflect an inventory of cases that are more straightforward, or in which the
region chooses lower-level actions and does not escalate them when
compliance is not achieved.

The EPA Inspector General's 1997 consolidated audit report for Chicago,
Dallas, and San Francisco regional air enforcement programs also considered
the productivity of the three regions in its review of regional actions and
similarly found variation. The report found that with approximately 50
percent of the resources among the three regions studied, the Chicago office
completed approximately 50 percent of the actions. However, with 24 percent
of the regions' resources, the Dallas office was responsible for 3 percent
of the actions completed, and with 22 percent of the regions' resources, the
San Francisco office completed 42 percent of the actions. The report noted
that the Chicago office had more staff and thus it was reasonable that they
could complete more actions. However, the Dallas and San Francisco offices
had similar resource levels but widely different numbers of enforcement
actions completed. The report suggested that the variation and the disparity
between the Dallas and San Francisco offices may be due to factors other
than resources (such as the region's attitude toward enforcement, the type
of industry in each region, among other factors), but did not evaluate the
impact of these factors.

Penalties play a key role in environmental enforcement by deterring
potential violators and by ensuring that members of the regulated community
cannot gain a competitive advantage by violating environmental regulations.
EPA's penalty policy provides that all penalties should include two
components. The first is an "economic benefit" component that reflects the
benefit achieved by avoiding compliance. The economic benefit component is
considered important to "level the playing field" among companies within an
industry and eliminates any economic advantage violators gain through
delayed or avoided compliance costs. The second component is called a
"gravity-based component," which reflects the seriousness of the violation,
the actual or possible harm it causes, and the size of the violator.

EPA's penalty policies provide that, at a minimum, a penalty should remove
any significant economic benefit resulting from noncompliance but allows
negotiators more flexibility in assessing the gravity component. For
example, EPA's small business and self-policing policies allow mitigation or
the elimination of the gravity portion of a civil penalty for qualifying
companies. Recognizing that it may not always be feasible to recoup the full
amount of the economic benefit of noncompliance plus some amount based on
the gravity of the violation, the agency's policy allows for mitigating or
adjusting the penalty to a lesser amount. Such a decision may be reached in
cases where the violator demonstrates an inability to pay the full penalty
or where the risks and costs to litigate a case justify a smaller amount.
The policy requires documentation of the amount of the economic benefit and
any decision to lessen a penalty.

Fiscal year 1998 data published in OECA's Measures of Success Management
Report show that the number and size of administrative penalties assessed by
regional offices varied. The data show, for example, that the Philadelphia
office assessed $422,000 in Clean Air Act administrative penalties for 22
cases during fiscal year 1998, and the Chicago office assessed over $1
million for 27 cases. On the other hand, the Seattle office assessed $10,000
for one case. The disparity was less pronounced for the regions' Clean Water
Act programs: the Philadelphia office assessed $523,000 in administrative
penalties for 23 cases during fiscal year 1998, the Chicago office assessed
over $1 million for 39 cases, and the Seattle office assessed $441,000 for
25 cases.

The Inspector General's 1997 consolidated report for the air enforcement
program in Chicago, Dallas, and San Francisco offices similarly found that
over an 18-month period, the regions varied in both the number of
administrative actions completed and penalties assessed. Specifically, the
Chicago office completed 33 actions and assessed more than $6 million in
penalties; the San Francisco office completed 25 actions and assessed about
$3.5 million in penalties; and the Dallas office completed 2 actions and
assessed penalties of just over $100,000.

The Inspector General also found that penalties assessed varied
significantly among the states in the three regions and that the regions
could do more to address the variations. The primary factor appeared to be
that the state penalty assessments did not always consider or assess the
economic benefit of noncompliance. The 1997 report recommended, among other
things, that the regions (1) hold discussions about recovering economic
benefit with the states when negotiating various EPA-state agreements and
(2) assist states in calculating economic benefit and in securing state
legal authority where necessary. At an EPA/State Enforcement Forum in
February of 2000, EPA discussed the importance of recovering economic
benefit and followed up with a letter to state commissioners in March of
2000 that included a fact sheet and sources for training and help in
recovering and calculating economic benefit.

On average, civil judicial actions result in significantly higher penalties
than do administrative actions. For example, according to fiscal year 1998
data in OECA's Measures of Success Management Report, the average civil
judicial penalty for Clean Air Act programs was about $603,453 compared with
the average administrative penalty of $17,656. Some regions, however,
obtained a significantly higher percentage of their overall penalties
through civil judicial versus administrative action. The Dallas office, for
example, obtained 62 percent of its penalties ($1.46 million) through civil
judicial action. On the other hand, the Chicago office obtained over 87
percent of its penalties ($7.22 million) through civil judicial action.

Regions refer to the Department of Justice larger, more complex cases
involving violators of environmental laws, as well as cases they believe may
set a precedent. The Department then brings these cases to U.S. federal
court. Settlement of cases and agreements reached between the agency and
responsible parties are formalized in a court-approved consent decree.
Consent decrees for civil judicial cases generally will contain provisions
for penalties, requirements for correction of the violations, and specific
actions to prevent future violations.

OECA also maintains that it is important to track the status of all active
consent decrees to ensure that such agreements are carried out. Accordingly,
each region is required to maintain a database of consent decree milestones
and, to determine the defendants' current compliance with the decree.

OECA data show variation in both (1) the extent that violations of consent
decrees are referred to Justice among regions and (2) the extent to which
the regions track the status of compliance with the consent decrees.
Regarding the extent of referrals, OECA's Measures of Success Management
Report concluded that, "There continues to be a wide disparity among the
regions in terms of how frequently they refer violations of consent decrees
to the Department of Justice." The report shows that from fiscal years 1990
through 1998, the New York and Chicago offices referred about 60 and 40
consent decree violations, respectively, to Justice; the Boston and
Philadelphia offices each referred between 10 and 20 consent decree
violations; and the Atlanta, Kansas City, San Francisco, and Seattle offices
each referred less than 10 consent decree violations. The data indicate that
the Dallas and Denver offices did not refer any violations of consent
decrees to Justice over the same 9-year period.

Tracking and reporting on the status of consent decree implementation also
varied widely, based on data submitted by the regions to EPA headquarters by
the end of fiscal year 1998. The Dallas, Kansas City, San Francisco, and
Seattle offices did not report or did not know the status of any of their
active cases. On the other hand, the Atlanta, Chicago, and Philadelphia
offices reported that they had tracked 97 percent of their cases.

EPA Inspector General audit reports and OECA regional evaluations, issued
since the mid-1990s, found that regional oversight of state delegated
programs had been reduced or varied considerably from program to program.
The Inspector General and OECA reports criticized the regions for inadequate
oversight of state programs. Among other things, the reports cited the
regions for not conducting an adequate number of oversight inspections; not
sufficiently encouraging that economic benefit be considered in calculating
penalties; not taking more direct federal actions where states were slow to
act; and not requiring states report all significant violations. Regional
officials acknowledged that at least to some extent, the criticisms were
valid. Seattle officials, for example, explained that there had been a
downturn in oversight activities in their region for some time prior to the
Inspector General reports. They noted that decisions were made to conduct
less oversight because (1) their own resources were inadequate to continue
all their oversight activities and (2) their agreements with states to build
more cooperative partnerships warranted less intense and detailed review.

In recent years, a number of regions have begun to develop and implement
state audit protocols in response to these criticisms. The regions also
noted that having such a protocol--which lays out the type of oversight
inspections that will be conducted and the specific program elements that
will be examined--could help them to review the state programs within their
jurisdiction with greater consistency. In most cases, the protocols were
developed with the support and participation of the states within each
region and exhibit a number of key differences from one region to another.

The Boston region's protocol is particularly unique in that the region has
adopted an approach in which it simultaneously examines a state's entire
array of delegated air, water, and waste programs. Given the
resource-intensive nature of the audit, the region is only able to examine
one state every year and a half. Regional enforcement officials commented
that this approach provides both the region and the states with a holistic
view of their enforcement efforts. The officials also pointed to one
particular case in which they were able to make the head of the state's
environmental agency aware that the waste management division took a
considerably more stringent approach to enforcement than the water division.
The officials said the state commissioner took steps to resolve the
differences so that the enforcement programs were more consistent. Regional
officials said that absent the multimedia audit, such a finding would not
have been so readily apparent to them.

Philadelphia regional enforcement officials, however, told us that the
Boston office's multimedia audit would not likely be acceptable to states in
their region. They explained that in contrast to the Boston office, which
has a centralized organizational structure with a multimedia focus, the
Philadelphia office focuses its efforts on individual environmental
programs. Accordingly, officials in Philadelphia's regional air protection
division developed an audit protocol, in concert with states in the region,
to aid the region in its oversight and evaluation of the effectiveness of
state-delegated air enforcement programs. The audits are to be conducted by
Philadelphia air enforcement staff along with state enforcement staff. The
Deputy Director of West Virginia's Division of Environmental Protection said
that the agency not only participated in the development of the Philadelphia
air enforcement protocol, but volunteered its air enforcement program to
undergo one of the first audits believing that the audit could help identify
strengths and weaknesses in the program. West Virginia officials are
currently awaiting the results.

The Seattle office jointly developed a set of Compliance Assurance Program
Evaluation Principles that the region and states agree define the elements
of a successful compliance assurance program and constitute a broad
framework for evaluating programs delegated to the states. For example, the
principles identify what environmental results are to be achieved and how
they will be measured; what kinds of facilities will be targeted for
inspection; and what process is to be used by the region and its states to
resolve disputes. In addition, regional enforcement officials negotiate
separate Compliance Assurance Agreements with their states for each
delegated program outlining in more detail responsibilities for conducting
certain functions, including regional oversight. The region undertook
evaluations based on the Compliance Assurance Program Evaluation Principles
of various media programs in two of the region's delegated states. The first
state reviews were focused reviews in Oregon, where the region focused on
compliance assurance programs for separate environmental media.
Subsequently, a contextual review was conducted of the Washington State air
program. This review was broader in scope and addressed the full range of
the state air program activities, including the compliance assurance
component.

OECA's 1998 evaluation of the Seattle office's performance, however, raised
concerns that the principles may conflict with agency policy and limit EPA's
oversight authority to certain areas. Also, OECA noted that it is not clear
whether certain provisions in the principles document limit EPA's ability to
(1) take issue with a state's policy-level decision not to seek economic
benefit or (2) review a state's decision concerning allocations of resources
that affects the implementation of the state enforcement program. Therefore,
OECA recommended that the Seattle office ensure that the states understand
that EPA retains the latitude to take the enforcement lead in instances
described in its guidance and to take the lead in any case in which the
agency determines that there are issues of national significance or
precedence that require federal action.

Officials from EPA's Chicago office had mixed opinions as to whether an
established audit protocol would improve their oversight of their states.
Chicago's water enforcement branch chief and the region's enforcement
coordinator said that they would consider developing a protocol for their
respective programs, although they do not have one underway at this time.
The region's air enforcement and compliance assurance branch chief, however,
said that he did not believe an audit protocol would be particularly useful,
noting instead that he prefers regional staff to engage in joint inspections
with states to assess their performance and to take direct federal action
where a state action is inadequate.

To supplement the information from EPA's databases and recent studies, we
interviewed officials from the states and EPA regional offices and
headquarters on their perceptions as to whether variation exists in key
components of the enforcement program and whether such variations are
problematic. Regional officials generally indicated that it was difficult
for them to assess whether variations are problematic, given their limited
vantage point. Some noted in particular that while they were aware of
various cases and reports of alleged inconsistency among regions through
exchanges at national meetings and through publications, they do not have
sufficient information for an informed opinion. They added that without more
detailed information and a better grasp of the national data, they are
hesitant to suggest that variation or differences among regions are either
extreme or harmful. The state officials we interviewed essentially agreed,
frequently noting that they were mainly concerned that their EPA regional
office apply consistent treatment to all states within the region.

EPA headquarters officials' expressed substantially more definitive views on
these issues, perhaps because their vantage point provides them with a
broader overview of the national program. The EPA officials we interviewed
noted that variation is fairly commonplace and that it does in fact pose a
problem. They expressed the concern that in reacting to similar violations,
enforcement responses in certain regions are weaker than they are in other
regions. The director of OECA's water enforcement division noted in
particular that the lack of consistency in regional enforcement of water
programs has worsened over the years.

The director of OECA's air enforcement division said that given the
considerable autonomy regional offices possess, it is not surprising that
there is substantial variability in how they approach enforcement and state
oversight. He illustrated the point with the inspection program.
Specifically, he said that because the air program does not have continuous
monitoring, facilities found in compliance some years ago may fall into
noncompliance without being detected unless they are periodically retested.
Therefore, he said a good indicator of variation is the number and quality
of inspections that a region or state conducts because this tells him
whether a region knows what its states are doing and whether it "has the
will to press the issue." He said that based on the numbers of inspections,
as well as the number of permits written in relation to the number of
sources, there is clearly disparity among both regions and states.

In commenting on the draft report, EPA said it shared our view on the
importance of consistency of regional enforcement, but raised a number of
issues concerning our discussion of that issue in this chapter. First, the
agency said that the draft report was not clear in the scope of consistency
being evaluated, particularly whether we were evaluating the consistency in
enforcement activity between federal and state activities or among EPA
regions themselves. The draft report had stated that, as requested, our
evaluation focused on "the extent to which there are variations among EPA's
regional offices in enforcing environmental requirements." Our "Objectives,
Scope, and Methodology" section further clarified that while variation among
states' enforcement programs has also been the subject of study by various
organizations, "we examined such variations only to the extent that they
provide insights into the actions of, and variations among, EPA's regional
programs." This approach is consistent with the approach OECA used in past
evaluations of regional office enforcement programs. For example, its 1998
evaluation of the Seattle office's enforcement program noted that its review
"… includes data and a review of state performance only to the extent
necessary to evaluate the region's performance in overseeing state
compliance and enforcement activity."

Second, EPA said that the draft report did not provide parameters defining
what was meant by "consistency," or discuss where variations might be
acceptable. At the outset of our review, we worked with EPA headquarters
enforcement staff to identify the criteria or areas where EPA would expect
to see consistency among regions in conducting enforcement programs and
overseeing state-delegated programs. The former Director of OECA's Office of
Planning and Policy Analysis identified several elements that should be
"consistent or largely consistent." These elements included such issues as
whether inspections consistently detect noncompliance; the selection of
enforcement response; the manner in which enforcement data are entered into
databases and used for performance measurement; and whether comparable
penalties are imposed for like offenses. During our fieldwork, we discussed
these elements with EPA regional and state officials, who generally
concurred that these elements should be largely consistent from region to
region.

EPA's comment that the draft report did not discuss where variations might
be acceptable is incorrect. For example, the draft report noted EPA's
position that some variation was to be expected in how regions target
resources to the most significant compliance issues, and in the level of
regional oversight of state enforcement programs (with the greater oversight
provided for weaker programs). The draft report also acknowledged that there
were circumstances under which variation may represent a problem, but that
there are also circumstances in which variation "represents the appropriate
exercise of flexibility by regions and states to apply national program
goals to their unique circumstances."

Third, EPA noted that our draft report did not identify any inconsistent
enforcement results or present evidence that unequal treatment of similarly
situated violators is occurring. We met with EPA officials on several
occasions to explore the possibilities of identifying similar violations in
different regions to allow for such cross-regional comparisons. EPA staff
pointed out that such an approach would require detailed follow-up work for
each violation to determine the specific circumstances in each case. They
also acknowledged that regardless of the follow-up work conducted, questions
could still be raised as to whether the selected violations were truly
comparable. Consequently, we focused our review on the elements of EPA's
enforcement program that are most likely to determine whether consistent
treatment of violators is likely to occur.

Fourth, EPA said that the draft report implied that, in a number of areas of
program management, variation was inappropriate and was a widespread
problem. We disagree. We believe we took a cautious approach in
characterizing both the extent and appropriateness of variation. For
example, the draft report acknowledged EPA's position that "some variation
is to be expected--and, in some cases, encouraged." It also stated that
variation in some cases may represent "…the appropriate exercise of
flexibility by regions and states to apply national program goals to their
unique circumstances." Indeed, the only instance in which the report
documented a view of a serious and persistent problem was in noting the
observation expressed by some senior OECA managers that variation in
regional approaches to enforcement was fairly commonplace and that it did
pose problems.

Fifth, EPA emphasized that it has principles and management mechanisms that
ensure national consistency among its regional enforcement programs. The
draft report had discussed these principles and mechanisms but was revised
to include a fuller description of them in response to EPA's comment.
Importantly, however, consistent principles and management systems, by
themselves, cannot "ensure" consistency. As documented in this report, and
by past reports of both OECA and EPA's Inspector General, the key is
implementation: the mere existence of enforcement principles and management
systems does not ensure they will be followed.

Last, EPA said the draft report did not sufficiently acknowledge that
specific data on such measures as "penalty amounts" must be analyzed in
conjunction with other facts and circumstances. We agree that specific data
on such measures are not useful indicators of variation unless analyzed in
conjunction with other facts and circumstances, and had made that point in
the draft report. For example, the draft report's executive summary cited
the concerns of OECA officials that "the data, by themselves, do not offer
the appropriate context to help determine the extent to which the variations
pose problems." The draft report also highlighted "the importance of getting
behind the data to understand the cause of apparently wide disparities to
understand whether they reflect a problem." Indeed, as discussed in chapter
4, our concern with EPA's current approach regarding its planned use of
Program Status Reports, which will present region-by-region information on
enforcement practices, is that it makes no provision for the "facts and
circumstances" and other contextual information needed to interpret regional
variation.

Key Factors Contributing to Variations in Regional Enforcement Activities

EPA's data and recent analyses by OECA and EPA's Inspector General show
variation in the quantity and quality of inspections, the number and type of
enforcement actions, and other key elements of the agency's enforcement
program. However, the data themselves do little to explain the causes of the
variation. Without such causal information, it is difficult to determine the
extent to which variation represents a problem, whether it is preventable,
or the extent to which it represents the appropriate exercise of flexibility
by regions and states to apply national program goals to their unique
circumstances. Accordingly, in our visits to regional offices and states and
in our discussions with headquarters officials, we sought to identify the
factors that may be contributing to the variations.

Overall, we found broad agreement among EPA and state enforcement officials
on the factors that contribute to variations in regional enforcement
activities. Among those factors commonly cited were (1) differences in the
approaches among enforcement staff about the best way to achieve compliance
with environmental regulations, (2) differences in state laws and
enforcement authorities and the manner in which regions respond to these
differences, (3) variations in resources available to both state and
regional enforcement offices, (4) the flexibility afforded by EPA policies
and guidance that allow states a degree of latitude in their enforcement
programs, and (5) incomplete and inadequate enforcement data that hamper
OECA's ability to detect variation.

While OECA has issued policies, memorandums, and other documents to guide
regions in their approach to enforcement, the considerable autonomy built
into EPA's decentralized, multilevel structure allows regional offices
considerable latitude in adapting headquarters direction in a manner that
they believe best suits their jurisdiction. The majority of regional and
headquarters officials cited differences in approaches to enforcement among
regional staff as accounting for a major share of the variation that exists
among regions' enforcement programs.

Such differences also exist among state enforcement authorities--perhaps
even more so, given their dual accountability to their governors' offices as
well as to EPA. While our review focused on variations in enforcement
practices among regions, the wide variation in approaches among states poses
additional complications for the regions that oversee them. How the regions
respond to widely differing state enforcement approaches offers yet
additional ways in which regions may exhibit variation in the exercise of
their enforcement responsibilities.

Differences at the regional level often reflect alternative views on the
extent to which traditional enforcement measures should be relied upon to
deter noncompliance and to bring violators into compliance, with some
regional staff preferring a greater reliance on alternative strategies such
as compliance assistance (e.g., workshops, site visits to identify potential
compliance problems) to a more traditional reliance on taking enforcement
action. EPA's Chicago office, for example, has long held a reputation for
having an aggressive enforcement program in which the region would act
quickly and forcefully if it determined that the state was not performing
its responsibilities adequately. A Chicago office official told us that the
region believes that it is important to maintain an "enforcement presence"
in states as a deterrent to the regulated community, in contrast to other
regions that believe having to take an enforcement action is a sign of
failure.

Other variations reflect differences over whether deterrence could be
achieved best through a small number of high profile, resource-intensive
cases or a larger number of smaller cases that establishes a more
widespread, albeit lower-profile enforcement presence. According to a number
of EPA officials we interviewed, these alternative approaches help to
explain some of the discrepancies between regions in the numbers of
enforcement actions they take. For example, a Boston office official told us
that the region tends to focus on a small number of large, high-profile
penalty cases while other regions tend to focus on a large number of small
penalty cases. Similarly, EPA's enforcement data for fiscal years 1996
through 1998 indicate that the Chicago office led regions in pursuing
judicial actions while the Dallas office led in taking administrative
penalty actions.

State enforcement authorities also exhibit differences in their approaches
to enforcement. The variations derived from states' enforcement approaches
can lead not only to differential treatment of violators from one state to
another but can also pose a complicated landscape for regional overseers. In
cases where states do not take sufficiently strong action, each region
decides how far to let states go before it intervenes.

An enforcement official in the Chicago office told us that each of the
states in the region has a different enforcement philosophy in dealing with
violators. He said states range from those who strive to identify violators
and take strong deterrent actions, to those who view themselves as partners
with industry and, therefore, adopt a more cooperative approach. The
official said that such philosophical differences among their states play a
significant role in determining the level of oversight the region exerts
over each state's enforcement program.

In contrast, EPA's Inspector General's 1998 analysis of Idaho's Air
Enforcement Program showed a different regional approach to identified
weaknesses in a state's environmental program. Specifically, the report
found that the state often did not take appropriate enforcement actions, did
not assess sufficient penalties, and did not inspect facilities in
accordance with EPA guidance. The report concluded that the main reason for
these deficiencies was the state's policy of focusing on compliance
assistance rather than enforcement to bring sources back into compliance.
When compliance assistance efforts did not achieve their intended results,
the state failed to pursue enforcement actions against violators. The
Inspector General faulted the Seattle office for entering into an agreement
with the state that did not require them to follow EPA's enforcement
guidance.

OECA acknowledged the particular challenge posed by varying state approaches
in its formal response to EPA's Inspector General's 1997 Consolidated Review
of the Air Enforcement Compliance Assistance Programs. OECA noted that EPA's
enforcement partnership with the states is complicated by the fact that some
states "...do not place enough emphasis on deterrence of noncompliance
through strong enforcement programs..." and that this "...reflects
differences in philosophy that cannot be addressed solely through more
effective oversight or better technical assistance to states."

Differences in state laws, and in the enforcement authority granted to
environmental agencies by state legislatures, can significantly impact the
operational aspects of state environmental programs. Nearly all regional and
state enforcement officials interviewed agreed that differences in state
laws and enforcement authorities contribute to variations in enforcement
programs. Among the most commonly cited variations were in states' authority
to (1) resolve compliance problems through administrative action rather than
relying solely on civil judicial action and (2) recover the economic benefit
a violator may have gained through noncompliance.

Enforcement officials in EPA's Chicago office noted that whether a state has
administrative authority to resolve compliance problems can be a significant
factor contributing to variations. If delegated agencies can pursue
violations administratively, they can avoid lengthy delays associated with
going through judicial channels to assess penalties. As noted in EPA's 1997
Inspector General's Consolidated Review of Air Enforcement and Compliance
Assistance Programs, having such administrative authority is particularly
important to delegated agencies that do not have strong legal support from
their state's Attorney General's office. The Inspector General reported that
limited legal support from Attorney General's offices in California,
Illinois, Indiana, and Wisconsin, combined in some cases with a lack of
administrative order authority, contributed to lengthy delays in resolving
enforcement cases, and in some cases, a reluctance to refer cases to legal
authorities because of the delays.

In those states without administrative order authority to assess penalties,
the region's role becomes more important. West Virginia officials, for
example, noted that while they have the administrative authority to order
violators to correct violations, they do not have the administrative
authority to assess penalties. Nevertheless, West Virginia officials said
they have been successful in negotiating "consent settlements" with
violators in lieu of penalties in large part because the Philadelphia office
is viewed by the regulated community as a credible enforcement threat if the
state's negotiations fail.

Another key difference among state enforcement authorities is the extent to
which they provide for the recovery of the economic benefit of violations.
EPA policies for determining appropriate penalties provides that
consideration be given to a number of factors, one of which is the recovery
of any economic benefits gained by the violator as a result of not complying
with environmental requirements.6 While some states' penalty policies
provide for recovery of economic benefits in accordance with EPA guidelines,
other states' policies do not. Among the six states included in our review,
four states either have, or are in process of developing, written penalty
policies that include economic benefit recovery provisions. Enforcement
officials in West Virginia, a state that does not have a written penalty
policy, said that they consider economic benefit recovery in their penalty
calculations, but do not follow the specific EPA calculation procedures
because they believe the calculation procedures result in excessively high
penalties. Maryland officials said that their state statutes do not require
nor list the use of economic benefit as a determining factor in calculating
or recovering penalties. Thus, Maryland's penalty policy does not mention
economic benefit. Officials stated, however, that in assessing penalties
they do consider the broad concept of economic benefit but they do not
believe they are compelled to use or follow EPA's guidance in their penalty
calculations.

Senior OECA officials acknowledge that variations in states' enforcement
authorities have contributed to wide variation in states' enforcement
capabilities and have noted that their past efforts to address the disparity
have been difficult. They noted, for example, that several years ago they
proposed requiring economic benefit as a requirement for the delegation of a
program under the Resource Conservation and Recovery Act but that it was
dropped in the midst of considerable state resistance. They further noted
that if economic benefit were required and a state did not adopt the
provision, EPA would be faced with a decision whether to take back
delegation from the state altogether. Furthermore, if a state did not
recover economic benefits in all cases, EPA would be faced with the decision
whether to "overfile" the state's action with its own enforcement action.
Either alternative could be extremely controversial.

Senior OECA officials point out that resource shortages at both the federal
and state level have been amplified in recent years by the expansion of the
universe of facilities inspected which could be subject to potential
enforcement action. As examples, they cited increased emphasis on new
requirements that municipalities address problems associated with combined
sewer overflows7 and new discharge requirements facing animal feeding
operations. In this resource-constrained environment, a majority of the
enforcement officials we interviewed in EPA regions and states agreed that
differences in resource allocations can contribute significantly to
variations in regional enforcement activities. In such an environment,
regions and states must make choices about where to focus their attention.
Where state agencies are particularly understaffed, EPA regional staff
sometimes help the states carry out their enforcement responsibilities.

Enforcement officials in EPA's Seattle office told us that civil judicial
cases have been particularly taxing on resources of some of their states. As
a result, some states have been very selective about which cases they decide
to pursue with legal action. According to the EPA officials, these states
sometimes settle for less of a penalty than they believe is warranted to
avoid litigation that would otherwise consume considerable resources. As
long as environmental compliance is achieved, some states in the region
generally view compliance assistance and working with violators as a cheaper
way to achieve the environmental results. To ease the burden on states with
particular resource limitations, Seattle regional staff are engaging in
"work sharing" to take some of the load off the states. Work sharing is also
practiced among some states' agencies as a means of reducing the impact of
resource limitations. The Idaho Department of Agriculture, for example,
incorporates environmental inspection objectives into their annual
inspections of dairy farms subject to the Clean Water Act requirements,
relieving the state environmental agency of this inspection responsibility.

Among the regions we visited, EPA's Philadelphia office was most direct in
pointing to resource constraints as affecting their basic enforcement
responsibilities. Regional enforcement officials noted, for example, that a
lack of travel funds in fiscal year 1999 hampered both training and
inspections. Additionally, the director of the water protection division in
this region said that he has about 10 vacancies on his enforcement staff
that he is unable to fill.

Senior OECA officials acknowledged that while they try to allocate resources
fairly and efficiently among the 10 regions, regional management sometimes
exercises discretion in assigning enforcement staff to what they view as
higher responsibilities. The officials note that in some cases, such
decisions reflect an appropriate exercise of management discretion, but that
in some instances, they have found it necessary to ask the region to alter
its decision to ensure that minimum program requirements are met.

Inspector General reports during the past decade have documented confusion
among both EPA regional and state enforcement officials as to the extent
regions and states are permitted to vary from EPA policies and guidance.
Most notably, in a series of air program audits during this period, EPA's
Inspector General found a widespread failure among states to report all
significant violators to EPA. This situation was attributed, in part, to
regional and state confusion over the air program's significant violator
guidance and the extensive flexibility it contained. To remedy this
situation, EPA and states jointly developed a new "High Priority Violations"
policy to replace the old guidance.

Even where EPA's enforcement policies and guidance are clear, they often
provide latitude that is wide enough for state and regional enforcement
actions to differ substantially and yet still abide by the policy or
guidance. EPA's penalty response policies, for example, provide for a range
of appropriate responses for given scenarios. Depending on the situation,
appropriate responses could include warning letters for minor violations, or
civil or criminal remedies and sanctions for more serious violations. Civil
remedies and sanctions may be imposed either administratively by the
enforcing agency or by the courts. The Director, Office of Enforcement,
Compliance and Environmental Justice in the Philadelphia office noted that
the latitude provided by EPA's penalty calculation policies helps to explain
why different states can calculate different penalties for essentially the
same or very similar violations. She said that while penalty calculations
are subjective and can never be done identically in different states, it is
nonetheless important that penalties are calculated in a comparable manner,
and that extremes are avoided.

EPA's policies for assessing penalties takes into consideration such factors
as severity of the violation, litigation considerations, and the economic
benefit of noncompliance. These factors are largely subjective, and the
values assigned to each factor can vary widely. Consequently, it is unlikely
that any two enforcement professionals could look at the same violation,
consider the same calculation factors, and come up with precisely the same
penalty amount. EPA officials noted, however, that one could reasonably
expect that the calculated penalties would be in the same broad range.

OECA needs accurate and complete data as a key tool to assess whether
minimum program requirements are being met by regions and states, and
whether there are significant variations from these requirements that should
be corrected. Responsibility for entering data into EPA's national databases
resides with the region or state responsible for the enforcement program.
Both the quality of and quality controls over these data have been widely
criticized by the regional and state officials we interviewed, and recent
internal OECA studies have acknowledged the seriousness of the problem.

Many state enforcement programs we reviewed maintain their own databases to
manage their programs and do not use EPA's national databases. Consequently,
keeping the information in the EPA databases current is a low priority for
the states in an environment of limited resources--which has only further
aggravated the problem for OECA. For example, an internal OECA enforcement
work group called the Targeting Program Review Team commented in its
November 1999 report that while EPA's air and water databases indicated very
different enforcement programs across states, it could not determine whether
the variation was a function of real differences or the fact that data are
not getting into the databases for some states. This work group reported
that functions related to data quality, such as the consistent entry of
information by regions and states, is not working properly. The work group
also cited important information gaps in the databases. The group noted, for
example, that EPA has little compliance information about minor Clean Water
Act dischargers, even though two-thirds of all Clean Water Act enforcement
actions are now taken at these facilities.

Other work groups also underscored the seriousness of the data problem. The
report by OECA's National Planning work group concluded that "...one of the
difficulties in current planning and evaluation is that OECA managers do not
have available to them timely, complete, and detailed analyses of regional
or national performance." The Data Quality work group reported that "Data is
viewed by most state, regional, and Headquarters programs as a reporting
exercise for `bean-counting,' rather than as a day-to-day management tool to
identify problems and determine progress against commitments and goals."
Additionally, the Data Quality work group asserted that the situation has
deteriorated from past years, noting that "Over the past several years,
managers in the regions and in OECA have become increasingly frustrated that
they are not receiving from the Office of Compliance the reports and data
analyses they need to manage their programs." The group further noted that
there "...has been less attention to the data in the national systems, a
commensurate decline in data quality, and insufficient use of data by
enforcement/compliance managers in managing their programs." The various
internal work groups made a number of recommendations to OECA to ease the
problems they had discovered. At the time of our review, OECA was
considering what action to take on these recommendations.

Recent EPA Efforts to Achieve Greater Consistency in Regional Enforcement
Activities

EPA has an array of principles, policies, guidance documents, and other
tools that are intended to ensure that minimum program requirements are met
and to help ensure reasonable consistency in the way regional offices across
the country take direct enforcement action and in the way they oversee state
enforcement programs. However, as discussed earlier in this report, these
traditional tools have not ensured consistency because their implementation
has often varied across EPA's 10 regions.

EPA headquarters and regional enforcement officials identified a number of
planned and ongoing activities that could help to achieve greater
consistency in how regional offices take direct enforcement action and in
how they oversee state enforcement programs within their jurisdiction. This
chapter describes these activities and, in some cases, suggests how they may
be modified to more effectively foster greater consistency in EPA's
nationwide enforcement program.

During fiscal years 1997 and 1998, OECA engaged in broad-based,
region-by-region performance reviews on a rotating 2-year schedule, with
half of the regions being reviewed each year. In addition to examining
enforcement practices associated with regions' air, water, and other media
programs, these detailed reviews included extensive examinations of
cross-cutting program activities, such as setting enforcement priorities,
addressing multimedia problems, conducting regional oversight of states'
activities, and managing data systems.

According to senior OECA officials, these intensive reviews were
discontinued because they were viewed as both burdensome and costly for the
regional offices and for headquarters officials. They also expressed doubts
as to whether the reviews produced sufficiently useful information for
improving regional performance because they took too long to complete and
did not always identify the most critical problems. An OECA official
explained, for example, that the reviews typically took a year to complete
and were based on the prior year's data. He noted that by the time a review
was completed and published, any problems identified would be 2 years old
and possibly not representative of current conditions.

As an alternative, OECA is presently developing a system in which Program
Status Reports would be issued about twice a year and would provide a
variety of information, by region, that would gauge both regional office
performance and performance by states within each region. According to the
Director of the Enforcement Planning, Targeting, and Data Division, the
Program Status Reports will draw information from existing data systems and
other sources. Information to be included will extend well beyond the
historic focus on "output measures" such as the number of inspections
conducted and enforcement actions taken. Instead, the reports will provide
comparative information on a broader array of information that focuses
increasingly on the results the enforcement program is trying to achieve.
Thus, for example, in addition to providing regional and state trend data on
the number of inspections conducted and the number of enforcement actions
taken, the report will attempt to convey, by region, the duration in which
significant violators remained in noncompliance; the extent to which past
violators returned to compliance have remained in compliance; and the
qualitative impact of enforcement actions taken (e.g., the extent to which
pollutants are reduced).

According to OECA, the Program Status Reports will be used, in combination
with other information sources, to identify specific program elements where
more detailed examination of regional and state performance is warranted.
These program elements will be prioritized, and then Program Element Reviews
will be used to examine specific aspects of selected programs. During these
reviews, a team of experts will review their implementation by EPA
headquarters, by most or all regions, and by one or two states in each
region. According to a letter dated June 1999 from the former Director of
the Office of Compliance to a member of the Environmental Council of the
States' executive committee, the reviews ". . . willl enable [OECA} to
describe how effectively the program elements are being implemented by both
EPA and the states." OECA maintains that in contrast to the discontinued
regional reviews, the Program Element Reviews will be more narrowly focused,
cover headquarters and most regions and several states in each region, and
provide more timely information. The agency currently envisions completing
two Program Element Reviews each year. According to the Director of the
Enforcement Planning, Targeting, and Data Division, the Program Element
Reviews are expected to provide a logical vehicle for assessing consistency
and identifying areas for improvement.

Program Status Reports and Program Element Reviews have the potential to
convey useful comparative information to both EPA managers and to the public
on the extent to which the enforcement program is being implemented
consistently and fairly nationwide. However, as OECA officials acknowledge,
raw enforcement data, such as the number of inspections conducted or the
number of enforcement actions taken, can be easily misinterpreted.
Consequently, for the Program Status Reports to provide useful, comparative
information on regions' enforcement programs to both agency officials and
the public, it is essential that the data be accompanied by the contextual
information needed to clarify whether variation in a given instance is
inappropriate, or whether it reflects the appropriate exercise of
flexibility by regions and states to tailor their needs and priorities to
their individual circumstances.

The unavailability among OECA program managers of comprehensive and reliable
enforcement data, particularly at the state level, significantly impedes
OECA's ability to diagnose and address unwarranted variation among regional
enforcement practices. While this issue will take both time and funds to
address, OECA has at least acknowledged the seriousness of the problem and
is exploring alternatives to deal with it. OECA's Office of Compliance has
organized a number of work groups which, in a series of reports in 1999,
identified a number of problems and made a series of recommendations to
address them. Of particular note, the Data Quality work group found that
data quality needs to be fully integrated into strategic planning, staffing
levels, agreements with the regions, regional reviews, and reporting in
order to convey a clear message of the importance of data in all aspects of
the enforcement program. In addition, the Targeting work group acknowledged
problems with the quality and uses of enforcement data and made a series of
recommendations designed to "...make OECA a more information driven, and
thus strategic organization." The Targeting work group recommended that "The
use of targeting methods that incorporate risk, environmental quality, and
pollutant data...be promoted to augment or replace existing methodologies
that are driven almost exclusively by policy mandates that tend to be
inflexible."

The Acting Director of the Office of Compliance noted that the resources
devoted to data quality may have been insufficient in recent years and
indicated headquarters' intention to shift some resources internally to
alleviate the problem. He also indicated that his office is studying the
work groups' recommendations to decide which should be adopted and how they
should be implemented. We believe such a study should be an important part
of a comprehensive strategy that identifies the key actions needed to
address OECA's data quality problems, and then brings to bear sufficient
priority and resources needed to address them.

Regional officials cited improved communication as a key component in their
efforts to initiate new processes and effect change among their staff and
among their states. Senior officials in the Seattle region, for example,
instituted a Regional Enforcement Forum attended by all regional program
directors and top managers to share information and to ensure that they are
aware of what is going on in other programs in the region. The Seattle
Regional Administrator told us that the region looks at data, such as
penalties, and puts all of the states' data in a matrix for comparison and
analysis. Seattle officials noted that program reviews have highlighted
diametrically opposed philosophies between the states of Oregon and
Washington in targeting their enforcement efforts. One state believes in
using its limited resources to target a few large violators that have a
large impact on the environment while the other state believes in targeting
many small violators that, cumulatively, also have large impacts on the
environment. The Regional Administrator also said that when top-level state
directors see something amiss when their own data are arrayed in a matrix
with neighboring states, they question their own staff for explanations for
the differences. He said that this process has created peer pressure and has
helped to bring about a greater level of consistency among the states in the
region.

Other regional officials conveyed similar experiences. The Director of the
Office of Enforcement, Compliance and Environmental Justice in EPA's
Philadelphia office said the regional office's decentralized enforcement
structure (i.e., enforcement is organized as a separate component within
each media program)--and the need to work closely with six states in the
region with evolving political leaderships−makes frequent and
effective communication critically important. Therefore, the office holds
regularly scheduled conference calls with regional and state staff to
discuss issues that may arise, and she also holds annual meetings to improve
communication between the region and its states and among the states.

A number of regions have developed and implemented audit protocols to
improve the consistency and effectiveness of their oversight of the states
within their jurisdiction. Among the most recent and comprehensive
experiments with such a protocol is in the Denver region, where the Denver
Regional Office recently developed a Unified Oversight System. Under this
new system, the regional office will review all state environmental programs
using a broad range of performance criteria such as data entry, timeliness
of actions, penalties recovered, and effectiveness of inspections. Each
state will be graded on each category and then given an overall rating. The
system is built, in part, on the concept of a comparative review system to
pinpoint the weakest state programs needing the most oversight attention by
the regional office. The protocol is also premised on the belief that over
time, states with the lowest rating will eventually rise to the level of
their peers.

Senior OECA officials acknowledged that regional protocols may help a region
oversee its state programs, although they cautioned that the protocols
should not be viewed as a regional "report card." Our interviews with
regional officials also suggest that oversight protocols, negotiated between
regional and state officials, offer a promising mechanism to improve
regional oversight that can help to ensure that each region's oversight is
fairer and more consistent among all the states within its jurisdiction.
Moreover, we also acknowledge the view among many regional staff that in
light of the different organizational structures among regional offices,
working relationships between different regions and their states, and other
factors, these protocols should be tailored to meet the needs of each
region. Nonetheless, EPA's guidance on elements that should be common to all
protocols could help to engender a minimal level of consistency in the way
the 10 regional offices oversee their states.

While noting that some variation among regions' environmental enforcement
activities may be appropriate, OECA has underscored the importance of an
appropriate level of consistency to ensure fairness and equitable treatment,
and to help ensure that minimum requirements will be met. OECA has relied on
a number of traditional tools to ensure reasonable consistency, such as the
use of formal memorandums of agreement outlining regions' and states'
enforcement responsibilities, and periodic visits by senior enforcement
managers to each regional office. Yet, maintaining a consistent approach
among 10 regional offices and 50 states has proven to be a difficult
challenge. EPA has also experienced problems in identifying and
communicating the extent to which variation (1) represents a problem, (2) is
preventable, or (3) represents the appropriate exercise of flexibility by
regions and states to apply national program goals to their unique
circumstances.

Headquarters and regional enforcement officials have identified a number of
planned and ongoing activities that could further help to improve
consistency in how regional offices take direct enforcement action and in
how they oversee state enforcement programs within their jurisdiction. We
acknowledge the merit of many of these activities but believe that
additional action, and in some cases changes to its approach, will further
the agency's effort to achieve an appropriate level of consistency in
regional enforcement.

First, OECA is planning to use Program Status Reports to provide comparative
data on regional enforcement practices and on the practices of states within
each region. Program Status Reports have already been the subject of much
discussion in the environmental media and among EPA and state enforcement
officials. The reports have the potential to convey useful information to
both EPA managers and to the public on the extent to which the enforcement
program is being implemented consistently and fairly nationwide. However, as
OECA officials acknowledge, the data can be easily misinterpreted without
the contextual information needed to clarify whether variation in a given
instance is inappropriate, or whether it reflects the appropriate exercise
of flexibility by regions and states to tailor their needs and priorities to
their individual circumstances. We, therefore, believe that EPA's Program
Status Reports can better serve agency management and the public if EPA (1)
clarifies what aspects of its enforcement program it expects to see
implemented consistently from region to region and where it believes greater
variation is appropriate and (2) supplements its region-by-region data with
contextual information that helps to explain the causes of variation and
thereby clarifies the extent to which such variation is problematic.

Second, the effort to develop such comparative information will only succeed
if it draws from data systems that are complete and reliable. However, the
reliability of the agency's enforcement data has been widely challenged from
both outside and inside the agency. Senior OECA officials have acknowledged
that it will require additional staffing and resources to deal with the
issue and have indicated that they are considering the reallocation of some
resources from other functions to augment their data quality efforts.
Nonetheless, EPA still needs to articulate a comprehensive strategy that
will build on internal analyses and recommendations concerning the agency's
enforcement databases, and will bring sufficient resources to bear in a
manner that will sufficiently address this critical and complex problem.

Third, a number of regional offices have worked with their states to develop
audit protocols that are designed, in part, to achieve more effective and
more consistent oversight of the states within their jurisdiction. We
acknowledge the potential of these protocols and believe there are good
reasons that such protocols should be tailored to meet the needs of each
region. However, we also believe that headquarters guidance on elements that
should be common to all protocols would help to engender an improved level
of consistency in the way the 10 regional offices oversee their states.

We recommend that the Administrator of EPA

ï¿½ provide, as part of the agency's efforts to develop Program Status Reports
containing comparative data on regional and state enforcement performance,
the contextual information needed to help EPA management and the public
properly understand them;

ï¿½ develop a comprehensive strategy that will bring to bear sufficient
priority and resources so that the problems affecting the quality of the
agency's enforcement data can be adequately addressed; and

ï¿½ issue guidance to EPA regions describing the required elements of audit
protocols to be used in overseeing state enforcement programs.

In its comments on our draft report, EPA disagreed with our recommendation
that the agency's Program Status Reports include the contextual information
needed to help EPA management and the public better understand raw data
characterizing regional performance. EPA noted that the reports are not
intended for public distribution and consequently do not "need contextual
information . . . since they are designed to be used by Agency program
managers who understand how to use them." We disagree with this statement.
First, past experience indicates that whether intended or not as public
documents, the Program Status Reports will likely be made public and will be
used by interested parties. This occurred in the case of OECA's regional
evaluations discussed in our report, which were released on the basis of a
Freedom of Information Act request and were reported widely in the trade
press. Recent press coverage anticipating EPA's Program Status Reports
provide a strong indication that they too will be used by members of the
public. Consequently, the contextual information explaining the variations
is essential if the reports are to clarify, rather than confuse, the
public's interpretation of the data. Second, while EPA notes that the
reports are designed for agency managers "who know how to use them," our
experience during this review indicates that without better contextual
information, even agency managers have had difficulty interpreting the raw
data to determine the extent to which variations are problematic, whether
they are preventable, or whether they represented the appropriate exercise
of flexibility.

EPA did not respond directly to our recommendation that the agency develop a
comprehensive strategy to improve the quality of its enforcement data. The
agency noted that the "root causes of the data quality problems are many and
varied." EPA noted in particular the importance of the state role in both
the causes and solutions to data quality problems in EPA's national data
systems, since they are the repository of most of the data and are typically
responsible for entering this data into the national data systems. We agree
that the states must be part of the solution if EPA is to have a useful and
reliable national enforcement database. However, EPA itself suggests that
solving the problem will require its leadership. EPA notes, for example,
that its data systems have aged to the point where many states have built
their own parallel data systems that incorporate more modern, user-friendly
architectures. Data entry by states into the national databases has
therefore waned as they have placed greater emphasis on maintaining their
own systems. EPA also points to "system or definitional incompatibilities"
between EPA enforcement databases and those of a number of states, and the
added burden on both states and EPA regional offices of entering data into
national databases that do not help them manage their programs. This point
echoes the findings of the agency's Data Management and Quality Program
Review Team which, as discussed in chapter 3, observed that "Data is viewed
by most state, regional, and Headquarters programs as a reporting exercise
for `bean-counting,' rather than as a day-to-day management tool to identify
problems and determine progress against commitments and goals." The team's
November 1999 report recommended that data quality needs "be elevated to an
OECA priority . . ." and that such needs "be considered in strategic
planning, budget formulation, and reporting to ensure a clear articulation
of the importance of good data and how it fits into all aspects of the
enforcement program."

EPA did not respond directly to our recommendation that the agency issue
guidance identifying elements that should be common to all regions' state
oversight audit protocols. However, the agency cautioned that the protocols
are not a substitute for a comprehensive oversight program. EPA also
expressed concern about the comprehensiveness of some of the protocols,
noting that they "do not all review State performance against all national
policies, including the 1986 State Guidance, other national policies, and
the [Memorandum of Agreement] process." We acknowledge that the protocols
are not a substitute for a comprehensive regional oversight program. We also
acknowledge EPA's concern about the comprehensiveness of the various
protocols being tested in different regions and continue to believe that the
recommended guidance would help to address the problem identified by EPA
while still allowing each region to tailor its protocol to meet its unique
circumstances.

Comments From the Environmental Protection Agency

GAO Contacts and Staff Acknowledgments

Steven Elstein, (202) 512-6515

In addition to those named above, Maureen Driscoll, Barbara Johnson, Gerald
Laudermilk, and Lisa Pittelkau made key contributions to this report.

(160497)

Figure 1: EPA's10 Regions and Regional Office Locations 17
  

1. The authority to order a party that is violating a provision of the law
to refrain from further violation is referred to as injunctive relief.

2. Title V requires large sources of air pollutants to obtain permits that
specify the maximum amount of pollutants that can be released and monitoring
requirements.

3. Clean Air Act sect. 502(b)(5)(A),(E), 42 U.S.C. sect. 766a(b)(5)(A),(E).

4. Clean Water Act sect. 402(b)(2)(A),(7), 33 U.S.C. sect. 1342(b)(2)(A),(7). The
National Pollutant Discharge Elimination System of the Clean Water Act
requires major sources of discharges to surface water to obtain permits that
control the amount of pollutants that may be discharged to surface water and
sets monitoring requirements.

5. The National Pollutant Discharge Elimination System is the primary
regulatory program governing the discharge of pollutants by facilities into
U.S. waters.

6. Past reports by both GAO and the EPA's Inspector General have concluded
that repeated violations have occurred in the absence of adequate penalties
that at least recover the economic benefits of noncompliance. See, for
example, Environmental Enforcement: Penalties may Not Recover Economic
Benefits Gained by Violators (GAO/RCED-91-166 , June 17, 1991).

7. Combined sewer overflows affect municipalities whose storm water and
wastewater sewer systems are combined. During large rainstorms, the surge in
water volume can cause the system to overflow, resulting in untreated sewage
flowing directly into a body of water.
*** End of document. ***