Results-Oriented Government: GPRA Has Established a Solid	 
Foundation for Achieving Greater Results (10-MAR-04, GAO-04-38). 
                                                                 
Now that the Government Performance and Results Act (GPRA) has	 
been in effect for 10 years, GAO was asked to address (1) the	 
effect of GPRA in creating a governmentwide focus on results and 
the government's ability to deliver results to the American	 
public, (2) the challenges agencies face in measuring performance
and using performance information in management decisions, and	 
(3) how the federal government can continue to shift toward a	 
more results-oriented focus.					 
-------------------------Indexing Terms------------------------- 
REPORTNUM:   GAO-04-38						        
    ACCNO:   A09453						        
  TITLE:     Results-Oriented Government: GPRA Has Established a Solid
Foundation for Achieving Greater Results			 
     DATE:   03/10/2004 
  SUBJECT:   Agency missions					 
	     Federal agencies					 
	     Performance measures				 
	     Productivity in government 			 
	     Reporting requirements				 
	     Strategic planning 				 

******************************************************************
** This file contains an ASCII representation of the text of a  **
** GAO Product.                                                 **
**                                                              **
** No attempt has been made to display graphic images, although **
** figure captions are reproduced.  Tables are included, but    **
** may not resemble those in the printed version.               **
**                                                              **
** Please see the PDF (Portable Document Format) file, when     **
** available, for a complete electronic file of the printed     **
** document's contents.                                         **
**                                                              **
******************************************************************
GAO-04-38

United States General Accounting Office

GAO

                       Report to Congressional Requesters

March 2004

RESULTS-ORIENTED GOVERNMENT

     GPRA Has Established a Solid Foundation for Achieving Greater Results

                                       a

GAO-04-38

Highlights of GAO-04-38, a report to congressional requesters

Now that the Government Performance and Results Act (GPRA) has been in
effect for 10 years, GAO was asked to address (1) the effect of GPRA in
creating a governmentwide focus on results and the government's ability to
deliver results to the American public, (2) the challenges agencies face
in measuring performance and using performance information in management
decisions, and (3) how the federal government can continue to shift toward
a more results-oriented focus.

GAO recommends that the Office of Management and Budget (OMB) improve its
guidance and oversight of GPRA implementation, as well as develop a
governmentwide performance plan. GAO also believes Congress should
consider amending GPRA to require that (1) agencies update their strategic
plans at least once every four years, consult with congressional
stakeholders at least once every new Congress, and make interim updates to
strategic and performance plans as appropriate; and (2) the President
develop a governmentwide strategic plan. OMB generally agreed with our
recommendations, but stated that the President's Budget can serve as both
a governmentwide strategic and annual plan. However, we believe the budget
provides neither a long-term nor an integrated perspective on the federal
government's performance.

March 2004

RESULTS-ORIENTED GOVERNMENT

GPRA Has Established a Solid Foundation for Achieving Greater Results

GPRA's requirements have established a solid foundation of
results-oriented performance planning, measurement, and reporting in the
federal government. Federal managers surveyed by GAO reported having
significantly more of the types of performance measures called for by GPRA
(see figure below). GPRA has also begun to facilitate the linking of
resources to results, although much remains to be done in this area to
increase the use of performance information to make decisions about
resources. We also found agency strategic and annual performance plans and
reports we reviewed have improved over initial efforts.

Although a foundation has been established, numerous significant
challenges to GPRA implementation still exist. Inconsistent top leadership
commitment to achieving results within agencies and OMB can hinder the
development of results-oriented cultures in agencies. Furthermore, in
certain areas, federal managers continue to have difficulty setting
outcome-oriented goals, collecting useful data on results, and linking
institutional, program, unit, and individual performance measurement and
reward systems. Finally, there is an inadequate focus on addressing issues
that cut across federal agencies.

OMB, as the focal point for management in the federal government, is
responsible for overall leadership and direction in addressing these
challenges. OMB has clearly placed greater emphasis on management issues
during the past several years. However, it has showed less commitment to
GPRA implementation in its guidance to agencies and in using the
governmentwide performance plan requirement of GPRA to develop an
integrated approach to crosscutting issues. In our view, governmentwide
strategic planning could better facilitate the integration of federal
activities to achieve national goals.

Percentage of Federal Managers Who Reported Having Specific Types of
Performance Measures Called for by GPRA

www.gao.gov/cgi-bin/getrpt?GAO-04-38.

To view the full product, including the scope and methodology, click on
the link above. For more information, contact Patricia A. Dalton at (202)
512-6806 or [email protected].

Contents

                              Transmittal Letter 1

     Executive Summary                     Purpose                        3 3 
                                          Background                        4 
                                       Results in Brief                     6 
                                      Principal Findings                   10 
                             Recommendations for Executive Action          19 
                           Matters for Congressional Consideration         20 
                                       Agency Comments                     20 
Chapter 1 Introduction    Impact of Emerging Trends and Fiscal             
                                  Challenges GPRA Background         22 23 25
                                    Scope and Methodology                  30 

Chapter 2 31 GPRA Established the GPRA Statutory Requirements Laid a
Foundation for Agencywide Results-Oriented Management 31

Foundation for a More Results-Oriented Federal Government

Chapter 3
Agencies Have
Addressed Many
Critical Performance
Planning and Reporting
Challenges, but
Weaknesses Persist

46 Quality of Selected Strategic Plans Reflects Improvements over Initial
Drafts 46 Fiscal Year 2004 Annual Performance Plans Addressed Some

Weaknesses of Earlier Plans, but Still Have Room for Significant

Improvement 53 Strengths and Weaknesses of Selected Agencies' Fiscal Year
2002

Annual Performance Reports 60

Chapter 4 68 Challenges to GPRA Top Leadership Does Not Consistently Show
Commitment to

Achieving Results 69

  Implementation Persist

                                    Contents

Managers Report Mixed Results in Use of Performance Information 75
Managers Continue to Confront a Range of Human Capital Management
Challenges 79 Persistent Challenges in Setting Outcome-Oriented Goals,

Measuring Performance, and Collecting Useful Data 88 Crosscutting Issues
Hinder Successful GPRA Implementation 92 Managers View Congress' Use of
Performance Information as

Limited 96

Chapter 5 Conclusions and Agenda for Achieving a Sustainable,      100 100 
                             Governmentwide Focus on Results          
        Recommendations        Recommendations for Executive Action       109 
                             Matters for Congressional Consideration      110 
                                         Agency Comments                  110 

Appendixes

Appendix I:

Appendix II:

Appendix III:

Objectives, Scope, and Methodology 113 Methodology for Governmentwide
Survey 113 Methodology for Focus Groups 116 Methodology for Interviews
with Political Appointees 118 Methodology for Selecting Agencies to Review
for Changes in the

Quality of Their Strategic Plans, Annual Performance Plans, and

Annual Performance Reports 119

Focus Group Participants Agreed GPRA Provides a
Framework for Federal Agencies to Become More Results
Oriented 121
GPRA Accomplishments 122
Views on Delivering Results to the American Public Were

Mixed 124 Alternate Views on GPRA's Effect 125 Challenges in Implementing
and Overseeing GPRA Activities 126 Suggestions to Address GPRA Challenges
127

Observations on Agencies' Strategic Plans 130 Required Elements of Agency
Strategic Plans 130 Observations on Changes in the Quality of Education's
Strategic

Plan 134 Observations on Changes in the Quality of DOE's Strategic Plan
139

Contents

Observations on Changes in the Quality of HUD's Strategic Plan

Observations on Changes in the Quality of SBA's Strategic Plan

Observations on Changes in the Quality of SSA's Strategic Plan

Observations on Changes in the Quality of DOT's Strategic Plan

        Appendix IV: Observations on Agencies' Annual Performance Plans

Key Elements of Information for Annual Performance Plans Observations on
Changes in the Quality of Education's Annual Performance Plan Observations
on Changes in the Quality of DOE's Annual Performance Plan Observations on
Changes in the Quality of HUD's Annual Performance Plan Observations on
Changes in the Quality of SBA's Annual Performance Plan Observations on
Changes in the Quality of SSA's Annual Performance Plan Observations on
Changes in the Quality of DOT's Annual Performance Plan

Appendix V:	Observations on Agencies' Annual Performance and
Accountability Reports

Observations on the Quality of Education's Fiscal Year 2002 Performance
and Accountability Report Observations on the Quality of DOE's Fiscal Year
2002 Annual Performance and Accountability Report Observations on the
Quality of HUD's Fiscal Year 2002 Annual Performance and Accountability
Report Observations on the Quality of SBA's Fiscal Year 2002 Annual
Performance and Accountability Report Observations on the Quality of SSA's
Fiscal Year 2002 Annual Performance and Accountability Report Observations
on the Quality of DOT's Fiscal Year 2002 Annual Performance and
Accountability Report

Appendix VI: GAO Federal Managers' Survey Data Appendix VII: Agencies
Subject to the Chief Financial Officers Act Appendix VIII: Comments from
the Office of Management and Budget Appendix IX: Comments from the
Department of Energy

GAO Comments

142 146 150 154

160 160

164

169

172

175

179

183

188

192

200

204

208

213

217

223

236

238

240 242

                                    Contents

Appendix X:	Comments from the Department of Housing and Urban Development
245 GAO Comments 251

Appendix XI:	Comments from the Social Security Administration 255 GAO
Comments 260

Appendix XII:	GAO Contact and Staff Acknowledgments 263 GAO Contact 263
Acknowledgments 263

     Related GAO Products                                                 264 
                                    GPRA/Managing for Results             264 
                               Strategic Human Capital Management         266 
                                  Linking Resources to Results            266 
                                      Measuring Performance               267 
                                        Data Credibility                  268 
                                  Using Performance Information           268 

Tables	Table 1: Table 2: Table 3: Table 4:

Table 5: Table 6: Table 7: Table 8: Table 9:

Agencies' Progress in Addressing Required Elements of
Strategic Planning under GPRA
Characterizations of Agencies' Fiscal Year 1999 and 2004
Annual Performance Plans
Characterizations of Agencies' 2002 Annual Performance
Reports
Summary of Characteristics of Agencies Selected for
Review of Strategic Plans, Annual Performance Plans, and
Annual Performance Reports
Agencies' Progress in Addressing Required Elements of
Strategic Planning under GPRA
Education's Progress in Addressing Required Elements of
Strategic Planning under GPRA
DOE's Progress in Addressing Required Elements of
Strategic Planning under GPRA
HUD's Progress in Addressing Required Elements of
Strategic Planning under GPRA
SBA's Progress in Addressing Required Elements of
Strategic Planning under GPRA

48 54 61

120 133 134 139 143 146 150 154

Table 10: SSA's Progress in Addressing Required Elements of Strategic
Planning under GPRA

Table 11: DOT's Progress in Addressing Required Elements of Strategic
Planning under GPRA

                                    Contents

Table 12: Characterizations of Agencies' Annual Performance Plans 163
Table 13: Characterizations of Agencies' Fiscal Year 2002 Annual
Performance and Accountability Reports 192

Figures Figure 1:

Figure 2:

Figure 3:

Figure 4: Figure 5:

Figure 6: Figure 7: Figure 8: Figure 9:

Composition of Spending as a Share of GDP Assuming
Discretionary Spending Grows with GDP after 2003 and
All Expiring Tax Provisions Are Extended
Percentage of Federal Managers Who Reported That
There Were Performance Measures for the Programs with
Which They Were Involved
Percentage of Federal Managers Who Reported Having
Specific Types of Performance Measures to a Great or
Very Great Extent
Percentage of Federal Managers Who Reported Their
Awareness of GPRA
Percentage of Federal Managers Who Reported
Hindrances to Measuring Performance or Using the
Performance Information to a Great or Very Great
Extent
Percentage of Federal Managers and SES Managers Who
Reported That OMB Paid Attention to Their Agency's
Efforts under GPRA to a Great or Very Great Extent
Percentage of Federal Managers Who Reported They
Considered Strategic Goals to a Great or Very Great
Extent When Allocating Resources
Percentage of Federal Managers Who Reported They
Considered Performance Information to a Great or Very
Great Extent When Allocating Resources
Percentage of Federal Managers Who Reported That
Funding Decisions Were Based on Results or
Outcome-Oriented Performance Information to a Great
or Very Great Extent

                                       24

                                       35

                                     36 37

38 39 43 44

45 70

71

Figure 10: Percentage of Federal Managers Who Reported to a Great or Very
Great Extent Their Top Leadership Has a Strong Commitment to Achieving
Results

Figure 11: Percentage of SES and Non-SES Managers Who Reported to a Great
or Very Great Extent Their Agency Top Leadership Demonstrated Strong
Commitment to Achieving Results

Contents

Figure 12: Percentage of Federal Managers Who Reported Using Information
Obtained from Performance Measurement to a Great or Very Great Extent for
Various Management Activities 77

Figure 13: Percentage of Federal Managers Responding "Yes" about Being
Involved in the Following Activities 79

Figure 14: Percentage of Federal Managers Reporting to a Great or Very
Great Extent That Managers/Supervisors at Their Levels Had the
Decision-Making Authority They Needed to Help the Agency Accomplish Its
Strategic Goals 81

Figure 15: Percentage of Federal Managers, SES, and Non-SES in 2003
Reporting to a Great or Very Great Extent That They Were Held Accountable
for the Accomplishment of Agency Strategic Goals 82

Figure 16: Percentage of Federal Managers in Each Survey Year Who Reported
That during the Past 3 Years Their Agencies Provided, Arranged, or Paid
for Training That Would Help Them Accomplish Specific Tasks 84

Figure 17: Percentage of Federal Managers Who Reported to a Great or Very
Great Extent That Employees in Their Agencies Received Positive
Recognition for Helping Their Agencies Accomplish Their Strategic Goals 86

Figure 18: Percentage of Federal Managers Reporting to a Great or Very
Great Extent That a Lack of Ongoing Congressional Commitment or Support
for Using Performance Information in Making Program/Funding Decisions Is a
Hindrance 98

Figure 19: Summary of Education's Performance Indicators for Fiscal Year
2002 194

Figure 20: Inputs: Allocating Funds for Education's Objective to Ensure
That All Students Read on Grade Level by the Third Grade 198

Figure 21: Summary of DOE's Performance Indicators for Fiscal Year 2002
201 Figure 22: Summary of HUD's Performance Indicators for Fiscal Year
2002 206 Figure 23: Summary of SBA's Performance Goals for Fiscal Year
2002 209 Figure 24: Summary of SSA's Performance Goals for Fiscal Year
2002 214 Figure 25: Summary of DOT's Performance Indicators for Fiscal
Year 2002 218

Contents

Abbreviations

AP advanced placement
CDBG Community Development Block Grants
CFO Chief Financial Officer
CRS Congressional Research Service
DOE Department of Energy
DOT Department of Transportation
EPA Environmental Protection Agency
FAA Federal Aviation Administration
FSA Federal Student Assistance
FTE full-time employee
GM general management
GPRA Government Performance and Results Act of 1993
GS general schedule
HHS Department of Health and Human Services
HOME Home Investment Partnership Program
HUD Department of Housing and Urban Development
ICH Interagency Council on the Homeless
IG Inspector General
IT information technology
IRS Internal Revenue Service
JARC Job Access and Reverse Commute
NAEP National Assessment for Educational Progress
NASA National Aeronautics and Space Administration
OASI Old Age and Survivors Insurance
OMB Office of Management and Budget
OPM Office of Personnel Management
PART Program Assessment Rating Tool
PMA President's Management Agenda
SBA Small Business Administration
SBDC Small Business Development Centers
SES Senior Executive Service
SSA Social Security Administration
SSI Supplemental Security Income
SSN Social Security number
VA Department of Veterans Affairs

This is a work of the U.S. government and is not subject to copyright
protection in the United States. It may be reproduced and distributed in
its entirety without further permission from GAO. However, because this
work may contain copyrighted images or other material, permission from the
copyright holder may be necessary if you wish to reproduce this material
separately.

Comptroller General of the United States

United States General Accounting Office Washington, D.C. 20548

March 10, 2004

Congressional Requesters

As you requested, we have assessed the effectiveness of the Government
Performance and Results Act (GPRA), in light of its 10-year anniversary in
2003. Our review focused on GPRA's accomplishments, challenges to its
continued implementation, and an agenda for achieving a sustainable,
governmentwide focus on results.

Upon issuance, we will send copies to the Director of the Office of
Management and Budget and executive branch agencies (see appendix VII for
a list). We will also make copies available to others upon request. In
addition, this report will be available at no charge on the GAO web site
at http://www.gao.gov.

If you have any questions concerning this report, please contact Patricia
A. Dalton at (202) 512-6806 or [email protected]. The major contributors to
this report are listed in appendix XII.

David M. Walker Comptroller General of the United States

List of Requesters

The Honorable Susan M. Collins
Chairman
The Honorable Joe Lieberman
Ranking Minority Member
Committee on Governmental Affairs
United States Senate

The Honorable George V. Voinovich
Chairman
The Honorable Richard Durbin
Ranking Minority Member
Subcommittee on Oversight of Government Management, the Federal

Workforce, and the District of Columbia Committee on Governmental Affairs
United States Senate

The Honorable Peter G. Fitzgerald
Chairman
The Honorable Daniel K. Akaka
Ranking Minority Member
Subcommittee on Financial Management, the Budget, and International

Security Committee on Governmental Affairs United States Senate

The Honorable Tom Davis
Chairman
The Honorable Henry A. Waxman
Ranking Minority Member
Committee on Government Reform
House of Representatives

The Honorable Todd Russel Platts
Chairman
The Honorable Edolphus Towns
Ranking Minority Member
Subcommittee on Government Efficiency and Financial Management
Committee on Government Reform
House of Representatives

Executive Summary

Purpose	From defending the homeland against terrorists, to preventing the
spread of infectious diseases, to providing a reliable stream of social
security income to retirees and supporting the transition from welfare to
work, the federal government provides funding and services to the American
public that can affect their lives in critical ways every day. However,
the federal government is in a period of profound transition and faces an
array of challenges and opportunities to enhance performance, ensure
accountability, and position the nation for the future. A number of
overarching trends, such as diffuse security threats and homeland security
needs, increasing global interdependency, the shift to knowledge-based
economies, and the looming fiscal challenges facing our nation drive the
need to reconsider the proper role for the federal government in the 21st
century, how the government should do business (including how it should be
structured), and in some instances, who should do the government's
business.

Without effective short-and long-term planning, which takes into account
the changing environment and needs of the American public and the
challenges they face and establishes goals to be achieved, federal
agencies risk delivering programs and services that may or may not meet
society's most critical needs. At a cost to taxpayers of over $2 trillion
annually, the federal government should be able to demonstrate to the
American public that it can anticipate emerging issues, develop sound
strategies and plans to address them, and be accountable for the results
that have been achieved.

Concerned that the federal government was more focused on program
activities and processes than the results to be achieved, Congress passed
the Government Performance and Results Act of 1993 (GPRA).1 The act
required federal agencies to develop strategic plans with long-term,
outcome-oriented goals and objectives, annual goals linked to achieving
the long-term goals, and annual reports on the results achieved. Now that
GPRA has been in effect for 10 years, you asked us to assess the
effectiveness of GPRA in creating a focus on results in the federal
government. Specifically, this report discusses (1) the effect of GPRA
over the last 10 years in creating a governmentwide focus on results and
the government's ability to deliver results to the American public,
including an assessment of the changes in the overall quality of agencies'
strategic plans,

1Pub. L. No. 103-62.

                               Executive Summary

annual performance plans, and annual performance reports; (2) the
challenges agencies face in measuring performance and using performance
information in management decisions; and (3) how the federal government
can continue to shift toward a more results-oriented focus.

To meet our objectives, we reviewed our extensive prior work on GPRA best
practices and implementation and collected governmentwide data to assess
the government's overall focus on results. We conducted a random,
stratified, governmentwide survey of federal managers comparable to
surveys we conducted in 1997 and 2000. We also held eight in-depth focus
groups-seven comprised of federal managers from 23 federal agencies and
one with GPRA experts. We also interviewed top appointed officials from
the current and previous administrations. Finally, we judgmentally
selected a sample of six agencies to review for changes in the quality of
their strategic plans, performance plans, and performance reports since
their initial efforts. The agencies we selected included the Departments
of Education (Education), Energy (DOE), Housing and Urban Development
(HUD), and Transportation (DOT) and the Small Business (SBA) and Social
Security Administrations (SSA). In making this selection, we chose
agencies that collectively represented the full range of characteristics
in the following four areas: (1) agency size (small, medium, large); (2)
primary program type (direct service, research, regulatory, transfer
payments, and contracts or grants); (3) quality of fiscal year 2000
performance plans based on our previous review;2 and (4) type of agency
(cabinet department and independent agency). Appendix I contains a more
detailed discussion of our scope and methodology. We performed our work in
Washington, D.C., from January through November 2003 in accordance with
generally accepted government auditing standards.

Background	GPRA is the centerpiece of a statutory framework that Congress
put in place during the 1990s to help resolve the long-standing management
problems that have undermined the federal government's efficiency and
effectiveness and to provide greater accountability for results. GPRA was
intended to address several broad purposes, including strengthening the
confidence of the American people in their government; improving federal
program effectiveness, accountability, and service delivery; and enhancing

2U.S. General Accounting Office, Managing for Results: Opportunities for
Continued Improvements in Agencies' Performance Plans, GAO/GGD/AIMD-99-215
(Washington, D.C.: July 20, 1999).

Executive Summary

congressional decision making by providing more objective information on
program performance.

GPRA requires executive agencies to complete strategic plans in which they
define their missions, establish results-oriented goals, and identify the
strategies that will be needed to achieve those goals. GPRA requires
agencies to consult with Congress and solicit the input of others as they
develop these plans. Through this strategic planning requirement, GPRA has
required federal agencies to reassess their missions and long-term goals
as well as the strategies and resources they will need to achieve their
goals. Agencies developed their first strategic plans in fiscal year 1997,
and are required to update the plans every 3 years since then.

GPRA also requires executive agencies to prepare annual performance plans
that articulate goals for the upcoming fiscal year that are aligned with
their long-term strategic goals. These performance plans are to include
results-oriented annual goals linked to the program activities displayed
in budget presentations as well as the indicators the agency will use to
measure performance against the results-oriented goals. Agencies developed
their first annual performance plans in fiscal year 1999 and are required
to issue plans annually thereafter to correspond with budget submissions
to Congress.

Finally, GPRA requires agencies to measure performance toward the
achievement of the goals in the annual performance plan and report
annually on their progress in program performance reports. If a goal was
not met, the report is to provide an explanation and present the actions
needed to meet any unmet goals in the future. These reports are intended
to provide important information to agency managers, policymakers, and the
public on what each agency accomplished with the resources it was given.
Agencies issued their first annual performance reports on their fiscal
year 1999 performance in fiscal year 2000 and are required to issue a
report on each subsequent performance plan.

The Office of Management and Budget (OMB) plays an important role in the
management of federal government performance and specifically GPRA
implementation. Part of OMB's overall mission is to ensure that agency
plans and reports are consistent with the President's budget and
administration policies. OMB is responsible for receiving and reviewing
agencies' strategic plans, annual performance plans, and annual
performance reports. To improve the quality and consistency of these
documents, OMB issues annual guidance to agencies for their preparation,

                               Executive Summary

including guidelines on format, required elements, and submission
deadlines. GPRA requires OMB to prepare a governmentwide performance plan,
based on agencies' annual performance plan submissions. OMB also played an
important role in the pilot phase of GPRA implementation by designating
agencies for pilot projects in performance measurement, managerial
accountability and flexibility, and performance budgeting, and assessing
the results of the pilots. Finally, GPRA provides OMB with authority to
grant agencies waivers to certain administrative procedures and controls.

Recent OMB guidance-OMB Circular A-11, July 2003-requires agencies to
submit "performance budgets" in lieu of annual performance plans for their
fiscal year 2005 budget submission to OMB and Congress. According to OMB,
performance budgets should satisfy all the statutory requirements of GPRA
for annual performance plans. In addition, agencies are to include all
performance goals used in the assessment of program performance done under
OMB's Program Assessment Rating Tool (PART) process.3 Moreover, the
guidance states that until all programs have been assessed by PART, the
performance budget will also for a time include performance goals for
agency programs that have not yet been assessed using PART. The
expectation is that agencies are to substitute new or revised performance
goals resulting from OMB's review for goals it deemed unacceptable.

Results in Brief	Among the purposes of GPRA cited by Congress was to
improve federal program effectiveness and service delivery by promoting a
new focus on results, service quality, and customer satisfaction by
setting program goals measuring performance against goals, and reporting
publicly on progress. Furthermore, GPRA was to improve congressional
decision making by providing more objective information on achieving
objectives, and on the relative effectiveness and efficiency of federal
programs and spending. Ten years after enactment, GPRA's requirements have
laid a solid foundation of results-oriented agency planning, measurement,
and reporting that have

3PART is a diagnostic tool developed by OMB that it has been using to rate
the effectiveness of federal programs with a particular focus on program
results. OMB's goal is to review all federal programs over a 5-year period
using the PART tool. OMB used the tool to review approximately 400
programs between the fiscal year 2004 budget cycle and the fiscal year
2005 budget cycle-234 programs were assessed last year and 173 were
assessed this year. Some reassessed programs were combined for review for
the 2005 budget, which is why the number of programs assessed over the 2
years does not add up to exactly 400 programs.

Executive Summary

begun to address these purposes. Focus group participants and high-level
political appointees, as well as OMB officials we interviewed, cited
positive effects of GPRA that they generally attributed to GPRA's
statutory requirements for planning and reporting. Performance planning
and measurement have slowly yet increasingly become a part of agencies'
cultures. The results of our stratified, random sample survey of federal
managers indicate that since GPRA went into effect governmentwide in 1997,
federal managers reported having significantly more of the types of
performance measures called for by GPRA-particularly outcome-oriented
performance measures. Survey data also suggested that more federal
managers, especially at the Senior Executive Service (SES) level, believed
that OMB was paying attention to their agencies' efforts under GPRA. GPRA
has also begun to facilitate the linking of resources to results, although
much remains to be done in this area.

Beginning with agencies' initial efforts to develop effective strategic
plans in 1997 and annual performance plans and reports for fiscal year
1999, Congress, GAO, and others have commented on the quality of those
efforts and provided constructive feedback on how agency plans and reports
could be improved. According to our current review of the strategic plans,
annual performance plans, and annual performance reports of six selected
agencies, these documents reflect much of the feedback that was provided.
For example, goals are more quantifiable and results oriented, and
agencies are providing more information about goals and strategies to
address performance and accountability challenges and the limitations to
their performance data. However, certain serious weaknesses persist, such
as lack of detail on how annual performance goals relate to strategic
goals and how agencies are coordinating with other entities to address
common challenges and achieve common objectives.

While a great deal of progress has been made in making federal agencies
more results oriented, numerous challenges still exist. As we have noted
before, top leadership commitment and sustained attention to achieving
results, both within the agencies and at OMB, is essential to GPRA
implementation. While one might expect an increase in agency leadership
commitment since GPRA was implemented governmentwide beginning in fiscal
year 1997, federal managers reported that such commitment has not
significantly increased. Furthermore, although OMB has recently
demonstrated leadership in its review of performance information from a
budgetary perspective using the PART tool, it is unclear whether the
results of those reviews, such as changes in program performance measures,
will complement and be integrated with the long-term, strategic focus of
GPRA.

Executive Summary

OMB provided significantly less guidance on GPRA implementation for the
fiscal year 2005 budget, compared to the very detailed guidance provided
in prior years. Without consistent guidance from OMB on meeting GPRA
requirements and following best practices, it may be difficult to maintain
the improvements in agency performance plans and reports or bring about
improvements in areas where weaknesses remain. The commitment of top
leadership within agencies, OMB, and Congress is critical to the success
of strategic planning efforts. However, GPRA specifies time frames for
updating strategic plans that do not correspond to presidential or
congressional terms. As a result, an agency may be required to update its
strategic plan a year before a presidential election and without input
from a new Congress. A strategic plan should reflect the policy priorities
of an organization's leaders and the input of key stakeholders if it is to
be an effective management tool.

Managers reported they had more performance measures, but indications that
managers are making greater use of this information to improve performance
are mixed. Additionally, managers reported several human capital-related
challenges that impede results-oriented management, including a lack of
authority and training to carry out GPRA requirements, as well as a lack
of recognition for completing these tasks. Unfortunately, most existing
federal performance appraisal systems are not designed to support a
meaningful performance-based pay system in that they fail to link
institutional, program, unit, and individual performance measurement and
reward systems. Fewer than half of federal managers reported receiving
relevant training in critical results-oriented management-related tasks.
Managers also reported significant challenges persist in setting
outcomeoriented goals, measuring performance, and collecting useful data.
In some agencies, particularly those that have a research and development
component, managers reported difficulties in establishing meaningful
outcome measures. Managers also identified difficulties in distinguishing
between the results produced by the federal program and results caused by
external factors or nonfederal actors, such as with grant programs. Timely
and useful performance information is not always available to federal
agencies, making it more difficult to assess and report on progress
achieved. Finally, agency officials believe that Congress could make
greater use of performance information to conduct oversight and to inform
appropriations decisions. GPRA provides a vehicle for Congress to
explicitly state its performance expectations in outcome-oriented terms
when establishing new programs or in exercising oversight of existing
programs that are not achieving desired results.

Executive Summary

Mission fragmentation and overlap contribute to difficulties in addressing
crosscutting issues, particularly when those issues require a national
focus, such as homeland security, drug control, and the environment. GPRA
requires a governmentwide performance plan, where these issues could be
addressed in a centralized fashion, but OMB has not issued a distinct plan
since 1999. Most recently, the President's fiscal year 2004 budget focused
on describing agencies' progress in addressing the President's Management
Agenda (PMA) and the results of PART reviews of agency programs. Such
information is important and useful, but is not adequate alone to provide
a broader and more integrated perspective of planned performance on
governmentwide outcomes. GAO has previously reported on a variety of
barriers to interagency cooperation, such as conflicting agency missions,
jurisdiction issues, and incompatible procedures, data, and processes. A
strategic plan for the federal government, supported by a set of key
national indicators to assess the government's performance, position, and
progress, could provide an additional tool for governmentwide
reexamination of existing programs, as well as proposals for new programs.
Such a plan could be of particular value in linking agencies' long-term
performance goals and objectives horizontally across the government. In
addition, it could provide a basis for integrating, rather than merely
coordinating, a wide array of federal activities.

To address these challenges, continued and sustained commitment and
leadership are needed. OMB, as the primary focal point for overall
management in the federal government, can provide this leadership and
direction working with the various management councils and work groups of
the government. Also, governmentwide planning could better facilitate the
integration of federal activities to achieve national goals.

GAO recommends that the Director of OMB (1) fully implement GPRA's
requirement to develop a governmentwide performance plan; (2) articulate
and implement an integrated, complementary relationship between GPRA and
PART; (3) provide clearer and consistent guidance to executive branch
agencies on how to implement GPRA; (4) continue to maintain a dialogue
with agencies about their performance measurement practices with a
particular focus on grant-making, research and development, and regulatory
functions to identify and replicate successful approaches agencies are
using to measure and report on their outcomes, including the use of
program evaluation tools; and, work with executive branch agencies to
identify the barriers to obtaining timely data to show progress against
performance goals and the best ways to report information when there are
unavoidable lags in data availability; and (5) work with agencies to
ensure

                               Executive Summary

they are making adequate investments in training on performance planning
and measurement, with a particular emphasis on how to use performance
information to improve program performance.

We also suggest that Congress consider amending GPRA to require that
updates to agency strategic plans be submitted at least once every 4
years, 12-18 months after a new administration begins its term.
Additionally, consultations with congressional stakeholders on existing
strategic plans should be held at least once every new Congress and
revisions should be made as needed. Further, we suggest Congress use these
consultations and its oversight role to clarify its performance
expectations for agencies. Congress should also consider amending GPRA to
require the President to develop a governmentwide strategic plan.

In commenting on a draft of this report, OMB generally agreed with our
findings and conclusions. OMB agreed to implement most of our
recommendations, but stated that the President's Budget represents the
executive branch's governmentwide performance plan and could also serve as
a governmentwide strategic plan. However, because of the budget's focus on
agency-level expenditures for the upcoming fiscal year, we believe that
the President's Budget provides neither a long-term nor an integrated
perspective on the federal government's performance. OMB's comments appear
in appendix VIII. Our response appears in chapter 5. We also provided
relevant sections of the draft to the six agencies whose plans and reports
we reviewed. DOE, HUD, and SSA disagreed with some of our observations,
and we changed or clarified relevant sections of the report, as
appropriate. Written comments from DOE, HUD, and SSA are reprinted in
appendixes IX, X, and XI, respectively, along with our responses.

                               Principal Findings

GPRA Laid the Foundation Prior to enactment of GPRA, our 1992 review of
the collection and use of for a More Results-Oriented performance data by
federal agencies revealed that, although many Federal Government agencies
collected performance information at the program level, few

agencies had results-oriented performance information to manage or make

Executive Summary

strategic policy decisions for the agency as a whole.4 GPRA addressed
agencies' shortcomings by creating a comprehensive and consistent
statutory foundation of required agencywide strategic plans, annual
performance plans, and annual performance reports. Participants in eight
focus groups comprised of experts on GPRA and federal managers from 23
agencies cited the creation of this statutory foundation as one of the key
accomplishments of GPRA. One of the premises of GPRA is that both
congressional and executive branch oversight of federal agency performance
were seriously hampered by a lack of adequate resultsoriented goals and
performance information. As noted above, prior to the enactment of GPRA
few agencies reported their performance information externally. OMB
officials we interviewed as part of our current review suggested that OMB
has been a key consumer of the performance information produced under GPRA
and that it has provided a foundation for their efforts to oversee agency
performance.

Federal managers' views of GPRA's effect on the federal government's
ability to deliver results to the American public were mixed. When asked
about the direct effects of GPRA on the public, 23 percent of the federal
managers surveyed agreed to a moderate or greater extent that GPRA
improved their agency's ability to deliver results to the American public.
High-level political appointees we interviewed cited a number of examples
of how the structure of GPRA created a greater focus on results in their
agencies. Participants in our focus groups had mixed perceptions of GPRA's
effect on their agency's ability to deliver results to the American
public. Participants indicated GPRA has had a positive effect by shifting
the focus of federal management from program activities and processes to
achieving the intended results of those programs. Another major
accomplishment of GPRA cited by focus group participants is that GPRA
improved the transparency of government results to the American public.
Other focus group participants had difficulty attributing the results
their agencies achieved directly to GPRA's requirements.

4U.S. General Accounting Office, Program Performance Measures: Federal
Agency Collection and Use of Performance Data, GAO/GGD-92-65 (Washington,
D.C.: May 4, 1992).

Executive Summary

Focus group and survey results suggest that performance planning and
measurement have slowly, but increasingly, become a part of agencies'
cultures. Compared to the results of our 1997 governmentwide survey of
federal managers, in our 2003 governmentwide survey more managers reported
having performance measures for their programs. When we asked managers who
said they had performance measures which of the five types of measures
they had to a great or very great extent, they reported increases in all
five types of measures between 1997 and 2003,5 all of which were
statistically significant.

Similarly, focus group participants commented on certain cultural changes
that had taken place within their agencies since the passage of GPRA in
which the "vocabulary" of performance planning and measurement-e.g., a
greater focus on performance measurement, orientation toward outcomes over
inputs and outputs, and an increased focus on program evaluation- had
become more pervasive. This perception is partly born out by our survey
results. Consistent with our survey results indicating increases in
results-oriented performance measures, we also observed a significant
decline in the percentage of federal managers who agreed that certain
factors hindered measuring performance or using the performance
information. Finally, our survey data suggested that more federal
managers, especially at the SES level, believed that OMB was paying
attention to their agencies' efforts under GPRA, but with no corresponding
increase in their concern that OMB would micromanage the programs in their
agencies.

5Types of measures were defined in the questionnaire as follows:
performance measures that tell us how many things we produce or services
we provide (output measures); performance measures that tell us if we are
operating efficiently (efficiency measures); performance measures that
tell us whether or not we are satisfying our customers (customer service
measures); performance measures that tell us about the quality of the
products or services we provide (quality measures); and performance
measures that would demonstrate to someone outside of our agency whether
or not we are achieving our intended results (outcome measures).

Executive Summary

Agencies have begun to establish a link between results and resources. Our
1998 assessment of fiscal year 1999 performance plans found that agencies
generally covered the program activities in their budgets, but most plans
did not identify how the funding for those program activities would be
allocated to performance goals.6 However, our subsequent reviews of
performance plans indicate that agencies have made progress in
demonstrating how their performance goals and objectives relate to program
activities in the budget.

We reviewed a sample of six agencies' strategic plans (Education, DOE,
HUD, DOT, SBA, and SSA) and found the quality of the selected plans
reflected improvements over these agencies' initial strategic plans. Our
1997 review of agencies' draft strategic plans found that a significant
amount of work remained to be done by executive branch agencies if their
strategic plans were to fulfill the requirements of GPRA, serve as a basis
for guiding agencies, and help congressional and other policymakers make
decisions about agency activities and programs.7 The six strategic plans
we looked at for this 2003 review reflected many new and continuing
strengths as well as improvements over the 1997 initial draft plans, but
we continued to find certain persistent weaknesses. Of the six elements
required by GPRA, the plans generally discussed all but one-program
evaluation, an area in which we have found agencies often lack capacity.
Although the strategic plans listed the program evaluations agencies
intended to complete over the planning period, they generally did not
address how the agencies planned to use their evaluations to establish new
or revise existing strategic goals, as envisioned by GPRA. Finally,
although not required by GPRA, the strategic plans would have benefited
from more complete discussions of how agencies planned to coordinate and
collaborate with other entities to address common challenges and achieve
common or complementary goals and objectives.

6U.S. General Accounting Office, Managing for Results: An Agenda to
Improve the Usefulness of Agencies' Annual Performance Plans,
GAO/GGD/AIMD-98-228 (Washington, D.C.: Sept. 8, 1998).

7U.S. General Accounting Office, Managing for Results: Critical Issues for
Improving Agencies' Strategic Plans, GAO/GGD-97-180 (Washington, D.C.:
Sept. 16, 1997).

                               Executive Summary

The six selected agencies' fiscal year 2004 annual performance plans
addressed some weaknesses of earlier plans, but there is still significant
room for improvement. During our review of agencies' first annual
performance plans, which presented agencies' annual performance goals for
fiscal year 1999,8 we found that substantial further development was
needed for these plans to be useful in a significant way to congressional
and other decision makers. Most of the 2004 plans that we reviewed showed
meaningful improvements over the fiscal year 1999 plans by showing a
clearer picture of intended performance, providing strategies and
resources that were more specifically related to achieving agency goals,
and providing a greater level of confidence that performance data would be
credible. But these plans also contained a number of serious weaknesses,
such as inadequate discussion of coordination and collaboration and
inconsistent or limited discussions of procedures used to verify and
validate performance data, which limited their quality and undermined
their usefulness.

Our review of the six agencies' fiscal year 2002 performance reports
showed a number of strengths and improvements over their fiscal year 1999
performance reports, as well as areas that needed improvement. As we found
in our earlier reviews, the six agencies' fiscal year 2002 reports
generally allowed for an assessment of progress made in achieving agency
goals. In addition, the majority of agencies discussed the progress
achieved in addressing performance and accountability challenges
identified by agency inspectors general and GAO. However, as with the
fiscal year 1999 reports, many of the weaknesses we identified in the
agencies' fiscal year 2002 reports were related to the significant number
of performance goals not achieved or for which performance data were
unavailable. In addition, the majority of the reports we reviewed did not
include other GPRA requirements, such as a summary of the findings from
program evaluations. Finally, only one of the six agencies clearly linked
its costs to the achievement of performance goals or objectives.

Challenges to GPRA While a great deal of progress has been made in making
federal agencies

Implementation Exist	more results oriented, numerous challenges still
exist to effective implementation of GPRA. We observed in our 1997 report
that we would expect to see managers' positive perceptions on items, such
as the extent

8GAO/GGD/AIMD-98-228.

Executive Summary

to which top leadership is committed to achieving results, become more
prevalent and the gap between SES and non-SES managers begin to narrow as
GPRA and related reforms are implemented. However, these changes do not
appear to be happening to the extent anticipated. The need for strong,
committed, and sustained leadership extends to OMB as well. OMB has shown
a commitment to improving the management of federal programs, both through
its leadership in reviewing agency program performance using the PART tool
as well as through the PMA. As part of the President's budget preparation,
PART clearly must serve the President's interests. However, it is not well
suited to addressing crosscutting (or horizontal) issues or to looking at
broad program areas in which several programs address a common goal. GPRA
was designed to address the needs of many users of performance
information, including (1) Congress to provide oversight and inform
funding decisions, (2) agency managers to manage programs and make
internal resource decisions, and (3) the public to provide greater
accountability. It is not yet clear the extent to which PART performance
goals and measures will compete with agencies' long-term, strategic GPRA
goals and objectives that were established in consultation with Congress
and other stakeholders.

We also found that, while the quality of agency plans and reports have
improved overall since their initial efforts, they continue to suffer from
certain persistent weaknesses as noted above. However, OMB's July 2003
guidance for preparation and submission of annual performance plans is
significantly shorter and less detailed than its 2002 guidance.
Consistent, more explicit OMB guidance on preparing GPRA documents can
help ensure that gains in the quality of GPRA documents are maintained and
provide a resource for agency managers to make further improvements in
those documents.

We also found that timing issues may affect the development of agency
strategic plans that are meaningful and useful to top leadership. The
commitment and sustained attention of top leadership within agencies, OMB,
and Congress is critical to the success of strategic planning efforts. A
strategic plan should reflect the policy priorities of an organization's
leaders and the input of key stakeholders if it is to be an effective
management tool. However, GPRA specifies time frames for updating
strategic plans that do not correspond to presidential or congressional
terms. As a result, an agency may be required to update its strategic plan
a year before a presidential election and without input from a new
Congress. If a new president is elected, the updated plan is essentially
moot and agencies must spend additional time and effort revising it to
reflect new

Executive Summary

priorities. Our focus group participants, including GPRA experts, strongly
agreed that this timing issue should be addressed by adjusting time frames
to correspond better with presidential and congressional terms.

The benefit of collecting performance information is only fully realized
when this information is actually used by managers to bring about desired
results. However, federal managers reported mixed results in the use of
performance information. Focus group participants and survey respondents
noted that although many federal managers understand and use
results-oriented management concepts in their day-to-day activities, such
as strategic planning and performance measurement, they do not always
connect these concepts to the requirements of GPRA. According to our 2003
survey results, the reported use of performance information to a great or
very great extent for nine management activities, such as setting program
priorities or setting individual job expectations for staff, ranging from
41 to 66 percent, has not changed significantly since our first survey in
1997. One exception was the reported use to a great or very great extent
of performance information to adopt new program approaches or change work
processes, which was significantly lower than the 1997 results. GPRA's
usefulness to agency leaders and managers as a tool for management and
accountability was cited as a key accomplishment numerous times by focus
group participants. However, a number of alternative views indicated that
the usefulness of GPRA as a management tool has been limited. Our survey
data also indicate that managers' perceive their participation in
activities related to the development and use of performance information
has been limited.

Federal managers continue to confront a range of important human capital
management challenges. These managers report that they are held
accountable for program results, but may not have the decision-making
authority they need to accomplish agency goals. Moreover, fewer than half
of managers reported receiving relevant training. Managers also perceive a
lack of positive recognition for helping agencies achieve results.
Unfortunately, most existing federal performance appraisal systems are not
designed to support a meaningful performance-based pay system in that they
fail to link institutional, program, unit, and individual performance
measurement and reward systems. In our view, one key need is to modernize
performance management systems in executive agencies so that they link to
the agency's strategic plan, related goals, and desired outcomes and are
therefore capable of adequately supporting more performancebased pay and
other personnel decisions.

Executive Summary

Managers reported persistent challenges in setting outcome-oriented goals,
measuring performance, and collecting useful data. Focus group
participants and survey respondents noted that outcome-oriented
performance measures were especially difficult to establish when the
program or line of effort was not easily quantifiable. In some agencies,
particularly those that have a research and development component,
managers reported difficulties in establishing meaningful outcome
measures. Managers also identified difficulties in distinguishing between
the results produced by the federal program and results caused by external
factors or nonfederal actors, such as with grant programs. Finally,
managers reported that timely and useful performance information is not
always available.

Crosscutting issues continue to be a challenge to GPRA implementation. Our
review of six agencies' strategic and annual performance plans showed some
improvement in addressing their crosscutting program efforts, but a great
deal of improvement is still necessary. We have previously reported and
testified that GPRA could provide OMB, agencies, and Congress with a
structured framework for addressing crosscutting policy initiatives and
program efforts. OMB could use the provision of GPRA that calls for OMB to
develop a governmentwide performance plan to integrate expected
agency-level performance. It could also be used to more clearly relate and
address the contributions of alternative federal strategies.
Unfortunately, this provision has not been fully implemented. Instead, OMB
has used the President's Budget to present high-level information about
agencies and certain program performance issues. The current
agency-by-agency focus of the budget does not provide the integrated
perspective of government performance envisioned by GPRA. For example, the
fiscal year 2004 budget identified budget requests and performance
objectives by agency, such as the U.S. Department of Defense, as opposed
to crosscutting governmentwide themes. From this presentation, one could
assume that the only activities the U.S. government planned to carry out
in support of national defense were those listed under the chapter
"Department of Defense." However, the chapter on the fiscal year 2004
budget discussing "the Department of State and International Assistance
Programs," contains a heading titled, "Countering the Threat from Weapons
of Mass Destruction." And while OMB may have a technical reason for not
classifying this task as being related to national defense or homeland
security, it is unclear that a lay reader could make that distinction. The
fiscal year 2005 budget also identified budget requests and performance
objectives by agency, not by crosscutting theme.

Executive Summary

A strategic plan for the federal government could provide an additional
tool for governmentwide reexamination of existing programs, as well as
proposals for new programs. If fully developed, a governmentwide strategic
plan could potentially provide a cohesive perspective on the longterm
goals of the federal government and provide a much needed basis for fully
integrating, rather than merely coordinating, a wide array of federal
activities. Successful strategic planning requires the involvement of key
stakeholders. Thus, it could serve as a mechanism for building consensus.
Further, it could provide a vehicle for the President to articulate
long-term goals and a road map for achieving them. In addition, a
strategic plan could provide a more comprehensive framework for
considering organizational changes and making resource decisions. The
development of a set of key national indicators could be used as a basis
to inform the development of governmentwide strategic and annual
performance plans. The indicators could also link to and provide
information to support outcome-oriented goals and objectives in
agency-level strategic and annual performance plans.

Focus group members believed that one of the main challenges to GPRA
implementation was the reluctance of Congress to use performance
information when making decisions, especially appropriations decisions.
However, less than one quarter of federal managers in the 2003 survey
shared that concern. Further, a recent Congressional Research Service
review suggests that Congress uses performance information to some extent,
as evidenced by citations in legislation and committee reports. While
there is concern regarding Congress' use of performance information, it is
important to make sure that this information is useful. In other words,
the information presented and its presentation must meet the needs of the
user. Regular consultation with Congress about both the content and format
of performance plans and reports is critical.

As a key user of performance information, Congress also needs to be
considered a partner in shaping agency goals at the outset. GPRA provides
a vehicle for Congress to explicitly state its performance expectations in
outcome-oriented terms when consulting with agencies on their strategic
plans or when establishing new programs or exercising oversight of
existing programs that are not achieving desired results. This would
provide important guidance to agencies that could then be incorporated in
agency strategic and annual performance plans.

                               Executive Summary

Recommendations for Executive Action

GAO recommends that the Director of OMB implement five suggestions to
improve its guidance and oversight of GPRA implementation.

To provide a broader perspective and more cohesive picture of the federal
government's goals and strategies to address issues that cut across
executive branch agencies, we recommend that the Director of OMB fully
implement GPRA's requirement to develop a governmentwide performance plan.

To achieve the greatest benefit from both GPRA and PART, we recommend that
the Director of OMB articulate and implement an integrated and
complementary relationship between the two. GPRA is a broad legislative
framework that was designed to be consultative with Congress and other
stakeholders, and allows for varying uses of performance information. PART
looks through a particular lens for a particular use-the executive budget
formulation process.

To improve the quality of agencies' strategic plans, annual performance
plans, and performance reports and help agencies meet the requirements of
GPRA, we recommend that the Director of OMB provide clearer and consistent
guidance to executive branch agencies on how to implement GPRA. Such
guidance should include standards for communicating key performance
information in concise as well as longer formats to better meet the needs
of external users who lack the time or expertise to analyze lengthy,
detailed documents.

To help address agencies' performance measurement challenges, we recommend
the Director of OMB engage in a continuing dialogue with agencies about
their performance measurement practices with a particular focus on
grant-making, research and development, and regulatory functions to
identify and replicate successful approaches agencies are using to measure
and report on their outcomes, including the use of program evaluation
tools. Additionally, we recommend that the Director of OMB work with
executive branch agencies to identify the barriers to obtaining timely
data to show progress against performance goals and the best ways to
report information where there are unavoidable lags in data availability.
Governmentwide councils, such as the President's Management Council and
the Chief Financial Officers Council, may be effective vehicles for
working on these issues.

                               Executive Summary

To facilitate the transformation of agencies' management cultures to be
more results oriented, we recommend that the Director of OMB work with
agencies to ensure they are making adequate investments in training on
performance planning and measurement, with a particular emphasis on how to
use performance information to improve program performance.

Matters for Congressional Consideration

GAO also identified two matters for congressional consideration to improve
the governmentwide focus on results.

To ensure that agency strategic plans more closely align with changes in
the federal government leadership, Congress should consider amending GPRA
to require that updates to agency strategic plans be submitted at least
once every 4 years, 12-18 months after a new administration begins its
term. Additionally, consultations with congressional stakeholders should
be held at least once every new Congress and interim updates made to
strategic and performance plans as warranted. Congress should consider
using these consultations along with its traditional oversight role and
legislation as opportunities to clarify its performance expectations for
agencies. This process may provide an opportunity for Congress to develop
a more structured oversight agenda.

To provide a framework to identify long-term goals and strategies to
address issues that cut across federal agencies, Congress should consider
amending GPRA to require the President to develop a governmentwide
strategic plan.

Agency Comments	We provided a copy of the draft report to OMB for comment.
OMB's written comments are reprinted in appendix VIII. In general, OMB
agreed with our findings and conclusions. OMB agreed to implement most of
our recommendations, noting that these recommendations will enhance their
efforts to make the government more results oriented. OMB agreed to (1)
work with agencies to ensure they are provided adequate training in
performance management, (2) revise its guidance to clarify the integrated
and complementary relationship between GPRA and PART, and (3) continue to
use PART to improve agency performance measurement practices and share
those practices across government.

In response to our recommendation that OMB fully implement GPRA's
requirement to develop a governmentwide performance plan, OMB stated

Executive Summary

that the President's Budget represents the executive branch's
governmentwide performance plan. However, according to GAO's review, the
agency-by-agency focus of the budget over the past few years does not
provide an integrated perspective of government performance, and thus does
not meet GPRA's requirement to provide a "single cohesive picture of the
annual performance goals for the fiscal year." To clarify this point, we
added an example that illustrates the lack of integration between
crosscutting issues in the budget.

In response to our matter for congressional consideration that Congress
should consider amending GPRA to require the President to develop a
governmentwide strategic plan, OMB noted that the budget serves as the
governmentwide strategic plan. However, the President's Budget focuses on
establishing agency budgets for the upcoming fiscal year. Unlike a
strategic plan, it provides neither a long-term nor an integrated
perspective on the federal government's activities. A governmentwide
strategic plan should provide a cohesive perspective on the long-term
goals of the federal government and provide a basis for fully integrating,
rather than primarily coordinating, a wide array of existing and
relatively short-term federal activities.

We provided relevant sections of the draft report to Education, DOE, HUD,
SBA, SSA, and DOT. Education and SBA did not provide any comments, while
DOT provided minor technical comments. DOE, HUD, and SSA disagreed with
some of our observations on their strategic plans, performance plans, and
performance reports; we changed or clarified relevant sections of the
report, as appropriate. Written comments from DOE, HUD, and SSA are
reprinted in appendixes IX, X, and XI, respectively, along with our
responses.

Chapter 1

Introduction

From defending the homeland against terrorists, to preventing the spread
of infectious diseases, to providing a reliable stream of social security
income to retirees and supporting the transition from welfare to work, the
federal government provides funding and services to the American public
that can affect their lives in critical ways every day. However, the
federal government is in a period of profound transition and faces an
array of challenges and opportunities to enhance performance, ensure
accountability, and position the nation for the future. A number of
overarching trends, such as diffuse security threats and homeland security
needs, increasing global interdependency, the shift to knowledge-based
economies, and the looming fiscal challenges facing our nation, drive the
need to reconsider the proper role for the federal government in the 21st
century, how the government should do business (including how it should be
structured), and in some instances, who should do the government's
business.

Without effective short-and long-term planning, which takes into account
the changing environment and needs of the American public and the
challenges they face and establishes goals to be achieved, federal
agencies risk delivering programs and services that may or may not meet
society's most critical needs. At a cost to taxpayers of over $2 trillion
annually, the federal government should be able to demonstrate to the
American public that it can anticipate emerging issues, develop sound
strategies and plans to address them, and be accountable for the results
that have been achieved.

Concerned that the federal government was more focused on program
activities and processes than the results to be achieved, Congress passed
the Government Performance and Results Act of 1993 (GPRA).1 The act
required federal agencies to develop strategic plans with long-term
strategic goals, annual goals linked to achieving the long-term goals, and
annual reports on the results achieved. Now that GPRA has been in effect
for 10 years, you asked us to assess the effectiveness of GPRA in creating
a focus on results in the federal government. Specifically, this report
discusses (1) the effect of GPRA over the last 10 years in creating a
governmentwide focus on results and the government's ability to deliver
results to the American public, including an assessment of the changes in
the overall quality of agencies' strategic plans, annual performance
plans, and annual performance reports; (2) the challenges agencies face in

1Pub. L. No. 103-62.

                             Chapter 1 Introduction

measuring performance and using performance information in management
decisions; and (3) how the federal government can continue to shift toward
a more results-oriented focus.

Impact of Emerging Trends and Fiscal Challenges

With the 21st century challenges we are facing, it is more vital than ever
to maximize the performance of federal agencies in achieving their
long-term goals. The federal government must address and adapt to major
trends in our country and around the world. At the same time, our nation
faces serious long-term fiscal challenges. Increased pressure also comes
from world events: both from the recognition that we cannot consider
ourselves "safe" between two oceans-which has increased demands for
spending on homeland security-and from the United States (U.S.) role in
combating terrorism in an increasingly interdependent world. To be able to
assess federal agency performance and hold agency managers accountable for
achieving their long-term goals, we need to know what the level of
performance is. GPRA planning and reporting requirements can provide this
essential information.

Our country's transition into the 21st century is characterized by a
number of key trends, including

o 	the national and global response to terrorism and other threats to our
personal and national security;

o 	the increasing interdependence of enterprises, economies, markets,
civil societies, and national governments, commonly referred to as
globalization;

o  the shift to market-oriented, knowledge-based economies;

o  an aging and more diverse U.S. population;

o 	rapid advances in science and technology and the opportunities and
challenges created by these changes;

o 	challenges and opportunities to maintain and improve the quality of
life for the nation, communities, families, and individuals; and

o 	the changing and increasingly diverse nature of governance structures
and tools.

Chapter 1 Introduction

As the nation and government policymakers grapple with the challenges
presented by these evolving trends, they do so in the context of rapidly
building fiscal pressures. GAO's long-range budget simulations show that
this nation faces a large and growing structural deficit due primarily to
known demographic trends and rising health care costs. The fiscal
pressures created by the retirement of the baby boom generation and rising
health costs threaten to overwhelm the nation's fiscal future. As figure 1
shows, by 2040, absent reform or other major tax or spending policy
changes, projected federal revenues will likely be insufficient to pay
more than interest on publicly held debt. Further, our recent shift from
surpluses to deficits means the nation is moving into the future in a more
constrained fiscal position.

Figure 1: Composition of Spending as a Share of GDP Assuming Discretionary
Spending Grows with GDP after 2003 and All Expiring Tax Provisions Are
Extended

50 Percent of GDP

40

30

20

10

0 2003 2015 2030 2040 Fiscal year

Net interest Social Security Medicare and Medicaid All other spending
Source: GAO's January 2004 analysis.

Notes: Although all expiring tax cuts are extended, revenue as a share of
gross domestic product (GDP) increases through 2013 due to (1) real
bracket creep, (2) more taxpayers becoming subject to the Alternative
Minimum Tax, and (3) increased revenue from tax-deferred retirement
accounts. After

                             Chapter 1 Introduction

2013, revenue as a share of GDP is held constant. This simulation assumes
that currently scheduled Social Security benefits are paid in full
throughout the simulation period.

The United States has had a long-range budget deficit problem for a number
of years, even during recent years in which we had significant annual
budget surpluses. Unfortunately, the days of surpluses are gone, and our
current and projected budget situation has worsened significantly. The
bottom line is that our projected budget deficits are not manageable
without significant changes in "status quo" programs, policies, processes,
and operations.

GPRA Background	GPRA is the centerpiece of a statutory framework that
Congress put in place during the 1990s to help resolve the long-standing
management problems that have undermined the federal government's
efficiency and effectiveness and to provide greater accountability for
results. In addition to GPRA, the framework comprises the Chief Financial
Officers Act of 1990, as amended by the Government Management Reform Act
of 1994, and information technology reform legislation, including the
Paperwork Reduction Act of 1995 and the Clinger-Cohen Act of 1996.
Together, these laws provide a powerful framework for developing and
integrating information about agencies' missions and strategic priorities,
the resultsoriented performance goals that flow from those priorities,
performance data to show the level of achievement of those goals, and the
relationship of reliable and audited financial information and information
technology investments to the achievement of those goals.

GPRA was intended to address several broad purposes, including
strengthening the confidence of the American people in their government;
improving federal program effectiveness, accountability, and service
delivery; and enhancing congressional decision making by providing more
objective information on program performance.

Chapter 1 Introduction

The basic requirements of GPRA for the preparation of strategic plans,
annual performance plans, and annual program performance reports by
executive branch agencies are the following:

Chapter 1 Introduction

The Office of Management and Budget (OMB) plays an important role in the
management of the federal government's performance, and specifically GPRA
implementation. Part of OMB's overall mission is to ensure that agency
plans and reports are consistent with the President's Budget and
administration policies. OMB is responsible for receiving and reviewing
agencies' strategic plans, annual performance plans, and annual
performance reports. To improve the quality and consistency of these
documents, OMB issues annual guidance to agencies for their preparation,
including guidelines on format, required elements, and submission
deadlines.2 GPRA requires OMB to prepare the overall governmentwide
performance plan, based on agencies' annual performance plan submissions.
OMB also played an important role in the pilot phase of GPRA
implementation by designating agencies for pilot projects in performance
measurement, managerial accountability and flexibility, and performance
budgeting, and assessing the results of the pilots. Finally, GPRA provides
OMB with authority to grant agencies waivers to certain administrative
procedures and controls.

Recent OMB guidance3 requires agencies to submit "performance budgets" in
lieu of annual performance plans for their budget submission to OMB and
Congress. Performance budgets are to meet all the statutory requirements
of GPRA for annual performance plans. In addition, agencies are to include
all performance goals used in the assessment of program performance done
under OMB's Program Assessment Rating Tool (PART) process.4 Moreover, the
guidance states that until all programs have been assessed by PART, the
performance budget will also for a time include performance goals for
agency programs that have not yet been assessed using PART. The
expectation is that agencies are to substitute new or revised performance
goals resulting from OMB's review for goals it deemed unacceptable.

2The guidance on the preparation of strategic plans, annual performance
plans, and program performance reports is contained in OMB Circular A-11,
Part 6.

3OMB Circular A-11, July 2003.

4PART is a diagnostic tool developed by OMB that it has been using to rate
the effectiveness of federal programs with a particular focus on program
results. OMB's goal is to review all federal programs over a 5-year period
using the PART tool. OMB used the tool to review approximately 400
programs between the fiscal year 2004 budget cycle and the fiscal year
2005 budget cycle-234 programs were assessed last year and 173 were
assessed this year. Some reassessed programs were combined for review for
the 2005 budget, which is why the number of programs assessed over the 2
years does not add up to exactly 400 programs.

Chapter 1 Introduction

In crafting GPRA, Congress recognized that managerial accountability for
results is linked to managers having sufficient flexibility, discretion,
and authority to accomplish desired results. GPRA authorizes agencies to
apply for managerial flexibility waivers in their annual performance plans
beginning with fiscal year 1999. The authority of agencies to request
waivers of administrative procedural requirements and controls is intended
to provide federal managers with more flexibility to structure agency
systems to better support program goals. The nonstatutory requirements
that OMB can waive under GPRA generally involve the allocation and use of
resources, such as restrictions on shifting funds among items within a
budget account. Agencies must report in their annual performance reports
on the use and effectiveness of any GPRA managerial flexibility waivers
that they receive.

OMB was to designate at least five agencies from the first set of pilot
projects to test managerial accountability and flexibility during fiscal
years 1995 and 1996. We previously reported on the results of the pilot
project to implement managerial flexibility waivers and found that the
pilot did not work as intended.5 OMB did not designate any of the seven
departments and one independent agency that submitted a total of 61 waiver
proposals as GPRA managerial accountability and flexibility pilots. For
about threequarters of the waiver proposals, OMB or other central
management agencies determined that the waivers were not allowable for
statutory or other reasons or that the requirement for which the waivers
were proposed no longer existed. For the remaining proposals, OMB or other
central management agencies approved waivers or developed compromises by
using authorities that were already available independent of GPRA.

Under GPRA, another set of pilot projects, which were scheduled for fiscal
years 1998 and 1999, were to test performance budgeting-i.e., the
presentation of the varying levels of performance that would result from
different budget levels. We previously reported that OMB initially
deferred these pilots-originally to be designated in fiscal years 1998 and
1999-to give federal agencies time to develop the capability of
calculating the effects of marginal changes in cost or funding on
performance.6 When the

5U.S. General Accounting Office, GPRA: Managerial Accountability and
Flexibility Pilot Did Not Work as Intended, GAO/GGD-97-36 (Washington,
D.C.: Apr. 10, 1997).

6U.S. General Accounting Office, Managing for Results: Agency Progress in
Linking Performance Plans With Budgets and Financial Statements,
GAO-02-236 (Washington, D.C.: Jan. 4, 2002).

Chapter 1 Introduction

pilots began in August 1999, OMB designed them as case studies prepared by
OMB staff to demonstrate how performance information could be used to
compare alternatives and to develop funding recommendations for
incorporation into the President's fiscal year 2001 budget submission.

On January 18, 2001, OMB reported the results of five performance
budgeting pilots that explored agencies' capabilities of more formally
assessing the effects of different funding levels on performance goals.
OMB selected the pilots to reflect a cross section of federal functions
and capabilities so that a representative range of measurement and
reporting issues could be explored. In its report, OMB concluded that
legislative changes were not needed. OMB reported that the pilots
demonstrated that assuring further performance measurement improvements
and steadily expanding the scope and quality of performance measures is
paramount, and that the existing statute provides sufficient latitude for
such improvement.

Overall, OMB concluded that the pilots raised several key challenges about
performance budgeting at the federal level including, for example, the
following:

o 	In many instances, measuring the effects of marginal, annual budget
changes on performance is not precise or meaningful.

o 	While continuing to change from an almost total reliance on output
measures to outcome measures, it will be much more difficult to associate
specific resource levels with those outcomes, particularly over short
periods of time.

o 	Establishing clear linkages between funding and outcomes will vary by
the nature of the program and the number of external factors.

o 	Delays in the availability of performance data, sometimes caused by
agencies' reliance on nonfederal program partners for data collection,
will continue to present synchronization problems during budget
formulation.

                             Chapter 1 Introduction

Scope and Methodology

To meet the three objectives stated earlier, we reviewed our extensive
prior work on GPRA best practices and implementation and collected
governmentwide data to assess the government's overall focus on results.
We conducted a random, stratified, governmentwide survey of federal
managers comparable to surveys we conducted in 1997 and 2000. We also held
eight in-depth focus groups-seven comprised of federal managers from 23
federal agencies and one with GPRA experts. We also interviewed top
appointed officials from the current and previous administrations.
Finally, we judgmentally selected a sample of six agencies to review for
changes in the quality of their strategic plans, performance plans, and
performance reports since their initial efforts. The agencies we selected
were the Departments of Education (Education), Energy (DOE), Housing and
Urban Development (HUD), and Transportation (DOT) and the Small Business
(SBA) and Social Security Administrations (SSA). In making this selection,
we chose agencies that collectively represented the full range of
characteristics in the following four areas: (1) agency size (small,
medium, large); (2) primary program types (direct service, research,
regulatory, transfer payments, and contracts or grants); (3) quality of
fiscal year 2000 performance plan based on our previous review (low,
medium, high);7 and (4) type of agency (cabinet department and independent
agency). Appendix I contains a more detailed discussion of our scope and
methodology.

We performed our work in Washington, D.C., from January through November
2003 in accordance with generally accepted government auditing standards.
Major contributors to this report are listed in appendix

XII.

7GAO/GGD/AIMD-99-215. Based on how we had rated agencies' annual
performance plans on their picture of performance, specificity of
strategies and resources, and the degree of confidence that performance
information will be credible, we assigned numeric values to each agencies'
rating (e.g., clear=3, general=2, limited=1, unclear=0) and added them up
to determine overall quality of high, medium, or low. An agency's plan was
considered high quality if its score was between 7-9, a score of 5-6 was
considered medium quality, and a score of 3-4 was low. No agencies
received a score lower than 3.

Chapter 2

GPRA Established the Foundation for a More Results-Oriented Federal
Government

Among the purposes of GPRA cited by Congress was to improve federal
program effectiveness and service delivery by promoting a new focus on
results, service quality, and customer satisfaction by setting program
goals, measuring performance against goals, and reporting publicly on
progress. Furthermore, GPRA was to improve congressional decision making
by providing better information on achieving objectives, and on the
relative effectiveness and efficiency of federal programs and spending.
Ten years after enactment, GPRA's requirements have laid a foundation of
resultsoriented agency planning, measurement, and reporting that have
begun to address these purposes. Focus group participants, high-level
political appointees, and OMB officials we interviewed cited positive
effects of GPRA that they generally attributed to GPRA's statutory
requirements for planning and reporting. Our survey results indicate that
since GPRA went into effect governmentwide in 1997, federal managers
reported having significantly more of the types of performance measures
called for by GPRA-particularly outcome-oriented performance measures.
GPRA has also begun to facilitate the linking of resources to results,
although much remains to be done in this area.

GPRA Statutory Requirements Laid a Foundation for Agencywide
Results-Oriented Management

Prior to enactment of GPRA, our 1992 review of the collection and use of
performance data by federal agencies revealed that, although many agencies
collected performance information at the program level, few agencies had
results-oriented performance information to manage or make strategic
policy decisions for the agency as a whole.1 Federal agencies surveyed
indicated that many had a single, long-term plan that contained goals,
standards, or objectives for the entire agency or program. Many of these
agencies also reported they collected a wide variety of performance
measures. However, in validating the survey responses with a sample of
agencies, we found that measures were typically generated and used by
program-level units within an agency and focused on measuring work
activity levels and outputs or compliance with statutes. Little of this
performance information was transparent to Congress, OMB, or the public
and few of the agencies we visited used performance measures to manage
toward long-term objectives. Few of the agencies surveyed had the
infrastructure in place, such as a unified strategic plan with measurable
goals, an office that collected performance measures, and regular
consolidated reports, to tie plans and measures.

1GAO/GGD-92-65.

Chapter 2
GPRA Established the Foundation for a More
Results-Oriented Federal Government

GPRA addressed these shortcomings by creating a comprehensive and
consistent statutory foundation of required agencywide strategic plans,
annual performance plans, and annual performance reports. In contrast to
prior federal government efforts to measure performance, GPRA explicitly
emphasized that, in addition to performance indicators that agencies may
need to manage programs on a day-to-day basis, such as quantity, quality,
timeliness, and cost, agencies also needed outcome-oriented goals and
measures that assess the actual results, effects, or impact of a program
or activity compared to its intended purpose.

Expert and agency focus group participants cited the creation of this
statutory foundation as one of the key accomplishments of GPRA.
Participants agreed that GPRA created a framework in statute for federal
agencies to plan their activities in order to become more results oriented
and provided a managerial tool for program accountability. Using this
framework, agencies could develop and focus on strategies to carry out the
programs they administer; set goals and identify performance indicators
that will inform them whether or not they achieved the performance they
expected; and determine what impact, if any, their programs have had on
the American public. According to the experts in one of our focus groups,
comparing federal agencies' current mission statements contained in their
strategic plans to what they were in the past demonstrates that the
agencies have done some "soul searching" to get a better sense of what
their role is (or should be) and how they can achieve it. Given that GPRA
is in statute, participants indicated that the use of this planning
framework is likely to be sustained within agencies.

One of the premises of GPRA is that both congressional and executive
branch oversight of federal agency performance were seriously hampered by
a lack of adequate results-oriented goals and performance information. As
noted above, prior to the enactment of GPRA few agencies reported their
performance information externally. OMB officials we interviewed as part
of our current review suggested that OMB has been a key consumer of the
performance information produced under GPRA and that it has provided a
foundation for their efforts to oversee agency performance.

                                   Chapter 2
                   GPRA Established the Foundation for a More
                      Results-Oriented Federal Government

For example, during the development of the fiscal year 2004 budget, OMB
used PART to review and rate 234 federal programs. We recently reported
that one of PART's major impacts was its ability to highlight OMB's
recommended changes in program management and design.2 PART reviews look
at four elements-program purpose and design, strategic planning, program
management, and program results/accountability-and rate the program on how
well each of these elements is executed. However, without the foundation
of missions, goals, strategies, performance measures, and performance
information generated under GPRA, such oversight would be difficult to
carry out.

Participants in most of our focus groups also agreed that GPRA has been a
driving force behind many cultural changes that have occurred within
federal agencies. Highlighting the focus on results, participants stated
that GPRA had stimulated a problem-solving approach within federal
agencies and encouraged agency managers to think creatively when
developing performance indicators for their programs. GPRA has also
changed the dialogue within federal agencies; front-line managers and
staff at lower levels of the organization now discuss budget issues in
connection with performance. Similarly, experts noted that information
about performance management and resource investments are more frequently
communicated between agency officials and Congress than in the past.
Within agencies, GPRA documents can provide a context of missions, goals,
and strategies that political appointees can use to articulate agencies'
priorities.

Views on GPRA's Effect on the Federal Government's Ability to Deliver
Results to the American Public Were Mixed

A key purpose of GPRA was "to improve the confidence of the American
people in the capability of the Federal Government, by systematically
holding Federal agencies accountable for achieving program results." When
asked about the direct effects of GPRA on the public in our 2003 survey,
an estimated 23 percent of federal managers agreed to a moderate or
greater extent that GPRA improved their agency's ability to deliver
results to the American public; a larger percentage-38 percent-chose a "no
basis to judge/not applicable" category.

When a similar question was posed in our focus groups with experts and
federal managers, participants' views were generally mixed. Some federal

2U.S. General Accounting Office, Performance Budgeting: Observations on
the Use of OMB's Program Assessment Rating Tool for the Fiscal Year 2004
Budget, GAO-04-174 (Washington, D.C.: Jan. 30, 2004).

Chapter 2
GPRA Established the Foundation for a More
Results-Oriented Federal Government

managers in our focus groups agreed that GPRA has had a positive effect on
raising awareness on many performance issues, and that in and of itself is
a way of delivering results. The information gathered and reported for
GPRA allows agencies to make better-informed decisions, which improves
their ability to achieve results. Other participants stated that while
certain aspects of GPRA-related work have been positive, agencies' ability
to deliver results and public awareness of their activities cannot always
be exclusively attributed to GPRA. For example, some participants stated
that many agencies rely on grant recipients to carry out their work, and
delivering results to the American public depends, to a large extent, on
the diligence of these organizations to implement their programs; such
results would not change dramatically if GPRA were no longer a
requirement.

A number of the political appointees we interviewed cited examples of
outcomes they believe would not have occurred without the structure of
GPRA. For example, a former deputy secretary of the Department of Veterans
Affairs (VA) stated that "the Results Act brought about a fundamental
rethinking of how we managed our programs and processes. . . . We
developed a strategic plan that was veteranfocused. . . . We made every
effort to define program successes from the veteran's perspective." A
former Chief Financial Officer (CFO) cited Customs Service goals to reduce
the quantity of illegal drugs flowing into the United States and the Food
and Drug Administration's focus on speeding up the approval of new drugs
as examples of outcomes that can make a big difference in people's lives.

Another major accomplishment of GPRA cited by our focus group participants
is that GPRA improved the transparency of government results to the
American public. As noted above, prior to GPRA, few agencies reported
performance results outside of their agencies. Focus group participants
indicated a key accomplishment of GPRA was its value as a communication
tool by increasing the transparency to the public of what their agencies
did in terms the public could understand. For example, information on
agencies' strategic plans, performance goals, measures, and results are
easily obtainable from agency Web sites. One focus group participant
commented that GPRA helps bureaucrats explain to nonbureaucrats what the
federal government does in terms they can better understand. Other
comments indicated that because of GPRA agencies could now tell Congress
and the American public what they are getting for their money.

                                   Chapter 2
                   GPRA Established the Foundation for a More
                      Results-Oriented Federal Government

More Managers Reported A fundamental element in an organization's efforts
to manage for results is Having Performance its ability to set meaningful
goals for performance and to measure Measures performance against those
goals. From our 2003 survey we estimate that

89 percent of federal managers overall said there were performance
measures for the programs they were involved with. This is a statistically
significantly higher percentage than the 76 percent of managers who
answered yes to this item on our 1997 survey. (See fig. 2.)

Figure 2: Percentage of Federal Managers Who Reported That There Were
Performance Measures for the Programs with Which They Were Involved

Percent

100

90 89

80

70

60

50

40

30

20

10

0 1997a 2000 2003a Year

Source: GAO. aThere was a statistically significant difference between
1997 and 2003 surveys.

Chapter 2
GPRA Established the Foundation for a More
Results-Oriented Federal Government

Moreover, when we asked managers who said they had performance measures
which of five types of measures they had to a great or very great extent,
they reported increases in all five types of measures between 1997 and
2003,3 all of which were statistically significant. (See fig. 3.) Notably,
managers indicated the existence of outcome measures, defined as
"performance measures that demonstrate to someone outside the organization
whether or not intended results are being achieved," grew from a low of 32
percent in 1997 to the current estimate of 55 percent, a level that is on
par with output measures for the first time since we began our survey.

Figure 3: Percentage of Federal Managers Who Reported Having Specific
Types of Performance Measures to a Great or Very Great Extent

Percent

60 54 55

50

40

30

20

10

0 Outputa Efficiencya Customer servicea Qualitya Outcomea

Performance measures

1997

2000

2003

Source: GAO.

aThere was a statistically significant difference between 1997 and 2003
surveys.

3Types of measures were defined in the questionnaire as follows:
performance measures that tell us how many things we produce or services
we provide (output measures); performance measures that tell us if we are
operating efficiently (efficiency measures); performance measures that
tell us whether or not we are satisfying our customers (customer service
measures); performance measures that tell us about the quality of the
products or services we provide (quality measures); and performance
measures that would demonstrate to someone outside of our agency whether
or not we are achieving our intended results (outcome measures).

Chapter 2
GPRA Established the Foundation for a More
Results-Oriented Federal Government

Similarly, focus group participants commented on certain cultural changes
that had taken place within their agencies since the passage of GPRA in
which the "vocabulary" of performance planning and measurement-e.g., a
greater focus on performance management; orientation toward outcomes over
inputs and outputs; and an increased focus on program evaluation- had
become more pervasive. This perception is partly born out by our survey
results. Since 1997 those reporting a moderate to extensive knowledge of
GPRA and its requirements shifted significantly from 26 percent to 41
percent in 2003, while those reporting no knowledge of GPRA declined
significantly from 27 percent to 20 percent. (See fig. 4.)

Figure 4: Percentage of Federal Managers Who Reported Their Awareness of
GPRA Percent

100

90

80

70

60

50 41

40

30

20

10

0 1997a 2000 2003a

Year

Moderate to extensive knowledge

No knowledge Source: GAO.

aThere was a statistically significant difference between 1997 and 2003
surveys.

Consistent with our survey results indicating increases in
results-oriented performance measures and increasing GPRA knowledge, we
also observed a significant decline in the percentage of federal managers
who agreed that certain factors hindered measuring performance or using
the performance information. For example, as shown in figure 5, of those
who expressed an

Chapter 2
GPRA Established the Foundation for a More
Results-Oriented Federal Government

opinion, the percentage of managers who noted that determining meaningful
measures was a hindrance to a great or very great extent was down
significantly from 47 percent in 1997 to 36 percent in 2003. Likewise, the
percentage that agreed to a great or very great extent that different
parties are using different definitions to measure performance was a
hindrance also declined significantly from 49 percent in 1997 to 36
percent in 2003.

Figure 5: Percentage of Federal Managers Who Reported Hindrances to
Measuring

Performance or Using the Performance Information to a Great or Very Great
Extent

Percent

60

50 49

40

30

20

10

0             
     Difficulty   Different parties 
                                are 
     determining    using different 
     meaningful      definitions to 
                            measure 
     measuresa         performancea 

1997

2000

2003

Source: GAO.

Note: Percentages are based on those respondents answering on the extent
scale. aThere was a statistically significant difference between 1997 and
2003.

Finally, our survey data suggested that more federal managers, especially
at the Senior Executive Service (SES) level, believed that OMB was paying
attention to their agencies' efforts under GPRA. Moreover, there was no
corresponding increase in their concern that OMB would micromanage the
programs in their agencies. In our survey, we asked respondents to assess

Chapter 2
GPRA Established the Foundation for a More
Results-Oriented Federal Government

the extent to which OMB pays attention to their agencies' efforts under
GPRA. As seen in figure 6, in 2003, the percentage of respondents who
responded "Great" or "Very Great" to this question (31 percent) was
significantly higher than in 2000 (22 percent). Of those, SES respondents
showed an even more dramatic increase, from 33 to 51 percent. We also
asked respondents the extent to which their concern that OMB would
micromanage programs in their agencies was a hindrance to measuring
performance or using performance information. The percentage among those
expressing an opinion that it was a hindrance to a great or very great
extent was low-around 24 percent in 2003-with no significant difference
between 2000 and 2003.

Figure 6: Percentage of Federal Managers and SES Managers Who Reported
That OMB Paid Attention to Their Agency's Efforts under GPRA to a Great or
Very Great Extent

Percent

100

90

80

70

60

                                       51

50

40

30

20 22

10

0 2000a 2003a

Year

Federal managers SES Non-SES Source: GAO.

aThere was a statistically significant difference between 2000 and 2003
surveys.

                                   Chapter 2
                   GPRA Established the Foundation for a More
                      Results-Oriented Federal Government

GPRA Has Begun to Establish a Link between Resources and Results

Among its major purposes, GPRA aims for a closer and clearer linkage
between requested resources and expected results. The general concept of
linking performance information with budget requests is commonly known as
performance budgeting. Budgeting is and will remain an exercise in
political choice, in which performance can be one, but not necessarily the
only, factor underlying decisions. However, efforts to infuse performance
information into resource allocation decisions can more explicitly inform
budget discussions and focus them-both in Congress and in agencies-on
expected results, rather than on inputs.

GPRA established a basic foundation for performance budgeting by requiring
that an agency's annual performance plan cover each program activity in
the President's budget request for that agency. GPRA does not specify any
level of detail or required components needed to achieve this coverage.
Further, GPRA recognizes that agencies' program activity structures are
often inconsistent across budget accounts and thus gives agencies the
flexibility to consolidate, aggregate, or disaggregate program activities,
so long as no major function or operation of the agency is omitted or
minimized. In addition, OMB guidance has traditionally required agencies
to display, by budget program activity, the funding level being applied to
achieve performance goals. OMB's guidance on developing fiscal year 2005
performance budgets also encourages a greater link between performance and
funding levels, however, it places greater emphasis on linking agencies'
long-term and annual performance goals to individual programs. At a
minimum, agencies are to align resources at the program level, but they
are encouraged to align resources at the performance goal level. Resources
requested for each program are to be the amounts needed to achieve program
performance goal targets.

Our 1998 assessment of fiscal year 1999 performance plans found that
agencies generally covered the program activities in their budgets, but
most plans did not identify how the funding for those program activities
would be allocated to performance goals.4 However, our subsequent reviews
of performance plans indicate that agencies have made progress in
demonstrating how their performance goals and objectives relate to program
activities in the budget.

4GAO/GGD/AIMD-98-228.

Chapter 2
GPRA Established the Foundation for a More
Results-Oriented Federal Government

Over the first 4 years of agency efforts to implement GPRA, we observed
that agencies continued to tighten the required link between their
performance plans and budget requests.5 Of the agencies we reviewed over
this period, all but three met the basic requirement of GPRA to define a
link between their performance plans and the program activities in their
budget requests, and most of the agencies in our review had moved beyond
this basic requirement to indicate some level of funding associated with
expected performance described in the plan. Most importantly, more of the
agencies we reviewed each year-almost 75 percent in fiscal year 2002
compared to 40 percent in fiscal year 1999-were able to show a direct link
between expected performance and requested program activity funding
levels-the first step in defining the performance consequences of
budgetary decisions. However, we have also observed that the nature of
these linkages varied considerably. Most of the agencies in our review of
fiscal year 2002 performance plans associated funding requests with
higher, more general levels of expected performance, rather than the more
detailed "performance goals or sets of performance goals" suggested in OMB
guidance.

Although not cited by our group of experts, participants at six of our
seven focus groups with federal managers cited progress in this area as a
key accomplishment of GPRA. However, the participants also commented that
much remains to be done in this area. The comments ranged from the
general-GPRA provides a framework for planning and budgeting, to the more
specific-GPRA created a definition of programs and how they will help the
agency achieve its goals/objectives and the amount of money that will be
required to achieve said goals/objectives. One of the comments implied
that GPRA has helped to prioritize agency efforts by helping agencies
align their efforts with programs or activities that make a difference. A
political appointee we interviewed echoed this comment, stating that GPRA
was pushing the department to think about what it gets out of the budget,
not just what it puts into it-12 to 15 years ago the "so what" was missing
from the budget process. Another political appointee we interviewed stated
that the department was in the process of tying its goals to its budget
formulation and execution processes and linking program costs to
departmental goals. A former political appointee

5GAO-02-236; U.S. General Accounting Office, Performance Budgeting:
Initial Experiences Under the Results Act in Linking Plans With Budgets,
GAO/AIMD/GGD-99-67 (Washington, D.C.: Apr. 12, 1999); and Performance
Budgeting: Fiscal Year 2000 Progress in Linking Plans With Budgets,
GAO/AIMD-99-239R (Washington, D.C.: July 30, 1999).

Chapter 2
GPRA Established the Foundation for a More
Results-Oriented Federal Government

discussed how his department used program performance information to
inform a major information systems investment decision.

Furthermore, GAO case studies on the integration of performance
information in budget decision making found that performance information
has been used to inform the allocation of resources and for other
management purposes at selected agencies. For example, the Veterans Health
Administration provides its health care networks with performance
information on patterns of patient care and patient health outcomes, which
can be used to analyze resource allocation and costs and reallocate
resources as appropriate.6 Officials at the Administration for Children
and Families said that training and technical assistance and salaries and
expense funds are often allocated based on program and performance needs.7
The Nuclear Regulatory Commission monitors performance against targets and
makes resource adjustments, if needed, to achieve those targets.8

Although there has been progress in formally establishing the linkages
between budgets and plans, our survey results are somewhat conflicting and
have not reflected any notable changes either in managers' perceptions
governmentwide as to their personal use of plans or performance
information when allocating resources, or in their perceptions about the
use of performance information when funding decisions are made about their
programs. Our 2003 survey data show that a large majority of federal
managers reported that they consider their agency's strategic goals when
they are allocating resources. As shown in figure 7, on our 2003 survey,
an estimated 70 percent of all federal managers agreed to a great or very
great extent that they considered their agency's strategic goals when
allocating resources. However, using our 1997 survey responses as a
baseline, it was not a statistically significant increase over 64 percent
of the managers who responded comparably then. As shown in figure 8, a
similar, but somewhat

6U.S. General Accounting Office, Managing for Results: Efforts to
Strengthen the Link Between Resources and Results at the Veterans Health
Administration, GAO-03-10 (Washington, D.C.: Dec. 10, 2002).

7U.S. General Accounting Office, Managing for Results: Efforts to
Strengthen the Link Between Resources and Results at the Administration
for Children and Families, GAO-0309 (Washington, D.C.: Dec. 10, 2002).

8U.S. General Accounting Office, Managing for Results: Efforts to
Strengthen the Link Between Resources and Results at the Nuclear
Regulatory Commission, GAO-03-258 (Washington, D.C.: Dec. 10, 2002).

Chapter 2
GPRA Established the Foundation for a More
Results-Oriented Federal Government

smaller, majority (60 percent) of managers who expressed an opinion on our
2003 survey agreed to a great or very great extent that they used
information from performance measurement when they were involved in
allocating resources. In 1997, the comparable response was about the same
at 62 percent. When we asked managers on another item, however, about the
extent to which they perceived funding decisions for their programs being
based on results or outcome-oriented performance information, only 25
percent of federal managers in 2003 endorsed this view to a great or very
great extent. In 1997, 20 percent of managers expressed a comparable view,
again not a significant increase. (See fig. 9.)

Figure 7: Percentage of Federal Managers Who Reported They Considered
Strategic
Goals to a Great or Very Great Extent When Allocating Resources
Percent

100

90

80

70

70

60

50

40

30

20

10

0 1997 2000 2003

Year

Source: GAO.

Chapter 2
GPRA Established the Foundation for a More
Results-Oriented Federal Government

Figure 8: Percentage of Federal Managers Who Reported They Considered
Performance Information to a Great or Very Great Extent When Allocating
Resources Percent

100

90

80

70

                                     62 60

60

50

40

30

20

10

0
1997 2000 2003

Year

Source: GAO. Note: Percentages are based on those respondents answering on
the extent scale.

Chapter 2
GPRA Established the Foundation for a More
Results-Oriented Federal Government

Figure 9: Percentage of Federal Managers Who Reported That Funding
Decisions Were Based on Results or Outcome-Oriented Performance
Information to a Great or Very Great Extent

Percent

100

90

80

70

60

50

40

30 25

20

10

0
1997 2000 2003
Year

Source: GAO.

Chapter 3

Agencies Have Addressed Many Critical Performance Planning and Reporting
Challenges, but Weaknesses Persist

Beginning with federal agencies' initial efforts to develop effective
strategic plans in 1997 and annual performance plans and reports for
fiscal year 1999, Congress, GAO, and others have commented on the quality
of those efforts and provided constructive feedback on how agency plans
and reports could be improved. On the basis of our current review of the
strategic plans, annual performance plans, and annual performance reports
of six selected agencies-Education, DOE, HUD, DOT, SBA, and SSA-we found
that these documents reflect much of the feedback that was provided. For
example, goals were more quantifiable and results oriented, and agencies
were providing more information about goals and strategies to address
performance and accountability challenges and the limitations to their
performance data. However, certain weaknesses, such as lack of detail on
how annual performance goals relate to strategic goals and how agencies
are coordinating with other entities to achieve common objectives,
persist. A detailed discussion of our scope and methodology and the
results of our reviews of the six agencies' most recent strategic plans,
annual performance plans, and annual performance reports compared to
initial efforts are contained in appendixes III, IV, and V, respectively.

Quality of Selected Strategic Plans Reflects Improvements over Initial
Drafts

Under GPRA, strategic plans are the starting point and basic underpinning
for results-oriented management. GPRA requires that an agency's strategic
plan contain six key elements: (1) a comprehensive agency mission
statement; (2) agencywide long-term goals and objectives for all major
functions and operations; (3) approaches (or strategies) and the various
resources needed to achieve the goals and objectives; (4) a description of
the relationship between the long-term goals and objectives and the annual
performance goals; (5) an identification of key factors, external to the
agency and beyond its control, that could significantly affect the
achievement of the strategic goals; and (6) a description of how program
evaluations were used to establish or revise strategic goals and a
schedule for future program evaluations.

Chapter 3
Agencies Have Addressed Many Critical
Performance Planning and Reporting
Challenges, but Weaknesses Persist

Our 1997 review of agencies' draft strategic plans found that a
significant amount of work remained to be done by executive branch
agencies if their strategic plans were to fulfill the requirements of
GPRA, serve as a basis for guiding agencies, and help congressional and
other policymakers make decisions about activities and programs.1 Our
assessment of 27 agencies' initial draft strategic plans revealed several
critical strategic planning issues that needed to be addressed. These
planning issues were as follows:

o 	Most of the draft plans did not adequately link required elements in
the plans, such as strategic goals to annual performance goals.

o  Long-term strategic goals often tended to have weaknesses.

o 	Many agencies did not fully develop strategies explaining how their
long-term strategic goals would be achieved.

o 	Most agencies did not reflect in their draft plans the identification
and planned coordination of activities and programs that cut across
multiple agencies.

o 	The draft strategic plans did not adequately address program
evaluations.

We noted that Congress anticipated that it may take several planning
cycles to perfect the process and that strategic plans would be
continually refined as various planning cycles occur. We also recognized
that developing a strategic plan is a dynamic process and that agencies,
with input from OMB and Congress, were continuing to improve their plans.

Agencies have now had 6 years to refine their strategic planning
processes. Although the six strategic plans we looked at for this review
reflected many new and continuing strengths as well as improvements over
the 1997 initial drafts, we continued to find certain persistent
weaknesses. As depicted in table 1, of the six elements required by GPRA,
the plans generally discussed all but one-program evaluation, an area in
which we have found capacity is often lacking in federal agencies.2
Although the strategic plans generally

1GAO/GGD-97-180.

2U.S. General Accounting Office, Managing for Results: Challenges Agencies
Face in Producing Credible Performance Information, GAO/GGD-00-52
(Washington, D.C.: Feb. 4, 2000).

                                   Chapter 3
                     Agencies Have Addressed Many Critical
                       Performance Planning and Reporting
                       Challenges, but Weaknesses Persist

listed the program evaluations agencies planned to complete over the
planning period, they generally did not address how the agencies planned
to use their evaluations to establish new or revise existing strategic
goals, as envisioned by GPRA. Finally, although not required by GPRA, the
strategic plans would have benefited from more complete discussions of how
agencies planned to coordinate with other entities to address common
challenges or achieve common or complementary goals. Appendix III provides
a more detailed discussion of (1) the required and other useful elements
we reviewed to assess strategic plan strengths and weaknesses and (2)
changes in the quality of the six agencies' strategic plans we reviewed.

    Table 1: Agencies' Progress in Addressing Required Elements of Strategic
Planning under GPRA Element included in agency strategic plan? Agency strategic
                                plans Plan year

Mission statement Long-term goals Strategies Relationship between longterm
goals and annual goals External factors Evaluations

                     Department of Education 1997 X X X X X

2002XXX X X

                        Department of Energy 1997 X X X

                                        2003a    X    X    X    X    X   
     Department of Housing and Urban     1997         X                  
               Development               2003    X    X    X    X    X   
      Small Business Administration      1997    X    X    X         X   
                                        2001b    X    X    X    X    X   
     Social Security Administration      1997    X    X    X    X    X      X 
                                         2003    X    X    X    X    X      X 
      Department of Transportation       1997    X    X                     X 
                                        2003a    X    X    X    X    X   

Sources: GAO analysis of agencies' strategic plans in effect at the time
of our review. See also, U.S. General Accounting Office, The Results Act:
Observations on the Department of Education's June 1997 Draft Strategic
Plan, GAO/HEHS-97-176R (Washington, D.C.: July 18, 1997); Results Act:
Observations on DOE's Draft Strategic Plan, GAO/RCED-97-199R (Washington,
D.C.: July 11, 1997); The Results Act: Observations on the Department of
Transportation's Draft Strategic Plan, GAO/RCED-97-208R (Washington, D.C.:
July 30, 1997); The Results Act: Observations on the Social Security
Administration's June 1997 Draft Strategic Plan, GAO/HEHS-97-179R
(Washington, D.C.: July 22, 1997); The Results Act: Observations on the
Small Business Administration's Draft Strategic Plan, GAO/RCED-97-205R
(Washington, D.C.: July 11, 1997); and The Results Act: Observations on
the Department of Housing and Urban Development's Draft Strategic Plan,
GAO/RCED-97-224R (Washington, D.C.: Aug. 8, 1997).

aThe 2003 plans for DOE and DOT were in draft form during the time of our
review.

bAt the time of our review, the most recent SBA strategic plan was for
fiscal years 2001-2008. SBA released a new strategic plan for fiscal years
2003-2008 in October 2003.

                                   Chapter 3
                     Agencies Have Addressed Many Critical
                       Performance Planning and Reporting
                       Challenges, but Weaknesses Persist

Strategic Planning Strengths and Improvements from Initial Draft Plans

Consistent with our review of agencies' 1997 strategic plans, the recent
strategic plans we reviewed generally contained mission statements that
were results oriented, distinct from other agencies, and covered the
agencies' major activities. DOT's mission statement had improved by
reflecting additional language from its enabling legislation that we
recommended adding during our 1997 review. Still improvement could be made
in this area as is shown by DOE's mission statement. DOE's mission was
results oriented but did not address the department's activities related
to energy supply and conservation.

Our review of the current strategic plans also revealed improvements in
the development of agencies' long-term, strategic goals-essential for
resultsoriented management. Although GPRA does not require that all of an
agency's long-term, strategic goals be results oriented, the intent of
GPRA is to have agencies focus their strategic goals on results to the
extent feasible. In addition, as required by GPRA, the goals should be
expressed in a manner that could be used to gauge success in the future
and should cover an agency's major functions or activities. All of the
strategic plans we reviewed contained long-term, strategic goals that
demonstrated improvements in the quality of their 1997 goals. Agencies'
long-term strategic goals generally covered their missions, were results
oriented, and were expressed in a manner that could be used to gauge
future success. For example, SBA improved the quality of its long-term
goals by focusing more on key outcomes to be achieved and less on process
improvements, as was the case in its 1997 plan. In some cases, we observed
strategic goals that addressed the agency's organizational capacity to
achieve results, such as SSA's long-term goal to strategically manage and
align staff to support its mission.

Chapter 3
Agencies Have Addressed Many Critical
Performance Planning and Reporting
Challenges, but Weaknesses Persist

We also found improvements in how agencies' current plans addressed
performance and accountability challenges we had identified, a key
weakness we identified in our earlier review. Each of the agency plans we
reviewed discussed the long-term goals and strategies to address the
challenges that we had identified. For example, Education's strategic plan
contained a long-term strategic goal to modernize the Federal Student
Assistance programs and address identified problems in this area, which we
have designated as high risk since 1990.3 SSA noted that it considered
GAO-identified performance and accountability challenges when it
determined its strategic goals and objectives, however not all of the
challenges are clearly addressed in the plan.

A third area of improvement we observed was in the description of the
strategies agencies planned to use to achieve their long-term strategic
goals. In our review of agencies' 1997 draft strategic plans, we found
that many agencies did not fully develop strategies explaining how their
longterm strategic goals would be achieved. In contrast, all six of the
current strategic plans we reviewed contained strategies that appeared
logically linked to achieving the agencies' long-term goals.

Other strengths and improvements we observed in meeting GPRA's basic
requirements involved the reporting of external factors that could affect
the achievement of the long-term goals and the identification of
crosscutting activities, although as indicated below these discussions
could be improved. The six agencies reviewed for this report each reported
on external factors in current strategic plans. For example, for each of
the strategic objectives in DOT's strategic plan, DOT lists factors
external to its control and how those factors could affect the achievement
of its objectives. Although not a requirement, some of the better plans we
reviewed discussed strategies to ameliorate the effect of external
factors. For example, for an external factor on teacher certification
under a goal on reading, Education's plan states that the agency "will
work with the states and national accreditation bodies to encourage the
incorporation of research-based reading instruction into teacher
certification requirements."

3Since 1990, GAO has periodically reported on government operations that
it identifies as "high risk" because of the greater vulnerabilities to
fraud, waste, abuse, and mismanagement. See U.S. General Accounting
Office, High-Risk Series: An Update, GAO03-119 (Washington, D.C.: January
2003).

                                   Chapter 3
                     Agencies Have Addressed Many Critical
                       Performance Planning and Reporting
                       Challenges, but Weaknesses Persist

We have frequently reported that a focus on results, as envisioned by
GPRA, implies that federal programs contributing to the same or similar
results should be closely coordinated to ensure that goals are consistent
and, as appropriate, program efforts are mutually reinforcing. This means
that federal agencies are to look beyond their organizational boundaries
and coordinate with other agencies to ensure that their efforts are
aligned. During our 1997 review, we found that most agencies did not
reflect in their draft plans the identification and planned coordination
of activities and programs that cut across multiple agencies. In contrast,
each of the six current agency strategic plans that we reviewed identified
at least some activities and programs that cut across multiple agencies.
For example, SBA's 1997 plan contained no evidence of how the agency
coordinated with other agencies, but the current plan contained a separate
section describing crosscutting issues in the areas of innovation and
research assistance, international trade assistance, business development
assistance, veterans affairs, and disaster assistance.

Critical Strategic Planning Issues Needing Further Improvement

First, consistent with our 1997 review, the strategic plans we reviewed
did not adequately link required elements in the plans. Although all of
the agencies we reviewed provided some information on the relationship
between their long-term and annual goals, the extent of information
provided on how annual goals would be used to measure progress in
achieving the long-term goals varied greatly. In the case of DOE, the plan
provides a very brief description of the overall relationship between its
long-term and annual goals with examples, but does not demonstrate how it
will assess progress for each of its long-term goals and objectives.
Another plan, DOT's, refers the reader to the annual performance plan for
information about annual goals. We have reported that this linkage is
critical for determining whether an agency has a clear sense of how it
will assess progress toward achieving its intended results.

Second, although the agencies' descriptions of their strategies had
improved since our initial reviews, with few exceptions, their strategies
generally did not include information on how the agencies plan to align
their activities, core processes, human capital, and other resources to
support their mission-critical outcomes and whether they have the right
mix of activities, skills, and resources to achieve their goals. Such
information is critical to understanding the viability of the strategies.
Furthermore, none of the agencies discussed alternative strategies they
had considered in developing their plans. Without such discussions, it is

Chapter 3
Agencies Have Addressed Many Critical
Performance Planning and Reporting
Challenges, but Weaknesses Persist

unclear whether agency planning processes were truly strategic or simply a
recasting of existing activities, processes, etc.

HUD was the only agency that provided any details of how it intended to
coordinate with other agencies to achieve common or complementary goals
for its crosscutting programs or activities. For example, to support its
goal of "Equal Opportunity in Housing," HUD's plan states that HUD and the
Department of Justice continue to coordinate their fair housing
enforcement activities, especially with respect to responding quickly and
effectively to Fair Housing Act complaints that involve criminal activity
(e.g., hate crimes), a pattern and practice of housing discrimination, or
the legality of state and local zoning or other land use laws or
ordinances. We have reported that mission fragmentation and program
overlap are widespread throughout the federal government.4 As such,
interagency coordination is important for ensuring that crosscutting
programs are mutually reinforcing and efficiently implemented.

Finally, the draft strategic plans did not adequately address program
evaluations. In combination with an agency's performance measurement
system, program evaluations can provide feedback to the agency on how well
its activities and programs contributed to achieving strategic goals. For
example, evaluations can be a potentially critical source of information
for Congress and others in assessing (1) the appropriateness and
reasonableness of goals; (2) the effectiveness of strategies by
supplementing performance measurement data with impact evaluation studies;
and (3) the implementation of programs, such as identifying the need for
corrective action. Evaluations are important because they potentially can
be critical sources of information for ensuring that goals are reasonable,
strategies for achieving goals are effective, and that corrective actions
are taken in program implementation. Five out of the six current plans
that we reviewed included a discussion of program evaluations, however for
most of these plans the discussions lacked critical information required
by GPRA, such as a discussion of how evaluations were used to establish
strategic goals or a schedule of future evaluations. For example, DOE's
plan stated that internal, GAO, and Inspector General (IG) evaluations
were used as resources to develop its draft strategic plan, but specific
program evaluations were not identified.

4U.S. General Accounting Office, Managing for Results: Using the Results
Act to Address Mission Fragmentation and Program Overlap, GAO/AIMD-97-146
(Washington, D.C.: Aug. 29, 1997).

                                   Chapter 3
                     Agencies Have Addressed Many Critical
                       Performance Planning and Reporting
                       Challenges, but Weaknesses Persist

Fiscal Year 2004 Annual Performance Plans Addressed Some Weaknesses of
Earlier Plans, but Still Have Room for Significant Improvement

According to our review of agencies' first annual performance plans, which
presented agencies' annual performance goals for fiscal year 1999,5 we
found that substantial further development was needed for these plans to
be useful in a significant way to congressional and other decision makers.
Most of the fiscal year 1999 plans that we reviewed contained major
weaknesses that undermined their usefulness in that they (1) did not
consistently provide clear pictures of agencies' intended performance, (2)
generally did not relate strategies and resources to performance, and (3)
provided limited confidence that agencies' performance data will be
sufficiently credible. Although all of the fiscal year 1999 plans
contained valuable information for decision makers, their weaknesses
caused their usefulness to vary considerably within and among plans.

As shown in table 2, our current review of agencies' fiscal year 2004
performance plans found that five agencies-Education, HUD, SBA, SSA, and
DOT-improved their efforts to provide a clear picture of intended
performance, with SSA and DOT being the clearest. Furthermore, the same
five agencies improved the specificity of the strategies and resources
they intended to use to achieve their performance goals, with DOT being
the most specific. Finally, the same five agencies-Education, HUD, SBA,
SSA, and DOT-made improvements in the area of greatest weakness- reporting
on how they will ensure performance data will be credible. However, only
DOT's plan provided a full level of confidence that the performance data
the agency intended to collect would be credible. Appendix IV provides a
more detailed discussion of (1) the required and other useful elements we
reviewed to assess the clarity of the picture of intended performance, the
specificity of the strategies and resources, and the level of confidence
in the performance data and (2) changes in the quality of the six
agencies' annual performance plans we reviewed.

5GAO/GGD/AIMD-98-228.

Chapter 3
Agencies Have Addressed Many Critical
Performance Planning and Reporting
Challenges, but Weaknesses Persist

Table 2: Characterizations of Agencies' Fiscal Year 1999 and 2004 Annual
Performance Plans

 Picture of intended performance (unclear, limited, general, clear) Strategies
and resources (no, limited, general, specific) Data credible (no, limited,
                                 general, full)

                      Agency    1999    2004    1999     2004    1999    2004 
               Department of Limited General Limited General  Limited General 
                   Education                                          
               Department of Limited Limited General General  Limited Limited 
                      Energy                                          
               Department of Limited General Limited General  Limited General 
                     Housing                                          
                   and Urban                                          
                 Development                                          
              Small Business Limited General Limited General  Limited General 
              Administration                                          
             Social Security Limited   Clear Limited General       No General 
              Administration                                          
               Department of General   Clear General Specific Limited    Full 
              Transportation                                          

Sources: GAO analysis of agencies' fiscal year 2004 annual performance
plans and U.S. General Accounting Office, Results Act: Observations on the
Department of Education's Fiscal Year 1999 Annual Performance Plan,
GAO/HEHS-98-172R (Washington, D.C.: June 8, 1998); Results Act:
Observations on DOE's Annual Performance Plan for Fiscal Year 1999,
GAO/RCED-98-194R (Washington, D.C.: May 28, 1998); Results Act:
Observations on the Department of Housing and Urban Development's Fiscal
Year 1999 Annual Performance Plan, GAO/RCED-98-159R (Washington, D.C.:
June 5, 1998); Results Act: Observations on the Small Business
Administration's Fiscal Year 1999 Annual Performance Plan,
GAO/RCED-98-200R (Washington, D.C.: May 28, 1998); The Results Act:
Observations on the Social Security Administration's Fiscal Year 1999
Annual Performance Plan, GAO/HEHS-98-178R (Washington, D.C.: June 9,
1998); and Results Act: Observations on the Department of Transportation's
Annual Performance Plan for Fiscal Year 1999, GAO/RCED-98-180R
(Washington, D.C.: May 12, 1998).

                                   Chapter 3
                     Agencies Have Addressed Many Critical
                       Performance Planning and Reporting
                       Challenges, but Weaknesses Persist

Plans Generally Provided a Clearer Picture of Intended Performance, Except
for Crosscutting Areas

At the most basic level, an annual performance plan is to provide a clear
picture of intended performance across the agency. Such information is
important to Congress, agency managers, and others for understanding what
the agency is trying to achieve, identifying subsequent opportunities for
improvement, and assigning accountability. Our current review of agencies'
fiscal year 2004 performance plans found that five of the six agencies
provided a clearer picture of intended performance than their fiscal year
1999 plans did, although only two of the 2004 plans-DOT's and
SSA's-received the highest rating possible. As shown in table 2, except
for DOT, the six agencies we reviewed for this report initially provided a
limited picture of intended performance. Most of the fiscal year 1999
performance plans we previously reviewed had at least some objective,
quantifiable, and measurable goals, but few plans consistently included a
comprehensive set of goals that focused on the results that programs were
intended to achieve. Moreover, agencies did not consistently follow OMB's
guidance that goals for performance and accountability challenges be
included in the plans. Agencies' plans generally showed how their missions
and strategic goals were related to their annual performance goals and
covered all of the program activities in the agencies' budget requests.6
In addition, many agencies took the needed first step of identifying their
crosscutting efforts, with some including helpful lists of other agencies
with which they shared a responsibility for addressing similar national
issues. However, the plans generally did not go further to describe how
agencies expected to coordinate their efforts with other agencies.

The fiscal year 2004 plans improved the picture of performance by making
annual goals and performance measures more results oriented, objective,
and quantifiable. For example, Education's plan included a measure for the
number of states meeting their eighth grade mathematics achievement
targets under the long-term goal to improve mathematics and science
achievement for all students. We previously criticized Education's 1999
plan for lacking such outcome-oriented measures. Another overall
improvement we observed was that all of the plans described intended
efforts to address performance and accountability challenges we and others
had previously identified. For instance, to address the governmentwide
high-risk area of strategic human capital management,

6Program activity refers to the list of projects and activities in the
appendix portion of the Budget of the United States Government. Program
activity structures are intended to provide a meaningful representation of
the operations financed by a specific budget account.

Chapter 3
Agencies Have Addressed Many Critical
Performance Planning and Reporting
Challenges, but Weaknesses Persist

HUD states that to develop its staff capacity, it will complete a
comprehensive workforce analysis in 2004 to serve as the basis to fill
mission critical skill gaps through succession planning, hiring, and
training initiatives in a 5-year human capital management strategy. The
clarity of DOE's plan remained limited because its annual goals were not
clearly linked to its mission, the long-term goals in its strategic plan,
or the program activities in its budget request.

Although five of the six agencies improved the clarity of the picture of
intended performance, improvement is still needed in reporting on
crosscutting efforts. In both the 1999 and 2004 plans, many agencies
identified their crosscutting efforts, with some including helpful lists
of other agencies with which they shared a responsibility for addressing
similar national issues. Our review of fiscal year 2004 plans shows that
the six agencies we reviewed still did not discuss how they expected to
coordinate with other agencies to address common challenges or to achieve
common or complementary performance goals. As we have reported previously,
improved reporting on crosscutting efforts can help Congress use the
annual performance plan to evaluate whether the annual goals will put the
agency on a path toward achieving its mission and longterm strategic
goals. In addition, the plans can aid in determining efforts to reduce
significant program overlap and fragmentation that can waste scarce
resources, confuse and frustrate program customers, and limit overall
program effectiveness.

None of the six agencies' plans indicated an intention to request waivers
of specific administrative procedural requirements and controls that may
be impeding an agencies' ability to achieve results. This provision of
GPRA allows agencies greater managerial flexibility in exchange for
accountability for results. We previously reported on the results of the
pilot project to implement this provision of GPRA and found that the pilot
did not work as intended.7 OMB did not designate any of the seven
departments and one independent agency that submitted a total of 61 waiver
proposals as GPRA managerial accountability and flexibility pilots. For
about three-quarters of the waiver proposals, OMB or other central
management agencies determined that the waivers were not allowable for
statutory or other reasons or that the requirement for which the waivers
were proposed no longer existed. For the remaining proposals, OMB or other
central management agencies approved waivers or developed

7GAO/GGD-97-36.

                                   Chapter 3
                     Agencies Have Addressed Many Critical
                       Performance Planning and Reporting
                       Challenges, but Weaknesses Persist

compromises by using authorities that were already available independent
of GPRA.

Plans More Specifically Related Strategies and Resources to Performance
Goals

To judge the reasonableness of an agency's proposed strategies and
resources, congressional and other decision makers need complete
information on how the proposed strategies and resources will contribute
to the achievement of agency goals. Agencies generally improved their
plans by better relating strategies and resources to performance.
Education's, HUD's, SBA's, and SSA's 1999 plans had a limited discussion,
while DOE's and DOT's 1999 plans had a general discussion. In 2004, five
of the six plans-Education's, DOE's, HUD's, SBA's, and SSA's-provided
general discussions of how their strategies and resources would contribute
to achieving their performance goals. DOT's 2004 plan improved to include
a specific discussion.

Our review of the 1999 plans found that most agencies' performance plans
did not provide clear strategies that described how performance goals
would be achieved. In contrast, the 2004 performance plans we reviewed
generally provided lists of the agencies' current array of programs and
initiatives. Several plans provided a perspective on how these programs
and initiatives were necessary or helpful for achieving results. For
example, DOE and HUD included in their plans a "means and strategies"
section for each of their goals that described how the goal would be
achieved. One strategy DOE identified to meet its goal of contributing
unique, vital facilities to the biological environmental sciences was to
conduct peer reviews of the facilities to assess the scientific output,
user satisfaction, the overall cost-effectiveness of each facility's
operations, and their ability to deliver the most advanced scientific
capability.

In addition, each of the agencies' plans identified the external factors
that could influence the degree to which goals are achieved. Some of the
better plans, such as DOT's and SBA's, provided strategies to mitigate the
negative factors or take advantage of positive factors, as appropriate.
For example, for its transportation accessibility goals, DOT's plan states
that as the population ages, more people will require accessible public
transit, for which states and local agencies decide how best to allocate
federally provided resources. One of the strategies DOT intends to employ
to address this external factor is the "Special Needs of Elderly
Individuals and Individuals with Disabilities" grant program. The plan
states the grant program will help meet transportation needs of the
elderly and persons

                                   Chapter 3
                     Agencies Have Addressed Many Critical
                       Performance Planning and Reporting
                       Challenges, but Weaknesses Persist

with disabilities when regular transportation services are unavailable,
insufficient, or inappropriate to meet their needs.

Agencies' 2004 plans did not consistently describe all the resources
needed and how they would be used to achieve agency goals. Our review of
agencies' fiscal year 1999 plans found that most did not adequately
describe-or reference other appropriate documents that describe-the
capital, human, information, and financial resources needed to achieve
their agencies' performance goals. The 2004 plans we reviewed generally
described the funding levels needed to achieve their performance goals
overall and in some cases broke out funding needs by high-level
performance goal. For example, SSA's plan provides a general perspective
on the financial resources needed to achieve its performance goals because
it provides budget information by account and program activity. However,
the plan is neither structured by budget program activity or account, nor
does it provide a crosswalk between the strategic goals and budget program
accounts. In contrast, HUD's plan presented its requested funding and
staffing levels at the strategic goal level, but did not present budget
information at the level of its annual goals. In addition, although the
plans make brief mention of nonfinancial resources, such as human capital,
information technology, or other capital investments, little information
is provided on how such resources would be used to achieve performance
goals.

Plans Continue to Provide Less Than Full Confidence That Performance Data
Will Be Credible

Credible performance information is essential for accurately assessing
agencies' progress towards the achievement of their goals and, in cases
where goals are not met, identifying opportunities for improvement or
whether goals need to be adjusted. Under GPRA, agencies' annual
performance plans are to describe the means that will be used to verify
and validate performance data. To help improve the quality of agencies'
performance data, Congress amended GPRA through the Reports Consolidation
Act of 2000 to require that agencies assess the completeness and
reliability of the performance data in their performance reports. Agencies
were also required to discuss in their report any material inadequacies in
the completeness and reliability of their performance data and discuss
actions to address these inadequacies. Meeting these new requirements
suggests the need for careful planning to ensure that agencies can comment
accurately on the quality of the performance data they report to the
public.

Chapter 3
Agencies Have Addressed Many Critical
Performance Planning and Reporting
Challenges, but Weaknesses Persist

As shown in table 2, although five of the six agencies we reviewed
improved in reporting how they plan to ensure that performance data will
be credible, only one agency-DOT-improved enough over its 1999 plan to
provide a full level of confidence in the credibility of its performance
data. Four agencies-Education, HUD, SBA, and SSA-improved enough to
provide a general level of confidence. However, DOE provided the same
limited level of confidence in the credibility of the performance data as
in its 1999 plan. Regarding all 24 of the fiscal year 1999 performance
plans we reviewed, we found most provided only superficial descriptions of
procedures that agencies intended to use to verify and validate
performance data. Moreover, in general, agencies' performance plans did
not include discussions of documented limitations in financial and other
information systems that may undermine efforts to produce high-quality
data. As we have previously noted, without such information, and
strategies to address those limitations, Congress and other decision
makers cannot assess the validity and reliability of performance
information.

We found that each of the 2004 plans we reviewed contained some discussion
of the procedures the agencies would use to verify and validate
performance information, although in some cases the discussion was
inconsistent or limited. For example, the discussions of SBA's
verification and validation processes for its indicators in the 2004 plan
were generally one- or two-sentence statements. SBA also noted that it
does not independently verify some of the external data it gathers or that
it does not have access to the data for this purpose. In contrast, the DOT
plan referred to a separate compendium available on-line that provides
source and accuracy statements, which give more detail on the methods used
to collect performance data, sources of variation and bias in the data,
and methods used to verify and validate the data.

                                   Chapter 3
                     Agencies Have Addressed Many Critical
                       Performance Planning and Reporting
                       Challenges, but Weaknesses Persist

In addition, all of the agencies except DOE discussed known limitations to
performance data in their plans. These agencies' plans generally provided
information about the quality of each performance measure, including any
limitations. According to DOE officials, DOE's plan generally does not
discuss data limitations because the department selected goals for which
data are expected to be available and therefore did not anticipate finding
any limitations. However, in our 2003 Performance and Accountability
Series report on DOE, we identified several performance and accountability
challenges where data were a concern, such as the need for additional
information on the results of contractors' performance to keep projects on
schedule and within budget.8 DOE's contract management continues to be a
significant challenge for the department and remains at high risk.

Finally, the remaining five agencies also discussed plans to address
limitations to the performance data. For example, DOT's plan provided a
general discussion of the limitations to the internal and external sources
of data used to measure performance. Detailed discussions were contained
in an appendix to the plan and separate source and accuracy statements.
This information had been lacking in its 1999 plan. Education, HUD, SBA,
and SSA also provided information on limitations to their performance data
and plans for improvement.

Strengths and Weaknesses of Selected Agencies' Fiscal Year 2002 Annual
Performance Reports

Key to improving accountability for results as Congress intended under
GPRA, annual performance reports are to document the results agencies have
achieved compared to the goals they established. To be useful for
oversight and accountability purposes, the reports should clearly
communicate performance results, provide explanations for any unmet goals
as well as actions needed to address them, and discuss known data
limitations as well as how the limitations are to be addressed in the
future. Compared to our reviews of the six agencies' fiscal year 1999
performance reports, we identified a number of strengths and improvements
as well as areas that continued to need improvement. Because the scope of
our review of the fiscal year 2002 reports was broader than that for the
fiscal year 1999 reports we previously reviewed, we were unable to make
specific comparisons for the three characteristics we used to assess the
fiscal year 2002 reports. However, we discuss comparative information on
aspects of

8U.S. General Accounting Office, Major Management Challenges and Program
Risks: Department of Energy, GAO-03-100 (Washington, D.C.: January 2003).

                                   Chapter 3
                     Agencies Have Addressed Many Critical
                       Performance Planning and Reporting
                       Challenges, but Weaknesses Persist

the reports where available. Table 3 shows the results of our assessment
of the six agencies' annual performance reports for fiscal year 2002.
Appendix V provides a more detailed discussion of (1) the required and
other useful elements we reviewed to assess the clarity of the picture of
performance, the clarity of the linkage between costs and performance, and
the level of confidence in the performance data and (2) changes in the
quality of the six agencies' annual performance plans we reviewed.

Table 3: Characterizations of Agencies' 2002 Annual Performance Reports

                                        Picture of Resources     
                                                   linked        
                                                      to results         Data 
                                       performance                   credible 
                                  (unclear,        (no, limited, (no,         
                                  limited,                       limited,     
                           Agency  general, clear)      general,     general, 
                                                          clear)        full) 
                    Department of          Limited         Clear      General 
                        Education                                
             Department of Energy          General       Limited      Limited 
            Department of Housing          General            No      General 
            and Urban Development                                
                   Small Business          Limited       General      General 
                   Administration                                
                  Social Security          General       Limited      General 
                   Administration                                
                    Department of          General            No         Full 
                   Transportation                                

Sources: GAO analysis of agencies' fiscal year 2002 annual performance
reports and U.S. General Accounting Office, Observations on the Department
of Education's Fiscal Year 1999 Performance Report and Fiscal Year 2001
Performance Plan, GAO/HEHS-00-128R (Washington, D.C.: June 30, 2000);
Observations on the Department of Energy's Fiscal Year 1999 Accountability
Report and Fiscal Years 2000 and 2001 Performance Plans, GAO/RCED-00-209R
(Washington, D.C.: June 30, 2000); Observations on the Department of
Housing and Urban Development's Fiscal Year 1999 Performance Report and
Fiscal Year 2001 Performance Plan, GAO/RCED-00-211R (Washington, D.C.:
June 30, 2000); Observations on the Small Business Administration's Fiscal
Year 1999 Performance Report and Fiscal Year 2001 Performance Plan,
GAO/RCED-00-207R (Washington, D.C.: June 30, 2000); Observations on the
Social Security Administration's Fiscal Year 1999 Performance Report and
Fiscal Year 2001 Performance Plan, GAO/HEHS-00-126R (Washington, D.C.:
June 30, 2000); and Observations on the Department of Transportation's
Fiscal Year 1999 Performance Report and Fiscal Year 2001 Performance Plan,
GAO/RCED-00-201R (Washington, D.C.: June 30, 2000).

Progress in Providing a Clear Picture of Performance

The six agency reports that we reviewed contained a number of strengths,
some of which we can describe as improvements over the reports on fiscal
year 1999 performance. A key strength of four of the 2002 reports
(Education, HUD, DOT, SSA) was a discussion of the relationship between
the strategic plan, performance plan, and performance report. For example,
SSA's report identified relevant results that were linked to its strategic
objective to deliver "citizen-centered, world-class service," such as
maintaining the accuracy, timeliness, and efficiency of service to people
applying for its benefit programs. The clarity of the DOE and SBA reports

Chapter 3
Agencies Have Addressed Many Critical
Performance Planning and Reporting
Challenges, but Weaknesses Persist

was limited by not clearly relating agency performance results to
strategic and annual performance goals. For example, the structure of
SBA's report reflected the objectives in its draft 2003 to 2008 strategic
plan rather than those in its 2002 performance plan, making it difficult
to assess progress against the original 2002 objectives. Furthermore,
although there is no "right" number of performance measures to be used to
assess progress, a number of the plans allowed for an easier review of
results by limiting the overall number of measures presented or by
highlighting key performance measures of greatest significance to their
programs. For example, SBA discussed a total of 19 performance goals and
DOT discussed a total of 40. Although SSA discussed a total of 69
performance goals, the report highlighted its progress in achieving 14 key
goals. In contrast, Education, HUD, and DOE presented a total of 120, 184,
and 260 measures, respectively. Furthermore, while Education and SSA each
provided a table showing progress across all its measures, the other
agencies did not provide such summary information.

As we found in our earlier reviews, the six agencies' fiscal year 2002
reports generally allowed for an assessment of progress made in achieving
agency goals. Some of the reports made this assessment easier than others
by providing easy-to-read summary information. For example, SSA provided a
table at the beginning of the report that summarized the results for each
of its 69 indicators with the following dispositions: met, not met, almost
met, and data not yet available. Other reports, such as HUD's, required an
extensive review to make this assessment. In addition, to place current
performance in context, each of the agencies' reports contained trend
information, as required by GPRA, which allowed for comparisons between
current year and prior year performance.

In addition, the majority of agencies maintained, or demonstrated
improvements over, the quality of their 1999 reports in discussing the
progress achieved in addressing performance and accountability challenges
identified by agency IGs and GAO. For example, SBA's report contained two
broad overviews and an appendix describing the status of GAO audits and
recommendations, as well as a description of the most serious management
challenges SBA faces as identified by the agency's IG.

Unfortunately, many of the weaknesses we identified in the agencies'
fiscal year 2002 reports were similar to those we found in their fiscal
year 1999 reports related to the significant number of performance goals
(1) which were not achieved and lacked explanations or plans for achieving
the goal in the future and (2) for which performance data were
unavailable. Three

Chapter 3
Agencies Have Addressed Many Critical
Performance Planning and Reporting
Challenges, but Weaknesses Persist

of the six agencies we reviewed-HUD, SSA, and Transportation-did not
consistently report the reasons for not meeting their goals. For example,
Transportation provided explanations for only 5 of the 14 goals it did not
meet. In addition, similar to our 1999 report findings, three of the six
agencies we reviewed-HUD, SBA, and DOT-did not discuss their plans or
strategies to achieve unmet goals in the future. For example, HUD reported
"substantially meeting" only 47 percent of the performance targets in
fiscal year 2002. However, although HUD provides various reasons for not
meeting all its targets, it offers no information on plans or time frames
to achieve the goals in the future. Finally, we continued to observe a
significant number of goals for which performance data were unavailable.
For example, performance data for 10 of SBA's 19 performance goals were
unavailable.

In addition, the majority of the reports we reviewed did not include other
GPRA requirements. The reports generally did not evaluate the performance
plan for the current year relative to the performance achieved toward the
performance goals in the fiscal year covered by the report. The reports
also did not discuss the use or effectiveness of any waivers in achieving
performance goals. In addition, for two of the agencies-DOE and
SBA-program evaluation findings completed during the fiscal year were not
summarized. As we have previously noted, such evaluations could help
agencies understand the relationship between their activities and the
results they hope to achieve.

                                   Chapter 3
                     Agencies Have Addressed Many Critical
                       Performance Planning and Reporting
                       Challenges, but Weaknesses Persist

Progress in Linking Resources to Results

Although linking costs to performance goals is not a requirement of GPRA,
both GPRA and the CFO Act emphasized the importance of linking program
performance information with financial information as a key feature of
sound management and an important element in presenting to the public a
useful and informative perspective on federal spending. The committee
report for GPRA suggested that developing the capacity to relate the level
of program activity with program costs, such as cost per unit of result,
cost per unit of service, or cost per unit of output, should be a high
priority. In our survey of federal managers, this year we asked for the
first time the extent to which federal managers had measures of
costeffectiveness for the programs they were involved with. Only 31
percent of federal managers we surveyed reported having such measures to a
great or very great extent, lower than any of the other types of measures
associated with GPRA we asked about by at least 12 percent (see fig. 3 in
ch. 2). Under the PMA, the current administration has set an ambitious
agenda for performance budgeting, calling for agencies to better align
budgets with performance goals and focus on capturing full budgetary costs
and matching those costs with output and outcome goals. All this suggests
that agencies will need to develop integrated financial and performance
management systems that will enable the reporting of the actual costs
associated with performance goals and objectives along with presentations
designed to meet other budgetary or financial purposes, such as the
accounts and program activities found in the President's Budget and
responsibility segments found in financial statements.9

9According to OMB's Statement of Federal Financial Accounting Standards
No. 4- Managerial Cost Accounting Standards, July 31, 1995, a
responsibility segment is a component of a reporting entity that is
responsible for carrying out a mission, conducting a major line of
activity, or producing one or a group of related products or services. In
addition, responsibility segments usually possess the following
characteristics: (1) their managers report to the entity's top management
directly and (2) their resources and results of operations can be clearly
distinguished from those of other segments of the entity. Managerial cost
accounting should be performed to measure and report the costs of each
segment's outputs.

                                   Chapter 3
                     Agencies Have Addressed Many Critical
                       Performance Planning and Reporting
                       Challenges, but Weaknesses Persist

Of the six agencies we reviewed, only Education's report clearly linked
its budgetary information to the achievement of its performance goals or
objectives. Education's report laid out, using both graphics and text, the
estimated appropriations associated with achieving each of its 24
objectives. In addition the report provided the staffing in full-time
equivalent employment (FTEs) and an estimate of the funds from salaries
and expenses contributing to the support of each of these objectives.
SBA's report contained crosswalks that showed the relationship between
SBA's strategic goals, outcome goals, performance goals, and programs.
Because SBA shows the resources for each program, a reader can infer a
relationship between SBA's resources and performance goals. However, the
linkage between resources and results would be clearer if results and
resources were presented by performance goal as well. SSA provided a
limited view of the costs of achieving its performance goals by providing
the costs associated with four out of five of its strategic goals.10
However, as reported by the IG, SSA needs to further develop its cost
accounting system, which would help link costs to performance.11 DOE also
provided a limited view of the costs of achieving its performance goals by
organizing its performance information by budget program activity and
associated net costs. According to DOE officials, the department plans to
link its individual performance measures to the costs of program
activities in future reports. Neither HUD nor DOT provided information on
the cost of achieving individual performance goals or objectives.

Progress in Providing Confidence in the Credibility of Performance Data

To assess the degree to which an agency's report provided full confidence
that the agency's performance information was credible, we examined the
extent to which the reports discussed the quality of the data presented.
As shown in table 3, only DOT's report provided a full level of confidence
in the quality of the data. The other agencies provided general or limited
confidence in their data.

10SSA noted that its fifth strategic goal, "Valued Employees," supports
the accomplishment of all its basic functions, so its resources are
inherently included in the other four goals.

11According to the IG, SSA began to implement an improved cost accounting
system in fiscal year 2002, which was to be phased in over the next 3 to 4
years.

Chapter 3
Agencies Have Addressed Many Critical
Performance Planning and Reporting
Challenges, but Weaknesses Persist

All six agencies in our current review complied with the Reports
Consolidation Act of 2000 by including assessments of the completeness and
reliability of their performance data in their transmittal letters. In
contrast, we found that only 5 of the 24 CFO Act agencies included this
information in their fiscal year 2000 performance reports.12 Of the six
agencies in our current review, only DOE provided this assessment in its
fiscal year 2000 report. For example, the Secretary of DOT stated in the
transmittal letter that the 2002 report "contains performance and
financial data that are substantially complete and reliable." However,
only two of the six agencies also disclosed material inadequacies in the
completeness and reliability of their performance data and discussed
actions to address the inadequacies in their transmittal letters. For
example, SBA stated in its transmittal letter that it is "working to
improve the completeness and reliability of the performance data for the
advice provided to small business through SBA's resource partners." SBA
explained that data for this aspect of its performance are collected
through surveys, which are inconsistent and not comparable, and for which
client responses are difficult to obtain. SBA stated that it is working to
improve the survey instruments it uses to obtain performance data.

In addition to the requirements of the Reports Consolidation Act, we have
previously reported on other practices that enhance the credibility of
performance data that are not specifically required by GPRA. For instance,
discussions of standards and methods used by agencies to assess the
quality of their performance data in their performance reports provide
decision makers greater insight into the quality and value of the
performance data. None of the reports explicitly referred to a specific
standard they used, however, DOE described its method for assuring data
quality. The report states that the heads of DOE's organizational elements
certified the accuracy of their performance data. DOE subsequently
reviewed the data for quality and completeness.

Other useful practices that help foster transparency to the public and
assist decision makers in understanding the quality of an agency's data
include: (1) discussion of data quality, including known data limitations
and actions to address the limitations, and (2) discussion of data
verification and

12U.S. General Accounting Office, Performance Reporting: Few Agencies
Reported on the Completeness and Reliability of Performance Data,
GAO-02-372 (Washington, D.C.: Apr. 26, 2002).

Chapter 3
Agencies Have Addressed Many Critical
Performance Planning and Reporting
Challenges, but Weaknesses Persist

validation procedures, including proposals to review data collection and
verification and validation procedures.

All six agencies' reports described data limitations, although discussions
were mostly brief and very high level. One exception was DOT, which
directed readers to the DOT Web site to obtain an assessment of the
completeness and reliability of its performance data and detailed
information on the source, scope, and limitations of the performance data.
HUD and SBA also discussed plans for addressing the limitations. For
example, HUD stated that to address problems with its indicator on the
number of homeowners who have been assisted with the Home Investment
Partnership Program (HOME), HUD has established a team of managers,
technical staff, and contractors to make a series of improvements to the
Integrated Disbursement and Information System beginning in fiscal year
2003 that should reduce the need to clean up the data.

Each of the six agencies' reports also discussed the procedures they used
to verify and validate their performance data. However, these discussions
ranged from the very general description of the DOE method (noted
previously), to the very detailed discussions provided by DOT. DOT
provides an on-line compendium that discusses the source and accuracy of
its data. Furthermore, DOT's 2002 report also describes strategies being
undertaken to address the quality of its data. The report states that a
DOT intermodal working group addressed data quality issues by developing
departmental statistical standards and by updating source and accuracy
statements for all of DOT's data programs. The working group also worked
to improve quality assurance procedures, evaluate sampling and nonsampling
errors, and develop common definitions for data across modes.

Chapter 4

                   Challenges to GPRA Implementation Persist

While a great deal of progress has been made in making federal agencies
more results oriented, numerous challenges still exist to effective
implementation of GPRA. The success of GPRA depends on the commitment of
top leadership within agencies, OMB, and Congress. However, according to
federal managers surveyed, top leadership commitment to achieving results
has not grown significantly since our 1997 survey. Furthermore, although
OMB has recently shown an increased commitment to management issues, it
significantly reduced its guidance to agencies on GPRA implementation
compared to prior years, and it is not clear how the program goals
developed through its PART initiative will complement and integrate with
the long-term, strategic focus of GPRA. Obtaining leadership commitment to
implement a strategic plan depends in part on the usefulness and relevance
of agency goals and strategies to agency leaders, Congress, and OMB.
However, GPRA's requirement to update agency strategic plans every 3 years
is out of sync with presidential and congressional terms and can result in
updated plans that do not have the support of top administration
leadership and key congressional stakeholders.

As noted in chapter 2, more federal managers surveyed reported having
results-oriented performance measures for their programs and we would
expect to have seen similar increases in the use of this information for
program management. However, we did not observe any growth in their
reported use of this information for key management activities, such as
adopting new program approaches or changing work processes. Additionally,
managers noted human capital-related challenges that impede
results-oriented management, including a lack of authority and training to
carry out GPRA requirements, as well as a lack of recognition for the
results achieved.

Consistent with our previous work, federal managers in our focus groups
reported that significant challenges persist in setting outcome-oriented
goals, measuring performance, and collecting useful data. However, our
survey data suggested that federal managers do not perceive issues, such
as "difficulty distinguishing between the results produced by the program
and results caused by other factors" and "difficulty obtaining data in
time to be useful," to be substantial hindrances to measuring performance
or using performance information.

Additionally, mission fragmentation and overlap contribute to difficulties
in addressing crosscutting issues, particularly when those issues require
a national focus, such as homeland security, drug control, and the

                                   Chapter 4
                   Challenges to GPRA Implementation Persist

environment. GAO has previously reported on a variety of barriers to
interagency cooperation, such as conflicting agency missions, jurisdiction
issues, and incompatible procedures, data, and processes. We have also
reported that OMB could use the provision of GPRA that calls for OMB to
develop a governmentwide performance plan to integrate expected
agencylevel performance. Unfortunately, this provision has not been fully
implemented and the federal government lacks a tool, such as a strategic
plan, that could provide a framework for a governmentwide reexamination of
existing programs, as well as proposals for new programs. Finally, federal
managers in our focus groups and political appointees we interviewed
believed that Congress does not use performance information to the fullest
extent to conduct oversight and to inform appropriations decisions. While
there is concern regarding Congress' use of performance information, it is
important to make sure that this information is initially useful. As a key
user of performance information, Congress needs to be considered a partner
in shaping agency goals at the outset. GPRA provides Congress
opportunities to influence agency performance goals through the
consultation requirement for strategic plans and through Congress'
traditional oversight role.

Top Leadership Does Not Consistently Show Commitment to Achieving Results

We have previously testified that perhaps the single most important
element of successful management improvement initiatives is the
demonstrated commitment of top leaders to change.1 This commitment is most
prominently shown through the personal involvement of top leaders in
developing and directing reform efforts. Organizations that successfully
address their long-standing management weaknesses do not "staff out"
responsibility for leading change. Top leadership involvement and clear
lines of accountability for making management improvements are critical to
overcoming organizations' natural resistance to change, marshalling the
resources needed in many cases to improve management, and building and
maintaining the organizationwide commitment to new ways of doing business.

Results from our surveys show that while the majority of managers continue
to indicate top leadership demonstrates a strong commitment to achieving
results, we have not seen a noteworthy improvement in the percentage of
managers expressing this view. From our 1997 survey, we

1U.S. General Accounting Office, Management Reform: Elements of Successful
Improvement Initiatives, GAO/T-GGD-00-26 (Washington, D.C.: Oct. 15,
1999).

Chapter 4
Challenges to GPRA Implementation Persist

estimated about 57 percent of managers overall reported such commitment to
a great or very great extent. On our 2003 survey, 62 percent of managers
expressed a comparable view-a higher but not statistically significant
increase. (See fig. 10.)

Figure 10: Percentage of Federal Managers Who Reported to a Great or Very
Great Extent Their Top Leadership Has a Strong Commitment to Achieving
Results Percent

100

90

80

70

62

60

50

40

30

20

10

0 1997 2000 2003 Year

Source: GAO.

As shown in figure 11, however, we continued to see a significant
difference between the perceptions of SES and non-SES managers on this
issue. That is, the percentage of SES managers reporting that top
leadership demonstrated strong commitment to a great or very great extent
in 2003 was 22 percent higher than for non-SES managers.

Chapter 4
Challenges to GPRA Implementation Persist

Figure 11: Percentage of SES and Non-SES Managers Who Reported to a Great
or
Very Great Extent Their Agency Top Leadership Demonstrated Strong
Commitment
to Achieving Results
Percent

100

90

82

80

70

60

60

50

40

30

20

10

0

1997a 2000a 2003a

Year

SES

Non-SES

Source: GAO.

aThere was a statistically significant difference between SES and non-SES.

We observed in our 1997 and 2000 reports on governmentwide implementation
of GPRA that we would expect to see managers' positive perceptions on
items, such as the extent to which top leadership is committed to
achieving results, become more prevalent and the gap between SES and
non-SES managers begin to narrow as GPRA and related reforms are
implemented; however, these changes do not appear to be happening as
expected.2

Demonstrating the willingness and ability to make decisions and manage
programs based on results and the ability to inspire others to embrace
such

2U.S. General Accounting Office, The Government Performance and Results
Act: 1997 Governmentwide Implementation Will Be Uneven, GAO/GGD-97-109
(Washington, D.C.: June 2, 1997) and Managing for Results: Federal
Managers' Views Show Need for Ensuring Top Leadership Skills, GAO-01-127
(Washington, D.C.: Oct. 20, 2000).

Chapter 4
Challenges to GPRA Implementation Persist

a model are important indicators of leadership commitment to
resultsoriented management. However, in both our 1997 and 2000 surveys,
only about 16 percent of managers reported that changes by management
above their levels to the programs for which they were responsible were
based on results or outcome-oriented performance information to a great or
very great extent. In our 2003 survey, this indicator increased to 23
percent, a statistically significant increase from prior surveys.
Twenty-eight percent of federal managers surveyed who expressed an opinion
reported that the lack of ongoing top executive commitment or support for
using performance information to make program/funding decisions hindered
measuring performance or using performance information to a great or very
great extent.

Our interviews with 10 top political appointees from the Clinton and
current Bush administrations indicated a high level of support and
enthusiasm for effectively implementing the principles embodied in GPRA.
For example, one appointee noted that GPRA focused senior management on a
set of goals and objectives to allow the organization to understand what
is important and how to deal with accomplishment at a macro-level, as well
as provided a structure for problem solving. Another political appointee
noted that GPRA has made it important to look at what you get out of the
budget, not just what you put into it, while another concluded that GPRA
brought about a fundamental rethinking of how they managed their programs
and processes. Such indications of support for GPRA are promising.
However, to support the transition to more results-oriented agency
cultures, top agency management will need to make a more concerted effort
to translate their enthusiasm for GPRA into actions that communicate to
employees that top management cares about performance results and uses the
information in its decision making.

The need for strong, committed leadership extends to OMB as well. OMB has
shown a commitment to improving the management of federal programs, both
through its leadership in reviewing agency program performance using the
PART tool as well as through the PMA, which calls for improved financial
performance, strategic human capital management, competitive sourcing,
expanded electronic government, and performance budget integration. Using
the foundation of information generated by agencies in their strategic
plans, annual performance plans, and program performance reports, OMB has
used the PART tool to exercise oversight of selected federal programs by
assessing program purpose and design, the quality of strategic planning,
the quality of program management, and the extent to which programs can
demonstrate results. PART provides OMB a

Chapter 4
Challenges to GPRA Implementation Persist

lens through which to view performance information for use in the budget
formulation process. PART, and OMB's use of performance data in the budget
formulation process, potentially can complement GPRA's focus on increasing
the supply of credible performance information by promoting the demand for
this information in the budget formulation process. As we reported in
chapter 2, more federal managers noted that OMB was paying attention to
their agencies' efforts under GPRA. (See fig. 6.) Additionally, OMB
convened a performance measurement workshop in April 2003 to identify
practical strategies for addressing common performance measurement
challenges. As a result of this workshop, it produced a paper in June 2003
that included basic performance measurement definitions and concepts and
common performance measurement problems that were discussed at the
workshop. This was part of OMB's continuing efforts to improve PART as an
evaluation tool.

However, there are areas where OMB could further enhance its leadership.
OMB has stated that the PART exercise presents an opportunity to inform
and improve on agency GPRA plans and reports and establish a meaningful,
systematic link between GPRA and the budget process. OMB has instructed
agencies that, in lieu of a performance plan, they are to submit a
performance budget that includes information from the PART assessments,
including all performance goals used in the assessment of program
performance done under the PART process. The result is that
program-specific performance measures developed through the PART review
are to substitute for other measures developed by the agency through its
strategic planning process. GPRA is a broad legislative framework that was
designed to be consultative with Congress and other stakeholders and
address the needs of many users of performance information-Congress to
provide oversight and inform funding decisions, agency managers to manage
programs and make internal resource decisions, and the public to provide
greater accountability. Changing agency plans and reports for use in the
budget formulation process may not satisfy the needs of these other users.
Users other than OMB are not likely to find the information useful unless
it is credible and valid for their purposes. PART's program-specific focus
may fit with OMB's agency-byagency budget reviews, but it is not well
suited to achieving one of the key purposes of strategic plans-to convey
agencywide, long-term goals and objectives for all major functions and
operations. PART's focus on program-specific measures does not substitute
for the strategic, long-term focus of GPRA on thematic goals and
department-and governmentwide crosscutting comparisons.

Chapter 4
Challenges to GPRA Implementation Persist

To reach the full potential of performance management, agency planning and
reporting documents need to reflect the full array of uses of performance
information, which may extend beyond those needed for formulating the
President's Budget. However, it is not yet clear whether the results of
those reviews, such as changes to agencies' program performance measures,
will complement and be integrated with the longterm, strategic goals and
objectives agencies have established in consultation with Congress and
other stakeholders under GPRA. OMB has not yet clearly articulated how
PART is to complement GPRA. Focus group participants suggested that the
administration and OMB needed to reinforce GPRA's usefulness as a
management tool for agencies. They also emphasized the need for OMB to
help agencies understand how to integrate GPRA with other management
initiatives, such as PART.

As we noted in chapter 3, agencies' plans and reports still suffer from
persistent weaknesses and could improve in a number of areas, such as
attention to issues that cut across agency lines, and better information
about the quality of the data that underlie agency performance goals.
However, OMB's July 2003 guidance for the preparation and submission of
strategic plans, annual performance plans, and annual performance reports
is significantly shorter and less detailed than its 2002 guidance. For
example, OMB no longer provides detailed guidance to agencies for the
development of performance plan components. OMB's 2002 guidance on the
preparation and submission of annual performance plans is approximately 39
pages long; in its 2003 guidance, that discussion spans only 2 pages. The
2003 guidance in this area does not include entire sections found in the
2002 guidance, such as principles for choosing performance goals and
indicators for inclusion in the annual plan, types of performance goals,
crosscutting programs, and requirements for verifying and validating data.

OMB needs to maintain and strengthen its leadership role in working with
agencies to help them produce the highest quality GPRA documents through
its formal guidance and reviews of strategic plan and report submissions.
Focus group participants discussed the need for consistent guidance on how
to implement GPRA. Furthermore, there is no evidence that agencies have
institutional knowledge of GPRA requirements that would obviate the need
for OMB's guidance. New managers will need a consistent resource that
provides practical guidance on what agencies need to include in their
planning and reporting documents to comply with GPRA and reflect best
practices. Consistent, explicit OMB guidance on preparing GPRA documents
can help ensure that gains in the quality of

                                   Chapter 4
                   Challenges to GPRA Implementation Persist

GPRA documents are maintained and provide a resource for agencies to make
further improvements in those documents. For example, guidance on how to
discuss coordination of crosscutting programs or improvements to the
credibility of performance data in agency performance plans goes
hand-in-hand with OMB's enhanced oversight of agency performance through
the PART exercise.

The success of GPRA depends on the commitment of top leadership within
agencies, OMB, and Congress. Obtaining such leadership commitment depends
in part on the usefulness and relevance of agency goals and strategies to
these parties. GPRA requires an agency to develop a strategic plan at
least every 3 years to cover the following 5-year period. Thus, there have
been two required updates of strategic plans since the initial strategic
plans were submitted for fiscal year 1997-fiscal year 2000 and fiscal year
2003. The fiscal year 2000 update occurred the year before a new
presidential term began. According to our focus group participants-both
the experts and federal managers-it makes little sense to require an
update of a strategic plan shortly before a new administration is
scheduled to take office. For example, changes in political leadership
generally result in a new agenda with new objectives. Such changes force
agencies to revise their plans, management initiatives, and strategies,
which translates into additional GPRA-related work. A strategic plan that
does not reflect the participation and buy-in of top administration
leadership and key congressional stakeholders is unlikely to be
successfully implemented. Therefore, GPRA's requirement to update agency
strategic plans according to a schedule that is out of sync with
presidential and congressional terms means that effort may be wasted on
plans that lack the support of top leadership.

Managers Report Mixed Results in Use of Performance Information

GPRA's usefulness to agency leaders and managers as a tool for management
and accountability was cited as a key accomplishment numerous times by
focus group participants. However, a number of alternative views indicated
use of performance information for key management decisions has been
mixed. For example, one participant said they did not believe GPRA has
been used as a tool yet, while another participant commented that only
better managers take advantage of GPRA as a management tool. According to
focus group participants, although many federal managers understand and
use results-oriented management concepts in their day-to-day activities,
such as strategic planning and performance measurement, they do not always
connect these concepts to the requirements of GPRA.

Chapter 4
Challenges to GPRA Implementation Persist

This view was strongly supported by our survey results. Prior to
mentioning GPRA in our survey, we asked federal managers the extent to
which they consider their agency's strategic goals when engaging in key
management tasks such as setting program activities, allocating resources,
or considering changes in their programs. A relatively high percentage of
managers-ranging from 66 to 79 percent-responded to a great or very great
extent. However, when we asked similar questions about the extent to which
they considered their agency's annual performance goals as set forth in
the agency's GPRA annual performance plan for the same activities, the
comparable responses were considerably lower, ranging from 22 to 27
percent.

Because the benefit of collecting performance information is only fully
realized when this information is actually used by managers, we asked them
about the extent to which they used the information obtained from
measuring performance for various program management activities. As shown
in figure 12, for seven of the nine activities we asked about, the
majority of managers who expressed an opinion reported using performance
information to a great or very great extent in 2003. Across all nine
activities, the percentage of managers saying they used performance
information to a great or very great extent ranged from 41 percent for
developing and managing contracts to 60 percent for allocating resources,
setting individual job expectations, and rewarding staff. While we had
observed a decline in the reported use of performance information to this
extent for many of these activities between 1997 and 2000, our 2003
results increased to levels not significantly different from 1997 for all
but one category-adopting new program approaches or changing work
processes. This category of use continued to be significantly lower at 56
percent in 2003 than it was in 1997 at 66 percent. Although another
category, coordinating program efforts with other internal or external
organizations, shows a similar pattern of limited recovery, the difference
between the 1997 and 2003 results is not statistically significant.

Chapter 4
Challenges to GPRA Implementation Persist

Figure 12: Percentage of Federal Managers Who Reported Using Information
Obtained from Performance Measurement to a
Great or Very Great Extent for Various Management Activities
Percent

100

90

80

70 66 66 60

50

40

30

20

10

0

rk processesapproaches or c

owAdopting new program

am

cesAllocatingresour

                                       ts

amogr

performance measures

w or revising isting performance goals Setting individual job

                                       ge

                                  or supervise

ogrpriorities

Setting pr

a

Coordinating prwith other internal or external y staff

r mo

pectations fxe

veDemanaging contractsor

ganizations Refining pr

warding staff I manaRe

                                  am effogror

xeKey management activities

1997

2000

2003

Source: GAO. Note: Percentages are based on those respondents answering on
the extent scale. aThere was a statistically significant difference
between the 1997 and 2003 surveys. bThis question was not asked in 1997.

Chapter 4
Challenges to GPRA Implementation Persist

We have reported that involving program managers in the development of
performance goals and measures is critical to increasing the relevance and
usefulness of this information to their day-to-day activities.3 Yet, our
survey data indicate that participation in activities related to the
development and use of performance information has also been mixed. In
2003, only 14 percent of managers believed to a great or very great extent
that their agencies considered their contributions to or comments on their
agency's GPRA plans or reports. However, significantly more SES managers
(43 percent) than non-SES managers (12 percent) expressed this view. Also,
when compared to our 2000 survey when we first asked this question, the
percentage of SES managers expressing this view in 2003 was significantly
higher than in 2000 (32 percent). The percentage of non-SES managers was
essentially unchanged from 2000 (10 percent).

Furthermore, as shown in figure 13, overall around half or fewer of
managers responded "yes" on our 2003 survey to questions about being
involved in developing ways to measure whether program performance goals
are being achieved (46 percent), gathering and analyzing data to measure
whether programs were meeting their specific performance goals (51
percent), or using measures for program performance goals to determine if
the agency's strategic goals were being achieved (43 percent). None of
these overall results were significantly different from our 1997 results.
We did find, however, that significantly more SES managers responded "yes"
on the 2003 survey (72 percent) than the 1997 survey (55 percent) with
regard to being involved in using performance measurement information to
determine if the agency's strategic goals were being achieved when
compared to our 1997 results.

3GAO/GGD-97-109 and GAO-01-127.

                                   Chapter 4
                   Challenges to GPRA Implementation Persist

Figure 13: Percentage of Federal Managers Responding "Yes" about Being
Involved in the Following Activities

Percent 100

90

80

70

60 55

                                     50 40

30 35
20
10

0 1997 2003

Year

Developing ways to measure whether program performance goals are being
achieved

Gathering and analyzing data to measure whether programs are meeting their
specific performance goals

Using measurements for program performance goals to determine if the
agency's strategic goals are being achieved

Source: GAO.

Managers Continue to Managing people strategically and maintaining a
highly skilled and

energized workforce that is empowered to focus on results are
criticallyConfront a Range of important. Such human capital management
practices are essential to the Human Capital success of the federal
government in the 21st century and to maximizing Management the value of
its greatest asset-its people. Our survey results showed

continuing challenges related to the adequacy of managerial
decisionChallenges making authority, training, and incentives.

                                   Chapter 4
                   Challenges to GPRA Implementation Persist

Federal Managers Report That They Are Held Accountable for Program Results
but Do Not Have the Decision-Making Authority They Need to Accomplish
Agency Goals

High-performing organizations seek to shift the focus of management and
accountability from activities and processes to contributions and
achieving results. In each of our three surveys, we asked managers about
the amount of decision-making authority they had and the degree to which
they were held accountable for results.

As shown in figure 14, for 2003, an estimated 40 percent of federal
managers overall reported that they had the decision-making authority they
needed to help the agency accomplish its strategic goals to a great or
very great extent. This was a statistically significant increase over our
1997 estimate of 31 percent. While there were more SES and non-SES
managers expressing this view on our 2003 survey than the 1997 survey, it
was the non-SES managers that showed the significant increase. Despite
this promising trend, however, there continued to be substantial
differences in 2003, as well as on the two previous surveys, between the
responses of SES and lower-level managers on this question. Compared to
the 57 percent of SES managers who reported having such authority to a
great or very great extent in 2003, only 38 percent of non-SES managers
reported having such authority to a great or very great extent.

Chapter 4
Challenges to GPRA Implementation Persist

Figure 14: Percentage of Federal Managers Reporting to a Great or Very
Great Extent That Managers/Supervisors at Their Levels Had the
Decision-Making Authority They Needed to Help the Agency Accomplish Its
Strategic Goals

Percent

100

90

80

70

60 56 57

50

40

40 30

29

20 10

1997 2000 2003 Year

Federal managersa

SESb

Non-SESa,b

Source: GAO.

aThere was a statistically significant difference between the 1997 and
2003 surveys. bThere was a statistically significant difference between
SES compared to non-SES for each survey.

However, when asked the extent to which managers or supervisors at their
levels were held accountable for the accomplishment of agency strategic
goals, 57 percent responded to a great or very great extent in 2003.
Unlike in other areas, where SES managers had significantly different
views from non-SES managers, there was little difference in the area of
accountability. (See fig. 15.)

Chapter 4
Challenges to GPRA Implementation Persist

Figure 15: Percentage of Federal Managers, SES, and Non-SES in 2003
Reporting to

a Great or Very Great Extent That They Were Held Accountable for the

Accomplishment of Agency Strategic Goals Percent

100

90

80

70

61

60

50

40

30

20

10

0 Federal Non-SES SES managers

Source: GAO.

This 57 percent is significantly higher than the 40 percent of managers
overall who indicated that they had comparable decision-making authority.
However, in contrast to the question on authority, as shown in figure 14,
where more SES managers than non-SES managers expressed the view that they
had the authority, there was little difference, as shown in figure 15,
between the two groups in their views about being held accountable for
achieving agency strategic goals to a great or very great extent. As
figures 14 and 15 further illustrate, roughly the same percentage of SES
managers perceived to a great or very great extent that managers at their
level had decision-making authority and accountability for achieving
agency strategic goals. This result suggests that their authority was
perceived to be on par with their accountability. In contrast, only 38
percent of non-SES managers perceived that managers at their levels had
the decision-making authority they needed to a great or very great extent,
while 57 percent perceived that they were held accountable to a comparable
extent.

Managers are hard-pressed to achieve results when they do not have
sufficient authority to act. In our report containing the results of our
1997 survey, we noted that agencies needed to concentrate their efforts on
areas

                                   Chapter 4
                   Challenges to GPRA Implementation Persist

where managers were not perceiving or experiencing progress, such as that
concerning devolving decision-making authority to managers throughout
their organizations. While authority for achieving results appears to be
in a modestly upward trend, the balance between authority and
accountability that fosters decision making to achieve results could be
further improved, particularly among non-SES managers.

Fewer Than Half of Managers Reported Training on Key Tasks

We previously reported on the need for agencies to expend resources on
effective training and professional development to equip federal employees
to work effectively.4 Among the resources focus group participants cited
as lacking included federal managers and staff with competencies and
skills needed to plan strategically, develop robust measures of
performance, and analyze what the performance data mean. Our 2003 Guide
calls for training and development efforts to be strategically focused on
improving performance toward the agency's goals and put forward with the
agency's organizational culture firmly in mind.5 Throughout this process
it is important that top leaders in the agencies communicate that
investments in training and development are expected to produce clearly
identified results. By incorporating valid measures of effectiveness into
the training and development programs they offer, agencies can better
ensure that they will adequately address training objectives and thereby
increase the likelihood that desired changes will occur in the target
population's skills, knowledge, abilities, attitudes, or behaviors.
Furthermore, if managers understand and support the objectives of training
and development efforts, they can provide opportunities to successfully
use the new skills and competencies on the job and model the behavior they
expect to see in their employees.

In response to our 2003 survey, fewer than half of managers answered "yes"
when we asked them whether, during the past 3 years, their agencies had
provided, arranged, or paid for training that would help them accomplish
any of seven critical results-oriented management-related tasks. However,
progress is indicated in our survey results. As shown in figure 16, more
managers answered "yes" in 2003 on all seven training areas than in

4GAO-01-127.

5U.S. General Accounting Office, Human Capital: A Guide for Assessing
Strategic Training and Development Efforts in the Federal Government
(Exposure Draft) GAO-03893G (Washington, D.C.: July 1, 2003).

                                   Chapter 4
                   Challenges to GPRA Implementation Persist

previous surveys. These increases were statistically significant for five
of the tasks-setting program performance goals, using program performance
information to make management decisions, linking program performance to
the achievement of agency strategic goals, and implementing the
requirements of GPRA.

Figure 16: Percentage of Federal Managers in Each Survey Year Who Reported
That during the Past 3 Years Their Agencies Provided, Arranged, or Paid
for Training That Would Help Them Accomplish Specific Tasks

50 Percent 49

40

30

20

10

                    Assess the quality of performance datab

Source: GAO.

Develop program Conduct Implement the Link the Use program Set program
performance strategic requirements performance of performance performance
measures planning of GPRAc programs/operations/ information to goalsc
projects to the make decisionsc achievement of agency strategic goalsc

1997

2000

2003

aThis question was not asked in the 1997 survey.
bThere was a statistically significant difference between the 2000 and
2003 surveys.
cThere was a statistically significant difference between the 1997 and
2003 surveys.

As with our 2000 survey results, the 2003 survey results continued to
demonstrate that there is a positive relationship between agencies
providing training and development on setting program performance goals
and the use of performance information when setting or revising

                                   Chapter 4
                   Challenges to GPRA Implementation Persist

performance goals. For those managers who responded "yes" to training on
setting performance goals, 60 percent also reported that they used
information obtained from performance measurement when setting new or
revising existing performance goals to a great or very great extent. In
contrast, for those managers who responded "no" to training on setting
performance goals, only 38 percent reported that they used information
obtained from performance measurement for setting new or revising existing
performance goals to a great or very great extent. The difference between
these percentages is statistically significant. Effective training and
development programs are an integral part of a learning environment that
can enhance the federal government's ability to attract and retain
employees with the skills and competencies needed to achieve results.
Training and developing new and current staff to fill new roles and work
in different ways will be a crucial part of the federal government's
endeavors to meet its transformation challenges. Ways that employees learn
and achieve results will also continue to transform how agencies do
business and engage employees in further innovation and improvements.

Managers Perceive a Lack of Positive Recognition for Helping Agencies
Achieve Results

Another fundamental aspect of the human capital management challenge
agencies face is providing the incentives to their employees to encourage
results-oriented management. Monetary and nonmonetary incentives can be
used as a method for federal agencies to reward employees and to motivate
them to focus on results.

Overall, an increasing but still small percentage of managers reported in
1997, 2000, and 2003 that employees in their agencies received positive
recognition to a great or very great extent for helping agencies
accomplish their strategic goals. In 1997, 26 percent of federal managers
reported such an extent of positive recognition as compared to 37 percent
in 2003, a statistically significant increase. Interestingly, this
improvement is seen in the responses of non-SES managers. As shown in
figure 17, the percentage of SES managers expressing this view stayed at
about the same level over the three surveys, while the percentage of
non-SES managers holding this view was significantly higher in 2003 than
in 1997. Even with this improvement on the part of the responses from
non-SES managers, significantly more SES managers (47 percent) than
non-SES managers (36 percent) expressed this perception to a comparable
extent in 2003.

Chapter 4
Challenges to GPRA Implementation Persist

Figure 17: Percentage of Federal Managers Who Reported to a Great or Very
Great Extent That Employees in Their Agencies Received Positive
Recognition for Helping Their Agencies Accomplish Their Strategic Goals

Percent

100

90

80

70

60

                                       52

50

37

40

30

20 25

10

0

1997 2000 2003

Year

Federal managersa

SES

Non-SESa

Source: GAO.

aThere was a statistically significant difference between the 1997 and
2003 surveys.

Unfortunately, most existing federal performance appraisal systems are not
designed to support a meaningful performance-based pay system in that they
fail to link institutional, program, unit, and individual performance
measurement and reward systems. In our view, one key need is to modernize
performance management systems in executive agencies so that they link to
the agency's strategic plan, related goals, and desired outcomes and are
therefore capable of adequately supporting more performancebased pay and
other personnel decisions.

We have reported federal agencies can develop effective performance
management systems by implementing a selected, generally consistent set of
key practices. These key practices helped public sector organizations both
in the United States and abroad create a clear linkage-"line of
sight"-between individual performance and organizational success and,

Chapter 4
Challenges to GPRA Implementation Persist

thus, transform their cultures to be more results oriented,
customerfocused, and collaborative in nature. Examples of such practices
include

o  aligning individual performance expectations with organizational goals,

o  connecting performance expectations to crosscutting goals,

o  linking pay to individual and organizational performance, and

o  making meaningful distinctions in performance.6

Beyond implementing these key practices, high-performing organizations
understand that their employees are assets whose value to the organization
must be recognized, understood, and enhanced. They view an effective
performance management system as an investment to maximize the
effectiveness of people by developing individual potential to contribute
to organizational goals. To maximize this investment, an organization's
performance management system is designed, implemented, and continuously
assessed by the standard of how well it helps the employees help the
organization achieve results and pursue its mission.

6For a complete list and discussion of the practices, see U.S. General
Accounting Office, Results oriented Cultures: Creating a Clear Linkage
between Individual Performance and Organizational Success, GAO-03-488
(Washington, D.C.: Mar. 14, 2003).

                                   Chapter 4
                   Challenges to GPRA Implementation Persist

Persistent Challenges in Setting Outcome-Oriented Goals, Measuring
Performance, and Collecting Useful Data

In prior reports, we have described difficulties faced by federal managers
in developing useful, outcome-oriented measures of performance and
collecting data indicating progress achieved.7 One of the most persistent
challenges has been the development of outcome-oriented performance
measures. Additionally, it is difficult to distinguish the impact of a
particular federal program from the impact of other programs and factors,
thus making it difficult to attribute specific program performance to
results. The lack of timely and useful performance information can also
hinder GPRA implementation.

Meaningful, Outcome-Oriented Performance Measures Are Sometimes Hard to
Develop

In the past, we have noted that federal managers found meaningful
performance measures difficult to develop. Focus group participants and
survey respondents noted that outcome-oriented performance measures were
especially difficult to establish when the program or line of effort was
not easily quantifiable. The challenge of the "complexity of establishing
outcome-oriented goals and measuring performance" was cited by six of the
eight focus groups as one of the key challenges that managers face in
implementing GPRA. Focus group participants agreed that they often felt as
if they were trying to measure the immeasurable, not having a clear
understanding of which performance indicators could accurately inform the
agency how it is carrying out a specific activity. Managers from agencies
engaged in basic science research and development and grantmaking
functions noted that this effort was particularly difficult for them
because federal programs, especially those that are research-based, often
take years to achieve the full scope of their goals. On our most recent
survey, we estimated that 36 percent of federal managers who had an
opinion indicated that the determination of meaningful measures hindered
the use of performance information or performance measurement to a great
or very great extent. While this number was significantly lower than the
percentage of managers expressing the comparable view on the 1997 or 2000
survey and may reflect some lessening of this as a hindrance to some

7See for example, U.S. General Accounting Office, Managing for Results:
Analytic Challenges in Measuring Performance, GAO/HEHS/GGD-97-138
(Washington, D.C.: May 30, 1997); Program Evaluation: Agencies Challenged
by New Demand for Information on Program Results, GAO/GGD-98-53
(Washington, D.C.: Apr. 24, 1998); Managing for Results: Measuring Program
Results That Are Under Limited Federal Control, GAO/GGD-99-16 (Washington,
D.C.: Dec. 11, 1998); and Managing for Results: Challenges Agencies Face
in Producing Credible Performance Information, GAO/GGD-00-52 (Washington,
D.C.: Feb. 4, 2000).

                                   Chapter 4
                   Challenges to GPRA Implementation Persist

managers, it nonetheless continues to be among those items having the
largest percentage of managers citing it as a substantial hindrance.

Impact of Federal Programs Difficult to Discern

In our June 1997 report on GPRA, we noted that "the often limited or
indirect influence that the federal government has in determining whether
a desired result is achieved complicates the effort to identify and
measure the discrete contribution of the federal initiative to a specific
program result."8 This occurs primarily because many federal programs'
objectives are the result of complex systems or phenomena outside the
program's control. In such cases, it is particularly challenging for
agencies to confidently attribute changes in outcomes to their program-the
central task of program impact evaluation. This is particularly
challenging for regulatory programs, scientific research programs, and
programs that deliver services to taxpayers through third parties, such as
state and local governments.

We have reported that determining the specific outcomes resulting from
federal research and development has been a challenge that will not be
easily resolved.9 Due to the difficulties in identifying outcomes,
research and development agencies typically have chosen to measure a
variety of proxies for outcomes, such as the number of patents resulting
from federally funded research, expert review and judgments of the quality
and importance of research findings, the number of project-related
publications or citations, and contributions to expanding the number of
research scientists.

8GAO/GGD-97-109, 6.

9U.S. General Accounting Office, Managing for Results: Key Steps and
Challenges in Implementing GPRA in Science Agencies, GAO/T-GGD/RCED-96-214
(Washington, D.C.: July 10, 1996).

Chapter 4
Challenges to GPRA Implementation Persist

We have also reported that implementing GPRA in a regulatory environment
is particularly challenging.10 Although federal agencies are generally
required to assess the potential benefits and costs of proposed major
regulatory actions, they generally do not monitor the benefits and costs
of how these and other federal programs have actually performed. For
example, in the case of the Environmental Protection Agency (EPA), to
determine if existing environmental regulations need to be retained or
improved, we previously recommended that EPA study the actual costs and
benefits of such regulations.11

In the past, regulatory agencies have cited numerous barriers to their
efforts to establish results-oriented goals and measures. These barriers
included problems in obtaining data to demonstrate results, accounting for
factors outside of the agency's control that affect results, and dealing
with the long time periods often needed to see results. Our prior work
discussed best practices for addressing challenges to measuring the
results of regulatory programs. In particular, to address the challenge of
discerning the impact of a federal program, when other factors also affect
results, we suggested agencies "establish a rationale of how the program
delivers results." Establishing such a rationale involves three related
practices: (1) taking a holistic or "systems" approach to the problem
being addressed, (2) building a program logic model that described how
activities translated to outcomes, and (3) expanding program assessments
and evaluations to validate the model linkages and rationale.

We have also reported on the difficulties encountered in meeting GPRA
reporting requirements for intergovernmental grant programs.12 Programs
that do not deliver a readily measurable product or service are likely to
have difficulty meeting GPRA performance measurement and reporting
requirements. Intergovernmental grant programs, particularly those with
the flexibility inherent in classic block grant design, may be more likely
to

10U.S. General Accounting Office, Managing for Results: Strengthening
Regulatory Agencies' Performance Management Practices, GAO/GGD-00-10
(Washington, D.C.: Oct. 28, 1999).

11U.S. General Accounting Office, Environmental Protection: Assessing the
Impacts of EPA's Regulations Through Retrospective Studies,
GAO/RCED-99-250 (Washington, D.C.: Sept. 14, 1999).

12U.S. General Accounting Office, Grant Programs: Design Features Shape
Flexibility, Accountability, and Performance Information, GAO/GGD-98-137
(Washington, D.C.: June 22, 1998).

                                   Chapter 4
                   Challenges to GPRA Implementation Persist

have difficulty producing performance measures at the national level and
raise delicate issues of accountability. Although most flexible grant
programs we reviewed reported simple activity or client counts, relatively
few of them collected uniform data on the outcomes of state or local
service activities. Collecting such data requires conditions (such as
uniformity of activities, objectives, and measures) that do not exist
under many flexible program designs, and even where overall performance of
a state or local program can be measured, the amount attributable to
federal funding often cannot be separated out.

Focus group participants also suggested that they faced challenges in
obtaining timely performance data from relevant partner organizations and
in identifying what the federal government's contribution has been to a
specific outcome. Furthermore, survey respondents provided some
corroboration for these views. Across all three of our surveys, we
estimate that roughly a quarter of all federal managers reported this
difficulty- distinguishing between the results produced by the program
they were involved with and results caused by other factors-as a
substantial hindrance. In response to a survey question about what the
federal government could do to improve its overall focus on managing for
results, one respondent noted: "Defining meaningful measures for the work
we do is extremely difficult; and even if they could be defined,
performance and accomplishment is (sic) dependent on so many factors
outside our control that it is difficult, if not impossible, to make valid
conclusions."

Timely, Useful Performance Information Not Always Available

In February 2000, we reported that intergovernmental programs pose
potential difficulties in collecting timely and consistent national
data.13 We also noted that agencies had limited program evaluation
capabilities and weaknesses in agencies' financial management capabilities
make it difficult for decision makers to effectively assess and improve
many agencies' financial performance. On the basis of our current
findings, these issues still exist. Federal managers who participated in
our focus groups cited difficulties in gathering data from state or local
entities, as well as statutory limitations regarding the nature and
breadth of data that they were permitted to collect. However, in our 2003
survey, only 27 percent of federal managers indicated that obtaining data
in time to be useful was a

13GAO/GGD-00-52.

                                   Chapter 4
                   Challenges to GPRA Implementation Persist

substantial hindrance; 31 percent expressed a comparable view with regard
to obtaining valid or reliable data.

Focus group participants also noted that OMB's accelerated time frames for
reporting performance information will contribute to the challenge of
producing complete, timely information in their agencies' performance
reports. Over the past 2 fiscal years, OMB has moved the deadline for
submission of agencies' performance reports (now performance and
accountability reports) back from the statutory requirement of March 31;
for fiscal year 2003 data, the deadline is January 30, 2004. In fiscal
year 2004, these reports will be due on November 15, 2004. According to
the managers, individual agencies may work on different time frames based
partially on the population they serve or the stakeholders they must work
with, such as state or local agencies. This "one size fits all" approach
does not take such differences into account.

Additionally, OMB requires agencies to report on their performance data
quarterly; managers noted that this was particularly difficult for
outcomes that may be achieved over extended periods of time, such as
outcomes associated with basic science. As we have previously reported,
measuring the performance of science-related projects can be difficult
because a wide range of factors determine if and how a particular research
and development project will result in a commercial application or have
other benefits. Efforts to cure diseases or pursue space exploration are
difficult to quantify and break down into meaningful quarterly performance
measures.

Crosscutting Issues Hinder Successful GPRA Implementation

Crosscutting issues continue to be a challenge to GPRA implementation.
Mission fragmentation and program overlap are widespread across the
federal government. Moreover, addressing this challenge is essential to
the success of national strategies in areas such as homeland security,
drug control, and the environment.

We have reported that agencies could use the annual performance planning
cycle and subsequent annual performance reports to highlight crosscutting
program efforts and to provide evidence of the coordination of those
efforts. Our review of six agencies' strategic and annual performance
plans showed some improvement in addressing their crosscutting program
efforts, but a great deal of improvement is still necessary. Few of the
plans we reviewed attempted the more challenging task of discussing
planned strategies for coordination and establishing complementary
performance

Chapter 4
Challenges to GPRA Implementation Persist

goals and complementary or common performance measures. For example, SSA's
2004 performance plan makes some mention of the agency's efforts to
coordinate with other agencies to preserve the integrity of the Social
Security number as a personal identifier, but there are very few details
about this important component of its mission.

Previous GAO reports and agency managers identified several barriers to
interagency coordination. First, missions may not be mutually reinforcing
or may even conflict, making reaching a consensus on strategies and
priorities difficult. In 1998 and 1999, we found that mission
fragmentation and program overlap existed in 12 federal mission areas,
ranging from agriculture to natural resources and the environment.
Implementation of federal crosscutting programs is often characterized by
numerous individual agency efforts that are implemented with little
apparent regard for the presence of related activities. Second, we
reported on agencies' interest in protecting jurisdiction over missions
and control over resources.14 Focus group participants echoed this
concern, noting that there can be "turf battles" between agencies, where
jurisdictional boundaries, as well as control over resources, are hotly
contested. Finally, incompatible procedures, processes, data, and computer
systems pose difficulties for agencies to work across agency boundaries.
For example, we reported how the lack of consistent data on federal
wetlands programs implemented by different agencies prevented the
government from measuring progress toward achieving the governmentwide
goal of no net loss of the nation's wetlands.15

14U.S. General Accounting Office, Managing for Results: Barriers to
Interagency Coordination, GAO/GGD-00-106 (Washington, D.C.: Mar. 29,
2000).

15U.S. General Accounting Office, Wetlands Overview: Problems With Acreage
Data Persist, GAO/RCED-98-150 (Washington, D.C.: July 1, 1998) and
Results-Oriented Management: Agency Crosscutting Actions and Plans in
Border Control, Flood Mitigation and Insurance, Wetlands, and Wildland
Fire Management, GAO-03-321 (Washington, D.C.: Dec. 20, 2002).

Chapter 4
Challenges to GPRA Implementation Persist

We have previously reported and testified that GPRA could provide OMB,
agencies, and Congress with a structured framework for addressing
crosscutting program efforts.16 OMB, for example, could use the provision
of GPRA that calls for OMB to develop a governmentwide performance plan to
integrate expected agency-level performance. Unfortunately, this provision
has not been fully implemented. OMB issued the first and only such plan in
February 1998 for fiscal year 1999. In our review of the plan,17 we found
that it included a broad range of governmentwide management objectives and
a mission-based presentation of key performance goals based on agency
performance plans and the plan's framework should ultimately allow for a
cohesive presentation of governmentwide performance. However, the specific
contents of this initial plan did not always deliver an integrated,
consistent, and results-oriented picture of fiscal year 1999 federal
government performance goals.

OMB officials we interviewed at the time stressed that developing the
governmentwide plan was viewed as an essential and integral component of
the President's budget and planning process. From OMB's perspective, both
the plan and the budget submission were intended to serve as communication
tools for a range of possible users. In their opinion, the plan added
value by reflecting a governmentwide perspective on policy choices made
throughout the budget formulation process. OMB acknowledged that the plan
itself did not serve to change the process through which decisions on
government priorities were made, but enhanced it by placing a greater
emphasis on results. As one official described it, the governmentwide
performance plan was a derivative document, reflecting the budget and
management decisions made throughout the process of formulating the
President's budget submission. However, we found that focusing broadly on
governmentwide outcomes should be a central and distinguishing feature of
the federal government performance plan. To be most effective and
supportive of the purposes of GPRA, the governmentwide plan must be more
than a compilation of agency-level plans; integration, rather than
repetition, must be its guiding principle.

16GAO/GGD-00-106 and U.S. General Accounting Office, Results-Oriented
Government: Using GPRA to Address 21st Century Challenges, GAO-03-1166T
(Washington, D.C.: Sept. 18, 2003).

17U.S. General Accounting Office, The Results Act: Assessment of the
Governmentwide Performance Plan for Fiscal Year 1999, GAO/AIMD/GGD-98-159
(Washington, D.C.: Sept. 8, 1998).

Chapter 4
Challenges to GPRA Implementation Persist

OMB has not issued a distinct governmentwide performance plan since fiscal
year 1999. Most recently, the President's fiscal year 2004 budget focused
on describing agencies' progress in addressing the PMA and the results of
PART reviews of agency programs. Although such information is important
and useful, it does not provide a broader and more integrated perspective
of planned performance on governmentwide outcomes. Additionally, the
fiscal year 2004 budget identified budget requests and performance
objectives by agency, such as the U.S. Department of Defense, as opposed
to crosscutting governmentwide themes. From this presentation, one could
assume that the only activities the U.S. government planned to carry out
in support of national defense were those listed under the chapter
"Department of Defense." However, the chapter of the fiscal year 2004
budget discussing "the Department of State and International Assistance
Programs," contains a heading titled, "Countering the Threat from Weapons
of Mass Destruction." And while OMB may have a technical reason for not
classifying this task as being related to national defense or homeland
security, it is unclear that a lay reader could make that distinction. The
fiscal year 2005 budget also identified budget requests by agency, not by
crosscutting theme. Without such a governmentwide focus, OMB is missing an
opportunity to assess and communicate the relationship between individual
agency goals and outcomes that cut across federal agencies and more
clearly relate and address the contributions of alternative federal
strategies. The governmentwide performance plan also could help Congress
and the executive branch address critical federal performance and
management issues, including redundancy and other inefficiencies in how we
do business. It could also provide a framework for any restructuring
efforts.

A strategic plan for the federal government, supported by key national
indicators to assess the government's performance, position, and progress,
could provide an additional tool for governmentwide reexamination of
existing programs, as well as proposals for new programs. If fully
developed, a governmentwide strategic plan could potentially provide a
cohesive perspective on the long-term goals of the federal government and
provide a much needed basis for fully integrating, rather than merely
coordinating, a wide array of federal activities. Successful strategic
planning requires the involvement of key stakeholders. Thus, it could
serve as a mechanism for building consensus. Further, it could provide a
vehicle for the President to articulate long-term goals and a road map for
achieving them. In addition, a strategic plan could provide a more
comprehensive framework for considering organizational changes and making
resource decisions.

                                   Chapter 4
                   Challenges to GPRA Implementation Persist

Developing a strategic plan for the federal government would be an
important first step in articulating the role, goals, and objectives of
the federal government. It could help provide critical horizontal and
vertical linkages. Horizontally, it could integrate and foster synergies
among components of the federal government as well as help to clarify the
role of the federal government vis-`a-vis other sectors of our society.
Vertically, it could provide a framework of federal missions and goals
within which individual federal agencies could align their own missions
and goals that would cascade down to individual employees. The development
of a set of key national indicators could be used as a basis to inform the
development of the governmentwide strategic and annual performance plans.
The indicators could also link to and provide information to support
outcomeoriented goals and objectives in agency-level strategic and annual
performance plans.

Managers View Congress' Use of Performance Information as Limited

Focus group members believed that one of the main challenges to GPRA
implementation was the reluctance of Congress to use that information when
making decisions, especially appropriations decisions. This concern was
cited as a significant challenge in each of the focus groups, and was one
of the top three "challenges" in five of the eight focus groups. In some
cases, managers in our focus groups noted that this lack of usage was a
significant disincentive to doing a good job in preparing GPRA plans and
reports. Agency managers made the following criticisms regarding the
perceived lack of congressional use of performance information:

o 	appropriators have not bought into GPRA, so there is no incentive to do
this well,

o 	failure of congressional leadership in developing and using performance
measures,

o 	appropriators do not use performance data or tools to make decisions,
and

o  GPRA does not drive public policy decisions.

Results from our survey provide some further information in support of
this view. On our 2003 survey, when we asked federal managers about the
extent to which they thought congressional committees paid attention to
agency efforts under GPRA, only 22 percent of federal managers responded
in the great to very great categories. This result was not significantly

Chapter 4
Challenges to GPRA Implementation Persist

different from the results we observed on our 2000 survey when we asked
this question about three specific types of congressional committees-
authorization, appropriation, and oversight. On the 2000 survey, only 18
percent of federal managers held a similar view concerning authorizing
committees, 19 percent for appropriations committees, and 20 percent for
oversight committees. As we noted earlier, when this item was asked in
relation to OMB, there was a significant increase in the percentage of
managers responding to a great or very great extent from 2000 to 2003. The
31 percent of managers who viewed OMB as paying attention to a great or
very great extent in 2003 was significantly higher than the 22 percent
holding a comparable view of congressional committees.

Although managers expressed these concerns about the use of this
information, a recent review by the CRS suggested that Congress uses
performance information to some extent, as evidenced by citations in
legislation and committee reports.18 For example, in the 106th Congress
(1999-2000), 42 public laws contained statutory language relating to GPRA
and performance measures, and 118 legislative reports19 contained
GPRAassociated passages. As shown in figure 18, across all three of our
surveys, only a minority of federal managers governmentwide viewed the
lack of ongoing congressional commitment for using performance information
as a hindrance to a great or very great extent.

18Congressional Research Service, Government Performance and Results Act:
Overview of Associated Provisions in the 106th Congress, (Washington,
D.C.: 2002).

19This included reports that accompanied bills passed by both the House
and Senate that were either enacted into law or vetoed by the President.

Chapter 4
Challenges to GPRA Implementation Persist

Figure 18: Percentage of Federal Managers Reporting to a Great or Very
Great Extent That a Lack of Ongoing Congressional Commitment or Support
for Using Performance Information in Making Program/Funding Decisions Is a
Hindrance

Percent

100

90

80

70

60

50

40

33 34 31

30

20

10

0 1997 2000 2003 Year

Source: GAO.

Note: Percentages are based on those respondents answering on the extent
scale.

While there is concern regarding Congress' use of performance information,
it is important to make sure that this information is initially useful.
One of GPRA's purposes is to respond to a need for accurate, reliable
information for congressional decision making. In 2000, we reported that
congressional staffs stated that they were looking for recurring
information on spending priorities within programs; the quality, quantity,
and efficiency of program operations; the populations served or regulated;
as well as programs' progress in meeting their objectives.20 For example,
learning who benefits from a program can help in addressing questions
about how well services are targeted to those most in need. Some of these
recurring needs were met through formal agency documents, such as annual
performance plans. However, some information the agencies provided did not
fully meet the congressional

20U.S. General Accounting Office, Managing for Results: Views on Ensuring
the Usefulness of Agency Performance Information to Congress, GGD-00-35
(Washington, D.C.: Jan. 26, 2000).

Chapter 4
Challenges to GPRA Implementation Persist

staffs' needs because the presentation was not clear, directly relevant,
or sufficiently detailed. For example, congressional staffs wanted to see
more direct linkages among the agencies' resources, strategies, and goals.
In other cases, the information was not readily available to the
congressional staffs, either because it had not been requested or
reported, or because staff were not informed that it was available.

As a key user of performance information, Congress also needs to be
considered a partner in shaping agency goals at the outset. For example,
through the strategic planning requirement, GPRA requires federal agencies
to consult with Congress and key stakeholders to reassess their missions
and long-term goals as well as the strategies and resources they will need
to achieve their goals. GPRA also provides a vehicle for Congress to
explicitly state its performance expectations in outcome-oriented terms
when establishing new programs or in exercising oversight of existing
programs that are not achieving desired results. Congress could use
authorizing and appropriations hearings to determine if agency programs
have clear performance goals, measures, and data with which to track
progress and whether the programs are achieving their goals. If goals and
objectives are unclear or not results oriented, Congress could use
legislation to articulate the program outcomes it expects agencies to
achieve. This would provide important guidance to agencies that could then
be incorporated in agency strategic and annual performance plans.

Chapter 5

                        Conclusions and Recommendations

Agenda for Achieving a Sustainable, Governmentwide Focus on Results

As we have shown in this report, in the 10 years since the enactment of
GPRA, significant progress has been made in instilling a focus on results
in the federal government. First, GPRA statutory requirements laid a
foundation for results-oriented management in federal agencies. Expert and
agency focus group participants cited the creation of this statutory
foundation as one of the key accomplishments of GPRA. Since GPRA began to
be implemented governmentwide in fiscal year 1997, we have observed
significant increases in the percentage of federal managers who reported
having results-oriented performance measures for their programs. Focus
group participants' views on whether GPRA has had a positive effect on the
federal government's ability to deliver results to the American public
were mixed. For example, the information gathered and reported for GPRA
allows agencies to make better-informed decisions, which improves their
ability to achieve results. In addition, GPRA has made the results of
federal programs more transparent to the public. Other participants stated
that while certain aspects of GPRA-related work have been positive,
agencies' ability to deliver results and public awareness of their
activities cannot be exclusively attributed to GPRA.

Second, GPRA has increased the connection between resources and results by
creating more formal linkages between agency performance goals and
objectives and the program activities in the budget. Over the first 4
years of agency efforts to implement GPRA, we observed that agencies
continued to tighten the required linkage between their performance plans
and budget requests. However, much remains to be done in this area. For
example, we have not observed notable increases in federal managers'
perceptions about their personal use of plans or performance information
when allocating resources, or about the use of performance information
when funding decisions are made about their programs. However, it should
be noted that we estimate a majority have positive perceptions about the
use of performance information to allocate resources.

Third, GPRA has provided a foundation for examining agency missions,
performance goals and objectives, and the results achieved. We have seen
improvements in the quality of agency strategic plans, annual performance
plans, and performance reports since initial efforts. However, few of the
six agencies we reviewed in this report produced GPRA planning and
reporting documents that met all of our criteria for the highest level of
quality. Most of these agencies continued to miss opportunities to present
clear pictures of their intended and actual performance results in their
GPRA plans and reports and to show how resources are aligned with actual

                   Chapter 5 Conclusions and Recommendations

performance results. Furthermore, most of the agencies we reviewed did not
provide a full level of confidence in the credibility of their performance
data.

Performance-based management, as envisioned by GPRA, requires transforming
organizational cultures to improve decision making, maximize performance,
and assure accountability. This transformation is not an easy one and
requires investments of time and resources as well as sustained leadership
commitment and attention. Challenges to successful implementation of GPRA
include inconsistent top leadership commitment to creating a focus on
results; an approach to setting goals and developing strategies for
achieving critical outcomes that creates individual agency stovepipes
rather than an integrated, holistic governmentwide approach; getting
federal managers to make greater use of performance information to manage
their programs and providing them authority to act that is commensurate
with their accountability for results; difficulty in establishing
meaningful measures of outcomes and assessing results of federal programs
that are carried out by nonfederal entities; and untimely performance
data.

The challenges identified in this report are not new-most have not changed
significantly since we first reported on governmentwide implementation of
GPRA. However, we have frequently reported on approaches that agencies,
OMB, and Congress could use to address the challenges. These approaches
include strengthening the commitment of top leadership to creating and
sustaining a focus on results; taking a governmentwide approach to
achieving outcomes that are crosscutting in nature; improving the
usefulness of performance information to managers, Congress, and the
public; and improving the quality of performance measures and data.
Collectively, these approaches form the agenda that federal agencies, OMB,
and Congress will need to follow to bring about a more sustainable,
governmentwide focus on results.

Strengthening Top Successfully addressing the challenges that federal
agencies face requires Leadership Commitment to leaders who are committed
to achieving results, who recognize the Creating and Sustaining importance
of using results-oriented goals and quantifiable measures, and

who integrate performance-based management into the culture and day-to-

Results-Oriented Cultures	day activities of their organizations. Top
leadership must play a critical role in creating and sustaining
high-performing organizations. Without the clear and demonstrated
commitment of agency top leadership-both

Chapter 5 Conclusions and Recommendations

political and career-organizational cultures will not be transformed, and
new visions and ways of doing business will not take root.

To be positioned to address the array of challenges faced by our national
government, federal agencies will need to transform their organizational
cultures so that they are more results oriented, customer-focused, and
collaborative. Leading public organizations here in the United States and
abroad have found that strategic human capital management must be the
centerpiece of any serious change management initiative and efforts to
transform the cultures of government agencies. Performance management
systems are integral to strategic human capital management. Such systems
can be key tools to maximizing performance by aligning institutional
performance measures with individual performance and creating a "line of
sight" between individual and organizational goals. Leading organizations
use their performance management systems as a key tool for aligning
institutional, unit, and employee performance; achieving results;
accelerating change; managing the organization day to day; and
facilitating communication throughout the year so that discussions about
individual and organizational performance are integrated and ongoing.1

Furthermore, achieving this cultural transformation requires people to
have the knowledge and skills to develop and use performance information
to improve program performance. Our survey data indicated a significant
relationship between those managers who reported they received training on
setting performance goals and those who used performance information when
setting or revising performance goals. However, federal agencies have not
consistently showed a commitment to investing in needed training and
development opportunities to help ensure that managers and employees have
the requisite skills and competencies to achieve agency goals.

The commitment to focusing on and using performance information needs to
extend to OMB and Congress as well. Through the administration's PMA and
PART initiatives, OMB has clearly placed greater emphasis on management
issues over the past several years. However, the focus of such oversight
needs to extend beyond the emphasis on formulating the President's Budget
to include an examination of the many challenges agencies face that may be
contributing to poor performance. In spite of the

1U.S. General Accounting Office, Human Capital: Key Principles From Nine
Private Sector Organizations, GAO/GGD-00-28 (Washington, D.C.: Jan. 31,
2000).

Chapter 5 Conclusions and Recommendations

persistent weaknesses we found in agencies' strategic plans and annual
performance plans and reports, OMB significantly reduced the scope of its
guidance to agencies on how to prepare these documents. By emphasizing a
focus on resource allocation through its PART exercise and providing less
information on how to comply with GPRA, OMB may be sending a message to
agencies that compliance with GPRA is not important. Without strong
leadership from OMB, the foundation of performance information that has
been built could deteriorate.

OMB leadership is critical to addressing the continuing challenges
presented in GPRA implementation and the transformation of the federal
government to an increasingly results-oriented culture. OMB, as the
primary focal point for overall management in the federal government, can
provide the needed impetus by providing guidance, fostering communication
among agencies, and forming intragovernmental councils and work groups
tasked with identifying potential approaches and solutions to overcoming
the persistent challenges to results-oriented management.

Congress can also play a decisive role in fostering results-oriented
cultures in the federal government by using information on agency goals
and results at confirmation, oversight, authorization, and appropriation
hearings. Consistent congressional interest in the status of an agency's
GPRA efforts, performance measures, and uses of performance information to
make decisions, will send an unmistakable message to agencies that
Congress expects GPRA to be thoroughly implemented.

We also found that timing issues may affect the development of agency
strategic plans that are meaningful and useful to top leadership. The
commitment of top leadership within agencies, OMB, and Congress is
critical to the success of strategic planning efforts. A strategic plan
should reflect the policy priorities of an organization's leaders and the
input of key stakeholders if it is to be an effective management tool.
However, GPRA specifies time frames for updating strategic plans that do
not correspond to presidential or congressional terms. As a result, an
agency may be required to update its strategic plan a year before a
presidential election and without input from a new Congress. If a new
president is elected, the updated plan is essentially moot and agencies
must spend additional time and effort revising it to reflect new
priorities. Our focus group participants, including GPRA experts, strongly
agreed that this timing issue should be addressed by adjusting time frames
to correspond better with presidential and congressional terms.

                   Chapter 5 Conclusions and Recommendations

Addressing Governmentwide Needs

Mission fragmentation and program overlap are widespread throughout the
federal government.2 We have noted that interagency coordination is
important for ensuring that crosscutting program efforts are mutually
reinforcing and efficiently implemented. Our review of six agencies'
strategic and annual performance plans along with our previous work on
crosscutting issues has demonstrated that agencies' still present their
goals and strategies in a mostly stovepiped manner. They have generally
not used their plans to communicate the nature of their coordination with
other agencies, in terms of the development of common or complementary
goals and objectives or strategies jointly undertaken to achieve those
goals.

We have also reported that GPRA could provide a tool to reexamine federal
government roles and structures governmentwide. GPRA requires the
President to include in his annual budget submission a federal government
performance plan. Congress intended that this plan provide a "single
cohesive picture of the annual performance goals for the fiscal year." The
governmentwide performance plan could help Congress and the executive
branch address critical federal performance and management issues,
including redundancy and other inefficiencies in how we do business. It
could also provide a framework for any restructuring efforts.
Unfortunately, this provision has not been fully implemented. Instead, OMB
has used the President's Budget to present high-level information about
agencies and certain program performance issues. The agency-byagency focus
of the budget does not provide the integrated perspective of government
performance envisioned by GPRA.

If the governmentwide performance plan were fully implemented, it could
provide a framework for such congressional oversight. For example, in
recent years, OMB has begun to develop common measures for similar
programs, such as job training. By focusing on broad goals and objectives,
oversight could more effectively cut across organization, program, and
other traditional boundaries. Such oversight might also cut across
existing committee boundaries, which suggests that Congress may benefit
from using specialized mechanisms to perform oversight (i.e., joint
hearings and special committees).

A strategic plan for the federal government, supported by key national
indicators to assess the government's performance, position, and progress,

2GAO/AIMD-97-146.

                   Chapter 5 Conclusions and Recommendations

could provide an additional tool for governmentwide reexamination of
existing programs, as well as proposals for new programs. If fully
developed, a governmentwide strategic plan can potentially provide a
cohesive perspective on the long-term goals of the federal government and
provide a much needed basis for fully integrating, rather than merely
coordinating, a wide array of federal activities. Successful strategic
planning requires the involvement of key stakeholders. Thus, it could
serve as a mechanism for building consensus. Further, it could provide a
vehicle for the President to articulate long-term goals and a road map for
achieving them. In addition, a strategic plan can provide a more
comprehensive framework for considering organizational changes and making
resource decisions.

Developing a strategic plan for the federal government would be an
important first step in articulating the role, goals, and objectives of
the federal government. It could help provide critical horizontal and
vertical linkages. Horizontally, it could integrate and foster synergies
among components of the federal government as well as help to clarify the
role of the federal government vis-`a-vis other sectors of our society.
Vertically, it could provide a framework of federal missions and goals
within which individual federal agencies could align their own missions
and goals that would cascade down to individual employees. The development
of a set of key national indicators could be used as a basis to inform the
development of governmentwide strategic and annual performance plans. The
indicators could also link to and provide information to support
outcomeoriented goals and objectives in agency-level strategic and annual
performance plans.

Improving Usefulness of We have found that leading organizations that
progressed the farthest to

Performance Information	results-oriented management did not stop after
strategic planning and performance measurement. They applied their
acquired knowledge and data to identify gaps in their performance, report
on that performance, and finally use that information to improve their
performance to better support their missions.

Chapter 5 Conclusions and Recommendations

Performance data can have real value only if they are used to identify the
gap between an organization's actual performance level and the performance
level it has identified as its goal. Once the performance gaps are
identified for different program areas, managers can determine where to
target their resources to improve overall mission accomplishment. When
managers are forced to reduce their resources, the same analysis can help
them target reductions to keep to a minimum the impact on their
organization's overall mission.3

Under GPRA, agencies produce a single strategic plan, annual performance
plan, and annual performance report. However, there are many potential
consumers of agencies' performance information-Congress, the public, and
the agency itself. One size need not fit all. Clearly, an agency will need
more detailed information on its programs for operational purposes than
would be suitable for external audiences. Of the six agencies' performance
reports we reviewed, some of them provided useful summary tables or
information that provided overall snapshots of performance or highlighted
progress in achieving key goals. Other reports that lacked such a summary
made it difficult to assess the progress achieved.

To improve the prospect that agency performance information will be useful
to and used by these different users, agencies need to consider the
different information needs and how to best tailor their performance
information to meet those needs. For example, we have reported that,
although many information needs were met, congressional staff also
identified gaps in meeting their information needs.4 Key to addressing
these information gaps was improving communication between congressional
staff and agency officials to help ensure that congressional information
needs are understood, and that, where feasible, arrangements are made to
meet them. Improved two-way communication might also make clear what
information is and is not available, as well as what is needed and what is
not needed. This might entail the preparation of simplified and
streamlined plans and reports for Congress and other external users.

3U.S. General Accounting Office, Executive Guide: Effectively Implementing
the Government Performance and Results Act, GAO/GGD-96-118 (Washington,
D.C.: June 1, 1996).

4GAO/GGD-00-35.

                   Chapter 5 Conclusions and Recommendations

Another challenge that limits the usefulness of agency performance reports
is the lack of timely data on performance. For the six performance reports
we reviewed we continued to observe a significant number of goals for
which performance data were unavailable. Policy decisions made when
designing federal programs, particularly intergovernmental programs, may
make it difficult to collect timely and consistent national data. In
administering programs that are the joint responsibility of state and
local governments, Congress and the executive branch continually balance
the competing objectives of collecting uniform program information to
assess performance with giving states and localities the flexibility
needed to effectively implement intergovernmental programs.

Improving Performance Measures and Data Quality

Another key challenge to achieving a governmentwide focus on results is
that of developing meaningful, outcome-oriented performance goals and
collecting performance data that can be used to assess results.
Performance measurement under GPRA is the ongoing monitoring and reporting
of program accomplishments, particularly progress toward preestablished
goals. It tends to focus on regularly collected data on the level and type
of program activities, the direct products and services delivered by the
programs, and the results of those activities. For programs that have
readily observable results or outcomes, performance measurement may
provide sufficient information to demonstrate program results. In some
programs, however, outcomes are not quickly achieved or readily observed,
or their relationship to the program is uncertain. In such cases, more
in-depth program evaluations may be needed, in addition to performance
measurement, to examine the extent to which a program is achieving its
objectives.

Chapter 5 Conclusions and Recommendations

Given the difficult measurement challenges we have identified, it is all
the more important that agency strategic planning efforts include the
identification of the most critical evaluations that need to take place to
address those challenges. However, our previous work has raised concerns
about the capacity of federal agencies to produce evaluations of program
effectiveness.5 Few of the agencies we reviewed deployed the rigorous
research methods required to attribute changes underlying outcomes to
program activities. Yet we have also seen how some agencies have
profitably drawn on systematic program evaluations to improve their
measurement of program performance or understanding of performance and how
it might be improved.6 Our review of six agencies' strategic plans and
performance reports in this report revealed weaknesses in their
discussions of program evaluation. Most of the strategic plans lacked
critical information required by GPRA, such as a discussion of how
evaluations were used to establish strategic goals or a schedule of future
evaluations. Furthermore, two of the six performance reports did not
summarize the results of program evaluations completed that year, as
required.

Our work has also identified substantial, long-standing limitations in
agencies' abilities to produce credible data and identify performance
improvement opportunities that will not be quickly or easily resolved.7
According to our review, five of six agencies' annual performance plans
showed meaningful improvements in how they discussed the quality of
performance data. However, only DOT's performance plan and report
contained information that provided a full level of confidence in the
credibility of its performance data. In particular, the plans and reports
did not always provide detailed information on how the agencies verified
and validated their performance data.

5U.S. General Accounting Office, Program Evaluation: Agencies Challenged
by New Demand for Information on Program Results, GAO/GGD-98-53
(Washington, D.C.: Apr. 24, 1998).

6U.S. General Accounting Office, Program Evaluation: Studies Helped
Agencies Measure or Explain Program Performance, GAO/GGD-00-204
(Washington, D.C.: Sept. 28, 2000).

7GAO/GGD-00-52.

                   Chapter 5 Conclusions and Recommendations

Recommendations for Executive Action

To provide a broader perspective and more cohesive picture of the federal
government's goals and strategies to address issues that cut across
executive branch agencies, we recommend that the Director of OMB fully
implement GPRA's requirement to develop a governmentwide performance plan.

To achieve the greatest benefit from both GPRA and PART, we recommend that
the Director of OMB articulate and implement an integrated and
complementary relationship between the two. GPRA is a broad legislative
framework that was designed to be consultative with Congress and other
stakeholders, and allows for varying uses of performance information. PART
looks through a particular lens for a particular use-the executive budget
formulation process.

To improve the quality of agencies' strategic plans, annual performance
plans, and performance reports and help agencies meet the requirements of
GPRA, we recommend that the Director of OMB provide clearer and more
consistent guidance to executive branch agencies on how to implement GPRA.
Such guidance should include standards for communicating key performance
information in concise as well as longer formats to better meet the needs
of external users who lack the time or expertise to analyze lengthy,
detailed documents.

To help address agencies' performance measurement challenges, we recommend
that the Director of OMB engage in a continuing dialogue with agencies
about their performance measurement practices with a particular focus on
grant-making, research and development, and regulatory functions to
identify and replicate successful approaches agencies are using to measure
and report on their outcomes, including the use of program evaluation
tools. Additionally, we recommend that the Director of OMB work with
executive branch agencies to identify the barriers to obtaining timely
data to show progress against performance goals and the best ways to
report information where there are unavoidable lags in data availability.
Interagency councils, such as the President's Management Council and the
Chief Financial Officers' Council, may be effective vehicles for working
on these issues.

To facilitate the transformation of agencies' management cultures to be
more results-oriented, we recommend that the Director of OMB work with
agencies to ensure they are making adequate investments in training on

                   Chapter 5 Conclusions and Recommendations

performance planning and measurement, with a particular emphasis on how to
use performance information to improve program performance.

Matters for 	To ensure that agency strategic plans more closely align with
changes in the federal government leadership, Congress should consider
amending

Congressional GPRA to require that updates to agency strategic plans be
submitted at

Consideration	least once every 4 years, 12-18 months after a new
administration begins its term. Additionally, consultations with
congressional stakeholders should be held at least once every new Congress
and interim updates made to strategic and performance plans as warranted.
Congress should consider using these consultations along with its
traditional oversight role and legislation as opportunities to clarify its
performance expectations for agencies. This process may provide an
opportunity for Congress to develop a more structured oversight agenda.

To provide a framework to identify long-term goals and strategies to
address issues that cut across federal agencies, Congress also should
consider amending GPRA to require the President to develop a
governmentwide strategic plan.

Agency Comments	We provided a copy of the draft report to OMB for comment.
OMB's written comments are reprinted in appendix VIII. In general, OMB
agreed with our findings and conclusions. OMB agreed to implement most of
our recommendations, noting that these recommendations will enhance its
efforts to make the government more results oriented. OMB agreed to (1)
work with agencies to ensure they are provided adequate training in
performance management, (2) revise its guidance to clarify the integrated
and complementary relationship between GPRA and PART, and (3) continue to
use PART to improve agency performance measurement practices and share
those practices across government.

In response to our recommendation that OMB fully implement GPRA's
requirement to develop a governmentwide performance plan, OMB stated that
the President's Budget represents the executive branch's governmentwide
performance plan. However, the agency-by-agency focus of the budget over
the past few years does not provide an integrated perspective of
government performance, and thus does not meet GPRA's requirement to
provide a "single cohesive picture of the annual performance goals for the
fiscal year." In response to our matter for

Chapter 5 Conclusions and Recommendations

congressional consideration that Congress should consider amending GPRA to
require the President to develop a governmentwide strategic plan, OMB
noted that the budget serves as the governmentwide strategic plan.
However, in our opinion, the President's Budget focuses on establishing
agency budgets for the upcoming fiscal year. Unlike a strategic plan, it
provides neither a long-term nor an integrated perspective on the federal
government's activities. A governmentwide strategic plan should provide a
cohesive perspective on the long-term goals of the federal government and
provide a basis for fully integrating, rather than primarily coordinating,
a wide array of federal activities.

We provided relevant sections of the draft report to Education, DOE, HUD,
SBA, SSA, and DOT. Education and SBA did not provide any comments, while
DOT provided minor technical comments. Written comments from DOE, HUD, and
SSA are reprinted in appendixes IX, X, and XI, respectively, along with
our responses.

DOE disagreed with portions of our analyses of its 2004 Annual Performance
Plan and its 2002 Performance and Accountability Report. Our analysis of
DOE's documents was based on specific criteria (see appendixes IV and V
for details) and was characterized in relation to our reviews of the other
five agencies' documents. We modified or clarified certain
characterizations in response to DOE comments, but for the most part found
that our characterizations were appropriate.

SSA generally agreed with our observations and agreed to incorporate them
in its future planning efforts. SSA made several points of clarification
and disagreed with our observation that its performance and accountability
report does not clearly state how program evaluations were used to answer
questions about program performance and results and how they can be
improved. SSA noted that its evaluations rely on surveys, and these
surveys form the basis for its efforts to deliver high-quality service.
SSA also noted that it lists other evaluations that are of great
importance to its ongoing operations. We do not discount the usefulness of
SSA's surveys in assessing its day-to-day management of programs. Rather,
we believe that it would be helpful for SSA to clearly identify the range
of evaluations conducted and how each of them contributed to improved
program performance.

HUD noted that all of the areas we suggested for further improvement are
under consideration for improvement. However, they disagreed with us on
two observations related to the strategic plan: (1) that the link between
its

Chapter 5 Conclusions and Recommendations

long-term and intermediate goals is difficult to discern and (2) that it
did not explain how it used the results of program evaluations to update
the current plan and did not include a schedule for future evaluations. On
the basis of OMB guidance for preparing strategic plans and the criteria
we used to evaluate all six agencies' strategic plans (see app. III for
more detail), we maintain that these two observations are valid and
require further attention. HUD also disagreed with how we presented the
performance information in its summary report cards (see fig. 22). HUD
noted that many of the results were explained in the individual indicator
write-ups that followed the summary information. Our analysis of HUD's
information included qualitative aspects of how the information was
presented, such as its usefulness to inform the average reader with little
or no exposure to the subject matter, and the extent to which HUD
presented a complete summary of performance information in a user-friendly
format.

Technical comments from DOE, HUD, and SSA were incorporated, as
appropriate.

Appendix I

                       Objectives, Scope, and Methodology

As agreed with your offices, our objectives for this report were to
determine (1) the effect of the Government Performance and Results Act
(GPRA) over the last 10 years in creating a governmentwide focus on
results and the government's ability to deliver results to the American
public, including an assessment of changes in the overall quality of
agencies' strategic plans, annual performance plans, and annual
performance reports; (2) the challenges that agencies face in measuring
performance and using performance information in management decisions; and
(3) how the federal government can continue to shift toward a more
results-oriented focus. To meet our objectives, we collected
governmentwide data to assess the government's overall focus on results.
We conducted a governmentwide survey of federal managers, focus groups
with federal managers and GPRA experts, and interviews with top appointed
officials. We identified and reviewed previously published reports on
GPRA. Finally, we selected a sample of agencies to review for changes in
the quality of their strategic plans, performance plans, and performance
reports since their initial efforts.

We conducted our work from January through November 2003 in Washington,
D.C. in accordance with generally accepted government auditing standards.
We provided drafts of the relevant sections of this report to officials
from each of the agencies whose GPRA reports we reviewed. We also provided
a draft of this report to OMB.

Methodology for A Web-based questionnaire on performance and management
issues was

administered to a stratified random probability sample of 800 persons
fromGovernmentwide a population of approximately 98,000 mid-level and
upper-level civilian Survey managers and supervisors working in the 24
executive branch agencies

covered by the Chief Financial Officers Act of 1990 (CFO). The sample was
drawn from the Office of Personnel Management's (OPM) Civilian Personnel
Data File as of December 31, 2002, using file designators indicating
performance of managerial and supervisory functions.

Appendix I
Objectives, Scope, and Methodology

The questionnaire was designed to obtain the observations and perceptions
of respondents on various aspects of GPRA as well as such results-oriented
management topics as the presence, use, and usefulness of performance
measures, hindrances to measuring and using performance information, and
agency climate. Most of the items on the questionnaire were closedended,
meaning that depending on the particular item, respondents could choose
one or more response categories or rate the strength of their perception
on a 5-point extent scale. Almost all the items on this questionnaire were
asked in two earlier mail-out surveys. One survey was conducted between
November 1996 and January 1997 as part of the work we did in response to a
GPRA requirement that we report on implementation of the act. The other
survey was conducted between January and August 2000.1

This survey covered the same CFO Act agencies and was designed to update
the results from the two earlier surveys. Similar to the two earlier
surveys, the sample was stratified by whether the manager or supervisor
was Senior Executive Service (SES) or non-SES. The management levels
covered general schedule (GS), general management (GM), or equivalent
schedules at levels comparable to GS/GM-13 through career SES or
equivalent levels of executive service. The sample also included the same
or equivalent special pay plans that were covered in our 2000 survey,
e.g., Senior Foreign Service executives.

We sent an e-mail to members of the sample that notified them of the
survey's availability on the GAO Web site and included instructions on how
to access and complete the survey. Members of the sample who did not
respond to the initial notice were sent up to two subsequent reminders
asking them to participate in the survey. The survey was administered from
June through August 2003.

During the course of the survey, we deleted 26 persons from our sample who
had either retired, separated, died, or otherwise left the agency or had
some other reason that excluded them from the population of interest. We
received useable questionnaires from 503 sample respondents, about 65
percent of the eligible sample. The eligible sample includes 39 persons
that

1For information on the design and administration of the two earlier
surveys, see GAO/GGD97-109, GAO-01-127, and U.S. General Accounting
Office, Managing For Results: Federal Managers' Views on Key Management
Issues Vary Widely Across Agencies, GAO-01-592 (Washington, D.C.: May 25,
2001).

Appendix I
Objectives, Scope, and Methodology

we were unable to locate and therefore unable to request that they
participate in the survey.

To assess whether the views of those individuals who chose not to
participate in our survey might be different than those who did, we made
an effort to administer a brief survey over the telephone to those
individuals who still had not responded about a month or more after the
survey had been available to them despite being contacted twice after the
initial notification e-mail had been sent out. This telephone survey
consisted of four items from the full survey. There were 58 persons who
agreed to answer these questions over the telephone. This was 41 percent
of those individuals who had not responded at the time we attempted to
contact them for the purpose of asking these four questions.

We analyzed the responses of this group on the four selected items
compared to the responses received from all other respondents. Our
analyses of the items showed very few differences between nonresponders
and responders. There was no sufficient or consistent pattern of
responding that would warrant a conclusion that the views of nonresponders
were notably different than responders. The responses of each eligible
sample member who provided a useable questionnaire were subsequently
weighted in the analysis to account statistically for all the members of
the population.

The overall survey results are generalizable to the population of managers
as described above at the CFO Act agencies. All results are subject to
some uncertainty or sampling error as well as nonsampling error. As part
of our effort to reduce nonsampling sources of error in survey results, we
checked and edited (1) the survey data for responses that failed to follow
instructions and (2) the programs used in our analyses. In general,
percentage estimates in this report for the entire sample have confidence
intervals ranging from about +- 4 to +-11 percentage points at the 95
percent confidence interval. In other words, if all CFO Act agency
managers and supervisors in our population had been surveyed, the chances
are 95 out of 100 that the result obtained would not differ from our
sample estimate in the more extreme cases by more than +-11 percent.
Appendix VI shows the questions asked with the weighted percentage of
managers responding to each item.

Because a complex sample design was used in the current survey as well as
the two previous surveys and different types of statistical analyses are
being done, the magnitude of sampling error will vary across the
particular

                                   Appendix I
                       Objectives, Scope, and Methodology

surveys, groups, or items being compared due to differences in the
underlying sample sizes and associated variances. The number of
participants in the current survey is only about one fifth of the number
in the 2000 survey (2,510) and slightly more than half of those in the
first survey (905). The 2000 survey was designed with a larger sample than
the other two surveys in order to provide estimates for each individual
agency as well as all the CFO Act agencies collectively. Consequently, in
some instances, a difference of a certain magnitude may be statistically
significant. In other instances, depending on the nature of the comparison
being made, a difference of equal or even greater magnitude may not
achieve statistical significance. For example, when comparing a result
from the current survey to the larger 2000 survey with its relatively
smaller confidence interval, a difference of a certain magnitude may be
significant. However, when comparing the current survey with the first
survey, that difference may not be significant given the greater
imprecision in the estimates due to both surveys' smaller sample sizes. We
note throughout the report when differences are significant at the .05
probability level.

Methodology for Focus Groups

We held a series of focus groups as one of our methods for obtaining
information about the accomplishments and challenges agencies face in
implementing and overseeing GPRA-related activities. Focus groups are a
form of qualitative research in which a specially trained leader, a
moderator, meets with a small group of people (usually 8 to 10) who are
knowledgeable about the topics to be discussed.

In all, we conducted eight focus groups-one with experts on GPRA and
performance management and seven with federal managers. For our focus
group with experts, we invited individuals from the private sector,
academia, the National Academy of Public Administration, and OMB. These
individuals were involved either in drafting GPRA, overseeing its
implementation, or studying and critiquing implementation, from outside
government. Out of the 14 experts we invited, a total of 11 attended the
focus group.

Appendix I
Objectives, Scope, and Methodology

For our focus groups with agency managers, we obtained a list of potential
participants for our focus groups by contacting all 24 CFO Act agencies
and requesting that they submit a list of candidates and their profiles
based on the following criteria: federal managers (1) in the GS-13 pay
grade and above, including members of the SES; (2) having at least 3 years
of managerial experience; (3) currently located in the Washington, D.C.,
metropolitan area; (4) having hands-on experience with GPRA or performance
management;2 and (5) representing both departments and their component
bureaus. We received profiles of candidates from all agencies; however, no
managers from the OPM chose to participate in the focus groups.

To select focus group participants, we reviewed the profiles submitted by
agencies and selected candidates with diverse experience who held a
variety of different positions within the agency in order to capture a
broad range of perspectives. For example, we invited a comptroller; a
deputy director of management, administration, and planning; budget
directors; budget officers; management analysts; and program managers;
among others. We contacted the candidates and provided them with the list
of questions to be discussed at the focus group in advance so they would
be aware of our interests and be better able to provide us, where
possible, with examples to illustrate their responses to our questions.
Out of 104 agency officials we invited, 70 participated in the focus
groups.3

2For example, candidates could be operations managers with hands-on
experience managing a federal program or agency officials directly
involved in carrying out the activities required under GPRA, such as
developing a strategic or annual performance plan or annual performance
report.

3Due to last minute circumstances, a federal manager participated via
teleconference from an agency's field office and another was unable to
attend the focus group, but mailed his answers to questions we sent to all
participants in advance of the focus groups.

                                   Appendix I
                       Objectives, Scope, and Methodology

During each session, the moderator explained the scope of our work and
elaborated on how the focus groups were one of several methods we were
using to collect information relevant to our objectives. As part of the
focus group process, the moderator asked participants at each session to
identify the main accomplishments and challenges that, in their view,
could be attributed to GPRA and to mention possible solutions to these
challenges. During the sessions, we created lists of the accomplishments,
challenges, and solutions identified by group participants and posted
these lists around the meeting room. Participants were then asked to
review the lists and vote on the three most important accomplishments and
the top three challenges.4

To organize the information collected during the focus groups, we reviewed
the statements made by participants in response to our questions. We
identified related sets of statements and summarized them as a general
theme. We noted how often a theme was expressed both within and across
each focus group. We also examined the number of votes each posted
statement obtained. Our analysis focused on those themes that were
supported by statements that obtained a high number of votes and were
mentioned frequently within and across the majority of focus groups.

The focus group results discussed in this report are summary descriptions
reflecting the range of views and perceptions held by employees,
supervisors, or project managers who participated in the focus groups.
Although we cannot assume all federal managers share these views, the
extent to which certain opinions or perceptions were repeatedly expressed
or endorsed by many participants from multiple focus groups provides a
rough gauge of the significance of these views.

Methodology for Interviews with Political Appointees

To obtain an additional perspective from top political managers of federal
agencies on GPRA, we held telephone or in-person interviews with 10
highlevel officials serving under political appointments with CFO Act
agencies. Five former officials from the Clinton administration and five
serving under the current Bush administration were interviewed. For
example, we interviewed deputy secretaries, chief financial officers, and
deputy assistant secretaries for management. We asked them to provide
their perspective on the main accomplishments or other effects of GPRA,
the

4We read the list of comments to the manager who participated via
teleconference and voted on his behalf based on his preferences.

                                   Appendix I
                       Objectives, Scope, and Methodology

key challenges to implementation, and possible improvements to GPRA. We
summarized the interviewees' answers and identified recurring themes or
observations for our analysis.

Methodology for Selecting Agencies to Review for Changes in the Quality of
Their Strategic Plans, Annual Performance Plans, and Annual Performance
Reports

To address how the quality of agency strategic plans, performance plans,
and performance reports have changed since their initial efforts, we
reviewed a sample of six agencies' current strategic plans, annual
performance plans, and annual performance reports and compared the results
to our findings from prior reviews of the agencies' initial efforts in
producing these documents. We did not independently verify or assess the
information we obtained from agency plans and reports. If an agency chose
not to discuss its efforts concerning elements in the plans and reports,
it does not necessarily mean that the agency is not implementing those
elements.

We selected the departments and agencies to review based on the extent to
which they collectively represented the full range of characteristics in
the following four areas: (1) agency size (small, medium, large); (2)
primary program types (direct service, research, regulatory, transfer
payments, and contracts or grants); (3) quality of fiscal year 2000
performance plan based on our previous review (low, medium, high);5 and
(4) type of agency (cabinet department and independent agency).

Based on these characteristics, we selected the following departments and
agencies:

o  Department of Education (Education),

o  Department of Energy (DOE),

o  Department of Housing and Urban Development (HUD),

o  Small Business Administration (SBA),

5GAO/GGD/AIMD-99-215. Based on how we had rated agencies' annual
performance plans on their picture of performance, specificity of
strategies and resources, and the degree of confidence that performance
information will be credible, we assigned numeric values to each agencies'
rating (e.g., clear=3, general=2, limited=1, unclear=0) and added them up
to determine overall quality of high, medium, or low. An agency's plan was
considered high quality if its score was between 7-9, a score of 5-6 was
considered medium quality, and a score of 3-4 was low. No agencies
received a score lower than 3.

                                   Appendix I
                       Objectives, Scope, and Methodology

o  Social Security Administration (SSA), and

o  Department of Transportation (DOT).
Table 4 shows the characteristics represented by each of these agencies.

Table 4: Summary of Characteristics of Agencies Selected for Review of
Strategic Plans, Annual Performance Plans, and Annual Performance Reports

                                                            Quality of fiscal 
                                                                    year 2000 
                                                                  performance 
                 Sizea        Functions                                 plans 

              Full time                                                  Low   
              Small (S),                                                 (L),  
              equivalent                                                medium 
                  medium          Direct             Transfer  Grants/    (M), 
                    (M),                                                
  Departments positions  Research service Regulatory payments contracts  high  
              large (L)                                                  (H)   
Education   4,756 S      X        X        X         X         X       M    
      DOE      16,067 M     X                 X                   X       L    
      HUD      10,752 M              X        X         X         X       M    
      DOT     135,022b L    X        X        X                   X       H    

Agencies

SBA 4,005S XX X L

SSA 64,418L X X H

Source: GAO.

aThe size of the agencies is based on data from December 2002.

bIn March 2003, the U.S. Coast Guard and the Transportation Security
Administration (TSA) were transferred from DOT to the Department of
Homeland Security. According to the fiscal year 2005 President's Budget,
in fiscal year 2003, the Coast Guard and TSA had 43,702 and 57,324
full-time equivalent positions, respectively.

A more detailed discussion of the criteria we used to assess the quality
of the agencies' planning and reporting documents and the results of our
review is contained in appendixes III, IV, and V.

Appendix II

Focus Group Participants Agreed GPRA Provides a Framework for Federal
Agencies to Become More Results Oriented

While GPRA's goal is to make the federal government more results oriented,
work carried out in support of this effort, such as planning activities,
implementing programs, reporting on outcomes, and evaluating performance,
generally lies in the hands of federal managers. To get a better
appreciation for the main accomplishments and challenges agencies face in
implementing and overseeing GPRA-related activities, we conducted a total
of seven focus groups comprised of federal managers from 23 CFO Act
agencies, and an eighth focus group comprised of 11 experts on GPRA and
performance management and budgeting.

For our focus groups, we asked participants to discuss (1) the key
accomplishments of GPRA to date, (2) whether GPRA has created a focus on
achieving results across the federal government, (3) the effect of GPRA on
the government's ability to deliver results to the American public, (4)
the persistent and prevalent challenges agencies face in implementing and
overseeing GPRA-related activities, and (5) suggestions to address these
challenges.

We recorded the views expressed by participants and categorized them into
themes that were most commonly expressed or endorsed both within and
across the groups. The focus group results discussed in this report are
organized according to the themes we identified and are summary
descriptions reflecting the range of views and perceptions expressed by
the experts, supervisors, and project managers. A more complete discussion
of our scope and methodology can be found in appendix I.

Focus group participants indicated that GPRA has helped to make the
federal government more results oriented. However, they noted that a
number of obstacles have made GPRA implementation challenging, such as
difficulty in establishing results-oriented goals and measuring
performance, addressing frequently changing priorities resulting from
changes in administration, and lack of top leadership support for GPRA. In
all, participants generally perceive the information contained in GPRA
reports to be important and useful; however, they do not believe that
lawmakers use this information to make resource decisions or conduct
oversight. To address these problems and concerns, focus group
participants stated that, among other things, Congress should provide
guidance to agencies on how to make GPRA reports more useful, OMB should
reinforce its value as a management tool, and agencies need to commit the
resources necessary to carry out GPRA-related activities.

                                  Appendix II
                      Focus Group Participants Agreed GPRA
                   Provides a Framework for Federal Agencies
                        to Become More Results Oriented

GPRA Accomplishments

Overall, focus group participants stated that GPRA has had a positive
effect on federal agencies' efforts to become more results oriented. Based
in statute, GPRA has created a framework for agencies to focus on
achieving results by requiring them to establish program goals and
objectives, develop performance indicators, and measure the extent to
which they have made progress towards achieving program goals. As a
result, federal managers noted that they have been increasingly able to
view their programs in terms of outcomes, not outputs, and have been
generally learning how to use this framework as a management tool.
Participants also attributed a series of cultural changes within federal
agencies to GPRA, where problem solving, creative thinking, and agencywide
discussions on budget and performance have become more common. The
strategic and annual performance plan and performance reports that federal
agencies are required to submit to OMB and Congress under GPRA have
increased the transparency of government activities. These documents have
also helped agencies justify their budget requests based on their
performance.

Creating a Framework in Statute and a Management Tool for Agency
Leadership

Participants agreed that GPRA created a framework in statute for federal
agencies to plan their activities in order to become more results oriented
and provided a managerial tool for program accountability. Using this
framework, agencies can develop and focus on strategies to carry out the
programs they administer; set goals and identify performance indicators
that will inform them whether or not they achieved the performance they
expected; and determine what impact, if any, their programs have had on
the American public. According to the experts in one of our focus groups,
comparing federal agencies' current mission statements contained in their
strategic plans to what they were in the past demonstrates that agencies
have done some "soul searching" to get a better sense of what their role
is (or should be) and how they can achieve it. Given that GPRA is in
statute, the use of this planning framework is likely to be sustained
within agencies.

Participants also mentioned that GPRA has encouraged federal managers to
view their programs in terms of results, not just inputs and outputs. Such
a change is important, as it has encouraged federal managers to reflect on
the statutory intent of their programs and use this framework as a
management tool for establishing accountability within their programs.

                                  Appendix II
                      Focus Group Participants Agreed GPRA
                   Provides a Framework for Federal Agencies
                        to Become More Results Oriented

Cultural Change within Federal Agencies

Participants in the majority of focus groups agreed that GPRA has been a
driving force behind many cultural changes that have occurred within
federal agencies. Highlighting the focus on results, participants stated
that GPRA has stimulated a problem-solving approach within federal
agencies and encouraged them to think creatively when developing
performance indicators for their programs. GPRA has also changed the
dialogue within federal agencies; front-line managers and staff at lower
levels of the organization now discuss budget issues in connection with
performance. Similarly, experts noted that performance management and
resource investments are more frequently communicated between agency
officials and Congress than in the past. Within agencies, GPRA documents
can provide a context of missions, goals, and strategies that political
appointees can use to articulate agencies' priorities.

Increased Transparency of Government Results

Some participants agreed that GPRA has increased federal agencies' ability
to present their results to the American public, benefiting both
stakeholders and agency staff. On the one hand, GPRA reports enable
federal agencies to communicate the results of government programs and
activities to a broad public. For example, GPRA reports are available on
agencies' Web sites and provide information to OMB, Congress, and the
American public on what agencies plan to do, how they plan to do it, and,
as summarized in the performance and accountability reports, what agencies
have accomplished and how much money it cost them to do it.

Similarly, some participants agreed that GPRA allows federal employees to
see exactly how their work can produce a positive outcome, increasing
employee morale. Using information contained in GPRA reports, agency
employees can compare department goals to the results of their activities,
and see how their work contributes to these goals. For example, a focus
group participant from the Indian Health Service in California stated that
he was pleased to learn that health care indicators of some Native
American tribes had already exceeded levels originally projected by his
agency to be reached by the year 2010.

Link between Budget and Participants also agreed that the GPRA framework
has had a positive effect

Performance	on agencies' ability to link their performance to their
budget. By focusing on their mission and outcomes, agencies are learning
to prioritize activities and align their resources to ensure that they
will be able to achieve results. For example, a participant stated that
the National Wild Horse and Burro

                                  Appendix II
                      Focus Group Participants Agreed GPRA
                   Provides a Framework for Federal Agencies
                        to Become More Results Oriented

Program, managed by the Bureau of Land Management in the U.S. Department
of the Interior, recently developed a model for its GPRArelated work which
divided their program into discrete components, identifying the results it
could accomplish in the short- and long-term and specifying what
additional resources were needed. According to the participant, the
program received additional funding based on this needs assessment.

In addition, a few managers noted that the link between budget and
performance has given agencies an incentive to commit the resources
necessary to modernize information systems. Having the right information
on time enables agencies to integrate budget requests and performance
information in ways that are more meaningful. This information also
increases Congress's ability to make informed budget decisions based on
agency performance.

Views on Delivering Results to the American Public Were Mixed

Participants' views on whether GPRA has helped agencies deliver results to
the American public were generally mixed. Some federal managers in our
focus groups agreed that GPRA has had a positive effect on raising
awareness on many issues, and that in and of itself is a way of delivering
results. The information gathered and reported for GPRA allows agencies to
make better-informed decisions, which improves their ability to achieve
results. Of note, GPRA has allowed agencies to move towards
performance-based budgeting, which helps agencies identify resources
available to use in programs where outcomes can be achieved. For example,
programs and expenses that do not add value to the agency's mission could
be eliminated. Having performance data readily available is another key
area where GPRA has enabled agencies to deliver results to the American
public.

Other participants stated that while certain aspects of GPRA-related work
have been positive, agencies' ability to deliver results and public
awareness of their activities cannot always be exclusively attributed to
GPRA. For example, while measuring performance is a move in the right
direction, GPRA reports provide too many indicators and it is unclear how
this has led to better performance among federal agencies. A few
participants also stated that agencies deliver results in ways that the
American public does not fully recognize. For example, a participant
stated that agencies working in the area of international relations
generally lack recognition for the results of their activities, as the
American public is generally unfamiliar with the nuances of foreign policy
and all the programs the U.S.

                                  Appendix II
                      Focus Group Participants Agreed GPRA
                   Provides a Framework for Federal Agencies
                        to Become More Results Oriented

government finances overseas. And while GPRA has helped these agencies
prioritize their work to achieve results, it is unclear that GPRA has
improved the visibility and understanding in the public eye of what these
agencies do because the American public does not see the results of their
work. A participant stated that research-based agencies have also used
GPRA to plan activities that benefit the American public in ways they are
not fully aware of. For example, National Aeronautics and Space
Administration's (NASA) space program includes, among other things,
predicting whether or not an asteroid will strike the earth, although on a
daily basis the American public is probably not worried about such a
rarity.

For other participants, the link between GPRA and agencies' service
delivery was not clear. Participants characterized GPRA-related activities
as time consuming, and there is no real evidence that this work has
improved their ability to deliver results to the American public. Other
participants stated that many agencies rely on grant recipients to carry
out their work, and delivering results to the American public depends, to
a large extent, on the diligence of these organizations to implement their
programs. Overall, they held the view that performance would not change
dramatically if GPRA were no longer a requirement for federal agencies.

Alternate Views on GPRA's Effect

Participants in one of our focus groups stated that GPRA, per se, had not
led federal agencies to achieve specific accomplishments. These
participants believed that managers' initiative, not the framework
established by GPRA, has been key to improving agencies' planning efforts
and focus on results. A few participants also mentioned that the results
framework established by GPRA is somewhat intangible unless managers can
use it effectively; without the adequate infrastructure to implement GPRA,
an agency's compliance with the law is simply paperwork, as GPRA does not
allow for a systematic and thorough approach to performance management.
For example, while agencies can develop performance indicators to gauge
progress towards a program goal, they often encounter problems in
collecting relevant data and measuring the agencies' contribution to a
specific outcome. In addition, agencies are not able to make changes to
the programs they manage, limiting their ability to deliver results.

                                  Appendix II
                      Focus Group Participants Agreed GPRA
                   Provides a Framework for Federal Agencies
                        to Become More Results Oriented

Challenges in Implementing and Overseeing GPRA Activities

Participants stated that agencies face significant challenges in complying
with GPRA-related activities. Focus group participants also agreed
Congress does not appear to use agencies' performance information when
making budget decisions. In carrying out GPRA-related activities, managers
find it difficult to identify performance indicators and establish program
goals that are results oriented, as required by GPRA. In addition, a few
participants stated that they lack the support from senior management to
work on GPRA activities. Changes in administration also tend to disrupt
progress made by agencies in support of GPRA.

Performance Information Participants strongly believed that Congress does
not take into account

Not Used	agencies' performance information when overseeing agency
activities and making budget decisions. As a result, there is a negative
effect on how agencies view their efforts in producing GPRA plans and
reports-many believe that they are less worthwhile than the effort and
resources invested. On the other hand, participants perceive the budget
process strictly as a political exercise, and it is unclear how useful
performance information can be in this context.

Complexity of Establishing Results-Oriented Goals and Measuring
Performance

Participants stated that establishing results-oriented goals and
identifying performance indicators are generally complex undertakings.
Participants agreed that they often feel as if they were trying to measure
the immeasurable, not having a clear understanding of which performance
indicators could accurately inform the agency how it is carrying out a
specific activity. And while agencies generally try to improve the
indicators from one year to another, in doing so, they generally lose
their ability to develop trend information to track progress made over
time.

Participants also mentioned that there appears to be a disconnect between
some federal programs that generally produce results over a longer time
period and GPRA's requirement that agencies report annually on their
progress towards achieving their goals. Participants stated that federal
programs, especially those that are research-based, often take years to
achieve the full scope of their work. Consequently, and for reasons that
range from lack of performance data to an absence of short-term outcomes,
it could appear as though resources invested in carrying out their
activities led to no results in the short run.

                                  Appendix II
                      Focus Group Participants Agreed GPRA
                   Provides a Framework for Federal Agencies
                        to Become More Results Oriented

Focus group participants generally agreed that in cases where third
parties, such as states or localities, implement federal programs, some
agencies face challenges in obtaining timely performance data from
relevant partner organizations. In some instances, federal managers have
trouble identifying what the government's contribution has been to a
specific outcome. While the experts generally attributed this to agencies
not doing a good job at setting appropriate goals and the corresponding
measures and objectives not being clear enough, managers stated that their
lack of control over grantees and the strict reporting deadlines imposed
by GPRA were factors that worked against their efforts to deliver results.

Managing Strategically with Frequently Changing Priorities

Some participants stated that it is difficult for them to manage
strategically, given the frequently changing priorities that come with
changes in administrations. While GPRA requires an agency to develop a
strategic plan at least every 3 years to cover the following 5-year
period, participants agreed that it makes little sense to update it
shortly before a new administration is scheduled to take office. In
addition, changes in political leadership generally result in a new agenda
with new objectives. These changes force agencies to revise their plans,
management initiatives, and strategies, which translate into additional
GPRA-related work, generally characterized by focus group participants as
a burden agency staff must add to their normal work load.

Lack of Top Leadership Some participants stated that they often encounter
resistance from agency

Support	leadership to endorse GPRA-related activities. In some instances,
seniorlevel support for GPRA falters or is nonexistent. Participants
attributed this to the reluctance within some federal agencies to think
about outcomes and performance. Some focus group participants stated that
in some instances high-level managers are somewhat averse to being held
accountable for the results of programs they run.

Suggestions to Address To address these challenges, focus group
participants made the following GPRA Challenges suggestions.

                                  Appendix II
                      Focus Group Participants Agreed GPRA
                   Provides a Framework for Federal Agencies
                        to Become More Results Oriented

Congress Should Clarify Congressional staff should provide guidance on
agencies' GPRA

How Performance submissions-specifically, how information could be
presented in the

Information Could Be More 	reports to make it more useful in the
decision-making process. They could also communicate to agencies how the
performance information is being

Useful 	used, so that agencies do not perceive their data gathering
efforts as inconsequential. Additionally, in using performance information
to make budget decisions, Congress should consider the results achieved by
agencies in addition to results not achieved.

The Administration and Agencies should embrace GPRA as a management tool,
not just an external OMB Need to Reinforce the requirement that is
separate from their day-to-day activities. To this end, Value of GPRA as a
the administration and OMB need make sure agency officials understand
Management Tool how GPRA can be further integrated with other management
initiatives.

Agency Guidance on GPRA OMB's guidance to agencies on how to implement
GPRA should recognize Should Recognize Diversity that one-size-fits-all
approaches are unlikely to be useful. OMB should also of Federal Agencies
afford agencies some flexibility by simplifying the reporting process. For

example, some participants believed that not everything needed to be
measured. OMB should make exceptions for unique situations and programs,
e.g., science programs, and it could consider providing multiyear funding
for them.

OMB Should Publish a OMB should commit to regularly publishing a
governmentwide Governmentwide performance report that would articulate the
government's Performance Report accomplishments to the public. It would
also be useful if it singled out

higher-performing programs so agencies could use them as models to guide

their planning efforts.

Agencies Need to Obtain Agency leadership needs to ensure that staff
members have the necessary and Commit Resources to resources to work on
GPRA-related activities. In addition, they need to Carry Out GPRA-Related
invest resources to train staff, including political appointees, on GPRA
and Activities the benefits of linking budget to performance.

                                  Appendix II
                      Focus Group Participants Agreed GPRA
                   Provides a Framework for Federal Agencies
                        to Become More Results Oriented

Timing of GPRA Reports Under GPRA, agencies are to update their strategic
plans every 3 years. Should Take into Account However, this effort can be
wasted. Given that federal administrations Changes of Administration
generally span at least 4 years, participants suggested that the required

update be changed to every 4 years to correspond with new presidential

terms.

Agencies Should Share Agencies should collaborate more to develop
strategies to address difficult Experiences on How to issues, such as how
to identify performance indicators and measure agency Address Common
Problems contributions to specific outcomes. It would be useful if more
agencies

created structured forums for managers to share experiences, talk about

effective practices, and share solutions.

Appendix III

                   Observations on Agencies' Strategic Plans

Under GPRA, strategic plans are the starting point and basic underpinning
for results-oriented management. One of our objectives was to assess the
changes in the overall quality of agencies' goals, strategies, and data
articulated in their strategic plans. To meet this objective, we
judgmentally selected six agencies-Education, DOE, HUD, SBA, SSA, and
DOT-using criteria such as agency size, primary program types, and
previous GAO reviews. To assess the overall quality and improvements made
to the agencies' strategic plans, we relied on requirements contained in
GPRA and accompanying committee report language,1 guidance to agencies
from OMB for developing strategic plans,2 previous GAO reports and
evaluations,3 and our knowledge of agencies' operations and programs. In
conducting our reviews, we compared our assessments of agencies' current
strategic plans to our assessments of draft plans from fiscal year 1997.4
A more detailed discussion of our scope and methodology can be found in
appendix I.

Required Elements of Agency Strategic Plans

GPRA requires an agency's strategic plan to contain six key elements:

1.	A comprehensive agency mission statement. The agency mission statement
should concisely summarize what the agency does, as required by law,
presenting the main purposes for all its major functions and operations.
According to OMB guidance issued in 2002, a mission statement is brief,
defining the basic purpose of the agency, and corresponds directly with an
agency's core programs and activities. The program goals should flow from
the mission statement as well. The federal government's adaptive responses
over time to new needs and problems have contributed to fragmentation and
overlap in a host of program areas, such as food safety, employment
training, early childhood development, and rural development, which could
limit the

1Government Performance and Results Act of 1993, Committee on Governmental
Affairs, United States Senate, S. Rpt. No. 58, 103d Cong. 1st Sess.
(1993).

2Office of Management and Budget, Circular No. A-11, Part 6, Preparation
and Submission of Strategic Plans, Annual Performance Plans, and Annual
Program Performance Reports

(Washington, D.C.: June 2002).

3U.S. General Accounting Office, Agencies' Strategic Plans Under GPRA: Key
Questions to Facilitate Congressional Review, GAO/GGD-10.1.16 (Washington,
D.C.: May 1, 1997).

4GAO/HEHS-97-176R; GAO/RCED-97-199R; GAO/RCED-97-208R; GAO/HEHS-97-179R;
GAO/RCED-97-205R; and GAO/RCED-97-224R.

Appendix III Observations on Agencies' Strategic Plans

overall effectiveness of the federal effort. The mission statement helps
to distinguish agencies' roles from one another and reduce the overlap and
identify areas needing coordination and collaboration.

2.	Agencywide long-term goals and objectives. General goals and
objectives-or strategic goals-explain what results are expected from the
agency's major functions and when to expect those results. Thus, such
goals are an outgrowth of the mission and are very often results oriented.
OMB guidance states that the goals should be defined in a manner that
allows a future assessment to be made on whether the goal was or is being
achieved. General goals should predominately be outcomes, and are
long-term in nature.

3.	Approaches or strategies to achieve goals and objectives. Strategies
help in aligning an agency's activities, core processes, and resources to
support achievement of the agency's strategic goals and mission. Under
GPRA, strategies are to briefly describe the operational processes, staff
skills, and technologies, as well as the human, capital, information, and
other resources needed. According to OMB guidance, descriptions should be
brief, but more detailed data should be provided if a significant change
in a particular means or strategy would be essential for goal achievement.
In addition, the plan should summarize agencies' efforts to provide
high-quality and efficient training and skill improvement opportunities
for employees. As we have reported previously, agencies' planning
processes should support making intelligent resource allocation decisions
that minimize, to the extent possible, the effect of funding reductions on
mission accomplishment.

4.	A description of the relationship between long-term and annual goals.
Under GPRA, agencies' long-term strategic goals and objectives are to be
linked to their annual performance plans and the day-to-day activities of
their managers and staff. Without this linkage, Congress may not be able
to judge whether an agency is making progress toward achieving its
long-term goals. OMB guidance states that an updated and revised strategic
plan should briefly outline (1) the type, nature, and scope of the
performance goals being included in annual performance plans and (2) how
these annual performance goals relate to the longterm, general goals and
their use in helping determine the achievement of the general goals.

5.	An identification of key external factors. Identification of key
factors, external to the agency and beyond its control that could

Appendix III Observations on Agencies' Strategic Plans

significantly affect the achievement of the strategic goals, are important
for Congress or the agencies to judge the likelihood of achieving the
strategic goals and actions needed to better meet those goals. Such
external factors could include economic, demographic, social,
technological, or environmental factors. Information on these factors can
be useful for goal setting and also for explaining results in the agency's
annual performance reports, including, when applicable, the reasons annual
performance goals were not met. According to OMB guidance, if key factors
cannot be identified, a statement of explanation should be included in the
plan.

6.	A description of program evaluations. Finally, strategic plans should
include a description of completed program evaluations that were used in
developing the strategic plan, and a schedule for future program
evaluations. Program evaluations can be a potentially critical source of
information for Congress and others in ensuring the validity and
reasonableness of goals and strategies, as well as for identifying factors
likely to affect performance. Such information can also be useful in
explaining results in an agency's annual performance report, including,
when applicable, the reasons annual performance goals were not met, and
identifying appropriate strategies to meet unmet goals. Program
evaluations are defined in the act as objective and formal assessments of
the results, impact, or effects of a program or policy. The evaluations
include assessments of the implementation and results of programs,
operating policies, and practices.

In addition to the six key elements, OMB guidance also states that
agencies participating in crosscutting programs should describe in their
strategic plans how the programs are related and how coordination will
occur to support common efforts. Uncoordinated program efforts can waste
scarce funds, confuse and frustrate program customers, and limit the
overall effectiveness of the federal effort. OMB guidance also states that
the strategic plan should include a brief description of any steps being
taken to resolve mission-critical management problems. One purpose of GPRA
is to improve the management of federal agencies. Therefore, it is
particularly important that agencies develop strategies to address
management challenges that threaten their ability to meet long-term
strategic goals as well as this purpose of GPRA.

As shown in table 5, the majority of agencies have made progress in
addressing the required elements of strategic planning under GPRA.

             Appendix III Observations on Agencies' Strategic Plans

    Table 5: Agencies' Progress in Addressing Required Elements of Strategic
Planning under GPRA Element included in agency strategic plan? Agency strategic
                                plans Plan year

Mission statement Long-term goals Strategies Relationship between longterm
goals and annual goals External factors Evaluations

Department of Education 1997 X X X X X

2002XXX X X

Department of Energy 1997 X X X

2003aX XX X X

Department of Housing and 1997 X
Urban Development

2003XXX X X

Small Business Administration 1997 X X X X

2001bX XX X X

Social Security Administration 1997 X X X X X X

2003X XX X XX

Department of Transportation 1997 X X X

2003aX XX X X

Sources: GAO/GGD-10.1.16; GAO/HEHS-97-176R; GAO/RCED-97-199R;
GAO/RCED-97-208R; GAO/HEHS-97-179R; GAO/RCED-97-205R; GAO/RCED-97-224R;
and GAO analysis of U.S. Department of
Education, Office of the Deputy Secretary, Planning and Performance
Management Service, U.S. Department of Education Strategic Plan 2002-2007
(Washington, D.C.: 2002); Department of Energy, 2003
Strategic Plan (Draft) (Washington, D.C.: 2003); Department of
Transportation, U.S. Department of Transportation Draft Strategic Plan for
Fiscal Years 2003-2008 (Washington, D.C.: 2003); Social Security
Administration, Social Security Administration Strategic Plan 2003-2008
(Washington, D.C.: 2003); U.S. Department of Housing and Urban
Development, Strategic Plan FY 2003-FY 2008 (Washington, D.C.:
2003); and Small Business Administration, SBA Strategic Plan, FY 2001-FY
2006 (Washington, D.C.: 2000).

aThe 2003 plans for DOE and DOT were in draft form during the time of our
review.

bAt the time of our review, the most recent SBA strategic plan was for
fiscal years 2001-2008. SBA released a new strategic plan for fiscal years
2003-2008 in October 2003.

The remainder of this appendix discusses our observations on how the
quality of each of the agencies' strategic plans we reviewed has changed
since the agencies submitted their first draft strategic plans in 1997. We
did not independently verify or assess the information we obtained from
agency strategic plans. If an agency chose not to discuss its efforts
concerning elements in the strategic plan, it does not necessarily mean
that the agency is not implementing those elements.

             Appendix III Observations on Agencies' Strategic Plans

Observations on Changes in the Quality of Education's Strategic Plan

In our review of Education's June 1997 draft strategic plan,5 we found
that the plan generally complied with GPRA and included all but one of the
six elements required by GPRA; it did not discuss how Education's
long-term goals and objectives would be related to its annual performance
goals. Also, we observed that the plan presented a logical, fairly
complete description of how Education intended to achieve its mission, but
a few areas could have been improved. In comparison, Education's 2002-2007
strategic plan has improved on several areas we identified in our 1997
review. However, we still found areas where Education could improve. Table
6 summarizes these findings.

Table 6: Education's Progress in Addressing Required Elements of Strategic
Planning under GPRA

Element of strategic planning Included in initial draft strategic plan
Included in current strategic plan

Mission statement   Yes. Mission statement    Yes. Mission statement is    
                          clearly and briefly             outcome             
                     explained why the agency oriented, comprehensive, covers 
                             exists, what the all of the                      
                      agency does, and how it          agency's functions and 
                           performs its work.          activities, and is the 
                                              same as in the 1997 draft plan. 

Long-term goals Yes. These goals were related to the Yes. The goals have
changed, but are mission and were results oriented, but did related to
each other and the mission and not appear to reflect civil rights
enforcement are results oriented. They now additionally and monitoring
responsibilities. reflect civil rights enforcement and monitoring
responsibilities.

Strategies	Yes. The plan outlined strategies to achieve Yes. The plan
includes strategies that are goals overall and to hold managers linked to
the goals. Several strategies relate accountable for achieving objectives,
and to resource alignment to achieve outcomes, generally described some of
its resource but the actual resources required are not requirements
throughout the plan. always specified.

Relationship between long-term goals and No. Education did not discuss the
Yes. Plan contains the annual performance

annual goals	relationship between its strategic plan goals measures and
targets, which represent the and those to be included in its annual plan,
annual performance goals, and aligns them but indicated this would be done
once its with long-term goals, with which they have a annual plan was
prepared. logical relationship.

External factors Yes. The plan generally described factors Yes. The plan
adequately describes outside program scope and responsibilities external
factors that could affect achieving that could negatively affect
Education's its goals and they are linked to the particular ability to
achieve goals, but factors were not goals. The plan also briefly discusses
directly linked to particular goals. strategies to ameliorate the effects
of a number of these factors.

5GAO/HEHS-97-176R.

             Appendix III Observations on Agencies' Strategic Plans

                         (Continued From Previous Page)

Element of strategic planning Included in initial draft strategic plan
Included in current strategic plan

Evaluations	Yes. Education said it would provide No. Because of the
comprehensive detailed descriptions of supporting revamping of Education's
strategic plan in evaluations once it consulted with Congress, 2002, its
program evaluation plan was completed the strategic plan, and agreed on
completely restructured and was not performance indicators. The plan
indicated released until it was included in the 2004 a commitment to using
evaluations, listing annual plan,a which contains information on
evaluations it intended to use to develop its new directions for program
evaluation. sound measures, but did not describe the evaluations.

Sources: GAO/HEHS-97-176R; U.S. Department of Education, Office of the
Deputy Secretary, Planning and Performance Management Service, U.S.
Department of Education Strategic Plan 2002-2007 (Washington, D.C.: 2002).

aAccording to OMB Circular No. A-11, Part 6, Preparation and Submission of
Strategic Plans, Annual Performance Plans, and Annual Program Performance
Reports, June 2002, general goals, which are multiyear and long term, are
synonymous with general objectives, and either term can be used. The
objectives in Education's strategic plan are such multiyear, long-term
objectives, and are referred to in our report as the agency's "long-term
goals." OMB's Circular A-11 also indicates that some agencies include
strategic goals in their strategic plan, which represent overarching
statements of aim or purpose whose achievement cannot be determined and
which can be used to group outcome or output goals. Education's strategic
goals meet this description.

Current Strategic Plan Strengths and Improvements from Fiscal Year 1997
Draft Strategic Plan

Education's current mission, "to ensure equal access to education and to
promote educational excellence throughout the nation," is the same
comprehensive, outcome-oriented mission that was included in its 1997
draft plan. All of Education's functions and activities are covered by it.
The plan's long-term goals6 are expressed so as to allow Education and
Congress to assess whether they are being achieved. Moreover, in contrast
to findings in our review of Education's 1997 Draft Strategic Plan,
Education's civil rights responsibilities-including enforcing five civil
right statutes that ensure equal educational opportunity for all students,
regardless of race, color, national origin, sex, disability, or age-appear
to be reflected, at least in part, in the current plan's goals. For
example, one long-term goal is to reduce the gaps in college access and
completion among student populations differing by race/ethnicity,
socioeconomic status, and disability while increasing the educational
attainment of all. Another is to enhance the literacy and employment
skills of American adults. Under the latter, the plan includes a strategy
to work with state vocational rehabilitation agencies to ensure
implementation of standards that will assist individuals with disabilities
in obtaining high-quality

6U.S. Department of Education, Office of the Deputy Secretary, Strategic
Accountability Service, U.S. Department of Education FY 2004 Annual Plan
(Washington, D.C.: March 2003). This document represents Education's
performance plan for GPRA and will be referred to henceforth as the
"annual plan."

Appendix III Observations on Agencies' Strategic Plans

employment outcomes. As in the past, some goals are targeted at results
for which Education has limited direct control and are, instead, greatly
influenced by third parties. However, Education recognizes this situation
and shows its intent to work closely with its partners.

Education's current plan provides some information on linking results and
day-to-day activities within the department. For example, the plan says
employees will be held accountable for implementation and success from top
to bottom and senior officers will have performance contracts linked to
the plan and be recognized for achieving results. In addition, the
strategy to foster a customer service orientation by ensuring that states,
districts, and other partners receive timely responses to inquiries; to
assign senior officers to develop relationships with individual states;
and to create a customer support team to respond to issues, seems
logically linked to the day-to-day activities of managers and staff.
However, the link between results and day-to-day activities is not
apparent for most of the goals and their strategies.

The current plan includes several annual performance measures that are at
least related to how well information technology is supporting strategic
and program goals, as required by the Clinger-Cohen Act. For example, for
its long-term goal to modernize the Federal Student Assistance (FSA)
Programs and reduce their high-risk status, the plan contains a measure on
the integration of FSA processes and systems that work together to support
FSA program delivery functions. Commendably, Education includes some goals
related to reducing agency program unintended negative effects, such as a
goal and related measure to reduce the data collection and reporting
burden.

Education's current plan shows great improvement on recognizing and
addressing external factors. Beyond adequately describing external factors
that could affect achieving its goals and directly linking these factors
to particular goals, the plan also briefly describes strategies to
ameliorate the effects of a number of factors. For example, for an
external factor on teacher certification under a goal on reading, the plan
says that Education "will work with the states and national accreditation
bodies to encourage the incorporation of research-based reading
instruction into teacher certification requirements." In addition, the
plan includes strategies indicating that Education monitors some internal
factors that could affect achievement of long term goals. Moreover, the
plan recognizes the overarching critical external factor for
Education-that it depends greatly on third parties who often control the
results the department is trying to

             Appendix III Observations on Agencies' Strategic Plans

achieve-and its strategies related to the goal-specific external factors
often reflect this.

In our 1997 assessment of Education's draft plan, we commented that
although the plan identified several management challenges the department
would face in the coming years, it provided little detail about them and
how they would be addressed. In our January 2001 performance and
accountability series, we identified four department-specific and two
governmentwide challenges that we said Education needed to meet.7
Education's current strategic plan includes some goals, measures, and
strategies that could be used to address these challenges. For example,
Education's goal to "develop and maintain financial integrity and
management and internal controls" and this goal's strategies and measures
are directly related to the management challenge we identified on
improving financial management. One of the measures under this goal is
"the achievement of an unqualified audit opinion," an important issue in
our identification of financial management weaknesses as a challenge in
2001, and Education received an unqualified audit opinion in early 2003.
Moreover, the current strategic plan includes goals and strategies meant
to improve Education's strategic human capital management and strengthen
information technology security, two important governmentwide high-risk
areas we identified in 2001.

Critical Strategic Planning In our report on Education's 1997 draft
strategic plan, we found that, Issues Needing Further although the
department had numerous crosscutting programs and Improvement activities,
the plan had identified key interagency activities for some

programs but not for others. For example, we said that the plan did not
identify or discuss activities for postsecondary programs for which the
Higher Education Act of 1965 required coordination. Education's current
plan includes an appendix, which highlights collaborative initiatives
under each of the department's strategic goals, with some activities
related to

7U.S. General Accounting Office, Performance and Accountability Series:
Major Management Challenges and Program Risks, Department of Education,
GAO-01-245 (Washington, D.C.: January 2001). The January 2001 assessment
was the last time we assessed the Department of Education under our
Performance and Accountability Series before the release of the agency's
2002-2007 Strategic Plan. We further reported in October 2002 on how
Education and other agencies reported responding to their management
challenges and program risks. (U.S. General Accounting Office, Performance
and Accountability: Reported Agency Actions and Plans to Address 2001
Management Challenges and Program Risks, GAO-03-225 (Washington, D.C.:
Oct. 31, 2002).

Appendix III Observations on Agencies' Strategic Plans

postsecondary education, including most of those mentioned in our 1997
report. However, as the plan states, the appendix presents a brief
overview of the highlights of some of its collaborative initiatives with
other agencies.

In our 1997 review, we stated that some resource requirements were
described throughout the plan. In the current plan, while a number of
strategies under the department's strategic goal to establish management
excellence are related to aligning resources to achieve outcomes, the
actual resources required-such as human, capital, and information-are not
always specifically indicated. The exception is for information resources,
for which a number of strategies discuss the information resources that
will be required to address the related goal. For example, under the goal
to develop and maintain financial integrity and management and internal
controls, the plan says that Education will "implement a new financial
system capable of producing timely and reliable financial data and
reconcile systems to the general ledger." Moreover, while the plan
stresses accountability throughout, it only refers to providing the
authority needed to achieve results once.8 In addition, consideration of
alternate strategies for achieving goals is not discussed.

In our 1997 review, we reported that Education said it would provide
detailed descriptions of supporting evaluations once it consulted with
Congress, completed the strategic plan, and agreed on performance
indicators. The draft plan indicated Education's commitment to using
evaluations and listed evaluations and strategies it intended to use to
develop sound measures, but did not describe the evaluations. The current
plan does not include descriptions of supporting evaluations or a future
program evaluation schedule. According to Education officials, this was
not done because, with the complete revamping of the strategic plan based
on passage of the No Child Left Behind Act of 2001,9 Education was set on
a course to totally restructure its evaluation program, but could not do
so in time to include it in the strategic plan. Consequently, Education
instead included information about its new directions for program
evaluation studies and a schedule of evaluations in its 2004 annual plan.
The schedule, however, lacked a timetable, except for stating whether the
evaluations

8Within one of its strategies, the plan states that "managers will be
given the freedom to manage and will be held accountable for results."

9Pub. L. No. 107-110, January 8, 2002. The No Child Left Behind Act of
2001 is a reauthorization of the Elementary and Secondary Education Act,
one of the major pieces of authorizing legislation for the Department of
Education.

             Appendix III Observations on Agencies' Strategic Plans

were new or continuing. The current strategic plan indicates, in some
cases, the use or planned use of program evaluation findings to develop or
revise components of the plan. For example, for long-term goals on various
types of academic achievement, data from the National Assessment of
Educational Progress were identified as having been used to set related
targets.

Observations on Changes in the Quality of DOE's Strategic Plan

Overall, DOE's draft 2003 strategic plan meets the required elements of
GPRA and has improved greatly over its 1997 draft strategic plan, as shown
in table 7. DOE made improvements to its plan by establishing
resultsoriented and measurable objectives, and addressing elements that
were not included in the department's 1997 plan, such as reporting on
external factors. Although DOE has shown improvement, a few elements in
the plan could still be enhanced. For instance, further improvement could
be made to the mission statement so that it better addresses the
department's major activities, and additional information could be
included in DOE's strategies to achieve its goals, such as management
accountability.

Table 7: DOE's Progress in Addressing Required Elements of Strategic
Planning under GPRA

Element of strategic planning Included in initial draft strategic plan
Included in current draft strategic plan

Mission statement Yes. DOE's mission was results oriented, Yes. DOE's
mission is results oriented and met a public need, and covered the
agency's meets a public need, but it does not address major activities.
the department's major activities related to energy supply and
conservation.

Long-term goals    Yes. Long-term goals     Yes. Long-term goals cover the 
                       covered mission and                        mission and 
                    major functions of the major functions of the agency.     
                         agency. Goals and Objectives,                        
                   objectives were results referred to as intermediate goals, 
                          oriented but not                        are results 
                         measurable.            oriented and measurable.      

Strategies	Yes. The plan included strategies and Yes. Strategies to
achieve each goal are measures to evaluate the results of the included in
the plan, along with intermediate strategies, but was missing information
on goals to measure success, but information linkages to day-to-day
activities, and the on linkages to day-to-day activities, and the extent
to which managers have the extent to which managers have the knowledge,
skills, and abilities to implement knowledge, skills, and abilities to
implement the strategies. the strategies, is not included.

Relationship between long-term goals and No. Relationship between the
long-term Yes. The draft plan provides a brief annual goals goals and
annual performance goals was description of the relationship between the
missing in the draft plan. long-term strategic goals and the annual
performance goals.

             Appendix III Observations on Agencies' Strategic Plans

                         (Continued From Previous Page)

Element of strategic Included in initial draft Included in current draft   
         planning       strategic plan            strategic plan              
     External factors   No. Key external factors   Yes. Key external factors  
                                were not                 and how they         
                         addressed in the draft   could affect the ability to 
                                  plan.                     achieve each goal 
                                                  are identified in the draft 
                                                             plan.            
       Evaluations      No. The impact of program   No. The impact of program 
                                   evaluations on              evaluations on 
                               the development of the development of          
                          strategic goals was not strategic goals is not      
                          included in the draft                               
                                  plan.              discussed thoroughly.

Sources: GAO/RCED-97-199R and Department of Energy, 2003 Strategic Plan (Draft),
                           (Washington, D.C.: 2003).

Strategic Plan Strengths and Improvements from Fiscal Year 1997 Plan

In its 2003 draft strategic plan, DOE has made significant improvements in
developing measurable objectives, referred to in its plan as intermediate
goals. GAO's review of DOE's 1997 draft strategic plan found that
objectives related to DOE's long-term goals were stated in ways that would
make it difficult to measure whether they were being achieved. In the 2003
draft plan, the objectives are stated in ways that will enable DOE to
measure its progress in achieving goals. For example, to meet the goal of
enhancing energy security through various means, one of DOE's affiliated
objectives is to ensure that throughout DOE's 25-year planning period, the
Strategic Petroleum Reserve is ready to supply oil at a sustained rate of
4.3 million barrels per day for 90 days within 15 days notice by the
President.

In addition, DOE has improved its draft 2003 strategic plan by including
elements that were not included in its 1997 draft plan. These elements
consisted of (1) identifying external factors and (2) describing the
relationship between long-term and annual goals. For each of its long-term
goals, DOE identified external factors, such as reduced funding, lack of
scientific talent, and unpredictable technological developments, that
could affect its ability to achieve its goals. The strategic plan also
included a description of the relationship between the long-term strategic
goals and the annual performance goals. The plan included a diagram that
showed an example of a strategic goal, its associated objectives, and how
they are related to the annual performance goals and targets. However, the
plan could be improved if all annual performance goals were discussed in
the plan and linked to each strategic goal so that it would be clear how
annual performance goals would be used to gauge performance. DOE staff
stated that a description of all actual annual performance goals was not
something that they thought should be included in the strategic plan
because the annual goals differ each year.

Finally, our past and current reviews of DOE's 1997 and 2003 draft
strategic plans found that DOE addressed performance and accountability

             Appendix III Observations on Agencies' Strategic Plans

challenges that we had previously identified. In January 2003, we
identified six areas where DOE's management attention was needed: (1)
addressing security concerns, (2) revitalizing infrastructure, (3)
improving contract management, (4) managing the nuclear weapons stockpile,
(5) cleaning up radioactive and hazardous wastes, and (6) enhancing
leadership in meeting energy needs. All areas were addressed in the 2003
draft strategic plan, with the exception of improving contract management.
For example, for the challenge of enhancing leadership in meeting energy
needs, one of the intermediate goals requires DOE to develop and
demonstrate technologies that can reduce emissions by more than 70 metric
tons of carbon and equivalent greenhouse gases by 2012.

Critical Strategic Planning Issues Needing Further Improvement

There are three elements in DOE's draft 2003 strategic plan that requires
further improvement. To begin with, although DOE's mission is results
oriented, it was revised from the 1997 draft strategic plan and does not
address the department's major activities related to energy supply and
conservation. These activities account for approximately 10 percent of
DOE's $23.4 billion fiscal year 2004 budget request. Our review of the
1997 draft strategic plan found that DOE's mission addressed all of its
major activities.

In addition, the impact of program evaluations on the development of
strategic goals could be discussed in greater detail. The strategic plan
stated that internal, GAO, and the Inspector General (IG) evaluations were
used as resources to develop the draft strategic plan, but specific
program evaluations were not identified. A schedule of future program
evaluations was also not discussed in the strategic plan. According to
DOE, there is no plan to include a table of reports and evaluations that
were used to develop the goals for the strategic plan because the
evaluations were from a prior administration, and when the administration
changes, it usually does not use evaluations from past administrations.

             Appendix III Observations on Agencies' Strategic Plans

DOE could also enhance the strategies included in its plan by providing
information on how goals are linked to the department's day-to-day
activities, and the extent to which managers have the knowledge, skills,
and abilities to implement the strategies. None of this information was
included in the 1997 and 2003 draft strategic plans. According to DOE
officials, in drafting the plan, their goal was to keep the plan at a high
level and this additional information would require more detail than what
is needed for a strategic plan. For example, one official stated that a
description of linkages to day-to-day activities was not discussed in the
strategic plan because it would conflict with the performance plan and
budget justification, which is where the information can be found. As we
have stated previously, without this information, it is difficult to judge
DOE's likelihood of success in achieving the goals or the appropriateness
of the strategies.10

Finally, DOE's draft strategic plan could be improved by identifying
programs and activities that are crosscutting or similar to those of other
agencies. In the 2003 draft plan, crosscutting activities are only
identified for one goal related to science. According to one DOE official,
as the strategic goals were being developed, DOE staff took crosscutting
activities into consideration. As we stated in 1997, unless DOE addresses
crosscutting issues in its plan, Congress cannot be assured that federal
programs are working effectively.

Observations on Changes in the Quality of HUD's Strategic Plan

Overall, HUD has made progress in developing its strategic plan for fiscal
years 2003 through 2008 as required under GPRA. In 1997, we stated that
HUD's draft strategic plan did not cover the six components required under
GPRA.11 HUD's fiscal year 2003-2008 strategic plan addressed several
issues we had previously identified, such as making sure the mission
statement is linked to the department's major operations and functions;
including outcome-related goals and objectives for the department's
functions and operations; generally providing a description of how it will
achieve its goals and objectives; identifying key external factors
affecting the achievement of departmental goals; and explaining how HUD
will coordinate with other agencies to address crosscutting issues.
However, HUD could improve its strategic plan by explaining the
relationship

                    10GAO/RCED-97-199R. 11GAO/RCED-97-224R.

             Appendix III Observations on Agencies' Strategic Plans

between the long-term and intermediate performance measures listed for
each strategic goal and discussing how it used program evaluations to
develop the plan. Table 8 summarizes these findings.

Table 8: HUD's Progress in Addressing Required Elements of Strategic
Planning under GPRA

Element of strategic planning Included in initial draft strategic plan
Included in current strategic plan

Mission statement	No. HUD's mission statement did not cover Yes. HUD's
mission statement covers the the major program functions and operations
agency's major program functions and of the agency. operations and
relevant statutes.

Long-term goals      Yes. The plan included eight   Yes. The plan includes 
                                           strategic     long-term goals that 
                      objectives, and they generally cover the major programs 
                                         covered the and functions of         
                   department's major functions and        the agency.        
                              operations.            
                    No. The strategic plan lacked an 
     Strategies                             adequate 
                    description of how its strategic 
                                          objectives 
                          would be achieved.         

Yes. The strategic plan discusses the means and strategies to address each
strategic goal, including plans to address human capital issues critical
to carrying out its mission.

  Relationship between long- term goals and No. HUD's strategic plan provided
                limited Yes. However, some long-term performance

annual goals examples of annual performance goals under each of its
strategic objectives, but it did not describe the relationship between
them. measures do not appear to have corresponding intermediate measures,
and in other instances it is not clear how HUD will measure progress
towards its goals

External factors	No. HUD briefly discussed the external Yes. HUD describes
the external factors factors in its draft strategic plan without that
could affect achieving its strategic linking them to specific strategic
objectives. objectives.

Evaluations No. HUD's strategic plan did No. HUD does not describe how     
                                not include program                           
                     information on program evaluations were used to prepare  
                               evaluations.                its                
                                            strategic plan. Although it       
                                            mentions that                     
                                             program evaluations will be used 
                                                                   to advance 
                                             key policy objectives, it states 
                                                               that there are 
                                              no fixed timetables for when    
                                                          these               
                                              evaluations will take place.    

Sources: GAO/RCED-97-224R and U.S. Department of Housing and Urban
Development, Strategic Plan FY 2003-FY 2008. (Washington, D.C.: 2003).

Strategic Plan Strengths and In its most recent strategic plan for fiscal
years 2003-2008, HUD has made Improvements from Fiscal progress in
crafting a mission statement that generally covers its major Year 1997
Draft Plan program functions, operations, and relevant statutes that
authorize its

programs. In addition, the current plan builds upon the strength of its
first draft strategic plan by including the strategic objectives that
cover the department's major functions and operations.

Appendix III Observations on Agencies' Strategic Plans

In contrast to its draft strategic plan of 1997, HUD's most recent plan
provides a general description of how its strategic objectives will be
achieved. For example, HUD lists the means and strategies following each
strategic goal and supporting objectives to describe how it will carry out
its activities. In addition, the strategic plan identifies a few
legislative and regulatory changes HUD will pursue to meet its objectives,
such as a new tax credit for developers of affordable housing and expanded
eligibility for the Assisted Living Conversion Program. To carry out its
work, HUD acknowledges in its plan that it needs to recruit and retain its
current workforce to ensure the proper skills and abilities needed to
carry out its mission.

HUD's most recent strategic plan also describes key factors external to
the department and beyond its control that could significantly affect the
achievement of its objectives. For example, for its goal of Promoting
Decent and Affordable Housing, HUD identifies the impact of broad economic
factors on opportunities for low-income workers as a factor that will
affect the department's ability to assist renters that depend on HUD's
programs to make progress towards self sufficiency. HUD's strategic plan
also provides a general description of current program evaluations and
mentions that the results of these evaluations will support key policy
objectives within the department.

Addressing a shortcoming of its draft strategic plan in 1997, HUD's
current plan generally explains how it will coordinate with other agencies
to address crosscutting problems. For example, the Interagency Council on
the Homeless (ICH), created by the Secretaries of HUD, the Department of
Health and Human Services (HHS), and Veterans Affairs (VA), will continue
to work to identify the obstacles homeless people face to enroll in the
main service programs and recommend specific changes-legislative, policy,
and procedural-that would make federal supportive service programs more
accessible to them.

             Appendix III Observations on Agencies' Strategic Plans

Lastly, HUD improved its discussion of how it plans to address performance
and accountability challenges we have raised. In January 2003, we reported
that programmatic and financial management information systems and human
capital issues were HUD's performance and accountability challenges.12
While in the past HUD acknowledged some of these problems and described
how they would be addressed, its plans for management reform were not
fully integrated into its draft strategic plan. In contrast, one of HUD's
strategic goals in its current strategic plan, "Embrace High Standards of
Ethics, Management, and Accountability," lists specific actions the
department will take to address these challenges.

Critical Strategic Planning Issues Needing Further Improvement

HUD has made progress in linking its strategic objectives to both
long-term goals and intermediate measures in its strategic plan; however,
long-term performance measures are not consistently linked to a
corresponding intermediate measure, making it difficult for the reader to
understand how the department will measure progress towards its strategic
goals. For example, HUD projects that the percentage of architects and
builders indicating awareness of the design and construction requirements
of the Fair Housing Act will increase through fiscal year 2008. However,
HUD does not mention in the plan how many architects or engineers it will
survey to establish a baseline of awareness or to gauge progress made
towards this goal in subsequent years. HUD officials explained that only
those long-term goals that are critical to being achieved within the next
1-2 years have corresponding intermediate performance measures. Therefore,
it is somewhat unclear how HUD plans to assess progress made towards its
broader goal of "Ensure Equal Opportunity in Housing."

Similarly, the strategic plan does not always provide a clear picture of
how HUD will be able to measure progress towards its strategic goals. For
example, the plan states that HUD's Community Development Block Grants
(CDBG) Program will create or retain 400,000 jobs by fiscal year 2008;
HUD's intermediate measure for fiscal year 2004 is 84,000 jobs. These
estimates are based on the average jobs created per grant dollar reported
by grantees. However, HUD does not mention how it will be able to discern
between those jobs created by CDBG and those created by other means.

12U.S. General Accounting Office, Major Management Challenges and Program
Risks: Department of Housing and Urban Development, GAO-03-103
(Washington, D.C.: January 2003).

             Appendix III Observations on Agencies' Strategic Plans

Similar to our previous finding, HUD's current strategic plan does not
describe how program evaluations were used to develop its strategic goals
or other components of its strategic plan, and does not include a schedule
for future evaluations. While the plan mentions that program evaluations
will be used to advance key policy objectives, it states that there are no
fixed timetables for when these evaluations will take place.

Observations on Changes in the Quality of SBA's Strategic Plan

In our review of SBA's March 1997 draft strategic plan,13 we found that
the plan did not meet two of the six requirements set forth by GPRA: (1) a
discussion of the relationship between the long-term goals and objectives
and the annual performance goals and (2) a description of how program
evaluations were used to establish or revise strategic goals and a
schedule for future program evaluations. In addition, the four elements
the plan contained could have better conformed to GPRA's requirements and
OMB guidance. In comparison, SBA's 2001-2006 strategic plan has improved
on several areas we identified in our 1997 review.14 However, we still
found areas that SBA could improve. Table 9 summarizes these findings.

Table 9: SBA's Progress in Addressing Required Elements of Strategic
Planning under GPRA

Element of strategic planning Initial draft strategic plan Current
strategic plan

Mission statement	Yes. The mission statement was results Yes. SBA's
mission statement now oriented. However, it did not directly specifically
mentions its mandate and incorporate key aspects of SBA's legislative
encompasses SBA's disaster loan program mandate and it did not encompass
one of to families and businesses. SBA's significant activities.

Long-term goals  Yes. Generally the strategic Yes. The strategic goals are 
                                   goals covered           outcome            
                         the major functions and oriented and cover the major 
                              operations of SBA.                functions and 
                   Most of SBA's strategic goals      operations of SBA.      
                               were              
                     expressed as processes, not 
                                    as outcomes. 

Strategies	Yes. However, the strategies were too Yes. The strategies will
generally help SBA vague or general to assess whether or not achieve its
goals. The strategies are now they would help achieve SBA's goals. Also,
listed by strategic goal and related objective. the strategies and goals
could have been The plan discusses some of the resources linked more
explicitly. SBA needs to achieve its goals.

13GAO/RCED-97-205R.

14At the time of our review, the most recent SBA strategic plan was for
fiscal years 20012008. SBA released a new strategic plan for fiscal years
2003-2008 in October 2003.

             Appendix III Observations on Agencies' Strategic Plans

                         (Continued From Previous Page)

Element of strategic Initial draft strategic    Current strategic plan     
         planning                plan           
Relationship between No. The linkage between           Yes. The plan lists 
long-term goals and         proposed                  performance measures 
       annual goals      performance measures,         by objectives for each 
                            strategies, and             strategic goal. Logic 
                        objectives was unclear. models show the relationship  
                                                           between            
                                                   measures and outcomes.     
     External factors   Yes. The plan listed a    Yes. The plan lists several 
                        number of external                  external factors. 
                        factors. A discussion     Generally, strategies to    
                        of how the external            ameliorate the         
                        factors would be taken  effects of these factors are  
                        into account when                 included.           
                        assessing progress      However, SBA does not discuss 
                        toward goals was not                 how              
                         included in the plan.  external factors will be      
                                                taken into account            
                                                when assessing progress       
                                                toward goals.                 

Evaluations No. The plan did not describe how program No. The plan states
that lessons learned evaluations were used to establish or revise from
SBA's program evaluations have strategic goals or include a schedule for
influenced its strategic direction. Examples future program evaluations.
of future program evaluations are given, but a specific schedule of when
these evaluations are to occur is not included.

Sources: GAO/RCED-97-205R and Small Business Administration, SBA Strategic Plan,
                  FY 2001 - FY 2006 (Washington, D.C.: 2000).

Strategic Plan Strengths and Improvements from Fiscal Year 1997 Plan

SBA's current mission statement is an improvement from the one contained
in its 1997 draft strategic plan. In our July 1997 report on SBA's draft
strategic plan, we noted that the mission statement could be improved by
more directly incorporating key aspects of SBA's legislative mandate to
aid, counsel, assist, and protect the interests of small businesses. In
addition, the mission statement did not encompass one of SBA's significant
activities-making loans to individuals. The mission statement in the
20012006 strategic plan now includes both of these items.

The long-term, or strategic, goals of SBA's 2001-2006 strategic plan are
(1) helping small businesses succeed, (2) helping Americans recover from
disasters, and (3) modernizing SBA. These three strategic goals are
outcome oriented. We stated in our previous report on SBA's draft
strategic plan that only two of SBA's seven strategic goals described
outcomes. The rest of the goals were expressed as processes.

SBA's 2001-2006 strategic plan shows significant improvement in its
strategies. In 1997 we stated that the plan could be improved by making
the linkage between the strategies and goals/objectives more explicit.
Objectives were listed as a group, followed by the strategies, which were
also listed as a group. The 2001-2006 plan describes the strategies,
objective by objective, making the linkage clear. The strategies contained
in SBA's 1997 plan consisted entirely of one-line statements and most were
too vague or general to enable an assessment of whether or not they would

Appendix III Observations on Agencies' Strategic Plans

help achieve the goals and objectives in the plan. While many of the
strategies listed in SBA's 2001-2006 plan are only one- or two-sentence
statements, all of the strategies seem to aid in achieving SBA's goals and
objectives. For example, one strategy for the objective of providing
entrepreneurial development assistance is to train SBA personnel in
outreach and business development approaches. This strategy would
presumably aid in making staff more competent at helping small businesses
develop. The current plan also includes some of the resources needed to
achieve SBA's goals. These resources include a table showing SBA's fiscal
year 2001 budget request and its plans to modernize its information
technology systems.

The 2001-2006 strategic plan also makes a clearer connection between
strategic goals and annual performance measures that will be used to gauge
progress in achieving strategic goals. SBA shows this relationship by
linking its performance output measures with the intended outcomes of
SBA's programs for two of the strategic goals. For the other strategic
goal, "Modernizing SBA," SBA lists the performance measures it will use by
objective. For example, the performance measure "personnel trained or
retrained" will gauge progress in achieving SBA's human capital
investments objective. This shows improvement over the draft strategic
plan, which listed the performance measures as a group without showing how
they were related to SBA's strategic goals.

Similar to its 1997 draft plan, SBA's 2001-2006 plan specifies external
factors that could affect the achievement of its strategic goals. The plan
lists eight external factors associated with two of its three strategic
goals; no external factors were associated with the third goal. The plan
includes strategies that seem to be intended to mitigate the effect of six
of these external factors, while two of the external factors,
congressional support and public support, do not seem to be addressed. An
example of an external factor that seems to be addressed by a strategy is
economic conditions, listed for the strategic goal "helping small
businesses succeed." One of the strategies for this strategic goal is to
determine economic trends and conditions. SBA states that it tracks
economic trends that affect small businesses and the contribution small
businesses make to the economy. SBA then brings these data to the
attention of policymakers.

We noted in our 1997 report that SBA did not explicitly address the
relationship of SBA's activities to similar activities in other agencies
and provided no evidence of coordination. In contrast, SBA's current
strategic plan includes a section on crosscutting issues. This section
contains

             Appendix III Observations on Agencies' Strategic Plans

discussions of innovation and research assistance, international trade
assistance, business development assistance, veterans affairs, and
disaster assistance. One example of coordination is the U.S. Export
Assistance centers, which combine the trade-promotion and export-finance
resources of SBA, the Department of Commerce, the Export-Import Bank, and
in some locations, the Agency for International Development and the
Department of Agriculture.

Finally, SBA's 2001-2006 strategic plan generally addresses performance
and accountability challenges that we have previously identified. For
example, we noted in our January 2001 report on major management
challenges and program risks15 that SBA needed to continue to improve
oversight of its lending partners to correct oversight weaknesses. The
plan identifies lender oversight as a management challenge and states that
SBA has developed and implemented a safety and soundness oversight program
for Small Business Lending companies, institutionalizing the process
through the Office of Lender Oversight. This is an improvement over SBA's
draft strategic plan, which we reported did not clearly address previously
identified management problems.

Critical Strategic Planning Issues Needing Further Improvement

In 1997, we noted that SBA's draft strategic plan did not include a
discussion of how the external factors would be taken into account when
assessing progress toward goals. This observation holds true for the
current strategic plan. For the external factor mentioned above, economic
conditions, the plan states that if the economy remains strong, surety
bond guaranties will remain constant or decrease, but if the economy
deteriorates, demand will increase. However, the plan does not state how
SBA will assess success or failure in meeting its goals in relation to
this factor.

In its 1997 draft plan, SBA did not describe how program evaluations were
used to establish or revise strategic goals or include a schedule for
future program evaluations. In the 2001-2006 plan, SBA states, "We have
used lessons learned in our performance monitoring and program evaluations
to influence the strategic direction contained in this plan." The plan
includes findings from six completed program evaluations; however, no
further detail is given as to how these program evaluations were used to
establish

15U.S. General Accounting Office, Major Management Challenges and Program
Risks: Small Business Administration, GAO-01-260 (Washington, D.C.:
January 2001).

             Appendix III Observations on Agencies' Strategic Plans

or revise the strategic goals. While the current plan gives examples of
future program evaluations, such as conducting a benchmark study on the
HUBZone program to assess the changes in employment and investment in
distressed urban and rural communities, it does not include a schedule of
future evaluations. SBA states that for the next several years, the agency
plans to systematically review programs that offer the most financial risk
to the government and also the programs that can offer tips on how to
improve efforts.

Observations on Changes in the Quality of SSA's Strategic Plan

SSA's strategic plan for 2003-2008 is well structured and contains all of
the required elements under GPRA. In 1997, we noted that SSA's draft
strategic plan contained all six required components, but suggested a
number of ways it could be strengthened. SSA has addressed some of the
issues we previously identified, such as associating specific programs
with goals and identifying external factors that may affect goal
achievement. SSA could further improve its strategic plan through (1)
ensuring that its strategic objectives will assist SSA in achieving its
strategic goals, (2) explicitly describing the effect of external factors
on goal attainment, (3) providing timetables or schedules for achieving
results, (4) providing details on how each performance and accountability
challenge will be addressed, (5) clearly explaining how program
evaluations were used in formulating the strategic plan, and (6)
discussing the manner in which SSA has coordinated with other agencies,
especially those that serve the same beneficiaries. Table 10 summarizes
SSA's progress in addressing the required elements of GPRA.

Table 10: SSA's Progress in Addressing Required Elements of Strategic
Planning under GPRA

Element of strategic Included in initial draft     Included in current     
         planning       strategic plan                strategic plan          
    Mission statement   Yes. SSA's mission statement       Yes. SSA's mission 
                                     was                    statement has not 
                        appropriate and reflective of changed substantially.  
                                       its new status 
                          as an independent agency.   

Long-term goals Yes. Long-term goals were established. Yes. The goals'
relationship to specific However, the relationship between long-programs
is more defined, but SSA's goal for term goals and specific programs was
achieving solvency of the social security unclear and did not identify the
results to be system is ambitious, given SSA's mission achieved. and
responsibilities. Key outcomes are identified for each goal. SSA's major
management challenges are not all clearly linked to individual goals or
objectives.

             Appendix III Observations on Agencies' Strategic Plans

                         (Continued From Previous Page)

Element of strategic planning Included in initial draft strategic plan Included
                           in current strategic plan

Strategies	Yes. The strategic plan was generally complete with regard to
processes and

                    Yes. SSA added some timetables, but the

technology, but did not include timetables or schedules. Some of the
success was predicated on changes in processes or technology improvements.

required resources are not specified. Some of the success is still
predicated on technological improvements.

Relationship between long-term goals and Yes. SSA provided numerous
performance Yes. SSA provided one or more key

annual goals	measures relating to strategic goals and outcomes for each
strategic objective. SSA objectives, and plans for developing new notes
that success in meeting the objectives measures were discussed. It was
will be measured in the annual performance sometimes difficult to link
measures with plans by progress in achieving the key their appropriate
objectives. It was also outcomes. hard to discern which objectives did not
yet have performance goals. Some data were expressed by program, while
other data were aggregated.

External factors  Yes. The report mentioned Yes. External (environmental)  
                                   several key factors are                    
                     external factors, but did listed in a separate section.  
                           not explicitly link However, the                   
                      factors to general goals  plan does not explicitly link 
                            and state how they             factors to general 
                      could have affected goal goals and state how they could 
                             attainment. Also, affect goal                    
                    the report did not discuss  attainment. Specific effects  
                                needed changes            are not             
                     (by Congress) to ensure    discussed-most examples are   
                            solvency.                      vague.             

Evaluations	Yes. The report contained a broad Yes. Future evaluations
(with brief discussion of program evaluations, but the descriptions) are
listed, but there is no evaluations were not clearly described. discussion
of how current evaluations are Also, SSA did not describe how the used to
establish or revise specific evaluations were used to establish or revise
goals/objectives. The plan states that SSA specific goals/objectives.
Finally, there was used internal and external (GAO, IG) no schedule for
completing future evaluations to determine strategic plans and evaluations
or methodologies. objectives.

Sources: GAO/HEHS-97-179R) and Social Security Administration, Social
Security Administration Strategic Plan 2003-2008, (Washington, D.C.:
2000).

Strategic Plan Strengths and SSA's mission statement changed very little
between 1997 and 2003. OMB Improvements from Fiscal Circular A-11 notes
that the mission statement should be brief and define Year 1997 Plan the
basic purpose of the agency, with particular focus on its core programs

and activities. SSA's statement conforms to this guidance-it reads, "To
advance the economic security of the nation's people through compassionate
and vigilant leadership in shaping and managing America's social security
programs."16

16In the 2003-2008 plan, the word "advance" replaced the word "promote."

             Appendix III Observations on Agencies' Strategic Plans

In 1997, the relationship between SSA's long-term goals and specific
programs was unclear and did not identify the specific results to be
achieved. Since that time, SSA has improved this linkage and better
articulated intended results, including quantifiable goals. For example,
as part of its strategic goal to ensure superior stewardship of Social
Security programs and resources, SSA notes that one of its key outcomes is
to increase Supplemental Security Income (SSI) payment accuracy to 96
percent (free of preventable error) by 2008.

SSA improved upon its linkage between long-term goals and annual goals in
its fiscal year 2003-2008 strategic plan. Under each strategic goal in
this plan, SSA provided one or more key outcomes for each strategic
objective; in its 1997 draft strategic plan, we found it difficult to link
measures with the appropriate objectives and discern which objectives did
not yet have performance goals.

Critical Strategic Planning Issues Needing Further Improvement

Not all of SSA's strategic objectives and associated performance measures
will allow SSA to achieve its related strategic goals. Specifically, the
solvency goal in SSA's current strategic plan reads, "To achieve
sustainable solvency and ensure Social Security programs meet the needs of
current and future generations," but the sole associated objective-through
education and research efforts, support reforms to ensure sustainable
solvency and more responsive retirement and disability programs-will not
allow SSA, on its own, to reach that goal. While SSA's mission is to
advance the economic security of the nation's people, it is not
unilaterally responsible for achieving solvency in social security
programs.

An agency's strategic plan is expected to contain strategies for achieving
the goals articulated. In 1997, SSA's strategic plan was generally
complete with regard to processes and technology, but did not include
timetables or schedules for results. While the current strategic plan
contains processes, anticipated progress in technology, and some
timetables, it does not contain timetables or schedules for all of the
results. For example, as part of its strategic objective to "efficiently
manage Agency finances and assets and effectively link resources to
performance outcomes," SSA's key outcomes include (1) competing or
converting 50 percent of commercial positions and (2) to "get to green" on
all five President's Management Agenda (PMA) items. SSA has neither
identified the required resources to achieve these goals nor has it
identified a time frame for achieving them.

Appendix III Observations on Agencies' Strategic Plans

In our review of SSA's 1997 draft strategic plan, we noted that SSA
described several key external factors that may affect its programs, but
did not explicitly link such factors to its general goals and state how
these factors could affect goal attainment. In the current strategic plan,
SSA identifies environmental (external) factors: demographics, health and
disability trends, technological advances, and workforce trends. However,
as we found in our earlier review, the effects of these factors on
specific performance goals are not specified, even though SSA notes that
they drive the development of such goals.

SSA noted that it considered major management challenges identified by GAO
when it determined its strategic goals and objectives, but not all of
these challenges are clearly addressed in the plan. While these challenges
are not clearly identified, SSA addresses them to some degree throughout
the plan. For example, SSA's strategic goal to "Strategically manage and
align staff to support SSA's mission" addresses the governmentwide
challenge of strategic human capital management.17

SSA includes a list of major strategic process and program evaluations
scheduled for the fiscal years 2003-2008 time period, organized by
strategic goal. However, SSA does not list ongoing evaluations or mention
how the results of these evaluations were used to prepare the current
strategic plan. SSA notes that many of the hundreds of process and program
evaluations conducted annually were designed to evaluate and improve
internal processes falling below the strategic level. However, some of the
ongoing evaluations are associated with specific strategic goals; thus,
their outcomes could be discussed in the context of the strategic goals
with which they are affiliated.

SSA's strategic plan contains a very limited discussion of its
interactions with other agencies that have similar goals or serve the same
beneficiaries. In an interview, SSA officials noted that SSA has extensive
interactions with other agencies on such issues as earnings accuracy and
medical information, but the level of interaction varies by initiative.
SSA's strategic plan would benefit from a broader discussion of these
interactions, especially if they were broken down by initiative. For
example, as part of the objective to increase the accuracy of earnings
records, SSA notes that it will collaborate with the Internal Revenue
Service (IRS) to achieve more

17U.S. General Accounting Office, Major Management Challenges and Program
Risks: Social Security Administration, GAO-03-117 (Washington, D.C.:
January 2003).

             Appendix III Observations on Agencies' Strategic Plans

accurate wage reporting as part of its means and strategies to reduce the
size of the suspense file.18 It would be helpful if SSA offered more
details as to the nature and extent of its collaboration with IRS.

Observations on Changes in the Quality of DOT's Strategic Plan

In our review of DOT's July 1997 draft strategic plan, we found that the
plan only met three of the six elements required by GPRA.19 The plan did
not meet GPRA's requirements to describe (1) strategies for achieving the
goals, (2) a linkage between DOT's long-term goals and annual performance
goals, and (3) the external factors that could significantly affect DOT's
ability to achieve its goals. Further, for the three elements that the
plan did meet, each had weaknesses. In comparison, DOT's 2003-2008 draft
strategic plan has improved on several areas we identified in our 1997
review.20 However, we still found areas where DOT could improve. Table 11
summarizes these findings.

 Table 11: DOT's Progress in Addressing Required Elements of Strategic Planning
                                   under GPRA

Element of strategic Included in initial draft  Included in current draft  
         planning       strategic plan             strategic plan             
    Mission statement       Yes. DOT's mission     Yes. The mission statement 
                              statement was        continues to               
                         comprehensive and covers   cover its major functions 
                                its major                  and operations and 
                        functions and operations.    more explicitly states   
                                                        DOT's statutory       
                                                           authority.         
                         Yes. Five long-term goals Yes. Five strategic        
     Long-term goals                   encompassed objectives encompass       
                        DOT's major functions and  DOT's major functions and  
                        operations.                       operations.         
                         However, it was not clear          Outcome goals and 
                                     as to how DOT      candidate performance 
                        would measure success for           measures for each 
                        most of its                  strategic goal help show 
                                                      how DOT will measure    
                                  goals.                    success.          

Strategies     No. While the plan listed six Yes. Strategies are listed by 
                                      corporate                     strategic 
                      management strategies for         objective and include 
                                  achieving its                discussions on 
                    long-term goals, it did not leadership, building          
                                   describe the expertise, and                
                     operational processes, the          technology.          
                                    skills, the 
                  technology, and the resources 
                                    required to 
                         meet them.             

18The suspense file contains information on earnings that cannot be
matched to an individual's record due to an invalid name/Social Security
number combination.

19GAO/RCED-97-208R.

20At the time of our review, the Department of Transportation was in the
process of revising its strategic plan. A draft copy of the updated
strategic plan, dated July 1, 2003, was used for this review.

             Appendix III Observations on Agencies' Strategic Plans

                         (Continued From Previous Page)

     Element of strategic    Included in initial draft    Included in current 
           planning               strategic plan         draft strategic plan 
Relationship between long   No. The plan did not    Yes. The plan includes 
        term-goals and             describe how             performance       
         annual goals        performance goals would   measures and refers to 
                             be related to the         the performance        
                                 long-term goals.         plan for further    
                                                       information on annual  
                                                         performance goals.   

External factors	No. Four external factors were identified, Yes. Several
external factors are listed for but other key factors were not included in
the each strategic objective. Generally the plan plan. Only one external
factor was gives descriptions of how these factors discussed in terms of
how it could have could affect the achievement of goals. affected DOT's
ability to accomplish its goals.

Evaluations Yes. Program evaluations used in        No. An extensive table 
                                                          describes the scope 
                       establishing goals and a       and methodology and the 
                             schedule of future            completion date of 
                    evaluations were discussed. program evaluations for       
                                   However, the fiscal years 2003-            
                    plan did not provide enough 2008. However, the plan does  
                                 information to              not              
                        determine the scope and specifically mention which or 
                                 methodology or how previous                  
                 the key issues to be addressed   evaluations were used in    
                                      in future        developing the         
                         evaluations.                       plan.             

Sources: GAO/RCED-97-208R and U.S. Department of Transportation, U.S.
Department of Transportation Draft Strategic Plan for Fiscal Years
2003-2008 (Washington, D.C.: 2003).

Strategic Plan Strengths and Improvements from Fiscal Year 1997 Plan

The mission statement contained in DOT's 2003-2008 draft strategic plan is
an improvement over the one contained in the 1997 draft plan. DOT's
mission, as stated in the 2003-2008 draft strategic plan, is "to develop
and administer policies and programs that contribute to providing fast,
safe, efficient, and convenient transportation at the lowest cost
consistent with the national objectives of general welfare, economic
growth and stability, the security of the United States, and the efficient
use and conservation of the resources of the United States." The mission
statement covers the major functions and operations of the department. In
our July 1997 report on DOT's 1997 draft strategic plan, we noted that the
mission statement could be improved by including language from the
department's enabling legislation to focus the mission statement more
directly on DOT's core activities. We gave an example of adding the
department's purpose to develop transportation policies and programs that
"contribute to providing fast, safe, efficient, and convenient
transportation at the lowest cost" from DOT's enabling legislation. The
mission statement in the 2003-2008 plan includes such language.

As in its 1997 draft strategic plan, DOT's 2003-2008 draft strategic plan
meets the requirement of GPRA to include long-term goals and objectives
for the major functions and operations of the department. The 2003-2008
draft strategic plan contains five strategic objectives (long-term goals)
that cover the major functions and activities of the department and are
results

Appendix III Observations on Agencies' Strategic Plans

oriented. Besides the strategic objectives of "safety," "mobility,"
"global connectivity," "environmental stewardship" and "security," the
current draft also contains an "organizational excellence" objective to
"advance the department's ability to manage for results and achieve the
goals of the PMA."

Each strategic objective section in the 2003-2008 draft plan contains
strategies for attaining DOT's outcomes and objectives. The strategies for
each strategic objective are listed in the categories of "leadership,"
"building expertise," and "technology." For example, a "technology"
strategy for the "mobility" strategic objective is to "examine ways to
encourage cargo movements by water through the development of barge and
fast vessel technologies to bring new capacity to our intermodal
transportation system." The plan states that this strategy supports DOT's
outcomes of reduced congestion in all modes of transportation and
increased reliability throughout the system. The strategies for each
strategic objective generally describe the operational processes, the
skills, and the technology required to meet DOT's goals and objectives.
The current draft strategic plan also states that the resources and
programs listed in DOT's annual performance plans and budgets are
necessary to achieve DOT's outcomes and to execute the strategies. In
contrast, the 1997 draft strategic plan provided insufficient information
to describe the operational processes, the skills, the technology, and the
resources required to meet DOT's long-term goals, as required by GPRA.

Also, each strategic objective section in the current draft plan contains
crosswalks between outcomes in the strategic plan and performance measures
in the annual performance plans and reports. These crosswalks show the
measures that will be used to measure progress in achieving most of DOT's
outcomes and strategic objectives. For example, the performance measure
"number of passengers in international markets with open skies aviation
agreements" is related in a crosswalk to the outcome "reduced barriers to
trade in transportation goods and services." Together, the measure and
outcome will show progress toward DOT's global connectivity strategic
objective. This is an improvement from DOT's 1997 draft strategic plan
when we noted that although supporting documents showed that DOT had
developed information on how to measure each outcome goal, this
information was not included in the draft.

In contrast to DOT's 1997 draft strategic plan, the 2003-2008 draft plan
lists several external factors for each strategic objective and generally
discusses how these factors could affect the department's ability to
achieve its

Appendix III Observations on Agencies' Strategic Plans

outcomes and objectives. For example, one of the external factors for
DOT's environmental stewardship strategic objective is that DOT faces a
significant challenge to control and minimize air, water, and noise
pollution. The plan states that if DOT cannot control and minimize this
pollution, the department may encounter a public backlash that may impede
system improvement. For the external factors relating to the safety and
mobility strategic objectives, the plan lists both positive and negative
consequences the factors could have on achieving goals. One example would
be the possible effects the expansion and integration of the
telecommunications and e-commerce industry sectors could have upon
transportation safety. The plan states that this could affect the
achievement of DOT's safety objective by leading to unsafe practices, such
as the use of cell phones and other personal devices while driving. On the
other hand, these technologies could also contribute to safety by alerting
responders to the location of crashes and vehicles in distress. The 1997
draft plan identified four external factors and only discussed how one of
those factors could affect DOT's ability to accomplish its goals.

The current draft strategic plan includes an extensive table listing
program evaluations to be completed during fiscal years 2003-2008. The
table includes the name of the program to be evaluated, which strategic
goal(s) the program supports, the scope and methodology of the evaluation,
and the fiscal year during which the evaluation will be completed. DOT's
1997 draft strategic plan only listed the titles for the evaluations
scheduled for 1997 and 1998, which was insufficient to determine the scope
and methodology.

DOT's current draft plan lists crosscutting programs by strategic
objective. The discussions of crosscutting programs include the goal of
each program, which of DOT's outcomes each supports, and the agencies
involved. For example, the goal of aviation security is to prevent
explosives, weapons, and other dangerous items from being placed aboard
aircraft. This program supports DOT's security outcome of rapid recovery
of transportation in all modes from intentional harm and natural
disasters. DOT, through the Federal Aviation Administration, leads this
program, which involves the Transportation Security Administration,
Federal Bureau of Investigation, U.S. Customs Service, and U.S. Postal
Service, among others. Previously, the 1997 draft did not provide evidence
that DOT coordinated with other agencies that had programs and activities
that were crosscutting or similar to DOT's.

             Appendix III Observations on Agencies' Strategic Plans

DOT's major management challenges, which we identified, are generally
discussed in the 2003-2008 draft strategic plan, organized by related
strategic objective. For example, we noted in our January 2003 report that
one major management challenge that DOT faces is building human capital
strategies.21 The current draft strategic plan includes a discussion of
human capital in DOT's organizational excellence objective. A separate
section of this discussion addresses our concerns regarding human capital
strategies and includes several milestones to address these concerns.
These milestones include conducting workforce planning for missioncritical
occupations in fiscal year 2003, and implementing a departmentwide
performance management system beginning in fiscal year 2003, and a uniform
branding and marketing approach to attract, acquire, and retain diverse
high-quality talent. DOT's 1997 draft strategic plan did not adequately
address the major management challenges we had previously identified.

Critical Strategic Planning Issues Needing Further Improvement

As stated above, the 2003-2008 plan provides a clear picture of how
success will be measured for most of DOT's strategic objectives and
outcomes. However, this clarity is not provided for a few strategic
objectives and outcomes. We noted the same issue in our review of DOT's
1997 draft strategic plan. For example, in the current plan three of the
outcome goals for the global connectivity strategic objective lack
corresponding performance measures. These outcomes are enhanced
international competitiveness of U.S. transport providers and
manufacturers, harmonized and standardized regulatory and facilitation
requirements, and the most competitive, cost-effective and efficient
environments for passenger travel. The plan states that the measures are
to be determined. However, without these measures it is unclear how
progress will be measured because the outcomes themselves do not lend
themselves to measurement.

While the current strategic plan shows improvement in the schedule for
future program evaluations, it does not sufficiently describe the
evaluations used in establishing or revising DOT's strategic objectives.
DOT states that detailed descriptions of completed program evaluations are
presented in its 2002 performance and accountability report. Further, the
plan states that DOT considered the results of completed program

21U.S. General Accounting Office, Major Management Challenges and Program
Risks: Department of Transportation, GAO-03-108 (Washington, D.C.: January
2003).

Appendix III Observations on Agencies' Strategic Plans

evaluations, as well as reports from DOT's Inspector General and GAO, in
writing the strategies to achieve its strategic objectives and outcomes.
The plan does not describe which or how program evaluations were used to
write the strategies.

Appendix IV

Observations on Agencies' Annual Performance Plans

Under GPRA, agencies are to prepare annual performance plans after the
development of their strategic plans. These annual plans are to establish
the connections between the long-term strategic goals outlined in the
strategic plans and the day-to-day activities of managers and staff. One
of our objectives was to assess the changes in the overall quality of
agencies' goals, strategies, and data articulated in their annual
performance plans. To meet this objective, we judgmentally selected six
agencies-Education, DOE, HUD, SBA, SSA, and DOT-using criteria such as
agency size, primary program types, and previous GAO reviews. To assess
the overall quality and improvements made to the agencies' performance
plans, we relied on requirements contained in GPRA and accompanying
committee report language,1 guidance to agencies from the OMB for
developing performance plans,2 best practices identified in our published
work,3 previous GAO evaluations,4 interviews with agency officials, and
our knowledge of agencies' operations and programs.

Key Elements of Information for Annual Performance Plans

Although GPRA does not require a specific format for the performance plan,
it does require the plan to (1) identify annual performance goals and
measures for each of an agency's program activities, (2) discuss the
strategies and resources needed to achieve annual performance goals, and
(3) provide an explanation of the procedures the agency will use to verify
and validate its performance data. We categorized each agency's plan based
on the degree to which it collectively addressed these three
characterizations.

To assess the degree to which an agency's plan provides a clear picture of
intended performance across the agency, we examined whether it included
(1) sets of performance goals and measures that address program results,
(2) baseline and trend data for past performance, (3) performance goals or

1Government Performance and Results Act of 1993, Committee on Governmental
Affairs, United States Senate, S. Rpt. No. 58, 103d Cong. 1st Sess.
(1993).

2OMB Circular No. A-11: Part 6, Preparation and Submission of Strategic
Plans, Annual Performance Plans, and Annual Program Performance Reports
(Washington, D.C.: June 2002).

3GAO/GGD/AIMD-99-215 and The Results Act: An Evaluator's Guide to
Assessing Agency Annual Performance Plans, GAO/GGD-10.1.20 (Washington,
D.C.: April 1998).

4GAO/HEHS-98-172R, GAO/RCED-98-194R, GAO/RCED-98-159R, GAO/RCED-98-180R,
GAO/RCED-98-200R, and GAO/HEHS-98-178R.

Appendix IV Observations on Agencies' Annual Performance Plans

strategies to resolve mission-critical management problems, and (4)
identification of crosscutting programs (i.e., those programs that
contribute to the same or similar results), common or complementary
performance goals and measures to show how differing program strategies
are mutually reinforcing, and planned coordination strategies.

To assess the degree to which an agency's plan provides a specific
discussion of strategies and resources the agency will use to achieve
performance goals, we examined whether it included (1) budgetary resources
related to the achievement of performance goals, (2) strategies and
programs linked to specific performance goals and descriptions of how the
strategies and programs will contribute to the achievement of those goals,
(3) a brief description or reference to a separate document of the human
capital, information, and other resources required to achieve results,5
and (4) strategies to leverage or mitigate the effects of external factors
on the accomplishment of performance goals.

Finally, to assess the degree to which an agency provides confidence that
its performance information will be credible, we examined how each report
discussed the quality of the data presented. To help improve the quality
of agencies' performance data, Congress included a requirement in the
Reports Consolidation Act of 2000 that agencies assess the completeness
and reliability of their performance data. Under the Act, agencies were to
include this assessment in the transmittal letter with their fiscal year
2000 performance reports. Agencies were also required to discuss in their
report any material inadequacies in the completeness and reliability of
their performance data and discuss actions to address these inadequacies.

5The Homeland Security Act (Pub. L. No. 107-296), also requires that
agencies provide a description of how the performance goals and objectives
are to be achieved, including the operations, processes, training, skills
and technology, and the human capital, information, and other resources
and strategies required to meet those performance goals and objectives.

Appendix IV Observations on Agencies' Annual Performance Plans

For each of these elements, we characterized each agency's fiscal year
1999 and fiscal year 2004 plan in one of four ways, based on the degree to
which the plan contained informative practices associated with that
element. Thus, to address the first element concerning the degree to which
the plan provided a clear picture of performance, we characterized each
plan in one of four ways: (1) clear, (2) general, (3) limited, or (4)
unclear. To address the second element, on the extent to which a plan
includes specific discussions of strategies and resources, we
characterized each plan as (1) containing specific discussions of
strategies and resources, (2) general discussions, (3) limited
discussions, or (4) no discussions. Finally, to address the third element
on the extent to which a plan provides confidence that performance
information will be credible, we characterized each plan as providing (1)
full confidence, (2) general confidence, (3) limited confidence, or (4) no
confidence. In conducting our reviews, we compared our assessments of
agencies' fiscal year 2004 plans to our assessments of plans from fiscal
year 1999 using similar criteria.6 A more detailed discussion of our scope
and methodology and the criteria we used can be found in appendix I.

Table 12 summarizes our characterizations of the six agencies' annual
performance plans based on our current review of fiscal year 2004 plans
and our previously published reviews of 1999 plans. Although the
characterization of agency performance plans did not change significantly
between the 1999 and the 2004 plans, the majority of agencies' plans
showed some improvement.

6GAO/HEHS-98-172R, GAO/RCED-98-194R, GAO/RCED-98-159R, GAO/RCED-98-180R,
GAO/RCED-98-200R, and GAO/HEHS-98-178R.

Appendix IV Observations on Agencies' Annual Performance Plans

       Table 12: Characterizations of Agencies' Annual Performance Plans
                               Characterizations

 Picture of intended performance (unclear, limited, general, clear) Strategies
           and resources (no discussions, limited, general, specific)

Data credible (no, limited, general, full) Agency

                                1999    2004    1999     2004  1999      2004 
               Department of Limited General Limited General  Limited General 
                   Education                                          
               Department of Limited Limited General General  Limited Limited 
                      Energy                                          
               Department of Limited General Limited General  Limited General 
                     Housing                                          
                   and Urban                                          
                 Development                                          
              Small Business Limited General Limited General  Limited General 
              Administration                                          
             Social Security Limited   Clear Limited General       No General 
              Administration                                          
               Department of General   Clear General Specific Limited    Full 
              Transportation                                          

Sources: GAO/HEHS-98-172R; GAO/RCED-98-194R; GAO/RCED-98-159R;
GAO/RCED-98-180R; GAO/RCED-98-200R; GAO/HEHS98-178R; and Department of
Education, FY 2004 Annual Performance Plan (Washington, D.C.: 2003);
Department of Energy, Annual Performance Plan, Fiscal Year 2004
(Washington, D.C.: 2003); Housing and Urban Development, Annual
Performance Plan, Fiscal Year 2004 (Washington, D.C.: 2003); Small
Business Administration, Budget Request & Performance Plan: FY 2004
Congressional Submission (Washington, D.C.: 2003); Social Security
Administration, Annual Performance Plan, Fiscal Year 2004 (Washington,
D.C.: 2003); and Department of Transportation, Fiscal Year 2004
Performance Plan (Washington, D.C.: 2003).

The remainder of this appendix discusses our observations on how the
quality of each of the agencies' annual performance plans we reviewed has
changed since the agencies submitted their first performance plans in
1999. We did not independently verify or assess the information we
obtained from agency annual performance plans. If an agency chose not to
discuss its efforts concerning elements in the plan, it does not
necessarily mean that the agency is not implementing those elements.

         Appendix IV Observations on Agencies' Annual Performance Plans

Observations on Changes in the Quality of Education's Annual Performance
Plan

Education's fiscal year 2004 annual plan7 provides a general picture of
intended performance across the agency-an improvement over the 1999
plan-because the measures and indicators adequately indicate progress
toward meeting annual targets; the measures are objective, measurable, and
quantifiable; and baseline and trend data are included where available.
However, the relationship between the goals and measures in volume 2 and
the long-term goals in volume 1 is not clear and the plan does not make
clear whether, and, if so, how, all of the program activities in the
department's budget are covered by the annual performance goals.8 In
another improvement over the 1999 plan, volume 1 of the 2004 plan provides
a general discussion of Education's strategies and resources to achieve
its goals by presenting strategies and resources and the projected
distribution of fiscal year 2004 funding and staffing for each long-term
goal. The plan also adequately recognizes and discusses external factors
that could affect the department's performance. However, the resources and
many of the strategies are not directly linked to the achievement of
individual annual performance goals and no strategies or resources are
designated for the goals and measures in the program performance plans in
volume 2. Lastly, the 2004 plan provides general confidence that agency
performance information will be credible. The 2004 plan contains
information on data sources for most of its measures, and for some,
identifies limitations. The plan also includes an appendix entitled
"Information Quality Guidelines" which recognizes data quality as a major
challenge for the agency and says that its improvement is a top priority.

7Education's plan states that its fiscal year 2004 annual plan includes
both department-level measures and program performance plans. These are
organized into two volumes: the Annual Plan Fiscal Year 2004 U.S.
Department of Education includes the departmentlevel measures and the FY
2004 Program Performance Plan: U.S. Department of Education includes the
program performance plans with their individual program measures. These
volumes are presented in a slightly different electronic format for the
public and other parties in general, which is available at Education's Web
site. The two volumes will henceforth be referred to as Education's 2004
annual plan, or, where applicable, volume 1 and volume 2.

8Education's 2004 annual plan represents its annual performance goals as
"targets." According to GPRA, the definition for "performance goal" is "a
target level of performance."

         Appendix IV Observations on Agencies' Annual Performance Plans

Education's Fiscal Year 2004 Performance Plan Provides a General Picture
of Intended Performance

Education's 2004 annual plan generally defines expected performance.
Measures and indicators are formulated so as to adequately indicate
progress towards meeting annual targets, seem to sufficiently cover key
performance aspects, and adequately capture important program
distinctions. The plan contains about 360 measures between volumes 1 and
2, an improvement over the 860 contained in its 1999 plan, which we judged
to be potentially excessive for an annual performance plan and possibly
interfering with Education's ability to assess its performance. Unlike in
our review of the department's 1999 annual plan, the measures and
indicators in the 2004 plan are objective, measurable, and quantifiable.
For example, most measures and indicators are set up to measure
percentages, cost, counts, or other numerical values with measurable,
quantifiable 2004 targets. In most cases where a measurable target is not
given, the plan provides a reasonable explanation, such as a new program
or the measure being new, or a case where data are not collected or
available each year. In most cases, the plan provides trend data for
measures, which provides a helpful context for assessing the relevance of
the 2004 targets, or an explanation of why such data were not provided
(e.g., the baseline has not yet been established because the measure
and/or program are new).

In our review of Education's 1999 annual plan, we said that greater
outcome measure use would make future annual plans more useful. The 2004
plan frequently includes outcome goals and measures, such as a measure for
the number of states meeting their eighth-grade mathematics achievement
targets under the long-term goal to improve mathematics and science
achievement for all students.

Volume 1 of Education's 2004 annual plan directly aligns strategies,
action steps, measures, and targets with each of Education's long-term
goals and six strategic goals,9 containing the same strategic goals,
long-term goals, and mission as the 2002-2007 strategic plan. In our
review of the 1999 plan, we also found that the plan had performance goals
in volume 1 that were directly linked to its mission, strategic goals, and
objectives. However, the relationship between the goals and measures in
volume 2 and the long-term goals in volume 1 was not made clear in the
department's 2004 annual plan, which was similar to what we found in our
review of Education's 1999

9In this report, we refer to the multiyear, long-term objectives in
Education's annual plan as "long-term goals." The strategic goals included
in the plan represent overarching statements of aim or purpose that are
used to group Education's long-term goals.

Appendix IV Observations on Agencies' Annual Performance Plans

plan-that the department could more directly link the goals and measures
in volume 2 with the strategic objectives (long-term goals). Education's
fiscal year 2002 Performance and Accountability Report includes a table
making it clear that these programs and their goals and measures are
aligned across Education's strategic and long-term goals. By including
such a table in its annual performance plan, Education could clearly show
the link between the goals and measures in volume 2 and the long-term
goals in volume 1.

Although volume 2 of the 2004 annual plan states that it contains
individual program performance plans for all major programs and many
smaller programs, the annual plan does not make clear whether, and, if so,
how all of the program activities in the department's budget are covered
by performance goals. In contrast, we found that the 1999 plan provided
sufficient information to determine which performance goals and measures
in volume 2 covered which program activities in the budget and whether all
were covered. For example, the 1999 plan contained tables indicating the
funding levels for the program activities in the department's budget and
how those activities related to the programs in volume 2.

In our review of Education's 1999 annual plan, we gave the agency credit
for addressing the need to coordinate with other federal agencies having
related strategic goals or performance goals. However, we further noted
that Education could build on its foundation by identifying performance
goals that reflect activities being undertaken to support programs of a
crosscutting nature and specifying the activities each agency would
undertake and what it expects to achieve within the fiscal year. While
selected action steps in the 2004 plan refer to instances where the
department will coordinate or cooperate with other federal agencies, the
plan does not include steps or goals for most crosscutting issues
identified in Education's strategic plan. For example, for a crosscutting
issue identified in the strategic plan on safe and drug-free schools and
communities, Education said it partners with the Departments of Justice
(Justice) and HHS to promote drug and alcohol education programs and to
disseminate information to schools and private organizations. The
department also coordinates closely with the Office of National Drug
Control Policy, and works closely with the Office of Juvenile Justice and
Delinquency Prevention Programs to share innovative ideas and promote
prevention strategies and programs. The relevant annual plan sections in
both volumes 1 and 2 do not identify goals or action steps related to
these

         Appendix IV Observations on Agencies' Annual Performance Plans

interactions. Additionally, according to our report on Education's June
1997 draft strategic plan,10 the department has numerous crosscutting
programs and activities, such as those related to early childhood and
employment training, and the 2004 annual plan does not address them all.

Education's 2004 annual plan discusses applicable goals, measures, and
strategies for two governmentwide major management challenges regarding
strategic human capital management and information security, as well as
three of the four major management challenges we identified for Education
in our January 2001 Performance and Accountability Series.11 For example,
for its student financial assistance programs, the department has
developed performance targets for fiscal year 2004 related to being
removed from our high-risk list, increasing the default recovery rate, and
decreasing grant overpayments to students. Also, a key strategy under its
goal to improve the strategic management of the department's human capital
is to develop a 5-year human capital plan, including developing a
recruitment plan and relevant training programs. For the fourth of
Education's major management challenges-promoting coordination with other
federal agencies and school districts to help build a solid foundation of
learning for all children-the 2004 plan did not include specific goals or
measures, but it did discuss some related strategies and steps, such as
using partnerships with other federal programs to promote development of
intervention strategies and methods to address the high incidence of
learning disabilities and illiteracy among adolescents attending high
schools.

Education's Fiscal Year 2004 In our review of Education's 1999 annual
plan, we found that the plan had a Performance Plan Provides limited
discussion of how the department's strategies and resources would a
General Discussion of help achieve its annual performance goals. The 2004
plan includes

strategies and resources under each of its long-term goals to be used to

Strategies and Resources	achieve its annual performance goals in volume 1,
including the projected distribution of fiscal year 2004 funding and
staffing, in both dollars and fulltime-employees (FTE), for each long-term
goal, under which the annual performance goals are organized. However, the
resources and many of the

10GAO/HEHS-97-176R.

11U.S. General Accounting Office, Performance and Accountability
Series-Major Management Challenges and Program Risks: A Governmentwide
Perspective, GAO-01-241 (Washington, D.C.: January 2001) and GAO-01-245.

         Appendix IV Observations on Agencies' Annual Performance Plans

strategies are not directly linked to the achievement of individual annual
performance goals and no strategies or resources are designated for the
goals and measures in the program performance plans in volume 2. Overall,
the plan does not discuss how resources were allocated to each goal, a
rationale for how the resources will contribute to improving performance,
or the relationship of capital asset investments, including those for
information technology (IT), to the achievement of specific goals.
However, the plan does include a performance measure and goal for the cost
and schedule of IT investments and a strategy for completing the
department's enterprise architecture, which is to be used to guide IT
capital decisions. In addition, the department has a plan for human
capital management and a new performance appraisal system that is meant to
link employee performance standards to the department's strategic
priorities, but neither had been fully implemented.

In our review of the 1999 plan, we said that external factors that could
affect performance were not discussed and that such factors are important
for a department like Education because much of what it hopes to achieve
depends on others and external events. In its 2004 plan, Education clearly
acknowledges that improving support for its state, local, and
institutional partners, who have the direct ability to influence outcomes
the department seeks, is a major challenge. The plan contains numerous
activities to handle this challenge, including, for example, to provide
support and technical assistance, improve grant monitoring, and fund an
annual survey of states' efforts. Moreover, although not labeled as
addressing external factors, the plan has specifically related strategies
and/or action steps for most external factors identified in Education's
2002-2007 strategic plan.

Education's Fiscal Year 2004 Performance Plan Provides General Confidence
That Performance Data Will Be Credible

In our review of Education's 1999 annual plan, we found that it did not
provide sufficient confidence that its performance information would be
credible. For example, the 1999 plan did not sufficiently recognize
limitations in Education's data for its elementary and secondary education
programs. In comparison, Education's 2004 plan recognizes limitations in
Education's data for many of its elementary and secondary education
programs, as well as for other programs. In many of these cases, the plan
also discusses plans to address these limitations. Also, the plan includes
an appendix containing an abbreviated form of its "Information Quality
Guidelines" and a sample checklist for statistical data from its complete
guidelines. The appendix recognizes data quality as a major challenge to
the department's successful implementation of GPRA and says that the
improvement of data quality is a top priority. The checklist includes

         Appendix IV Observations on Agencies' Annual Performance Plans

several steps related to the verification and validation of data, such as
evaluating data quality, including known limitations; addressing the
reliability of data sources; and ensuring reproducibility of findings
using the same data and methods of analysis. In addition, the plan usually
identifies the sources of data and, although not in volume 1, includes a
column on sources and data quality for each measure in volume 2, usually
with an item entitled "Validated By" and, in some cases, "Limitations." In
the end, the lack of direct control over the implementation of its
programs, including the collection of data, is a significant data quality
challenge that Education must face.

The 2004 annual plan also contains several action steps on new or changing
information systems that relate to improving the collection of information
for measuring performance. For example, under its strategy to reduce
Education's partners' data reporting burden, the plan includes an action
step to develop and implement the Performance-Based Data Management
Initiative collection system. This step is directly related to the plan's
measure to reduce the burden hours of Education program data collections
per year.

Observations on Changes in the Quality of DOE's Annual Performance Plan

Compared to the fiscal year 1999 plan we reviewed, DOE's fiscal year 2004
performance plan continued to provide a limited picture of intended
performance. Although the plan included more results-oriented annual
performance measures, it still provided a limited linkage between its
reported annual goals and its mission, strategic plan goals, and program
activities within its budget request. Furthermore, the 2004 plan provided
a general discussion of strategies and resources, similar to our 1999
findings. Finally, the 2004 plan provided a limited level of confidence
that data will be credible by making little progress in reporting on the
procedures it uses to ensure data quality or identifying significant data
limitations, which is consistent with our 1999 findings.

DOE's Fiscal Year 2004 While DOE has improved its development of annual
performance Performance Plan Provides measures-referred to as targets-by
making them more results oriented, a Limited Picture of the overall
picture of performance is limited by the lack of alignment

between its annual and strategic goals and minimal discussion of

Intended Performance	coordination with other agencies. Our review of DOE's
1999 plan found that many measures were unclear, appeared limited in
scope, or were not very useful indicators of performance. We found these
problems in the

Appendix IV Observations on Agencies' Annual Performance Plans

performance plans for subsequent years as well. For the 2004 plan,
however, the majority of the performance measures related to each goal
were results oriented and pertained specifically to the performance of
fiscal year 2004. An example of one measure requires DOE to train 4,000
federal employees by the end of fiscal year 2004 in energy management best
practices that support National Energy Policy education goals.

DOE provided a limited link between its reported annual goals and its
mission, strategic plan goals, and program activities within its budget
request. While the 2004 annual performance plan goals address all of the
major program activities in DOE's budget, the goals and mission of the
2004 plan do not align with the mission and goals for the 2003 draft
strategic plan. This represents a set back because in our review of DOE's
1999 annual performance plan, we found that DOE clearly linked its annual
goals to the agency's mission, strategic plan goals, and its program
activities within its budget request. DOE officials told us the lack of
linkage between the performance plan and the strategic plan was a matter
of timing. According to these officials, the department originally updated
its strategic plan at the same time as the annual performance plan, which
was finalized in the early months of 2003, and the goals of each plan
coincided, but the draft strategic plan goals were revised in the latter
part of the year and no longer align with the 2004 performance plan.

DOE's ability to show coordination with other agencies is also limited. In
1999, we reported that DOE did not adequately show that it coordinated
with other agencies that have related strategic or performance goals.
DOE's 1999 plan contained very little evidence of specific goals and
measures that addressed crosscutting programs and only briefly described
coordination with other agencies. The 2004 plan does not specifically
describe how coordination is taking place among crosscutting programs, but
does identify groups that it is collaborating with on certain programs. In
response, DOE officials told us the plan does not discuss what specific
collaboration activities are taking place because it would require
reporting too much detail for a performance plan. DOE officials stated
that it collaborates at the program level, rather than the agency level,
because the program plans pertain to an organizational layer lower than
the annual performance plan.

         Appendix IV Observations on Agencies' Annual Performance Plans

Finally, the plan briefly mentions that the department has been
identifying challenges and working on ways to address them. According to
DOE officials, when developing annual targets for the department,
management challenges are considered but not mentioned specifically in the
report. Our review of management challenges in 2002 found that DOE had
addressed all eight of its challenges in its 2003 annual performance
plan.12 In comparing these challenges to the 2004 plan, we found that DOE
continues to have goals that address the eight challenges we identified in
2002 and the additional challenges that we identified in our 2003
performance and accountability report.13

DOE's Fiscal Year 2004 Performance Plan Provides a General Discussion of
Strategies and Resources

DOE provided a general discussion of the strategies and resources that it
will use to achieve its annual performance goals. DOE's 1999 plan
partially provided clear and reasonable strategies for achieving
performance goals, how strategies would contribute to achieving the
performance goals, and key external factors that might affect performance.
For each of the 2004 annual performance goals, DOE included a "Means and
Strategies" section in the plan that described how each goal will be
achieved. For example, one strategy identified to meet its goal of
contributing unique, vital facilities to the biological and environmental
sciences is to conduct peer reviews of the facilities to assess the
scientific output, user satisfaction, and the overall cost-effectiveness
of each facility's operations, and their ability to deliver the most
advanced scientific capability. The 2004 plan includes a brief discussion
of the department's overall needs, particularly in the areas of human
capital, financial, and logistical resources. The plan also identified
budget amounts for each of its goals. DOE's 1999 plan partially identified
the resources needed to accomplish annual performance goals.

The plan also provided a general discussion of the external factors that
could affect achievement of the goals, but it did not specifically discuss
actions on how the external factors will be addressed. For example, the
plan states that external factors related to DOE's goal of achieving
reliable, affordable, and environmentally sound energy supplies, such as
renewable fuels, include program funding, the state of the economy, the
availability of

                          12GAO-03-225. 13GAO-03-100.

         Appendix IV Observations on Agencies' Annual Performance Plans

conventional supplies, the cost of competing technologies, and the
continuation of federal tax incentives and other national-level policies.

DOE's Fiscal Year 2004 Performance Plan Provides Limited Confidence That
Performance Data Will Be Credible

DOE has made limited progress on reporting the procedures it uses to
ensure data quality. Its 1999 plan described how DOE would ensure that its
performance information is sufficiently complete, accurate, and
consistent, but did not discuss in detail DOE procedures on how to help
ensure the quality of data or the process of collecting the data. The plan
also did not identify significant data limitations and how they may affect
DOE's ability to achieve performance goals. However, the 2004 plan showed
some improvement over the 1999 plan by describing credible procedures to
verify and validate performance information and specific program
evaluations are mentioned for each goal. The plan also discusses that DOE
acquired new commercial software for performance tracking through remote
data entry, monitoring, and oversight by program offices and managers. The
2004 plan only identifies data limitations and any new or modified systems
very briefly for a few relevant goals. According to DOE officials, with a
few exceptions, its plans do not discuss data limitations because DOE
writes goals that are not affected by data limitations. The goals are
written to ensure that the data will be there to meet performance targets.
However, our 2003 performance and accountability series identified several
DOE management challenges where data quality was a concern, such as
further upgrades needed for cyber security to ensure adequate protection
of data and information systems and additional information on the results
of contractors' performance to keep projects on schedule and within
budget.

Observations on Changes in the Quality of HUD's Annual Performance Plan

HUD's annual performance plan for fiscal year 2004 improves upon areas
where we previously reported shortcomings and generally meets the criteria
set forth in GPRA. HUD's 2004 plan provides a general picture of intended
performance by covering all the programs contained in HUD's budget and
linking program activities to strategic goals and objectives. The plan
also improved by providing specific information on HUD's strategies and
activities along with performance measures it will use to assess progress
toward its goals and discussing relevant external factors that could
affect the attainment of certain program objectives. HUD also provides
greater confidence that performance data will be credible by thoroughly
discussing the data it will use for measuring progress toward its goals.
Nevertheless, the plan could be further enhanced if it included more

         Appendix IV Observations on Agencies' Annual Performance Plans

specific information on how funds will be allocated to achieve program
objectives, explain how HUD will contribute to crosscutting efforts along
with other agencies, and what steps it will take to mitigate the impact of
external factors on its programmatic objectives.

HUD's Fiscal Year 2004 Performance Plan Provides a General Picture of
Intended Performance

Since our review of HUD's annual performance plan for fiscal year 1999,14
HUD has made progress in developing an annual performance plan that
generally reflects the department's mission and provides a general picture
of intended performance. HUD's most recent performance plan covers the
program activities contained in its budget, and generally links program
activities to strategic goals and objectives, key items missing from its
plan for fiscal year 1999. HUD has also improved the quality of its
performance plan by including performance measures that generally indicate
how the department will gauge progress toward achieving its goals. For
example, the performance plan lists a series of performance measures for
each objective that can be used to indicate progress towards the
department's goals and expected performance. These measures are also
objective and a number of them have been quantified, another key area
where HUD has improved since its first performance plan. For example,
activities supporting HUD's long-term strategic objective to "Improve the
Physical Quality and Management Accountability of Public and Assisted
Housing" include, among other things, eliminating 100,000 units of the
worst public housing. According to the current plan, the department
intends to demolish 10,000 of these units in fiscal year 2004.

While HUD's most recent annual performance plan generally identifies other
agencies it will coordinate with to address crosscutting efforts, it does
not discuss how it plans to work with these agencies to address these
crosscutting activities. For example, the plan states that the Interagency
Working Group on Limited English Proficiency will ensure that persons with
limited English proficiency will have meaningful access to funded and
federally conducted programs and activities. However, the plan does not
discuss what HUD's contribution to this multiagency effort will be, what
strategies it will employ, or how it will measure progress toward
achieving the strategies of this multiagency effort.

14GAO/RCED-98-159R.

         Appendix IV Observations on Agencies' Annual Performance Plans

HUD's current performance plan is an improvement compared to its fiscal
year 1999 plan as it describes steps HUD will take to address major
management challenges. One of HUD's strategic goals, "Embrace High
Standards of Ethics, Management, and Accountability," identifies five
objectives that cover management challenges, some of which have been
raised by GAO and the HUD IG. These objectives discuss plans to rebuild
HUD's human capital and diversify its workforce; improve HUD's management,
internal controls, and resolve audit issues; improve accountability,
service delivery, and customer service; ensure program compliance; and
improve internal communication and employee involvement.

HUD's Fiscal Year 2004 Performance Plan Provides a General Discussion of
Strategies and Resources

HUD's most recent performance plan also improves upon earlier plans we
reviewed in providing readers an idea of the strategies that HUD will
employ to carry out its goals. Each strategic goal in the annual
performance plan contains a section titled "Means and Strategies," which
describes activities HUD will pursue to support that goal. For example, to
support its goal of "Increasing Homeownership Opportunities," HUD will
fund low-income homeowner assistance programs to provide approximately
40,000 families with down payments and closing costs on their homes,
473,199 families with home purchase and homeownership counseling, and
about 232,370 families with rental counseling.

HUD's performance plan also discusses the strategies it will employ to
address the department's human capital issues, such as the upcoming
potential wave of employees planning to retire and the need to equip staff
with the desired knowledge and skills. For example, HUD completed a staff
resource estimation and allocation system in 2002, and it will conduct a
comprehensive workforce analysis in 2004 to serve as the basis to fill
mission-critical skill gaps through succession planning, hiring, and
training initiatives in its Five-Year Human Capital Management Strategy.

Although HUD has made progress in linking its resources to strategies, it
could improve the discussion by linking funding allocations to specific
performance goals, thus making the plan more informative. The plan
discusses budget and staff allocations for programs supporting each
strategic goal. For instance, portions of HUD's Community Development
Block Grants Fund and Home Investment Partnership Program, with a combined
budget authority for fiscal year 2004 of more than $2.5 billion and staff
of 203, support the strategic goal of promoting "Decent Affordable
Housing." However, it is unclear what resources will be used to pursue

         Appendix IV Observations on Agencies' Annual Performance Plans

specific performance targets for each program. Additionally, HUD does not
mention in its plan how IT and capital resources will be used to support
its programs.

Anticipating that some aspects of the department's strategic goals are
intertwined with broader phenomena, the performance plan also discusses
several external factors relevant to each strategic goal that could affect
HUD's ability to meet its objectives. For example, for its strategic goal
"Promote Decent and Affordable Housing," HUD states that broad economic
factors can affect opportunities for low-income workers relying on the
department for rent assistance to make progress towards selfsufficiency.
However, it is unclear from the performance plan what actions, if any, HUD
has put in place to mitigate the effect of these external factors.

HUD's Fiscal Year 2004 Performance Plan Provides General Confidence That
Performance Data Will Be Credible

HUD has also made significant progress in providing assurance that the
department will be able to use credible data to gauge progress towards
achieving its goals. HUD identifies the steps it (or others) will take to
verify and validate the performance data to ensure that what is reported
on HUD's performance will be credible. For example, for its objective
"Increasing Minority Homeownership," HUD will rely on, among other
indicators, the rate of minority homeownership from the Current Population
Survey conducted monthly by the U.S. Census Bureau. HUD will not verify
the data because the Bureau performs that task. Additionally, HUD also
includes in its most recent performance plan a discussion of the inherent
limitations of the data it will use and generally discusses steps it will
take to improve the measure, providing the reader with a clearer
expectation of what HUD will be able to report.

Observations on SBA's 2004 performance plan shows progress made over the
agency's 1999

performance plan. In contrast to our review of SBA's 1999 plan,15 the
2004Changes in the Quality plan provides a general picture of intended
performance by discussingof SBA's Annual coordination between SBA and
other federal agencies on crosscutting Performance Plan activities.
Resource analysis sections throughout the plan provide a

general discussion of how SBA has previously used its resources to achieve

its goals and how it intends to use future resources for the same
purposes.

The 2004 plan also provides general confidence that SBA's performance

15GAO/RCED-98-200R.

         Appendix IV Observations on Agencies' Annual Performance Plans

data will be credible by including more detail on how SBA verifies and
validates its data, as well as by identifying data limitations. However,
several areas of the plan could be improved, such as clearly linking SBA's
performance indicators, performance goals, and programs.

SBA's Fiscal Year 2004 Performance Plan Provides a General Picture of
Intended Performance

SBA's fiscal year 2004 performance plan provides a general picture of
intended performance. The performance goals and performance indicators in
the plan are generally objective, measurable, and quantified. Performance
indicators are listed throughout the plan by related programs and
strategic programmatic goals. Performance goals and outcome goals are
listed for each strategic programmatic goal. In our review of SBA's fiscal
year 1999 performance plan, we noted that SBA's performance goals were
objective and measurable, its performance measures were generally
objective and quantified, and that the performance goals in the plan were
clearly linked to SBA's strategic goals and objectives.

Like the performance measures contained in its 1999 plan, the 2004 plan's
performance indicators will be useful in assessing progress towards SBA's
performance goals. For example, the performance indicator "Regulatory Cost
Savings to Small Business" will adequately show progress for the
corresponding performance goal "Cost savings for small business due to the
efforts of the Office of Advocacy." In this example, the performance
indicator, which is listed under the Advocacy Program, can be linked to a
performance goal because of a crosswalk that relates outcome goals,
performance goals, and programs. However, there is not always such a clear
link between all of SBA's performance indicators and performance goals
because indicators are listed by program instead of by performance goal.
The BusinessLaw.gov program is linked to three performance goals: "number
of users of BusinessLaw.gov," "reduced cost to businesses and regulatory
agencies," and "increased rate of compliance." While the first two
performance goals appear related to the first two indicators listed in the
BusinessLaw.gov program section, there is no clear relationship between
any of the other performance indicators for this program and the third
performance goal, "increased rate of compliance."

SBA's 2004 performance plan contains annual performance goals that
generally cover the agency's budget activities. The 2004 performance plan
contains a budget crosswalk that "shows how the goals relate to specific
and general program areas." This is an improvement over the 1999 plan,
which we noted contained a budget crosswalk, but the categories in it did
not match SBA's budget accounts or activities by name or account number.

         Appendix IV Observations on Agencies' Annual Performance Plans

However, the performance plan does not seem to cover all of SBA's
programs. Three "advocacy" programs listed in the crosswalk do not seem to
be contained in the plan: Business.gov, Disability Initiative, and
National Women's Business Council.

Each strategic programmatic goal section in the 2004 plan contains a
discussion of crosscutting issues. Several examples of coordination
efforts are given, such as SBA working with the Department of Defense to
integrate the PRO-Net system with the Central Contractor Registry and SBA
partnering with the Federal Acquisition Institute to develop on-line
training courses for small business programs. In contrast, SBA's 1999
performance plan provided little information on SBA's coordination efforts
with other entities whose programs and activities crosscut those of SBA.

SBA's 2004 performance plan generally addresses performance and
accountability challenges we have previously identified. For example, we
have previously stated that SBA needs to strengthen its performance in
human capital management. The 2004 plan includes outcome goals,
performance goals, and programs to address SBA's strategic management of
human capital in a section on the PMA.

SBA's Fiscal Year 2004 Performance Plan Provides a General Discussion of
Strategies and Resources

The 2004 performance plan provides a general discussion of the strategies
and resources SBA will use to achieve its goals. Each strategic
programmatic goal and each of the goals for the PMA contains a discussion
of the strategies for accomplishing the goals. These discussions provide a
broad overview of the strategies used at the strategic programmatic goal
level. For example, the plan includes a strategy for SBA's strategic
management of human capital. The strategy lays out SBA's Transformation
and Human Capital plans, which will be used to implement a new vision of
SBA. In its 1999 performance plan, SBA discussed strategies for most of
its performance goals, although for some of the goals the strategies were
missing.

Each strategic programmatic goal and several of the PMA goals contain
brief discussions of external factors that could affect the achievement of
SBA's goals. These discussions include actions to address external
factors, such as working with an Interagency Acquisition Working Group
under the Procurement Executives Council to develop supplemental
performance measures to better evaluate the success of its programs. In
1998, we noted that SBA's 1999 plan recognized certain external factors
and contained a discussion of actions SBA could take to mitigate the
effects of such factors

         Appendix IV Observations on Agencies' Annual Performance Plans

for one of its strategic goals. We stated that it would be useful for SBA
to include a similar discussion of external factors and mitigation
strategies for its other strategic goals.

Each program listed throughout the plan has a resource analysis section
that describes how resources were used in fiscal year 2002. Some of these
analyses also include planned resources for fiscal year 2004. For example,
the resource analysis section for the Small Business Development Centers
(SBDC) program states that for fiscal year 2004 SBA requested
approximately the same level of funding as in fiscal year 2002. In 2002,
85 percent of the funds were for grants, while the other 15 percent
covered field support, program management, and overhead costs such as
rent, legal services, human resources, and information technology support.
Some of the resource analyses also contained pie charts of the breakdown
of costs. This is an improvement over SBA's 1999 performance plan, which
we found did not specifically identify the human or technological
resources that SBA would need to achieve its performance goals.

SBA's Fiscal Year 2004 Performance Plan Provides General Confidence That
Performance Data Will Be Credible

SBA's 2004 performance plan provides general confidence that its
performance data will be credible. An appendix of the performance plan
contains verification and validation information, as well as data
limitations and remedies for these limitations for most of SBA's
performance indicators. However, the appendix does not include this
information for the performance indicators of the disaster loan program,
nor are any of the PMA performance indicators discussed in the appendix.

Generally, the discussions of SBA's verification and validation processes
for its indicators in the 2004 plan are one-or two-sentence statements.
For one of the indicators, "number of jobs created and retained by the
7(a) loan program," SBA states that it does not have access to the data
for verification purposes. SBA also notes that it does not independently
verify some of the external data it gathers, as is stated in the
verification discussion of the indicator, "504 loans to emerging market
firms." This is an improvement over SBA's 1999 performance plan, which
included brief descriptions, often only one or two words, on the means it
used to verify and validate its data. We noted in our report on the 1999
plan that these appeared to be sources of data for the measures rather
than means to verify and validate the data.

The data limitations contained in SBA's 2004 performance plan are
generally one-sentence statements and the same limitations are used for

         Appendix IV Observations on Agencies' Annual Performance Plans

multiple indicators. For example, several limitations, such as "the
measure is based on the number of approved loans" or "information is
derived from loan approval data," are used for multiple indicators. The
appendix also lists remedies for the data limitations for each indicator.
For example, a limitation of SBA's indicator "small business appointments
conducted with procurement officials" is that the indicator may not
capture unscheduled appointments. The remedy for this limitation is to
keep track of both scheduled and unscheduled appointments. The discussion
of data limitations and their remedies in the 2004 plan shows progress
over SBA's 1999 plan, which did not contain a discussion of data
limitations.

Observations on Changes in the Quality of SSA's Annual Performance Plan

Compared to the fiscal year 1999 plan we reviewed, SSA's performance plan
provided a clear picture of intended performance by (1) defining expected
performance, (2) offering trend data, which helps track progress toward
performance goals, and (3) using objective, measurable, and quantifiable
performance measures. SSA also provided general information on its
strategies and resources, somewhat better than our 1999 findings. Finally,
the plan provided a general level of confidence that data will be credible
by describing the Inspector General's (IG) involvement in data testing,
providing data sources and definitions, and identifying some data
weaknesses, an improvement over the 1999 plan. However, SSA's performance
plan still does not fully discuss the agency's coordination with other
agencies, identify performance goals that clearly cover all the program
activities, address how SSA plans to use the information from the
evaluations to improve program results, identify the resources needed to
address each performance goal, and discuss data verification and
validation procedures for its internal systems.

         Appendix IV Observations on Agencies' Annual Performance Plans

SSA's Fiscal Year 2004 Performance Plan Provides a Clear Picture of
Intended Performance

Overall, SSA's fiscal year 2004 performance plan has improved over its
1999 plan.16 The 1999 plan only provided a partial picture of SSA's
intended performance across the agency. In June 1998, we reported that
SSA's 1999 Annual Performance Plan contained performance goals, many of
which were measurable and linked to the agency's strategic goals17 and
objectives; some of the performance goals related to particular strategic
goals were objective, measurable, and quantifiable. However, other goals
were not measurable or quantifiable and did not define the level of
performance to be achieved, thus making it difficult to see how SSA would
assess success.

SSA's fiscal year 2004 plan provides a much clearer picture of intended
performance through (1) defining expected performance, (2) offering trend
data, which helps track progress toward performance goals, and (3) the use
of objective, measurable, and quantifiable performance measures. For
example, as part of the strategic objective to "Prevent fraudulent and
erroneous payments and improve debt management," SSA provided historical
data on the outcome measure "Percent of SSI payments free of preventable
error (overpayments and underpayments)" from fiscal years 1999-2001 and
projected goals for fiscal years 2002-2004.

While we found significant improvements in SSA's 2004 annual performance
plan over its 1999 plan, we also found some weaknesses. For example,
coordination efforts with other entities, such as federal agencies, state
and local entities, and others, are not well identified. According to SSA
officials, SSA coordinates with other federal agencies, such as the IRS
and the Immigration and Naturalization Service, as well as Veterans
Administration, on information-sharing initiatives. However, these types
of coordination efforts are mentioned only briefly, if at all, in the 2004
annual performance plan.

In its 2004 plan, SSA includes a list of major program evaluations it
plans to conduct during 2003-2004, with a brief description of the
evaluations, their associated strategic goals, and projected completion
dates. However, there is no indication how SSA plans to use the
information from the evaluations

16Much of this improvement took place between the 1999 and 2000 plans. We
reported that SSA's fiscal year 2000 performance plan showed significant
improvement over its 1999 plan in U.S. General Accounting Office,
Observations on the Social Security Administration's Fiscal Year 2000
Performance Plan, GAO/HEHS-99-162R (Washington, D.C.: July 20, 1999).

17SSA reduced its strategic goals from five to four in its 2003-2008
strategic plan.

         Appendix IV Observations on Agencies' Annual Performance Plans

to improve program results. The plan could be enhanced if the descriptions
of these evaluations included the manner in which SSA planned to use the
information gathered to improve its programs.

SSA's Fiscal Year 2004 Performance Plan Provides a General Discussion of
Strategies and Resources

SSA's 1999 plan had little discussion of the relationship between SSA's
mission, goals, and budget activities. Throughout the document, the fiscal
year 2004 plan included clearer discussions of the linkage between SSA's
mission and goals. It also provided performance data dating back to fiscal
year 1999, essential to making comparisons between prior and proposed
levels of performance. The 2004 performance plan noted that the Limitation
on Administrative Expenses account, SSA's basic administrative account, is
an annual appropriation that covers everything from salaries and benefits
of SSA federal employees (excluding IG) to systems and telecommunications
activities. SSA provided information on the funding sources of this
account, including some of its budget accounts.

In its fiscal year 2004 plan, SSA provided information on the strategies
it plans to use in addressing its key strategic objectives. The plan
included a summary chart, showing the strategic objectives associated with
each strategic goal, as well as the performance measures under each
objective. In addition, the "means and strategies" section associated with
each strategic objective identified strategies that support items in the
PMA, GAO and IG major management challenges, and Social Security Advisory
Board recommendations. In our October 2002 report Performance and
Accountability: Reported Agency Actions and Plans to Address 2001
Management Challenges and Program Risks,18 we noted that SSA identified
directly related goals and measures for five of its six challenges, and
had strategies (without goals or measures) for the sixth challenge.

It is difficult to determine whether or not the annual performance plan
identifies annual performance goals that cover all of the program
activities in the agency's budget, as well as the financial, human
capital, and information technology resources needed to address each
individual goal. General human capital requirements and goals are
identified as part of SSA's strategic goal to strategically manage and
align staff to support SSA's mission. The plan is neither structured by
program activity nor account. SSA noted that it aligned its strategic
goals, performance measures, and

18GAO-03-225.

         Appendix IV Observations on Agencies' Annual Performance Plans

budget with its major functional responsibilities rather than by program
accounts since direct service and support employees provide services
linked to these functional responsibilities, as opposed to a specific
program. However, SSA does not indicate what it means by "functional
responsibilities," nor does it show a clear link between its strategic
goals and such responsibilities.

As in the fiscal year 1999 plan, the fiscal year 2004 plan included a
discussion of external factors that could affect the achievement of its
goals.19 SSA identified strategies to alleviate some, but not all of, the
factors. For example, SSA plans to mitigate the loss of institutional
knowledge through SSA's "retirement wave" through the use of employee
development programs, redeploying positions to direct service, hiring
Presidential Management Interns, and the increased use of hiring
flexibilities. However, the discussion of factors affecting SSA's solvency
strategic goal merely notes that Social Security programs must respond to
related developments.

SSA's Fiscal Year 2004 Performance Plan Provides General Confidence That
Performance Data Will Be Credible

SSA's 1999 plan stated that the Office of the Inspector General was
responsible for reviewing the data systems underlying its performance
measures, but did not provide further details that would assure the reader
that SSA is taking the steps necessary to ensure data integrity. In
contrast, SSA's fiscal year 2004 plan provided data sources and
definitions for each performance measure. SSA's fiscal year 2004 plan
identified data limitations related to performance measures, as well as
some efforts to correct or address data weaknesses. When performance
indicators and goals are not quantified, SSA describes its benchmarks for
goal achievement. For example, for the outcome measure "Developing new
performance management systems," SSA defines "Implementing the new SES
system" as its goal for 2003.

As in the fiscal year 1999 plan, SSA notes that the IG's office is
involved in the data system reliability process. In the fiscal year 2004
plan, SSA went further to explain the IG's four-point approach to
reviewing performance measures, including assessing whether the reported
performance measure data are valid. SSA also noted that performance data
for its quantifiable measures are generated by automated management
information and

19SSA refers to external factors as environmental factors.

         Appendix IV Observations on Agencies' Annual Performance Plans

workload measurement systems, as a by-product of routine operations.
However, there is no discussion of verification and validation procedures
for data generated by these systems.

Observations on Changes in the Quality of DOT's Annual Performance Plan

DOT's annual performance plan for fiscal year 2004 showed evidence of
improvements in areas that we previously identified had shortcomings in
our 1998 review of DOT's 1999 performance plan.20 The 2004 plan provides a
clear picture of intended performance with DOT's measures and performance
goals now being clearly linked to the department's strategic objectives. A
specific discussion of DOT's strategies and resources in the plan includes
numerous and detailed strategies for achieving DOT's performance goals,
and the resources needed for those strategies. Procedures to verify and
validate data, as well as known data limitations, are described for each
performance measure providing full confidence in the credibility of DOT's
performance data. Still, the performance plan could be improved by
including a discussion of, and performance measures for, each of DOT's
program activities and by more consistently describing DOT's role in
crosscutting programs.

DOT's Fiscal Year 2004 Performance Plan Provides a Clear Picture of
Intended Performance

DOT's fiscal year 2004 performance plan shows evidence of many of the same
strengths as, and a few improvements over, its fiscal year 1999
performance plan and provides a clear picture of intended performance. The
2004 plan lists outcome goals, performance goals, and measures by
strategic objective, all of which are generally objective, quantifiable,
and can show progress toward DOT's strategic objectives. For example, the
measure "fatalities per 100 million vehicle-miles of travel" will gauge
progress toward the performance goal "reduce highway fatalities per 100
million vehicle-miles traveled to no more than 1.0 in 2008, from 1.7 in
1996." The data gathered by the measure will also show progress toward the
related outcome of "reduce the number of transportation-related deaths"
for DOT's "safety" strategic objective. This is an improvement over DOT's
1999 plan in which we found that DOT's performance goals typically covered
only a portion of the strategic goals and the link between annual
performance goals and strategic goals could be improved.

20GAO/RCED-98-180R.

Appendix IV Observations on Agencies' Annual Performance Plans

DOT's plan also presents trend and baseline data for each performance
measure and goal. For the example given above, the plan indicates the
performance targets from 1999 to 2004 and also presents actual data for
these targets for 1999 to 2002. In its 1999 plan, DOT had provided
baseline data for most of its performance goals and measures as well. This
information, along with the clearly linked performance goals and strategic
objectives, helps to show DOT's progress in achieving its goals.

As in the 1999 plan, DOT's 2004 performance plan generally covers each
program activity in its budget request for fiscal year 2004. An appendix
to the performance plan lists DOT's program activities and indicates the
proposed funding level for each program by strategic objective. However,
as in its 1999 plan, a few programs do not seem to be linked to the
strategic objectives elsewhere in the plan. Capital grants to the National
Passenger Rail Corporation (Amtrak) and the Bureau of Transportation
Statistics' Office of Airline Information are both linked to the "mobility
& economic growth" strategic objective in the budget crosswalk, but they
do not appear in the discussions contained within that strategic objective
section. When the 2004 plan was published in February 2003, DOT had not
yet released its new reform strategy for Amtrak, which was made public in
July 2003. Still, the inclusion of information on Amtrak, as well as a
discussion of the Bureau of Transportation Statistics' Office of Airline
Information, would provide a clearer picture of how DOT intends to achieve
its goals.

The discussions of each performance goal have sections entitled "Other
Federal Programs with Common Outcomes." In this section, the plan
describes crosscutting programs and other agencies with which DOT works.
For example, the plan states that the Research and Special Programs
Administration of DOT continues to develop the National Pipeline Mapping
System with the Federal Energy Regulatory Commission, the National Oceanic
and Atmospheric Administration, the Department of Energy, the U.S.
Geological Survey and others, in order to help analyze risks to
environmentally sensitive and populated areas. This supports DOT's efforts
to reduce pipeline incidents. Yet for several goals, coordination efforts
are not described. One example of this is in the highway congestion
section where the plan states that the Federal Highway Administration
works closely with the Department of the Interior, Department of
Agriculture, and Department of Defense agencies to improve mobility on
federally owned lands. However, the plan does not describe the specific
actions that are being taken to improve mobility. Our 1998 report stated
that DOT's contribution or role was not described in many of the
crosscutting programs listed in DOT's 1999 performance plan.

         Appendix IV Observations on Agencies' Annual Performance Plans

The 2004 performance plan generally addresses performance and
accountability challenges we previously identified. The discussions of
these management challenges are included in the plan by the performance
goal and programs to which they are related. For example, in discussing
highway safety, DOT addresses our concerns on transportation safety,
specifically through the use of safety belts. The strategies include
continuing the National Highway Traffic Safety Administration's safety
belt outreach to high-risk populations and encouraging states to embrace
"Click It or Ticket" as the message or theme for their Buckle Up
Campaigns. A performance measure related to this management challenge
included in the plan is "percentage of front seat occupants using safety
belts." However, not all of the management challenges have related
measures and goals. For example, we have previously identified building
human capital strategies as a management challenge for DOT. A section
within the plan focuses on an "organizational excellence" objective to
implement the PMA. Strategic management of human capital strategies is
discussed in this section but no goals or measures are given to show DOT's
progress with these strategies. Still, this shows some improvement over
the 1999 plan, which generally covered management challenges, but did so
in a separate appendix without explaining how the challenges were related
to the rest of the plan. We noted that this area could be improved by
including goals and measures related to resolving these challenges.

DOT's Fiscal Year 2004 Performance Plan Provides a Specific Discussion of
Strategies and Resources

DOT's 2004 performance plan shows several improvements over its 1999
performance plan, providing a specific discussion of strategies and
resources. Discussions of each performance goal include a section titled
"Strategies and Initiatives to Achieve 2004 Target." These sections
include a variety of means by which DOT intends to accomplish its
performance goals. One example would be for DOT's performance goal to
"reduce pipeline hazmat (hazardous materials) spilled 30 percent by 2006,
from the last five years' average spill rate." The strategies for this
goal include enforcing operator qualification requirements, expanding
monitoring technology that can help prevent construction-related damage to
pipelines, and developing regulatory standards for leak detection
technology. This shows progress from when we reported that DOT's 1999
performance plan lacked sufficient information to clearly link the
strategies to performance goals in many cases.

In contrast to its 1999 performance plan, DOT's 2004 performance plan
generally discusses the human, capital, information, and other resources
needed to meet its performance goals. Each performance goal section of

         Appendix IV Observations on Agencies' Annual Performance Plans

the performance plan includes a graph showing the enacted funding for
fiscal year 2002, and the proposed funding for fiscal years 2003 and 2004.
The organizational excellence objective for DOT describes DOT's human
capital and information technology resources and strategies. For example,
one of DOT's strategies for strategic management of human capital is to
"establish a corporate approach to target recruitment efforts, with
special emphasis on cross-modal, mission-critical occupations," which
includes a pilot program for centrally recruiting and training entry-level
employees for one or more mission-critical occupations.

The 2004 plan also discusses external factors that could hamper DOT's
ability to achieve its performance goals. In our review of DOT's 1999
performance plan, we noted that the plan could be improved by recognizing
more external factors and by discussing actions that DOT could take to
mitigate the effects of these factors. In contrast, external factors are
listed for most of the performance goals in the 2004 plan. For its
transportation accessibility goals, DOT states that as the population
ages, more people will require accessible public transit, for which states
and local agencies decide how best to allocate federally provided
resources. One of the strategies that addresses this external factor is
the Special Needs of Elderly Individuals and Individuals with Disabilities
grants, which DOT states will help meet the transportation needs of the
elderly and persons with disabilities when regular transportation services
are unavailable, insufficient, or inappropriate to meet their needs.

DOT's Fiscal Year 2004 Performance Plan Provides Full Confidence That
Performance Data Will Be Credible

The 2004 plan provides full confidence that DOT's performance data will be
credible. As in the 1999 performance plan, the 2004 performance plan
contains a section, entitled "Performance Data and Performance
Measurement," that discusses the means that DOT uses to verify and
validate its data. But unlike the 1999 plan in which this discussion was
broad and not linked to specific goals and measures, the 2004 plan also
contains an appendix that provides the following for each of DOT's
measures: the source of the data, limitations of the data, observations on
the quality of the data, work planned or ongoing to improve data quality,
and any known biases. Finally, DOT has compiled source and accuracy
statements,21 which provide more detail on the methods used to collect the

21Bureau of Transportation Statistics, Source & Accuracy Compendium,
http://www.bts.gov/statpol/SAcompendium.html (Washington, D.C.: Aug. 15,
2003).

Appendix IV Observations on Agencies' Annual Performance Plans

data, sources of variation and bias in the data, and methods used to
verify and validate the data.

The presentation of data limitations in DOT's 2004 performance plan also
shows progress from its 1999 plan. The Performance Data and Performance
Measurement section includes a general discussion of DOT's data
limitations. This discussion includes limitations for the internal and
external data used by the department. Specific limitations for internal
data can be found in the aforementioned source and accuracy compendium,
while details on the limitations of external data are given in the
appendix on performance measures. In our report on DOT's 1999 performance
plan, we stated that information on data limitations was lacking for most
measures and that the plan could be improved by more consistently
addressing the data limitations throughout the plan.

Appendix V

Observations on Agencies' Annual Performance and Accountability Reports

To help Congress and the President determine agencies' actual performance
and progress in achieving strategic plan goals, GPRA requires each agency
to prepare a report on program performance for the previous fiscal year.1
One of our objectives was to assess the overall quality of agencies'
annual performance and accountability reports and the extent to which
selected elements of agency reporting have improved. To meet this
objective, we judgmentally selected six agencies-Education, DOE, HUD, SBA,
SSA, and DOT-using criteria, such as agency size, primary program type,
and previous GAO reviews. To assess the overall quality and improvements
made to the agencies' performance and accountability reports, we relied on
requirements and guidance contained in GPRA and accompanying committee
report language,2 guidance to agencies from OMB for developing performance
reports,3 interviews with agency officials, the Chief Financial Officers
Act,4 our previous reports,5 and our knowledge of agencies' operations and
programs. To assess the quality of the six agencies' performance and
accountability reports, we categorized each report based on the degree to
which it addressed three characterizations: (1) picture of performance,
(2) link between resources and results, and (3) credibility of performance
information.

To assess the degree to which an agency's report provided a clear picture
of performance across the agency, we reviewed the extent to which the
report addressed elements required by GPRA. The annual performance report
should:

o 	describe the performance indicators established in the agency's annual
performance plan, along with the actual program performance achieved
compared with the performance goals expressed in the plan for that fiscal
year;

1Office of Management and Budget, Memorandum: Program Assessment Rating
Tool (PART) - Presentation in Congressional Justifications, M-03-06
(Washington, D.C.: 2003).

2Government Performance and Results Act of 1993, Committee on Governmental
Affairs, United States Senate, S. Rpt. No. 58, 103d Cong. 1st Sess.
(1993).

3OMB Circular No. A-11, Part 6, Preparation and Submission of Strategic
Plans, Annual Performance Plans, and Annual Program Performance Reports
(Washington, D.C.: June 2002).

4Chief Financial Officers Act of 1990 (Pub. L. No. 101-576).

5GAO-02-372 and Executive Guide: Creating Value Through World-class
Financial Management, GAO/AIMD-00-134 (Washington, D.C.: Apr. 1, 2000).

Appendix V Observations on Agencies' Annual Performance and Accountability
Reports

o  review the success of achieving the performance goals of the fiscal
year;

o  provide actual results for the 3 preceding fiscal years;

o 	evaluate the performance plan for the current fiscal year relative to
the performance achieved toward the performance goals in the fiscal year
covered by the report;

o 	explain and describe where a performance goal has not been met or a
corresponding level of achievement if an alternative form is used, as well
as why the goal was not met, plans and schedules for achieving the
established performance goal, and if the performance goal is impractical
or infeasible;

o 	describe the use and assess the effectiveness of achieving performance
goals of any waivers; and

o 	include the summary findings for those program evaluations completed
during the fiscal year covered by the report.6

We also looked at the extent to which the reports clearly discussed
progress achieved in addressing the major management challenges previously
identified by us or others. For agencies that choose to issue a
performance and accountability report, the Reports Consolidation Act of
2000 requires that the report include a summary of the most serious
management and performance challenges facing the agency, as identified by
their IG, and a brief assessment of the agency's progress in addressing
those challenges.

In assessing the clarity of the performance information, we also looked at
selected qualitative characteristics used by the Association of Government
Accountants, in conjunction with the Chief Financial Officers Council, in
assessing performance and accountability reports for the Certificate of
Excellence in Accountability Reporting.7 These characteristics included
(1) whether there was a clear relationship between the performance

6The Homeland Security Act (Pub. L. No. 107-296) requires agencies to
include a review of the performance goals and evaluation of the
performance plan relative to the agency's strategic human capital
management.

7Association of Government Accountants, Certificate of Excellence in
Accountability Reporting: Reviewers Checklist, Fiscal Year 2001.
(Washington, D.C.).

Appendix V Observations on Agencies' Annual Performance and Accountability
Reports

information in the report and the goals and objectives contained in the
strategic and annual performance plans, (2) the extent to which the agency
limited the measures it discussed to those that were most significant for
its programs, and (3) the extent to which the report was user friendly by
being well-organized, concise, readable, and making effective use of
graphics to ease understanding of narrative information. We characterized
the clarity of each report in one of four ways: (1) clear, (2) general,
(3) limited, or (4) unclear, based on the extent to which the 2002 report
addressed the elements required by GPRA and the other informative
practices we described.

Both GPRA and the CFO Act emphasized the importance of linking program
performance information with financial information as a key feature of
sound management and an important element in presenting to the public a
useful and informative perspective on federal spending. Similarly, the
current administration's ambitious agenda for performance budgeting,
calling for agencies to better align budgets with performance goals and
focus on capturing full budgetary costs and matching these costs with
output and outcome goals, suggests that agencies need to develop
integrated financial and performance management systems that will enable
the reporting of the actual costs associated with performance results.
Although linking resources to performance goals is not a requirement of
GPRA, the committee report for GPRA suggested that developing the capacity
to relate the level of program activity with program costs, such as costs
per unit of result, costs per unit of service, or costs per unit of
output, should be a high priority. We have reported that world-class
financial management practices call for enterprisewide systems to
integrate financial and operating data to support both management decision
making and external reporting requirements. To assess the degree to which
an agency's report discussed the relationship between resources and
results, we characterized each report as having a (1) clear relationship,
(2) general relationship, (3) limited relationship, or (4) no
relationship.

Finally, to assess the degree to which an agency's plan provided
confidence that the agency's performance information would be credible, we
examined how each report discussed the quality of the data presented. To
help improve the quality of agencies' performance data, Congress included
a requirement in the Reports Consolidation Act of 2000 that agencies
assess the completeness and reliability of their performance data. Under
the act, agencies were to begin including this assessment in the
transmittal letter with their fiscal year 2000 performance reports.
Agencies were also required to discuss in their report any material
inadequacies in the

Appendix V Observations on Agencies' Annual Performance and Accountability
Reports

completeness and reliability of their performance data and discuss actions
to address these inadequacies.

We have previously reported on other practices that enhance the
credibility of performance data that are not specifically required by
GPRA.8 For instance, discussions of standards and methods used by agencies
to assess the quality of their performance data in their performance
reports provides decision makers greater insight into the quality and
value of the performance data. We also reported on additional practices,
in several agencies' performance reports, that would help foster
transparency to the public and assist decision makers in understanding the
quality of an agency's data. The additional practices we observed included
(1) discussions of data quality, including known data limitations and
actions to address the limitations and (2) discussions of data
verification and validation procedures. To address the extent to which a
report provided confidence that performance information was credible, we
characterized each report as providing (1) full confidence, (2) general
confidence, (3) limited confidence, or (4) no confidence.

In conducting our reviews, to the extent information was available in
prior assessments, we compared our findings of agencies' fiscal year 2002
reports to our assessments of reports for fiscal year 1999.9 A more
detailed discussion of our scope and methodology and the criteria we used
can be found in appendix I. Table 13 shows the results of our assessment
of the six agencies' reports.

8GAO-02-372.

9GAO/HEHS-00-128R, GAO/RCED-00-209R, GAO/RCED-00-211R, GAO/RCED-00-207R,
GAO/HEHS-00-126R, and GAO/RCED-00-201R.

Appendix V Observations on Agencies' Annual Performance and Accountability
                                    Reports

Table 13: Characterizations of Agencies' Fiscal Year 2002 Annual
Performance and Accountability Reports

Characterizations Picture of

performance (unclear, limited, general, clear) Resources linked to results
(no, limited, general, clear)

                               Department/agency

Data credible (no, limited, general, full)

                        Department of Education   Limited   Clear     General 
                           Department of Energy   General  Limited    Limited 
                      Department of Housing and   General    No       General 
                              Urban Development                     
                  Small Business Administration   Limited  General    General 
                 Social Security Administration   General  Limited    General 
                   Department of Transportation   General    No       Full    

Sources: U.S. Department of Education, U.S. Department of Education FY
2002 Performance and Accountability Report (Washington, D.C.: 2003); U.S.
Department of Energy, Performance and Accountability Report, Fiscal Year
2002 (Washington, D.C.: 2003); U.S. Department of Housing and Urban
Development, Fiscal Year 2002 Performance and Accountability Report
(Washington, D.C.: 2003); Small Business Administration, Fiscal Year 2002
Performance and Accountability Report (Washington, D.C.: 2003); Social
Security Administration, Performance and Accountability Report, Fiscal
Year 2002 (Washington, D.C.: 2002); and U.S. Department of Transportation,
Fiscal Year 2002 Performance and Accountability Report (Washington, D.C.:
2003).

The remainder of this appendix discusses our observations on the quality
of the agencies' annual performance and accountability reports we reviewed
and, to the extent information was available from our prior reviews, how
the quality has changed since the agencies submitted their first reports
on fiscal year 1999 performance. We did not independently verify or assess
the information we obtained from agency annual performance reports. If an
agency chose not to discuss its efforts concerning elements in the report,
it does not necessarily mean that the agency is not implementing those
elements.

Observations on the Quality of Education's Fiscal Year 2002 Performance
and Accountability Report

Education's fiscal year 2002 Performance and Accountability Report
comprises two volumes-the main volume and a second volume including
performance reports for the agency's individual programs. In our
assessment, we did not review the second volume, which includes very
detailed, discrete, and disaggregated performance information with over
350 individual measures for the Office of Civil Rights, IG, and 117
Education programs in 60 clusters.

Although Education's report included many features designed to present its
performance information clearly, the overall clarity was limited by the

Appendix V Observations on Agencies' Annual Performance and Accountability
                                    Reports

significant amount of performance information that was unavailable to show
Education's performance results. In contrast, Education's report very
clearly related its performance to its costs by using both graphics and
text to provide the agency's estimate of appropriations associated with
achieving each of its six strategic goals, 24 objectives (long-term
goals), and individual programs. Finally, Education provided a general
level of confidence in the quality of its data, primarily because of its
recognition of the challenges it faces on the timeliness, reliability, and
validity of its data. Education's recent efforts in undertaking a
performance-based data management initiative in partnership with state
leaders to allow timely and ready access to high-quality achievement and
other performance data, which the IG said would address many of the
related concerns identified during IG audits, also aided in the level of
confidence in the data.

Education's Fiscal Year 2002 Report Provided a Limited Picture of
Performance

Education's 2002 performance report is directly aligned with the goals and
measures in the agency's 2002-2007 strategic plan and its 2002-2003 annual
plan. Of the 210 measures included in the agency's strategic plan and
annual plan, 120 were identified for measurement in fiscal year 2002, and
all of these are addressed in the performance report. The report contains
sections on changes planned to enhance performance on the basis of
results. For each measure, the performance report includes trend data,
with a table showing actual data from fiscal years 1999 through 2002; in
some cases, the table also includes data from fiscal year 1998. When data
are not provided, the table indicates that they were not applicable or not
available. Overall, the report contains clear, succinct figures and a
table summarizing the status of all 120 measures, as summarized in figure
19.

Appendix V Observations on Agencies' Annual Performance and Accountability
Reports

 Figure 19: Summary of Education's Performance Indicators for Fiscal Year 2002

1.7%

Target almost met

4.2%

Set baseline

Target not met

Data not expected

Target met or exceeded

                             Data not yet available

Source: GAO analysis of U.S. Department of Education's FY
2002Performanceand Accountability Report.

However, while Education's 2002 report does review the levels of success
for its performance goals10 for fiscal year 2002, there is a critical
limitation. As we observed in our review of Education's 1999 report,11
data were not yet available for many measures in the 2002 report.
Specifically, 2002 data were available for only 41 of the 120 measures;
the rest were characterized as "Pending: Data Not Yet Available" (63) or
"Incomplete: Data Not Expected" (16). Despite the numerous strengths in
Education's 2002 performance report, the picture of performance for
Education presented in this report is limited mainly because of the lack
of data for so many of its 2002 targets. However, Education recognizes the
challenges created by its limited access to timely, reliable data:

We still face significant challenges to meeting our national education
goals. Primary among these challenges is access to timely, reliable data
on our performance in meeting our goals and implementing our programs. Our
efforts to identify effective and ineffective programs

10Education's annual performance goals are represented by its targets.
11GAO/HEHS-00-128R.

Appendix V Observations on Agencies' Annual Performance and Accountability
Reports

in the Department are severely limited by the frequent absence of useful
data about them. In FY 2002 we designed a performance-based data
management initiative which will provide much more robust information
about our programs and strategic objectives, as well as provide a strong
foundation for educational research and evaluation.

This data management initiative is being undertaken in partnership with
state leaders and the software industry and is expected to result in an
electronic data system that will allow ready access to high-quality
achievement and other performance data in a timely and seamless manner in
the future. Because the lack of data for so many of its targets blurs the
picture of performance, Education should take every possible step to
complete, as quickly as possible, its newly established performance-based
data management initiative.

While lacking data for so many of its measures, Education's 2002 report
provides an explanation for measures with pending or incomplete data. For
pending data, the report states that comparisons to targets will be made
in the subsequent performance and accountability report, in addition to
citing the department's performance-based data management initiative. The
report further indicates that measures with incomplete data were so
characterized because methods to collect data were not ready in time to
measure fiscal year 2002 results, data collection did not occur, or data
collection was delayed. The report goes on to say that, for these
measures, Education will put methods in place to measure fiscal year 2003
results, develop other data sources, or revise its measures to measure
results differently. In addition, for each incomplete measure, the report
clearly describes why data are incomplete and what will be done to address
the situation. For example, for its measure on the percentage of states
with complete school accountability systems in place, as required by the
No Child Left Behind Act,12 the report explains that the requirements
under this act are more extensive than in the past, that states that had
met prior requirements may not yet meet new requirements, that the
department had decided regulation would be necessary, and that regulations
had not been finalized to define a complete school accountability system.

In addition, the report almost always included explanations of performance
and sometimes provided information on why targets were not met when that
was the case. However, such information was not always easy to find,

12Pub. L. No. 107-110, January 8, 2002. The No Child Left Behind Act of
2001 is a reauthorization of the Elementary and Secondary Education Act,
one of the major pieces of authorizing legislation for Education.

Appendix V Observations on Agencies' Annual Performance and Accountability
Reports

as the report did not always include it in the same area as the related
measure.

For most measures, including those that did not meet their targets, the
report provided information on the steps Education is taking or plans to
take to improve or enhance performance. For example, the 2002 target for a
measure on the number of children attending charter schools was 690,000
and the actual result was 575,000. To improve performance on this measure,
Education's report says that it is distributing guidance and information
to encourage parents to consider charter schools, using both publications
and its Web site to promote charter school enrollment, and sponsoring a
charter schools Web site with information on federal assistance for
charter schools. However, it was not always clear how the steps cited
would improve or enhance performance. For example, for four measures on
advanced placement (AP) achievement that were not met in 2002, the
strategy given for improving performance is to continue to support
increasing AP achievement through the Advanced Placement Incentives
program, but the report does not include an explanation of, or any
information on, this incentives program.

Education's report included an appendix entitled "Findings from FY 2002
Evaluations" and included summaries of the findings from nine GAO reports
and eight other studies completed in fiscal year 2002. For example, the
Education for Homeless Children and Youth Program: Learning to Succeed
report comprised two studies that found that homeless students are best
served when promising practices are implemented as part of a comprehensive
and coordinated education program for the homeless.

The 2002 performance and accountability report also contained an appendix
consisting of the IG's summary of serious management challenges, including
financial management, federal student aid programs, information
technology, program performance and accountability, and human capital. For
each of these challenges, the IG provided information on Education's
progress in addressing them. Under program performance and accountability,
for example, the IG pointed out that a significant amount of the data used
to measure education programs were provided by state and local education
entities and that it is imperative that these data are accurate, so as to
provide Congress, OMB, and the public with an objective measure of the
success of education programs. The IG said that Education has recognized
the importance of improving data quality and addressed this issue in its
performance plan.

Appendix V Observations on Agencies' Annual Performance and Accountability
                                    Reports

The IG's summary of serious management challenges also included references
to GAO's high-risk list with respect to federal student aid programs and
human capital management, and Education's report included measures related
to both of these areas. Specifically, for its 2002 measure to have Federal
Student Aid (FSA) leave the GAO high-risk list by 2003 and not return, the
report states that the department "almost met" its 2002 target by
achieving 94 percent of its FSA High Risk Plan, and it described the
shortfall as not significant or material. In contrast, in our review of
Education's 1999 Performance Report,13 we noted that the department did
not have goals, objectives, or measures related to problems with its
student assistance programs. In addition, for the six measures in
Education's 2002 report under its strategic goal to improve the strategic
management of its human capital, the department reports that four targets
were pending, one was incomplete, and one had set the baseline. With
respect to the GPRA requirement14 that agencies' performance reports
include a review of the performance goals and evaluation of the
performance plan relative to the department's strategic human capital
management, Education's report discusses its human capital management
strategic goal and related performance goals in the context of its human
capital management plan, One-Ed.

Education's 2002 Report Showed a Clear Relationship between Resources and
Results

Education's 2002 report included information for each of its six strategic
goals and 24 objectives that clearly linked the department's resources
with its efforts to achieve specific results. While the department was not
able to break the costs down by each of its measures and targets, the
report used both graphics and text to provide the department's estimate of
appropriations associated with achieving each of its six strategic goals,
24 objectives (long-term goals), and individual programs. For example, for
each of its objectives, the report used a pie chart to show the percentage
of the department's appropriation that supports the objective, the rest of
the strategic goal the objective falls under, and the other five strategic
goals. An example is shown in figure 20 for the objective to ensure that
all students read on grade level by the third grade.

13GAO/HEHS-00-128R.
14As amended by the Homeland Security Act of 2002.

Appendix V Observations on Agencies' Annual Performance and Accountability
                                    Reports

Figure 20: Inputs: Allocating Funds for Education's Objective to Ensure
That All Students Read on Grade Level by the Third Grade

Education's 2002 performance report provided general confidence that the
agency's data were credible because of its recognition of the challenges
it faces on the timeliness, reliability, and validity of its data; its
straightforward disclosure of these challenges; and its recent efforts to
address them. In Education's transmittal letter, the department secretary
said that the information contained in the report is "as complete and
reliable as we have available." However, Education's report recognized
that one of the agency's significant challenges to meeting its national
education goals is access to timely, reliable data on performance and that
the lack of useful data severely limits efforts to identify effective and
ineffective programs. The report further explained that 97 percent of the
department's funding is awarded to third parties, including, for example,

                                  Other Goal 2

                                 Objective 2.1

                                  Other Goals

Source: U.S. Department of Education's FY 2002 Performance and
Accountability Report(Washington, D.C. January 2003).

The text accompanying each chart listed the dollar amount supporting the
objective's activities, the percentage of the strategic goal's allocation
that amount represented, the individual programs that supported the
objective, and the dollar amount from salaries and expenses that was
included in the dollar amount for the objective. In addition, in the
report's appendixes, a table summarizing fiscal year 2002 appropriations
and staffing allocated by goal and objective also included the FTEs under
staffing for each strategic goal and objective. Another table provided a
percentage breakdown of each objective's appropriations by 146 agency
programs.

Education's Fiscal Year 2002 Report Provided General Confidence That
Performance Data Were Credible

Appendix V Observations on Agencies' Annual Performance and Accountability
Reports

state and local agencies, that have an impact on the measurement of
results, especially the timing of data collection. Thus, Education
recognized in its report that it had limited control over the data it must
use to report results. Similarly, in the IG's summary of serious
management challenges, the report noted that Education needs to improve
its controls over the timeliness, reliability, and validity of data.

Moreover, Education's report included information on recent steps it has
taken to address its data challenges. In addition to a discussion of its
data management initiative to develop an electronic data system providing
access to timely, high-quality data, the report included an appendix with
the agency's Information Quality Guidelines and draft Data Quality
Standards presented in an abbreviated format. The discussion of data
quality standards recognizes the importance of data quality concepts to
the process of developing high-quality performance measures. The eight
standards provided are: validity, accurate definitions, accurate counts,
editing, calculation, timeliness, reporting, and burden reduction. To
facilitate the use of the standards, Education reported that it created a
data quality checklist and regularly held classes to teach staff how to
apply the standards. The IG's summary of serious management challenges
gave the department credit for this effort, pointing out that these
guidelines address many of the concerns identified during IG audits and
that the department plans to disseminate these guidelines to the chief
state school officers.

The report also provided explanations of data sources and data quality for
most measures. For example, for the measures on the percentages of 12th
grade students scoring at or above the basic and proficient levels on the
National Assessment for Educational Progress (NAEP) reading test, the
source is given as: "U.S. Department of Education; National Center for
Education Statistics, (NAEP); The Nation's Report Card, Reading." Under
data quality, the report states that NAEP data are validated using
rigorous National Center for Education Statistics statistical standards.
For most of the measures, the explanations of data quality contain similar
information on data validation. However, the data quality information only
sometimes identifies what limitations are relevant, if any. For example,
for a measure on the percentage of managers satisfied with services
received from Education's Office of Management when hiring staff, the
department relied on an internal survey of managers for its data. Although
the response rate for this survey was 22 percent, the report did not say
whether this was a limitation to the data collected. For a few of
Education's measures, the report stated that no data limitations had been
noted. It would be better if

Appendix V Observations on Agencies' Annual Performance and Accountability
                                    Reports

the agency clearly stated whether the data for each measure had
limitations or not, and, if so, what they were.

Observations on the Quality of DOE's Fiscal Year 2002 Annual Performance
and Accountability Report

DOE's 2002 Annual Performance and Accountability Report provided a general
picture of intended performance by explaining in detail the progress made
in meeting performance measures and addressing management challenges. It
also provided a limited discussion of the costs incurred to achieve DOE's
performance goals by organizing its report by major program activities,
the costs of these activities, and their corresponding goals. Finally, the
report provided a limited level of confidence that data will be credible
because the report did not include a discussion on data limitations.

DOE's Fiscal Year 2002 Report Provided a General Picture of Performance

DOE's 2002 Performance and Accountability Report provided a general
picture of performance in meeting its goals and measures. The report
contained a detailed explanation of progress for each of its performance
measures, which were referred to as "targets," by identifying whether each
measure was met, not met, or had mixed results, as shown in figure 21. In
addition, the report identified the operational processes, technology,
human capital, and other resources used to achieve each performance
measure. The results for the past 3 years of performance measures related
to each goal were also reported so that performance trends could be
identified. However, the report did not clearly explain how the results of
the performance measures reported contributed to achieving the performance
goals in DOE's annual performance plan and strategic plan.

Appendix V Observations on Agencies' Annual Performance and Accountability
Reports

Figure 21: Summary of DOE's Performance Indicators for Fiscal Year 2002

                                      Met

                                     Mixed

                                    Not met

.

Each performance measure that was not met or had mixed results contained a
plan of action to achieve the measure in the future. In addition, the
majority of fiscal year 2002 measures that were not met contained a clear
explanation as to why they were not met or had mixed results. For example,
a measure that required Southeastern Power Administration to meet its
planned repayment of principal of federal investment was not met due to
severe drought. DOE's report explained that to achieve the measure in the
future, Southeastern plans to change its rate design, propose rate
increases to obtain greater revenue, and increase cost recovery from fixed
charges.

Appendix V Observations on Agencies' Annual Performance and Accountability
                                    Reports

DOE also discussed the progress it made in addressing performance and
accountability challenges. The report identifies significant issues for
fiscal year 2002 that, as stated in the report, merit a higher level of
attention and focus in the department. Each of the challenges were linked
to a goal and its related performance measure(s). In addition, actions
taken to address each challenge and the progress made on those actions
were identified. For example, one of the challenges identified was the
need for DOE to meet federal requirements for improved and more
cost-effective use of information technology. A related goal to deal with
this challenge was for DOE to promote the effective management of
information technology resources in the department. To address this
challenge, the report stated that DOE realigned its management structure
for information technology issues, established an enterprisewide license
for Microsoft software, and launched an e-government applications task
force to identify high-priority e-government investments, among other
actions. In our prior review of DOE's 2001 performance report, we also
found that DOE had made progress in addressing all eight of its major
management challenges.15

It is unclear, however, how program evaluations were used to assess
performance because DOE did not include a summary of program evaluation
findings in either the fiscal year 1999 or 2002 reports. According to DOE
officials, a section on program evaluations was not included in fiscal
year 2002 and one is not planned for fiscal year 2003 in order to limit
the amount of detail included in the report.

DOE's Fiscal Year 2002 Report Showed a Limited Relationship between
Resources and Results

DOE's 2002 performance report provided a limited discussion of how its
resources were related to its performance. As in our review of the 1999
report, DOE's 2002 report, which also includes DOE's financial
information, continued its practice linking the department's performance
information to the costs of its program activities. For the majority of
its program activities, the 2002 report included the program activity's
net costs and related performance information for fiscal years 1999
through 2002. During our review of DOE's 1999 performance report, we were
supportive of DOE's efforts to link its performance goals and measures to
the program activities in the President's Budget. However, DOE has not
moved beyond presenting its performance and cost information by program
activity, instead of by strategic or annual performance goal or objective.
A report

15GAO-03-225.

Appendix V Observations on Agencies' Annual Performance and Accountability
                                    Reports

that presented cost and performance information by performance goals in
addition to other presentations would more clearly identify the costs
associated with the achievement of each goal. According to DOE officials,
the department plans to link its individual performance measures to the
costs of program activities in future reports.

DOE's Fiscal Year 2002 Report Provides Limited Confidence That Performance
Data Are Credible

DOE's reporting on data credibility has improved but is still limited.
Based on our review of DOE's 1999 performance report, a key improvement
made to the 2002 performance report was the department's ability to report
on its data validation and verification processes. DOE's 1999 report did
not discuss the implementation of DOE's verification and validation plan
or provide any evidence that the data quality was sufficient for assessing
the department's performance. The 2002 report met some requirements of the
Reports Consolidation Act by including a statement in the report's
transmittal letter assessing the completeness and reliability of the data.
The letter did not discuss any material inadequacies with DOE's
performance data. The report also included a high-level discussion on how
DOE will validate and verify its data and refers the reader to its 2003
annual performance plan for further details. For example, the report
stated that DOE's end-of-year reporting process includes certification by
heads of organizational elements on the accuracy of reported results,
followed by a review for quality and completeness by DOE's Office of
Program Analysis and Evaluation.

Although the department has improved on reporting its data verification
and validation processes, it has not improved on reporting any existing
data limitations. Neither the 1999 nor the 2002 report included an overall
discussion of the limitations to the data or steps DOE would take to
address those limitations, although the 2002 performance report did
identify minor data limitations for a few specific goals. DOE officials
stated that a discussion on data limitations was not included in the 2002
report because the department already reports on this information in the
annual performance plan and they thought it was redundant to put it in the
report.

Appendix V Observations on Agencies' Annual Performance and Accountability
                                    Reports

Observations on the Quality of HUD's Fiscal Year 2002 Annual Performance
and Accountability Report

Compared to its fiscal year 1999 performance report, HUD's fiscal year
2002 Performance and Accountability Report provides a general picture of
what the department accomplished by including, among other things, a
report card listing its performance indicators with the corresponding
achievement, a list of program evaluations concluded during the fiscal
year, trend information for some of its performance indicators, a
discussion of the department's attempts to address its performance and
accountability challenges, and visual aids to illustrate information on
its performance. In a few instances, the report mentions whether certain
performance goals are impractical or unfeasible. However, the report is
not as clear as it could be because it does not (1) explain how it plans
to address performance targets that were not met during the fiscal year;
(2) include an evaluation of the fiscal year 2003 performance plan
relative to the performance information presented in the performance
report for the fiscal year; or (3) include an evaluation of the fiscal
year 2003 performance plan relative to the performance information
presented in the performance report for fiscal year 2002. The report does
not show the relationship between resources and results by linking
expended dollar amounts to specific program objectives. The report
provides general confidence to the reader that the data presented are
credible by providing background information on each performance indicator
and discussing the results and analysis of the most recent data.

HUD's 2002 Report Provided a General Picture of Performance

Overall, HUD's fiscal year 2002 Performance and Accountability Report
provides a general understanding of what the department's mission is and
what it accomplished during the previous fiscal year.16 Since we first
reviewed its report for fiscal year 1999, HUD has made progress in
developing its performance report to comply with GPRA.17 In reviewing
HUD's fiscal year 1999 performance report, we noted that it only contained
performance information for three of the department's four outcome
measures. HUD's report for fiscal year 2002 includes a report card for
each strategic goal listing performance targets that were met. The report
card also provides an explanation for many performance targets that were
not marked as being met during the fiscal year. Although the report
includes

16U.S. Department of Housing and Urban Development, Fiscal year 2002
Performance and Accountability Report (Washington, D.C.: 2003).

17GAO/RCED-00-211R.

Appendix V Observations on Agencies' Annual Performance and Accountability
Reports

visual aids to enhance readers' understanding of progress made toward
attaining performance targets, HUD could enhance the report by providing a
summary of performance targets for all strategic objectives met (or not)
during the fiscal year.

HUD's performance report does not meet GPRA's requirement of including an
evaluation of its fiscal year 2003 performance plan relative to the
performance attained by the department in fiscal year 2002. Including this
evaluation could provide readers some assurance that HUD takes into
account prior performance information, such as unmet goals, to manage its
performance in the fiscal year already under way.

The report suggests that some of the performance targets that were not met
during the fiscal year were impractical or unfeasible. For example, for
its goal of "Increase the Rate of Homeownership," HUD mentions that the
indicator can be resistant to increases above an undetermined level
because homeownership is not practical or desirable for all households.
Broad economic conditions, including employment, incomes and interest
rates can affect homeownership rates. Likewise, HUD will no longer track a
performance indicator that measures the percentage of low-income housing
units containing threats to health and safety, such as exposed wiring,
unvented heaters, holes in the floor, and rodents. HUD mentions that this
indicator is not included in the fiscal year 2003 annual performance plan
because of the difficulty of attributing the results to its programs.

In several instances, HUD's annual performance report lacks a discussion
of how it plans to address unmet performance targets as required by GPRA.
For example, while HUD substantially met almost half of its performance
targets in fiscal year 2002, the report does not mention what steps the
department will take to address some of its unmet performance targets (see
fig. 22).

Appendix V Observations on Agencies' Annual Performance and Accountability
Reports

Figure 22: Summary of HUD's Performance Indicators for Fiscal Year 2002

                                       2%

                               Program not funded

                                       2%

                               Data not reliable

                                 No explanation

Source: GAO analysis of HUD's FiscalYear 2002 Performance and
Accountability Report.

aRather than stating if some performance targets were met or not, HUD
provided the following explanations: data not available; no performance
goal for this fiscal year; third quarter of calendar year (last quarter of
fiscal year, not entire fiscal year); calendar year ending in the current
fiscal year; calendar year ending the previous fiscal year; other
reporting period; results too complex to summarize; and baseline newly
established.

HUD continued to build upon the strengths of its earlier report by
including trend information for some performance indicators it used to
measure progress toward its targets during the past fiscal year. While not
presented consistently throughout the report, trend information provides a
context to understand HUD's performance and helps to show the extent to
which HUD exceeded or fell short of expectations set for its performance
targets. For instance, for HUD's performance indicator that tracks
increases in the share of welfare families residing in public housing that
move from welfare to work each year, the report mentions that in fiscal
year 2002 the rate was 13.1 percent compared to 19.9 percent in fiscal
year 2001. In preparing to implement the Temporary Assistance for Needy
Families program, HUD originally estimated this indicator to be around 6.5
percent in fiscal year 1997.

Appendix V Observations on Agencies' Annual Performance and Accountability
                                    Reports

HUD's fiscal year 2002 report also mentions that it concluded several
program evaluations during the fiscal year, a key requirement absent from
its performance report for fiscal year 1999. The report also provides a
brief summary of the main findings of these program evaluations. In fiscal
year 2002, HUD concluded and published reports on 21 program evaluations
covering five of its strategic goals.

Similar to our findings on HUD's previous performance reports,18 HUD's
fiscal year 2002 report discusses the steps the department took to address
decade-long management challenges. For example, while HUD's report
mentions that deficiencies remain in its financial management systems, in
fiscal year 2002 the department initiated a project to design and
implement an integrated financial system. Similarly, to address staffing
imbalances and human capital challenges, HUD implemented the last phase of
its Resource Estimation and Allocation Process in January 2002 and started
to implement the Total Estimation and Allocation Mechanism, a tool that
collects actual workload accomplishments and staff usage within the
various operating components at HUD.

HUD's 2002 Report Showed No Relationship between Resources and Results

While HUD has made some improvements in how it presents cost information
in its report, it is still not useful for linking program objectives to
specific dollar expenditures. HUD's report provides a summary of the cost
of operations by each reporting segment, such as the total amount of money
spent by the Federal Housing Authority and the Public and Indian Housing
programs, and states that the total cost for fiscal year 2002 operations
was $33 billion. However, the report does not reconcile these costs to
specific program performance objectives, limiting the reader's ability to
understand how HUD used its resources to carry out its objectives during
the fiscal year.

HUD's 2002 Report Provided HUD's fiscal year 2002 performance report
generally informs the reader on General Confidence That critical issues
about the reliability of its performance data, an issue that Performance
Data Are was not discussed in detail in its earlier report. In its
transmittal letter,

HUD briefly discusses that in some instances the data used in the
reportCredible were either incomplete and/or unreliable. The report
includes background information, results, analysis, and a discussion of
the data used for each

18GAO-03-225.

Appendix V Observations on Agencies' Annual Performance and Accountability
                                    Reports

performance indicator during the fiscal year. In discussing the data, in
some instances HUD points out issues concerning data validity and accuracy
and mentions steps HUD will take to correct problems. For example, to
address problems with its indicator on the number of homeowners who have
been assisted with the HOME program, HUD has established a team of
managers, technical staff, and contractors to make a series of
improvements to the Integrated Disbursement and Information System
beginning in fiscal year 2003, which should reduce the need for data
cleanup.

Observations on the Quality of SBA's Fiscal Year 2002 Annual Performance
and Accountability Report

SBA's fiscal year 2002 annual performance report shows several
improvements over the agency's initial report. The report shows a general
relationship between resources and results by including an analysis of
resources used by each program in fiscal year 2002. A section on data
validation and verification, which includes data limitations and remedies
for those limitations, provides a general level of confidence in the
credibility of SBA's performance data. While the 2002 report includes a
scorecard to show the agency's overall performance, the report provides a
limited picture of performance due to a lack of plans to meet unmet goals
in the future and data that were unavailable to show progress towards a
large share of SBA's performance goals, among other reasons.

SBA's 2002 Report Provided a Limited Picture of Performance

SBA's 2002 performance report19 provides a limited picture of its
performance. SBA's 2002 report includes a scorecard that shows overall
agency performance for its 2002 goals, including trend data (when
available) from fiscal year 1999 to fiscal year 2002, the fiscal year 2002
goal, and a column showing the percentage of the fiscal year 2002 goal
achieved. However, based on the performance information provided in the
fiscal year 2002 performance report, it can be difficult to gauge SBA's
progress in achieving its goals. This is similar to our findings on SBA's
1999 report, which we noted was unclear as to how well SBA performed in
achieving several of its performance goals for two of the key outcomes
addressed in

19Small Business Administration, SBA's Performance & Accountability Report
for Fiscal Year 2002 (Washington, D.C.: 2003).

Appendix V Observations on Agencies' Annual Performance and Accountability
Reports

our 2000 report.20 Figure 23 summarizes SBA's progress on its 19
performance goals for fiscal year 2002.

Figure 23: Summary of SBA's Performance Goals for Fiscal Year 2002

Goal achieved

                               Goal not achieved

                               Data not available

                                      n=19

  Source: GAO analysis of SBA's FY 2002 Annual Performance and Accountability
                                    Report.

Data were unavailable for 10 of SBA's 19 performance goals in 2002. For
the nine goals that had performance data available, SBA met seven. SBA's
2002 performance report included explanations for all of its goals that
were unmet, deemed infeasible, or for which data were not available. For
example, the report states that "homes restored to pre-disaster
conditions" and "businesses restored to pre-disaster conditions" are no
longer goals because SBA is reviewing its outcome measures for the
disaster loan program. Also, data were not available for the "customer
satisfaction" goal in the disaster assistance program because a Customer
Service Survey for disaster loan recipients was not issued during the
fiscal year due to having not received final clearance from OMB. This
contrasts to our findings on SBA's 1999 report when the agency did not
provide explanations for not

20Our review of SBA's fiscal year 1999 report, GAO/RCED-00-207R, focused
on our observations on only three of SBA's key outcomes, as well as the
major management challenges addressed in the performance report. Since our
review of SBA's 2002 performance report used somewhat different assessment
criteria, we could not make valid comparisons on all aspects of the
reports.

Appendix V Observations on Agencies' Annual Performance and Accountability
Reports

meeting several of its performance goals. However, for the two goals that
were unmet in 2002, "start-ups receiving 7(a) and 504 financing" and "jobs
created and retained by SBIC clients," the report does not describe any
plans for achieving the goals in the future. The lack of information for
over half of SBA's performance goals and the absence of plans for
achieving unmet goals limits our ability to assess the overall progress
the agency made in fiscal year 2002, as well as the likelihood that it
will improve its performance in the future.

Several other factors limited our ability to evaluate SBA's performance in
its fiscal year 2002 report. The report presents performance data in
accordance with the goal structure of SBA's 2003-2008 draft strategic
plan. The goal structure contained in the fiscal year 2002 performance
report does not directly correspond with the goal structure presented in
the 2002 performance plan. Only one of the report's strategic goals, "Help
Families and Businesses Recover from Disasters," directly corresponds to
the 2002 performance plan. Similarly, not all of the performance goals
listed in performance scorecards in both documents correspond. For
example, the performance goal "start-ups receiving 7(a) and 504 loans
viable 3 years after receiving loan," listed in the scorecard in the 2002
report, is not listed in the 2002 performance plan. The report states that
based on 2002 results SBA made "substantial modifications" to its fiscal
year 2003 goals, but the report does not specifically discuss how the
performance achieved in 2002 could affect the achievement of the 2003
goals. Finally, the report does not include the findings of program
evaluations completed in fiscal year 2002. SBA states that it was unable
to conduct program evaluations in 2002 due to a lack of funding and that
the agency has requested funding for program evaluations in fiscal years
2003 and 2004.

Similar to our findings on SBA's 1999 performance report, the agency
continues to provide information indicating the agency's progress in
addressing management challenges that have been previously identified. For
example, we have previously observed that SBA needs to improve the quality
of the performance measures that it uses for the disaster loan program.
SBA states in its 2002 report that the agency is in the process of
reevaluating its measures for the disaster loan program, and specifically
that the methodology for measuring the number or percentage of homes and
businesses restored through the program will be addressed by this review.

SBA's 2002 performance report also discusses the agency's strategic
management of human capital. One section of the report relating to the

Appendix V Observations on Agencies' Annual Performance and Accountability
                                    Reports

PMA describes SBA's transformation plan, which is to realign the agency's
"organization, operation, and workforce to better serve its small business
customers." An appendix to the report identifies fully developing and
implementing the agency's human capital management strategy as one of
SBA's major challenges. This section lists the actions that SBA needs to
take to address this challenge as well as the progress the agency has made
in implementing these actions. However, the report does not include a
review of the performance goals and evaluation of the performance plan
relative to the agency's strategic human capital management, as required
by the Homeland Security Act of 2002.

The report also contains two broad overviews and an appendix of GAO audits
and recommendations, as well as a description of management challenges
identified by the agency's Inspector General. One chart, entitled "Status
of GAO Reviews Conducted at SBA in FY 2002," shows the review title,
status of the review (open or closed), and the number of recommendations
that came from these reviews. Another chart, entitled "Number of Open GAO
Recommendations at End of FY 2002," lists the GAO report number and title,
the year it was issued, and the number of recommendations remaining open.
Further detail is provided in an appendix to the performance report, which
lists GAO's outstanding recommendations, the status of the
recommendations, and the estimated date of completion. Another appendix
includes a report from SBA's Acting IG that describes the most serious
management challenges SBA faced in fiscal year 2002.

SBA's 2002 Report Showed a General Relationship between Resources and
Results

SBA's 2002 performance report contains analyses of resources and results
for SBA's programs that show a general relationship between resources and
results. In the description of each program's performance, the report
includes an analysis of the resources used by each program. For example,
the fiscal year 2002 cost of the Advocacy Program was estimated to be $8
million, with 50 percent of the funds going to support the Office of
Advocacy, 14 percent funding economic research, 11 percent for SBA's
executive direction support, 16 percent for fixed costs, and 9 percent
going to human resources, information technology, and procurement. The
report contains crosswalks that show the relationship between SBA's
strategic goals, outcome goals, performance goals, and programs. The
resources used by each program can then be linked through this crosswalk
to performance goals to generally show the resources needed for the
results achieved towards the goals. However, the connection of resources
to

Appendix V Observations on Agencies' Annual Performance and Accountability
                                    Reports

results could be more explicitly stated in the report if results and
resources were also presented by performance goal.

SBA's 2002 Report Provided General Confidence That Performance Data Are
Credible

SBA's 2002 performance report provides general confidence that the
agency's performance data are credible. In the letter transmitting its
2002 performance report, SBA states that the performance data for its
credit and procurement assistance programs are complete and reliable,
based on a systematic review of these data. SBA further states that it is
"working to improve the completeness and reliability of the performance
data for the advice provided to small business through SBA's resource
partners." Data for this aspect of SBA's performance are collected through
surveys, which the agency notes are neither consistent nor comparable, and
from which client responses are difficult to obtain. This could be seen as
a material inadequacy, of which a discussion is required by the Reports
Consolidation Act. SBA discusses the actions it will take to address the
quality of the surveys by stating in the transmittal letter that it is
working to improve the survey instruments it uses to obtain performance
data.

SBA provides a detailed discussion of each performance indicator in a
section of the report on data validation and verification. For each
indicator, the report provides a definition, source, information on
validation, and means for data verification. The verification process for
several measures includes audits, independent reviews, and consistency
checks. However, for these measures this is the only discussion of
verification procedures and no further details are provided. Also, for
several measures, such as "number of start-up firms financed by 7(a) &
504" and "regulatory cost savings to small businesses," SBA states that it
does not independently verify the data.

The report also addresses limitations to its data in the section on data
validation and verification. In this section SBA states that it faces many
challenges in acquiring high-quality data on both outputs and outcomes,
from both internal and external sources. The strategies that SBA will use
to address the shortcomings of its data quality are contained in this
section as well, which include ensuring the validity of performance
measures and data, fostering organizational commitment and capacity for
data quality, assessing the quality of existing data, responding to data
limitations, and building quality into the development of performance
data.

The section on data validation and verification in the 2002 report
discusses how SBA plans to remedy the limitations for each indicator. For
example,

Appendix V Observations on Agencies' Annual Performance and Accountability
                                    Reports

for the "customer satisfaction rate" measure of the disaster loan program,
the report states that the surveys used for this measure only determine
the satisfaction of those who received disaster loans and therefore do not
address the satisfaction of those who did not receive the loans. The
remedy listed for this limitation is to expand the survey to include all
applicants. This is an improvement from SBA's 1999 report, which we noted
did not discuss data limitations that could affect the quality of data
used by SBA to assess its performance.

Observations on the Quality of SSA's Fiscal Year 2002 Annual Performance
and Accountability Report

On the whole, SSA's 2002 performance report generally showed the agency's
progress towards its annual goals for fiscal year 2002.21 It showed
continued emphasis on outcome-oriented goals and identified relevant
results that were linked to individual strategic objectives. It also
provided trend information, typically back to fiscal year 1999, and
contained a brief discussion of the program evaluations completed during
fiscal year 2002. SSA's strategic goals were linked to financial resources
at a very high level, but none of the performance goals were associated
with costs; thus, the cost of achieving (or not achieving) a particular
goal was not clear. Additionally, the SSA Commissioner certified that
SSA's data presentation was credible, but missing data and a lack of
documentation of the methods and data used to measure its performance
reduced the overall quality of the document.

SSA's 2002 Report Provided SSA's 2002 performance report exhibited a
general description of

a General Picture of performance, including the identification of 14 key
performance indicators

Performance	out of a total of 69 indicators. Similar to our review of
SSA's fiscal year 1999 report,22 we found that many of SSA's fiscal year
2002 goals and indicators were outcome oriented. In the fiscal year 2002
report, SSA plainly summarized progress on its 69 performance indicators,
as shown in figure 24.

21Social Security Administration, Performance and Accountability Report,
Fiscal Year 2002 (Washington, D.C.: 2002).

22Our review of SSA's fiscal year 1999 performance report
(GAO/HEHS-00-126R) focused on our observations on five of the agency's key
outcomes, as well as the major management challenges addressed in the
performance report. Since our review of SSA's 2002 performance report used
different assessment criteria, we could not make valid comparisons on all
aspects of the reports.

Appendix V Observations on Agencies' Annual Performance and Accountability
Reports

Figure 24: Summary of SSA's Performance Goals for Fiscal Year 2002

                             Data not yet available

                                    Not met

                                   Almost met

                                      Met

Source: SSA's FY 2002 Annual Performance and Accountability Report.

In SSA's 1999 annual performance report, performance measures focused on
activities rather than results, so it was difficult to determine the
agency's real progress in achieving results. For example, the measures
directly related to the outcome "long-term disability benefits are reduced
because people return to the workplace" did not sufficiently track
progress toward this key outcome. One of the measures was to "begin
implementation of the `Ticket to Independence' program, contingent upon
enactment of supporting legislation in FY 1998."23 This measure was
neither quantifiable nor measurable, and did not measure the number of
beneficiaries achieving this outcome.

In the 2002 report, SSA identified relevant results that are linked to
strategic objectives. For example, one of SSA's objectives related to the
strategic goal "deliver citizen-centered, world-class service" was to
"maintain the accuracy, timeliness, and efficiency of service to people
applying for Old Age and Survivors Insurance (OASI) and Supplemental
Security Income (SSI) Aged benefits." SSA reported on the timeliness of
OASI and SSI claims, as well as the implementation of software and
infrastructure for paperless processing of claims, as the relevant
results.

23This program was an administration proposal to test allowing disabled
beneficiaries to choose their own public or private vocational
rehabilitation provider.

Appendix V Observations on Agencies' Annual Performance and Accountability
Reports

In its 1999 performance report, SSA noted that a number of its goals were
not met, such as those relating to accurate and timely disability
determinations. Also, data on the accuracy of decisions at the initial
level were not available, and accuracy at the appellate level was not
measured. In its 2002 report, 10 percent of the goals were not met and 10
percent were almost met. SSA's report provided explanations for 17
performance goals SSA did not meet. However, not all of the explanations
actually identified the reasons for SSA's not meeting its goals. For
example, SSA did not meet its goals for the performance indicators
"percent of 800-number calls handled accurately-payment" and "percent of
800-number calls handled accurately-service." The explanation noted that
several quality initiatives were implemented, but SSA did not provide
explanations as to why the goals were not met. SSA also noted that some of
its performance indicators were being eliminated in favor of more focused
and outcomebased goals. In some cases, SSA identified future plans to
improve performance.

In SSA's 2002 performance and accountability report, trend information was
generally available for comparison of data back to fiscal year 1999. This
information was helpful in making an assessment whether SSA was making
progress towards its goals. SSA noted that it addressed all the IG's list
of major management challenges in its report, and that it addressed the
major management challenges we identified in its annual performance plan.
SSA also addresses the progress it made against certain challenges GAO and
the IG identified during fiscal year 2002 in its performance and
accountability report.24 For example, SSA highlights components of its SSI
Corrective Action Plan that are geared to improve the administration of
the SSI program and get it removed from our high-risk list. The IG's
report noted that SSA needs to have performance goals and measures that
address the major management challenges facing SSA, as they are not all
addressed. For example, performance measures were not established to
address problems with the Earnings Suspense File and the integrity of the
representative payee process.

24In our October 2002 report GAO-03-225, we noted that SSA reported
progress on all six of its major management challenges in its fiscal year
2001 annual performance report.

Appendix V Observations on Agencies' Annual Performance and Accountability
                                    Reports

Finally, SSA's performance and accountability report contained a
discussion of the program evaluations conducted, organized by strategic
goal. However, the program evaluations SSA identified were typically
surveys of people who did business with SSA or assessments of internal
needs, such as a survey of training effectiveness and water/air quality
surveys. While this is a slight improvement over its 1999 report, where
there was only a brief summary of program evaluations, it would be helpful
for SSA to report on whether and how its evaluations have helped answer
questions about program performance and results. We have previously
reported that evaluations can help agencies improve their measurement of
program performance and/or understanding of performance and how it might
be improved.25

SSA's 2002 Report Showed a Limited Relationship between Resources and
Results

In the fiscal year 2002 performance and accountability report, SSA's
performance goals were not aligned by budget account-rather, they were
associated with strategic goals, which in turn cross budget accounts and
programs. Thus, the monetary, human capital, and technological resources
necessary to achieve many performance goals were not adequately described.
The financial statements show a schedule of financing and a schedule of
budgetary resources for each of SSA's major programs, and operating
expenses were associated with four out of the five strategic goals.26
However, these resources were not broken down by performance goal, and
were not linked to outcomes. Additionally, as reported by the IG, SSA
needs to further develop its cost accounting system, which it began to use
in fiscal year 2002; such a system would help to link costs with
performance.27

SSA's 2002 Report Provided While SSA provides data sources and definitions
for the data supporting its General Confidence That performance
indicators, some data issues continue to detract from SSA's Performance
Data Are performance report. SSA's transmittal letter noted that the
performance

and financial data presented are fundamentally complete and reliable,
andCredible that no material inadequacies were identified. Data sources
are identified

25We characterized program evaluations and their uses in GAO/GGD-00-204.

26SSA noted that its fifth strategic goal, "Valued Employees," supports
the accomplishment of all its basic functions, so its resources are
inherently included in the other four goals.

27SSA began to implement an improved cost accounting system in fiscal year
2002, which will be phased in over the next 3 to 4 years.

Appendix V Observations on Agencies' Annual Performance and Accountability
                                    Reports

for many of the performance indicators, such as findings from evaluations
and quality assurance reports. In certain cases, data limitations are
identified; for example, SSA noted that data to support the "Percent of
SSNs issued accurately" goal does not include SSNs (social security
numbers) assigned via the Enumeration-at-Birth process and major errors
identified by the Office of Quality Assurance that do not include these
SSNs result in SSN cards being issued erroneously. Some data verification
procedures are noted in the report, but verification procedures are not
consistently discussed and data reliability and completeness is not
ensured. The IG noted that of the 21 measures it reviewed, 16 were
reliable; data or documentation of the methods used to measure SSA's
performance were not available for the other five measures. The IG went
further to say that even for the performance measures found to be
reliable, SSA lacks documentation of the methods and data used to measure
its performance. Finally, data were not available for 17 percent of the
performance goals, so it was difficult to assess whether or not progress
had been made in those areas.

Observations on the Quality of DOT's Fiscal Year 2002 Annual Performance
and Accountability Report

DOT's fiscal year 2002 performance report provided information that
generally showed the department's performance and progress toward its
goals. Summary tables within the report showed when DOT met or did not
meet its targets and the report supplied brief analyses as to whether or
not DOT would likely meet its targets for fiscal year 2003 based on actual
performance in 2002. A separate section of the report on performance data
completeness and reliability, along with an on-line compendium, provides a
full level of confidence in DOT's performance data. However, the report
does not clearly show the relationship between resources and results.

Appendix V Observations on Agencies' Annual Performance and Accountability
                                    Reports

DOT's 2002 Report Provided a General Picture of Performance

DOT's fiscal year 2002 performance report28 provides a general picture of
the department's performance. The report includes performance summary
tables, which show the progress made toward each strategic objective.
These performance summaries include actual performance data from fiscal
years 1996 to 2002 when possible, as well as the performance targets for
fiscal year 2002 and whether or not the target was met. Similarly, we
noted in our 2000 report29 reviewing DOT's 1999 performance report that
performance information was clearly articulated, with summary tables
listing the fiscal year 1999 goals and trend data, and checkmarks to
indicate whether or not goals were met. Figure 25 summarizes DOT's overall
performance on its 40 performance targets, as reported in its 2002 report.

Figure 25: Summary of DOT's Performance Indicators for Fiscal Year 2002

                                      Met

                                    Not met

                               Data not available

According to the report, DOT met 24 (60 percent) of its performance
targets. Fourteen (35 percent) of DOT's performance targets were not met.

28U.S. Department of Transportation, Fiscal Year 2002 Performance and
Accountability Report (Washington, D.C.: 2003).

29Our review of DOT's fiscal year 1999 report, GAO/RCED-00-201R, focused
on our observations regarding only four of the department's key outcomes,
as well as the major management challenges addressed in the performance
report. Since our review of DOT's 2002 performance report used different
assessment criteria, we could not make valid comparisons on all aspects of
the reports.

Appendix V Observations on Agencies' Annual Performance and Accountability
Reports

The report provides explanations for why five of these targets were unmet.
For example, the target for the measure "number of passengers (in
millions) in international markets with open skies aviation agreements" of
the "mobility and economic growth" strategic objective was unmet. The
target was set at 59.7 million passengers, while DOT's preliminary
estimate for this measure indicated there were 57 million passengers. The
report states that this target was unmet because passenger travel
diminished in fiscal year 2002 due to the impact that the events of
September 11, 2001, had on air travel. However, the report did not provide
explanations describing why the nine other targets were not met. A DOT
official stated that explanations for these unmet targets were not
included in the 2002 report due, in part, to time constraints. In our 2000
report of DOT's 1999 performance report, we stated that for all of its
unmet goals except transit fatalities, the department provided
explanations for not meeting the goals related to the outcomes we
observed.

We noted in our report on DOT's 1999 performance report that the
department supplied strategies to achieve its unmet goals in the future,
for the areas we reviewed. However, of the 14 unmet targets in the fiscal
year 2002 report, DOT provided future plans to achieve only two. For
example, the report provided a plan for future achievement of the unmet
target "percent of environmental justice cases unresolved after one year."
The report stated that DOT's External Complaint Tracking System was being
revised "to track complaints more closely, in a more timely way, and with
a higher level of data quality." DOT is also developing guidance requiring
more intensive legal staff involvement in external civil rights
complaints, especially environmental justice cases. A DOT official stated
that future plans were not included in the 2002 report for the other unmet
targets due, in part, to time constraints.

Data were unavailable for two of DOT's measures, "cumulative average
percent change in transit passenger-miles traveled per transit market" and
"employment sites (in thousands) made accessible by Job Access and Reverse
Commute (JARC) transportation services." The report explains the reasons
why data were unavailable for both of these measures and includes plans to
provide these data in the future. Data were unavailable for the JARC
program performance measure because DOT had not received data from JARC
grantees to verify that fiscal year 2002 program targets had been
achieved. DOT states that a new reporting system is being implemented,
which should improve data gathering performance.

Appendix V Observations on Agencies' Annual Performance and Accountability
Reports

Although the report does not identify any performance goals that were
impractical or infeasible, it states that the measure on transit
passengermiles traveled had been changed in fiscal year 2002 because "it
placed excessive emphasis on increasing ridership in the Nation's very
largest urban areas." However, after using the measure for one year, DOT
concluded that the measure should once again be modified to account for
changes in the level of employment in each urban area. The report states
that a recent study found that changes in the level of employment are a
key economic factor related to changes in the level of transit ridership.

Another strength of DOT's fiscal year 2002 performance report is an
analysis of whether or not DOT will likely meet its planned performance
targets for fiscal year 2003. Each discussion of performance goals
contains an evaluation of the fiscal year 2003 performance plan target and
whether or not it will likely be met based on the fiscal year 2002 data.
For example, for its highway fatality rate targets, DOT says that the
National Highway Traffic Safety Administration and the Federal Motor
Carrier Safety Administration will be challenged to meet the established
fiscal year 2003 targets because targets had not been met for fiscal year
2002. In other instances where DOT is sure that it will meet its targets
for fiscal year 2003, it simply states so, as in the case of its aviation
safety targets.

DOT's 2002 performance report also includes information on completed
program evaluations. There is a separate section in the report that
discusses DOT's fiscal year 2002 program evaluations. Summaries of the
findings of these evaluations are discussed in this section. For example,
an evaluation of the Noise Set-Aside Portion of the FAA (Federal Aviation
Administration) Airport Improvement Program found that funding for the
program's noise compatibility projects was variable from year to year,
making it difficult to forecast annual population benefits.

As we found in the 1999 report, the major management challenges that DOT
faces are generally discussed in the 2002 report. These discussions were
contained within the program section to which they relate. The report also
showed the progress DOT has made in addressing its management challenges.
For example, we noted in our January 2003 report30 on DOT's major
management challenges that FAA's financial management systems remained at
high risk. DOT's 2002 report states that FAA created an Interim Fixed
Asset System to centrally control and

30GAO-03-108.

Appendix V Observations on Agencies' Annual Performance and Accountability
                                    Reports

account for its property and that in fiscal year 2003, FAA will convert to
use Delphi, DOT's financial accounting system.

The 2002 performance report included a discussion on strategic human
capital management as part of DOT's "organizational excellence" strategic
objective. This discussion included a brief overview of DOT's human
capital plan as well as strategies for strategic human capital management.
For example, the report noted that FAA was redirecting 37,300 employees
into a results-oriented Air Traffic Organization, "freeing most of the FAA
to manage better and modernize more efficiently." However, the report did
not include a review of the performance goals and evaluation of the
performance plan relative to the agency's strategic human capital
management, as required by the Homeland Security Act of 2002.

DOT's 2002 Report Showed DOT's 2002 performance report did not show the
relationship between the No Relationship between resources it used and the
results it achieved in fiscal year 2002. The Resources and Results
financial portion of the report provided a statement of net cost for each
of

DOT's programs in fiscal year 2002. The report could be improved by
providing net cost information for DOT's performance goals in the
performance section of the report, similar to the funding information
provided in the 2004 performance plan.

DOT's 2002 Report Provided Full Confidence That Performance Data Are
Credible

DOT's 2002 performance report provided a full level of confidence that the
department's performance data were credible. In his transmittal letter,
the Secretary of DOT stated that the 2002 report "contains performance and
financial data that are substantially complete and reliable." The letter
also stated that a section of the report assessed the inadequacies of
DOT's performance data and provided plans to remedy those inadequacies.

The "Performance Data Completeness and Reliability" section of the 2002
report generally discussed data completeness, reliability, and
limitations. This section discussed an overall limitation in DOT's
performance data. The report stated that much of DOT's performance data
came from external sources, and therefore, the department had no direct
control over the quality of these data. The report continues by stating
that DOT takes limitations to its external data into account when it uses
these data.

Appendix V Observations on Agencies' Annual Performance and Accountability
Reports

The 2002 report noted that DOT has compiled source and accuracy statements
that provide detail on the methods used to collect performance data,
sources of variation and bias in the data, methods used to verify and
validate the data, as well as data limitations.31 However, the online
Source and Accuracy Compendium does not include this information for the
goals and measures related to the department's organizational excellence
objective. The compendium states that a small number of source and
accuracy statements are not yet completed and that they will be added upon
completion.

Finally, the 2002 report also described strategies being undertaken to
address the quality of data used by DOT. The report stated that a DOT
intermodal working group addressed data quality issues by developing
departmental statistical standards and by updating source and accuracy
statements for all of DOT's data programs. The working group also worked
to improve quality assurance procedures, evaluate sampling and nonsampling
errors, and develop common definitions for data across modes.

31Bureau of Transportation Statistics, Source & Accuracy Compendium,
http://www.bts.gov/statpol/SAcompendium.html (Washington, D.C.: Aug. 15,
2003).

Appendix VI

                       GAO Federal Managers' Survey Data

                     Q1. What is your current grade level?

GS/GM-13 or equivalent (percent)

            GS/GM-14 or equivalent (percent) GS/GM-15 or equivalent (percent)

Senior

Executive Service or equivalent

                                   (percent)

Other - please

specify -Continue with next question

(percent)

Number of respondents

35.2 24.6 7.9 0.9 500

Q1a. If you answered "Other" in question 1 above, please enter your
response below.

Writing comment (percent)

Number of respondents

3

Q2. In total, for how many years have you been a supervisor and/or a
manager in the federal government?

Number of Mean Median Minimum Maximum respondents

                                   13 1 43497

Q2no. Or, If you have never been a supervisor or a manager in the federal
                    government, please check the box below.

Percent

Number of respondents

3

Q3. In your current role, approximately how many government employees are
you responsible for? (Please answer for your permanent position. Please
specify the total number. If none, enter 0.) Enter numeric digits only.
Employees:

Number of Mean Median Minimum Maximum respondents

15 1 11,000 475

Q4. Please indicate where you currently work. (If you are currently on
temporary assignment or on detail, please answer for your permanent work
location.)

Headquarters of my department or agency (percent)

                          A field office of my department or agency (percent)

Other - please specify -Continue with next question

(percent)

No answer (percent)

Number of respondents

                             28.9 60.7 9.5 1.0 503

                                  Appendix VI
                       GAO Federal Managers' Survey Data

  Q4a. If you answered "Other" in question 4 above, please enter your response
                                     below.

Writing comment (percent)

Number of respondents

38

Q5. For those program(s)/operation(s)/project(s) that you are involved
with, to what extent, if at all, do you consider your agency's strategic
goals when participating in the following activities? (Check one box in
each row.)

To a very                To a                     No basis             
                                                  to                   
          To a      moderate  To a          To no  judge/Not           
    great great               small                                    
extent    extent    extent    extent    extent applicable No answer   Number of 
(percent) (percent) (percent) (percent) (percent)  (percent) (percent) respondents 

a. Setting program
priorities 33.9 45.3 11.4 4.7 1.8 2.2 0.7

b. Allocating resources 30.3 39.9 18.9 4.8 1.8 3.1 1.3

c. Adopting new program
approaches or changing
work processes 33.6 39.2 16.5 4.6 2.4 1.9 1.8

d. Developing or refining
program performance
measures 29.0 37.0 16.3 8.3 4.1 4.4 0.9

Q6. Are there performance measures for the program(s)/operation(s)/
project(s) that you are involved with?

Yes No Do not know No answer Number of (percent) (percent) (percent)
(percent) respondents

6.2 3.8 1.0 503

Q7. To what extent, if at all, do you agree with the following statements
as they relate to performance measures for the program(s)/operation(s)/
project(s) that you are involved with? (Check one box in each row.) We
have performance measures that...

To a very                To a                     No basis             
                                                  to                   
          To a      moderate  To a          To no  judge/Not           
    great great               small                                    
extent    extent    extent    extent    extent applicable No answer   Number of 
(percent) (percent) (percent) (percent) (percent)  (percent) (percent) respondents 

a....tell us how many things
we produce or services we
provide. (Output
measures) 25.8 28.7 19.4 8.4 4.8 1.8 11.0 503

b....tell us if we are
operating efficiently.
(Efficiency measures) 15.2 27.5 25.8 13.4 5.5 1.5 11.0 503

c....tell us whether or not
we are satisfying our
customers. (Customer
service measures) 16.8 29.8 21.6 12.6 6.1 1.6 11.4 503

                                  Appendix VI
                       GAO Federal Managers' Survey Data

To a very                To a                     No basis             
                                                  to                   
          To a      moderate  To a          To no  judge/Not           
    great great               small                                    
extent    extent    extent    extent    extent applicable No answer   Number of 
(percent) (percent) (percent) (percent) (percent)  (percent) (percent) respondents 

d....tell us about the quality
of the products or services
we provide. (Quality
measures) 16.0 30.3 23.5 12.6 5.6 0.7 11.3

e....demonstrate to
someone outside of our
agency whether or not we
are achieving our intended
results. (Outcome
measures) 19.1 35.9 19.0 10.5 2.6 1.5 11.3

f....link our product or
service costs with the
results we achieve. (Cost
benefit measures) 12.5 18.7 19.7 20.0 11.7 6.0 11.3

Q8. For those program(s)/operation(s)/project(s) that you are involved
with, to what extent, if at all, do you use the information obtained from
performance measurement when participating in the following activities?

To a very                To a                     No basis             
                                                  to                   
          To a      moderate  To a          To no  judge/Not           
    great great               small                                    
extent    extent    extent    extent    extent applicable No answer   Number of 
(percent) (percent) (percent) (percent) (percent)  (percent) (percent) respondents 

a. Setting program
priorities 13.0 38.7 21.7 10.8 3.4 1.2 11.0

b. Allocating resources 14.9 36.6 20.3 10.8 3.7 1.8 12.0

c. Adopting new program
approaches or changing
work processes 13.9 34.8 24.2 10.2 4.1 0.9 11.9

d. Coordinating program efforts with other internal or external
organizations 10.1 32.2 29.4 10.9 3.6 2.7 11.1

e. Refining program
performance measures 14.4 28.8 24.7 13.2 3.4 4.1 11.3

f. Setting new or revising
existing performance goals 16.1 32.4 21.0 11.8 2.9 3.5 12.3

g. Setting individual job
expectations for the
government employees I
manage or supervise 18.4 33.6 22.0 9.3 2.8 2.4 11.4 503

h. Rewarding government
employees I manage or
supervise 17.7 33.3 21.8 8.2 4.0 3.0 11.9 503

i. Developing and
managing contracts 7.3 18.8 19.3 10.4 8.1 24.4 11.7 503

                                  Appendix VI
                       GAO Federal Managers' Survey Data

Q9. Based on your experience with the program(s)/operation(s)/project(s)
that you are involved with, to what extent, if at all, have the following
factors hindered measuring performance or using the performance
information?

To a very                To a                     No basis             
                                                  to                   
          To a      moderate  To a          To no  judge/Not           
    great great               small                                    
extent    extent    extent    extent    extent applicable No answer   Number of 
(percent) (percent) (percent) (percent) (percent)  (percent) (percent) respondents 

a. Setting program
priorities 14.2 20.3 32.1 19.7 9.1 3.9 0.7

b. Different parties are
using different definitions
to measure performance 10.3 23.5 24.6 24.6 10.4 5.6 1.0

c. Difficulty obtaining
valid or reliable data 7.5 22.1 26.0 27.2 12.0 4.1 1.0

d. Difficulty obtaining
data in time to be useful 7.6 17.5 23.9 26.0 18.3 4.4 2.2

e. Lack of incentives
(e.g., rewards, positive
recognition) 12.9 15.8 25.0 24.1 16.6 4.8 0.8

f. Difficulty resolving
conflicting interests of
stakeholders, either
internal or external 9.5 19.3 25.5 22.0 14.5 8.1 1.0

g. Difficulty distinguishing
between the results
produced by the program
and results caused by
other factors 7.4 18.7 24.3 28.6 13.7 6.3 1.0

h. Existing information
technology and/or
systems not capable of
providing needed data 9.7 19.4 20.6 26.6 17.0 5.4 1.3

i. Lack of staff who are
knowledgeable about
gathering and/or
analyzing performance
information 10.1 17.5 23.4 26.2 16.5 5.0 1.3

j. Lack of ongoing top
executive commitment or
support for using
performance information
to make program/funding
decisions 9.7 16.0 18.4 24.0 23.1 7.9 1.0 503

                                  Appendix VI
                       GAO Federal Managers' Survey Data

To a very                To a                     No basis             
                                                  to                   
          To a      moderate  To a          To no  judge/Not           
    great great               small                                    
extent    extent    extent    extent    extent applicable No answer   Number of 
(percent) (percent) (percent) (percent) (percent)  (percent) (percent) respondents 

k. Lack of ongoing
Congressional
commitment or support
for using performance
information to make
program/funding
decisions 7.1 16.6 18.1 16.5 17.1 23.8 0.7

l. Difficulty determining
how to use performance
information to improve
the program 5.3 12.8 30.1 26.6 18.8 5.6 0.7

m. Concern that OMB will
micromanage programs
in my agency 7.9 10.8 14.7 19.2 26.6 19.7 1.0

Q10. To what extent, if at all, do you agree with the following
statements? (Check one box in each row.)

To a very To a No basis to great To a great moderate To a small To no
judge/Not extent extent extent extent extent applicable No answer Number
of (percent) (percent) (percent) (percent) (percent) (percent) (percent)
respondents

a. Agency managers/
supervisors at my level
have the decision making
authority they need to
help the agency
accomplish its strategic
goals. 9.5 30.1 28.1 25.2 5.6 0.6 1.0

b. Agency managers/
supervisors at my level
are held accountable for
agency accomplishment
of its strategic goals. 14.6 42.9 24.1 12.7 3.3 1.5 1.0

c. Agency managers/
supervisors at my level
are held accountable for
the results of the
program(s)/
operation(s)/project(s)
they are responsible for. 24.2 46.5 17.2 7.2 3.0 0.9 1.0 503

                                  Appendix VI
                       GAO Federal Managers' Survey Data

To a very To a No basis to great To a great moderate To a small To no
judge/Not extent extent extent extent extent applicable No answer Number
of (percent) (percent) (percent) (percent) (percent) (percent) (percent)
respondents

d. Employees in my
agency receive positive
recognition for helping the
agency accomplish its
strategic goals. 9.7 27.1 31.2 22.0 7.2 1.5 1.3

e. My agency's top
leadership demonstrates
a strong commitment to
achieving results. 24.4 37.1 21.8 9.9 2.9 2.6 1.3

f. My agency is investing
the resources needed to
ensure that its
performance data is of
sufficient quality. 9.5 21.4 29.3 21.9 7.4 9.3 1.0

Q11. To what extent, if at all, do you agree with the following
statements? (Check one box in each row.) The following items focus on the
program(s)/operation(s)/project(s) that you are responsible for.

To a very                 To a                      No basis   
                                                       to         
       great To a great moderate  To a small     To no  judge/Not 
      extent     extent    extent     extent    extent applicable   No answer 
                                                                    Number of 
                                                                    (percent) 
(percent) (percent)  (percent) (percent)  (percent)  (percent) respondents 

a. The individual I report to
periodically reviews with
me the results or outcomes
of the program(s)
operation(s)/project(s) that
I am responsible for. 18.8 36.1 23.1 15.3 5.3 0.4 1.0

b. Funding decisions for
the program(s)/
operation(s)/project(s) I am
responsible for are based
on results or outcome
oriented performance
information. 4.5 20.9 23.1 26.4 16.5 7.5 1.0

c. Staffing and personnel
decisions for the
program(s)/
operation(s)/project(s) I am
responsible for are based
on results or outcome
oriented performance
information. 4.2 20.9 27.8 23.2 19.0 3.7 1.3 503

                                  Appendix VI
                       GAO Federal Managers' Survey Data

To a very                 To a                      No basis   
                                                       to         
       great To a great moderate  To a small     To no  judge/Not 
      extent     extent    extent     extent    extent applicable   No answer 
                                                                    Number of 
                                                                    (percent) 
(percent) (percent)  (percent) (percent)  (percent)  (percent) respondents 

d. Changes by
management above my
level to the program(s)/
operation(s)/project(s) I am
responsible for are based
on results or outcome
oriented performance
information. 2.6 20.5 25.2 26.5 15.3 8.2 1.6

e. It is easy to motivate
employees to be more
results-oriented in the
program(s)/operation(s)/
project(s) I am responsible
for. 3.5 22.0 33.6 29.0 7.8 2.8 1.3

f. I have sufficient
information on the validity
of the performance data I
use to make decisions. 4.7 32.6 30.2 20.3 7.1 3.2 1.8

Q12. During the past 3 years, has your agency provided, arranged, or paid
for training that would help you to accomplish the following tasks? (Check
one box in each row.)

Yes No No answer Number of (percent) (percent) (percent) respondents

a. Conduct strategic planning 46.7 52.3 1.0 503

b. Set program performance goals 48.8 50.2 1.0 503

c. Develop program performance
measures 42.9 55.5 1.6 503

d. Assess the quality of performance
data 35.3 63.5 1.3 503

e. Use program performance
information to make decisions 40.7 56.8 2.4 503

f. Link the performance of
program(s)/operation(s)/project(s) to
the achievement of agency strategic
goals 40.8 57.0 2.2 503

g. Implement the requirements of the
Government Performance and Results
Act (GPRA or the Results Act) 31.9 66.7 1.3 503

                                  Appendix VI
                       GAO Federal Managers' Survey Data

Q13. In your opinion, to what extent, if at all, do you believe you need
training (or additional training) in order to help you to accomplish the
following tasks? (Check one box in each row.)

To a very To a No basis to great To a great moderate To a small To no
judge/Not extent extent extent extent extent applicable No answer Number
of (percent) (percent) (percent) (percent) (percent) (percent) (percent)
respondents

a. Conduct strategic
planning 8.2 18.3 27.8 30.8 11.2 2.8 0.9

b. Set program
performance goals 8.6 19.7 27.6 27.2 12.7 3.0 1.2

c. Develop program
performance measures 9.6 21.1 29.2 23.7 11.2 3.6 1.6

d. Assess the quality of
performance data 8.9 22.1 25.7 25.7 13.3 2.5 1.9

e. Use program
performance information
to make decisions 9.5 22.1 23.7 26.9 13.8 2.2 1.9

f. Link the performance of
program(s)/operation(s)/p
roject(s) to the
achievement of agency
strategic goals 12.0 21.6 24.2 27.3 11.2 2.8 0.9

g. Implement the
requirements of the
Government Performance
and Results Act (GPRA or
the Results Act) 12.0 25.0 26.4 17.6 11.1 6.9 1.0

Q14. What, in your opinion, can the Federal government do to improve its
overall focus on managing for results?

Writing comment (percent)

Number of respondents

503

 Q15. Prior to receiving this questionnaire , which of the following statements
                     best describes your awareness of GPRA?

                           I had heard   I had heard   I had heard 
                                    of            of            of 
             I had heard GPRA and had   GPRA and had      GPRA and 
                      of                                           
           GPRA but had    a low level      moderate had extensive 
           no                       of                             
I had   knowledge of  knowledge of  knowledge of                
never        its      its           its            knowledge of 
 heard of  requirements. requirements. requirements. its             No answer 
GPRA.                                             requirements.   Number of 
 (percent)     (percent)     (percent)     (percent)     (percent)   (percent) 
                                                                   respondents 
19.5             13.5          24.5          35.4           5.6     1.5 503 

                                  Appendix VI
                       GAO Federal Managers' Survey Data

Q16. For those program(s)/operation(s)/project(s) that you are involved
with, to what extent,if at all, do you consider the annual performance
goals set forth in your agency's GPRAannual performance plan when
participating in the following activities?

To a very                To a                       No basis           
                                                          to           
          To a      moderate  To a          To no  judge/Not           
    great great               small                                    
extent    extent    extent    extent    extent applicable No answer   Number of 
(percent) (percent) (percent) (percent) (percent)  (percent) (percent) respondents 

a. Setting program
priorities 7.5 19.8 24.4 12.7 13.5 20.8 1.2

b. Allocating resources 5.2 16.8 23.9 15.7 14.7 22.5 1.2

c. Adopting new program approaches or changing work processes 5.9 21.3
23.8 13.5 13.3 21.1 1.2

d. Coordinating program efforts with other internal or external
organizations 5.4 16.2 26.4 15.7 12.8 22.3 1.2

e. Developing or refining
program performance
measures 5.7 18.1 21.5 15.9 14.7 22.9 1.2

f. Setting individual job
expectations for the
government employees I
manage or supervise 5.7 22.0 19.7 15.4 15.0 20.5 1.6

g. Rewarding
government employees I
manage or supervise 5.9 19.5 20.4 16.1 16.2 20.5 1.5

h. Developing and
managing contracts 3.3 12.3 14.9 13.3 16.4 37.1 2.7

Q17. During the past 3 years, have you been involved in these activities?
(Check one box in each row.) GPRA-related activities

Yes No No answer Number of (percent) (percent) (percent) respondents

a. Developing ways to measure
whether program performance goals
are being achieved. 45.8 52.7 1.5 503

b. Gathering and analyzing data to
measure whether programs are
meeting their specific performance
goals. 50.6 47.9 1.5 503

c. Using measures for program
performance goals to determine if the
agency's strategic goals are being
achieved. 42.6 55.9 1.5 503

d. Assessing the quality of data used
in measuring performance. 39.7 58.2 2.1 503

                                  Appendix VI
                       GAO Federal Managers' Survey Data

Q18. To what extent, if at all, do you agree with the following
statements? (Check one box in each row.) Extent I agree with the following
statements:

To a very                To a                     No basis             
                                                  to                   
          To a      moderate  To a          To no  judge/Not           
    great great               small                                    
extent    extent    extent    extent    extent applicable No answer   Number of 
(percent) (percent) (percent) (percent) (percent)  (percent) (percent) respondents 

a. The objectives of my
program(s)/operation(s)/pr
oject(s) are in alignment
with my agency's strategic
plan under GPRA. 10.1 29.4 15.8 7.2 1.2 35.0 1.2

b. The costs associated
with implementing GPRA
have taken time or funds
away from other important
activities or projects. 4.8 7.7 13.1 15.1 10.6 46.1 2.5

c. The benefits to my
agency that are achieved
by implementing GPRA
are worth the costs
incurred in doing so (e.g.,
in time, money, and effort). 3.8 11.3 15.8 12.4 7.4 48.0 1.3

d. GPRA strategic and
annual performance plans
are mostly a repackaging
of goals, measures, and
objectives that were
already being used within
my agency. 5.7 22.7 19.1 7.7 2.8 40.1 1.9

e. Managerial
effectiveness is impeded
by the lack of integration
between GPRA and other
federal management
programs. 3.8 7.7 16.3 13.4 6.8 50.5 1.5

f. My agency considers
contributions to and
comments on GPRA plans
or reports from
managers/supervisors at
my level: 3.5 10.6 15.6 15.0 11.7 42.1 1.5 503

                                  Appendix VI
                       GAO Federal Managers' Survey Data

To a very                To a                     No basis             
                                                  to                   
          To a      moderate  To a          To no  judge/Not           
    great great               small                                    
extent    extent    extent    extent    extent applicable No answer   Number of 
(percent) (percent) (percent) (percent) (percent)  (percent) (percent) respondents 

g. Agency managers/
supervisors at my level
use GPRA annual
performance plans to
manage their
program(s)/operation(s)/pr
oject(s). 3.6 7.8 18.5 16.4 17.7 33.9 2.1

h. GPRA's planning and
reporting requirements
impose a significant
paperwork burden. 2.9 7.4 17.1 15.9 7.3 47.7 1.8

i. GPRA has caused
agency managers/
supervisors at my level to
place a greater emphasis
on getting input from
appropriate stakeholders
on their interests and
expectations. 2.5 7.0 20.8 13.1 12.2 42.2 2.1

Q19. To what extent, if at all, do you believe that GPRA has improved your
agency's ability to deliver results to the American public.

To a very      To a   To a                                             
  great       great moderate                                           
 extent      extent  extent                       No basis             
                    (Continue                     to                   
(Continue (Continue   with      To a               judge/Not           
  with    with      question    small                                  
question   question     19a.)    extent   To no   applicable No answer   Number of 
  19a.)       19a.)                      extent                        
(percent) (percent) (percent) (percent) (percent)  (percent) (percent) respondents 
1.6          6.9      14.5      25.2      11.6       38.0       2.1         503 

Q19a. Please briefly describe how GPRA has improved your agency's ability
to deliver results to the American public.

Writing comment Number of (percent) respondents

139

                                  Appendix VI
                       GAO Federal Managers' Survey Data

Q20. To what extent, if at all, do you believe the following persons or
entities pay attention to your agency's efforts under GPRA? (Check one box
in each row.)

To a very                To a                     No basis             
                                                  to                   
          To a      moderate  To a          To no  judge/Not           
    great great               small                                    
extent    extent    extent    extent    extent applicable No answer   Number of 
(percent) (percent) (percent) (percent) (percent)  (percent) (percent) respondents 

a. Department Secretary
(if applicable) 10.3 18.3 10.6 6.2 3.6 47.1 3.9

b. Agency head other
than Department
Secretary (if applicable) 11.4 20.6 12.9 6.9 2.1 42.9 3.2

c. The individual I report
to 8.5 16.0 19.3 14.0 13.2 26.6 2.4

d. Managers and
supervisors at my level 5.1 13.2 19.7 19.5 16.0 24.1 2.4

e. Employees who report
to me 2.5 8.3 13.8 18.1 30.7 24.0 2.7

f. Office of Management
and Budget 14.1 16.7 11.4 4.0 1.8 49.0 2.9

g. Congressional
committees 7.8 14.5 13.0 8.2 3.5 50.3 2.6

h. The audit community
(e.g., GAO, Inspectors
General) 11.7 16.9 11.5 7.3 1.2 48.2 3.2

i. The general public 1.5 2.5 6.1 17.5 26.4 42.2 3.9

Q21. To what extent, if at all, do you believe that efforts to implement
GPRA to date have improved the program(s)/operation(s)/project(s) in which
you are involved?

 I have not                                                              
    been                                                                 
sufficiently                          To a                               
involved in                                                              
GPRA to have To a very   To a    moderate       To a                     
     an                  great                 small                     
  opinion.     great      extent    extent    extent To no     No answer   Number of 
              extent                                 extent              
 (percent)   (percent) (percent) (percent) (percent) (percent) (percent) respondents 
    48.7           2.9       5.2      14.7      16.2      10.0       2.4         503 

Q22. To what extent, if at all, do you believe that efforts to implement
GPRA to date have improved your agency's programs/operations/projects?

 I have not                                                              
    been                                                                 
sufficiently                          To a                               
involved in                                                              
GPRA to have To a very   To a    moderate  To a                          
     an                  great             small                         
  opinion.     great      extent    extent    extent To no     No answer   Number of 
              extent                                 extent              
 (percent)   (percent) (percent) (percent) (percent) (percent) (percent) respondents 
    47.9           2.3       6.2      17.0      16.3       8.7       1.5         503 

                                  Appendix VI
                       GAO Federal Managers' Survey Data

  Q23. To what extent, if at all, do you believe implementing GPRA can improve
           your agency's programs/operations/projects in the future ?

To a very      To a   To a      To a              No basis            
              great moderate    small                                 
  great      extent    extent    extent   To no   to judge  No answer   Number of 
 extent                                  extent                       
(percent) (percent) (percent) (percent) (percent) (percent) (percent) respondents 
3.4         11.9      23.7      16.7       5.6      35.7       3.0         503 

Q24. If you have been involved to any extent in implementing GPRA for the
program(s)/operation(s)/project(s) you are involved with, what has been
your greatest difficulty, and in what ways, if any, do you think this
difficulty could be addressed?

Writing comment Number of (percent) respondents

26.0 503

Q25. If you have additional comments regarding any previous question or
any comments/suggestions concerning GPRA, please use the space provided
below.

Writing comment Number of (percent) respondents

16.5 503

Note: Percents reported are weighted percents based on the population
size. Unweighted N reported for each item.

Appendix VII

Agencies Subject to the Chief Financial Officers Act

The Chief Financial Officers Act of 1990 (the CFO Act) created the
position of Chief Financial Officer in each executive department and in
each major executive agency in the federal government. The agencies
covered by the CFO Act are:

1. Agency for International Development

2. Department of Agriculture

3. Department of Commerce

4. Department of Defense

5. Department of Education

6. Department of Energy

7. Department of Health and Human Services

8. Department of Housing and Urban Development

9. Department of the Interior

10. Department of Justice

11. Department of Labor

12. Department of State

13. Department of Transportation

14. Department of the Treasury

15. Department of Veterans Affairs

16. Environmental Protection Agency

17. Federal Emergency Management Agency1

1The Federal Emergency Management Agency became part of the Department of
Homeland Security in March 2003.

Appendix VII
Agencies Subject to the Chief Financial
Officers Act

18. General Services Administration

19. National Aeronautics and Space Administration

20. National Science Foundation

21. Nuclear Regulatory Commission

22. Office of Personnel Management

23. Small Business Administration

24. Social Security Administration2

2Formerly part of the Department of Health and Human Services, the Social
Security Administration once again became an independent agency on March
31, 1995. Congress established the position of Chief Financial Officer
within SSA in the Social Security Independence and Program Improvements
Act of 1994.

Appendix VIII

Comments from the Office of Management and Budget

Appendix VIII
Comments from the Office of Management
and Budget

                                  Appendix IX

                     Comments from the Department of Energy

Note: GAO comments supplementing those in the report text appear at the
end of this appendix.

Now on pp. 53-54.

See comment 1.

See comment 2.

See comment 3.

See comment 1. Now on p. 56.

Appendix IX
Comments from the Department of Energy

                                 See comment 4.

                                 See comment 5.

                                  Appendix IX
                     Comments from the Department of Energy

     The following are our comments on DOE's letter dated January 15, 2004.

GAO Comments	The Director of the Office of Program Assessment and
Evaluation forwarded written comments from DOE on a draft of this report.
DOE disagreed with several of our conclusions concerning its 2004 Annual
Performance Plan and 2002 Performance and Accountability Report. We
incorporated the additional information and perspectives of DOE into our
report as appropriate.

1.

2.

We stated that, when compared to the 1999 Annual Performance Plan, DOE's
2004 Annual Performance Plan continued to provide a limited picture of
intended performance. DOE disagreed, stating that the plan provided a
clear picture of intended performance because a description of each
program was contained in each of the general goals. We agree that a
description of programs is provided for each goal, and we also stated in
our draft report that improvement was made in developing results-oriented
performance measures that pertained specifically to fiscal year 2004.
However, in our view, describing each program does not sufficiently
explain DOE's expectations for intended performance. More specifically, we
found the overall picture of intended performance was limited because DOE
did not specifically describe how it coordinates with other agencies to
accomplish crosscutting programs and did not provide a clear link between
its annual goals and its mission and strategic goals. In our draft report
we acknowledged that a link did not exist between the performance plan and
the strategic plan because the strategic plan was revised after the
performance plan was finalized. Nevertheless, DOE did not revise the final
2004 performance plan to reflect its changes in strategic goals. The lack
of alignment between the performance plan and strategic plan goals limits
the usefulness of the performance plan to support managers and staff in
their day-to-day activities in achieving DOE's long-term strategic goals.

In response to our observation that DOE provided a general discussion of
the strategies and resources needed to achieve its performance goals, DOE
stated that its 2004 annual performance plan provided specific strategies
and resources that will be used to achieve performance goals. DOE also
noted that funding was included at the general goal level. We agree that
funding was included at the general strategic goal level. However, better
plans relate resources to the achievement of performance goals. DOE did
not provide resource information at the performance goal level.
Furthermore, while DOE

Appendix IX
Comments from the Department of Energy

discussed external factors that could affect its ability to achieve its
performance goals at a high level, it did not discuss any specific
strategies to mitigate those factors.

3.	DOE disagreed with our characterization that its 2004 annual
performance plan provided limited confidence that performance data will be
credible. The department stated that the introduction section of its plan
contained specific sections on assessing results and validating and
verifying data, as well as discussed a software package used to document
and track performance. In our draft report, we stated that DOE's plan
showed some improvement over its 1999 plan by describing credible
procedures to verify and validate performance information and by
mentioning specific program evaluations for each goal. We also noted that
DOE acquired new commercial software for performance tracking through
remote data entry, monitoring, and oversight by program offices and
managers. However, we concluded that DOE's reporting of credible
performance data was limited because its plan does not specifically
identify data limitations overall or for each of its goals. As we stated
in our report, we found this to be of particular concern because, as we
mentioned in our 2003 performance and accountability series, DOE has
several management challenges where data quality is a concern.

4.	Concerning its 2002 Annual Performance and Accountability Report, DOE
stated that it provided a plan of action for addressing the causes of
targets that were not met. We agree, and in our draft report we state that
all targets that were not met or had mixed results contained a plan of
action to achieve the target in the future. We also found that the
majority of DOE targets that were not met or had mixed results contained
clear explanations. We revised our text in the final version of this
report to make these findings more evident.

5.	Finally, DOE disagreed with our finding that it did not provide a
discussion of the relationship between the strategic plan, performance
plan, and the performance report in its 2002 Performance and
Accountability Report. DOE stated that the introductory section of the
report contains a paragraph that discusses the linkages between these
three reports. However, although the performance and accountability report
links these documents by organizing its results section according to
strategic goals, associated program performance goals and targets, it did
not succinctly demonstrate how the results relate to

Appendix IX
Comments from the Department of Energy

the annual and long-term strategic goals. We modified the draft
accordingly to clarify this point.

Appendix X

Comments from the Department of Housing and Urban Development

Note: GAO comments supplementing those in the report text appear at the
end of this appendix.

Appendix X
Comments from the Department of Housing
and Urban Development

                                 See comment 1.

Appendix X
Comments from the Department of Housing
and Urban Development

                                 See comment 2.

Appendix X
Comments from the Department of Housing
and Urban Development

                                 See comment 3.

Appendix X
Comments from the Department of Housing
and Urban Development

                                 See comment 4.

                                 See comment 5.

Appendix X
Comments from the Department of Housing
and Urban Development

                                 See comment 6.

                                   Appendix X
                    Comments from the Department of Housing
                             and Urban Development

The following are our comments on HUD's letter dated January 9, 2004.

GAO Comments	HUD provided written comments and disagreed with several of
our observations, which we address below. HUD also mentioned that all of
the areas we suggested for further improvement were already in process or
being considered. Where appropriate, we incorporated HUD's comments and
perspectives to clarify our report.

1.	HUD did not agree with our observation that the link between longterm
and intermediate goals in its strategic plan is difficult to discern. The
department mentioned that the need for a direct link between longterm and
intermediate goals is not apparent, as they are aligned with strategic
goals. GPRA requires that an agency's strategic plan contain, among other
things, a description of the relationship between the longterm goals and
objectives and the annual performance goals. In addition, OMB's June 2002
Circular A-11 states that the strategic plan should briefly outline how
annual performance goals relate to the longterm, general goals, and how
they help determine the achievement of the general goals. Federal agencies
can help readers understand how they move from general goals to specific,
measurable outcomes by discussing how they plan to measure progress in
achieving the long term-goals in their strategic plan. For example, for
its strategic goal of "Increase Homeownership Opportunities," HUD mentions
that one of its long-term performance measures is to combat predatory
lending. Readers can review the intermediate measures listed under that
goal to get a sense of how HUD plans to accomplish this objective. For
example, HUD mentions that beginning in the third quarter of fiscal year
2003, field offices will report all activities related to predatory
lending to headquarters each quarter. However, not all long-term measures
listed in the strategic plan have a corresponding intermediate performance
measure.

Appendix X
Comments from the Department of Housing
and Urban Development

2.	HUD disagreed with our observation that it did not explain in its
strategic plan how it used the results of program evaluations to update
the current plan and did not include a schedule for future evaluations. As
we have previously reported, program evaluations are individual,
systematic studies that use objective measurement and analysis to answer
specific questions about how well a program is working and, thus, may take
many forms. Where a program aims to produce changes that result from
program activities, outcome or effectiveness evaluations assess the extent
to which those results were achieved. Where complex systems or events
outside a program's control also influence its outcomes, impact
evaluations use scientific research methods to establish the causal
connection between outcomes and program activities and isolate the
program's contribution to those changes. A program evaluation that also
systematically examines how a program was implemented can provide
important information about why a program did or did not succeed and
suggest ways to improve it.1 In its strategic plan, HUD provides a few
examples of how it modified performance measures as a result of program
evaluations. However, we found that 38 of the 41 performance measures
discussed in the strategic plan did not mention how, if at all, HUD
revised and/or updated them as the result of program evaluations.
Elsewhere in the plan, HUD discussed program evaluation activities carried
out by its Office of Policy Development and Research; however, a
significant number of those evaluations will take place in the future and
there is no fixed timetable for when HUD will issue reports on its
findings.

3.	HUD questioned an example we used to show that its strategic plan did
not always provide a clear picture of how it will be able to measure
progress toward its strategic goals. We chose this example because HUD
used the number of jobs created or retained to measure its progress in
achieving the results of the Community Development Block Grant (CDBG)
program. As HUD discusses in it strategic plan, there are factors external
to the CDBG program, such as broad macroeconomic trends and HUD's limited
control over how grant recipients use the funding, which can significantly
affect job creation in a community. Therefore, it is difficult to
establish the contribution of the CDBG program-apart from the other
factors-to HUD's stated goal.

1GAO/GGD-00-204.

Appendix X
Comments from the Department of Housing
and Urban Development

4. HUD also disagreed with our observation that in its annual performance
report it did not state the steps it would take to address unmet
performance goals. We recognize that in some instances HUD mentioned how
it would address unmet goals. GPRA requires that agencies explain and
describe, where a performance goal has not been met, why the goal was not
met, schedules for achieving the established performance goal, and whether
or not the performance goal is impractical or unfeasible. However, our
review of HUD's performance report found that of the 93 unmet performance
targets for fiscal year 2002, 74 lacked an explanation of how HUD would
address them in fiscal year 2003.

5.	In commenting on our observation that HUD did not include an evaluation
of its fiscal year 2003 performance plan relative to the performance
attained by the department in fiscal year 2002, HUD mentioned that we
should consider the impact of the acceleration of reporting deadlines on
the department's ability to include an evaluation of the new fiscal year's
performance plan relative to the performance attained in the just
completed fiscal year and reasonable alternative actions to fulfill this
requirement. While we acknowledge that changes in the reporting deadlines
can create challenges for federal agencies, these deadlines are
governmentwide and not specific to HUD. In our review of agency plans we
found that some agencies, such as DOT, were able to collect performance
information for 95 percent of their performance indicators and were able
to predict future performance, despite not having complete performance
information and facing the same deadlines. DOT provided an evaluation of
whether or not fiscal year 2003 performance targets would be met for each
of its 40 performance goals based on fiscal year 2002 results. These
evaluations were included for the two performance goals for which data
were unavailable. For example, for the measure "Number of employment sites
(in the thousands) that are made accessible by Job Access and Reverse
Commute (JARC) transportation services," DOT could not characterize
performance since data had not yet been received from JARC grantees. The
2002 performance report stated that a new easier to use reporting system
is being implemented that should improve data gathering performance. The
report further stated that DOT would meet this target in fiscal year 2003.

6. HUD also disagreed with how we presented the performance information in
its summary report cards (see fig. 22). HUD noted that many of the results
were explained in the individual indicator write-ups

Appendix X
Comments from the Department of Housing
and Urban Development

that followed the summary information. Our review of HUD's reports
included, among other things, qualitative aspects of how the information
was presented, such as its usefulness to inform the average reader with
little to no exposure on the subject matter, and the extent to which it
presented summarized performance information that was complete and
user-friendly. Our analysis of HUD's performance information was largely
based on a review of the information and terms used in the performance
report cards. We characterized some of HUD's performance indicators as
being "undetermined," given that HUD did not clearly indicate whether or
not a goal was achieved. Instead, HUD provided footnotes, such as "results
too complex to summarize." We also characterized some performance targets
as having "no explanation," given that information was missing from the
report card to determine whether HUD had reached its desired target. To
develop the graphic summarizing HUD's performance information, we compiled
the results of HUD's performance indicators across all five report cards
contained in the report.

Appendix XI

Comments from the Social Security Administration

Note: GAO comments supplementing those in the report text appear at the
end of this appendix.

Appendix XI
Comments from the Social Security
Administration

                                 See comment 1.

Appendix XI
Comments from the Social Security
Administration

                                 See comment 2.

                                 See comment 3.

                                 See comment 4.

Appendix XI
Comments from the Social Security
Administration

                                 See comment 5.

Appendix XI
Comments from the Social Security
Administration

                             Now on pp. 65 and 215.

                                  Appendix XI
                       Comments from the Social Security
                                 Administration

The following are our comments on SSA's letter dated January 16, 2004.

GAO Comments	In general, SSA agreed with our conclusions. SSA also agreed
to incorporate the suggestions for improvement in its future planning
efforts. SSA made several points of clarification and disagreed with our
assessment in one area.

1.

2.

3.

In our draft report, we noted that SSA did not explicitly link external
factors that may affect its programs to its general goals and state how
these factors could affect goal attainment. SSA attests that its four
categories of environmental factors are discussed under each of the
strategic goals, as appropriate, and the relationship between these
factors and SSA's strategic priorities is described. This general
discussion of the environmental factors is useful in understanding the
challenges SSA faces in working toward its broad strategic goals. However,
SSA provides little or no discussion of these challenges in its discussion
of the agency's performance goals. Thus, the range of challenges facing
the agency in meeting each of its performance goals is not fully
explained.

In our draft report, we noted that SSA does not provide timetables or
schedules for achieving all the results in its strategic plan. SSA noted
that it expects to achieve its long-term outcomes within the 5-year period
covered by the strategic plan; in selected instances, shorter time frames
are specified. SSA noted that more detailed plans and timetables are
featured in its annual performance plan. GPRA requires agencies to furnish
a schedule of significant actions in their strategic plans; however, SSA
does not clearly articulate its timetables and schedules for achieving
each of its long-term outcomes in its strategic plan.

We noted that SSA's strategic plan could be improved by providing details
on how each performance and accountability challenge will be addressed.
SSA asserted that the strategic plan addresses some of the challenges, but
because the challenges are updated every year, they are more appropriately
addressed in the annual performance plan and performance and
accountability report. As noted in our discussion of the criteria used to
analyze agencies' strategic plans, it is particularly important that
agencies develop strategies that address management challenges that
threaten their ability to meet long-term strategic goals,

Appendix XI
Comments from the Social Security
Administration

as one of the purposes of GPRA is to improve the management of federal
agencies.

4.	In our draft report, we observed that SSA's discussion of its
interactions with other agencies, especially those that serve the same
beneficiaries, was limited. SSA noted that such a discussion would be
useful, but is not an OMB requirement. While we agree that this is not an
OMB requirement, we have reported that given scarce resources and
competing priorities, it would be useful to identify agency efforts to
maximize its effect through cooperation and coordination across the
federal government. Better strategic plans not only identify the need to
coordinate with other agencies, but also discuss how agencies intend to
coordinate common or complementary goals and strategies with other
agencies.

5.	With regard to SSA's performance and accountability report, we noted
that SSA did not clearly state how program evaluations were used to answer
questions about program performance and results and how those results can
be improved. SSA disagreed with our observation, stating that many of its
evaluations rely on surveys, and these surveys form the basis for its
efforts to deliver high-quality service. SSA also noted that it listed
other evaluations that are of great importance to its ongoing operations.
We do not discount the usefulness of SSA's surveys in assessing its
day-to-day management of programs. Rather, as we noted in the report, it
would be helpful for SSA to clearly identify the range of evaluations
conducted and how each of them contributed to improved program
performance. For example, we recently recommended that SSA evaluate a new
initiative to improve the integrity of Social Security number issuance to
noncitizens; the description of such an evaluation would be helpful for
SSA to determine how it can be best positioned to ensure the integrity of
its enumeration process.1 Additionally, our September 2000 report on
program evaluation states that GPRA recognizes the complementary nature of
program evaluation and performance measurement. Strategic plans are to
describe the program evaluations that were used in establishing and
revising goals and to include a schedule for future

1U.S. General Accounting Office, Social Security Administration: Actions
Taken to Strengthen Procedures for Issuing Social Security Numbers to
Noncitizens, but Some Weaknesses Remain, GAO-04-12 (Washington, D.C.: Oct.
15, 2003).

Appendix XI
Comments from the Social Security
Administration

program evaluations. Agencies are to summarize the findings of program
evaluations in their annual performance reports.

Additionally, SSA made technical comments that we incorporated into the
report, as appropriate.

Appendix XII

                     GAO Contact and Staff Acknowledgments

GAO Contact Patricia Dalton, (202) 512-6806

Acknowledgments	In addition to the persons mentioned above, Thomas Beall,
Daniel Bertoni, Kay Brown, Joyce Corry, Elizabeth Curda, David Dornisch,
William Fenzel, Kimberly Gianopoulos, Evan Gilman, Katie Harris, Susan
Higgins, Benjamin Licht, William McKelligott, James Noel, Carol Petersen,
Carolyn Samuels, Teresa Spisak, Daren Sweeney, Carolyn Taylor, Michael
Volpe, Lynn Wasielewski, and Steven Westley made key contributions to this
report.

Related GAO Products

GPRA/Managing for Results

Results-Oriented Government: Using GPRA to Address 21st Century
Challenges, GAO-03-1166T (Washington, D.C.: Sept. 18, 2003).

Results-Oriented Management: Agency Crosscutting Actions and Plans in
Border Control, Flood Mitigation and Insurance, Wetlands, and Wildland
Fire Management, GAO-03-321 (Washington, D.C.: Dec. 20, 2002).

Results-Oriented Management: Agency Crosscutting Actions and Plans in Drug
Control, Family Poverty, Financial Institution Regulation, and Public
Health Systems, GAO-03-320 (Washington, D.C.: Dec. 20, 2002).

Performance and Accountability: Reported Agency Actions and Plans to
Address 2001 Management Challenges and Program Risks, GAO-03-225
(Washington, D.C.: Oct. 31, 2002).

Managing for Results: Next Steps to Improve the Federal Government's
Management and Performance, GAO-02-439T (Washington, D.C.: Feb. 15, 2002).

Managing for Results: Federal Managers' Views on Key Management Issues
Vary Widely Across Agencies, GAO-01-592 (Washington, D.C.: May 25, 2001).

Managing for Results: Federal Managers' Views Show Need for Ensuring Top
Leadership Skills, GAO-01-127 (Washington, D.C.: Oct. 20, 2000).

Managing for Results: Barriers to Interagency Coordination, GAO/GGD-00-106
(Washington, D.C.: Mar. 29, 2000).

Management Reform: Elements of Successful Improvement Initiatives,
GAO/T-GGD-00-26 (Washington, D.C.: Oct. 15, 1999).

Management Reform: Using the Results Act and Quality Management to Improve
Federal Performance, GAO/T-GGD-99-151 (Washington, D.C.: July 29, 1999).

Managing for Results: Opportunities for Continued Improvements in
Agencies' Performance Plans, GAO/GGD/AIMD-99-215 (Washington, D.C.: July
20, 1999).

Related GAO Products

Agency Performance Plans: Examples of Practices That Can Improve
Usefulness to Decisionmakers, GAO/GGD/AIMD-99-69 (Washington, D.C.: Feb.
26, 1999).

The Results Act: Assessment of the Governmentwide Performance Plan for
Fiscal Year 1999, GAO/AIMD/GGD-98-159 (Washington, D.C.: Sept. 8, 1998).

Managing for Results: An Agenda to Improve the Usefulness of Agencies'
Annual Performance Plans, GAO/GGD/AIMD-98-228 (Washington, D.C.: Sept. 8,
1998).

The Results Act: An Evaluator's Guide to Assessing Agency Annual
Performance Plans. GAO/GGD-10.1.20 (Washington, D.C.: April 1998).

Agencies' Annual Performance Plans Under the Results Act: An Assessment
Guide to Facilitate Congressional Decisionmaking, GAO/GGD/AIMD-10.1.18
(Washington, D.C.: February 1998).

Managing for Results: Agencies' Annual Performance Plans Can Help Address
Strategic Planning Challenges, GAO/GGD-98-44 (Washington, D.C.: Jan. 30,
1998).

Managing for Results: Building on Agencies' Strategic Plans to Improve
Federal Management, GAO/T-GGD/AIMD-98-29 (Washington, D.C.: Oct. 30,
1997).

Managing for Results: Critical Issues for Improving Federal Agencies'
Strategic Plans, GAO/GGD-97-180 (Washington, D.C.: Sept. 16, 1997).

Managing for Results: Using the Results Act to Address Mission
Fragmentation and Program Overlap, GAO/AIMD-97-146 (Washington, D.C.: Aug.
29, 1997).

Managing for Results: The Statutory Framework for Improving Federal
Management and Effectiveness, GAO/T-GGD/AIMD-97-144 (Washington, D.C.:
June 24, 1997).

The Government Performance and Results Act: 1997 Governmentwide
Implementation Will Be Uneven, GAO/GGD-97-109 (Washington, D.C.: June 2,
1997).

                              Related GAO Products

Agencies' Strategic Plans Under GPRA: Key Questions to Facilitate
Congressional Review, GAO/GGD-10.1.16 (Washington, D.C.: May 1997).

GPRA: Managerial Accountability and Flexibility Pilot Did Not Work as
Intended, GAO/GGD-97-36 (Washington, D.C.: Apr. 10, 1997).

Managing for Results: Enhancing the Usefulness of GPRA Consultations
Between the Executive Branch and Congress, GAO/T-GGD-97-56 (Washington,
D.C.: Mar. 10, 1997).

Managing for Results: Key Steps and Challenges in Implementing GPRA in
Science Agencies, GAO/T-GGD/RCED-96-214 (Washington, D.C.: July 10, 1996).

Executive Guide: Effectively Implementing the Government Performance and
Results Act, GAO/GGD-96-118 (Washington, D.C.: June 1996).

Strategic Human Capital Management

Human Capital: A Guide for Assessing Strategic Training and Development
Efforts in the Federal Government (Exposure Draft), GAO-03-893G
(Washington, D.C.: July 2003).

Results-Oriented Cultures: Creating a Clear Linkage between Individual
Performance and Organizational Success, GAO-03-488 (Washington, D.C.: Mar.
14, 2003).

Results-Oriented Cultures: Insights for U.S. Agencies from Other
Countries' Performance Management Initiatives, GAO-02-862 (Washington,
D.C.: Aug. 2, 2002).

Human Capital: Key Principles From Nine Private Sector Organizations,
GAO/GGD-00-28 (Washington, D.C.: Jan. 31, 2000).

Linking Resources to 	Performance Budgeting: Observations on the Use of
OMB's Program Assessment Rating Tool for the Fiscal Year 2004 Budget,
GAO-04-174

Results (Washington, D.C.: Jan. 30, 2004).

Managing for Results: Efforts to Strengthen the Link Between Resources and
Results at the Nuclear Regulatory Commission, GAO-03-258 (Washington,
D.C.: Dec. 10, 2002).

                              Related GAO Products

Managing for Results: Efforts to Strengthen the Link Between Resources and
Results at the Administration for Children and Families, GAO-03-9
(Washington, D.C.: Dec. 10, 2002).

Managing for Results: Efforts to Strengthen the Link Between Resources and
Results at the Veterans Health Administration, GAO-03-10 (Washington,
D.C.: Dec. 10, 2002).

Managing for Results: Agency Progress in Linking Performance Plans With
Budgets and Financial Statements, GAO-02-236 (Washington, D.C.: Jan. 4,
2002).

Performance Budgeting: Fiscal Year 2000 Progress in Linking Plans With
Budgets, GAO/AIMD-99-239R (Washington, D.C.: July 30, 1999).

Performance Budgeting: Initial Experiences Under the Results Act in
Linking Plans With Budgets, GAO/AIMD/GGD-99-67 (Washington, D.C.: Apr. 12,
1999).

Performance Budgeting: Past Initiatives Offer Insights for GPRA
Implementation, GAO/AIMD-97-46 (Washington, D.C.: Mar. 27, 1997).

Measuring Performance

Program Evaluation: An Evaluation Culture and Collaborative Partnerships
Help Build Agency Capacity, GAO-03-454 (Washington, D.C.: May 2, 2003).

Managing for Results: Next Steps to Improve the Federal Government's
Management and Performance, GAO-02-439T (Washington, D.C.: Feb. 15, 2002).

Program Evaluation: Studies Helped Agencies Measure or Explain Program
Performance, GAO/GGD-00-204 (Washington, D.C. Sept. 29, 2000).

Managing for Results: Strengthening Regulatory Agencies' Performance
Management Practices, GAO/GGD-00-10 (Washington, D.C.: Oct. 28, 1999).

Managing for Results: Measuring Program Results That Are Under Limited
Federal Control, GAO/GGD-99-16 (Washington, D.C.: Dec. 11, 1998).

                              Related GAO Products

Grant Programs: Design Features Shape Flexibility, Accountability, and
Performance Information, GAO/GGD-98-137 (Washington, D.C.: June 22, 1998).

Program Evaluation: Agencies Challenged by New Demand for Information on
Program Results, GAO/GGD-98-53 (Washington, D.C.: Apr. 24, 1998).

Managing for Results: Regulatory Agencies Identified Significant Barriers
to Focusing on Results, GAO/GGD-97-83 (Washington, D.C.: June 24, 1997).

Managing for Results: Analytic Challenges in Measuring Performance,
GAO/HEHS/GGD-97-138 (Washington, D.C.: May 30, 1997).

Data Credibility	Performance Reporting: Few Agencies Reported on the
Completeness and Reliability of Performance Data, GAO-02-372 (Washington,
D.C.: Apr. 26, 2002).

Managing for Results: Assessing the Quality of Program Performance Data,
GAO/GGD-00-140R (Washington, D.C.: May 25, 2000).

Managing for Results: Challenges Agencies Face in Producing Credible
Performance Information, GAO/GGD-00-52 (Washington, D.C.: Feb. 4, 2000).

Performance Plans: Selected Approaches for Verification and Validation of
Agency Performance Information, GAO/GGD-99-139 (Washington, D.C.: July 30,
1999).

Using Performance 	Managing for Results: Views on Ensuring the Usefulness
of Agency Performance Information to Congress, GAO/GGD-00-35 (Washington,

Information D.C.: Jan. 26, 2000).

Managing for Results: Using GPRA to Assist Congressional and Executive
Branch Decisionmaking, GAO/T-GGD-97-43 (Washington, D.C.: Feb. 12, 1997).

Related GAO Products

Managing for Results: Achieving GPRA's Objectives Requires Strong
Congressional Role, GAO/T-GGD-96-79 (Washington, D.C.: Mar. 6, 1996).

GAO's Mission	The General Accounting Office, the audit, evaluation and
investigative arm of Congress, exists to support Congress in meeting its
constitutional responsibilities and to help improve the performance and
accountability of the federal government for the American people. GAO
examines the use of public funds; evaluates federal programs and policies;
and provides analyses, recommendations, and other assistance to help
Congress make informed oversight, policy, and funding decisions. GAO's
commitment to good government is reflected in its core values of
accountability, integrity, and reliability.

Obtaining Copies of GAO Reports and Testimony

The fastest and easiest way to obtain copies of GAO documents at no cost
is through the Internet. GAO's Web site (www.gao.gov) contains abstracts
and fulltext files of current reports and testimony and an expanding
archive of older products. The Web site features a search engine to help
you locate documents using key words and phrases. You can print these
documents in their entirety, including charts and other graphics.

Each day, GAO issues a list of newly released reports, testimony, and
correspondence. GAO posts this list, known as "Today's Reports," on its
Web site daily. The list contains links to the full-text document files.
To have GAO e-mail this list to you every afternoon, go to www.gao.gov and
select "Subscribe to e-mail alerts" under the "Order GAO Products"
heading.

Order by Mail or Phone	The first copy of each printed report is free.
Additional copies are $2 each. A check or money order should be made out
to the Superintendent of Documents. GAO also accepts VISA and Mastercard.
Orders for 100 or more copies mailed to a single address are discounted 25
percent. Orders should be sent to:

U.S. General Accounting Office 441 G Street NW, Room LM Washington, D.C.
20548

To order by Phone: 	Voice: (202) 512-6000 TDD: (202) 512-2537 Fax: (202)
512-6061

To Report Fraud, 	Contact: Web site: www.gao.gov/fraudnet/fraudnet.htm

Waste, and Abuse in E-mail: [email protected]

Federal Programs Automated answering system: (800) 424-5454 or (202)
512-7470

Public Affairs	Jeff Nelligan, Managing Director, [email protected] (202)
512-4800 U.S. General Accounting Office, 441 G Street NW, Room 7149
Washington, D.C. 20548

                               Presorted Standard
                              Postage & Fees Paid
                                      GAO
                                Permit No. GI00

United States
General Accounting Office
Washington, D.C. 20548-0001

Official Business
Penalty for Private Use $300

Address Service Requested
*** End of document. ***