Information Technology Investment: Agencies Can Improve Performance,
Reduce Costs, and Minimize Risks (Chapter Report, 09/30/96,
GAO/AIMD-96-64).
The management of information technology projects has long been a
significant problem for federal agencies. The government obligated more
than $23.5 billion toward information technology products and services
in fiscal year 1994--about five percent of the government's total
discretionary spending. Yet the impact of this spending on agency
operations and service delivery has been mixed at best. Federal computer
systems often cost millions more than expected, take longer to complete
than anticipated, and fail to significantly improve the speed and
quality of federal programs--or reduce their cost. Some private and
public sector organizations, however, have significantly improved
performance by managing their information technology resources within an
overall framework that aligns technology with business needs and
priorities. This report compares the information technology investment
management practices of leading organizations with those of five
agencies--NASA, the National Oceanic and Atmospheric Administration, the
Environmental Protection Agency, the Coast Guard, and the Internal
Revenue Service.
--------------------------- Indexing Terms -----------------------------
REPORTNUM: AIMD-96-64
TITLE: Information Technology Investment: Agencies Can Improve
Performance, Reduce Costs, and Minimize Risks
DATE: 09/30/96
SUBJECT: Information technology
Internal controls
Product performance evaluation
Strategic information systems planning
Cost control
Federal procurement
Agency missions
Investment planning
Information resources management
IDENTIFIER: NWS Modernization Program
IRS Corporate Files On-Line Project
IRS Service Center Recognition/Image Processing System
IRS Enforcement Revenue Information System
IRS Integrated Collection System
******************************************************************
** This file contains an ASCII representation of the text of a **
** GAO report. Delineations within the text indicating chapter **
** titles, headings, and bullets are preserved. Major **
** divisions and subdivisions of the text, such as Chapters, **
** Sections, and Appendixes, are identified by double and **
** single lines. The numbers on the right end of these lines **
** indicate the position of each of the subsections in the **
** document outline. These numbers do NOT correspond with the **
** page numbers of the printed product. **
** **
** No attempt has been made to display graphic images, although **
** figure captions are reproduced. Tables are included, but **
** may not resemble those in the printed version. **
** **
** Please see the PDF (Portable Document Format) file, when **
** available, for a complete electronic file of the printed **
** document's contents. **
** **
** A printed copy of this report may be obtained from the GAO **
** Document Distribution Center. For further details, please **
** send an e-mail message to: **
** **
** **
** **
** with the message 'info' in the body. **
******************************************************************
Cover
================================================================ COVER
Report to Congressional Requesters
September 1996
INFORMATION TECHNOLOGY INVESTMENT
- AGENCIES CAN IMPROVE
PERFORMANCE, REDUCE COSTS, AND
MINIMIZE RISKS
GAO/AIMD-96-64
IT Investment
(510981)
Abbreviations
=============================================================== ABBREV
ASOS - Automated Surface Observing System
CFOL - Corporate Files On-Line
CFO - Chief Financial Officer
CIO - Chief Information Officer
EPA - Environmental Protection Agency
ESC - Executive Steering Committee
GIS - geographic information system
GOES-Next - Next Generation Geostationary Operational Environmental
Satellite
GPRA - Government Performance and Results Act
GSA - General Services Administration
FTE - full-time equivalent
FTS - Federal Telecommunications System
FY - fiscal year
IG - Inspector General
IRM - information resources management
IRS - Internal Revenue Service
IT - information technology
ITMRA - Information Technology Management Reform Act
NASA - National Aeronautics and Space Administration
NEXRAD - Next Generation Weather Radar
NOAA - National Oceanic and Atmospheric Administration
NWS - National Weather Service
OMB - Office of Management and Budget
Op. - operational
PCM - program control meeting
PIR - postimplementation review
PMC - Program Management Council
PRA - Paperwork Reduction Act
R&D - research and development
ROI - return on investment
SCRIPS - Service Center Recognition/Image Processing System
TSM - Tax Systems Modernization
U.D. - under development
Letter
=============================================================== LETTER
B-271399
September 30, 1996
The Honorable William S. Cohen
Chairman, Subcommittee on Oversight of Government Management
and the District of Columbia
Committee on Governmental Affairs
United States Senate
The Honorable William F. Clinger, Jr.
Chairman
The Honorable Cardiss Collins
Ranking Minority Member
Committee on Government Reform and Oversight
House of Representatives
For some time now, you have expressed concern about the large amount
of money spent annually on information technology (IT) by federal
agencies and whether agencies have processes in place to ensure it is
being spent on the right projects and is producing meaningful
results. This report responds to your request that we assess the IT
investment practices of a small group of federal agencies and compare
them to those used by leading private and public sector
organizations. This report also highlights the implications of our
findings for the Office of Management and Budget (OMB) as it responds
to investment management requirements of the recent Information
Technology Management Reform Act.
We are sending copies of this report to the Secretaries of Commerce
and Transportation; the Administrators of the National Aeronautics
and Space Administration and Environmental Protection Agency; the
Commissioner of Internal Revenue; the Director of the Office of
Management and Budget; the Ranking Minority Member of the
Subcommittee on Oversight of Government Management and the District
of Columbia, Senate Committee on Governmental Affairs; the Chairmen
and Ranking Minority Members of the cognizant subcommittees of the
Senate and House Appropriations Committees; and other interested
congressional committees. We will also make copies available to
others upon request.
Please contact me at (202) 512-6406 if you have any questions about
this report.
Christopher W. Hoenig
Director, Information Resources Management/
Policies and Issues
EXECUTIVE SUMMARY
============================================================ Chapter 0
PURPOSE
---------------------------------------------------------- Chapter 0:1
In recent years, the Congress and the public have increased their
demand for a smaller government that provides improved services at a
lower cost. To keep pace with rising expectations, the federal
government must focus on dramatically improving operations. Such
improvement will require strengthened management of three fundamental
assets--personnel, knowledge and information, and capital
property/fixed assets. Investments in information technology (IT)
can have a dramatic impact on all three of these assets. However, an
IT project's impact comes from how the investment is selected,
designed, and implemented, not from the amount of money that is
spent. In this age of constrained resources, federal executives must
find ways to spend more wisely, not faster.
The management of IT projects, however, has long been a significant
problem for many federal agencies. The federal government obligated
more than $23.5 billion towards IT products and services in fiscal
year 1994--about 5 percent of the government's total discretionary
spending. Yet the impact of this spending on improving agency
operations and service delivery has been mixed at best. Federal
information systems often cost millions more than expected, take
longer to complete than anticipated, and fail to produce significant
improvements in the speed, quality, or cost of federal programs.
Some private and public sector organizations, on the other hand, have
achieved significant performance improvements by managing their IT
resources within an overall management framework that aligns
technology with business needs and priorities. In a 1994 report,\1
GAO identified 11 fundamental management practices found in leading
organizations that led to short- and long-term performance
improvements. One key practice identified by this research was the
management of IT projects as investments. By following this
practice, the organizations minimize risks and maximize returns on
those IT projects that have the best chance of significantly
improving organizational performance.
In order to better understand how federal managers can reduce risks,
control costs, and use technology to improve performance, you asked
GAO to compare and contrast the investment management practices and
decision processes used by leading private and public sector
organizations with a small group of federal agencies. Specifically,
this report compares the IT investment management practices of
leading organizations with IT management activities at five
agencies--the National Aeronautics and Space Administration (NASA),
National Oceanic and Atmospheric Administration (NOAA), Environmental
Protection Agency (EPA), Coast Guard, and Internal Revenue Service
(IRS).
The recent passage of the Information Technology Management Reform
Act (ITMRA), which became effective on August 8 of this year,
introduces new requirements for how IT-related projects will be
selected and managed. These requirements closely parallel the
investment practices followed by leading organizations. Though
agencies have the primary responsibility for leading the change
effort, OMB's specific responsibilities under the act, as well as its
central oversight role, make it a pivotal player at this early stage
of implementation. As a result, this report also examines the
challenges and opportunities presented to OMB as it supports and
oversees agencies' efforts to improve IT management and performance.
--------------------
\1 Executive Guide: Improving Mission Performance Through Strategic
Information Management and Technology (GAO/AIMD-94-115, May 1994).
BACKGROUND
---------------------------------------------------------- Chapter 0:2
In May 1994, GAO issued a report based on its work analyzing the
information management practices of several leading private and state
organizations. One of the key practices that was identified was that
leading organizations use disciplined processes to manage IT projects
as investments, rather than as one-time expenditures.
In general, the leading organizations GAO studied follow a
three-phased management approach for selecting, controlling, and
evaluating IT-related projects. They assess all IT
projects--proposed, under development, and operational--and then
prioritize and make funding decisions based on several factors,
including cost, risk, and return, as well as how well the project
meets mission needs. Once selected, executives monitor the projects
throughout their life cycle, taking quick actions to mitigate effects
of changes in risks and costs to ensure that the investments are
providing expected benefits. And after a project has been
implemented, the organizations evaluate actual versus expected
results for the project and revise their investment management
process based on the lessons learned.
Over the years, the Congress has passed legislation--the Chief
Financial Officers (CFO) Act, the Government Performance and Results
Act (GPRA), as well as revisions to the Paperwork Reduction Act
(PRA)--that enhances agencies' responsibility and accountability for
managing projects for results, recognizes the value of effectively
managing IT projects, and emphasizes maintaining reliable, accurate
financial cost data. In addition, ITMRA directs agencies to use a
comprehensive capital planning and investment approach for maximizing
the value and assessing and managing the risks of IT projects. By
eliminating the oversight role of the General Services Administration
(GSA), accountability for IT projects has been placed squarely with
the individual agencies.
In addition, OMB's role in overseeing federal agencies' selection and
management of IT has been significantly heightened with the passage
of ITMRA. The Director of OMB is now responsible for promoting and
directing that federal agencies establish capital planning processes
for IT investment decisions. The Director is also responsible for
evaluating the results of agency IT investments and enforcing
accountability through the budget process.
To its credit, OMB has taken a proactive role in drafting new
policies and procedures to assist agencies in establishing IT
investment decision-making approaches. For example, in November
1995, OMB published a guide designed to assist agency and OMB staff
in creating and evaluating a portfolio of IT investments.\2 In
addition, OMB is currently working on revisions to its key management
circular regarding strategic information resources management
planning, its budget submission guidance to federal agencies, and its
planning guidance for the acquisition of fixed capital assets.\3
--------------------
\2 Evaluating Information Technology Investments: A Practical Guide,
Executive Office of the President, Office of Management and Budget,
November 1995.
\3 Specifically, revisions are being drafted for Circular A-130,
Management of Federal Information Resources and Circular A-11,
Preparation and Submission of Budget Estimates. In addition,
Bulletin 95-03, Planning and Budgeting for the Acquisition of Fixed
Assets, has been replaced with Circular A-11, Part 3, Planning,
Budgeting, and Acquisition of Fixed Assets.
RESULTS IN BRIEF
---------------------------------------------------------- Chapter 0:3
In examining the IT decision processes at the five case study
agencies, GAO found elements of an investment approach embedded in
some of the agencies' existing decision-making policies and
procedures. Among the elements that GAO found were
-- project funding decision-making processes that used explicit
decision criteria to evaluate risks and returns,
-- processes to prioritize IT projects in alignment with key
strategic mission goals,
-- attempts to integrate IT funding decisions with overall
strategic business planning and direction, and
-- central management processes in place that included both line
managers and IT professionals.
However, GAO also found four cross-cutting weaknesses that prevented
the agencies from having a complete, institutionalized process that
would fulfill the intent of PRA and ITMRA. While all four weaknesses
may not have been present at each agency, in general GAO found that
the agencies:
-- lacked uniformity in their internal processes for selecting and
managing systems investments,
-- focused their selection processes on justifying new project
funding rather than managing all IT projects as a portfolio of
competing investments,
-- made funding decisions without giving adequate attention to
management control or evaluation processes, and
-- made funding decisions using undefined decision criteria, and
often without up-to-date or accurate cost, benefit, and risk
data to support their IT investment decisions.
With a complete investment process, agencies can gain better control
of their IT budgets, increase the odds of operational improvements,
and reduce risks. Conversely, without one, agencies increase the
chance of becoming entrapped in a host of difficult problems, such as
unmanaged development risks, higher failure rates, low-value or
redundant projects, and an overemphasis on maintaining old systems at
the expense of using technology to redesign outmoded work processes.
PRINCIPAL FINDINGS
---------------------------------------------------------- Chapter 0:4
AGENCIES NEED CONSISTENT
PROCESSES TO SELECT AND
MANAGE IT PROJECTS
-------------------------------------------------------- Chapter 0:4.1
Leading organizations use selection, control, and evaluation
processes uniformly at an enterprise level and within each business
unit of the organization. This enables an organization, even one
that is highly decentralized, to systematically identify
cross-functional system opportunities and to determine trade-offs
between projects, both within and across business units.
By contrast, there was little or no uniformity in how risks,
benefits, and costs of various IT projects were evaluated across
subunits within the case study agencies. Three of the
agencies--NASA, EPA, and NOAA--chose IT projects based on
inconsistent or nonexistent investment processes. Thus, making
cross-comparisons between systems of different size or organizational
impact was difficult at best. More important, management had no
assurance that the most important mission objectives of the agency
were being met by the suite of system investments that was selected.
AGENCIES NEED TO MANAGE
THEIR IT PROJECTS AS AN
INVESTMENT PORTFOLIO
-------------------------------------------------------- Chapter 0:4.2
In conducting their selection processes, leading organizations assess
and manage the different types of IT projects in order to create a
complete strategic investment portfolio. By analyzing the entire
portfolio, managers examine the costs of maintaining existing systems
versus investing in new ones, comparatively rank projects based on
expected net returns, and can reach decisions based on overall
contribution to the most pressing organizational needs. This
portfolio process is analogous to the capital planning and budgeting
process frequently used in both public and private sector
organizations.
At the federal agencies GAO studied, some prioritization of projects
was conducted, but none made managerial trade-offs across all types
of projects. NOAA and the Coast Guard, for instance, conducted
portfolio analyses, but these analyses focused primarily on new or
under-development projects and did not consider spending for
operations and maintenance, enhancements, or research and
development. IRS only included Tax Systems Modernization (TSM)
projects that were under development in its investment portfolio.
Of all the agencies that were reviewed, the Coast Guard had the most
comprehensive selection phase. However, this selection process was
still incomplete because it did not include all types of proposed IT
spending. Consequently, the Coast Guard could not make trade-offs
between all types of IT investments, creating a risk of implementing
new systems that duplicate existing systems or of uneconomically
maintaining old systems beyond their life cycle.
MANAGEMENT CONTROL AND
EVALUATION PROCESSES WERE
OFTEN ABSENT
-------------------------------------------------------- Chapter 0:4.3
Once selection has occurred, leading organizations continue to manage
their investments, maintaining a cycle of continual control and
evaluation. This enables senior executives to (1) identify and focus
on managing high-potential or high-risk projects, (2) reevaluate
investment decisions early in a project's life cycle if problems
arise, (3) be responsive to changing external and internal conditions
in mission priorities and budgets, and (4) learn from past successes
and failures in order to make better decisions in the future. This
focus on evaluating project performance in terms of actual results
and mission impact is also consistent with legislative provisions
contained in GPRA.
Control mechanisms in the five case study agencies were driven
primarily by cost and schedule concerns with little focus on
quantitative, outcome-based performance measures. Two of the
reviewed agencies--the Coast Guard and EPA--did not use management
control processes that focused on IT systems projects. The other
three agencies had management control processes that focused
primarily on schedule and cost concerns, but not interim evaluations
of performance and results. Rarely did GAO find examples in which
anticipated benefits were compared to results at critical project
milestones.
Postimplementation reviews (PIR) of actual versus projected returns
were rarely conducted. Four of the five federal agencies did not
systematically evaluate implemented IT projects to determine actual
costs, benefits, and risks, and information and lessons learned in
either the control or evaluation phases were not fed back in to the
selection phase to improve the project selection process. IRS had
developed a PIR methodology that was used to conduct five systems
postimplementation reviews. However, the PIR methodology had not
been integrated into a cohesive investment process. As a result,
PIRs that were conducted did not meet one of their primary
objectives--to ensure continual improvement based on lessons
learned--and IRS runs the risk of repeating past mistakes.
AGENCY IT DECISIONS WERE NOT
BASED ON ADEQUATE DATA
-------------------------------------------------------- Chapter 0:4.4
To help make high-quality decisions on IT investments, leading
organizations require all projects to have accurate, complete, and
up-to-date project information. This information, which includes
cost and benefit data, risk assessments, implementation plans, and
performance measures, is used as the basis for decision-making on
project selections, ongoing monitoring activities, and evaluation of
completed projects. In the federal government, sound financial
management and cost data are a cornerstone requirement of the CFO Act
and are critical to making informed decisions under the performance
management approach required by GPRA.
The agency IT investment decisions GAO examined were largely based on
undefined or implicit decision criteria. Of the five federal
agencies, only the Coast Guard had defined decision criteria for
cost, risk, and return. IRS had recently begun identifying and using
decision criteria, but these criteria were not yet complete and there
was no evidence that IRS decisions were based on acceptable data on
project costs, benefits, and risks. Generally, officials in the
other agencies stated that they determined which projects to fund
based on the judgmental expertise of decisionmakers involved in the
process. Also, data on a project's cost, schedule, risks, and
returns were not documented, defined, or kept up-to-date, and, in
many cases, were not used to make investment decisions. While the
agencies conducted analyses to get projects approved, little effort
was made to ensure that the information was kept accurate and
up-to-date, and rarely were the data used to manage a project
throughout its life cycle.
RECOMMENDATIONS
---------------------------------------------------------- Chapter 0:5
Maximizing the returns and minimizing the risks on the billions of
dollars that are spent each year for IT will require continued
efforts on two fronts. First, federal agencies must develop and
implement a structured IT investment approach that encompasses all
aspects of the investment process. Second, oversight attention far
beyond current levels must be given to agencies' management processes
and rigorous analysis of actual results. Given the critical policy
development and oversight role prescribed to it by ITMRA, OMB has a
significant leadership responsibility in supporting agencies'
efforts. In meeting this responsibility, GAO is recommending that
OMB address four critical challenges.
OMB's first challenge is to guide and assist agencies as they
establish and improve their IT investment management processes. GAO
recommends that OMB develop guidance requiring agencies to (1)
implement IT investment decision-making processes, (2) periodically
analyze their entire portfolio of IT investments, (3) design control
and evaluation processes that include cost, schedule, and
quantitative performance measures, and (4) set minimum quality
standards for data used to assess cost, benefit, and risk decisions.
Second, OMB will need to use the results produced by the improved
investment processes to develop recommendations for the President's
budget that reflect an agency's actual track record in delivering
mission performance for IT funds expended.
Third, to ensure that the improved investment management processes
are effectively implemented, and to make the appropriate linkages
between agency track records and budget recommendations, OMB will
need to marshal the resources and skills necessary to be able to make
sound investment decisions on agency portfolios.
Finally, GAO recommends that as part of its internal implementation
strategy, the Director of OMB should consider developing an approach
to assess OMB's own performance in executing its oversight
responsibilities under ITMRA's capital planning and investment
provisions. Such a process could focus on whether and how OMB
reviews of agency processes and results have an impact on reducing
risk or increasing the returns on information technology
investments--both within and across federal agencies.
AGENCY COMMENTS
---------------------------------------------------------- Chapter 0:6
OMB's comments on a draft of this report are presented and evaluated
in chapter 3 and are reprinted in appendix I. OMB stated that the
analysis contained in the report made a positive contribution towards
understanding the critical elements that agencies will need to have
in place to implement the investment and capital planning provisions
of the Information Technology Management Reform Act of 1996. OMB
also generally supported the report's findings and four sets of
recommendations and said that it would be implementing many aspects
of the recommendations as part of the fiscal year 1998 budget review
process of fixed capital assets. However, OMB did state that rather
than evaluating each agency's IT processes and ensuring compliance
with OMB guidance, it prefers to focus on the results that are
occurring from IT investments. GAO agrees with this results-oriented
approach, provided that OMB maintains its key role in supporting and
selectively evaluating agency practices, as required by statute.
Improved IT results, which are the intent of ITMRA, will not occur
without agencies developing and implementing discliplined investment
processes and practices.
Comments from EPA, NASA, NOAA, IRS, and the Coast Guard have been
incorporated in the report where appropriate, and a summary of their
comments, as well as GAO's response, is included at the end of
chapter 2. All of the agencies generally agreed with the report's
findings, although in some cases they stated that the report did not
appropriately recognize recent progress they had made and did not
adequately address the inherent difficulties in implementing the
investment requirements of ITMRA. In updating the agency
information, GAO allowed the five agencies to provide additional
information reflecting changes and modifications they had made in
preparation for implementing provisions of ITMRA. This updated
information has been incorporated into the report where appropriate.
However, many of the process changes and modifications have occurred
very recently, and it is not yet practical to fully evaluate these
changes or determine their effects.
In addition, while all of the agencies agreed in principle with the
IT investment approach, several raised concerns regarding how the
investment process would work in different organizational
environments. GAO recognizes that IT investment decision-making must
be adaptable to different agency environments and believes that the
approach outlined in the report provides such flexibility. GAO also
agrees that implementing a mature investment decision-making approach
is a complex undertaking and will take several years to implement
fully.
INTRODUCTION
============================================================ Chapter 1
There is an increasing demand, coming from the Congress and the
public, for a smaller government that works better and costs less.
Having valuable, accurate, and accessible financial and programmatic
information is a critical element for any improvement effort to
succeed. Furthermore, increasing the quality and speed of service
delivery while reducing costs will require the government to make
significant investments in three fundamental assets--personnel,
knowledge, and capital property/fixed assets.
Investments in information technology (IT) projects can dramatically
affect all three of these assets. Indeed, the government's ability
to improve performance and reduce costs in the information age will
depend, to a large degree, on how well it selects and uses
information systems investments to modernize its often outdated
operations. However, the impact of information technology is not
necessarily dependent on the amount of money spent, but rather on how
the investments are selected and managed. This, in essence, is the
challenge facing federal executives: Increasing the return on money
spent on IT projects by spending money wiser, not faster.
IT projects, however, are often poorly managed. For example, one
market research group estimates that about a third of all U.S. IT
projects are canceled, at a estimated cost in 1995 of over $81
billion.\1 In the last 12 years, the federal government has obligated
at least $200 billion for information management with mixed results
at best. Yet despite this huge investment, government operations
continue to be hampered by inaccurate data and inadequate systems.
Too often, IT projects cost much more and produce much less than what
was originally envisioned. Even worse, often these systems do not
significantly improve mission performance or they provide only a
fraction of the expected benefits. Of 18 major federal agencies, 7
have an IT effort that has been identified as high risk by either the
Office of Management and Budget (OMB) or us.\2
Some private and public sector organizations, on the other hand, have
designed and managed IT to improve their organizational performance.
In a 1994 report, we analyzed the information management practices of
several leading private and state organizations.\3
These leading organizations were identified as such by their peers
and independent researchers because of their progress in managing
information to improve service quality, reduce costs, and increase
work force productivity and effectiveness. From this analysis, we
derived 11 fundamental IT management practices that, when taken
together, provide the basis for the successful outcomes that we found
in leading organizations. (See figure 1.1.)
Figure 1.1: Fundamental
Strategic Information
Management Practices
(See figure in printed
edition.)
One of the best practices exhibited by leading organizations was that
they manage information systems projects as investments.\4 This
particular practice offers organizations great potential for gaining
better control over their IT expenditures. In the short term (within
2 years), this practice serves as a powerful tool for carefully
managing and controlling IT expenditures and better understanding the
explicit costs and projected returns for each IT project. In the
long term (from 3 to 5 years), this practice serves as an effective
process for linking IT projects to organizational goals and
objectives. However, managing IT projects as investments works most
effectively when implemented as part of an integrated set of
management practices. For example, project management systems must
also be in place, reengineering improvements analyzed, and planning
processes linked to mission goals.
While the specific processes used to implement an investment approach
may vary depending upon the structure of the organization (e.g.,
centralized versus decentralized operations), we nonetheless found
that the leading organizations we studied shared several common
management practices related to the strategic use of information and
information technologies. Specifically, they maintained a
decision-making process consisting of three phases--selection,
control, and evaluation--designed to minimize risks and maximize
return on investment. (See figure 1.2.)
Figure 1.2: An IT Investment
Approach Used in Leading
Organizations
(See figure in printed
edition.)
The Congress has passed several pieces of legislation that lay the
groundwork for agencies to establish a investment approach for
managing IT. For instance, revisions to the Paperwork Reduction Act
(PRA) (Public Law 104-13) have put more emphasis on evaluating the
operational merits of information technology projects. The Chief
Financial Officers (CFO) Act (Public Law 101-576) focuses on the need
to significantly improve financial management and reporting practices
of the federal government. Having accurate financial data is
critical to establishing performance measures and assessing the
returns on IT investments. Finally, the Government Performance and
Results Act (GPRA) (Public Law 103-62) requires agencies to set
results-oriented goals, measure performance, and report on their
accomplishments.
In addition, the recently passed Information Technology Management
Reform Act (ITMRA) (Division E of Public Law 104-106) requires
federal agencies to focus more on the results achieved through IT
investments while streamlining the federal IT procurement process.
Specifically this act, which became effective August 8 of this year,
introduces much more rigor and structure into how agencies approach
the selection and management of IT projects. Among other things, the
head of each agency is required to implement a process for maximizing
the value and assessing and managing the risks of the agency's IT
acquisitions. Appendix V summarizes the primary IT investment
provisions contained in ITMRA.
ITMRA also heightens the role of OMB in supporting and overseeing
agencies' IT management activities. The Director of OMB is now
responsible for promoting and directing that federal agencies
establish capital planning processes for IT investment decisions.
The Director is also responsible for evaluating the results of agency
IT investments and enforcing accountability. The results of these
decisions will be used to develop recommendations for the President's
budget.
OMB has begun to take action in these areas. In November 1995, OMB,
with substantial input from GAO, published a guide designed to help
federal agencies systematically manage and evaluate their IT-related
investments.\5 This guide was based on the investment processes found
at the leading organizations. Recent revisions to OMB Circular A-130
on federal information resources management have also placed greater
emphasis on managing information system projects as investments. And
the recently issued Part 3 of OMB Circular A-11, which replaced OMB
Bulletin 95-03, "Planning and Budgeting for the Acquisition of Fixed
Assets," provides additional guidance and information requirements
for major fixed asset acquisitions.
--------------------
\1 Charting the Seas of Information Technology: Chaos, The Standish
Group International Inc., 1994.
\2 See Information Technology Investment: A Governmentwide Overview
(GAO/AIMD-95-208, July 31, 1995).
\3 Executive Guide: Improving Mission Performance Through Strategic
Information Management and Technology (GAO/AIMD-94-115, May 1994).
\4 IT investment is defined as an expenditure of money and/or
resources for IT or IT-related products and services involving
managerial, technical, and organizational risk for which there are
expected benefits to the organization's performance. These benefits
are defined as improvements either in efficiency of operations or
effectiveness in services (such as reductions in process cycle time
or operational costs, increases in speed or quality of customer
service, or improvements in productivity).
\5 Evaluating Information Technology Investments: A Practical Guide,
Executive Office of the President, Office of Management and Budget,
November 1995.
OBJECTIVES, SCOPE, AND
METHODOLOGY
---------------------------------------------------------- Chapter 1:1
The Chairman, Senate Subcommittee on Oversight of Government
Management and the District of Columbia, Committee on Governmental
Affairs and the Chairman and Ranking Minority Member, House Committee
on Government Reform and Oversight, requested that we compare and
contrast the management practices and decision processes used by
leading organizations with a small sample of federal agencies. The
process used by leading organizations is embodied in OMB's Evaluating
Information Technology Investments: A Practical Guide and specific
provisions contained in the Information Technology Management Reform
Act of 1996. The agencies we examined are the National Aeronautics
and Space Administration (NASA) ($1.6 billion spent on IT in FY
1994), National Oceanic and Atmospheric Administration (NOAA) ($296
million spent on IT in FY 1994), Environmental Protection Agency
(EPA) ($302 million spent on IT in FY 1994), Coast Guard ($157
million spent on IT in FY 1994), and the Internal Revenue Service
(IRS) ($1.3 billion spent on IT in FY 1994).
We selected the federal agencies for our sample based on one or more
of the following characteristics: (1) large IT budgets, (2) expected
IT expenditure growth rates, and (3) programmatic risk as assessed by
GAO and OMB. In addition, the Coast Guard was selected because of
its progress in implementing an investment process. Collectively,
these agencies spent about $3.7 billion on IT in FY 1994--16 percent
of the total spent on IT. Our review focused exclusively on how well
these five agencies manage information technology as investments, one
of the 11 practices used by leading organizations to improve mission
performance, as described in our best practices report.\6 As such,
our evaluation only focused on policies and practices used at the
agencywide level; we did not evaluate the agencies' performance in
the 10 other practices. In addition, we did not systematically
examine the overall IT track records of each agency.
During our review of agency IT investment decision-making processes,
we did the following:
-- reviewed agencies' policies, practices, and procedures for
managing IT investments;
-- interviewed senior executives, program managers, and IRM
professionals; and
-- determined whether agencies followed practices similar to those
used by leading organizations to manage information systems
projects as investments.
We developed the attributes needed to manage information systems
projects as investments from the Paperwork Reduction Act, the Federal
Acquisition Streamlining Act, OMB Circular A-130, GAO's "best
practices" report on strategic information management, GAO's
strategic information management toolkit, and OMB's guide Evaluating
Information Technology Investments: A Practical Guide. Many of the
characteristics of this investment approach are contained in the
Information Technology Management Reform Act of 1996 (as summarized
in appendix V). However, this law was not in effect at the time of
our review.
To identify effects associated with the presence or absence of
investment controls, we reviewed agencies' reports and documents,
related GAO and Inspector General reports, and other external
reports. We also discussed the impact of the agencies' investment
controls with senior executives, program managers, and IRM
professionals to get an agencywide perspective on the controls used
to manage IT investments. Additionally, we reviewed agency
documentation dealing with IT selection, budgetary development, and
IT project reviews.
To determine how much each agency spent on information technology, we
asked each agency for information on spending, staffing, and their 10
largest IT systems and projects. The agencies used a variety of
sources for the same data elements, which may make comparisons among
agencies unreliable. While data submitted by the agencies were
validated by agency officials, we did not independently verify the
accuracy of the data.
Most of our work was conducted at agencies' headquarters in
Washington, D.C. Similarly, we visited NOAA offices in Rockville,
Maryland, and the National Weather Service in Silver Spring,
Maryland. We also visited NASA program, financial, and IRM officials
at Johnson Space Center in Houston, Texas, and Ames Research Center
in San Francisco, California, to learn how they implement NASA policy
on IT management. We performed the majority of our work from April
1995 through September 1995, with selected updates through July 1996,
in accordance with generally accepted government auditing standards.
We updated our analyses of IRS and NASA in conjunction with other
related audit work.\7
In addition, several of the agencies provided us with updated
information as part of their comments on a draft version of the
report. Many of these changes have only recently occurred and we
have not fully evaluated them to determine their effect on the
agency's IT investment process.
We provided and discussed a draft of this report with officials from
OMB, EPA, NASA, NOAA, IRS, and the Coast Guard, and have incorporated
their comments where appropriate. OMB's written comments, as well as
our evaluation, are provided in appendix I.
Appendix II profiles each agency's IT spending, personnel, and major
projects. Appendix III provides a brief description of an IT
investment process approach based on work by GAO and OMB. Appendix
IV provides a brief overview of each agency's IT management
processes. Because of its relevance to this report, the investment
provisions of the Information Technology Management Reform Act of
1996 are summarized in appendix V. Major contributors to this report
are listed in appendix VI.
--------------------
\6 GAO/AIMD-94-115, May 1994.
\7 Tax Systems Modernization: Management and Technical Weaknesses
Must Be Overcome to Achieve Success (GAO/T-AIMD-96-75, March 26,
1996); Tax Systems Modernization: Actions Underway, but IRS Has Not
Yet Corrected Management and Technical Weaknesses (GAO/AIMD-96-106,
June 7, 1996); NASA Chief Information Officer: Opportunities to
Strengthen Information Resources Management (GAO/AIMD-96-78, August
15, 1996).
IT INVESTMENT APPROACHES IN FIVE
CASE STUDY AGENCIES: PROGRESS HAS
BEEN MADE, BUT CHALLENGES REMAIN
============================================================ Chapter 2
All of the agencies we studied--NASA, IRS, the Coast Guard, NOAA, and
EPA--had at least elements or portions of an IT investment process in
place. For instance,
-- the Coast Guard had a selection process with decision criteria
that included an analysis of cost, risk, and return data;
-- EPA had created a executive management group to address
cross-agency IT issues;
-- NASA and NOAA utilized program control meetings to ensure senior
management involvement in monitoring the progress of important
ongoing IT projects; and
-- IRS had developed a systems investment evaluation review
methodology and used it to conduct postimplementation reviews of
some Tax Systems Modernization projects.
However, none of these five agencies had implemented a complete,
institutionalized investment approach that would fulfill requirements
of PRA and ITMRA. Consequently, IT decision-making at these agencies
was often inconsistent or based on the priorities of individual units
rather than the organization as a whole. Additionally, cost-benefit
and risk analyses were rarely updated as projects proceeded and were
not used for managing project results. Also, the mission-related
benefits of implemented systems were often difficult to determine
since agencies rarely collected or compared data on anticipated
versus actual costs and benefits.
In general, we found that the IT investment control processes used at
the case study agencies at the time of our review contained four main
weaknesses. While all four weaknesses may not have been present at
each agency, in comparison to leading organizations, the case study
agencies
-- lacked a consistent process (used at all levels of the agency)
for uniformly selecting and managing systems investments;
-- focused their selection processes on selected efforts, such as
justifying new project funding or focusing on projects already
under development, rather than managing all IT projects--new,
under development, and operational--as a portfolio of competing
investments;
-- made funding decisions without giving adequate attention to
management control or evaluation processes, and
-- made funding decisions based on negotiations or undefined
decision criteria and did not have the up-to-date, accurate data
needed to support IT investment decisions.
Appendix IV provides a brief overview of how each agency's current
processes for selecting, controlling, and evaluating IT projects
worked.
AGENCIES NEED CONSISTENT
PROCESSES TO SELECT AND MANAGE
IT INVESTMENTS
---------------------------------------------------------- Chapter 2:1
Leading organizations use the selection, control, and evaluation
decision-making processes in a consistent manner throughout different
units. This enables the organization, even one that is highly
decentralized, to make trade-offs between projects, both within and
across business units.
Figure 2.1 illustrates how this process can be applied to the federal
government where major cabinet departments may have several agencies
under their purview. IT portfolio investment processes can exist at
both the departmental and agency levels. As with leading
organizations, the key factor is being able to determine which IT
projects and resources are shared (and should be reviewed at the
departmental level) and which are unique to each agency. Three
common criteria used by leading organizations are applicable in the
federal setting. These threshold criteria include (1) high-dollar,
high-risk IT projects (risk and dollar amounts having been already
defined), (2) cross-functional projects (two or more organizational
units will benefit from the project), and (3) common infrastructure
support (hardware and telecommunications). Projects that meet these
particular threshold criteria are discussed, reviewed, and decided
upon at a departmentwide level. The key to making this work is
having clearly defined roles, responsibilities, and criteria for
determining the types of projects that will be reviewed at the
different organizational levels.
Figure 2.1: The IT Investment
Process Is Uniform Throughout
an Organization
(See figure in printed
edition.)
As described in ITMRA, agency heads are to implement a process for
maximizing the value and assessing and managing the risks of IT
investments. Further, this process should be integrated with the
agency's budget, financial, and program management process(es).
Whether highly centralized or decentralized, matrixed or hierarchial,
agencies can most effectively reap the benefits of an investment
process by developing and maintaining consistent processes within and
across their organizations.
One of the agencies we reviewed--the Coast Guard--used common
investment criteria for making cross-agency IT decisions. IRS had
defined some criteria, but was not yet using these criteria to make
decisions. The three other agencies--NASA, EPA, and NOAA--chose IT
projects based on inconsistent or nonexistent investment processes.
There was little or no uniformity in how risks, benefits, and costs
of various IT projects across offices and divisions within these
three agencies were evaluated. Thus, cross-comparisons between
systems of similar size, function, or organizational impact were
difficult at best. More important, management had no assurance that
the most important mission objectives of the agency were being met by
the suite of system investments that was selected.
NASA, for instance, allowed its centers and programs to make their
own IT funding decisions for mission-critical systems. These
decisions were made without an agencywide mechanism in place to
identify low-value IT projects or costs that could be avoided by
capitalizing on opportunities for data sharing and system
consolidation across NASA units. As a result, identifying
cross-functional system opportunities was problematic at best.
The scope of this problem became apparent as a result of a special
NASA IT review. In response to budget pressures, NASA conducted an
agencywide internal information systems review to identify cost
savings. The resulting March 1995 report described numerous
instances of duplicate IT resources, such as large-scale computing
and wide area network services, that were providing similar
functions.\1 A subsequent NASA Inspector General's (IG) report, also
issued in March 1995, substantiated this special review, finding that
at one center NASA managers had expended resources to purchase or
develop information systems that were already available elsewhere,
either within NASA or, in some cases, within that center itself.\2
While this special review prompted NASA to plan several consolidation
efforts, such as consolidating its separate wide area networks (for a
NASA projected savings of $236 million over 5 years), the risk of
purchasing duplicate IT resources remained because of weaknesses in
its current decentralized decision-making process. For example, NASA
created chief information officer (CIO) positions for NASA
headquarters and for each of its 23 centers. These CIOs have a key
role in improving agencywide IT cooperation and coordination.
However, the CIOs have limited formal authority and to date have only
exercised control over NASA's administrative systems--which account
for about 10 percent of NASA's total IT budget. With more defined
CIO roles, responsibility, and authority, it is likely that
additional opportunities for efficiencies will be identified.\3
NASA recently established a CIO council to establish high-level
policies and standards, approve information resources management
plans, and address issues and initiatives. The council will also
serve as the IT capital investment advisory group to the proposed
NASA Capital Investment Council. NASA plans for this Capital
Investment Council to have responsibility for looking at all capital
investments across NASA, including those for IT. NASA's proposed
Capital Investment Council may fill this need for identifying
cross-functional opportunities; however, it is too early to evaluate
its impact.
By having consistent, quantitative, and analytical processes across
NASA that address both mission-critical and administrative systems,
NASA could more easily identify cross-functional opportunities. NASA
has already demonstrated that savings can be achieved by looking
within mission-critical systems for cross-functional opportunities.
For instance, NASA estimated that $74 million was saved by developing
a combined Space Station and Space Shuttle control center using
largely commercial off-the-shelf software and a modular development
approach, rather than the original plan of having two separate
control centers that used mainframe technology and custom software.
EPA, like NASA, followed a decentralized approach for making IT
investment decisions. Program offices have had control and
discretion over their specific IT budgets, regardless of project size
or possible cross-office impact. As we have previously reported,\4
this has led to stovepiped systems that do not have standard data
definitions or common interfaces, making it difficult to share
environmental data across the agency. This is important because
sharing environmental data across the agency is crucial to
implementing EPA's strategic goals. In 1994, EPA began to address
this problem by creating a senior management Executive Steering
Committee (ESC) charged with ensuring that investments in agencywide
information resources are managed efficiently and effectively. This
committee, comprised of senior EPA executives, has the responsibility
to (1) recommend funding on major system development efforts and (2)
allocate the IT budget reserved for agencywide IRM initiatives, such
as geographical information systems (GIS) support and data standards.
At the time of our review, the ESC had not reviewed or made
recommendations on any major information system development efforts.
Instead, the ESC focused its activity on spending funds allocated to
it for agencywide IRM policy initiatives, such as intra-agency data
standards. The ESC met on June 26, 1996, to assess the impact of
ITMRA upon EPA's IT management process.
--------------------
\1 Information Systems Cross-Cutting Team Report, NASA, March 20,
1995.
\2 Audit Report: Survey of NASA Information Systems, NASA, Office of
Inspector General, March 29, 1995.
\3 NASA Chief Information Officer: Opportunities to Strengthen
Information Resources Management (GAO/AIMD-96-78, August 15, 1996).
\4 Environmental Protection: EPA's Plans to Improve Longstanding
Information Resources Management Problems (GAO/AIMD-93-8, September
1993) and Environmental Enforcement: EPA Needs a Better Strategy to
Manage Its Cross-Media Information (GAO/IMTEC-92-14, April 1992).
AGENCIES NEED TO MANAGE THEIR
IT PROJECTS AS A PORTFOLIO
---------------------------------------------------------- Chapter 2:2
In conducting their selection processes, leading organizations assess
and manage the different types of IT projects, such as
mission-critical or infrastructure, at all different phases of their
life cycle, in order to create a complete strategic investment
portfolio. (See figure 2.2.) By scrutinizing and analyzing their
entire IT portfolio, managers can examine the costs of maintaining
existing systems versus investing in new ones. By continually and
rigorously reevaluating the entire project portfolio based on mission
priorities, organizations can reach decisions on systems based on
overall contribution to organizational goals. Under ITMRA, agencies
will need to compare and prioritize projects using explicit
quantitative and qualitative decision criteria.
Figure 2.2: A Comprehensive
Approach Includes All Major IT
Projects
(See figure in printed
edition.)
At the federal agencies we studied, some prioritization of projects
was conducted, but none made managerial trade-offs across all types
of projects. IRS, NOAA, and the Coast Guard each conducted some type
of portfolio analyses; EPA and NASA did not. Additionally, the
portfolio analyses that were performed generally covered projects
that were either high dollar, new, or under development. For
example, in 1995 we reported that IRS executives were consistently
maintaining that all 36 TSM projects, estimated to cost up to $10
billion through the year 2001, were equally important and must all be
completed for the modernization to succeed.\5 This approach, as well
as the accompanying initial failure to rank the TSM projects
according to their prioritized needs and mission performance
improvements, has meant that IRS could not be sure that the most
important projects were being developed first.
Since our 1995 report, IRS has begun to rank and prioritize all of
the proposed TSM projects using cost, risk, and return decision
criteria. However, these decision criteria are largely qualitative,
the data used for decisions were not validated or reliable, and
analyses were not based on calculations of expected return on
investment.\6
In addition, according to IRS, its investment review board uses a
separate process with different criteria for analyzing operational
systems. IRS also said that the board does not review research and
development (R&D) systems or field office systems. Using separate
processes for some system types and not including all systems
prevents IRS from making comparisons and trade-offs as part of a
complete IT portfolio.
Of all the agencies we reviewed, the Coast Guard had the most
experience using a comprehensive selection phase. In 1991, the Coast
Guard started a strategic information resources management process
and shortly thereafter initiated an IT investment process. Under
this investment process, a Coast Guard working group from the IRM
office ranks and prioritizes new IT projects and those under
development based on explicit risk and return decision criteria. A
senior management board meets annually to rank the projects and
decide on priorities.
The Coast Guard has derived benefits from its project selection
process. During the implementation of its IT investment process, the
Coast Guard identified opportunities for systems consolidation. For
example, the Coast Guard reported that five separate personnel
systems are being incorporated into the Personnel Management
Information System/Joint Military Pay System II for a cost avoidance
of $10.2 million. The Coast Guard also identified other systems
consolidation opportunities that, if implemented, could result in a
total cost savings of $77.4 million.
However, at the time of our review, the Coast Guard's selection
process was still incomplete. For example, R&D projects and
operational systems were not included in the prioritization process.
As a result, the Coast Guard could not make trade-offs between all
types of proposed systems investments, creating a risk that new
systems would be implemented that duplicate existing systems.
Additionally, the Coast Guard was at risk of overemphasizing
investments in one area, such as maintenance and enhancements for
existing systems, at the expense of higher value investments in other
areas, such as software applications development supporting multiple
unit needs.
--------------------
\5 Tax Systems Modernization: Management and Technical Weaknesses
Must be Corrected if Modernization Is to Succeed, (GAO/AIMD-95-156,
July 26, 1995).
\6 Tax Systems Modernization: Actions Underway, but IRS Has Not Yet
Corrected Management and Technical Weaknesses (GAO/AIMD-96-106, June
7, 1996).
MANAGEMENT CONTROL AND
EVALUATION PROCESSES WERE OFTEN
ABSENT
---------------------------------------------------------- Chapter 2:3
Leading organizations continue to manage their investments once
selection has occurred, maintaining a cycle of continual control and
evaluation. Senior managers review the project at specific
milestones as the project moves through its life cycle and as the
dollar amounts spent on the project increase. (See figure 2.3.) At
these milestones, the executives compare the expected costs, risks,
and benefits of earlier phases with the actual costs incurred, risks
encountered, and benefits realized to date. This enables senior
executives to (1) identify and focus on managing high-potential or
high-risk projects, (2) reevaluate investment decisions early in a
project's life cycle if problems arise, (3) be responsive to changing
external and internal conditions in mission priorities and budgets,
and (4) learn from past success and mistakes in order to make better
decisions in the future. The level of management attention focused
on each of the three investment phases is proportional based on such
factors as the relative importance of each project in the portfolio,
the relative project risks, and the relative number of projects in
different phases of the system development process.
Figure 2.3: IT Investment Is a
Continuous and Dynamic Process
(See figure in printed
edition.)
The control phase focuses senior executive attention on ongoing
projects to regularly monitor their interim progress against
projected risks, cost, schedule, and performance. The control phase
requires projects to be modified, continued, accelerated, or
terminated based on the results of those assessments. In the
evaluation phase, the attention is focused on implemented systems to
give a final assessment of risks, costs, and returns. This
assessment is then used to improve the selection of future projects.
Similarly in the federal government, GPRA forces a shift in the focus
of federal agencies--away from such traditional concerns as staffing
and activity levels and towards one overriding issue: results. GPRA
requires agencies to set goals, measure performance, and report on
their accomplishments. Just as in leading organizations, GPRA, in
concert with the CFO Act, is intended to bring a more disciplined,
businesslike approach to the management of federal programs.
The agencies we reviewed focused most of their resources and
attention on selecting projects and gave less attention to
controlling or evaluating those projects. While IRS, NASA, and NOAA
had implemented control mechanisms, and IRS had developed a
postimplementation review methodology, none of the agencies had
complete and comprehensive control and evaluation processes in place.
Specifically, in the five case study agencies we evaluated, we found
that
-- control mechanisms were driven primarily by cost and schedule
concerns without any focus on quantitative performance measures,
-- evaluations of actual versus projected returns were rarely
conducted, and
-- information and lessons learned in either the control or
evaluation phases were not systematically fed back to the
selection phase to improve the project selection process.
MANAGEMENT CONTROL PROCESSES
WERE FOCUSED PRIMARILY ON
COST AND SCHEDULE
-------------------------------------------------------- Chapter 2:3.1
Leading organizations maintain control of a project throughout its
life cycle by regularly measuring its progress against not only
projected cost and schedule estimates, but also quantitative
performance measures, such as benefits realized or demonstrated in
pilot projects to date. To do this, senior executives from the
program, IRM, and financial units continually monitor projects and
systems for progress and identify problems. When problems are
identified, they take immediate action to resolve them, minimize
their impact, or alter project expectations.
Legislation now requires federal executives to conduct this type of
rigorous project monitoring. With the passage of ITMRA, agencies are
required to demonstrate, through performance measures, how well IT
projects are improving agency operations and mission effectiveness.
Senior managers are also to receive independently verifiable
information on cost, technical and capability requirements,
timeliness, and mission benefit data at project milestones.
Furthermore, pursuant to the Federal Acquisition Streamlining Act of
1994 (Public Law 103-355), if a project deviates from cost, schedule,
and performance goals, the agency head is required to conduct a
timely review of the project and identify appropriate corrective
action--to include project termination.
Two of the agencies we reviewed--the Coast Guard and EPA--did not use
management control processes that focused on IT systems projects.
The other three agencies--IRS, NOAA, and NASA--had management control
processes that focused primarily on schedule and cost concerns, but
not interim evaluations of performance and results. Rarely did we
find examples in which anticipated benefits were compared to results
at critical project milestones. We also found few examples of
lessons that were learned during the control phase being cycled back
to improve the selection phase.
To illustrate, both IRS and NASA used program control meetings (PCMs)
to keep senior executives informed of the status of their major
systems by requiring reports, in the form of self-assessments, from
the project managers. However, these meetings did not focus on how
projects were achieving interim, measurable improvement targets for
quality, speed, and service that could form the basis for project
decisions about major modifications or termination. IRS, for
instance, used an implementation schedule to track different
components of each of its major IT projects under TSM. Based on our
discussions with IRS officials, the PCMs focused on factors bearing
on real or potential changes in project costs or schedule. Actual,
verified data on interim application or system testing
results--compared to projected improvements in operational, mission
improvements--were not evaluated.
At NASA, senior program executives attended quarterly Program
Management Council (PMC) meetings to be kept informed of major
programs and projects and to take action when problems arose. While
not focused exclusively on IT issues, the PCMs were part of a review
process that looked at implementation issues of programs and projects
that (1) were critical to fulfilling NASA's mission, particularly
those that were assigned to two or more field installations, (2)
involved the allocation of significant resources, defined as projects
whose life-cycle costs were over $200 million, or (3) warranted
special management attention, including those that required external
agency reporting on a regular basis. During the PMC meetings, senior
executives reviewed self-assessments (grades of green, yellow, and
red), done by the responsible project manager, on the cost, schedule,
and technical progress of the project.
Using this color-coded grading scheme, NASA's control process focused
largely on cost, schedule, and technical concerns, but not on
assessing improvements to mission performance. Additionally, the
grading scheme was not based on quantitative criteria, but instead
was largely qualitative and subjective in nature. For instance,
projects were given a "green" rating if they were "in good shape and
on track consistent with the baseline." A "yellow" rating was defined
as a "concern that is expected to be resolved within the schedule and
budget margins," and a "red" rating was defined as "a serious problem
that is likely to require a change in the baseline determined at the
beginning of the project." However, the lack of quantitative
criteria, benefit analysis, and performance data invited the
possibility for widely divergent interpretations and a
misunderstanding of the true value of the projects under review.
As of 1995, three IT systems had met NASA's review criteria and had
been reviewed by the PMC. These three systems constituted about 7
percent of NASA's total fiscal year 1994 IT spending. No similar
centralized review process existed for lower dollar projects, which
could have resulted in problem projects and systems that collectively
added up to significant costs being overlooked. For instance, in
1995 NASA terminated an automated accounting system project that had
been under development for about 6 years, had cost about $45 million
to date, and had an expected life-cycle cost of over $107 million.
In responding to a draft of this report, the NASA CIO said that the
current cost threshold of $200 million is being reduced to a lower
level to ensure that most, if not all, agency IT projects will be
subject to PMC reviews. In addition, the CIO noted that NASA's
internal policy directive on program/project management is being
revised to (1) include IT evaluation criteria that are aligned with
ITMRA and executive-branch guidance and (2) clearly establish the
scope and levels of review (agency, lead center, or center) for IT
investment decisions.
EVALUATIONS OF ACTUAL VERSUS
PROJECTED RETURNS ARE RARELY
CONDUCTED
-------------------------------------------------------- Chapter 2:3.2
Once projects have been implemented and become operational, leading
organizations evaluate them to determine whether they have achieved
the expected benefits, such as lowered cost, reduced cycle time,
increased quality, or increased the speed of service delivery. They
do this by conducting project postimplementation reviews (PIRs) to
compare actual to planned cost, returns, and risks. The PIR results
are used to calculate a final return on investment, determine whether
any unanticipated modifications may be necessary to the system, and
provide "lessons learned" input for changes to the organization's IT
investment processes and strategy. ITMRA now requires agencies to
report to OMB on the performance benefits achieved by their IT
investments and how those benefits support the accomplishment of
agency goals.
Only one of the five federal agencies we
reviewed--IRS--systematically evaluated implemented IT projects to
determine actual costs, benefits, and risks. Indeed, we found that
most of the agencies rarely evaluated implemented IT projects at all.
In general, the agency review programs were insufficiently staffed
and used poorly defined and inconsistent approaches. In addition, in
cases where evaluations were done, the findings were not used to
consider improvements or revisions in the IT investment
decision-making process.
NOAA, for instance, had no systematic process in place to ensure that
it was achieving the planned benefits from its annual $300 million IT
expenditure. For example, of the four major IT projects that
constitute the $4.5 billion National Weather Service (NWS)
modernization effort, only the benefits actually accruing from one of
four--the NEXRAD radars--had been analyzed.\7
While not the only review mechanism used by the agency, NOAA's
central review program was poorly staffed. NOAA headquarters, with
half a staff year devoted to this review program, generally conducted
reviews in collaboration with other organizational units and had
participated in only four IT reviews over the last 3 fiscal years.
Additionally, these reviews generally did not address the systems'
projected versus actual cost, performance, and benefits.
IRS had developed a PIR methodology that it used to conduct five
systems postimplementation reviews. A standardized methodology is
important because it makes the reviews consistent and adds rigor to
the analytical steps used in the review process. The IRS used the
June 1994 PIR on the Corporate Files On-Line (CFOL) system as the
model for this standardized methodology. In December 1995, IRS used
the PIR methodology to complete a review of the Service Center
Recognition/Image Processing System (SCRIPS). Subsequently, three
more PIRs have been completed (TAXLINK, the Enforcement Revenue
Information System, and the Integrated Collection System) and five
more are scheduled. IRS estimated that the five completed systems
have an aggregate cost of about $845 million.
However, the PIR methodology was not integrated into a cohesive
investment process. Specifically, there were no mechanisms in place
to take the lessons learned from the PIRs and apply them to the
decision criteria and other tools and techniques used in their
investment process. As a result, the PIRs that were conducted did
not meet one of their primary objectives--to ensure continual
improvement based on lessons learned--and IRS ran the risk of
repeating past mistakes.
--------------------
\7 For more information on the status of the NWS modernization, see
Weather Service Modernization: Despite Progress, Significant
Problems and Risks Remain (T-AIMD-95-87, Feb. 21, 1995).
AGENCY IT DECISIONS ARE NOT
BASED ON ADEQUATE DATA
---------------------------------------------------------- Chapter 2:4
To help make continual decisions on IT investments, leading
organizations require all projects to have complete and up-to-date
project information. This information includes cost and benefit
data, risk assessments, implementation plans, and initial performance
measures. (See figure 2.4). Maintaining this information allows
senior managers to rigorously evaluate the current status of
projects. In addition, it allows them to compare IT projects across
the organization; consider continuation, delay, or cancellation
trade-offs; and take action accordingly.
ITMRA requires agencies to use quantitative and qualitative criteria
to evaluate the risks and the returns of IT investments. As such,
agencies need to collect and maintain accurate and reliable cost,
benefit, risk, and performance data to support project selection and
control decisions. The requirement for accurate, reliable, and
up-to-date financial and programmatic information is also a primary
requirement of the CFO Act and is essential to fulfilling agency
requirements for evaluating program results and outcomes under GPRA.
At the five case study agencies we evaluated, we found that, in
general
-- agency IT investment decisions were based on undefined or
implicit decision criteria, and
-- data on the project's cost, schedule, risks, and returns were
not documented, defined, or kept up-to-date, and, in many cases,
were not used to make investment decisions.
Figure 2.4: Consistent,
Well-defined, and Up-to-date
Data Are Essential Throughout
the IT Investment Process
(See figure in printed
edition.)
EXPLICIT DECISION CRITERIA
WERE NOT DEFINED
-------------------------------------------------------- Chapter 2:4.1
To ensure that all projects and operational systems are treated
consistently, leading organizations define explicit risk and return
decision criteria. These criteria are then used to evaluate every IT
project or system. Risk criteria involve managerial, technical,
resource, skill, security, and organizational factors, such as the
size and scope of the project, the extent of use of new technology,
the potential effects on the user organization, the project's
technical complexity, and the project's level of dependency on other
systems or projects. Return criteria are measured in financial and
nonfinancial terms. Financial measurements can include return on
investment and internal rate of return analyses while nonfinancial
assessments can include improvements in operational efficiency,
reductions in cycle time, and progress in better meeting customer
needs.
Of the five agencies in our sample, only the Coast Guard used a
complete set of decision criteria. These decision criteria included
(1) risk assessments of schedule, cost, and technical feasibility
dimensions, (2) cost-benefit impacts of the investment, (3) mission
effectiveness measures, (4) degree of alignment with strategic goals
and high-level interest (such as Congress or the President), and (5)
organizational impact in the areas of personnel training, quality of
work life, and increased scope of service. The Coast Guard used
these criteria to prioritize IT projects and justify final
selections. The decision criteria were weighted and scored, and
projects were evaluated to determine those with the greatest
potential to improve mission performance.
Generally, officials in other agencies stated that they determine
which projects to fund based on the judgmental expertise of
decisionmakers involved in the process. NOAA, for instance, had a
board of senior executives that met annually to determine budget
decisions across seven strategic goals. Working groups for each
strategic goal met and each created a prioritized funding list, which
was then submitted to the executive decision-making board. These
working groups did not have uniform criteria for selecting projects.
The executive board accepted the prioritized lists as submitted and
made funding threshold decisions based on these lists. As a result,
the executive board could not easily make consistent, accurate
trade-offs among the projects that were selected by these individual
working groups on a repeatable basis.
In addition, to maximize funding for a specific working group,
project rankings may not have been based on true risk or return.
According to a NOAA senior manager and the chair of one of the NOAA
working groups, one group ranked high-visibility projects near the
bottom of the list to encourage the senior decision-making board to
draw the budgetary cut-off line below these high visibility projects.
Few of these high-visibility projects were at the top of the list,
despite being crucial to NOAA and high on the list of the NOAA
Administrator's priorities. Explicit decision criteria would
eliminate this type of budgetary gamesmanship.
DATA WERE OFTEN NOT
CONSISTENT OR UP-TO-DATE
-------------------------------------------------------- Chapter 2:4.2
Leading organizations consider project data the foundation by which
they select, control, and evaluate their IT investments. Without it,
participants in an investment process cannot determine the value of
any one project. Leading organizations use rigorous and up-to-date
cost-benefit analyses, risk assessments, sensitivity analyses, and
project specific data including current costs, staffing, and
performance, to make funding decisions and project modifications
based, whenever possible, on quantifiable data.
While the agencies in our sample developed documents in order to get
project approvals, little effort was made to ensure that the
information was kept accurate and up-to-date, and rarely were the
data used to manage the project throughout its life cycle. During
our review, we asked each agency to supply us with basic data on its
largest dollar IT projects. However, this information was not
readily available and gathering it required agency officials to rely
on a variety of sometimes incomparable sources for system cost,
life-cycle phase, and staffing levels.
In addition, some of the agencies could not comparatively analyze IT
projects because they did not keep a comprehensive accounting of data
on all of the IT systems. For example, EPA had to conduct a special
information collection to identify life-cycle cost estimates on its
major systems and projects for this report. While the individual
system managers at EPA did have system life-cycle cost estimates, the
fact that this information was decentrally maintained made
cross-system comparisons unlikely. In a 1995 report, the NASA IG
found that neither NASA headquarters nor any of the NASA centers had
a complete inventory of all information systems for which they were
responsible.\8
All of the agencies we reviewed conducted cost-benefit analyses for
their major IT projects. However, these analyses were generally done
to support decisions for project approval and were seldom kept
current. In addition, the cost-benefit projections were rarely used
to evaluate actual project results.
The NWS modernization, for instance, has a cost-benefit analysis that
was done in 1992. This analysis covers the four major systems under
the modernization.\9 To be effective, an analysis should include the
costs and benefits of each project, alternatives to that project, and
finally, a combined cost-benefit analysis for the entire
modernization. However, the cost-benefit analysis that was conducted
only compares the aggregate costs and benefits of the NWS
modernization initiative against the current system. It does not
assess or analyze the costs and benefits of each system, nor does it
examine alternatives to those systems. As a result, NWS does not
know if each of the modernization projects is cost-beneficial, and
cannot make trade-offs among them. If using only this analysis,
decision-makers are forced to choose either the status quo or all of
the projects proposed under the modernization.
Without updated cost-benefit data, informed management decisions
become difficult. We reported in April 1995 that NWS was trying to
assess user concerns related to the Automated Surface Observing
System (ASOS), one of the NWS modernization projects, but that NWS
did not have a complete estimate of what it would cost to address
these concerns.\10 As we concluded in the report, without reliable
estimates of what an enhanced or supplemented ASOS would cost, it
would be difficult for NWS to know whether continued investment in
ASOS is cost-beneficial.
--------------------
\8 Audit Report: Survey of NASA Information Systems, NASA, Office of
Inspector General, March 29, 1995, Report No. JP-95-003.
\9 Chapman, Robert E. Benefit-cost Analysis for the Modernization
and Associated Restructuring of the National Weather Service, July
1992, National Institute of Standards and Technology, Department of
Commerce.
\10 Weather Forecasting: Unmet Needs and Unknown Costs Warrant
Reassessment of Observing System Plans (GAO/AIMD-95-81, April 21,
1995).
AGENCY COMMENTS AND OUR
EVALUATION
---------------------------------------------------------- Chapter 2:5
We provided and discussed a draft of this report with officials from
EPA, NASA, NOAA, IRS, and the Coast Guard, and have incorporated
their comments where appropriate. Several of the agencies noted that
they, in response to the issuance of OMB's guidance on IT investment
decision-making\11 and the passage of ITMRA, have made process
changes and organizational modifications affecting IT funding
decisions. We have incorporated this information into the report
where applicable. However, many of the process changes and
modifications have occurred very recently, and we have not fully
evaluated these changes or determined their effects.
Officials from NOAA and NASA also had reservations about the
applicability of the investment portfolio approach to their
organizations because their decentralized operating environments were
not conducive to a single agencywide portfolio model with a fixed set
of criteria. Because any organization, whether centralized or
decentralized, has to operate within the parameters of a finite
budget, priorities must still be set, across the organization, about
where limited IT dollars will be spent to achieve maximum mission
benefits. We agree that many IT spending decisions can be made at
the agency or program level. However, there are some
decisions--especially those involving projects that are (1)
high-risk, high-dollar, (2) cross-functional, (3) or providing a
common infrastructure (e.g., telecommunications)--that should be made
at a centralized, departmental level. Establishing a common,
organizationwide focus, while still maintaining a flexible
distribution of departmental and agency/program/site decision-making,
can be achieved by implementing standard decision criteria. These
criteria help ensure that projects are assessed and evaluated
consistently at lower levels, while still maintaining an
enterprisewide portfolio of IT investments.
--------------------
\11 Evaluating Information Technology Investments: A Practical
Guide, Executive Office of the President, Office of Management and
Budget, November 1995.
CONCLUSIONS AND RECOMMENDATIONS
============================================================ Chapter 3
CONCLUSIONS
---------------------------------------------------------- Chapter 3:1
Buying information technology can be a high-risk, high-return
undertaking that requires strong management commitment and a
systematic process to ensure successful outcomes. By using an
investment-driven management approach, leading organizations have
significantly increased the realized return on information technology
investments, reduced the risk of cost overruns and schedule delays,
and made better decisions about how their limited IT dollar should be
spent.
Adopting such an investment-driven approach can provide federal
agencies with similar opportunities to achieve greater benefits from
their IT investments on a more consistent basis. However, the
federal case study agencies we examined used decision-making
processes that lacked many essential components associated with an
investment approach. Critical weaknesses included the absence of
reliable, quantitative cost figures, net return on investment
calculations, rigorous decision criteria, and postimplementation
project reviews. With sustained management attention and substantive
improvements to existing processes, these agencies should be able to
meet the investment-related provisions of ITMRA.
Implementing and refining an IT investment process, however, is not
an easy undertaking and cannot be accomplished overnight. Maximizing
the returns and minimizing the risks on the billions of dollars that
are spent each year for IT will require continued efforts on two
fronts. First, agencies must fundamentally change how they select
and manage their IT projects. They must develop and begin using a
structured IT investment approach that encompasses all aspects of the
investment process--selection, control, and evaluation.
Second, oversight attention far beyond current levels must be given
to agencies' management processes and to actual results that are
being produced. Such attention should include the development of
policies and guidance as well as selective evaluations of processes
and results. These evaluations should have a dual focus: They
should identify and address deficiencies that are occurring, but they
should also highlight positive results in order to share lessons
learned and speed success.
OMB's established leadership role, as well as the policy development
and oversight responsibilities that it was given under ITMRA, place
it in a key position to provide such oversight. OMB has already
initiated several changes to governmentwide guidance to encourage the
investment approach to IT decision-making, and has drawn upon the
assistance of several key interagency working groups comprised of
senior agency officials. Such efforts should be continued and
expanded, to ensure that the federal government gets the most return
for its information technology investments.
RECOMMENDATIONS
---------------------------------------------------------- Chapter 3:2
Given its significant leadership responsibility in supporting
agencies' improvement efforts and responding to requirements of
ITMRA, it is imperative that OMB continue to clearly define
expectations for agencies and for itself to successfully implement
investment decision-making approaches. As such, we are recommending
four specific actions for the Director of OMB to take.
OMB's first challenge is to help agencies improve their investment
management processes. With effective processes in place, agencies
should be in much stronger positions to make informed decisions about
the relative benefits and risks of proposed IT spending. Without
them, agencies will continue to be vulnerable to risks associated
with excessively costly projects that produce questionable
mission-related improvements. Under Sections 5112 and 5113 of the
Information Technology Management Reform Act, the Director of OMB has
responsibility for promoting and directing that federal agencies
establish capital planning processes for information technology
investment decisions. In designing governmentwide guidance for this
process, we recommend that the Director of the Office of Management
and Budget require agencies to:
-- Implement IT investment decision-making processes that use
explicitly defined, complete, and consistent criteria applied to
all projects, regardless of whether project decisions are made
at the departmental, bureau, or program level. With criteria
that reflect cost, benefit, and risk considerations, applied
consistently, agencies should be able to make more reasonable
and better informed trade-offs between competing projects in
order to achieve the maximum economic impact for their scarce
investment dollars.
-- Periodically analyze their entire portfolios of IT
investments--at a minimum new projects, as well as projects in
development and operations and maintenance expenditures--to
determine which projects to approve, cancel or delay. With
development and maintenance efforts competing directly with one
another for funding, agencies will be better able to gauge the
best proportion of investment in each category of spending to
move away from their legacy bases of systems with excessive
maintenance costs.
-- Design control and evaluation processes that include cost,
schedule, and quantitative performance assessments of projected
versus actual improvement in mission outcomes. As a result,
they should increase their capacity to both assess actual
project results and learn from their experience which
operational areas produce the highest returns and how well they
estimate projects and deliver final results.
-- Advise agencies in setting minimum quality standards for data
used to assess (qualitatively and quantitatively) cost, benefit,
and risks decisions on IT investments. Agencies should
demonstrate that all IT funding proposals include only data
meeting these quality requirements and that projected versus
actual results are assessed at critical project milestones. The
audited data required by the CFO Act should help produce this
accurate, reliable cost information. Higher quality information
should result in better and more consistent decisions on complex
information systems investments.
OMB's second challenge is to use the results produced by the improved
investment processes to develop recommendations for the President's
budget that reflect an agency's actual track record in delivering
mission performance for IT funds expended. Under Section 5113 of
ITMRA, the Director of OMB is charged with evaluating the results of
agency IT investments and enforcing accountability--including
increases or reductions in agency IT funding proposals--through the
annual budget process. In carrying out these responsibilities, we
recommend that the Director of the Office of Management and Budget:
-- Evaluate information system project cost, benefit, and risk data
when analyzing the results of agency IT investments. Such
analyses should produce agency track records that clearly and
definitively show what improvements in mission performance have
been achieved for the IT dollars expended.
-- Ensure that the agency investment control process are in
compliance with OMB's governmentwide guidance, and if not,
assess strengths and weaknesses and recommend actions and
timetables for improvements. When results are questionable or
difficult to determine, monitoring agency investment processes
will help OMB diagnose problem causes by determining the degree
of agency control and the quality of decisions being made.
-- Use OMB's evaluation of each agency's IT investment control
processes and IT performance results as a basis for recommended
budget decisions to the President. This direct linkage should
give agencies a strong, much needed incentive to maximize the
returns and minimize the risks of their scarce IT investments.
To effectively implement improved investment management processes and
make the appropriate linkages between agency track records and budget
recommendations, OMB also has a third challenge. It will need to
marshal the resources and skills to execute the new types of analysis
required to make sound investment decisions on agency portfolios.
Specifically, we recommend that the Director of the Office of
Management and Budget:
-- Organize an interagency group comprised of budget, program,
financial, and IT professionals to develop, refine and transfer
guidance and knowledge on best practices in IT investment
management. Such a core group can serve as an ongoing source of
practical knowledge and experience on the state of the practice
for the federal government.
-- Obtain expertise on an advisory basis to assist these
professionals in implementing complete and effective investment
management systems. Agency senior IRM management could benefit
greatly from a high quality, easily accessible means to solicit
advice from capital planning and investment experts outside the
federal government.
-- Identify the type and amount of skills required for OMB to
execute IT portfolio analyses, determine the degree to which
these needs are currently satisfied, specify the gap and both
design and implement a plan, with timeframes and goals, to close
the gap. Given existing workloads and the resilience of the OMB
culture, without a determined effort to build the necessary
skills, OMB will have little impact on the quality of IT
investment decision-making. If necessary to augment its own
staff resources, OMB should consider the option of obtaining
outside support to help perform such assessments.
Finally, as part of its internal implementation strategy, the
Director of the Office of Management and Budget should consider
developing an approach to assessing OMB's own performance in
executing oversight responsibilities under the ITMRA capital planning
and investment provisions. Such a process could focus on whether OMB
reviews of agency processes and results have an impact on reducing
risk or increasing the returns on information technology
investments--both within and across federal agencies.
AGENCY COMMENTS AND OUR
EVALUATION
---------------------------------------------------------- Chapter 3:3
In its written comments on a draft of our report, OMB generally
supported our recommendations and said that it is working towards
implementing many aspects of the recommendations as part of the
fiscal year 1998 budget review process of fixed capital assets. OMB
also provided observations or suggestions in two additional areas.
First, OMB stated that given ITMRA's emphasis on agencies being
responsible for IT investment results, it did not plan to validate or
verify that each agency's investment control process is in compliance
with OMB's guidance contained in its management circulars. As
discussed in our more detailed evaluation of OMB's comments in
appendix I, conducting selective evaluations is an important aspect
of an overall oversight and leadership role because it can help
identify management deficiencies that are contributing to poor IT
investment results.
Second, OMB noted that the relationship of IT investment processes
between a Cabinet department and bureaus or agencies within the
department was not fully evaluated and that additional attention
would be needed as more data on this issue become available. We
agree that our focus was on assessing agencywide processes and that
continued attention to the relationships between departments,
bureaus, and agencies will contribute to increased understanding
across the government and will ultimately improve ITMRA's chances of
success. This issue is discussed in more detail in our response to
comments provided by the five agencies we reviewed (summarized at the
end of chapter 2).
(See figure in printed edition.)Appendix I
COMMENTS FROM THE OFFICE OF
MANAGEMENT AND BUDGET AND OUR
EVALUATION
============================================================ Chapter 3
(See figure in printed edition.)
The following are GAO's comments on the Office of Management and
Budget's letter dated July 26, 1996.
GAO COMMENTS
---------------------------------------------------------- Chapter 3:4
1. As stated in the scope and methodology section of the report, we
focused our analysis on agencywide processes. We agree that
continued attention to this issue will contribute to increased
understanding across the government and will ultimately improve
ITMRA's chances of success. As noted in our response to comments
received from the agencies we reviewed (provided at the end of
chapter 2), we believe that a flexible distribution of departmental
and agency/program/site IT decision-making is possible and can best
be achieved by implementing standard decision criteria for all
projects. In addition, we note that particular types of IT
decisions, such as those with unusually high-risk, cross-functional
impact or that provide common infrastructure needs, are more
appropriately decided at a centralized, departmental level.
Experience gained during implementation of the Chief Financial
Officers (CFO) Act showed that departmental-level CFOs needed time to
build effective working relationships with their agency- or
bureau-level counterparts. We believe the same will be true for
Chief Information Officers (CIOs) established by ITMRA and that
establishing and maintaining this bureau-level focus will be integral
for ensuring the act's success.
2. ITMRA does squarely place responsibility and accountability for
IT investment results with the head of each agency. Nevertheless,
ITMRA clearly requires that OMB provide a key policy leadership and
implementation oversight role. While we agree that it may not be
feasible to validate and verify every agency's investment processes,
it is still essential that selected evaluations be conducted on a
regular basis. These evaluations can effectively support OMB's
performance and results-based approach. They can help to identify
and understand problems that are contributing to poor investment
outcomes and also help perpetuate success by providing increased
learning and sharing about what is and is not working.
AGENCY INVESTMENT TECHNOLOGY
PROFILES
========================================================== Appendix II
In order to develop a profile of each agency's IT environment, we
asked the agencies to provide us information on the following:
-- total IT expenditures for fiscal year 1990 through fiscal year
1994;
-- total number of staff devoted to IRM functions and activities
for fiscal year 1990 through fiscal year 1994; and
-- costs for the 10 largest IT projects for fiscal year 1994 (as
measured by total project life-cycle cost).
To gather this information, we developed a data collection instrument
and submitted it to responsible agency officials. Information
supplied by the agencies is summarized in the following tables. We
did not independently verify the accuracy of this information.
Moreover, comparison of figures across the agencies is difficult
because agency officials used different sources (such as budget data,
IRM strategic plans, etc.) for the same data elements.
Table II.1
Total Agency Budget and IT Spending for
Fiscal Year 1990 Through Fiscal Year
1994
(Dollars in millions)
FY FY FY FY FY
Agency 1990 1991 1992 1993 1994
------------------------------ ------ ------ ------ ------ ------
Coast Guard
Total $3,304 $3,427 $3,571 $3,649 $3,666
Amount spent on IT $45 $46 $121 $139 $157
Percent of total 1 1 3 4 4
EPA
Total $6,123 $6,584 $6,969 $6,970 $5,782
Amount spent on IT $262 $291 $281 $282 $302
Percent of total 4 4 4 4 5
IRS
Total $5,500 $6,113 $6,670 $7,100 $7,188
Amount spent on IT $789 $979 $1,294 $1,479 $1,293
Percent of Total 14 16 19 21 18
NASA
Total $13,98 $14,75 $15,18 $14,95 $14,67
1 6 1 0 0
Amount spent on IT $1,513 $1,589 $1,777 $2,002 $1,604
Percent of total 11 11 12 13 11
NOAA
Total $1,200 $1,400 $1,600 $1,700 $1,900
Amount spent on IT $199 $191 $261 $304 $296
Percent of total 17 14 16 18 16
----------------------------------------------------------------------
Sources: Coast Guard, EPA, IRS, NASA, and NOAA.
Table II.2
Agency Total and IT Staffing for Fiscal
Year 1990 Through Fiscal Year 1994
FY FY FY FY FY
Agency 1990 1991 1992 1993 1994
------------------------------ ------ ------ ------ ------ ------
Coast Guard
Total FTE's\a 43,102 43,645 45,581 45,692 44,546
IT FTE's 331 369 398 428 424
Percent of total 1 1 1 1 1
EPA
Total FTE's 15,272 16,415 17,010 18,351 17,721
IT FTE's 886 840 831 863 850
Percent of total 6 5 5 5 5
IRS
Total FTE's 111,96 115,62 116,67 113,46 10,665
2 8 3 0
IT FTE's Not 9,001 9,881 9,529 9,030
identi
fied
Percent of total NA 8 8 8 8
NASA
Total FTE's 23,669 24,692 24,330 23,996 23,685
IT FTE's 1,666 1,659 1,700 1,752 1,476
Percent of total 7 7 7 7 6
NOAA
Total FTE's 12,892 13,410 13,829 14,309 13,292
IT FTE's 910 844 1,100 944 1,030
Percent of total 7 6 8 7 8
----------------------------------------------------------------------
Sources: Coast Guard, EPA, IRS, NASA, and NOAA.
\a One FTE is one full-time equivalent employee.
Table II.3
Life-Cycle Cost and Fiscal Year 1994
Expenditures for the 10 Largest IT
Projects at the Coast Guard
(Dollars in thousands)
Amount Life-
Stat spent in cycle
Project us FY 1994 cost
---------------------------------------------------- ---- ---------- --------
Coast Guard Standard Workstation III U.D. $690 $184,070
Provides an organizationwide microcomputer
infrastructure and is the primary source for
acquiring desktop, server and portable hardware;
operating system and office automation system
software; utilities and peripherals, training,
personnel support, and cabling.
Coast Guard Standard Workstation II Op. $21,600 $63,800
Provides continued support for the Coast Guard's
existing microcomputer infrastructure.
Finance Center Information Resources Management Op. $8,084 $60,079
System
Provides a consolidated accounting and pay system.
Vessel Traffic System Upgrade U.D. $12,385 $27,594
A configuration of sensors, communication links,
personnel, and decision support tools that will
modernize and expand the systems in three cities by
incorporating radar sensor information overlaid on
digital nautical charts as well as improved
decision support systems.
Communication System 2000 U.D. $60 $25,440
Provides an automated and consolidated
communication system.
Aviation Logistics Management Information System U.D. $760 $25,208
Merges two maintenance systems for tracking and
recording scheduled aviation maintenance actions.
Coast Guard Standard Workstation Application U.D. $2,000 $25,000
Conversion
Reprograms most of the existing Coast Guard
developed applications to comply with the National
Institute of Standards and Technology's Application
Portability Profile.
Marine Safety Information System Op. $4,114 $24,459
Provides safety performance histories of vessels
and involved parties and is used as a decision
support tool for the Commercial Vessel Safety
program.
Aviation Repair and Supply Center Systems Op. $4,032 $22,256
Provides aviation technical publications in
electronic format.
Coast Guard Local Information Network Cabling U.D. $3,125 $22,125
Upgrade Project
Consolidated into the Coast Guard Standard
Workstation III system.
================================================================================
Total $56,850 $480,031
--------------------------------------------------------------------------------
Source: Office of Command Control and Communications management
information system.
Note: Categories have been melded to reflect the way Coast Guard
tracks systems and projects; categories do not include IT facilities,
central "bill paying" accounts for IT, or "Umbrella"
Projects/Contracts.
Note: Op. is operational or being maintained; U.D. is under
development.
Table II.4
Life-Cycle Cost and Fiscal Year 1994
Expenditures for the 10 Largest IT
Projects at EPA
(Dollars in thousands)
Amount Life-
Stat spent in cycle
Project us FY 1994 cost
---------------------------------------------------- ---- ---------- --------
Integrated Financial Management System Op. $5,470 $202,730
Performs funds control from commitments through
payment; updates all ledgers and tables as
transactions are processed; provides a standard
means of data entry, edit, and inquiry; and
provides a single set of reference and control
files.
Toxic Release Inventory System Op. $10,149 $138,000
Contains data submitted to EPA under the Emergency
Planning and Community Right to Know Act for
chemicals and chemical categories listed by the
agency. Data include chemical identity, amount of
on-site users, release and off-site transfers, on-
site treatment, minimization/prevention actions.
Public access is provided by the National Library
of Medicine.
Contract Laboratory Program System Op. $8,470 $115,000
Supports management and administration of chemical
samples from Superfund sites that are analyzed
under agency contracts with chemical laboratories.
The system schedules and tracks samples from site
collection, through analysis, to delivery to the
agency.
Aerometric Information Retrieval System Op. $4,737 $75,000
Stores air quality, point source emissions, and
area/mobile source data required by federal
regulations from the 50 states.
Comprehensive Environmental Response, Compensation, Op. $2,390 $68,000
and Liability Information System
Superfund's official source of planning and
accomplishment data. Serves as the primary basis
for strategic decision-making and site-by-site
tracking of cleanup activities.
Certification Fuel Economy Information System Op. $3,750 $45,550
Contains a set of computer applications and a major
relational database which is used to support
regulation development, air quality analysis,
compliance audits, investigations, assembly line
testing, in-use compliance, legislation
development, and environmental initiatives.
Resource Conservation and Recovery Information Op. $3,457 $35,000
System
Maintains basic data identifying and describing
hazardous waste handlers; detailed information
about hazardous waste treatment storage and
disposal processes, environmental permitting,
information on inspections, violations, and
enforcement actions; and tracks specific corrective
action information needed to regulate facilities
with hazardous waste releases.
Permit Compliance System Op. $2,810 $28,000
Supports the National Pollutant Discharge
Elimination System, a Clean Water Act program that
issues permits and tracks facilities that discharge
pollutants into our navigable waters.
Comprehensive Environmental Response, Compensation, U.D. $1,400 $28,000
and Liability Information System, Version III
A replacement for the existing Comprehensive
Environmental Response, Compensation, and Liability
Information System described above.
WasteLAN Op. $2,230 $28,000
A PC LAN version of the Comprehensive Environmental
Response, Compensation, and Liability Information
System database used by EPA regional offices for
data input and local analysis needs.
================================================================================
Total $44,863 $763,280
--------------------------------------------------------------------------------
Source: System/Project manager or the Senior IRM Official for the
office.
Note: Op. is operational or being maintained; U.D. is under
development; Imp. is being implemented.
Table II.5
Life-Cycle Cost and Fiscal Year 1994
Expenditures for the 10 Largest IT
Projects at IRS
(Dollars in thousands)
Amount Life-
Stat spent in cycle
Project us FY 1994 cost
---------------------------------------------------- ---- ---------- --------
Service Center Support System ISD-08 U.D. $27,404 $2,847,3
38
Acquire and install Tax System Modernization host-
tier computers at three computing centers.
Integrated Case Processing System ISD-03 U.D. $108,877 $1,870,9
80
Integrates five systems that control, assign,
prioritize, and track taxpayer inquiries; provides
office automation, case folder review and
inventories, and display and manipulation of case
inquiry folders; automates collection cases;
provide access to current tax return information;
automates case preparation and closure; and
provides standardized hardware and custom software
to the criminal investigation function on a
nationwide basis.
Integrated Input Processing System ISD-06 U.D. $159,933 $1,661,3
29
Integrates six systems that will receive and
control information being transmitted to or from
IRS; automates remittance processing activities;
scans paper tax returns and correspondence for
processing in an automated database; provides
automated telephone assistance to customers;
permits individual and business tax returns to be
filed by utilizing a touch-tone phone; and provides
access to all electronically filed returns that
have been scored for potential fraud.
Integrated Collection System CO-05 Op. $44,469 $1,261,3
61
Provides case tracking, expanded legal research, a
document management system for briefs, an
integrated office system, time reporting, issue
tracking, litigation support, and a decision
support system.
Corporate Systems Design ISD-09 U.D. $38,357 $967,835
Integrates three systems that provide application
programs to query, search, update, analyze and
extract information from a database; aggregates tax
information into electronic case folders and
distributes them to field locations; and provides
the security infrastructure to support all
components of the Tax System Modernization.
Servicewide Technical Infrastructure ISD-15 U.D. \a $699,338
Provides a variety of workstation models, monitors,
printers, operating systems and related equipment;
provides for standardization of the small and
medium-scale computers used by front line programs
in the national and field offices and service
centers.
Tax Processing Mainframe Computer System ISD-13 Op. $20,890 $671,739
Provides funding for (1) the mainframe and
miscellaneous peripherals at each service center,
(2) magnetic media and ADP supplies for all service
centers, (3) lease and maintenance for support
equipment, and (4) on-line access to taxpayer
information and account status.
Corporate Systems Modernization Transition ISD-10 U.D. $68,056 $578,208
Provides an interim hardware platform at two
computing centers to support master file processing
and full implementation of the CFOL data retrieval/
delivery system.
Software Development Environment ISM-35 U.D. $25,316 $424,415
Provides upgradable software development
workstations and workbench tools, including
automated analysis and design tools; requirements
traceability tools; construction kits with smart
editors, compilers, animators, and debuggers; and
static analyzers.
Communications Modernization ISD-21 U.D. $47,258 $408,905
Integrates four systems that provide for ordering
and delivery of telecommunication systems and
services for Treasury bureaus; serves as a
Government Open Systems Interconnection Profile
prototype; provides centralized network and
operations management and will acquire about 14,000
workstations.
================================================================================
Total $540,560 $11,391,
448
--------------------------------------------------------------------------------
Source: IRS' Information Systems Initiative Summaries.
Note: Op. is operational or being maintained; U.D. is under
development or being implemented.
\a Note: ISD-15 did not exist in FY 1994.
Table II.6
Life-Cycle Cost and Fiscal Year 1994
Expenditures for the 10 Largest IT
Projects at NASA
(Dollars in thousands)
Amount Life-
Stat spent in cycle
Project us FY 1994 cost
---------------------------------------------------- ---- ---------- --------
Earth Observing System Data and Information System U.D. $55,077 $3,394,8
Core System 72
Receives, processes, archives, and distributes
earth science research data from U.S., European,
and Japanese polar platforms, selected Earth
probes, the Synthetic Aperture Radar free flyer,
selected existing databases, and other sources of
related data.
Program Information Support Mission Services Op. $21,075 $1,680,0
00
Provides telecommunications and computation
services for Marshall Space Flight Center.
Information Systems Contract Op. $67,185 $490,000
Supports most data systems, networks, user
workstations and telecommunications systems and
provides maintenance, operations, software
development, engineering, and customer support
functions at Johnson Space Center.
Operations Automatic Data Processing Procurement Op. $4,503 $460,000
Provides a family of compatible computing systems
covering a broad performance range that will
provide ground-based mission operations systems
support.
Engineering Test & Analysis Op. $7,648 $430,000
Provides a contractor to supply, over the next 10
years, the necessary personnel, management,
equipment, and materials to support over 100
laboratories within the Engineering Directorate and
other closely related directorates and offices at
the Johnson Space Center
Base Operations Contract Op. $17,636 $418,000
Provides continuity of base operations, including
federal information processing resources of
sustaining engineering, computer operations, and
communications services for Kennedy Space Center.
Scientific & Engineering Workstations Procurement Op. $56,906 $347,000
Acquisition of seven classes of scientific and
engineering workstations plus supporting equipment.
Central Computing Resources Project Op. $24,221 $332,000
Furnishes, installs, and tests the Advanced
Computer Generated Image System; provides direct
computational analysis and programming support to
specific research disciplines and flight projects;
provides for the analysis, programming,
engineering, and maintenance services for the
flight simulation facilities. Also provides support
for the Central Scientific and Computing Complex
operation and systems maintenance as well as
Complex-wide communications systems support and
system administration of distributed computing and
data reduction systems.
Small/Disadvantaged Business Resources Acquisition Op. $26,866 $286,000
Provides a wide array of supporting services,
including computational, professional, technical,
administrative, engineering, and operations at the
Lewis Research Center.
Advanced X-Ray Astrophysics Facility U.D. $4,600 $90,000
================================================================================
Total $285,718 $7,927,8
72
--------------------------------------------------------------------------------
Source: NASA IRM Division APR files.
Note: Op. is operational or being maintained; U.D. is under
development; Imp. is being implemented.
Table II.7
Life-Cycle Cost and Fiscal Year 1994
Expenditures for the 10 Largest IT
Projects at NOAA
(Dollars in thousands)
Amount Life-
Stat spent in cycle
Project us FY 1994 cost
---------------------------------------------------- ---- ---------- --------
Advanced Weather Interactive Processing System U.D. $42,954 $657,048
An information system including workstations,
associated data processing, and communications,
designed to integrate data from several National
Weather Service information systems, as well as
from field offices, regional and national centers,
and other sources.
NWS Supercomputer Replacement Project Op. $13,335 $181,143
An initiative to acquire supercomputers necessary
to run large complex numeric models as a key
component of the weather forecast system.
Central Environmental Satellite Computer System Op. $11,100 $128,696
A distributed-processing system architecture
designed to acquire, process, and distribute
satellite data and products.
Information Technology 1995 Op. $10,925 $120,300
An effort to replace a variety of obsolete
technology in the National Marine Fisheries Service
with a common computing infrastructure that
supports distributed processing in an open system
environment. The system stores, integrates,
analyzes, and disseminates large quantities of
living marine resource data.
Geophysical Fluid Dynamics Laboratory High- Op. $7,396 $97,628
Performance Computer System
Procurement of a high-performance computer system
to provide support services for climate and weather
research activities.
Geostationary Operational Environmental Satellite Op. $7,545 $65,616
(GOES I-M)
Ground system consisting of minicomputers with
associated peripherals and satellite-dependent
customized applications software to provide the
monitoring, supervision, and data acquisition and
processing functions for the GOES-Next satellites.
WSR-88D Operational Support Facility System Support Op. $6,501 $65,260
A system designed to support weather radars and
associated display systems.
NWS Gateway Upgrade Op. $4,145 $45,135
An effort to replace old mainframes as well as the
associated channel-connected architecture with an
open systems architecture.
Polar Orbiting Environmental Satellite Op. $600 $43,982
Ground system consisting of minicomputers with
associated peripherals and satellite-dependent
customized applications software intended to
provide the monitoring, supervision, and data
acquisition and processing functions for the polar
satellites.
Automated Surface Observing System Imp. $4,330 $35,950
A system of sensors, computers, display units, and
communications equipment to automatically collect
and process basic data on surface weather
conditions, including temperature, pressure, wind,
visibility, clouds, and precipitation.
================================================================================
Total $108,831 $1,440,7
58
--------------------------------------------------------------------------------
Sources: A-11 Reports for FY 90-93 and FY 90-94; FY 94 IT Operating
Plan Resource Summary for FY 94-00; FY 95 IT Operating Plan Resource
Summary for FY 95-97; Geophysical Fluid Dynamics Laboratory High
Performance Computer System Benefit/Cost Analysis submitted to OMB
October 1993; 12/90 National Weather Service Gateway System's Upgrade
Requirements Initiative; WSR-88D Operational Support Facility System
Support 11/93 Requirements Initiative; Advanced Weather Interactive
Processing System Acquisition Office.
Note: Op. is operational or being maintained; U.D. is under
development; Imp. is being implemented.
DESCRIPTION OF AN INFORMATION
TECHNOLOGY INVESTMENT PROCESS
APPROACH
========================================================= Appendix III
This appendix is a compilation of work done by OMB and us on how
federal agencies should manage information systems using an
investment process. It is based upon analysis of the IT management
best practices found in leading private and public sector
organizations and is explained in greater detail in OMB's Evaluating
Information Technology Investments: A Practical Guide.\1
--------------------
\1 Evaluating Information Technology Investments: A Practical Guide,
Office of Management and Budget, Executive Office of the President
(November 1995).
MANAGE INFORMATION TECHNOLOGY
WITH AN INVESTMENT PERSPECTIVE
------------------------------------------------------- Appendix III:1
Leading organizations manage IT projects and systems as investments.
This approach systematically reduces risks while maximizing benefits
because it forces the organization to assess the risks and return of
each system throughout its entire life cycle. While the specific
processes and practices used to implement this approach may vary
depending upon the structure of the organization (e.g., centralized
versus decentralized operations), leading organizations follow
several common management activities. Specifically, these
organizations maintain a similar decision-making process consisting
of three phases--selection, control, and evaluation. (See figure
II.1.)
Figure III.1: An IT Investment
Approach Used in Leading
Organizations
(See figure in printed
edition.)
SELECTION PHASE: CHOOSING THE
BEST IT INVESTMENTS
------------------------------------------------------- Appendix III:2
Key Question: How can you select the right mix of IT projects that
best meets mission needs and improvement priorities?
The goal of the selection phase is to assess and prioritize current
and proposed IT projects and then create a portfolio of IT projects.
In doing so, this phase helps ensure that the organization (1)
selects those IT projects that will best support mission needs and
(2) identifies and analyzes a project's risks and returns before
spending a significant amount of project funds. A critical element
of this phase is that a group of senior executives makes project
selection and prioritization decisions based on a consistent set of
decision criteria that compares costs, benefits, risks, and potential
returns of the various IT projects.
STEPS OF THE SELECTION PHASE
----------------------------------------------------- Appendix III:2.1
-- Initially filter and screen IT projects for explicit links to
mission needs and program performance improvement targets using
a standard set of decision criteria.
-- Analyze the most accurate and up-to-date cost, benefit, risk,
and return information in detail for each project.
-- Create a ranked list of prioritized projects.
-- Determine the most appropriate mix of IT projects (new versus
operational, strategic versus maintenance, etc.) to serve as the
portfolio of IT investments.
MANAGEMENT TOOLS AND
TECHNIQUES APPLICABLE TO
THIS PHASE
----------------------------------------------------- Appendix III:2.2
-- An executive management team that makes funding decisions based
on comparisons and trade-offs between competing project
proposals, especially for those projects expected to have
organizationwide impact.
-- A documented and defined set of decision criteria that examines
expected return on investment (ROI), technical risks,
improvement to program effectiveness, customer impact, and
project size and scope.
-- Predefined dollar thresholds and authority levels that recognize
the need to channel project evaluations and decisions to
appropriate management levels to accommodate unit-specific
versus agency-level needs.
-- Minimal acceptable ROI hurdle rates that apply to projets across
the organization that must be met for projects to be considered
for funding.
-- Risk assessments that expose potential technical and managerial
weaknesses.
(See figure in printed
edition.)
CONTROL PHASE: MANAGE THE
INVESTMENTS BY MONITORING FOR
RESULTS
------------------------------------------------------- Appendix III:3
Key Question: What controls are you using to ensure that the
selected projects deliver the projected benefits at the right time
and the right price?
Once the IT projects have been selected, senior executives
periodically assess the progress of the projects against their
projected cost, schedule, milestones, and expected mission benefits.
The type and frequency of the reviews associated with this monitoring
activity are usually based on the analysis of risk, complexity, and
cost that went into selecting the project and that are performed at
critical project milestones. If a project is late, over cost, or not
meeting performance expectations, senior executives decide whether it
should be continued, modified, or canceled.
STEPS OF THE CONTROL PHASE
----------------------------------------------------- Appendix III:3.1
-- Use a set of performance measures to monitor the developmental
progress for each IT project to identify problems.
-- Take action to correct discovered problems.
MANAGEMENT TOOLS AND
TECHNIQUES DURING THIS PHASE
----------------------------------------------------- Appendix III:3.2
-- Established processes that involve senior managers in ongoing
reviews and force decisive action steps to address problems
early in the process.
-- Explicit cost, schedule, and performance measures to monitor
expected versus actual project outcomes.
-- An information system to collect project cost, schedule, and
performance data, in order to create a record of progress for
each project.
-- Incentives for exposing and solving project problems.
(See figure in printed
edition.)
EVALUATION PHASE: LEARN FROM
THE PROCESS
------------------------------------------------------- Appendix III:4
Key Question: Based on your evaluation, did the system deliver what
was expected?
The evaluation phase provides a mechanism for constantly improving
the organization's IT investment process. The goal of this phase is
to measure, analyze, and record results, based on the data collected
throughout each phase. Senior executives assess the degree to which
each project met its planned cost and schedule goals and fulfilled
its projected contribution to the organization's mission. The
primary tool in this phase is the postimplementation review (PIR),
which should be conducted once a project has been completed. PIRs
help senior managers assess whether a project's proposed benefits
were achieved and refine the IT selection criteria.
STEPS OF THE EVALUATION
PHASE
----------------------------------------------------- Appendix III:4.1
-- Compare actual project costs, benefits, risks, and return
information against earlier projections. Determine the causes
of any differences between planned and actual results.
-- For each system in operation, decide whether it should continue
operating without adjustment, be further modified to improve
performance, or be canceled.
-- Modify the organization's investment process based on lessons
learned.
MANAGEMENT TOOLS AND
TECHNIQUES DURING THIS PHASE
----------------------------------------------------- Appendix III:4.2
-- Postimplementation reviews to determine actual costs, benefits,
risks, and return.
-- Modification of decision criteria and investment management
processes, based on lessons learned, to improve the process.
-- Maintenance of accountability by measuring actual project
performance and creating incentives for even better project
management in the future.
(See figure in printed
edition.)
BRIEF DESCRIPTION OF AGENCY IT
MANAGEMENT PROCESSES
========================================================== Appendix IV
The following sections briefly describe the information technology
management processes at each of the five agencies we reviewed. These
descriptions are intended to characterize the general workings of the
agency processes at the time of our review. We used the
selection/control/evaluation model (as summarized in appendix III and
described in detail in OMB's Evaluating Information Technology
Investments: A Practical Guide) as a template for describing each
agency's IT management process.
AGENCY SELECTION PROCESSES
-------------------------------------------------------- Appendix IV:1
COAST GUARD
------------------------------------------------------ Appendix IV:1.1
The Coast Guard had an IT investment process used to select IT
projects for funding. IT project proposals were screened, evaluated,
and ranked by a group of senior IRM managers using explicit decision
criteria that took into account project costs, expected benefits, and
risk assessments. The ranked list with recommended levels of funding
for each project was submitted for review to a board of senior Coast
Guard officers and then forwarded to the Coast Guard Chief of Staff
for final approval.
ENVIRONMENTAL PROTECTION
AGENCY
------------------------------------------------------ Appendix IV:1.2
EPA used a decentralized IT project initiation, selection, and
funding process. Under this broad process, program offices
independently selected and funded IT projects on a case-by-case basis
as the need for the system was identified. EPA had IRM policy and
guidance for IT project data and analysis requirements--such as a
project-level risk assessment and a cost-benefit study--that the
program offices had to identify in order to proceed with system
development. EPA did not have a consistent set of decision criteria
for selecting IT projects.
INTERNAL REVENUE SERVICE
------------------------------------------------------ Appendix IV:1.3
IT selection and funding activities within IRS differed depending on
whether the project was part of the Tax System Modernization (TSM) or
an operational system. In 1995, IRS created a senior-level board for
selecting, controlling, and evaluating information technology
investments and began to rank all of the proposed TSM projects using
its cost, risk, and return decision criteria. However, these
criteria were largely qualitative, data used were not validated or
reliable, and the analyses were not based on calculations of expected
return on investment. According to IRS, its investment review board
used a separate process with different criteria for evaluating
operational systems. The board did not review research and
development systems or field office systems. IRS did not compare the
results of its different evaluation processes.
NATIONAL AERONAUTICS AND
SPACE ADMINISTRATION
------------------------------------------------------ Appendix IV:1.4
Within NASA, IT project selection and funding decisions were made by
domain-specific program managers. NASA had two general types of IT
funding--program expenditures and administrative spending. Most of
NASA's IT funding was embedded within program-specific budgets.
Managers of these programs had autonomy to make system-level and
system support IT selection decisions. Administrative IT systems
were generally managed by the cognizant NASA program office or
center.
NASA has recently established a CIO council to establish high-level
policies and standards, approve information resources management
plans, and address issues and initiatives. The council will also
serve as the IT capital investment advisory group to the proposed
NASA Capital Investment Council. NASA plans for this Capital
Investment Council to have responsibility for looking at all capital
investments across NASA, including those for IT. While this Capital
Investment Council may fill the need for identifying cross-functional
opportunities, it is not yet operational.
NATIONAL OCEANIC AND
ATMOSPHERIC ADMINISTRATION
------------------------------------------------------ Appendix IV:1.5
IT project selection and funding decisions at NOAA were made as part
of its strategic management and budgeting process. NOAA had seven
work teams--each supporting a NOAA strategic goal--that prioritized
incoming funding requests. Managers on these work teams negotiated
to determine IT project funding priorities within the scope of their
respective strategic goals. These prioritization requests were then
submitted to NOAA's Executive Management Board, which had final
agency decision authority over all expenditures. A key decision
criterion used by the work teams was the project's contribution to
the agency's strategic goals; however, no standard set of decision
criteria was used in the prioritization decisions. Other data, such
as cost-benefit analyses, were also sometimes used to evaluate IT
project proposals, although use of these data sources was not
mandatory.
AGENCY CONTROL PROCESSES
-------------------------------------------------------- Appendix IV:2
COAST GUARD
------------------------------------------------------ Appendix IV:2.1
The Coast Guard conducted internal system reviews, but these reviews
were not used to monitor the progress of IT projects. The review
efforts were designed to address ways to improve efficiency, reduce
project cost, and reduce project risk. Cost, benefit, and schedule
data were also collected annually for some new IT projects, but the
Coast Guard did not measure mission benefits derived from each of its
projects.
ENVIRONMENTAL PROTECTION
AGENCY
------------------------------------------------------ Appendix IV:2.2
EPA had a decentralized managerial review process for monitoring IT
projects. EPA's IRM policy set requirements for the minimum level of
review activity that program offices had to conduct, but program
offices had primary responsibility for overseeing the progress of
their IT projects. In an effort to provide a forum for senior
managerial review of IT projects, EPA, in 1994, created the Executive
Steering Committee (ESC) for IRM to guide EPA's agencywide IRM
activities. The ESC was chartered to review IRM projects that are
large, important, or cross-organizational. The committee's first
major system review was scheduled for some time in 1996. EPA is
currently formulating the data submission requirements for the ESC
reviews.
INTERNAL REVENUE SERVICE
------------------------------------------------------ Appendix IV:2.3
IRS regularly conducted senior management program control meetings
(PCM) to review the cost and schedule activity of TSM projects. IRS
had two types of PCMs. The four TSM sites--Submission Processing,
Computing Center, Customer Service, and District Office--conducted
PCMs to monitor the TSM activity under their purview. Also, IRS
could hold "combined PCMs" to resolve issues that spanned across the
TSM sites. IRS did not conduct PCMs to monitor the performance of
operational systems. To date, (1) working procedures, (2) required
decision documents, (3) reliable cost, benefit, and return data, (4)
and explicit quantitative decision criteria needed for an effective
investment control process are not in place for the IRS Investment
Review Board.
NATIONAL AERONAUTICS AND
SPACE ADMINISTRATION
------------------------------------------------------ Appendix IV:2.4
NASA senior executives regularly reviewed the cost and schedule
performance of major programs and projects, but they reviewed only
the largest IT projects. No central IRM review has been conducted
since 1993. NASA put senior-level CIOs in place for each NASA
center, but these CIOs exercised limited control over mission-related
systems and had limited authority to enforce IT standards or
architecture policies. NASA's proposed Capital Investment Council,
which is intended to supplement the Program Management Council by
reviewing major capital investments, may address this concern once
the Investment Council is operational.
NATIONAL OCEANIC AND
ATMOSPHERIC ADMINISTRATION
------------------------------------------------------ Appendix IV:2.5
NOAA conducted quarterly senior-level program status meetings to
review the progress and performance of major systems and programs,
such as those in the NWS modernization. NOAA had defined performance
measures to gauge the progress toward its strategic goals, but did
not have specific performance measures for individual IT systems.
Also, while some offices had made limited comparisons of actual to
expected IT project benefits, NOAA did not require the collection or
assessment of mission benefit accrual information on IT projects.
AGENCY EVALUATION PROCESSES
-------------------------------------------------------- Appendix IV:3
COAST GUARD
------------------------------------------------------ Appendix IV:3.1
The Coast Guard did not conduct any postimplementation reviews of IT
projects. Instead the Coast Guard focused its review activity on
systems that were currently under development.
ENVIRONMENTAL PROTECTION
AGENCY
------------------------------------------------------ Appendix IV:3.2
EPA did not conduct any centralized postimplementation reviews. EPA
did conduct postimplementation reviews as part of the General
Services Administration's (GSA) triennial review requirement, but
curtailed this activity in 1992 when the GSA requirement was lifted.
INTERNAL REVENUE SERVICE
------------------------------------------------------ Appendix IV:3.3
IRS directives required that postimplementation reviews be conducted
6 months after an IT system is implemented. At the time of our
review, IRS had conducted five postimplementation reviews and had
developed a standard postimplementation review methodology. However,
no mechanisms were in place to ensure that the results of these IRS
investment evaluation reviews were used to modify the IRS selection
and control decision-making processes or alter funding decisions for
individual projects.
NATIONAL AERONAUTICS AND
SPACE ADMINISTRATION
------------------------------------------------------ Appendix IV:3.4
NASA did not conduct or require any centralized project
postimplementation reviews. NASA stopped conducting centralized IRM
reviews in 1993 and now instead urges programs to conduct IRM
self-assessments.
NATIONAL OCEANIC AND
ATMOSPHERIC ADMINISTRATION
------------------------------------------------------ Appendix IV:3.5
While the agency conducted other reviews, NOAA's IRM office has
participated in only four IRM reviews over the last 3 years. These
reviews tended to focus on specific IT problems, such as evaluating
the merits of electronic bulletin board systems or difficulties being
encountered digitizing nautical navigation maps. No
postimplementation reviews had been conducted over the past 3 years.
SUMMARY OF INVESTMENT-RELATED
PROVISIONS OF THE INFORMATION
TECHNOLOGY MANAGEMENT REFORM ACT
OF 1996
=========================================================== Appendix V
On February 10, 1996, the Information Technology Management Reform
Act of 1996 (Division E of Public Law 104-106) was signed into law.
This appendix is a summary of the information technology
investment-related provisions from this act, it is not the actual
language contained in the law.
Provision Summary of Provision Narrative
------------------------- -----------------------------------------------------
Sec. 5002(3) Information technology (IT) is defined as any
equipment, or interconnected system or subsystem of
equipment, that is used in the automatic acquisition,
storage, manipulation, management, movement, control,
display, switching, interchange, transmission, or
reception of data or information. It may include
equipment used by contractors.
Sec. 5112(b) The OMB Director is to promote and be responsible for
improving the acquisition, use, and disposal of IT by
federal agencies
Sec. 5112(c) The OMB Director is to develop a process (as part of
the budget process) for analyzing, tracking, and
evaluating the risks and results of major capital
investments for information systems; the process
shall include explicit criteria for analyzing the
projected and actual costs, benefits, and risks
associated with the investments over the life of each
system.
Sec. 5112(c) The OMB Director is to report to the Congress (at the
same time the budget is submitted) on the net program
performance benefits achieved by major capital
investments in information systems and how the
benefits relate to the accomplishment of agency
goals.
Sec. 5112(e) The OMB Director shall designate (as appropriate)
agency heads as executive agents to acquire IT for
governmentwide use.
Sec. 5112(f) The OMB Director shall encourage agencies to develop
and use "best practices" in acquiring IT.
Sec. 5113(b)(2) The OMB Director shall direct that agency heads (1)
establish effective and efficient capital planning
processes for selecting, managing, and evaluating
information systems investments, (2) before investing
in new information systems, determine whether a
government function should be performed by the
private sector, the government, or government
contractor, and (3) analyze their agencys' missions
and revise the mission-related and administrative
processes (as appropriate) before making significant
investments in IT.
Sec. 5113(b)(4) Through the budget process, the OMB Director is to
review selected agency IRM activities to determine
the efficiency and effectiveness of IT investments in
improving agency performance.
Sec. 5122(a) Agency heads are to design and implement a process
for maximizing the value and assessing and managing
the risks of IT investments.
Sec. 5122(b) The agency process is to (1) provide for the
selection, management, and evaluation of IT
investments, (2) be integrated with the processes for
making budget, financial, and program management
decisions, (3) include minimum criteria for selecting
IT investments and specific quantitative and
qualitative criteria for comparing and prioritizing
projects, (4) provide for identifying potential IT
investments that would result in shared benefits with
other federal, state, or local governments, (5)
provide for identifying quantifiable measurements for
determining the net benefits and risks of IT
investments, and (6) provide the means for senior
agency managers to obtain timely development progress
information, including a system of milestones for
measuring progress, on an independently verifiable
basis, in terms of cost, capability of the system to
meet specified requirements, timeliness, and quality.
Sec. 5123(3) Agency heads are to ensure that performance
measurements are prescribed for IT and that the
performance measurements measure how well the IT
supports agency programs.
Sec. 5123(4) Where comparable processes and organizations exist in
either the public or private sectors, agency heads
are to quantitatively benchmark agency process
performance against such processes in terms of cost,
speed, productivity, and quality of outputs and
outcomes.
Sec. 5124(a)(1) Agency heads may acquire IT as authorized by law (the
Brooks Act--40 U. S. C. 759--is repealed by sec.
5101) except that the GSA Administrator will continue
to manage the FTS 2000 and follow-on to that program
(sec. 5124(b)).
Sec. 5125(a) Agency heads are to designate Chief Information
Officers (in lieu of designating IRM officials--as a
result of amending the Paperwork Reduction Act
appointment provision).
Sec. 5125(b) Agency Chief Information Officers (CIOs) are
responsible for (1) providing advice and assistance
to agency heads and senior management to ensure that
IT is acquired and information resources are managed
in a manner that implements the policies and
procedures of the Information Technology Management
Reform Act of 1996, is consistent with the Paperwork
Reduction Act, and is consistent with the priorities
established by the agency head, (2) developing,
maintaining, and facilitating the implementation of a
sound and integrated agency IT architecture, and (3)
promoting effective and efficient design and
operation of major IRM processes.
Sec. 5126 Agency heads (in consultation with the CIO and CFO)
are to establish policies and procedures that (1)
ensure accounting, financial, and asset management
systems and other information systems are designed,
developed, maintained, and used effectively to
provide financial or program performance data for
agency financial statements,
(2) ensure that financial and related program
performance data are provided to agency financial
management systems on a reliable, consistent, and
timely basis, and (3) ensure that financial
statements support the assessment and revision of
agency mission-related and administrative processes
and the measurement of performance of agency
investments in information systems.
Sec. 5127 Agency heads are to identify (in their IRM plans
required under the Paperwork Reduction Act) major IT
acquisition programs that have significantly deviated
from the cost, performance, or schedule goals
established for the program (the goals are to be
established under title V of the Federal Acquisition
Streamlining Act of 1994).
Sec. 5141 This section establishes which provisions of the
title apply to "national security systems."
Sec. 5142 "National security systems" are defined as any
telecommunications or information system operated by
the United States government that (1) involves
intelligence activities,
(2) involves cryptologic activities related to
national security, (3) involves command and control
of military forces, (4) involves equipment that is an
integral part of a weapon or weapon system, or (5) is
critical to the direct fulfillment of military or
intelligence missions.
Sec. 5401 This section requires the GSA Administrator to
provide (through the Federal Acquisition Computer
Network established under the Federal Acquisition
Streamlining Act of 1994 or another automated system)
not later than January 1, 1998, governmentwide on-
line computer access to information on products and
services available for ordering under the multiple
award schedules.
Sec. 5701 The Information Technology Management Reform Act
takes effect 180 days from the date of enactment
(February 10, 1996).
--------------------------------------------------------------------------------
MAJOR CONTRIBUTORS TO THIS REPORT
========================================================== Appendix VI
ACCOUNTING AND INFORMATION
MANAGEMENT DIVISION,
WASHINGTON, D.C.
-------------------------------------------------------- Appendix VI:1
David McClure, Assistant Director
Danny R. Latta, Adviser
Alicia Wright, Senior Business Process Analyst
Bill Dunahay, Senior Evaluator
John Rehberger, Information Systems Analyst
Shane Hartzler, Business Process Analyst
Eugene Kudla, Staff Evaluator
*** End of document. ***