DOD Business Systems Modernization: Navy ERP Adherence to Best	 
Business Practices Critical to Avoid Past Failures (29-SEP-05,	 
GAO-05-858).							 
                                                                 
The Department of Defense's (DOD) difficulty in implementing	 
business systems that are efficient and effective continues	 
despite the billions of dollars that it invests each year. For a 
decade now--since 1995--we have designated DOD's business systems
modernization as "high-risk." GAO was asked to (1) provide a	 
historical perspective on the planning and costs of the Navy's	 
four Enterprise Resource Planning (ERP) pilot projects, and the  
decision to merge them into one program; (2) determine if the	 
Navy has identified lessons from the pilots, how the lessons are 
being used, and challenges that remain; and (3) determine if	 
there are additional best business practices that could be used  
to improve management oversight of the Navy ERP.		 
-------------------------Indexing Terms------------------------- 
REPORTNUM:   GAO-05-858 					        
    ACCNO:   A38678						        
  TITLE:     DOD Business Systems Modernization: Navy ERP Adherence to
Best Business Practices Critical to Avoid Past Failures 	 
     DATE:   09/29/2005 
  SUBJECT:   Best practices					 
	     Best practices methodology 			 
	     Financial management systems			 
	     Internal controls					 
	     Lessons learned					 
	     Program evaluation 				 
	     Program management 				 
	     Software verification and validation		 
	     Systems conversions				 
	     Systems design					 
	     Systems evaluation 				 
	     Technology modernization programs			 
	     Business planning					 
	     Pilot programs					 
	     Navy Enterprise Resource Planning System		 

******************************************************************
** This file contains an ASCII representation of the text of a  **
** GAO Product.                                                 **
**                                                              **
** No attempt has been made to display graphic images, although **
** figure captions are reproduced.  Tables are included, but    **
** may not resemble those in the printed version.               **
**                                                              **
** Please see the PDF (Portable Document Format) file, when     **
** available, for a complete electronic file of the printed     **
** document's contents.                                         **
**                                                              **
******************************************************************
GAO-05-858

                 United States Government Accountability Office

                     GAO Report to Congressional Requesters

September 2005

DOD BUSINESS SYSTEMS MODERNIZATION

 Navy ERP Adherence to Best Business Practices Critical to Avoid Past Failures

                                       a

GAO-05-858

[IMG]

September 2005

DOD BUSINESS SYSTEMS MODERNIZATION

Navy ERP Adherence to Best Business Practices Critical to Avoid Past Failures

  What GAO Found

The Navy invested approximately $1 billion in four ERP pilots without
marked improvement in its day-to-day operations. The planning for the
pilots started in 1998, with implementation beginning in fiscal year 2000.
The four pilots were limited in scope and were not intended to be
corporate solutions for any of the Navy's long-standing financial and
business management problems. Furthermore, because of the various
inconsistencies in the design and implementation of the pilots, they were
not interoperable, even though they performed many of the same business
functions. In short, the efforts were failures and $1 billion was largely
wasted.

Because the pilots would not meet its overall requirements, the Navy
decided to start over and develop a new ERP system, under the leadership
of a central program office. Using the lessons learned from the pilots,
the current Navy ERP program office has so far been committed to the
disciplined processes necessary to manage this effort. GAO found that,
unlike other systems projects it has reviewed at DOD and other agencies,
Navy ERP management is following an effective process for identifying and
documenting requirements. The strong emphasis on requirements management,
which was lacking in the previous efforts, is critical since requirements
represent the essential blueprint that system developers and program
managers use to design, develop, test, and implement a system and are key
factors in projects that are considered successful.

While the Navy ERP has the potential to address some of the Navy's
financial management weaknesses, as currently planned, it will not provide
an allinclusive end-to-end corporate solution for the Navy. For example,
the current scope of the ERP does not include the activities of the
aviation and shipyard depots. Further, there are still significant
challenges and risks ahead as the project moves forward, such as
developing and implementing 44 system interfaces with other Navy and DOD
systems and converting data from legacy systems into the ERP system. The
project is in its early phases, with a current estimated completion date
of 2011 at an estimated cost of $800 million. These estimates are subject
to, and very likely will, change. Broader challenges, such as alignment
with DOD's business enterprise architecture, which is not fully defined,
also present a significant risk. Given DOD's past inability to implement
business systems that provide the promised capability, continued close
management oversight-by the Navy and DOD-will be critical. In this regard,
the Navy does not have in place the structure to capture quantitative data
that can be used to assess the effectiveness of the overall effort. Also,
the Navy has not established an IV&V function. Rather, the Navy is using
in-house subject matter experts and others within the project. Industry
best practices indicate that the IV&V activity should be independent of
the project and report directly to agency management in order to provide
added assurance that reported results on the project's status are
unbiased.

United States Government Accountability Office

Contents

     Letter                                                                 1 
                                           Results in Brief                 3 
                                              Background                    7 
                               Navy's Pilot Projects Lacked Coordinated    12 
                                         Management Oversight              
                            Requirements Management Process Effective, but 
                                            Implementation                 
                                     Challenges and Risks Remain           23 
                              Additional Actions Can be Taken to Improve   
                                         Management Oversight              
                                        of the Navy ERP Effort             41 
                                             Conclusions                   47 
                                 Recommendations for Executive Action      48 
                                  Agency Comments and Our Evaluation       48 
Appendixes                                                              
                Appendix I:             Scope and Methodology              54 
               Appendix II:    Comments from the Department of Defense     56 
                            Identification of the Navy and Defense Systems 
              Appendix III:                   That Must                    
                                        Interface with the ERP             60 
              Appendix IV:         GAO Contacts and Acknowledgments        62 

Tables                 Table 1: Navy ERP Pilot Projects                 13 
                 Table 2: Functions Performed by the Pilot Projects        15 
                  Table 3: Documentation for Detailed Requirements         30 
Figures    Figure 1: Percent of Effort Associated with Undisciplined       
                                      Projects                             22
               Figure 2: Relationship between Requirements Development and 
                                       Testing                             27 
                    Figure 3: Navy ERP Required System Interfaces          35 

This is a work of the U.S. government and is not subject to copyright
protection in the United States. It may be reproduced and distributed in
its entirety without further permission from GAO. However, because this
work may contain copyrighted images or other material, permission from the
copyright holder may be necessary if you wish to reproduce this material
separately.

A

United States Government Accountability Office Washington, D.C. 20548

September 29, 2005

Congressional Requesters

The Department of Defense (DOD) has historically been unable to develop
and implement business systems on time, within budget, and with the
promised capability. As noted in our recent report,1 this difficulty
continues despite the billions of dollars that DOD spends each year to
operate, maintain, and modernize its currently reported 4,150 business
systems. For fiscal year 2005, the department requested $13 billion for
its existing business systems environment. For a decade now-since 1995- we
have designated DOD's financial management and business systems
modernization as "high-risk." In fact, of the 25 areas on GAO's
governmentwide "high-risk" list, 8 are DOD specific program areas, and the
department shares responsibility for 6 other high-risk areas that are
governmentwide in scope.2

DOD has recognized the importance of transforming its business operations
and systems to make them more efficient and effective in support of the
department's defense mission. A critical aspect of the department's
transformation effort will be its ability to effectively implement
business systems on time, within budget, and with the promised capability.

This report responds to your request for information on DOD's management
of selected facets of its business modernization efforts that are intended
to enhance the department's reporting on its results of operation. As
agreed with your offices, we selected the Department of the

1GAO, DOD Business Systems Modernization: Billions Being Invested without
Adequate Oversight, GAO-05-381 (Washington, D.C.: Apr. 29, 2005).

2GAO, High-Risk Series: An Update, GAO-05-207 (Washington, D.C.: January
2005). The eight specific DOD high-risk areas are (1) approach to business
transformation, (2) business systems modernization, (3) contract
management, (4) financial management, (5) personnel security clearance
program, (6) supply chain management, (7) support infrastructure
management, and (8) weapon systems acquisition. The six governmentwide
high-risk areas that include DOD are: (1) disability programs, (2)
interagency contracting, (3) information systems and critical
infrastructure, (4) information sharing for homeland security, (5) human
capital, and (6) real property.

Navy's (Navy) Enterprise Resource Planning (ERP) program,3 initially
created as four pilot projects and now being pursued as a consolidated
effort, to review and determine if it will help to resolve some of the
Navy's long-standing financial and business management problems. The Navy
expects the ERP to be fully operational by fiscal year 2011. It is
currently estimated that for fiscal years 2004-2011, the program will cost
approximately $800 million. When fully operational, the Navy reports that
the ERP will manage an estimated 80 percent of its appropriated funds.4
Our objectives were to (1) provide a historical perspective on the
planning and costs of the Navy's four ERP pilot projects, and the decision
to merge them into one program; (2) determine if the Navy has identified
lessons from the pilots, how the lessons are being used, and the
challenges that remain; and (3) determine if there are additional best
business practices that could be used to improve management oversight of
the Navy ERP.

To obtain a historical perspective on the planning and costs of the Navy's
four ERP pilot projects, and the decision to merge them into one program,
we reviewed DOD's budget justification materials and other background
information on the four pilot projects and met with program management and
DOD Chief Information Officer (CIO) officials. Additionally, we reviewed
project documentation provided by DOD and held discussions with program
management officials related to two key processes- requirements management
and testing-to determine if these key aspects of program management were
being performed and if the system is being designed to help address some
of Navy's long-standing management problems. Further, we reviewed relevant
industry standards and best practices, and interviewed and requested
documentation from the Navy ERP to determine whether there are additional
best business practices that could appropriately be used to improve
management oversight of the ERP. Given that this effort is still in the
early stages of development, we did not evaluate all best practices.
Rather, we concentrated on those that could have an immediate impact in
improving management's oversight.

3An ERP solution is an automated system using commercial off-the-shelf
(COTS) software consisting of multiple, integrated functional modules that
perform a variety of businessrelated tasks such as payroll, general ledger
accounting, and supply chain management.

4The 80 percent is calculated after excluding estimated appropriated
funding for the Marine Corps and military personnel and pay. Of the 80
percent, about 2 percent of the appropriated funds will be executed and
maintained in detail by the financial management systems at the aviation
and shipyard depots.

Our work was performed from August 2004 through June 2005 in accordance
with U.S. generally accepted government auditing standards. Details on our
scope and methodology are included in appendix I. We requested comments on
a draft of this report from the Secretary of Defense or his designee.
Written comments from the Deputy Under Secretary of Defense (Financial
Management) and the Deputy Under Secretary of Defense (Business
Transformation) are reprinted in appendix II.

Results in Brief	The Navy has invested approximately $1 billion in its
four pilot ERP efforts, without marked improvement in its day-to-day
operations. The four pilots were limited in scope and were not intended to
be a corporate solution for resolving any of the Navy's long-standing
financial and business management problems. Because of the various
inconsistencies in the design and implementation, the pilots were not
interoperable, even though they performed many of the same business
functions. The lack of a coordinated effort among the pilots led to a
duplication of efforts in implementing many business functions and
resulted in ERP solutions that carry out similar functions in different
ways from one another. In essence, the pilots resulted in four more DOD
stovepiped systems that did not enhance DOD's overall efficiency and
resulted in $1 billion being largely wasted.

Because the pilots were stovepiped, limited within the scope of their
respective commands, and not interoperable, they did not transform the
Navy's business operations. As a result, under the leadership of a central
program office-something that was lacking in the pilots-the Navy decided
to start over and undertake the development and implementation of a single
ERP system. Using the lessons learned from the pilots, the current Navy
ERP program office has so far demonstrated a commitment to the disciplined
processes necessary to manage this effort and reduce the risks associated
with system implementation. Specifically, we found that, unlike other
reviews we have performed at DOD and other agencies, the Navy ERP program
office is following an effective process for identifying

and documenting requirements.5 Requirements represent the essential
blueprint that system developers and program managers use to design,
develop, test, and implement a system.

While the Navy ERP has the potential to address some of the Navy's
financial management weaknesses, its current planned functionality will
not provide an all-inclusive end-to-end corporate solution for the Navy.
For example, the current scope of the ERP does not provide for real-time
asset visibility of shipboard inventory. Asset visibility has been and
continues to be a long-standing problem within the department. The scope
of the current effort is much larger than the pilots. The system is
intended to manage an estimated 80 percent of the Navy's appropriated
funds, according to the Navy ERP program office.

The project has a long way to go, with a current estimated completion date
of 2011, at an estimated cost of $800 million. These estimates are very
likely to change, given (1) the Navy ERP's relatively early phase of
development and (2) DOD's past history of not implementing systems on time
and within budget. The project faces numerous significant challenges and
risks that must be dealt with as the project moves forward. For example,
44 system interfaces6 with other Navy and DOD systems must be developed
and implemented. Long-standing problems regarding the lack of integrated
systems and use of nonstandard data within DOD pose significant challenges
and risks to a successful Navy ERP interface with these systems. Testing
the system interfaces in an end-to-end manner is necessary in order for
the Navy to have reasonable assurance that the ERP will be capable of
providing the intended functionality. Another challenge and risk is the
Navy's ability to convert the necessary data from its legacy systems into
the ERP system. Data conversion is one of the critical tasks necessary to
successfully implement a new financial system. If data conversion is done
right, the new system has a much greater opportunity for success. On the
other hand, converting data incorrectly or entering unreliable data from a
legacy system has lengthy and long-term

5According to the Software Engineering Institute, requirements management
is a process that establishes a common understanding between the customer
and the software project manager regarding the customer's business needs
that will be addressed by a project. A critical part of this process is to
ensure that the requirements development portion of the effort documents,
at a sufficient level of detail, the problems that need to be solved and
the objectives that need to be achieved.

6An interface is a connection between two devices, applications, or
networks or a boundary across which two systems communicate.

repercussions. The adage "garbage in, garbage out" best describes the
adverse impact of inadequate data conversion efforts.

A broader challenge and risk that is out of the Navy ERP project's
control, but could significantly affect it, is DOD's development of a
business enterprise architecture (BEA).7 An enterprise architecture
consists of snapshots of the enterprise's current environment and its
target environment, as well as a capital investment road map for
transitioning from the current to the target environment. The real value
of an enterprise architecture is that it provides the necessary content
for guiding and constraining system investments in a way that promotes
interoperability and minimizes overlap and duplication. As we have
recently reported,8 DOD's BEA still lacks many of the key elements of a
well-defined architecture, and no basis exists for evaluating whether the
Navy ERP will be aligned with the BEA and whether it would be a corporate
solution for DOD in its "To Be" or target environment. At this time, it is
unknown what the target environment will be. Therefore, it is unknown what
business processes, data standards, and technological standards the Navy
ERP must align to, as well as what legacy systems will be transitioned
into the target environment. Developing a large-scale systems
modernization program, such as the Navy ERP, without the context of an
enterprise architecture poses risks of rework to comply with the BEA once
it is fully defined.

While we are encouraged by the Navy's management of the requirements
process, other actions are needed to enhance management's oversight, both
Navy and DOD, of the ERP effort. The Navy does not have in place the
structure to capture quantitative data that can be used to assess the
effectiveness of the overall effort. This information is necessary to
understand the risk being assumed and whether the project will provide the
desired functionality. Additionally, the Navy has not established an
independent verification and validation (IV&V) function to provide an
assessment of the Navy ERP to DOD and Navy management. Rather, the Navy is
using in-house subject matter experts and others who report to the

7In July 2001, the Secretary of Defense established the Business
Management Modernization Program to improve the efficiency and
effectiveness of DOD business operations and to provide the department's
leaders with accurate and timely information through the development and
implementation of a business enterprise architecture.

8GAO, DOD Business Systems Modernization: Long-standing Weaknesses in
Enterprise Architecture Development Need to Be Addressed, GAO-05-702
(Washington, D.C.: July 22, 2005).

Quality Assurance team leader. Industry best practices indicate that IV&V
activities should be conducted by individuals independent of the project
and who report directly to agency management in order to provide added
assurance that reported results on the project's status are unbiased.
Since effective implementation of the disciplined processes has been shown
to be a key indicator of whether a project has reduced its risk to
acceptable levels, these management actions would provide increased
confidence that the Navy ERP project is "on track."

Considering that the project is still in its early stages of development
and that there are significant challenges and high risks ahead, it is
critical that mechanisms be in place to critically review the project at
all stages to ensure it will result in improvements to the Navy's
operations and alert the Navy and DOD if the project does not remain on
schedule and within budget. Given the department's past history of not
being able to successfully implement business systems on time and within
budget, accountability is imperative. Thus, we are making three
recommendations to the Secretary of Defense directed towards improving
oversight over the Navy ERP. In its written comments on a draft of this
report, DOD generally agreed with our recommendations. DOD agreed to
develop quantitative metrics that can be used to evaluate the program and
stated that the metrics would be developed by December 2005. While the
department agreed with the recommendation to establish an IV&V, it noted
that the IV&V would be within the project team and will report directly to
the Navy ERP program manager. Additionally, while the department agreed
with the intent of our recommendation that the Defense Business System
Management Committee review the status of the Navy ERP on a semiannual
basis, DOD noted that its tiered accountability structure, recently put in
place to improve the control and accountability of business system
investments, would provide the necessary oversight. In regard to the last
two points, we continue to believe that providing the IV&V reports to the
appropriate investment review board and instituting semiannual reviews by
the Defense Business System Management Committee are the appropriate
courses of action. Given (1) the Navy's initial estimate that the new ERP
will cost at least $800 million and (2) the department's past difficulties
in effectively developing and implementing business systems, substantive
reviews that are focused just on the Navy ERP by the highest levels of
management within the department are warranted so that management can act
quicker rather than later if problems emerge.

In its comments, the department took exception to our characterization of
the pilots as failures and identified what it believes were achievements
of

the pilot programs. However, as discussed in the report, the pilots were
limited in scope. Although they used the same software, inconsistencies in
the design and implementation resulted in them not being interoperable. In
essence, the department had four more stovepiped systems. The department's
comments did not disagree with any of the above points. We characterized
the pilots as failures because the department spent $1 billion on systems
that did not result in marked improvement in the Navy's day-today
operations. Furthermore, if there had been effective management oversight
of the pilot programs at the outset, there would not have been a need to,
in essence, start over with the Navy ERP. See the Agency Comments and Our
Evaluation section of this report for a more detailed discussion of the
agency comments. We have reprinted DOD's written comments in appendix II.

Background	The Navy, with reported assets totaling $321 billion in fiscal
year 2004,9 would be ranked among the largest corporations in the world if
it were a private sector entity. According to the Navy, based upon the
reported value of its assets, it would be ranked among the 15 largest
corporations on the Fortune 500 list. Additionally, in fiscal year 2004
the Navy reported that its inventory was valued at almost $73 billion and
that it held property, plant, and equipment with a reported value of
almost $156 billion. Furthermore, the Navy reported for fiscal year 2004
that its operations involved total liabilities of $38 billion, that its
operations had a net cost of $130 billion, and that it employed
approximately 870,000 military and civilian personnel-including reserve
components.10

The primary mission of the Navy is to control and maintain freedom of the
seas, performing an assortment of interrelated and interdependent business
functions to support its military mission with service members and
civilian personnel in geographically dispersed locations throughout the
world. To support its military mission and perform its business functions,
the Navy requested for fiscal year 2005 almost $3.5 billion for the
operation, maintenance, and modernization of its business systems and
related infrastructure-the most of all the DOD components-or about 27
percent

9Department of the Navy, Fiscal Year 2004 Annual Financial Report
(November 2004). Numbers are combined from the fiscal year 2004 General
Fund and Working Capital Fund financial reports.

10This includes the Marine Corps.

of the total $13 billion DOD fiscal year 2005 business systems budget
request. Of the 4,150 reported DOD business systems, the Navy holds the
largest inventory of business systems-with 2,353 reported systems or 57
percent of DOD's reported inventory of business systems.

The Secretary of Defense recognized that the department's business
operations and systems have not effectively worked together to provide
reliable information to make the most effective business decisions. He
challenged each military service to transform its business operations to
support DOD's warfighting capabilities and initiated the Business
Management Modernization Program (BMMP) in July 2001. Further, the
Assistant Secretary of the Navy for Financial Management and Comptroller
(Navy Comptroller) testified that transforming the Navy's business
processes, while concurrently supporting the Global War on Terrorism, is a
formidable but essential task.11 He stated that the goal of the
transformation is to "establish a culture and sound business processes
that produce high-quality financial information for decision making." One
of the primary elements of the Navy's business transformation strategy is
the Navy ERP.

  Recently Reported Business and Financial Weaknesses at the Navy

The need for business processes and systems transformation to provide
management with timely information to make important business decisions is
clear. However, none of the military services, including the Navy, have
passed the scrutiny of an independent financial audit. Obtaining a clean
(unqualified) financial audit opinion is a basic prescription for any
wellmanaged organization, as recognized by the President's Management
Agenda. For fiscal year 2004, the DOD Inspector General issued a
disclaimer on the Navy's financial statements-Navy's General Fund and
Working Capital Fund-citing eight material weaknesses12 and six material

11Status of Financial Management Reform Within the Department of Defense
and the Individual Services: Hearing Before the Subcommittee on Readiness
and Management Support, Senate Armed Services Committee, 108th Cong. 2
(Nov. 18, 2004) (statement by the Assistant Secretary of the Navy
(Financial Management and Comptroller), Richard Greco, Jr.).

12The eight material weaknesses for the Navy General Fund are related to:
(1) accounting and financial management systems; (2) fund balance with
Treasury; (3) accounts receivable; (4) inventory and related property; (5)
general property, plant, and equipment; (6) accounts payable; (7)
environmental liabilities; and (8) problem disbursements.

weaknesses13 respectively, in internal control and noncompliance with the
Federal Financial Management Integrity Act of 1996 (FFMIA).14 The
inability to obtain a clean financial audit opinion is the result of
weaknesses in the Navy's financial management and related business
processes and systems. Most importantly, the Navy's pervasive weaknesses
have (1) resulted in a lack of reliable information to make sound
decisions and report on the status of activities, including accountability
of assets, through financial and other reports to the Navy and DOD
management and the Congress; (2) hindered its operational efficiency; (3)
adversely affected mission performance; and (4) left the Navy and DOD
vulnerable to fraud, waste, and abuse, as the following examples
illustrate.

o 	The Navy's lack of detailed cost information hinders its ability to
monitor programs and analyze the cost of its activities. We reported15
that the Navy lacked the detailed cost and inventory data needed to assess
its needs, evaluate spending patterns, and leverage its telecommunications
buying power. As a result, at the sites we reviewed, the Navy paid for
telecommunications services it no longer required, paid too much for
services it used, and paid for potentially fraudulent or abusive
long-distance charges. In one instance, we found that DOD paid over $5,000
in charges for one card that was used to place 189 calls in one 24-hour
period from 12 different cities to 12 different countries.

13The six material weaknesses for the Navy Working Capital Fund are
related to: (1) accounting and financial management systems; (2) fund
balance with Treasury; (3) accounts receivable; (4) inventory and related
property; (5) general property, plant, and equipment; and (6) accounts
payable.

14FFMIA, Pub. L. No. 104-208, div. A., S:101(f), title VIII, 110 Stat.
3009, 3009-389 (Sept. 30, 1996), requires the 24 major departments and
agencies covered by the Chief Financial Officers Act of 1990, Pub. L. No.
101-576, 104 Stat. 2838 (Nov. 15, 1990) (31 U.S.C. S: 901(b), as amended),
to implement and maintain financial management systems that comply
substantially with (1) federal financial management systems requirements,
(2) applicable federal accounting standards, and (3) the U.S. Standard
General Ledger at the transaction level.

15GAO, Vendor Payments: Inadequate Management Oversight Hampers the Navy's
Ability to Effectively Manage Its Telecommunication Program, GAO-04-671
(Washington, D.C.: June 14, 2004).

o 	Ineffective controls over Navy foreign military sales using blanket
purchase orders placed classified and controlled spare parts at risk of
being shipped to foreign countries that may not be eligible to receive
them. For example, we identified instances in which Navy country managers
(1) overrode the system to release classified parts under blanket purchase
orders without filing required documentation justifying the release; and
(2) substituted classified parts for parts ordered under blanket purchase
orders, bypassing the control-edit function of the system designed to
check a country's eligibility to receive the parts.16

o 	The Naval Inventory Control Point and its repair contractors have not
followed DOD and Navy procedures intended to provide the accountability
for and visibility of inventory shipped to Navy repair contractors.
Specifically, Navy repair contractors are not routinely acknowledging
receipt of government-furnished material received from the Navy. A DOD
procedure requires repair contractors to acknowledge receipt of
government-furnished material that has been shipped to them from the
Navy's supply system. However, Naval Inventory Control Point officials are
not requiring repair contractors to acknowledge receipt of these
materials. By not requiring repair contractors to acknowledge receipt of
government-furnished material, the Naval Inventory Control Point has also
departed from the procedure to follow up with the contractor within 45
days when the contractor fails to confirm receipt for an item. Without
material receipt notification, the Naval Inventory Control Point cannot be
assured that its repair contractors have received the shipped material.
This failure to acknowledge receipt of material shipped to repair
contractors can potentially impair the Navy's ability to account for
shipments leading to possible fraud, waste, or abuse.17

16GAO, Foreign Military Sales: Improved Navy Controls Could Prevent
Unauthorized Shipment of Classified and Controlled Spare Parts to Foreign
Countries, GAO-04-507 (Washington, D.C.: June 25, 2004).

17GAO, Defense Inventory: Navy Needs to Improve the Management Over
Government-Furnished Material Shipped to Its Repair Contractors,
GAO-04-779 (Washington, D.C.: July 23, 2004).

o 	A limited Naval Audit Service audit revealed that 53 of 118 erroneous
payment transactions, valued at more than $990,000, occurred because Navy
certifying officials did not ensure accurate information was submitted to
the Defense Finance and Accounting Service (DFAS) prior to authorizing
payment. In addition, certifying officials submitted invoices to DFAS
authorizing payment more than once for the same transaction.18

Brief Overview of Navy ERP	To address the need for business operations
reform, in fiscal year 1998 the Navy established an executive committee
responsible for creating a "Revolution in Business Affairs" to begin
looking at transforming business affairs and identifying areas of
opportunity for change. This committee, in turn, set up a number of
working groups, including one called the Commercial Business Practices
(CBP) Working Group,19 which consisted of representatives from financial
management organizations across the Navy. This working group recommended
that the Navy should use ERP as a foundation for change and identified
various ERP initiatives that were already being developed or under
consideration within the Navy. Ultimately, the Navy approved the
continuation of four of these initiatives, using funds from existing
resources from each of the sponsors (i.e., commands) to test the
feasibility of ERP solutions within the Navy. From 1998 to 2003, four
different Navy commands began planning, developing, and implementing four
separate ERP pilot programs to address specific business areas. A CBP
Executive Steering Group was created in December 1998 to monitor the pilot
activities.

18Naval Audit Service, Erroneous Payments Made to Navy Vendors, N2005-0011
(Washington, D.C.: Dec. 2, 2004).

19Initially the focus of the committee was on financial practices, and it
was named the Commercial Financial Practices Working Group. The committee
changed its name to reflect a broader focus from financial practices to
business practices.

As the pilots progressed in their development and implementation, the Navy
identified issues that had to be addressed at a higher level than the
individual pilots, such as the integration between the pilots as well as
with other DOD systems, and decided that one program would provide a more
enterprisewide solution for the Navy. In August 2002, the Assistant
Secretary of the Navy for Research, Development, and Acquisition
established a Navy-wide ERP program to "converge" the four ongoing pilots
into a single program. This Navy-wide program is expected to replace all
four pilots by fiscal year 2008 and to be "fully operational" by fiscal
year 2011. The Navy estimates that the ERP will manage about 80 percent of
the Navy's estimated appropriated funds-after excluding appropriated funds
for the Marine Corps and military personnel and pay. Based on the Navy's
fiscal years 2006 to 2011 defense planning budget, the Navy ERP will
manage approximately $74 billion annually.20

According to a Navy ERP official, while the Navy ERP would account for the
total appropriated amount, once transactions occur at the depots, such as
when a work order is prepared for the repair of an airplane part, the
respective systems at the depots will execute and maintain the detailed
transactions. This accounts for about 2 percent, or approximately $1.6
billion, being executed and maintained in detail by the respective systems
at the aviation and shipyard depots-not by the Navy ERP. The remaining 20
percent that the ERP will not manage comprises funds for the Navy
Installations Command, field support activity, and others.

  Navy's Pilot Projects Lacked Coordinated Management Oversight

Each of the Navy's four ERP pilot projects was managed and funded by
different major commands within the Navy. The pilots, costing over $1
billion in total, were limited in scope and were not intended to provide
corporate solutions to any of the Navy's long-standing financial and
business management problems. The lack of centralized management oversight
and control over all four pilots allowed the pilots to be developed
independently. This resulted in four more DOD stovepiped systems that
could not operate with each other, even though each carried out many of
the same functions and were based on the same ERP commercial-off-theshelf
(COTS) software. Moreover, due to the lack of high-level departmentwide
oversight from the start, the pilots were not required to go

20The amount is based on DOD's estimated planning budget for fiscal years
2006 through 2011.

through the same review process as other acquisition projects of similar
magnitude.

  Pilots Developed Independently of Each Other

Four separate Navy organizations began their ERP pilot programs
independently of each other, at different times, and with separate
funding. All of the pilots implemented the same ERP COTS software, and
each pilot was small in scale-relative to the entire Navy. For example,
one of the pilots, SMART, was responsible for managing the inventory items
and repair work associated with one type of engine, although the
organization that implemented SMART-the Naval Supply Systems Command-
managed the inventory for several types of engines. As of September 2004,
the Navy estimated that the total investment in these four pilots was
approximately $1 billion. Table 1 summarizes each of the pilots, the
cognizant Navy organization, the business areas they address, and their
reported costs through September 2004.

                        Table 1: Navy ERP Pilot Projects

                              Dollars in millions

                                                                        Costs 
                                                                      through 
ERP pilot   Organization      Area of pilot's focus     Initial   FY 2004a 
                                                            start    
CABRILLO  Space and Naval                              June 2000     $67.4 
             Warfare             Financial Management                
                                o  Navy Working Capital              
             Systems Command             Fund                        

SMART    Naval Supply           Supply Management        August 2000 346.4 
              Systems                                                   
              Command       o  Intermediate-Level                       
                            Maintenance                                 
                                      Management                        
                            o  Interface to Aviation Depots             

NEMAIS Naval Sea Systems       Regional Maintenance        June 2000 414.6 
                    Command                                             
                            o  Intermediate-Level Maintenance           
                                       Management                       
                                    o Project Systems                   
                                 o  Workforce Management                

SIGMA        Naval Air Systems Program Management with      May 2001 215.9 
                          Command linkage                               
                                             among:                     
                                     o  Contract Management             
                                    o  Financial Management             
                                    o  Workforce Management             

                                 Total $1,044.3

Source: GAO analysis of Navy data.

aThe costs reflect amounts disbursed through September 30, 2004, as
reported by the Navy ERP program.

Even after the pilots came under the purview of the CBP Executive Steering
Group in December 1998, they continued to be funded and controlled by
their respective organizations. We have previously reported21 that
allowing systems to be funded and controlled by component organizations
has led to the proliferation of DOD's business systems. These four pilots
are prime examples. While there was an attempt made to coordinate the
pilots, ultimately each organization designed its ERP pilot to accommodate
its specific business needs. The Navy recognized the need for a working
group that would focus on integration issues among the pilots, especially
because of the desire to eventually extend the pilot programs beyond the
pilot organizations to the entire Navy. In this regard, the Navy
established the Horizontal Integration Team in June 1999, consisting of
representatives from all of the pilots to address this matter. However,
one Navy official described this team as more of a "loose confederation"
that had limited authority. As a result, significant resources have been
invested that have not and will not result in corporate solutions to any
of the Navy's long-standing business and financial management problems.
This is evident as noted in the DOD Inspector General's audit reports of
the Navy's financial statements discussed above.

In addition to the lack of centralized funding and control, each of the
pilots configured the software differently, which, according to Navy ERP
program officials, caused integration and interoperability problems. While
each pilot used the same COTS software package, the software offers a high
degree of flexibility in how similar business functions can be processed
by providing numerous configuration points.22 According to the Navy, over
2.4 million configuration points exist within the software. The pilots
configured the software differently from each other to accommodate
differences in the way they wanted to manage their functional area focus.
These differences were allowed even though they perform many of the same
types of business functions, such as financial management. These
configuration differences include the levels of complexity in workflow
activities and the establishment of the organizational structure. For
example, the primary work order managed by the NEMAIS pilot is an
intricate ship repair job, with numerous tasks and workers at many levels.

21GAO, Department of Defense: Financial and Business Management
Transformation Hindered by Long-standing Problems, GAO-04-941T
(Washington, D.C.: July 8, 2004).

22A configuration point is a place at which the system developer must
define the business flows with inputs, conditions, and criteria that will
be used in the application.

Other pilots had much simpler work order definitions, such as preparing a
budget document or procuring a single part for an engine.

Because of the various inconsistencies in the design and implementation,
the pilots were stovepiped and could not operate with each other, even
though they performed many of the same business functions. Table 2
illustrates the similar business functions that are performed by more than
one pilot.

Table 2: Functions Performed by the Pilot Projects Type of functions to be
           performed NEMAIS Cabrillo SIGMA SMART Materials management

o  Sales and distribution X X X X

o  Procurement X X X

                              Financial management

o  Financial accounting X X X X

o  Revenue and cost controlling X X X X

o  Asset accounting X X X X

o  Budgeting and funds management X X X

                               Program management

o  Project management X X X

o  Program planning, budgeting, control X X X

                              Workforce management

o  Time and attendance X X X

Source: GAO, based upon information provided by the Navy.

By definition, an ERP solution should integrate the financial and business
operations of an organization. However, the lack of a coordinated effort
among the pilots led to a duplication of efforts and problems in
implementing many business functions and resulted in ERP solutions that
carry out redundant functions in different ways from one another.

The end result of all of the differences was a "system" that could not
successfully process transactions associated with the normal Navy
practices of moving ships and aircraft between fleets. Another
configuration problem occurred because the pilots generally developed
custom roles23 for systems users. Problems arose after the systems began
operating. Some roles did not have the correct transactions assigned to
enable the users with that role to do their entire job correctly. Further,
other roles violated the segregation-of-duties principle due to the
inappropriateness of roles assigned to individual users.

The pilots experienced other difficulties with respect to controlling the
scope and performance schedules due to the lack of disciplined
processes,24 such as requirements management. For example, the pilots did
not identify in a disciplined manner the amount of work necessary to
achieve the originally specified capabilities-even as the end of testing
approached. There were repeated contract cost-growth adjustments, delays
in delivery of many planned capabilities, and initial periods of systems'
instabilities after the systems began operating. All of these problems
have been shown as typical of the adverse effects normally associated with
projects that have not effectively implemented disciplined processes.

23Roles are the actions and activities assigned to or required or expected
of a person or group. A user's access to the transactions, reports, and
applications is controlled by the roles assigned to the user.

24Disciplined processes include a wide range of activities, including
project planning and oversight, requirements management, risk management,
and testing.

  Pilots Lacked Departmentwide Oversight

The Navy circumvented departmentwide policy by not designating the pilots
as major automated information systems acquisition programs. DOD policy in
effect at the time25 stipulated that a system acquisition should be
designated as a major program if the estimated cost of the system exceeds
$32 million in a single year, $126 million in total program costs, or $378
million in total life-cycle costs, or if deemed of special interest by the
DOD Chief Information Officer (CIO). According to the Naval Audit
Service,26 all four of the pilots should have been designated as major
programs based on their costs-which were estimated to be about $2.5
billion at the time-and their significance to Navy's operations. More
specifically, at the time of its review, SMART's total estimated costs for
development, implementation, and sustainment was over $1.3 billion-far
exceeding the $378 million lifecycle cost threshold. However, because Navy
management considered each of its ERP programs to be "pilots," it did not
designate the efforts as major automated information systems acquisitions,
thereby limiting departmental oversight.27

25Department of Defense Directive 5000.1, The Defense Acquisition System,
DOD Instruction 5000.2, Operation of the Defense Acquisition System (Apr.
5, 2002) and DOD Regulation 5000.2-R, Mandatory Procedures for Major
Defense Acquisition Programs and Major Automated Information System
Acquisition Programs (Apr. 5, 2002). The DOD policy also specifies that
the DOD CIO is the milestone decision authority, responsible for program
approval, for all major automated information systems.

26Naval Audit Service, Department of the Navy Implementation of Enterprise
Resource Planning Solutions, N2002-0024 (Washington, D.C.: Jan. 25, 2002).

27Subsequent to the Naval Audit Service's report, on July 21, 2003, NEMAIS
was designated a major automated information system. This memorandum also
allowed the fielding of NEMAIS to four Navy regions.

Consistent with the Clinger-Cohen Act of 1996, DOD acquisition guidance28
requires that certain documentation be prepared at each milestone within
the system life cycle. This documentation is intended to provide relevant
information for management oversight and in making decisions as to whether
the investment of resources is cost beneficial. The Naval Audit Service
reported 29 that a key missing document that should have been prepared for
each of the pilots was a mission needs statement.30 A mission needs
statement was required early on in the acquisition process to describe the
projected mission needs of the user in the context of the business need to
be met. The mission needs statement should also address interoperability
needs. As noted by the Naval Audit Service, the result of not designating
the four ERP pilots as major programs was that program managers did not
prepare and obtain approval of this required document before proceeding
into the next acquisition phase. In addition, the pilots did not undergo
mandatory integrated reviews that assess where to spend limited resources
departmentwide. The DOD CIO is responsible for overseeing major automated
information systems and a program executive office is required to be
dedicated to executive management and not have other command
responsibilities. However, because the pilots were not designated major
programs, the oversight was at the organizational level that funded the
pilots (i.e., command level). Navy ERP officials stated that at the
beginning of the pilots, investment authority was dispersed throughout the
Navy and there was no established overall requirement within the Navy to
address systems from a centralized Navy enterprise level. The Navy ERP is
now designated a major program under the oversight of the DOD CIO.

28DOD, Operation of the Defense Acquisition System, DOD Instruction
5000.2, (Apr. 5, 2002); and DOD Regulation 5000.2-R, Mandatory Procedures
for Major Defense Acquisition Programs and Major Automated Information
System Acquisition Programs (Apr. 5, 2002). The acquisition controls
described in this guidance apply to all DOD acquisition programs.

29Naval Audit Service N2002-0024.

30The mission needs statement has been replaced by the requirement for an
Initial Capabilities Document.

  Experience Has Shown the Effects of Not Effectively Implementing Disciplined
  Processes

The problems identified in the failed implementation of the four pilots
are indicative of a system program that did not adhere to the disciplined
processes. The successful development and implementation of systems is
dependent on an organization's ability to effectively implement best
practices, commonly referred to as disciplined processes, which are
essential to reduce the risks associated with these projects to acceptable
levels.31 However, the inability to effectively implement the disciplined
processes necessary to reduce risks to acceptable levels does not mean
that an entity cannot put in place a viable system that is capable of
meeting its needs. Nevertheless, history shows that the failure to
effectively implement disciplined processes and the necessary metrics to
understand the effectiveness of processes implemented increases the risk
that a given system will not meet its cost, schedule, and performance
objectives.

In past reports we have highlighted the impact of not effectively
implementing the disciplined processes. These results are consistent with
those experienced by the private sector. More specifically:

o 	In April 2003, we reported32 that NASA had not implemented an effective
requirements management process and that these requirement management
problems adversely affected its testing activities. We also noted that
because of the testing inadequacies, significant defects later surfaced in
the production system. In May 2004, we reported33 that NASA's new
financial management system, which was fully deployed in June 2003 as
called for in the project schedule, still did not address many of the
agency's most challenging external reporting issues, such as external
reporting problems related to property accounting and budgetary
accounting. The system continues to be unable to produce reliable
financial statements.

31Acceptable levels refer to the fact that any systems acquisition effort
will have risks and will suffer the adverse consequences associated with
defects in the processes. However, effective implementation of the
disciplined processes reduces the potential of risks actually occurring
and prevents significant defects from materially affecting the cost,
timeliness, and performance of the project.

32GAO, Business Modernization: Improvements Needed in Management of NASA's
Integrated Financial Management Program, GAO-03-507 (Washington, D.C.:
Apr. 30, 2003).

33GAO, National Aeronautics and Space Administration: Significant Actions
Needed to Address Long-standing Financial Management Problems, GAO-04-754T
(Washington, D.C.: May 19, 2004).

o 	In May 2004, we reported34 that the Army's initial deployments for its
Logistics Modernization Program (LMP) did not operate as intended and
experienced significant operational difficulties. In large part, these
operational problems were due to the Army not effectively implementing the
disciplined processes that are necessary to manage the development and
implementation of the systems in the areas of requirements management and
testing. The Army program officials have acknowledged that the problems
experienced in the initial deployment of LMP could be attributed to
requirements and testing. Subsequently, in June 2005,35 we reported that
the Army still had not put into place effective management control and
processes to help ensure that the problems that have been identified since
LMP became operational in July 2003 are resolved in an efficient and
effective manner. The Army's inability to effectively implement the
disciplined processes provides it with little assurance that (1) system
problems experienced during the initial deployment that caused the delay
of future deployments have been corrected and (2) LMP is capable of
providing the promised system functionality. The failure to resolve these
problems will continue to impede operations at Tobyhanna Army Depot, and
future deployment locations can expect to experience similar significant
disruptions in their operations, as well as having a system that is unable
to produce reliable and accurate financial and logistics data.

o 	We reported in February 200536 that DOD had not effectively managed
important aspects of the requirements for the Defense Integrated Military
Human Resources System, which is to be an integrated personnel and pay
system standardized across all military components. For example, DOD had
not obtained user acceptance of the detailed requirements nor had it
ensured that the detailed requirements were complete and understandable.
Based on GAO's review of a random sample of the requirements
documentation, about 77 percent of the detailed requirements were
difficult to understand.

34GAO, DOD Business Systems Modernization: Billions Continue to Be
Invested with Inadequate Management Oversight and Accountability,
GAO-04-615 (Washington, D.C.: May 27, 2004).

35GAO, Army Depot Maintenance: Ineffective Oversight of Depot Maintenance
Operations and System Implementation Efforts, GAO-05-441 (Washington,
D.C.: June 30, 2005).

36GAO, DOD Systems Modernization: Management of Integrated Military Human
Capital Program Needs Additional Improvements, GAO-05-189 (Washington,
D.C.: Feb. 11, 2005).

The problems experienced by DOD and other agencies are illustrative of the
types of problems that can result when disciplined processes are not
properly implemented. The four Navy pilots provide yet another example. As
discussed previously, because the pilots were four stovepiped efforts,
lacking centralized management and oversight, the Navy had to start over
when it decided to proceed with the current ERP effort after investing
about $1 billion. Figure 1 shows how organizations that do not effectively
implement disciplined processes lose the productive benefits of their
efforts as a project continues through its development and implementation
cycle. Although undisciplined projects show a great deal of productive
work at the beginning of the project, the rework associated with defects
begins to consume more and more resources. In response, processes are
adopted in the hopes of managing what later turns out to be, in reality,
unproductive work. However, generally these processes are "too little, too
late," and rework begins to consume more and more resources because the
adequate foundations for building the systems were not done or not done
adequately. In essence, experience shows that projects that fail to
implement disciplined processes at the beginning are forced to implement
them later, when it takes more time and they are less effective.

As can be seen in figure 1, a major consumer of project resources in
undisciplined efforts is rework (also known as thrashing).

       Figure 1: Percent of Effort Associated with Undisciplined Projects

Percent of Effort

                                      100

                                       0

Lucky projects finish here

Unlucky projects get stuck here

Visible progress (coding)

Thrashing (unplanned rework and wasted effort)

Planning and process management

Thrashing and planning combine to limit ability to make any visible
progress

Source: Used by permission of Steve McConnell, Professional Software
Development,(Boston, Addison-Wesley, 2004).

Rework occurs when the original work has defects or is no longer needed
because of changes in project direction. Disciplined organizations focus
their efforts on reducing the amount of rework because it is expensive.
Studies have shown that fixing a defect during testing is anywhere from 10
to 100 times more expensive than fixing it during the design or
requirements phase.37

37Steve McConnell, Rapid Development: Taming Wild Software Schedules
(Redmond, Wash.: Microsoft Press, 1996).

  Requirements Management Process Effective, but Implementation Challenges and
  Risks Remain

To date, Navy ERP management has followed a comprehensive and disciplined
requirements management process, as well as leveraged lessons learned from
the implementation of the four ERP pilot programs to avoid repeating past
mistakes. Assuming that the project continues to effectively implement the
processes it has adopted, the planned functionality of the Navy ERP has
the potential to address at least some of the weaknesses identified in the
Navy's financial improvement plan. However, the project faces numerous
challenges and risks. Since the program is still in a relatively early
phase-it will not be fully operational until fiscal year 2011, at a
currently estimated cost of $800 million-the project team must be
continually vigilant and held accountable for ensuring that the
disciplined processes are followed in all phases to help achieve overall
success. For example, the project management office will need to ensure
that it effectively oversees the challenges and risks associated with
developing interfaces with 44 Navy and DOD systems and data
conversion-areas that were troublesome in other DOD efforts we have
audited. Considering that the project is in a relatively early phase and
DOD's history of not implementing systems on time and within budget, the
projected schedule and costs estimates are subject to, and very likely
will, change. Furthermore, a far broader challenge, which lies outside the
immediate control of the Navy ERP program office, is that the ERP is
proceeding without DOD having clearly defined its BEA. As we have recently
reported,38 DOD's BEA still lacks many of the key elements of a
well-defined architecture. The real value of a BEA is that it provides the
necessary content for guiding and constraining system investments in a way
that promotes interoperability and minimizes overlap and duplication.
Without it, rework will likely be needed to achieve those outcomes.

             Navy ERP Built on Lessons Learned from Pilot Projects

Although the four pilot projects were under the control of different
entities and had different functional focuses, a pattern of issues emerged
that the Navy recognized as being critical for effective development of
future projects. The Navy determined that the pilots would not meet its
overall requirements and concluded that the best alternative was to
develop a new ERP system-under the leadership of a central program
office-and use efforts from the pilots as starting points by performing a
review of their functionality and lessons learned, eliminating
redundancies, and developing new functionalities that were not addressed
by the pilots. The

38GAO-05-702.

lessons learned from the pilots cover technical, organizational, and
managerial issues and reinforce the Navy's belief that it must effectively
implement the processes that are necessary to effectively oversee and
manage the ERP efforts. Navy ERP project management recognizes that the
failure to do so would, in all likelihood, result in this ERP effort
experiencing the same problems as those resulting in the failure of the
four earlier pilots.

One of the most important lessons learned from the earlier experiences by
the Navy ERP project management is the need for following disciplined
processes to identify and manage requirements. As discussed later in this
report, the ERP program is following best practices in managing the
system's requirements. A key part of requirements identification is to
have system users involved in the process to ensure that the system will
meet their needs. Additionally, the inclusion of system users in the
detailed requirement development process creates a sense of ownership in
the system, and prepares system users for upcoming changes to the way they
conduct their business. Moreover, the experience from the pilots
demonstrated that the working-level reviews must be cross functional. For
example, the end-to-end process walkthroughs, discussed later, reinforce
the overall business effect of a transaction throughout the enterprise,
and help to avoid a stovepiped view of an entity's operations.

Another lesson learned is the need to adopt business processes to conform
with the types of business practices on which the standard COTS packages
are based, along with the associated transaction formats.39 Just the
opposite approach was pursued for the pilots, during which the Navy
customized many portions of the COTS software to match the existing
business process environment. However, the current Navy ERP management is
restraining customization to the core COTS software to allow modifications
only where legal or regulatory demands require. Obviously, minimizing the
amount of customization reduces the complexity and costs of development.
Perhaps more importantly, holding customization to a minimum helps an
entity take advantage of two valuable benefits of COTS software. First,
COTS software provides a mature, industry-proven "best practices" approach
to doing business. The core elements of work-flow management, logistics,
financial management, and

39A transaction format is a logical process, as defined by the ERP. From
the user's point of view, a transaction is a self-contained unit, such as
changing an address for a customer or executing a program.

other components have been optimized for efficiency and standardization in
private industry over many years. According to program officials, the Navy
ERP will adhere to the fundamental concepts of using a COTS package and
thus take advantage of this efficiency benefit by modifying their business
practices to match the COTS software rather than vice versa as was done in
the four pilots. Having the software dictate processes is a difficult
transition for users to accept, and Navy ERP officials recognize the
challenge in obtaining buy-in from system users. To meet this challenge,
they are getting users involved early in requirements definition, planning
for extensive training, and ensuring that senior level leadership
emphasize the importance of process change, so the entire chain of command
understands and accepts its role in the new environment. In effect, the
Navy is taking the adopted COTS process and then presenting it to the
users. As a result, the Navy is attempting to limit the amount of
customization of the software package. One important consideration in
doing this is that if the standard COTS components are adopted, the
maintenance burden of upgrades remains with the COTS vendor.

Finally, the Navy learned from the pilots that it needed to manage its
system integrators40 better. The ERP officials also found that they could
significantly reduce their risk by using the implementation methodology of
the COTS vendor rather than the specific approach of a system integrator.
Each of the pilots had separate system integrators with their own
particular methodology for implementing the COTS software. According to
Navy ERP officials, using the implementation methodology and tool set of
the COTS vendor maintains a closer link to the underlying software, and
provides more robust requirements management by easily linking
requirements from the highest level down to the COTS transaction level.
Navy ERP is focused on staying as close as possible to the delivered COTS
package, both in its avoidance of customization and its use of tools
provided by the COTS vendor. In contrast, with the pilots, the Navy
allowed the system integrators more latitude in the development process,
relying on their expertise and experience with other ERP efforts to guide
the projects. Navy ERP management realized they needed to maintain much
better control over the integrators' work. As a result, the Navy
established the Strategy, Architecture, and Standards Group to structure
and guide the effort across the Navy.

40A system integrator is a company that specializes in enabling an
organization to use offthe-shelf hardware and software packages to meet
the organization's computing needs.

Requirements Management Process Following Best Practices

Our review found that the ERP development team has so far followed an
effective process for managing its requirements development. Documentation
was readily available for us to trace selected requirements from the
highest level to the lowest, detailed transaction level. This traceability
allows the user to follow the life of the requirement both forward and
backward through the documentation, and from origin through
implementation. Traceability is also critical to understanding the
parentage, interconnections, and dependencies among the individual
requirements. This information in turn is critical to understanding the
impact when a requirement is changed or deleted.

Requirements represent the blueprint that system developers and program
managers use to design, develop, test, and implement a system. Improperly
defined or incomplete requirements have been commonly identified as a
cause of system failure and systems that do not meet their cost, schedule,
or performance goals. Without adequately defined requirements that have
been properly reviewed and tested, significant risk exists that the system
will need extensive and costly changes before it will achieve its intended
capability.

Because requirements provide the foundation for system testing,
specificity and traceability defects in system requirements preclude an
entity from implementing a disciplined testing process. That is,
requirements must be complete, clear, and well documented to design and
implement an effective testing program. Absent this, an organization is
taking a significant risk that its testing efforts will not detect
significant defects until after the system is placed into production.
Industry experience indicates that the sooner a defect is recognized and
corrected, the cheaper it is to fix. As shown in figure 2, there is a
direct relationship between requirements and testing.

Figure 2: Relationship between Requirements Development and Testing

                                  Source: GAO.

Although the actual testing activities occur late in the development
cycle, test planning can help disciplined activities reduce
requirements-related defects. For example, developing conceptual test
cases based on the requirements derived from the concept of operations and
functional requirements stages can identify errors, omissions, and
ambiguities long before any code is written or a system is configured.
Disciplined organizations also recognize that planning testing activities
in coordination with the requirements development process has major
benefits. As we have previously reported,41 failure to effectively manage
requirements and testing activities has posed operational problems for
other system development efforts.

The Navy ERP requirements identification process began with formal
agreement among the major stakeholders on the scope of the system,
followed by detailed, working-level business needs from user groups and
legacy systems. The high-level business or functional requirements
identified initially are documented in the Operational Requirements
Document (ORD). The ORD incorporates requirements from numerous major DOD
framework documents42 and defines the capabilities that the system must
support, including business operation needs such as acquisition, finance,
and logistics. In addition, the ORD also identifies the numerous policy
directives to which the Navy ERP must conform, such as numerous DOD
infrastructure systems, initiatives, and policies. The ORD was distributed
to over 150 Navy and DOD reviewers. It went through seven major revisions
to incorporate the comments and suggestions provided by the reviewers
before being finalized in April 2004. According to Navy ERP program
officials, any requested role for the Navy ERP to perform that was not
included in the ORD will not be supported. This is a critical decision
that reduces the project's risks since "requirements creep" is another
cause of projects that do not meet their cost, schedule, and performance
objectives.

41See, for example, GAO-04-615 and GAO, Financial Management Systems: Lack
of Disciplined Processes Puts Implementation of HHS' Financial System at
Risk, GAO-041008 (Washington, D.C.: Sept. 23, 2004).

42The documents include: DOD Financial Bluebook; Global Information Grid
Capstone Requirements Document (CRD) Crosswalk; Global Combat Support
System CRD Crosswalk; Joint Deployment Systems CRD Crosswalk; DoD Joint
Technical Architecture; DOD 8500.1, Information Assurance; DOD 8510.1-M,
Information Technology Security Certification and Accreditation Process;
and DOD 5000.1, The Defense Acquisition System.

We selected seven requirements43 from the ORD that related to specific
Navy problem areas, such as financial reporting and asset management, and
found that the requirements had the expected attributes, including the
necessary detail one would normally expect to find for the requirement
being reviewed. For example, a requirement stated that the ERP will
provide reports of funds expended versus funds allocated. We found this
requirement was described in a low-level requirement document called a
Customer Input Template, which included a series of questions that must be
addressed. The documentation further detailed the standard reports that
were available based on the selection of configuration options. Further,
the documentation of the detailed requirements identified the specific
COTS screen number that would be used and described the screen settings
that would be used when a screen was "activated."

While the ORD specifies the overall capabilities of the system at a high
level, more specific, working-level requirements also had to be developed
to achieve a usable blueprint for configuration and testing of the system.
To develop these lower-level requirements, the Navy ERP project held
detailed working sessions where requirements and design specifications
were discussed, refined, formalized, and documented. Each high-level
requirement was broken down into its corresponding business processes,
which in turn drove the selection of transactions (COTS functions) to be
used for configuration of the software. For each selected transaction,
comprehensive documentation was created to capture the source information
that defines how and why a transaction must be configured. This
documentation is critical for ensuring accurate configuration of the
software, as well as for testing the functionality of the software after
configuration. Table 3 describes the kinds of documentation used to
maintain these lower-level detailed requirements.

43As discussed in appendix I, our approach to validating the effectiveness
of the requirements management process relied on a selection of seven
requirements from different functional areas. From the finance area, we
selected the requirement to provide reports of funds expended versus funds
allocated. From the intermediate-level maintenance management area, we
selected the requirement related to direct cost per job and forecasting
accuracy. From the procurement area, we selected the requirement to enable
monitoring and management of cost versus plan. In the plant supply
functions area, we reviewed the requirement related to total material
visibility and access of material held by the activity and the enterprise.
From the wholesale supply functions area, we selected the requirements of
in-transit losses/in-transit write-offs and total material visibility and
access of material held by the activity and the enterprise. Additionally,
we reviewed the requirement that the ERP be compliant with federal
mandates and requirements and the U.S. Standard General Ledger.

Table 3: Documentation for Detailed Requirements

                  Documentation name Documentation description

Customer Input Templates	A series of questions completed for every
requirement. It enforces a comprehensive review of the requirement and
documents the reasoning behind the answers.

Functional Design Specification A detailed description for each interface.

Technical Functional Script A description of any customization that had to
be made to a transaction.

Implementation Guide	Automatically documents the actual configuration
choices made by the developer for each transaction.

Source: GAO, based upon information provided by the Navy.

Additionally, the Navy ERP program is using a requirements management tool
containing a database that links each requirement from the highest to the
lowest level and maintains the relationship between the requirements. This
tool helps to automate the linkage between requirements and helps provide
the project staff reasonable assurance that its stated processes have been
effectively implemented. This linkage is critical to understanding the
scope of any potential change. For example, the users can utilize the tool
to (1) determine the number of transactions affected by a proposed change
and (2) identify the detailed documentation necessary for understanding
how this change will affect each business process. To further ensure that
the individual transactions ultimately support the adopted business
process, Navy ERP officials conducted master business scenarios44 or
end-to-end process walkthroughs. This end-to-end view of the business
process ensures that the business functionality works across the various
subsystems of the COTS package. For instance, the requirements for a
purchase order could be viewed simply from the vantage point of a
logistics person or the acquisition community. However, a purchase order
also has financial ramifications, and therefore must be posted to
financial records, such as the general ledger. The master business
scenarios provide a holistic review of the business process surrounding
each transaction.

44A business scenario is a description of the flow of business processes
that runs within a particular area of a company process.

    ERP Has the Potential to Address Some of the Navy's Reported Financial
    Management Weaknesses

The Navy expects the new ERP project to address a number of the weaknesses
cited in the Department of the Navy Financial Improvement Plan-a course of
action directed towards achieving better financial management and an
unqualified audit opinion for the Department of the Navy annual financial
statements. According to ERP officials, the COTS software used for the ERP
program will improve the Navy's current financial controls in the areas of
asset visibility, financial reporting, and full cost accounting. However,
the currently planned ERP is not intended to provide an all-inclusive
end-to-end corporate solution for the Navy.

The COTS software offers the potential for real-time asset visibility for
the Navy, limited by two factors beyond the project's scope. First, items
in transit fall under the authority of the U.S. Transportation Command
(TRANSCOM). Once the Navy hands off an item to TRANSCOM, it does not
retain visibility of that asset until it arrives at another Navy location.
The second factor is the limited ability for communication with ships at
sea. Once the currently planned ERP is fully implemented, it will cover
all inventories, including inventory on ships. However, the data for
shipboard inventory will be current only as of when the ship leaves port.
Those data will typically not be updated until the ship docks in another
port and can transmit updated information to the ERP system. This lag time
for some ships could be as much as 3 to 4 months. While the ERP has the
capability to maintain real-time shipboard inventory, the Navy has yet to
decide whether to expand the scope of the ERP and build an interface with
the ships, which could be extensive and costly, or install the ERP on the
ships. Both options present additional challenges that necessitate
thorough analysis of all alternatives before a decision is made. According
to the program office, a time frame for making this critical decision has
not been established.

The COTS software is also intended to provide standardized government and
proprietary financial reporting at any level within the defined
organization. According to Navy ERP officials, full cost accounting will
be facilitated by a software component integrated with the ERP. For
example, the Navy expects that this component will provide up-to-date cost
information-including labor, materials, and overhead-for its numerous, and
often complicated, maintenance jobs. Full cost information is necessary
for effective management of production, maintenance, and other activities.

According to Navy ERP program officials, when fully operational in fiscal
year 2011, the Navy ERP will be used by organizations comprising
approximately 80 percent of Navy's estimated appropriated funds-after
excluding the Marine Corps and military pay and personnel.45 Based on
fiscal years' 2006 through 2011 defense planning budget, the Navy ERP will
manage approximately $74 billion annually. The organizations that will use
Navy ERP include the Naval Air Systems, the Naval Sea Systems, the Naval
Supply Systems, the Space and Naval Warfare Systems, and the Navy
Facilities Engineering Commands, as well as the Office of Naval Research,
the Atlantic and Pacific Fleets, and the Strategic Systems Programs.
However, the Navy ERP will not manage in detail all of the 80 percent.
About 2 percent, or approximately $1.6 billion, will be executed and
maintained in detail by respective financial management systems at the
aviation and shipyard depots. For example, when a work order for a repair
of an airplane part is prepared, the respective financial management
system at the depot will execute and maintain the detailed transactions.
The remaining 20 percent that the Navy ERP will not manage comprises the
Navy Installations Command, field support activities, and others. Navy ERP
officials have indicated that it is the Navy's intent to further expand
the system in the future to include the aviation and shipyard depots, but
definite plans have not yet been made. According to Navy ERP officials,
the software has the capability to be used at the aviation and shipyard
depots, but additional work would be necessary. For example, the desired
functionality and related requirements-which as discussed above, are
critical to the success of any project-would have to be defined for the
aviation and shipyard depots.

    System Interfaces and Data Conversion Will Be Challenges

While the Navy's requirements management process is following disciplined
processes and comprises one critical aspect of the overall project
development and implementation, by itself, it is not sufficient to provide
reasonable assurance of the ERP's success. Going forward, the Navy faces
very difficult challenges and risks in the areas of developing and
implementing 44 system interfaces with other Navy and DOD systems, and
accurately converting data from the existing legacy systems to the ERP. As
previously noted, financial management is a high-risk area in the
department and has been designated as such since 1995. One of the

45Military pay and personnel will not be included in the Navy ERP because
DOD plans to use a new system-the Defense Integrated Military Human
Resources System (DIMHRS)-to process these functions for all DOD
components.

contributing factors has been DOD's inability to develop integrated
systems. As a result, the Navy is dependent upon the numerous interfaces
to help improve the accuracy of its financial management data. Navy ERP
program managers have recognized the issues of system interfaces and data
conversion in their current list of key risks. They have identified some
actions that need to be taken to mitigate the risks; however, they have
not yet developed the memorandums of agreement with the owners of the
systems which the Navy ERP will interface. According to the Navy ERP
program office, they plan to complete these memorandums of agreement by
October 2005.

Integrated Systems 	One of the long-standing problems within DOD has been
the lack of integrated systems. This is evident in the many duplicative,
stovepiped business systems among the 4,150 that DOD reported as belonging
to its systems environment. Lacking integrated systems, DOD has a
difficult time obtaining accurate and reliable information on the results
of its business operations and continues to rely on either manual reentry
of data into multiple systems, convoluted system interfaces, or both.
These system interfaces provide data that are critical to day-to-day
operations, such as obligations, disbursements, purchase orders,
requisitions, and other procurement activities. Testing the system
interfaces in an end-to-end manner is necessary in order for the Navy to
have reasonable assurance that the ERP will be capable of providing the
intended functionality.

The testing process begins with the initial requirements development
process. Furthermore, test planning can help disciplined activities reduce
requirements-related defects. For example, developing conceptual test
cases based on the requirements can identify errors, omissions, and
ambiguities long before any code is written or a system is configured. The
challenge now before Navy ERP is to be sure its testing scenarios
accurately reflect the activities of the "real users," and the
dependencies of external systems.

We previously reported46 that Sears and Wal-Mart, recognized as
leading-edge inventory management companies, have automated systems that
electronically receive and exchange standard data throughout the entire
inventory management process, thereby reducing the need for manual data
entry. As a result, information moves through the data systems with

46GAO, DOD Management: Examples of Inefficient and Ineffective Business
Processes, GAO-02-873T (Washington, D.C.: June 25, 2002).

automated ordering of inventory from suppliers; receiving and shipping at
distribution centers; and receiving, selling, and reordering at retail
stores. Unlike DOD, which has a proliferation of nonintegrated systems
using nonstandard data, Sears and Wal-Mart require all components and
subsidiaries to operate within a standard systems framework that results
in an integrated system and does not allow individual systems development.

For the first deployment, the Navy has to develop interfaces that permit
the ERP to communicate with 44 systems-27 that are Navy specific and 17
systems belonging to other DOD entities. Figure 3 illustrates the numerous
required system interfaces.

Figure 3: Navy ERP Required System Interfaces

         Note: See app. III for the definitions of the system acronyms.

Long-standing problems regarding the lack of integrated systems and use of
nonstandard data within DOD pose significant risks for the Navy ERP to
successfully interface with these systems. Even if integration is
successful, if the information within the 44 systems is not accurate and
reliable, the overall information on Navy's operation provided by the ERP
to Navy management and the Congress will not be useful in the
decision-making process. While the Navy ERP project office is working to
develop agreements with system owners for the interfaces and has been
developing the functional specifications for each system, officials
acknowledged that, as of May 2005, they are behind schedule in completing
the interface agreements due to other tasks. The Navy ERP is dependent on
the system owners to achieve their time frames for implementation. For
example, the Defense Travel System (DTS) 47 is one of the DOD systems with
which the Navy ERP is to interface and exchange data. DTS is currently
being implemented, and any problems that result in a DTS schedule slippage
will, in turn, affect Navy ERP's interface testing.

We have previously reported that the lack of system interface testing has
seriously impaired the operation of other system implementation efforts.
For example, in May 2004, we reported48 that because the system interfaces
for the Defense Logistics Agency's Business Systems Modernization (BSM)
program and the Army's LMP were not properly tested prior to deployment,
severe operational problems were experienced. Such problems have led BSM,
LMP, and organizations with which they interface-such as DFAS-to perform
costly manual reentry of transactions, which can cause additional data
integrity problems. For example:

47According to DOD, DTS is expected to reengineer defense travel to a
seamless, paperless, automated system that meets the needs of individual
travelers, force commanders, and process owners (such as finance and
accounting services). DTS represents a whole new way of doing business for
government and it is expected to make the travel process faster, easier,
and better than ever before. During fiscal years 2004-2006, DTS is
expected to be fielded to more than 250 high-volume sites across the
country that serve over 80 percent of all DOD travelers.

48GAO-04-615.

o 	BSM's functional capabilities were adversely affected because a
significant number of interfaces were still in development or were being
executed manually once the system became operational. Since the design of
system interfaces had not been fully developed and tested, BSM experienced
problems with receipts being rejected, customer orders being canceled, and
vendors not being paid in a timely manner. At one point, DFAS suspended
all vendor payments for about 2 months, thereby increasing the risk of
late payments to contractors and violations of the Prompt Payment Act.49

o 	In January 2004, the Army reported that due to an interface failure,
LMP had been unable to communicate with the Work Ordering and Reporting
Communications System (WORCS) since September 2003. WORCS is the means by
which LMP communicates with customers on the status of items that have
been sent to the depot for repair and initiates procurement actions for
inventory items. The Army has acknowledged that the failure of WORCS has
resulted in duplicative shipments and billings and inventory items being
delivered to the wrong locations. Additionally, the LMP program office has
stated that it has not yet identified the specific cause of the interface
failure. The Army is currently entering the information manually, which,
as noted above, can cause additional data integrity errors.

Besides the challenge of developing the 44 interfaces, the Navy ERP must
also develop the means to be compliant with DOD's efforts to standardize
the way that various systems exchange data with each other. As discussed
in our July 2004 report,50 DOD is undertaking a huge and complex task
(commonly referred to as the Global Information Grid or GIG) that is
intended to integrate virtually all of DOD's information systems,
services, applications, and data into one seamless, reliable, and secure
network. The GIG initiative is focused on promoting interoperability
throughout DOD by building an Internet-like network for DOD-related
operations based on common standards and protocols rather than on trying
to establish interoperability after individual systems become operational.
DOD

49The Prompt Payment Act, codified at 31 U.S.C. S:S: 3901 - 3907, and as
implemented at 5 C.F.R. pt. 1315 (2005), provides for agencies, among
other things, to pay interest and penalties under various circumstances
for late payments, generally when payments are not made within 30 days of
the payment due date.

50GAO, Defense Acquisitions: The Global Information Grid and Challenges
Facing Its Implementation, GAO-04-858 (Washington, D.C.: July 28, 2004).

envisions that this type of network would help ensure systems can easily
and quickly exchange data and change how military operations are planned
and executed since much more information would be dynamically available to
users.

DOD's plans for realizing the GIG involve building a new core network and
information capability and successfully integrating the majority of its
weapon systems; command, control, and communications systems; and business
systems with the new network. The effort to build the GIG will require DOD
to make a substantial investment in a new set of core enterprise programs
and initiatives. To integrate systems such as the Navy ERP into the GIG,
DOD has developed (1) an initial blueprint or architecture for the GIG and
(2) new policies, guidance, and standards to guide implementation.
According to project officials, the Navy ERP system will be designed to
support the GIG. However, they face challenges that can result in
significant cost and schedule risks depending on the decisions reached.
One challenge is the extent to which other DOD applications with which the
Navy ERP must exchange data are compliant with the GIG. While traditional
interfaces with systems that are not GIG compliant can be developed, these
interfaces may suboptimize the benefits expected from the Navy ERP. The
following is one example of the difficulties faced by the Navy ERP
project.

As mentioned previously, one system that will need to exchange data with
the Navy ERP system is DTS. However, the DTS program office and the Navy
ERP project office hold different views of how data should be exchanged
between the two systems. The travel authorization process exemplifies
these differences. DTS requires that funding information and the
associated funds be provided to DTS in advance of a travel authorization
being processed. In effect, DTS requires that the financial management
systems set aside the funds necessary for DTS operations. Once a travel
authorization is approved, DTS notifies the appropriate financial
management system that an obligation has been incurred. The Navy ERP
system, on the other hand, only envisions providing basic funding
information to DTS in advance, and would delay providing the actual funds
to DTS until they are needed in order to (1) maintain adequate funds
control, (2) ensure that the funds under its control are not tied up by
other systems, and (3) ensure that the proper accounting data are provided
when an entry is made into its system.

According to the Software Engineering Institute (SEI), a widely recognized
model evaluating a system of systems interoperability is the Levels of

Information System Interoperability. This model focuses on the increasing
levels of sophistication of system interoperability. According to Navy ERP
officials, the GIG and the ERP effort are expected to accomplish the
highest level of this model-enterprise-based interoperability. In essence,
systems that achieve this level of interoperability can provide multiple
users access to complex data simultaneously, data and applications are
fully shared and distributed, and data have a common interpretation
regardless of format. This is in contrast to traditional interface
strategies, such as the one used by DTS. The traditional approach is more
aligned with the lowest level of the SEI model. Data exchanged at this
level rely on electronic links that result in a simple electronic exchange
of data.

Alignment with DOD's BEA Is a A broader challenge and risk that is out of
the Navy ERP project's control,

Significant Risk Factor	but could significantly affect it, is DOD's
development of a BEA. As we recently reported,51 DOD's BEA still lacks
many of the key elements of a well-defined architecture and no basis
exists for evaluating whether the Navy ERP will be aligned with the BEA
and whether it would be a corporate solution for DOD in its "To Be" or
target environment.

An enterprise architecture consists of snapshots of the enterprise's
current environment and its target environment, as well as a capital
investment road map for transitioning from the current to the target
environment. The real value of an enterprise architecture is that it
provides the necessary content for guiding and constraining system
investments in a way that promotes interoperability and minimizes overlap
and duplication. At this time, it is unknown what the target environment
will be. Therefore, it is unknown what business processes, data standards,
and technological standards the Navy ERP must align to, as well as what
legacy systems will be transitioned into the target environment.

The Navy ERP project team is cognizant of the BEA development and has
attempted to align to prior versions of it. The project team analyzed the
BEA requirements and architectural elements to assess Navy ERP's
compliance. The project team mapped the BEA requirements to the Navy ERP
functional areas and the BEA operational activities to the Navy ERP's
business processes. The Navy ERP project team recognizes that
architectures evolve over time, and analysis and assessments will continue
as requirements are further developed and refined. The scope of the BEA
and the development approach are being revised. As a result of the new

51GAO-05-702.

focus, DOD is determining which products from prior releases of the BEA
could be salvaged and used.

Since the Navy ERP is being developed absent the benefit of an enterprise
architecture, there is limited, if any, assurance that the Navy ERP will
be compliant with the architecture once it becomes more robust in the
future. Given this scenario, it is conceivable that the Navy ERP will be
faced with rework in order to be compliant with the architecture, once it
is defined, and as noted earlier, rework is expensive. At the extreme, the
project could fail as the four pilots did. If rework is needed, the
overall cost of the Navy ERP could exceed the Navy's current estimate of
$800 million.

Accuracy of Data Conversion Is The ability of the Navy to effectively
address its data conversion challenges

Critical	will also be critical to the ultimate success of the ERP effort.
A Joint Financial Management Improvement Program (JFMIP)52 white paper on
financial system data conversion53 noted that data conversion (that is,
converting data in a legacy system to a new system) was one of the
critical tasks necessary to successfully implement a new financial system.
The paper further pointed out that data conversion is one of the most
frequently underestimated tasks.

If data conversion is done right, the new system has a much greater
opportunity for success. On the other hand, converting data incorrectly or
entering unreliable data from a legacy system can have lengthy and
longterm repercussions. The adage "garbage in, garbage out" best describes
the adverse impact. Accurately converting data, such as account balances,
from the pilots, as well as other systems that the Navy ERP is to replace,
will be critical to the success of the Navy ERP. While data conversion is
identified in the Navy ERP's list of key risks, it is too early in the ERP
system life cycle for the development of specific testing plans.

52JFMIP was a joint and cooperative undertaking of the Department of the
Treasury, GAO, the Office of Management and Budget (OMB), and the Office
of Personnel Management (OPM), working in cooperation with each other and
other federal agencies to improve financial management practices in the
federal government. Leadership and program guidance were provided by the
four Principals of JFMIP-the Secretary of the Treasury, the Comptroller
General of the United States, and the Directors of OMB and OPM. Although
JFMIP ceased to exist as a stand-alone organization as of December 1,
2004, the JFMIP Principals will continue to meet at their discretion.

53JFMIP, White Paper: Financial Systems Data Conversion-Considerations
(Washington, D.C.: Dec. 20, 2002).

However, our previous audits have shown that if data conversion is not
done properly, it can negatively impact system efficiency. For example,
the Army's LMP data conversion effort has proven to be troublesome and
continues to affect business operations. As noted in our recent report,54
when the Tobyhanna Army Depot converted ending balances from its legacy
finance and accounting system-the Standard Depot System (SDS)-to LMP in
July 2003, the June 30, 2003, ending account balances in SDS did not
reconcile to the beginning account balances in LMP. Accurate account
balances are important for producing reliable financial reports. Another
example is LMP's inability to transfer accurate unit-of-issue data-
quantity of an item, such as each number, dozen, or gallon-from its legacy
system to LMP. This resulted in excess amounts of material ordered.
Similar problems could occur with the Navy ERP if data conversion issues
are not adequately addressed. The agreements between the Navy ERP and the
other systems owners, discussed previously, will be critical to
effectively support Navy's ERP data conversion efforts.

Additional Actions Can be Taken to Improve Management Oversight of the
Navy ERP Effort

Navy officials could take additional actions to improve management
oversight of the Navy ERP effort. For example, we found that the Navy does
not have a mechanism in place to capture the data that can be used to
effectively assess the project management processes. Best business
practices indicate that a key facet of project management and oversight is
the ability to effectively monitor and evaluate a project's actual
performance, cost, and schedule against what was planned.55 Performing
this critical task requires the accumulation of quantitative data or
metrics that can be used to evaluate a project's performance. This
information is necessary to understand the risk being assumed and whether
the project will provide the desired functionality. Lacking such data, the
ERP program management team can only focus on the project schedule and
whether activities have occurred as planned, not whether the activities
achieved their objectives.

Additionally, although the Navy ERP program has a verification and
validation function, it relies on in-house subject matter experts and
others

54GAO-05-441.

55GAO, Information Technology: DOD's Acquisition Policies and Guidance
Need to Incorporate Additional Best Practices and Controls, GAO-04-722
(Washington, D.C.: July 30, 2004).

who are not independent to provide an assessment of the Navy ERP to DOD
and Navy management. The use of an IV&V function is recognized as a best
business practice and can help provide reasonable assurance that the
system satisfies its intended use and user needs. Further, an independent
assessment of the Navy ERP would provide information to DOD and Navy
management on the overall status of the project, including the
effectiveness of the management processes being utilized and
identification of any potential risks that could affect the project with
respect to cost, schedule, and performance. Given DOD's long-standing
inability to implement business systems that provide users with the
promised capabilities, an independent assessment of the ERP's performance
is warranted.

Quantitative Data Necessary for Assessing Whether the System Will Provide
the Needed Functionality

The Navy's ability to understand the impact of the weaknesses in its
processes will be limited because it has not determined the quantitative
data or metrics that can be used to assess the effectiveness of its
project management processes. This information is necessary to understand
the risk being assumed and whether the project will provide the desired
functionality. The Navy has yet to establish the metrics that would allow
it to fully understand (1) its capability to manage the entire ERP effort;
(2) how its process problems will affect the ERP cost, schedule, and
performance objectives; and (3) the corrective actions needed to reduce
the risks associated with the problems identified. Experience has shown
that such an approach leads to rework and thrashing instead of making real
progress on the project.

SEI has found that metrics identifying important events and trends are
invaluable in guiding software organizations to informed decisions. Key
SEI findings relating to metrics include the following.

o 	The success of any software organization depends on its ability to make
predictions and commitments relative to the products it produces.

o 	Effective measurement processes help software groups succeed by
enabling them to understand their capabilities so that they can develop
achievable plans for producing and delivering products and services.

o 	Measurements enable people to detect trends and anticipate problems,
thus providing better control of costs, reducing risks, improving quality,
and ensuring that business objectives are achieved.56

The lack of quantitative data to assess a project has been a key concern
in other projects we have reviewed. Without such a process, management can
only focus on the project schedule and whether activities have occurred as
planned, not whether the activities achieved their objectives. Further,
such quantitative data can be used to hold the project team accountable
for providing the promised capability.

Defect-tracking systems are one means of capturing quantitative data that
can be used to evaluate project efforts. Although HHS had a system that
captured the reported defects, we found that the system was not updated in
a timely manner with this critical information.57 More specifically, one
of the users identified a process weakness related to grant accounting as
a problem that will affect the deployment of HHS's system-commonly
referred to as a "showstopper." However, this weakness did not appear in
the defect-tracking system until about 1 month later. As a result, during
this interval the HHS defect-tracking system did not accurately reflect
the potential problems identified by users, and HHS management was unable
to determine (1) how well the system was working and (2) the amount of
work necessary to correct known defects. Such information is critical when
assessing a project's status.

We have also reported58 that while NASA had a system that captured the
defects that have been identified during testing, an analysis was not
performed to determine the root causes of reported defects. A critical
element in helping to ensure that a project meets its cost, schedule, and
performance goals is to ensure that defects are minimized and corrected as
early in the process as possible. Understanding the root cause of a defect
is critical to evaluating the effectiveness of a process. For example, if
a significant number of defects are caused by inadequate requirements
definition, then the organization knows that the requirements management

56William A. Florac, Robert E. Park, and Anita D. Carleton, Practical
Software Measurement: Measuring for Process Management and Improvement
(Pittsburgh, Pa.: Software Engineering Institute, Carnegie Mellon
University, 1997).

57GAO-04-1008.

58GAO-03-507.

process it has adopted is not effectively reducing risks to acceptable
levels. Analysis of the root causes of identified defects allows an
organization to determine whether the requirements management approach it
has adopted sufficiently reduces the risks of the system not meeting cost,
schedule, and functionality goals to acceptable levels. Root-cause
analysis would also help to quantify the risks inherent in the testing
process that has been selected.

Further, the Navy has not yet implemented an earned value management
system, which is another metric that can be employed to better manage and
oversee a system project. Both OMB59 and DOD60 require the use of an
earned value management system. The earned value management system
attempts to compare the value of work accomplished during a given period
with the work scheduled for that period. By using the value of completed
work as a basis for estimating the cost and time needed to complete the
program, management can be alerted to potential problems early in the
program. For example, if a task is expected to take 100 hours to complete
and it is 50 percent complete, the earned value management system would
compare the number of hours actually spent to complete the task to the
number of hours expected for the amount of work performed. In this
example, if the actual hours spent equaled 50 percent of the hours
expected, then the earned value would show that the project's resources
were consistent with the estimate. Without an effective earned value
management system, the Navy and DOD management have little assurance that
they know the status of the various project deliverables in the context of
progress and the cost incurred in completing each of the deliverables. In
other words, an effective earned value management system would be able to
provide quantitative data on the status of a given project deliverable,
such as a data conversion program. Based on this information, Navy
management would be able to determine whether the progress of the data
conversion effort was within the expected parameters for completion.
Management could then use this information to determine actions to take to
mitigate risk and manage cost and schedule performance. According to Navy
ERP officials, they intend to implement the earned value management system
as part of the contract for the next phase of the project.

59OMB Circular No. A-11, Part 7, Planning, Budgeting, Acquisition, and
Management of Capital Assets (June 21, 2005) and the supplement to Part 7,
the Capital Programming Guide (July 22, 1997).

60Under Secretary of Defense (Acquisition, Technology and Logistics),
Department of Defense, Revision to DOD Earned Value Management Policy,
March 7, 2005.

Independent Assessment of Navy ERP Could Enhance DOD and Navy Management
Oversight of the Project

The Navy has not established an IV&V function to provide an assessment of
the Navy ERP to DOD and Navy management. Best business practices indicate
that use of an IV&V function is a viable means to provide management
reasonable assurance that the planned system satisfies its planned use and
users. An effective IV&V review process would provide independent
information to DOD and Navy management on the overall status of the
project, including a discussion of any impacts or potential impacts to the
project with respect to cost, schedule, and performance. These assessments
involve reviewing project documentation, participating in meetings at all
levels within the project, and providing periodic reports and
recommendations, if deemed warranted, to senior management. The IV&V
function61 should report on every facet of a system project such as:

o 	Testing program adequacy. Testing activities would be evaluated to
ensure they are properly defined and developed in accordance with industry
standard and best practices.

o 	Critical-path analysis. A critical path defines the series of tasks
that must be finished in time for the entire project to finish on
schedule. Each task on the critical path is a critical task. A
critical-path analysis helps to identify the impact of various project
events, such as delays in project deliverables, and ensures that the
impact of such delays is clearly understood by all parties involved with
the project.

o 	System strategy documents. Numerous system strategy documents that
provide the foundation for the system development and operations are
critical aspects of an effective system project. These documents are used
for guidance in developing documents for articulating the plans and
procedures used to implement a system. Examples of such documents include
the Life-cycle Test Strategy, Interface Strategy, and Conversion Strategy.

61According to Institute of Electrical and Electronics Engineers (IEEE),
verification and validation processes for projects such as Navy ERP can be
used to determine whether (1) the products of a given activity conform to
the requirements of that activity and (2) the software satisfies its
intended use and user needs. This determination may include analyzing,
evaluating, reviewing, inspecting, assessing, and testing software
products and processes. The verification and validation processes should
assess the software in the context of the system, including the
operational environment, hardware, interfacing software, operators, and
users.

The IV&V reports should identify the project management weaknesses that
increase the risks associated with the project to senior management so
that they can be promptly addressed. The Navy ERP program's approach to
the verification and validation of its project management activities
relies on inhouse subject matter experts and others who work for the
project team's Quality Assurance leader. The results of these efforts are
reported to the project manager. While various approaches can be used to
perform this function, such as using the Navy's approach or hiring a
contractor to perform these activities, independence is a key component to
successful verification and validation activities. The system developer
and project management office may have vested interests and may not be
objective in their self-assessments. Accordingly, performing verification
and validation activities independently of the development and management
functions helps to ensure that verification and validation activities are
unbiased and based on objective evidence. The Navy's adoption of
verification and validation processes is a key component of its efforts to
implement the disciplined processes necessary to manage this project.
However, Navy and DOD management cannot obtain reasonable assurance that
the processes have been effectively implemented since the present
verification and validation efforts are not conducted by an independent
party.

In response to the Ronald W. Reagan National Defense Authorization Act for
Fiscal Year 2005,62 DOD has established a hierarchy of investment review
boards from across the department to improve the control and
accountability over business system investments.63 The boards are
responsible for reviewing and approving investments to develop, operate,
maintain, and modernize business systems for their respective business

62Ronald W. Reagan National Defense Authorization Act for Fiscal Year
2005, Pub. L. No. 108-375, S: 332, 118 Stat. 1811, 1851-56 (Oct. 28, 2004)
(codified, in part, at 10 U.S.C. S:S: 186, 2222).

63The act requires the use of procedures for ensuring consistency with the
guidance issued by the Secretary of Defense and the Defense Business
Systems Management Committee and incorporation of common decision
criteria, including standards, requirements, and priorities that result in
the integration of defense business systems.

areas.64 The various boards are to report to the Defense Business Systems
Management Committee (DBSMC), which is ultimately responsible for the
review and approval of the department's investments in its business
systems. To help facilitate this oversight responsibility, the reports
prepared by the IV&V function should be provided to the appropriate
investment review board and the DBSMC to assist them in the decisionmaking
process regarding the continued investment in the Navy ERP. The
information in the reports should provide reasonable assurance that an
appropriate rate of return is received on the hundreds of millions of
dollars that will be invested over the next several years and the Navy ERP
provides the promised capabilities.

To help ensure that the Navy ERP achieves its cost, schedule, and
performance goals, the investment review should employ an early warning
system that enables it to take corrective action at the first sign of
slippages. Effective project oversight requires having regular reviews of
the project's performance against stated expectations and ensuring that
corrective actions for each underperforming project are documented, agreed
to, implemented, and tracked until the desired outcome is achieved.

Conclusions	The lack of management control and oversight and a poorly
conceived concept resulted in the Navy largely wasting about $1 billion on
four ERP system projects that had only a limited positive impact on the
Navy's ability to produce reliable, useful, and timely information to aid
in its day-to-day operations. The Navy recognizes that it must have the
appropriate management controls and processes in place to have reasonable
assurance that the current effort will be successful. While the current
requirements management effort is adhering to the disciplined processes,
the overall effort is still in the early stages and numerous challenges
and significant risks remain, such as validating data conversion efforts
and developing numerous systems interfaces. Given that the current effort
is not scheduled to be complete until 2011 and is currently estimated by
the Navy

64Approval authorities include (1) the Under Secretary of Defense for
Acquisition, Technology and Logistics; (2) the Under Secretary of Defense
(Comptroller); (3) the Under Secretary of Defense for Personnel and
Readiness; (4) the Assistant Secretary of Defense for Networks and
Information Integration/Chief Information Officer of the Department of
Defense; and (5) the Deputy Secretary of Defense or an Under Secretary of
Defense, as designated by the Secretary of Defense. These approval
authorities are responsible for the review, approval, and oversight of
business systems and must establish investment review processes for
systems under their cognizance.

to cost about $800 million, it is incumbent upon Navy and DOD management
to provide the vigilant oversight that was lacking in the four pilots.
Absent this oversight, the Navy and DOD run a higher risk than necessary
of finding, as has been the case with many other DOD business systems
efforts, that the system may cost more than anticipated, take longer to
develop and implement, and does not provide the promised capabilities. In
addition, attempting large-scale systems modernization programs without a
well-defined architecture to guide and constrain business systems
investments, which is the current DOD state, presents the risk of costly
rework or even system failure once the enterprise architecture is fully
defined. Considering (1) the large investment of time and money
essentially wasted on the pilots and (2) the size, complexity, and
estimated costs of the current ERP effort, the Navy can ill afford another
business system failure.

Recommendations for Executive Action

To improve the Navy's and DOD's oversight of the Navy ERP effort, we
recommend that the Secretary of Defense direct the Secretary of the Navy
to require that the Navy ERP Program Management Office (1) develop and
implement the quantitative metrics needed to evaluate project performance
and risks and use the quantitative metrics to assess progress and
compliance with disciplined processes and (2) establish an IV&V function
and direct that all IV&V reports be provided to Navy management and to the
appropriate DOD investment review board, as well as the project
management.

Furthermore, given the uncertainty of the DOD business enterprise
architecture, we recommend that the Secretary of Defense direct the DBSMC
to institute semiannual reviews of the Navy ERP to ensure that the project
continues to follows the disciplined processes and meets its intended
costs, schedule, and performance goals. Particular attention should be
directed towards system testing, data conversion, and development of the
numerous system interfaces with the other Navy and DOD systems.

Agency Comments and 	We received written comments on a draft of this
report from the Deputy Under Secretary of Defense (Financial Management)
and the Deputy Under

Our Evaluation	Secretary of Defense (Business Transformation), which are
reprinted in appendix II. While DOD generally concurred with our
recommendations, it

took exception to our characterization that the pilots were failures and a
waste of $1 billion.

Regarding the recommendations, DOD agreed that it should develop and
implement quantitative metrics that can be used to evaluate the Navy ERP
and noted that it intends to have such metrics developed by December 2005.
The department also agreed that the Navy ERP program management office
should establish an IV&V function and noted that the IV&V team will report
directly to the program manager. We continue to reiterate the need for the
IV&V to be completely independent of the project. As noted in the report,
performing IV&V activities independently of the development and management
functions helps to ensure that the results are unbiased and based on
objective evidence. Further, rather than having the IV&V reports provided
directly to the appropriate DOD investment review boards as we
recommended, DOD stated that the Navy management and/or the project
management office shall inform the Office of the Under Secretary of
Defense for Business Transformation of any significant IV&V results. We
reiterate our support for the recommendation that the IV&V reports be
provided to the appropriate investment review board so that it can
determine whether any of the IV&V results are significant. Again, by
providing the reports directly to the appropriate investment review board,
we believe there would be added assurances that the results were objective
and that the managers who will be responsible for authorizing future
investments in the Navy ERP will have the information needed to make the
most informed decision.

With regard to the reviews by the DBSMC, DOD partially agreed. Rather than
semiannual reviews by the DBSMC as we recommended, the department noted
that the components (e.g., the Navy) would provide briefings on their
overall efforts, initiatives, and systems during meetings with the DBSMC.
Given the significance of the Navy ERP, in terms of dollars and its
importance to the overall transformation of the department's business
operations, and the failure of the four ERP pilots, we continue to support
more proactive semiannual reviews by the DBSMC. As noted in the report,
the Navy's initial estimate is that the ERP will cost at least $800
million, and given the department's past difficulties in effectively
developing and implementing business systems, substantive reviews by
individuals outside of the program office that are focused just on the
Navy ERP by the highest levels of management within the department are
warranted. Further, we are concerned that the briefings contemplated to
the DBSMC may not necessarily discuss the Navy ERP, nor provide the
necessary detailed discussions to offer the requisite level of confidence
and

assurance that the project continues to follow disciplined processes with
particular attention to numerous challenges, such as system interfaces and
system testing.

In commenting on the report, the department depicted the pilots in a much
more positive light than we believe is merited. DOD pointed out that it
viewed the pilots as successful, exceeding initial expectations, and
forming the foundation upon which to build a Navy enterprise solution, and
took exception to our characterization that the pilots were failures and
largely a waste of $1 billion. As discussed in the report, the four pilots
were narrow in scope, and were never intended to be a corporate solution
for resolving any of the Navy's long-standing financial and business
management problems. We characterized the pilots as failures because the
department spent $1 billion on systems that did not result in marked
improvement in the Navy's day-to-day operations. While there may have been
marginal improvements, it is difficult to ascertain the sustained,
long-term benefits that will be derived by the American taxpayers for the
$1 billion.

Additionally, the pilots present an excellent case study as to why the
centralization of the business systems funding would be an appropriate
course of action for the department, as we have previously recommended.65
Each Navy command was allowed to develop an independent solution that
focused on its own parochial interest. There was no consideration as to
how the separate efforts fit within an overall departmental framework, or,
for that matter, even a Navy framework. As noted in table 2, the pilots
performed many of the same functions and used the same software, but yet
were not interoperable because of the various inconsistencies in the
design and implementation.

Because the department followed the status quo, the pilots, at best,
provided the department with four more stovepiped systems that perform
duplicate functions. Such investments are one reason why the department
reported in February 200566 that it had 4,150 business systems. Further,
in its comments the department noted one of the benefits of the pilots was
that they "proved that the Navy could exploit commercial ERP tools without
significant customization." Based upon our review and during discussions
with the program office, just the opposite occurred in the

65GAO-04-615. 66GAO-05-381.

pilots. Many portions of the pilots' COTS software were customized to
accommodate the existing business processes, which negated the advantages
of procuring a COTS package. Additionally, the department noted that one
of the pilots-SMART, on which, as noted in our report, the Navy spent
approximately $346 million through September 30, 2004-has already been
retired. We continue to question the overall benefit that the Navy and the
department derived from these four pilots and the $1 billion it spent.

As agreed with your offices, unless you announce the contents of this
report earlier, we will not distribute it until 30 days after its issuance
date. At that time, we will send copies to the Chairmen and Ranking
Minority Members, Senate Committee on Armed Services; Senate Committee on
Homeland Security and Governmental Affairs; Subcommittee on Defense,
Senate Committee on Appropriations; House Committee on Armed Services;
House Committee on Government Reform; and Subcommittee on Defense, House
Committee on Appropriations. We are also sending copies to the Under
Secretary of Defense (Comptroller); the Under Secretary of Defense
(Acquisition, Technology and Logistics); the Under Secretary of Defense
(Personnel and Readiness); the Assistant Secretary of Defense (Networks
and Information Integration); and the Director, Office of Management and
Budget. Copies of this report will be made available to others upon
request. In addition, the report will be available at no charge on the GAO
Web site at http://www.gao.gov. If you or your staff have any

questions on matters discussed in this report, please contact Gregory D.
Kutz at (202) 512-9505 or [email protected] or Keith A. Rhodes at (202) 512
6412 or [email protected]. Key contributors to this report are listed in
appendix IV. Contact points for the Offices of Congressional Relations and
Public Affairs are shown on the last page of the report.

Gregory D. Kutz
Managing Director
Forensic Audits and Special Investigations

Keith A. Rhodes
Chief Technologist
Applied Research and Methodology Center for Engineering and Technology

List of Requesters

The Honorable Tom Davis
Chairman
Committee on Government Reform
House of Representatives

The Honorable Christopher H. Shays
Chairman
Subcommittee on National Security, Emerging Threats,

and International Relations Committee on Government Reform House of
Representatives

The Honorable Todd R. Platts
Chairman
Subcommittee on Government Management, Finance

and Accountability Committee on Government Reform House of Representatives

The Honorable Adam H. Putnam House of Representatives

Appendix I

Scope and Methodology

To obtain a historical perspective on the planning and costs of the Navy's
four Enterprise Resource Planning (ERP) pilot projects, and the decision
to merge them into one program, we reviewed the Department of Defense's
(DOD) budget justification materials and other background information on
the four pilot projects. We also reviewed Naval Audit Service reports on
the pilots. In addition, we interviewed Navy ERP program management and
DOD Chief Information Officer (CIO) officials and obtained informational
briefings on the pilots.

To determine if the Navy has identified lessons learned from the pilots,
how they are being used, and the challenges that remain, we reviewed
program documentation and interviewed Navy ERP program officials. Program
documentation that we reviewed included concept of operations
documentation, requirements documents, the testing strategy, and the test
plan. In order to determine whether the stated requirements management
processes were effectively implemented, we performed an in-depth review
and analysis of seven requirements that relate to the Navy's problem
areas, such as financial reporting and asset management, and traced them
through the various requirements documents. These requirements were
selected in a manner that ensured that the requirements selected were
included in the Navy's Financial Improvement Plan.

Our approach to validating the effectiveness of the requirements
management process relied on a selection of seven requirements from
different functional areas. From the finance area, we selected the
requirement to provide reports of funds expended versus funds allocated.
From the intermediate-level maintenance management area, we selected the
requirement related to direct cost per job and forecasting accuracy. From
the procurement area, we selected the requirement to enable monitoring and
management of cost versus plan. In the plant supply functions area, we
reviewed the requirement related to total material visibility and access
of material held by the activity and the enterprise. From the wholesale
supply functions area, we selected the requirements of in-transit
losses/in-transit write-offs and total material visibility and access of
material held by the activity and the enterprise. Additionally, we
reviewed the requirement that the ERP be compliant with federal mandates
and requirements and the U.S. Standard General Ledger.

In order to provide reasonable assurance that our test results for the
selected requirements reflected the same processes used to document all
requirements, we did not notify the project office of the specific
requirements we had chosen until the tests were conducted. Accordingly,

Appendix I Scope and Methodology

the project office had to be able to respond to a large number of
potential requests rather than prepare for the selected requirements in
advance. Additionally, we obtained the list of systems the Navy ERP will
interface with and interviewed selected officials responsible for these
systems to determine what activities the Navy ERP program office is
working with them on and what challenges remain.

To determine if there are additional business practices that could be used
to improve management oversight of the Navy ERP, we reviewed industry
standards and best practices from the Institute of Electrical and
Electronics Engineers, the Software Engineering Institute, the Joint
Financial Management Improvement Program, GAO executive guides, and prior
GAO reports. Given that the Navy ERP effort is still in the early stages
of development, we did not evaluate all best practices. Rather, we
concentrated on those that could have an immediate impact in improving
management's oversight. We interviewed Navy ERP program officials and
requested program documentation to determine if the Navy ERP had addressed
or had plans for addressing these industry standards and best practices.

We did not verify the accuracy and completeness of the cost information
provided by DOD for the four pilots or the Navy ERP effort. We conducted
our work from August 2004 through June 2005 in accordance with U.S.
generally accepted government auditing standards.

We requested comments on a draft of this report from the Secretary of
Defense or his designee. We received written comments on a draft of the
report from the Deputy Under Secretary of Defense (Financial Management)
and the Deputy Under Secretary of Defense (Business Transformation), which
are reprinted in appendix II.

                                  Appendix II

                    Comments from the Department of Defense

Appendix II
Comments from the Department of Defense

Appendix II
Comments from the Department of Defense

Appendix II
Comments from the Department of Defense

Appendix III

Identification of the Navy and Defense Systems That Must Interface with
the ERP

                              Acronym System name

Navy systems

                      AIT Automated Information Technology

                   CAV II Commercial Asset Visibility System

CDF Consolidated Data File

CDMD-OA Configuration Data Manager's Database - Open Architecture

 CRCS-CADS Common Rates Computation System/Common Allowance Development System

DONIBIS Department of the Navy Industrial Budget Information System

G02APU G02 Annual Pricing Update

ITIMP Integrated Technical Item Management & Procurement Manugistics Manugistics
MSWP Maintenance and Ship Work Planning NALCOMIS OMA & OOMA Naval Aviation
     Logistic Command Management Information System (2 different versions)

NALDA Naval Aviation Logistics Data Analysis

Navy Data Mart Defense Civilian Personnel Data System

NDMS NAVAIR Depot Maintenance System

NES Navy Enlisted Personnel System

One Touch One Touch Supply System
OPINS Officer Personnel Information System
PBIS Program Budget Information System
RMAIS Regional Maintenance Automated Information System
SAS SAS Activity-Based Management
SDRS Supply Discrepancy Reporting System
SKED Preventative Maintenance Scheduling Program
SLDP Standard Logistics Data Procedures
TDSA Technical Directive Status Accounting System
TFMMS Total Force Manpower Management System
UADPS- SP/U2 Uniform Automated Data Processing System -Stock Points

Appendix III Identification of the Navy and Defense Systems That Must
Interface with the ERP

(Continued From Previous Page)

                              Acronym System name

DOD systems

ADS DFAS Corporate Database

APVM Accounting Pre-Validation Module

CCR Central Contractor Registration System

DAAS Defense Automatic Addressing System

DCAS Defense Cash Accountability System

DCPS Defense Civilian Pay System

DDRS Defense Departmental Reporting System

DTS Defense Travel System

FLIS Federal Logistics Information System

FRS Financial Reporting System - Accounting

ICAPS Interactive Computer Aided Provisioning System

        MISIL Management Information System for International Logistics

PPVM Payment Pre-Validation Module

SPS Standard Procurement System

VPIS Vendor Pay Inquiry System

WAWF Wide Area Workflow
WINS Web Based Invoicing System

Source: Navy ERP program office.

Appendix IV

                        GAO Contacts and Acknowledgments

GAO Contacts	Gregory D. Kutz (202) 512-9505 or [email protected] Keith A.
Rhodes (202) 512-6412 or [email protected]

Acknowledgments	In addition to the contacts above, Darby Smith, Assistant
Director; J. Christopher Martin, Senior Level Technologist; Francine
DelVecchio; Kristi Karls; Jason Kelly; Mai Nguyen; and Philip Reiff made
key contributions to this report.

GAO's Mission	The Government Accountability Office, the audit, evaluation
and investigative arm of Congress, exists to support Congress in meeting
its constitutional responsibilities and to help improve the performance
and accountability of the federal government for the American people. GAO
examines the use of public funds; evaluates federal programs and policies;
and provides analyses, recommendations, and other assistance to help
Congress make informed oversight, policy, and funding decisions. GAO's
commitment to good government is reflected in its core values of
accountability, integrity, and reliability.

Obtaining Copies of The fastest and easiest way to obtain copies of GAO
documents at no cost

is through GAO's Web site (www.gao.gov). Each weekday, GAO postsGAO
Reports and newly released reports, testimony, and correspondence on its
Web site. To Testimony have GAO e-mail you a list of newly posted products
every afternoon, go to

www.gao.gov and select "Subscribe to Updates."

Order by Mail or Phone	The first copy of each printed report is free.
Additional copies are $2 each. A check or money order should be made out
to the Superintendent of Documents. GAO also accepts VISA and Mastercard.
Orders for 100 or more copies mailed to a single address are discounted 25
percent. Orders should be sent to:

U.S. Government Accountability Office 441 G Street NW, Room LM Washington,
D.C. 20548

To order by Phone:	Voice: (202) 512-6000 TDD: (202) 512-2537 Fax: (202)
512-6061

To Report Fraud, Contact:
Waste, and Abuse in Web site: www.gao.gov/fraudnet/fraudnet.htm

E-mail: [email protected] Programs Automated answering system: (800)
424-5454 or (202) 512-7470

Congressional	Gloria Jarmon, Managing Director, [email protected] (202)
512-4400 U.S. Government Accountability Office, 441 G Street NW, Room 7125

Relations Washington, D.C. 20548

Public Affairs	Paul Anderson, Managing Director, [email protected] (202)
512-4800 U.S. Government Accountability Office, 441 G Street NW, Room 7149
Washington, D.C. 20548
*** End of document. ***