DOD Systems Modernization: Planned Investment in the Naval	 
Tactical Command Support System Needs to be Reassessed		 
(05-DEC-05, GAO-06-215).					 
                                                                 
Because it is important that the Department of Defense (DOD)	 
adheres to disciplined information technology (IT) acquisition	 
processes to successfully modernize its business systems, GAO was
asked to determine whether the Naval Tactical Command Support	 
System (NTCSS) is being managed according to important aspects of
DOD's acquisition policies and guidance, as well as other	 
relevant acquisition management best practices. NTCSS was started
in 1995 to help Navy personnel effectively manage ship, 	 
submarine, and aircraft support activities. To date, about $1	 
billion has been spent to partially deploy NTCSS to about	 
one-half its intended ashore and afloat sites.			 
-------------------------Indexing Terms------------------------- 
REPORTNUM:   GAO-06-215 					        
    ACCNO:   A42697						        
  TITLE:     DOD Systems Modernization: Planned Investment in the     
Naval Tactical Command Support System Needs to be Reassessed	 
     DATE:   12/05/2005 
  SUBJECT:   Best practices					 
	     Best practices methodology 			 
	     Defense procurement				 
	     Economic analysis					 
	     Enterprise architecture				 
	     Evaluation criteria				 
	     Internal controls					 
	     Procurement planning				 
	     Procurement policy 				 
	     Program evaluation 				 
	     Naval Tactical Command Support System		 

******************************************************************
** This file contains an ASCII representation of the text of a  **
** GAO Product.                                                 **
**                                                              **
** No attempt has been made to display graphic images, although **
** figure captions are reproduced.  Tables are included, but    **
** may not resemble those in the printed version.               **
**                                                              **
** Please see the PDF (Portable Document Format) file, when     **
** available, for a complete electronic file of the printed     **
** document's contents.                                         **
**                                                              **
******************************************************************
GAO-06-215

     

     * Report to the Subcommittee on Readiness and Management Support,
       Committee on Armed Services, U.S. Senate
          * December 2005
     * DOD SYSTEMS MODERNIZATION
          * Planned Investment in the Naval Tactical Command Support System
            Needs to Be Reassessed
     * Contents
          * Results in Brief
          * Background
               * NTCSS Genesis and Status Overview
               * NTCSS Oversight and Management Roles and Responsibilities
               * NTCSS Participation in DOD's Rapid Improvement Team Pilot
               * Prior Review Identified Strengths and Weaknesses in DOD's
                 Acquisition Policies and Guidance
          * NTCSS Has Not Been Managed in Accordance with DOD and Other
            Relevant System Acquisition and Development Guidance
               * The Navy Has Not Economically Justified Investment in NTCSS
                 on the Basis of Costs and Benefits
                    * The Latest NTCSS Cost Estimate Was Not Derived Reliably
               * The Latest NTCSS Economic Analysis Did Not Meet Key Federal
                 Guidance
                    * The Latest NTCSS Economic Analysis Was Not
                      Independently Reviewed
               * The Navy Has Yet to Measure Whether Actual Benefits Have
                 Accrued from Deployed NTCSS Capabilities
               * The Navy Recently Decided to Prepare a Benefits Assessment
               * The Navy Has Not Defined and Developed NTCSS within the
                 Context of an Enterprise Architecture
               * Key Program Management and Oversight Activities Have Not
                 Been Effectively Performed
                    * The Navy is Not Adequately Measuring Progress Against
                      Planned Cost and Scheduled Work Commitments
                         * DOD Has Adopted Industry Standards for Earned
                           Value Management
                         * NTCSS Has Not Effectively Implemented EVM
                         * Two NTCSS Projects Illustrate How EVM Has Been
                           Poorly Implemented
                    * The Navy Has Not Adequately Reported NTCSS's Progress
                      and Problems
                         * Navy Reporting Requirements for NTCSS Have Changed
                           over the Last Several Years
                         * The Navy Has Not Satisfied All NTCSS Reporting
                           Requirements
                    * The Navy Has Not Properly Budgeted for NTCSS
                    * Navy Oversight of NTCSS Has Not Been Adequate
                         * The Milestone Decision Authority Has Not
                           Adequately Overseen the Program
                         * Other Navy Organizations Have Not Conducted
                           Program Oversight
               * NTCSS Requirements and Test Management Weaknesses Have
                 Contributed to Deployment Delays and System Quality Problems
                    * The Navy Has Not Adequately Managed Requirements for
                      the NTCSS Application Currently Under Development
                         * Requirements for OOMA Release 4.10 Were Not Traced
                         * Requirements for OOMA Release 4.10 Were Not
                           Prioritized
                    * The Navy's Developmental Testing for OOMA Has Not Been
                      Effective, but Improvements Planned
                         * Navy Operational Testing Organization Reported
                           That Developmental Testing Has Failed to Identify
                           Problems
                         * Developmental Test Documentation Has Not Been
                           Adequate, but Improvements Planned
                    * Central Design Agency Reports Management Improvements
                      are Under Way
          * Conclusions
          * Recommendations for Executive Action
          * Agency Comments and Our Evaluation
     * Objective, Scope, and Methodology
     * Trouble Reports and Change Proposals Assessment
          * Trouble Reports
          * Change Proposals
     * Earned Value Management Assessment
     * Comments from the Department of Defense
     * GAO Contact and Staff Acknowledgments

Report to the Subcommittee on Readiness and Management Support, Committee
on Armed Services, U.S. Senate

December 2005

DOD SYSTEMS MODERNIZATION

Planned Investment in the Naval Tactical Command Support System Needs to
Be Reassessed

Contents

Tables

Figures

December 5, 2005Letter

The Honorable John Ensign Chairman The Honorable Daniel K. Akaka Ranking
Minority Member Subcommittee on Readiness and Management Support Committee
on Armed Services United States Senate

Because it is so important that the Department of Defense (DOD) adhere to
disciplined information technology (IT) acquisition processes in order to
successfully modernize its business systems, you requested that we
determine whether the department is following its own revised policies and
guidance for acquiring systems,1 which it issued in May 2003. As part of
our response to your request, we agreed to review the Naval Tactical
Command Support System (NTCSS) program. NTCSS was started in 1995 and is
intended to help Navy personnel effectively manage ships, submarines, and
aircraft support activities. The Navy expects to spend $348 million on
NTCSS between fiscal years 2006 and 2009, for a total of approximately
$1.45 billion since program inception.

As agreed, our objective was to determine whether NTCSS is being managed
according to important aspects of DOD's acquisition policies and guidance,
as well as other relevant acquisition management best practices. We
focused on the program's (1) economic justification; (2) architectural
alignment; (3) project management, including progress measurement,
progress reporting, funding disclosure, and oversight activities; and
(4) system development, including requirements management and testing. For
requirements management and testing, we focused on the NTCSS application
that is currently being developed, known as the Optimized Organizational
Maintenance Activity (OOMA).

We conducted our review from September 2004 through November 2005 in
accordance with generally accepted government auditing standards. For
details on our objective, scope, and methodology, see appendix I.

Results in Brief

The Navy has not managed its NTCSS program in accordance with key aspects
of the department's system acquisition policies and related guidance,
including federal and recognized best practice guidance. Collectively,
these policies and guidance are intended to reasonably ensure that
investment in a given IT system represents the right solution to fill a
mission need-and if it is, that acquisition and deployment of the system
are handled in a manner that maximizes the chances of delivering defined
system capabilities on time and within budget. In the case of NTCSS,
neither of these outcomes is being realized. As a result, the Navy does
not currently have a sufficient basis for determining whether NTCSS is the
right systems solution for its aircraft, ship, and submarine tactical
command support needs, and it has not pursued the proposed solution in the
right way, meaning in a fashion that increases chances of delivering
defined capabilities on time and within budget. Key areas in which the
Navy did not follow relevant policies and guidance are described here.

o The Navy has not economically justified its ongoing and planned
investment in NTCSS on the basis of reliable estimates of future costs and
benefits. The most recent economic justification's cost estimates were not
reliably derived, and return on investment was not properly calculated. In
addition, independent reviews of the economic justification to determine
its reliability did not occur, and the Navy has not measured whether
already deployed and operating components of the system are producing
expected value.

o The Navy has not invested in NTCSS within the context of a well-defined
enterprise architecture, which is an institutional blueprint to guide and
constrain program investment decisions in a way that promotes
interoperability and reduces redundancy among related and dependent
systems. As we recently reported,2 DOD's business enterprise architecture
does not contain sufficient context (depth and scope of operational and
technical requirements) to effectively guide and constrain business
transformation and system modernization efforts. Further, the Navy does
not yet have a defined architecture, although it plans to develop one.
Investing in systems, in the absence of an enterprise architecture,
requires explicit recognition and deliberate consideration of the inherent
risks to ensure fully informed investment decision making.

o The Navy has not effectively performed key measurement, reporting, and
oversight activities. In particular, earned value management, which is a
means for determining and disclosing actual performance against budget and
schedule estimates, and revising estimates based on performance to date
has not been implemented effectively. Also, complete and current reporting
of NTCSS progress and problems in meeting cost, schedule, and performance
goals has not occurred, leaving oversight entities without the information
needed to mitigate risks, address problems, and take corrective action. In
addition, NTCSS budgets have not reflected the proper category of
appropriated funds associated with system development efforts. Further,
oversight entities' roles and responsibilities have not been fully
discharged.

o The Navy has not adequately conducted requirements management and
testing activities. For the NTCSS application that is currently under
development, the Navy has not adequately managed requirements, as
evidenced by the absence of requirements traceability to system design
specifications and testing documents, and the lack of prioritization of
the requirements. The lack of requirements traceability and other issues
have in turn contributed to problems with developmental testing, including
the failure of these tests to identify problems that subsequently
prevented the system from passing operational testing twice over the last
4 years. Based on the Navy's data, the recent trend in key indicators of
system maturity, such as the number and nature of reported systems
problems and change proposals, shows that problems with NTCSS persist and
that these problems could involve costly and timely rework to address.3

Reasons the Navy cited for not following policies and guidance included
questioning their applicability to the NTCSS program, having insufficient
time in which to apply them, and believing that plans to adopt them were
not meant to be applied retroactively. In some cases, the Navy did not
acknowledge that any deviations from policies and guidance had occurred,
but in these cases, it has yet to provide us with documentation
demonstrating that it did adhere to them. Collectively, this means that
after investing 10 years and $1 billion on NTCSS, it is unclear whether
the Navy's planned future investment in the program is warranted. Even if
key uncertainties are addressed and it can be demonstrated that NTCSS is
the right solution, then the manner in which NTCSS is being defined,
developed, tested, measured, and overseen is also of concern. Accordingly,
we are making recommendations to the Secretary of Defense aimed at
developing the basis needed to determine whether continued investment in
NTCSS is a prudent use of limited departmental resources. We are also
making recommendations to strengthen management of the program,
conditional upon a decision to proceed with further investment in the
NTCSS program.

The Office of the Assistant Secretary of Defense for Networks and
Information Integration provided written comments on a draft of the
report. In its comments, DOD concurred with two of the recommendations and
partially concurred with the remaining five. DOD also stated that while
some of our findings are valid, our overall findings significantly
understated and misrepresented the program's level of discipline and
conformance with applicable guidance and direction. We do not agree. Our
report cites numerous instances, supported by analyses, where the Navy did
not comply with either DOD acquisition policies and guidelines or industry
best practices. DOD's comments are reprinted in their entirety in appendix
IV of this report, along with our detailed responses to each.

Background

The Navy's primary mission is to organize, train, maintain, and equip
combat-ready naval forces capable of winning the global war on terror and
any other armed conflict, deterring aggression by would-be foes,
preserving freedom of the seas, and promoting peace and security. To
support this mission, the Navy performs a variety of interrelated and
interdependent business functions such as logistics and financial
management. The Navy requested, for fiscal year 2005, about $3.5 billion
to operate, maintain, and modernize its business systems and related IT
infrastructure that support these business functions. This request
represents about 27 percent of the $13 billion that DOD requested for all
of its business systems for fiscal year 2005. Of the 4,150 business
systems that DOD reports in its current inventory, the Navy accounts for
2,353, or about 57 percent, of the total.

In 1995, we designated DOD's business systems modernization efforts as a
high-risk program and continue to designate it as such today4 for several
reasons, including the department's challenges in implementing effective
IT investment management structures and processes, developing and
implementing an enterprise architecture, and implementing effective IT
system acquisition and development processes.

NTCSS Genesis and Status Overview

In the early 1990s, the Navy employed a variety of IT systems to support
the management of information, personnel, materials, and funds required to
maintain and operate ships, submarines, and aircraft. Three core
systems-each managed by a separate program office-consisting of nine major
applications, provided this support: (1) the Shipboard Non-Tactical
Automated Data Processing Program (SNAP), managed by the Space and Naval
Warfare Systems Command; (2) the Naval Aviation Logistics Command
Management Information System (NALCOMIS), managed by the Naval Air Systems
Command; and (3) the Maintenance Resource Management System (MRMS),
managed by the Naval Sea Systems Command. See table 1 for a description of
these three legacy systems and a list of their respective applications.

Table 1: Legacy Systems and Applications

Legacy system Description                          Application             
SNAP systems  Manages systems for maintenance,     SNAP I:                 
                 supply, and financial operations at                          
SNAP I        the organizational and intermediate  o Shipboard Uniform     
                 levels.a                             Automated Data          
SNAP II                                            Processing System       
                 Manages medical and dental services,                         
                 pay and personnel administration,    o Organizational        
                 food service, retail sales and       Maintenance Management  
                 service, training programs,          System                  
                 technical data storage and                                   
                 retrieval, support and test          o Administration Data   
                 equipment, and other mission         Management I            
                 support-related areas at the                                 
                 organizational level.                SNAP II:                
                                                                              
                 SNAP I was developed for the Navy's  o Supply and Financial  
                 larger ships, marine aviation        Management              
                 logistics squadrons,b training                               
                 sites, and selected activities       o Organizational        
                 ashore.                              Maintenance Management  
                                                      System II Maintenance   
                 SNAP II provides the same            Data System             
                 functionality as SNAP I, but it was                          
                 developed for use on smaller ships   o Administration Data   
                 and submarines. SNAP II was also     Management II           
                 modified to use microcomputers as    
                 the computing platforms when it is   
                 deployed on ships with constricted   
                 physical space; this version is      
                 known as MicroSNAP.                  
NALCOMIS      Supports day-to-day aircraft         o NALCOMIS              
                 maintenance and related material     Organizational          
                 maintenance functionality both at    Maintenance Activity    
                 sea and ashore.                                              
                                                      o NALCOMIS Intermediate 
                 Provides the initial maintenance     Maintenance Activity    
                 response when a problem is           
                 reported-including aircraft          
                 component troubleshooting,           
                 servicing, inspection, and removal   
                 and replacement at the               
                 organizational level.                
                                                      
                 Supports, at the intermediate        
                 maintenance level, the repair of     
                 components after defective parts     
                 have been removed from an aircraft   
                 and sent to a central location to be 
                 refurbished.                         
MRMS          Supports intermediate-level ship and o Maintenance Resource  
                 submarine maintenance at ashore      Management System       
                 facilities by providing management   
                 information such as planning,        
                 scheduling, workload forecasting,    
                 work progression, production         
                 control, productivity analysis, and  
                 resource management.                 

Source: Navy.

aThe "organizational" level is the first stage of aircraft maintenance
activity that is performed on individual planes and involves the upkeep
and servicing of the aircraft at the location where it is deployed, such
as a ship. Components or parts that cannot be repaired at the
organizational level are removed from the plane and sent to a central
location for repair. This second stage of maintenance is known as the
"intermediate" level, and it normally occurs on land. If the defective
part cannot be fixed at the intermediate level, it is then sent to a third
stage of maintenance, known as the "depot" level, which is not in the
scope of the NTCSS program.

bMarine aviation logistics squadrons are groups of planes that are
land-based but that can be deployed on an aircraft carrier for a specific
mission. When the mission is completed, these planes return to their land
base.

In 1992, we recommended that the Navy merge the management of all
shipboard nontactical programs under a single command that would have
authority and control over funding and development.5 In 1993, the Navy
developed a strategy to do so. In 1994, the Navy also identified a number
of problems with the three legacy systems. Specifically, the Navy
determined that (1) the individual systems did not consistently handle
increasing workloads and provide the flexibility to meet changing
operational demands; (2) the systems' software architectures were
ineffective and inefficient; (3) the hardware was outdated, slow,
expensive to maintain, and nonstandard; and (4) the systems could not
support modernized applications.

To address these concerns, the Navy initiated the NTCSS program in 1995 to
enhance the combat readiness of ships, submarines, and aircraft. To
accomplish this, NTCSS was to provide unit commanding officers and crews
with information about, for example, maintenance activities, parts
inventories, finances, technical manuals and drawings, and personnel.
According to the Navy, it spent approximately $1.1 billion for NTCSS from
its inception through fiscal year 2005 and expects to spend another $348
million between fiscal years 2006 and 2009, for a total of approximately
$1.45 billion.

The Navy defined a three-stage acquisition process for NTCSS.

Stage 1: Purpose was to replace hardware in order to establish a common
infrastructure across all force-level ships, unit-level ships, aviation
squadrons, Naval air stations, marine aviation logistics squadrons, and
other maintenance activities-both at sea and ashore.6 During this stage,
software and business processes were not to be changed. This phase was
begun in 1994 under the legacy SNAP and NALCOMIS programs and, according
to program officials, it is fundamentally complete-although technology
refresh or replacement activities are still occurring.

Stage 2: Purpose was to provide the functionality of the legacy systems
software with more efficient, more easily maintained software and to
eliminate functional overlap among the systems. This stage was to involve
software technology modernization but no changes in software functionality
or business processes. Existing legacy systems used flat files and
hierarchical databases, which were to be converted to relational
databases, and the existing application code was to be rewritten using
modern software languages. A common hardware and systems software
environment was also to be implemented, and functionality found in eight
of the nine legacy applications was to be consolidated and rewritten as
four new NTCSS applications. Development of these four applications began
in 1995 and was reportedly completed in 2000. This stage was known as
NTCSS Optimization. See table 2 for a description of the functionality of
these new applications.

Stage 3: Purpose was to improve NTCSS's functionality by implementing
business process improvements. According to Navy officials, this stage is
known as NTCSS Modernization and, to date, includes two efforts:
(1) replace the last legacy application and (2) create a Web-enabled
version of the three unit-level Optimized NTCSS applications that were
developed under Stage 2. See table 3 for a description of the
functionality of these business process improvements.

Table 2: Optimized Applications Developed During Stage 2 of the NTCSS
Program

NTCSS Optimized     Description                       Status               
applications                                          
Relational Supply   Supports supply chain management, Operational, as of   
                       inventory management, and         September 1998, on   
                       financial management processes.   large force-level    
                                                         ships, smaller       
                       Provides Navy personnel with      unit-level ships,    
                       access to the supply support      and at air stations  
                       functions they perform most       and marine aviation  
                       often-ordering, receiving, and    logistics            
                       issuing necessary supplies and    squadrons.a          
                       materials; maintaining financial  
                       records; and reconciling supply,  
                       inventory, and financial records  
                       with the Navy's shore             
                       infrastructure.                   
Organizational      Assists shipboard personnel in    Operational, as of   
Maintenance         planning, scheduling, reporting,  September 1998,      
Management          and tracking maintenance and      primarily on large   
System-Next         related logistics support         force-level ships    
Generation          actions.                          and smaller          
                                                         unit-level ships.    
                       Maintains online lists of         
                       maintenance actions to be         
                       performed, parts required to      
                       maintain shipboard equipment, and 
                       parts carried onboard ship to     
                       support maintenance actions.      
                                                         
                       Interfaces with Relational Supply 
                       to requisition parts that are not 
                       onboard.                          
Relational          Automates the management of       Operational, as of   
Administration Data personnel awards and decorations, April 2000, on large 
Management          work assignments, and berthing    force-level ships,   
                       assignments.                      smaller unit-level   
                                                         ships, and at air    
                                                         stations and marine  
                                                         aviation logistics   
                                                         squadrons.           
Optimized           Provides online                   Operational, as of   
Intermediate        intermediate-level aviation       April 2000, at       
Maintenance         maintenance, configuration, and   force-level ships    
Activities          logistics management support.     and at air stations  
                                                         and marine aviation  
                       Interfaces with other major       logistics squadrons. 
                       integrated logistics support      
                       systems within the Naval aviation 
                       community.                        

Source: Navy.

aRelational Supply is also in use at additional sites that are not a part
of the NTCSS program.

Table 3: Modernized Applications Developed During Stage 3 of the NTCSS
Program

NTCSS modernized Description                        Status                 
applications                                        
Optimized        Is to support day-to-day           Initiated in 1999,     
Organizational   maintenance management tools for   withdrawn from         
Maintenance      aviation squadrons and other       operational testingb   
Activity (OOMA)  organizational-level maintenance   in April 2001 when it  
                    activities.                        became clear that it   
                                                       would fail. Failed     
                    Is to provide the foundation for   operational testing    
                    achieving a completely automated   again in May 2004.     
                    maintenance environment, such as a Scheduled for third    
                    single point of data entry,        operational test in    
                    automated and assisted pilot and   the third quarter of   
                    maintenance debrief, online        fiscal year 2006.      
                    diagnostics, structural life                              
                    prognostics,a interactive          Fielded at 77 sites as 
                    electronic technical manuals, and  of June 2005.          
                    forecasting and tracking of        
                    maintenance schedules.             
eNTCSS           Was to provide a Web-enabled       Initiated in 2001.     
                    version of NTCSS, and allow users  Cancelled in April     
                    to access the three unit-level     2004.                  
                    Optimized applications from any                           
                    workstation on a ship's local area Fielded on one         
                    network via a standard Web browser submarine and          
                    and to execute work activities in  scheduled to be        
                    a Web-server environment.          fielded on one more.   
                                                                              
                                                       Is to be replaced with 
                                                       the Optimized          
                                                       applications, but a    
                                                       date has yet to be     
                                                       determined.            

Source: Navy.

aAccording to the U.S. Marine Corps Logistics Directorate, structural life
prognostics is defined as the ability to reliably predict the remaining
useful life of mechanical or structural components, within an actionable
time period, and within acceptable confidence limits.

bAccording to the DOD Defense Acquisition Guidebook, the primary purpose
of operational test and evaluation is for representative users to evaluate
systems in a realistic environment in order to determine whether these
systems are operationally effective and suitable for their intended use
before production or deployment.

As of April 2005, legacy applications were still in use at 51 percent of
the Navy's 659 sites. These 659 sites either have legacy, Optimized, or
modernized applications. Table 4 shows the distribution of the legacy,
Optimized, and modernized applications.

Table 4: Applications in Operation as of April 2005

Applications                           Number of sites Percentage of total 
Legacy applications                                    
SNAP Ia, b                                          10 
SNAP IIa, b, c                                      68 
MicroSNAP                                           32 
NALCOMIS Organizational Maintenance                214 
Activityd                                              
NALCOMIS Intermediate Maintenance                   10 
Activityb                                              
Maintenance Resource Management                      2 
Systeme                                                
Subtotal                                           336                  51 
Optimized applicationsf                                
Relational Supplyc                                     
Organizational Maintenance Management                  
System - Next Generation                               
Relational Administration Data                         
Management                                             
Optimized Intermediate Maintenance                     
Activities                                             
Subtotal                                           229                  35 
Modernized applications                                
Optimized Organizational Maintenance                93 
Activity                                               
eNTCSS                                               1 
Subtotal                                            94                  14 
Total                                              659                 100 

Source: Navy.

aSNAP I and SNAP II are each composed of three different legacy
applications (see table 1).

bThe Navy plans to decommission some of the ships that use these
applications and upgrade the remaining ships to NTCSS Optimized
applications.

cThis application also is in use at additional sites that are not a part
of the NTCSS program.

dThe functionality included in this application is to be replaced in the
future by Optimized Organizational Maintenance Activity.

eThe Navy plans to incorporate this functionality into Organizational
Maintenance Management System-Next Generation at a future date.

fThese four applications are deployed as a single software package at all
229 sites.

According to Navy officials, about $1.1 billion was spent on NTCSS between
1995 and 2005. This includes about $1 billion on NTCSS Optimized
applications7 and $91 million on OOMA and eNTCSS. Table 5 shows NTCSS's
budget totals from the time the program began in fiscal year 1995 through
fiscal year 2005.

Table 5: NTCSS Budget from FY 1995 through FY 2005

 Dollars  
in     
thousands 
           FY 95  FY 96  FY 97   FY 98   FY 99   FY 00   FY 01   FY 02  FY 03   FY 04  FY 05     Total 
NTCSS     83,537 69,794 69,075 123,469 119,822  91,053  95,322  95,549 82,708 108,087 71,926 1,010,342 
Optimized                                                                                    
OOMA                920    700     983   4,724  16,527  20,854  14,920  3,981   2,871 13,291    79,771 
eNTCSS                                                   5,000   5,309    985                  11, 294 
Total     83,537 70,714 69,775 124,452 124,546 107,580 121,176 115,778 87,674 110,958 85,217 1,101,407 

Source: Navy.

NTCSS Oversight and Management Roles and Responsibilities

A number of Navy and DOD organizations are involved in overseeing and
managing the NTCSS program. Table 6 lists the organizations involved in
NTCSS oversight and their respective roles and responsibilities.

Table 6: NTCSS Oversight Roles and Responsibilities

Oversight entity               Roles and responsibilities                  
Deputy Assistant Secretary of  Currently serves as the milestone decision  
the Navy for Command, Control, authority. Assigned overall responsibility  
Communication, Computers and   for the NTCSS program; approves the program 
Intelligence, and Space        to proceed through its acquisition cycle on 
                                  the basis of a review of key documents,     
                                  such as an acquisition plan, an             
                                  independently evaluated life cycle          
                                  cost-and-benefits estimate, Acquisition     
                                  Program Baseline documents, and Defense     
                                  Acquisition Executive Summary reports.      
Program Executive Office for   Serves as the program executive office.     
Command, Control,              Assigned overall responsibility for NTCSS   
Communication, Computers and   program oversight; reviews the component    
Intelligence, and Space; Space cost analysis, acquisition strategy, and    
and Naval Warfare Systems      Acquisition Program Baseline prior to       
Command                        approval by the milestone decision          
                                  authority.                                  
Department of Navy Chief       Reviews the acquisition program during the  
Information Officer            department's planning, programming,         
                                  budgeting, and execution processes to       
                                  ensure that the program's goals are         
                                  achievable and executable; ensures          
                                  conformance to appropriation law, financial 
                                  management regulations, and Navy, DOD, and  
                                  federal IT policies in several areas (e.g., 
                                  security, architecture, and investment      
                                  management); works closely with the program 
                                  office during milestone review assessments. 
Assistant Secretary of the     Ensures system compliance with              
Navy, Research Development and architectural standards and promotes        
Acquisition, Chief Engineer    interoperability of the Navy's systems.     
Office of the Secretary of     Verifies and validates the reliability of   
Defense, Office of the         cost and benefit estimates found in         
Director for Program Analysis  economic analyses and provides its results  
and Evaluation                 to the milestone decision authority.        
Naval Cost Analysis Division   Performs independent cost estimates,        
                                  maintains cost analysis tools, and focuses  
                                  on cost analysis policy and oversight.      
Executive Steering Committee   Establishes priorities for NTCSS            
                                  development and implementation and for      
Members are representatives    defining long-term architectural goals;     
from:                          meets after regularly scheduled NTCSS       
                                  meetings (e.g., Requirements Integrated     
Office of the Chief of Naval   Product Team meetings and Forum meetings).a 
Operations for Material        
Readiness and Logistics        
Operations (Chairman);         
                                  
Commander in Chief, U.S.       
Atlantic Fleet;                
                                  
Commander in Chief, U.S.       
Pacific Fleet;                 
                                  
Commandant of the Marine       
Corps; and                     
                                  
Program Executive Office for   
Command, Control,              
Communication, Computers and   
Intelligence, and Space.       

Source: Navy.

aThe Requirements Integrated Product Team is chartered to collect and
analyze users' requirements, input these requirements into the NTCSS
requirements management process, and provide recommendations to the
program office on these requirements. The Forum brings together
stakeholders and acquisition and development personnel to (1) discuss
issues and requirements related to current and future system readiness,
(2) develop specific action items and recommendations that will result in
improved program products and services to the Fleet, and (3) facilitate
key decisions by senior program leadership at Executive Steering Committee
meetings.

There have been three milestone decision authorities for NTCSS since the
program was begun. Initially, the milestone decision authority was in the
Office of the Assistant Secretary of Defense for Networks and Information
Integration/Chief Information Officer. In July 1999, this authority was
delegated to the Assistant Secretary of the Navy for Research,
Development, and Acquisition, who then delegated oversight authority to
Deputy Assistant Secretary of the Navy for Command, Control,
Communication, Computers and Intelligence, and Space in March 2000.

Table 7 lists the organizations involved in NTCSS management and their
respective roles and responsibilities.

Table 7: NTCSS Management and Stakeholder Roles and Responsibilities

Entity                     Roles and responsibilities                      
Program Manager, Warfare;  Serves as the program office. Assigned          
Space and Naval Warfare    responsibility for day-to-day program           
Systems Command            management of NTCSS and, as such, is the single 
                              point of accountability for managing the        
                              program's objectives through development,       
                              production, and sustainment. Manages cost,      
                              schedule, and performance reporting. Prepares   
                              and updates the acquisition strategy, component 
                              cost analysis, and acquisition program          
                              baselines. Coordinates all testing activities   
                              in coordination with requirements.              
Space and Naval Warfare    Serves as the central design agency. Assigned   
Systems Command, Systems   responsibility for software development,        
Center Norfolk             including application design, development, and  
                              testing activities. Responsible for managing    
                              trouble reports and change proposals.a Manages  
                              Space and Naval Warfare Systems Command,        
                              Systems Center Norfolk Detachment San Diego,    
                              which installs the initial NTCSS systems on     
                              ships, submarines, and at land sites and        
                              performs subsequent on-site software            
                              maintenance.                                    
Space and Naval Warfare    Serves as the in-service engineering activity.  
Systems Command, Systems   Provides engineering support and installs and   
Center Charleston          integrates hardware.                            
Office of the Chief of     Serves as the program and resource sponsor.     
Naval Operations for       Balances user requirements with available       
Material Readiness and     resources. Works with users to ensure that      
Logistics Operations       operational and functional requirements are     
                              prioritized correctly and are supported.        
                              Addresses various issues pertaining to Navy     
                              policy, requirements, resources, and schedules. 
Functional Managers        Represent the system users. Participate in the  
                              process of establishing functional requirements 
Includes representatives   for input into the change management and system 
from:                      design processes. Prepare test plans and test   
                              analysis reports to support functional          
Naval Sea Systems Command; certification of software.                      
                              
Naval Supply Systems       
Command;                   
                              
Naval Air Systems Command; 
and                        
                              
Commander in Chief, U.S.   
Atlantic Fleet.            

Source: Navy.

aNavy officials provided data regarding trouble reports and change
proposals for the Optimized and modernized NTCSS applications. For details
see appendix II.

NTCSS Participation in DOD's Rapid Improvement Team Pilot

In 2001, the DOD Chief Information Officer and the Undersecretary of
Defense for Acquisition, Technology, and Logistics chartered a pilot
project aimed at saving time by significantly reducing the reporting and
oversight requirements. The ultimate goal was to enable the acquisition
process to deliver mission-effective IT systems within 18 months. Known as
the Rapid Improvement Team (RIT) for IT Acquisition Management
Transformation, the pilot was to cover a 2-year period from January 1,
2002, through December 31, 2003. Nine programs from the military services
participated in the pilot. NTCSS was selected to participate in the pilot
by its milestone decision authority due to its longevity and because of
its perceived low risk, stability, and compliance with IT management best
practices. It was also believed that little system development remained to
be done. NTCSS began participating in the RIT pilot in October 2002.

The RIT pilot relieved the program office of the normal acquisition
process activities, such as preplanned, formal milestone decision reviews
or briefings, and it granted the program office the authority to pass key
milestones once it determined that established requirements had been met.
This streamlined approach was considered possible because all information
related to these requirements was to be continually updated and available
to oversight organizations and stakeholders via a RIT Web site. More
specifically, the program office was to update the Web site monthly via a
set of electronic forms with the kind of data that were traditionally
found in DOD oversight documents. The program office was also to use the
Web site to input key acquisition documents (e.g., acquisition plans,
economic analyses, requirements documents and test plans) in an electronic
library. In turn, the milestone decision authority and other oversight
organizations were to review these data on at least a monthly basis and
directly retrieve any acquisition documents to be reviewed from the
library. No response from the milestone decision authority would indicate
implicit approval of the program data. Although the formal RIT pilot ended
in December 2003, program officials told us that they continued to operate
using the RIT pilot's procedures and continued to update program
information on the Web site through December 2004.

According to a memorandum issued by the Office of the Assistant Secretary
of Defense for Networks and Information Integration/Chief Information
Officer and the Undersecretary of Defense for Acquisition, Technology, and
Logistics, the principal output of the pilot would be a blueprint for IT
acquisition that is transferable to other systems. A report summarizing
the results of the entire RIT pilot program was published in

April 2005.8 This report concluded that (1) by instituting risk-based
governance, the milestone decision authority can be assigned to an
organization subordinate to the Office of the Secretary of Defense without
adding unacceptable risk to the investment process and (2) the success of
risk-based governance and cycle time reduction is predicated on the
adoption of net-centricity9 by the business community.

Prior Review Identified Strengths and Weaknesses in DOD's Acquisition
Policies and Guidance

In July 2004, we reported10 that DOD's revised systems acquisition
policies and guidance incorporated many best practices for acquiring
business systems, such as (1) justifying system investments economically,
on the basis of costs, benefits, and risks, and (2) continually measuring
an acquisition's performance, cost, and schedule against approved
baselines. However, the revised policies and guidance did not incorporate
a number of other best practices, particularly those associated with
acquiring commercial component-based business systems, and DOD did not
have documented plans for incorporating these additional best practices
into its policies. We also reported that the department's revised
acquisition policies did not include sufficient controls to ensure that
military services and defense agencies would appropriately follow these
practices. We concluded that, until these additional best practices were
incorporated into DOD's acquisition policies and guidance, there was
increased risk that system acquisitions would not deliver planned
capabilities and benefits on time and within budget and increased risk
that an organization will not adopt and use best practices that were
defined. Accordingly, we made 14 recommendations to the Secretary of
Defense that were aimed at strengthening DOD's acquisition policy and
guidance by including additional IT systems acquisition best practices and
controls for ensuring that these best practices were followed. DOD agreed
with most of our recommendations and has since issued additional system
acquisition guidance.11

NTCSS Has Not Been Managed in Accordance with DOD and Other Relevant
System Acquisition and Development Guidance

DOD system acquisition and development policies and guidance, along with
other federal and best practices guidance, provide an effective framework
within which to manage IT business system programs and investments, like
NTCSS. Proper implementation of this framework can minimize program risks
and better ensure that system investments deliver promised capabilities
and benefits on time and within budget. The Navy has not managed NTCSS in
accordance with many key aspects of these policies and guidance. For
example, the Navy has not economically justified its investment in NTCSS
on the basis of cost and benefits. It has not invested in NTCSS within the
context of a well-defined enterprise architecture. Further, the Navy has
not effectively performed key measurement, reporting, and oversight
activities, and has not adequately conducted requirements management and
testing activities. Reasons the Navy cited for not following policies and
guidance included questioning their applicability to the NTCSS program,
having insufficient time in which to apply them, and believing that plans
to adopt them were not meant to be applied retroactively. In some cases,
the Navy did not acknowledge that any deviations from policies and
guidance had occurred but, in these cases, it has yet to provide us with
documentation demonstrating that it did adhere to them. As a result, the
Navy does not currently have a sufficient basis for determining whether
NTCSS is the right system solution for its tactical command support needs,
and it has not pursued the proposed solution in a way that increases the
likelihood of delivering defined capabilities on time and within budget.

The Navy Has Not Economically Justified Investment in NTCSS on the Basis
of Costs and Benefits

The decision to invest in any system should be based on reliable analyses
of estimated system costs and expected benefits over the life of the
program. DOD policy requires such analyses, and other relevant acquisition
management practices provide guidance on how these analyses should be
prepared. However, the current economic analysis for the NTCSS program
does not meet this guidance. Additionally, the analysis was not
independently reviewed in accordance with DOD guidance. Finally, contrary
to DOD policy and relevant acquisition management practices, the Navy has
not demonstrated that NTCSS Optimized applications are producing expected
benefits. Without such reliable analyses, an organization cannot
adequately know that a given system investment is justified.

The Latest NTCSS Cost Estimate Was Not Derived Reliably

According to DOD guidance,12 the cost estimates used to economically
justify an investment should be reasonable, traceable, and based on
realistic assumptions. Our research shows that a reliable cost estimate
should meet nine specific criteria developed by Carnegie Mellon
University's Software Engineering Institute (SEI),13 such as appropriately
sizing the task being estimated and identifying and explaining estimate
assumptions.

In March 2004, the NTCSS program office prepared its fifth NTCSS economic
analysis. This analysis examined the costs associated with three
alternative NTCSS hardware, software, operating system and data base
management configurations, and was to be used to inform decisions about
system development and implementation. The analysis did include estimated
costs for each alternative. However, it did not include measurable,
quantifiable benefits for each alternative. Rather, it included only
qualitative benefits. Further, the cost estimates used in this analysis
did not meet six of the nine criteria associated with reliable cost
estimates. For example, while the estimate's purpose was stated in
writing, the system life cycle used was 6 years rather than the 10 years
recommended. Also, documentation showing that the costs were based on data
from the program's demonstrated accomplishments has yet to be provided to
us, and the assumptions used to create the cost estimate were not
identified and explained. See table 8 for the results of our analyses
relative to each of the nine criteria.

Table 8: Navy Satisfaction of Cost Estimating Criteria

                                        

        Criterion             Explanation       Criterion    GAO analysis     
                                                  meta    
The objectives of    The objectives of the   Yes       The objective of    
the program are      program should be                 the program was     
stated in writing.   clearly and concisely             clearly stated.     
                        stated for the cost               
                        estimator to use.                 
The life cycle to    The life cycle should   No        The life cycle was  
which the estimate   be clearly defined to             not clearly defined 
applies is clearly   ensure that the full              to ensure that the  
defined.             cost of the                       full cost of the    
                        program-that is, all              program is          
                        direct and indirect               included. The life  
                        costs for planning,               cycle defined was 6 
                        procurement, operations           years past full     
                        and maintenance, and              operational         
                        disposal-are captured.            capability, instead 
                        For investments such as           of the full 10      
                        NTCSS, the life cycle             years defined in    
                        should cover 10 years             the DOD guidance.   
                        past full operational             
                        capability of the                 
                        system.b                          
The task has been    An appropriate sizing   Yes       The method used in  
appropriately sized. metric should be used             the model lends     
                        in the development of             itself to being     
                        the estimate, such as             appropriately       
                        the amount of software            sized.              
                        to be developed and the           
                        amount of software to             
                        be revised.                       
The estimated cost   Estimates should be     No        No documentation    
and schedule are     validated by relating             was provided to     
consistent with      them back to                      show the use of     
demonstrated         demonstrated and                  historical data to  
accomplishments on   documented performance            produce the         
other projects.      on completed projects.            estimate.           
A written summary of If a parametric         No        The model used      
parameter values and equation was used to              undocumented values 
their rationales     generate the estimate,            as the source of    
accompany the        the parameters that               the estimate for    
estimate.            feed the equation                 multiple elements.  
                        should be provided,               
                        along with an                     
                        explanation of why they           
                        were chosen.                      
Assumptions have     Accurate assumptions    No        Any assumptions     
been identified and  regarding issues such             used in the model   
explained.           as schedule, quantity,            were not            
                        technology, development           identified.         
                        processes,                        
                        manufacturing                     
                        techniques, software              
                        language, etc., should            
                        be understood and                 
                        documented.                       
A structured         A work breakdown        Yes       A work breakdown    
process, such as a   structure or similar              structure was       
template or format,  structure that                    provided and        
has been used to     organizes, defines, and           included all the    
ensure that key      graphically displays              standards elements. 
factors have not     the individual work               
been overlooked.     units to be performed             
                        should be used. The               
                        structure should be               
                        revised over time as              
                        more information                  
                        becomes known about the           
                        work to be performed.             
Uncertainties in     For all major cost      No        No risk analysis    
parameter values     drivers, an uncertainty           was documented in   
have been identified analysis should be                the estimate.       
and quantified.      performed to recognize            
                        and reflect the risk              
                        associated with the               
                        cost estimate.                    
If more than one     The primary methodology No        No secondary model  
cost model or        or cost model results             was discussed in    
estimating approach  should be compared with           the estimate        
has been used, any   any secondary                     documentation.      
differences in the   methodology (for                  
results have been    example, cross-checks)            
analyzed and         to ensure consistency.            
explained.                                             

Sources: SEI criteria, DOD guidance, and GAO analysis of Navy data.

a"Yes" means that the program provided documentation demonstrating
satisfaction of the criterion. "Partially" means that the program provided
documentation demonstrating satisfaction of part of the criterion. "No"
means that the program has yet to provide documentation demonstrating
satisfaction of the criterion.

bDOD, DOD Automated Information System (AIS) Economic Analysis (EA) Guide,
May 1, 1995.

Program officials told us that they did not develop the 2004 cost estimate
in accordance with all of the SEI cost estimating criteria because they
had only a month to complete the economic analysis. By not following
practices associated with reliable estimates, the Navy has decided on a
course of action that is not based on one of the key ingredients to sound
and prudent decision making- a reliable estimate of system life cycle
costs. Among other things, this means that the investment decision made by
the Navy has not been adequately justified and, that to the extent that
program budgets were based on cost estimates, the likelihood of funding
shortfalls and inadequate funding reserves is increased.

The Latest NTCSS Economic Analysis Did Not Meet Key Federal Guidance

According to Office of Management and Budget (OMB) guidance,14 economic
analyses should meet certain criteria to be considered reasonable, such as
comparing alternatives on the basis of net present value and conducting an
uncertainty analysis of costs and benefits.

The latest NTCSS economic analysis, prepared in March 2004, identified
potential costs and benefits from three alternative NTCSS hardware,
software, operating system, and data base management configurations.
However, the analysis provided only monetized costs for each alternative.
It did not provide monetized benefits. Further, the analysis did not meet
five of eight OMB criteria. For example, the alternatives were not
compared on the basis of their net present values, an appropriate interest
rate was not used to discount the net present values, and the uncertainty
associated with the cost estimates was not disclosed and used in the
analysis. See table 9 for the results of our analyses relative to each of
the eight criteria.

Table 9: Navy Satisfaction of OMB Economic Analysis Criteria

                                        

       Criterion           Explanation       Criterion      GAO analysis      
                                               meta    
The economic       The economic analysis  Yes       The economic analysis  
analysis clearly   should clearly explain           explained why the      
explained why the  the reason why the               status quo alternative 
investment was     investment is needed,            was not viable.        
needed.            i.e., why the status             
                      quo alternative is               
                      unacceptable.                    
At least two       At least two           Yes       Three alternatives to  
alternatives to    meaningful                       the status quo were    
the status quo     alternatives to the              considered.            
were considered.   status quo should be             
                      examined to help                 
                      ensure that the                  
                      alternative chosen was           
                      not preselected.                 
The general        The general rationale  Yes       The rationale for each 
rationale for the  for the inclusion of             alternative was        
inclusion of each  each alternative                 discussed.             
alternative was    should be discussed to           
discussed.         enable reviewers of              
                      the analysis to gain             
                      an understanding of              
                      the context for the              
                      selection of one                 
                      alternative over the             
                      others.                          
The quality of the The quality of the     No        The cost estimates     
cost estimate for  cost estimate of each            were not complete and  
each alternative   alternative should be            did not meet a         
was reasonable.    complete and                     majority of the SEI    
                      reasonable for a net             criteria.              
                      present value to be              
                      accurate. One measure            
                      of a cost estimate's             
                      reasonableness is its            
                      satisfaction of                  
                      earlier cited SEI                
                      criteria.                        
The quality of the The quality of the     No        Monetized estimates of 
benefits to be     benefit estimate of              benefits were not      
realized from each each alternative                 provided, and no       
alternative was    should be complete and           explanation was given  
reasonable.        reasonable for a net             as to why these        
                      present value to be              estimates were not     
                      calculable and                   provided.              
                      accurate.                        
Alternatives were  The net present value  No        The economic analysis  
compared on the    should be calculated             stated that all costs  
basis of net       because it                       and benefits were      
present value.     consistently results             expressed in           
                      in the selection of              undiscounted constant  
                      the alternative with             fiscal year 2004       
                      the greatest benefit             dollars; however,      
                      net of cost.                     monetized benefits     
                                                       were not reported in   
                                                       the economic analysis. 
                                                       As a result, the net   
                                                       present value was not  
                                                       calculated.            
The proper         OMB Circular A-94 is   No        Since all dollar       
discount rate used the general guidance             amounts are expressed  
for calculating    for conducting                   in undiscounted        
each alternative's cost-benefit analyses            constant fiscal year   
overall net        for federal government           2004 dollars, the      
present value      programs and provides            discount rate used in  
should be used.    specific guidance on             the economic analysis  
                      the discount rates to            is, by default, zero.  
                      be used in evaluating            The discount rates     
                      those programs whose             provided by OMB        
                      benefits and costs are           Circular No. A-94 are  
                      distributed over time.           all positive (i.e.,    
                                                       greater than zero).    
An uncertainty     Estimates of benefits  No        No uncertainty         
analysis of costs  and costs are                    analysis for the       
and benefits was   typically uncertain              overall reported costs 
included.          because of imprecision           was included.          
                      in both underlying               
                      data and modeling                
                      assumptions. Because             
                      such uncertainty is              
                      basic to virtually any           
                      cost-benefit analysis,           
                      its effects should be            
                      analyzed and reported.           

Sources: OMB guidance and GAO analysis of Navy data.

a"Yes" means that the program provided documentation demonstrating
satisfaction of the criterion. "Partially" means that the program provided
documentation demonstrating satisfaction of part of the criterion. "No"
means that the program has yet to provide documentation demonstrating
satisfaction of the criterion.

Program officials told us that they did not adhere to the OMB criteria
because they had only a month to complete the economic analysis and,
therefore, did not have the time necessary to comply with it. By not
following established OMB guidance, the reliability of the latest NTCSS
economic analysis is questionable. This further increases the risk that
the Navy is following a course of action that will not produce the
expected return on investment.

The Latest NTCSS Economic Analysis Was Not Independently Reviewed

DOD guidance15 states that economic analyses and cost estimates should be
independently reviewed and assessed. In this regard, the Office of Program
Analysis and Evaluation, located in the Office of the Secretary of
Defense, is responsible for verifying and validating the reliability of
economic analyses and providing the results to the milestone decision
authority; the Naval Cost Analysis Division is responsible for preparing
independent cost estimates.

However, neither of these offices reviewed the most recent economic
analysis for NTCSS. An official from the Office of Program Analysis and
Evaluation told us that this office did not review the 2004 economic
analysis because, once NTCSS entered the RIT Pilot, the program office no
longer provided documentation needed to review the analysis. Officials
from the Naval Cost Analysis Division also stated that they did not review
the estimates in this economic analysis. According to officials from this
office, they are only required to review cost estimates that are prepared
for milestone reviews, and staffing limitations do not permit them to
review all cost estimates.

By not having the economic analysis reviewed by independent parties, the
Navy has no independent verification that the estimates of life cycle
costs and benefits are reasonable and traceable, that the cost estimates
are built on realistic program and schedule assumptions, or that the
return on investment calculation is valid. This casts further doubt on the
reliability of the economic analysis the Navy has used to justify its
ongoing investment in NTCSS.

The Navy Has Yet to Measure Whether Actual Benefits Have Accrued from
Deployed NTCSS Capabilities

The Clinger-Cohen Act of 1996 and OMB guidance16 emphasize the need to
develop information to ensure that IT projects are actually contributing
to tangible, observable improvements in mission performance. DOD
guidance17 also requires that analyses be conducted to validate estimated
benefits and measure the extent to which desired outcomes have been
achieved. To this end, agencies should define and collect metrics to
determine whether expected benefits are being achieved and modify
subsequent applications and investments to reflect the lessons learned.

However, the Navy has yet to measure whether NTCSS Optimized applications
are actually producing expected benefits commensurate with actual costs.
For example, in 1999 the Navy projected that deploying the NTCSS Optimized
applications would result in reduced costs associated with NTCSS
maintenance, training, and other support activities. However, the Navy
does not know the extent to which NTCSS Optimized applications are meeting
these expectations-even though these applications have been deployed to
229 user sites since 1998-because metrics to demonstrate that these
expectations have been met have not been defined and collected.

Program officials and officials representing the milestone decision
authority stated that the Navy is not required to measure actual accrual
of benefits because DOD guidance to do so was not yet in effect when the
NTCSS Optimized applications were deployed, and there was no explicit
requirement to apply this guidance retroactively. Program officials also
stated that it will not be possible to measure actual return-on-investment
for the already deployed NTCSS Optimized applications until the entire
NTCSS system is deployed and operational. Similarly, an official with the
milestone decision authority stated that actual NTCSS return-on-
investment has not yet been measured.

Because it is not measuring whether cost and benefit projections are being
met, the Navy lacks important information that it will need to inform
future economic analyses and investment decisions.

The Navy Recently Decided to Prepare a Benefits Assessment

In February 2005, officials from the Office of the Chief of Naval
Operations for Material Readiness and Logistics Operations18 and
representatives from key user organizations questioned whether NTCSS can
cost effectively meet users' future needs. Initially this office tasked
the program office to develop a new economic analysis to determine whether
to continue investing in NTCSS or in some other system solution, such as
the Navy enterprise resource planning (ERP) program.19 In November 2005,
officials from the Office of the Chief of Naval Operations for Material
Readiness and Logistics Operations stated that they were no longer
planning to develop a new economic analysis but planning to conduct a
benefits assessment to evaluate changing NTCSS to some solution to enable
the system to perform future ashore activities. These officials
acknowledged that this assessment will be less than the initially planned
economic analysis in that it will exclude any analysis of costs and
alternative solutions. However, they also acknowledged that DOD policy and
guidance does not address benefits assessments as a recognized acquisition
program document. They stated that this assessment will be prepared for
inclusion in the 2006 budget submission.

Without knowing the extent to which NTCSS Optimized applications are
meeting cost and benefit expectations, the Navy is not in a position to
make informed, and thus justified, decisions on whether and how to proceed
with the program. Such a situation introduces a serious risk that the Navy
will not be able to demonstrate whether NTCSS is cost-effective until it
has already spent hundreds of millions of dollars more on the NTCSS
Optimized applications and OOMA.

The Navy Has Not Defined and Developed NTCSS within the Context of an
Enterprise Architecture

DOD policy and guidance,20 as well as federal and best practice
guidance,21 recognize the importance of investing in IT business systems
within the context of an enterprise architecture. Our research and
experience in reviewing federal agencies shows that not doing so often
results in systems that are duplicative, not well integrated,
unnecessarily costly to interface and maintain, and do not optimally
support mission outcomes.22 NTCSS has not been defined and developed in
the context of a DOD or Navy enterprise architecture because a
well-defined version of either has not existed to guide and constrain the
program, and meaningful analysis showing how NTCSS aligns to evolving DOD
and Navy architecture efforts was not produced. This means that the Navy
does not have a sufficient basis for knowing if NTCSS, as defined,
properly fits within the context of future DOD and Navy business
operational and technological environments.

More specifically, a well-defined enterprise architecture provides a clear
and comprehensive picture of an entity, whether it is an organization
(e.g., a federal department) or a functional or mission area that cuts
across more than one organization (e.g., personnel management). This
picture consists of snapshots of both the enterprise's current or "As Is"
environment and its target or "To Be" environment, as well as a capital
investment road map for transitioning from the current to the target
environment. These snapshots consist of integrated "views," which are one
or more architecture products that describe, for example, the enterprise's
business processes and rules; information needs and flows among functions;
supporting systems, services, and applications; and data and technical
standards and structures. GAO has promoted the use of architectures to
guide and constrain systems modernization, recognizing them as a crucial
means to a challenging goal: agency operational structures that are
optimally defined in both the business and technological environments.

DOD has long operated without a well-defined enterprise architecture for
its business environment. In 2001, we first reported that DOD did not have
such an architecture and recommended that it develop one to guide and
constrain IT business systems, like NTCSS.23 Over the next 4 years, we
reported that DOD's architecture development efforts were not resulting in
the kind of business enterprise architecture that could effectively guide
and constrain business system investments,24 largely because the
department did not have in place the architecture management structures
and processes described in federal guidance. In particular, we most
recently reported in July 200525 that despite spending about $318 million
producing eight versions of its architecture, DOD's latest version still
did not have, for example, a clearly defined purpose that could be linked
to the department's goals and objectives and a description of the "As Is"
environment and a transition plan. Further, we reported that the
description of the "To Be" environment was still missing important content
(depth and scope) relative to, for example, the actual systems to be
developed or acquired to support future business operations and the
physical infrastructure (e.g., hardware and software) that would be needed
to support the business systems. Over the last several years, we have also
reported that DOD's efforts for determining whether ongoing investments
were aligned to its evolving architecture were not documented and
independently verifiable.26 On September 28, 2005, DOD issued the next
version of its business enterprise architecture,27 which we are required
to review, along with other things such as the department's efforts to
review certain investments' alignment with the architecture, pursuant to
the Fiscal Year 2005 National Defense Authorization Act.28

The Navy has also not had an enterprise architecture to guide and
constrain its IT system investments. For example, in February 2002 and
November 2003, we reported that while the Navy was developing an
enterprise architecture, the architecture products were not complete and
they were not, for example, under configuration management.29 Since that
time, the Navy has yet to develop an enterprise architecture. In response
to our request for the latest version of its architecture, the Assistant
Secretary of the Navy, Research Development and Acquisition, Chief
Engineer, provided us documentation that describes high-level principles
or goals that the Navy wants to achieve, such as systems interoperability.
However, most of the critical products that an enterprise architecture
should include were not provided, such as (1) a data dictionary, which is
a repository of standard data definitions for applications; (2) a logical
database model that provides the data structures that support information
flows and that provides the basis for developing the schemas for
designing, building, and maintaining the existing physical databases; and
(3) an analysis of the gaps between the baseline and target architecture
for business processes, information/data, and services/application systems
to define missing and needed capabilities. According to the Deputy
Assistant Secretary of the Navy for Command, Control, Communication,
Computers and Intelligence, and Space, the Navy does not have an
enterprise architecture. However, these officials stated that the Navy
recognizes the importance of developing and using one and is taking steps
to do so. They did not have a time frame as to when this would be
accomplished, however.

In addition, NTCSS program officials told us that the system has been
assessed against DOD's business enterprise architecture, and based on this
assessment, the system is aligned. However, our analysis of the alignment
documentation showed while NTCSS could be mapped to several enterprise
architecture elements (e.g., strategic goals and organizational roles), it
was not mapped to other important elements (e.g., technical standards and
data model). Moreover, as previously discussed, the version of the
enterprise architecture used to assess alignment lacked utility and did
not provide a sufficient basis for making informed investment decisions.

These officials stated that they have not yet assessed the system against
the Navy's architecture because (1) the architecture has yet to be
sufficiently developed and (2) compliance with this architecture may not
be required.

Without having a well-defined architecture to set the institutional
context within which a given investment like NTCSS must fit and taking
proactive and verifiable steps to understand the extent to which the
system as it is defined fits within this context, misalignments can occur
that can introduce redundancies and incompatibilities and that can produce
inefficiencies and require costly and time consuming rework to fix. In the
case of NTCSS, this could be a problem because of the Navy's ongoing
investment in its ERP program.31 this program is 30 As we recently
reported,intended to provide functionality in such areas as supply and
workforce management for ashore activities, which is functionality similar
to that of NTCSS for afloat activities. However, both programs have
proceeded without a common, institutional frame of reference (i.e.,
enterprise architecture) that can be used to effectively manage their
relationships and dependencies. Our research and experience in reviewing
federal agencies shows that managing such relationships on a program to
program basis is untenable and has proven unsuccessful. This is why the
inherent risks associated with investing in systems in the absence of a
well-defined architecture need to be explicitly disclosed and deliberately
evaluated in order to make a well-informed investment decision.

Key Program Management and Oversight Activities Have Not Been Effectively
Performed

Key aspects of effective program management include reliable progress
measurement and reporting, appropriate budgeting, and meaningful
oversight. DOD policy requires such activities, and DOD and other industry
best practices provide guidance on how these activities should be
conducted. However, these activities have not been effectively performed
on the NTCSS program. Specifically, the Navy has not adequately measured
progress against planned cost and scheduled work commitments, fulfilled
defined reporting requirements, properly budgeted for expenditures, and
conducted meaningful program oversight. As a result, opportunities for
proactive program intervention and actions to address risks and problems
were missed, allowing the program to proceed largely unchecked.

The Navy is Not Adequately Measuring Progress Against Planned Cost and
Scheduled Work Commitments

Measuring and reporting progress against cost and schedule commitments is
a vital element of effective program management. DOD policy and guidance
recognize this by requiring the use of earned value management, and
describing how it is to be performed. The NTCSS program has elected to use
earned value management; however, it is not doing so effectively. As a
result, the program, as well as Navy and DOD oversight authorities, have
not had access to the kind of reliable and timely information they need to
make informed decisions.

DOD Has Adopted Industry Standards for Earned Value Management

According to DOD policy and guidance,32 program offices should obtain data
from contractors and central design agencies on work progress, and these
data should relate cost, schedule, and technical accomplishments.
Moreover, the guidance states that these data should be valid, timely, and
auditable. The tool that many DOD entities, including the NTCSS's program
office and its central design agency, use to obtain and report these data
is known as earned value management (EVM). Through EVM, program offices
and others can determine a contractor's or central design agency's ability
to perform work within cost and schedule estimates. It does so by
examining variances between the actual cost and time to perform work tasks
and the budgeted/estimated cost and time to perform the tasks.

In 1996, DOD adopted industry guidance33 that identifies 32 criteria that
a reliable EVM system should meet. The 32 criteria are organized into five
categories: organization, planning and budgeting, accounting, analysis and
management reports, and revisions and data maintenance (see app. III for
the 32 criteria). As we previously reported,34 EVM offers many benefits
when done properly. In particular, it is a means to measure performance
and serves as an early warning system for deviations from plans. It
therefore enables a program office to mitigate the risk of cost and
schedule overruns.

NTCSS Has Not Effectively Implemented EVM

The EVM system that NTCSS has implemented to measure program performance
does not provide the kind of reliable and timely data needed to
effectively identify and mitigate risks. According to the NTCSS central
design agency's self-assessment of its earned value management system, 17
of the 32 industry best practice criteria are not being satisfied by the
EVM system it has implemented. For example, the central design agency
reported that the system cannot (1) establish and maintain a budget
baseline against which program performance can be measured over time, (2)
identify management reserves in case of contingencies, (3) record all
indirect costs35 that will be allocated to the work, (4) summarize data
elements and associated variances through the work breakdown structure to
support management needs, and (5) develop revised estimates of cost at
completion based on performance to date.

Beyond this self-assessment, our review showed that 29 of the 32 criteria
were not satisfied. For example, the system does not (1) provide for the
integration of planning, scheduling, budgeting, work authorization, and
cost accumulation management process; (2) identify physical products,
milestones, technical performance goals, or other indicators used to
measure progress; (3) reconcile current budgets to prior budgets in terms
of changes to the authorized work and internal replanning; and (4) control
retroactive changes to records. See appendix III for the Navy's complete
self-assessment and our full analysis of the extent to which the 32
criteria are satisfied.

Officials with the program office and the central design agency stated
that although they chose to use EVM, they are not required by DOD policy
to do so and, therefore, do not have to comply with the 32 criteria. These
officials stated that one reason they are not required to use it is
because the program office and the central design agency are part of the
same organization (the Space and Naval Warfare Systems Command) and thus a
formal contract or written agreement between them does not exist. They
also stated that although the program as a whole exceeds dollar thresholds
for which EVM is required,36 they have chosen to break the program into
smaller projects managed on a fiscal year basis, and none of these
projects individually exceeds either the new or old DOD policy thresholds
that would require the use of EVM.

We do not agree that the absence of a contractual relationship or the
decomposition of the program into small, fiscal year-based projects is a
valid reason for not effectively implementing EVM. DOD and OMB guidance
require that the Navy base programmatic decisions on reliable analyses of
estimated system's costs and expected benefits over the life of the
program. The program office chose to use EVM as a means to satisfy these
requirements and to measure progress and identify potential problems
early, so that they could be effectively addressed. To accomplish this,
EVM must be performed correctly. By not implementing it correctly on
NTCSS, the Navy is losing an opportunity to gain the kind of visibility
into program progress needed to identify problems and risks early and
better ensure program success. Moreover, by tracking individual projects
on a yearly basis the program office cannot adequately understand the
status of the NTCSS program as a whole, which hinders its ability to
accurately forecast program costs at completion and provide realistic
schedule projections. In short, without reliable, timely, and auditable
EVM data, the program office cannot adequately manage technical, cost, and
schedule risks and problems.

Two NTCSS Projects Illustrate How EVM Has Been Poorly Implemented

Two of the individual NTCSS projects for which EVM activities were
reportedly being performed are (1) 2004 OOMA software development and (2)
2004 NTCSS hardware installation and integration (for both OOMA and
Optimized NTCSS). For the OOMA software project, EVM was performed by the
central design agency and for the NTCSS hardware project it was performed
by the Space and Naval Warfare Systems Command Systems Center, Charleston.
On both projects, we found several examples of ineffective EVM
implementation, including the following:

o An integrated baseline review was not conducted for either of the
projects. According to DOD guidance and best practices, an integrated
baseline review should be conducted as needed throughout the life of a
program to ensure that the baseline for tracking cost, technical, and
schedule status reflects (1) all tasks in the statement of work, (2)
adequate resources in terms of staff and materials to complete the tasks,
and (3) integration of the tasks into a well-defined schedule. Further,
program managers are to use cost performance reports that have been
validated by an integrated baseline review. Without verifying the
baseline, monthly cost performance reporting, which is to track against a
set budget and schedule, does not have sufficient meaning or validity.

o The estimate at completion for the 2004 OOMA software project, which is
a forecast value expressed in dollars representing the final projected
costs of the project when all work is completed, showed a negative cost
for a 6-month period (November 2003 to April 2004). When EVM is properly
implemented, this amount should include all work completed and always be a
positive number. The negative estimate at completion for this project
would mean that the central design agency had incurred a savings rather
than spending money, even though during that time more than $1.7 million
had been spent.

o The schedule performance index for the OOMA software project, which is
to reflect the critical relationship between the actual work performed
versus the costs expended to accomplish the work, showed program
performance during a time when the program office stated no work was being
performed. Specifically, the reports showed the schedule performance
fluctuating between $0.21 worth of work performed for every dollar spent
to more than $3.75 worth of work performed for every dollar spent during a
time that the program office claims all work was halted. Perfect
performance would indicate schedule indices equal to 1.0 at best (i.e.,
for every dollar spent there was 100 percent of the schedule achieved).

o The estimate at completion for the OOMA hardware installation project
showed that almost $1 million in installation costs had been removed from
the total sunk costs, but no reason for doing so was provided in the cost
performance report.

o The cost and schedule indices for the OOMA hardware installation project
showed improbably high program performance during a time when the
installation schedules and installation budget had been drastically cut
because OOMA software failed operational testing. Specifically, the
reports between March 2004 and July 2004 showed the current cost
performance fluctuating between $0.07 worth of work performed for every
dollar spent to $8.48 worth of work performed for every dollar spent.

Navy officials cited several reasons for these shortcomings. For the
software project, program officials stated that prior to the operational
testing of OOMA in 2003, the central design agency's implementation of EVM
was primitive at best and that the resulting data were not usable. They
also stated that after the project failed operational testing, they did
not see the value in rebaselining the project and thus all EVM analysis
was halted. They did, however, continue to invest in OOMA. For the
hardware installation project, a Charleston Center official responsible
for developing the installation reports stated that there were problems
with collecting actual costs because the individuals responsible for doing
the work were covered by other contracts, and there was no way to ensure
that the costs were being reported consistently. Regarding the
approximately $1 million in installation costs that were removed from the
total sunk costs, this official stated that these costs were erroneously
charged to this project and were thus removed because they were not part
of the original plan.

Ineffective implementation of EVM, as occurred on these two projects,
precludes NTCSS program officials from having reliable and timely
information about actual program status and does not provide these
officials with a sound basis for making informed program decisions.

The Navy Has Not Adequately Reported NTCSS's Progress and Problems

One essential aspect of effective program management is complete and
current reporting by the program office to oversight organizations
responsible for making decisions regarding the program's future. DOD
policy recognizes this, stating that the program office is accountable for
providing credible schedule, performance, and cost reporting information
to the milestone decision authority. Officials from the NTCSS milestone
decision authority told us that they relied on the program office to fully
disclose progress against, and deviations from, program cost, schedule,
and performance goals. However, the program office has not reported
consistently or reliably on the program's progress and, as a result, has
not fully disclosed program status to Navy and DOD oversight authorities
who are responsible for making proper investment decisions.

Navy Reporting Requirements for NTCSS Have Changed over the Last Several
Years

Since program inception, NTCSS requirements for reporting cost, schedule,
and performance information have changed. Prior to October 2002, the
program office was required to comply with applicable DOD acquisition
policies and guidance.37 This guidance generally required the program
office to provide oversight organizations with the following three key
reports:

o The Acquisition Program Baseline, which describes the program's cost,
schedule, and performance goals. This baseline document is to be developed
when the program is initiated, and it is to be updated for each milestone
review. Within 90 days of a program breach,38 unless the program is back
within its baseline goals, a new Acquisition Program Baseline is to be
prepared by the program office and approved by the milestone decision
authority.

o The Program Deviation Report, which is to be prepared when the program
office identifies deviations from the approved Acquisition Program
Baseline goals. More specifically, when the program office has reason to
believe that a program breach will occur, it is to immediately notify the
milestone decision authority. Within 30 days, the program office is to
inform the milestone decision authority of the reason for the deviation
and the actions it considers necessary to bring the program back within
baseline goals.

o The Defense Acquisition Executive Summary, which is prepared to inform
the milestone decision authority on the program's progress against cost,
schedule, and performance goals reflected in the Acquisition Program
Baseline. Prepared quarterly, the summary is designed to provide an early
warning to the DOD Chief Information Officer (CIO) and the milestone
decision authority by identifying existing and potential program problems
and describing mitigating actions that have been taken.

Between October 2002 and December 2004, the reporting requirements for the
program changed.39 As previously discussed, NTCSS was selected by its
milestone decision authority to participate in the RIT pilot, which was
aimed at saving time in the acquisition management process by reducing
traditional DOD reporting and oversight requirements, while still adhering
to DOD acquisition guidance. Under the RIT pilot, the program office was
required to prepare the following two monthly electronic reports:

o The Monthly Acquisition Program Review, which was to assess the current
health of the program on a monthly basis in such areas as cost and
schedule performance, testing, funding, and contracting. This report was
broken into eight parts. According to the program office, the main part
for NTCSS was the Program Manager Assessment.

o The Smart Chart, which was to address risks for different projects
within the program, including a description of the risk, actions taken to
address the risk, and recommendations for further actions. The Smart Chart
was also to contain any updates to the Acquisition Program Baseline.

In short, the RIT reporting was to provide the same information reported
via the traditional acquisition baseline and the summary report, but it
was to be more frequent (monthly versus quarterly) and use a different
format (electronic versus paper). In addition, under the RIT pilot,
certain acquisition documents, such as acquisition plans, economic
analyses, requirements documents, and test plans, were to be posted to the
RIT Web site's electronic library rather than sent in hard copy to the
program's stakeholders.

In December 2004, the program office and the milestone decision authority
agreed to discontinue use of the RIT pilot procedures. In January 2005,
the reporting requirements reverted to the acquisition policies and
procedures as prescribed in the updated DOD 5000 series. Currently, the
program office is required to prepare the summary report quarterly and the
acquisition baseline as needed. Also, in January 2005, the Navy required
the program office to begin making entries into the Dashboard. The
Dashboard, like the summary report, is prepared by the program office on a
quarterly basis for the milestone decision authority and is to provide an
assessment of the program in such areas as cost, schedule, and performance
characteristics.

The Navy Has Not Satisfied All NTCSS Reporting Requirements

The program office did not comply with the reporting requirements that
were in effect during the 27 months of the RIT pilot. Specifically:

o The Smart Chart was not updated for 19 of the 27 months. Specifically,
the data were updated eight times between October 2002 and November 2003;
the data were not updated after November 2003.

o The Program Manager Assessment was not updated for 11 of the 27 months.
In addition, the updates were not always made in a timely manner. For the
16 months that were updated, 7 were done after the month had ended, and
most of these updates were a month late.

o Of the 15 essential acquisition documents that the program office
committed to entering in the RIT electronic library, 10 were not entered.
For example, the most recent economic analysis and the test and evaluation
master plan for OOMA were not entered.

o The Program Deviation Report and Acquisition Program Baseline were not
prepared in a timely manner. Specifically, in April 2004, the acquisition
of eNTCSS was cancelled and, in May 2004, OOMA did not pass operational
testing-two events that caused the related cost and schedule thresholds in
the Acquisition Program Baseline to be breached. While program officials
had notified the milestone decision authority of these events via (1)
e-mail, (2) entries into the Program Manager Assessment on the RIT Web
site, and (3) briefings, the program office did not prepare a Program
Deviation Report until about 15 months later. Moreover, this deviation
report addressed only the OOMA failure, not the cancellation of eNTCSS and
reprogramming of unexpended eNTCSS funding. In addition, program officials
have yet to provide us with a new Acquisition Program Baseline to reflect
the program breach or documentation showing that this revised baseline has
been approved by the milestone decision authority.

For the DOD and Navy reporting requirements in effect since January 2005,
the Navy has satisfied some, but not all, of the reporting requirements.
For example, the program office has prepared the Dashboard reports
quarterly as required. However, it has not prepared the Defense
Acquisition Executive Summary quarterly as required; the first report was
not prepared until June 2005-6 months after the requirement resumed and
the report was due.

Program officials provided various reasons why the required program
reporting has not occurred. In the case of the Smart Charts and the
Program Manager Assessment reports, a contractor supporting the Assistant
Program Manager stated that the data may have been entered into the Web
site but not properly saved. Regarding the posting of documents into the
electronic library, an official from the milestone decision authority
stated that there was no documentation from the Office of the Assistant
Secretary of Defense for Networks and Information Integration/Chief
Information Officer that directed which, if any, acquisition documents
were to be entered into the RIT Web site. Similarly, a contractor
supporting the Assistant Program Manager stated that the folders in the
electronic library were established by the Army and thus the Navy was not
required to use them. However, our review of documentation provided by the
program office shows that it clearly states which specific documents
should be included in the electronic library. Regarding the delay in
preparation of the Program Deviation Report and subsequent Acquisition
Program Baseline revision, a contractor supporting the Assistant Program
Manager stated that a new baseline should have been prepared sooner, but
that this reporting was delayed due to the uncertainty of which reporting
methods to use after the end of the formal RIT pilot.

Officials representing the milestone decision authority stated that they
relied on program office reporting on program status and progress, and
that they expected the program office to inform them if the program
exceeded its cost, schedule, and performance thresholds. Without adequate
reporting, oversight officials were not positioned to effectively execute
their roles and responsibilities.

The Navy Has Not Properly Budgeted for NTCSS

In September 1999, the Navy Comptroller issued guidance directing program
offices to review their budgets and identify efforts that were being
improperly funded and to take the steps necessary to realign these funds
to "Research, Development, Test and Evaluation" as quickly as possible.
Further, DOD Financial Management Regulation40 requires that IT
development, test, and evaluation requirements generally be funded in the
"Research, Development, Test and Evaluation" appropriations. More
specifically it states that, "The Research, Development, Test and
Evaluation funds should be used to develop major upgrades increasing the
performance envelope of existing systems, purchase test articles, and
conduct developmental testing and/or initial operational test and
evaluation prior to system acceptance." Similarly, Navy financial
management policy41 states that, "All costs associated with software
development/modification efforts that provide a new capability or expand
the capability of the current software program (i.e., expand the
performance envelope) are funded in the Research, Development, Test and
Evaluation appropriation."42

However, this has not occurred. Since 1997, the program office has not
identified "Research, Development, Test and Evaluation" funds in five of
its seven Acquisition Program Baseline documents, three of which were
prepared after the guidance was issued by the Comptroller of the Navy.
Instead, the Navy funded these activities primarily out of the "Operations
and Maintenance," "Other Procurement," and "Ship Construction"
appropriations. (See table 10.)

Table 10: Threshold Amounts in NTCSS Acquisition Program Baselines

Dollars in  
    thousands  
Acquisition Date       Operations       Other         Ship       Research, 
program     prepared          and procurement construction    development, 
baseline              maintenance                                 test and 
                                                                   evaluation 
Revision 0  March         182,986     199,636       11,683               0 
               1997                                           
Revision 1  March         257,542     303,565       23,836           3,026 
               1998                                           
Revision 2  December      223,370     285,550       18,220               0 
               1998                                           
Revision 3  January       276,100     382,000       27,300               0 
               2001                                           
Revision 4  January       276,100     382,000       27,300               0 
               2003                                           
Revision 5  July 2003     276,100     382,000       27,300               0 
Revision 6  January       376,400     346,600       25,700          29,800 
               2004                                           

Source: Navy.

Program officials agreed that they have funded NTCSS development
activities, such as those associated with OOMA, out of the "Operation and
Maintenance" appropriation rather than the "Research, Development, Test
and Evaluation" appropriation. A contractor supporting the Assistant
Program Manager stated that, although they were aware of the Comptroller
of the Navy's budget guidance, the program office chose not to comply
because program officials believed in 1999 that the OOMA application,
which had been under development for 3 years, would pass developmental
testing and operational testing by 2001. As a result, program officials
determined that the effort required to reprogram funding from the
"Operation and Maintenance" appropriation into the "Research, Development,
Test and Evaluation" appropriation was not warranted. Further, the
official stated that although OOMA did not pass operational testing in
2001, the program office did not fund OOMA with "Research, Development,
Test and Evaluation" funds until 2004 because it continued to consider
OOMA as being close to becoming operational.

The lack of proper budgeting for "Research, Development, Test and
Evaluation" funding has given oversight authorities the misleading
impression that NTCSS development activities were completed and that the
system was fully operational. Specifically, officials from the Office of
the Assistant Secretary of Defense for Networks and Information
Integration/Chief Information Officer, which was the original NTCSS
milestone decision authority, stated that since most of the "Research,
Development, Test and Evaluation" funding appeared to have been spent,
they concluded that the development portion of NTCSS was essentially
complete. As a result, these officials stated that they had considered
taking NTCSS off of the list of programs subject to oversight reviews.
However, after 9 years and over $79 million in expenditures, the OOMA
application still has not passed operational testing and thus is still in
development.

Navy Oversight of NTCSS Has Not Been Adequate

DOD and Navy policies task a number of organizations with oversight of IT
system acquisition and development programs. For example, DOD policy
states that a milestone decision authority has overall program
responsibility. In addition, the Navy Chief Information Officer is
responsible for reviewing programs at certain points in the acquisition
cycle. Finally, the NTCSS Executive Steering Committee is responsible for
monitoring the near-term development and evolution of the NTCSS program.
However, effective oversight by these entities has not occurred. As a
result, opportunities to address long-standing program weaknesses have
been missed, and the program has been allowed to proceed virtually
unchecked.

The Milestone Decision Authority Has Not Adequately Overseen the Program

DOD acquisition policy43 states that a milestone decision authority is the
designated individual with overall responsibility for a program and is to
ensure accountability and maximize credibility in program cost, schedule,
and performance reporting. In this role, the milestone decision authority
is responsible for reviewing the program throughout its acquisition life
cycle, including: (1) whenever the program reaches a milestone decision
point; (2) whenever cost, schedule, or performance goals are baselined or
must be changed; and (3) periodically through review of management
information such as that found in the Defense Acquisition Executive
Summary reports.

However, the Navy milestone decision authority44 has not conducted such
reviews. Specifically:

o The NTCSS program has not reached a milestone decision point in over 5
years. The last such milestone was in April 2000 when the final two NTCSS
Optimized applications became operational. The next scheduled milestone
was to be in 2001, but because OOMA operational testing was stopped and
has yet to be successfully completed, a milestone decision point has yet
to occur. As a result, there has not been a triggering event that would
cause the milestone decision authority to formally review the program or
any of its projects. We discussed the state of NTCSS in March 2005 with
the milestone decision authority's representatives. In July 2005, the
authority was briefed by the program office. According to program
officials, this was the first formal program review to occur since
termination of the RIT pilot in December 2003. These officials also stated
that quarterly acquisition team meetings have since resumed-with the first
meeting having occurred in September 2005 and the next scheduled for
December 2005-to prepare for the next milestone review of OOMA.

o The program office notified the milestone decision authority in April
and June 2004 that OOMA failed operational testing and that eNTCSS was
cancelled via e-mail, entries into the Program Manager Assessment on the
RIT Web site, and briefings. According to officials with the milestone
decision authority, they followed up with the program office and provided
guidance; however, these events did not trigger a formal program review.

o The milestone decision authority did not contact the program office to
inquire as to the reason why monthly reports were not being prepared as
agreed to after the formal RIT pilot had ended. For example, Smart Charts
were not prepared after November 2003. However, according to milestone
decision authority officials, they did not seek an explanation from the
program office as to why. Milestone decision authority officials told us
that they were relying on the Dashboard report in order to stay informed
on the program's progress. However, they did not require the program
office to begin preparing the Dashboard report until January 2005.

According to DOD and Navy officials, including officials from the Office
of the Assistant Secretary of Defense for Networks and Information
Integration/Chief Information Officer, the Navy milestone decision
authority, and the program office, NTCSS participation in the RIT pilot
resulted in disruption of normal oversight activities, which have yet to
be fully restored. They added that compounding this is the fact that the
Navy's milestone decision authority's staffing has been reduced in recent
years. According to these officials, approximately 2 years ago the number
of full time staff in the Office of the Deputy Assistant Secretary of the
Navy for Command, Control, Communication, Computers and Intelligence, and
Space was reduced from 16 to 6 people, and these 6 are responsible for
reviewing approximately 60 acquisition programs. The officials stated
that, given the large number of programs and limited staffing, they are
unable to fully perform oversight activities so they have increasingly
relied on the program executive office's assistance to perform detailed
oversight of this program. Without adequate oversight by the milestone
decision authority, the NTCSS program has been allowed to proceed despite
the program weaknesses discussed in this report.

Other Navy Organizations Have Not Conducted Program Oversight

While the milestone decision authority is the main program oversight
entity, two other Navy organizations have oversight responsibilities.
However, these entities also have not performed effective oversight of the
program. Specifically,

o Department of Navy CIO is responsible for reviewing programs at certain
points in the acquisition cycle to ensure, among other things, that
program goals are achievable and executable and that the program is
providing value (i.e., producing a positive return-on-investment). Navy
CIO officials stated that they have overseen NTCSS primarily by reviewing
the Capital Investment Reports45 prepared by the program office. They
stated that they have not performed any proactive activities to verify and
validate the program's status and progress. Instead, they rely on
information in the Capital Investment Reports, such as economic
justification; budget information by appropriation type; and cost,
schedule, progress, and status. However, as was discussed previously, the
program office does not have or has not reported reliable information on
these topics.

o The NTCSS Executive Steering Committee is responsible for establishing
priorities for NTCSS development and implementation and determining the
strategic direction of the program. Among other things, it is to meet
immediately following each major NTCSS program meeting. However, it has
not met since December 2002, even though the program office convened both
a Requirements Integrated Product Team meeting and a Forum meeting in
February 2005. Further, during this period, major setbacks occurred on the
program, including the failure of OOMA to pass operational testing and the
cancellation of eNTCSS, which were issues that affected the direction of
the program and its priorities and thus were consistent with the
committee's charter. Program officials agreed that the Executive Steering
Committee has not formally convened during this time frame. However,
program officials stated that members of the committee informally met to
discuss and provide advice regarding OOMA concerns, and Navy officials
higher than the Executive Steering Committee made the decision to cancel
eNTCSS. Therefore, these officials stated there was no need to formally
convene an Executive Steering Committee meeting. Program officials stated
that the Executive Steering Committee will be meeting in January 2006.

NTCSS Requirements and Test Management Weaknesses Have Contributed to
Deployment Delays and System Quality Problems

As we have previously reported,46 the effectiveness of the processes used
to develop a system is a reliable predictor of the quality of the system
products produced. Two key system development processes are requirements
development and management and test management. For the NTCSS application
currently under development, we found weaknesses with both of these
process areas. While improvements are planned, until they are implemented
effectively, the risk of continued NTCSS cost, schedule, and performance
shortfalls persists.

The Navy Has Not Adequately Managed Requirements for the NTCSS Application
Currently Under Development

Well-defined requirements can be viewed as a cornerstone of effective
system development and implementation. Accordingly, DOD guidance and
industry best practices recognize effective requirements development and
management as an essential system development and acquisition management
process. For the NTCSS application that is currently under
development-OOMA-the Navy has not adequately managed its 732 requirements,
as evidenced by a lack of requirements traceability and prioritization.
NTCSS program officials told us that NTCSS requirements development
practices have historically been poor, but that improvements are under
way. Without effective requirements management, it is likely that the
Navy's challenges to date in developing NTCSS applications that meet user
needs on time and on schedule will continue.

Requirements for OOMA Release 4.10 Were Not Traced

DOD guidance and industry best practices also recognize the importance of
requirements traceability.47 The purpose of requirements traceability is
to ensure that the finished product is compliant with the requirements. To
do this, the system documentation should be consistent and thus complete,
allowing for requirements traceability. Requirements traceability involves
both the alignment and consistency backward to system documentation and
forward to system design and test documentation.

OOMA release 4.10 requirements were not traced to an Operational
Requirements Document. According to DOD guidance,48 an Operational
Requirements Document translates nonsystem-specific statements of a needed
operational capability into a set of validated and prioritized user
requirements. However, the Navy did not develop an Operational
Requirements Document for NTCSS. As a result, the Navy did not take a
basic validation step to ensure that the requirements to which it designed
and built the application were complete and correct. In addition, release
4.10 requirements were not always traceable to associated system
specifications. Specifically, we were unable to trace 215 requirements
found in the system segment specification to the requirements listed in
the requirements checklist. Requirements should also be traced to test
cases, but the program office has yet to provide us with the developmental
test cases used to test the OOMA release 4.10 so that we could verify this
traceability.

Program officials acknowledged that release 4.10 requirements were not
traceable but that improvements are planned for the next OOMA release. We
found that 97 percent of the OOMA release 5.0 requirements found in the
system segment specification were traceable to the requirements listed in
the requirements checklist. However, these documents have yet to be
approved. Requirements should also be traced to test cases, but the
program office has yet to provide us with the developmental test cases
used to test the OOMA release 5.0 so that we could verify this
traceability. Without this traceability, the Navy has not had a sufficient
basis for knowing that the scope of its development efforts, including
testing, provides adequate assurance that applications will perform as
intended.

Requirements for OOMA Release 4.10 Were Not Prioritized

According to published best practices guidance,49 any project with
resource limitations should establish the relative priorities of the
requested features or requirements. Prioritization helps the project
office resolve conflicts, make trade-off decisions among competing
requirements, and helps to ensure that the delivered system will be
operationally suitable.

However, OOMA's approximately 732 requirements have never been
prioritized, and a program official told us that they are all considered
to be equally important. This means, for example, that a requirement that
dictates basic application functionality (e.g., if text can be entered on
a particular screen) is as important as a requirement addressing safety
issues that, if not met, could result in the loss of an aircraft or even a
life.

This lack of requirements prioritization contributed to release 4.10
passing developmental testing but failing operational testing. (See later
section of this report for a detailed discussion of OOMA testing.) A
developmental testing threshold that the Navy set for release 4.10 was
that each requirement was to be tested, and 95 percent of the requirements
had to pass in order for the application to proceed to operational
testing. For developmental testing of the OOMA release 4.10, 97 percent of
the requirements passed. Of the 3 percent of the requirements that failed
this test, some of these deficiencies seriously impacted squadron level
operations. Further, for operational testing of OOMA release 4.10, 96
percent of the requirements passed. However, the remaining 4 percent
contained significant defects. Specifically, the release provided
inconsistent and inaccurate flight and usage hours, as well as incorrect
aircraft usage records. According to the Navy's independent operational
test organization, these deficiencies impacted aircraft and component
time-based inspection cycles and thus were the basis for the system
failing operational testing. The Navy has yet to provide evidence that the
requirements have been prioritized for the OOMA release 5.0.

The Navy's Developmental Testing for OOMA Has Not Been Effective, but
Improvements Planned

Both DOD policy and relevant guidance recognize that effective testing is
an essential component of system development or acquisition programs.
Generally, testing can be viewed as consisting of two major phases-a
developmental phase in which tests are performed to ensure that defined
system requirements and specifications are met and an operational phase
that includes tests to determine if the system meets user needs and is
suitable in an operational environment. The OOMA application has failed
operational testing twice over the last 4 years reportedly because of
deficiencies in developmental testing. Program officials attributed
developmental testing deficiencies to poor software development practices,
such as the earlier discussed requirements development problems. These
testing deficiencies can also be attributed to incomplete testing
documentation. Without effective developmental testing, there is an
increased risk that application problems will be detected later in the
system life cycle when they are more expensive and difficult to fix.

Navy Operational Testing Organization Reported That Developmental Testing
Has Failed to Identify Problems

According to DOD guidance and recognized best practices,50 the purpose of
developmental testing is to provide objective evidence that the product
(e.g., software module, application, system) satisfies defined
requirements and performs as intended. Successful completion of
developmental testing provides the basis for proceeding into operational
testing to determine whether the integrated product (e.g., application,
system, system of systems) performs as intended in an operational or
real-world setting.

OOMA operational testing results over the last 4 years show that the
program office's developmental testing efforts have not been effective in
identifying critical product problems. In particular, the application has
failed operational testing twice during this time frame and, according to
an official in the office of the Director of Navy Test and Evaluation and
Technology Requirements, the failures occurred in operational testing
because they were not identified during developmental testing. More
specifically,

o In March 2001, the program office certified that OOMA release 3.25 had
passed developmental testing and was ready for operational testing.
However, 1 month into a scheduled 3-month operational test, the decision
was made to cease further testing because of significant problems with
system reliability, data transfer between the application and the
database, and user training on the application. As a result, the program
office decertified this release, and the Navy's independent test
organization recommended discontinuing OOMA deployment.

o Using results from the failed operational test, the central design
agency developed release 4.0. In February and March 2002, developmental
testing of this release was conducted. Test results showed that the
application was not ready for operational testing because it did not
satisfy key functional requirements. Subsequently, the central design
agency incorporated software fixes in release 4.10. In August and
September 2002, developmental testing was conducted on this release and,
while a number of deficiencies were verified as fixed, additional
corrections were needed. From January to June 2003, developmental testing
was again conducted on OOMA release 4.10.

o From August 2002 to April 2003, the Naval Audit Service51 reviewed OOMA
and reported several problems that would affect the application's
readiness for operational testing. For example, it reported that controls
to prevent unauthorized access were not in place, Privacy Act information
was not adequately protected, and backup and recovery procedures were not
in place. It also reported that the program had not adopted and
implemented a risk-based system life cycle management approach. According
to the report, these weaknesses could compromise safety, affect planning,
and distort readiness reporting if OOMA was implemented throughout the
Navy.

o In June 2003, the program office certified OOMA release 4.10 as having
passed developmental testing and being ready for operational testing. The
Navy's independent operational test organization subsequently conducted
testing from August to December 2003 and, in May 2004,52 this organization
concluded that OOMA was not operationally effective or suitable and thus
it again failed operational testing. In particular, the operational
testing results showed that the application was incorrectly calculating
flight and component usage hours-defects, which according to an official
in the office of the Director of Navy Test and Evaluation and Technology
Requirements, could have resulted in the loss of aircraft or life. The
Assistant Program Manager also told us that release 4.10 did not address
all of the deficiencies reported by the Naval Audit Service.

For about a year, the central design agency has been developing and
testing OOMA release 5.0 to fix the problems found in the prior version.
The program office expects that this release will be certified as ready
for operational testing sometime between April and June 2006. In
preparation for operational testing, the Navy's independent operational
test organization has been observing OOMA 5.0 developmental testing. A
memo from this organization states that this release is an improvement
over the previous releases.

According to Navy officials, including the NTCSS Assistant Program Manager
and the official responsible for OOMA developmental testing, previous
application development practices were poor, which led to testing
problems. Specifically, they cited poor requirements definitions, poor
documentation, and concurrent development of application releases as
examples. Further, Navy officials stated that the central design agency
has not had a developmental testing lab to facilitate effective testing of
application components and their integration. To address the poor
development practices, program officials told us that they are in the
process of implementing a new system life cycle management process that
they said incorporates industry best practices, including those related to
testing. However, the program office has yet to provide us with
information defining how the practices in this plan will be implemented.
To address the need for a developmental testing lab, the Naval Air Systems
Command organization representing NTCSS users recently created a lab to
strengthen the program's developmental testing capability. According to
officials associated with the lab, they are finding defects that the
central design agency should have found.

It is important that the NTCSS program improve its developmental testing.
Without effective developmental testing, there is an increased risk that
system application problems will be detected late in the system life
cycle, such as during operational testing. Generally, problems discovered
late in the cycle are more expensive and difficult to fix than those
discovered early.

Developmental Test Documentation Has Not Been Adequate, but Improvements
Planned

To be effective, testing should be approached in a rigorous and
disciplined fashion. One aspect of such testing is developing and using
various testing

documentation. DOD policy, guidance, and related best practices53 state
that such documentation includes a test and evaluation master plan for the
program, as well as documentation that is system product (e.g., module,
application, system) and test type (e.g., integration, stress, regression,
developmental) specific. This documentation includes approved test plans,
test procedures and cases, and test results. According to DOD and other
guidance, test plans should include, among other things, objectives,
responsibilities, resources (tools, people, and facilities), schedules,
and performance and exit criteria; test procedures should include detailed
test scenarios, test events, steps, inputs, and expected outputs that are
traced back to requirements. Test results include the test scenarios that
passed and failed, assessments of deviations from test plans, and the
extent to which requirements have been met.

The NTCSS test and evaluation master plan identified, among other things,
three phases of developmental testing for OOMA release 4.10. However, key
test documentation for each of these phases was not produced.
Specifically,

o For the first phase, a test report was produced that contained detailed
information on test results, but the program office has yet to provide us
with a test plan or test procedures.

o For the second phase, a test report was produced but it only contained
the number of defects found (organized by severity) and did not include
any other information on test results. Moreover, the program office has
yet to provide us with a test plan or test procedures.

o For the third phase, both a test plan and test report were produced, and
the plan included the test purpose and objectives, schedule,
responsibilities, and people resources, while the test report described
test issues and contained detailed test results. However, the program
office has yet to provide us with test procedures.

According to Navy officials, including the Assistant Program Manager and
officials responsible for developmental testing, the previously mentioned
poor application development practices contributed to the absence of
testing documentation. To address these poor practices, the program has
developed a system life cycle plan that they said incorporates industry
best practices, including those associated with testing documentation.
However, the program has yet to provide us with plans defining how these
practices will be implemented. Moreover, while the plan contains a
recommended list of testing documents (e.g., test plan, test procedures,
and test results report), our review of OOMA release 5.0 developmental
testing documentation shows that not all the documentation is being
prepared. Specifically, available documentation included an updated test
and evaluation master plan and two test reports. Documentation not yet
provided to us included test procedures, which would include documentation
tracing test cases to requirements.

The lack of a full set of developmental test documentation is a problem.
Without such documentation, the adequacy and reliability of developmental
testing cannot be substantiated, and thus the quality of the associated
system products is in doubt.

Central Design Agency Reports Management Improvements are Under Way

In an effort to improve its performance on NTCSS and other programs,
central design agency officials told us that they chose to undergo an SEI
Capability Maturity Model Software Capability Appraisal in July and August
2005. Carnegie Mellon University's SEI, recognized for its expertise in
software and system processes, has developed the Capability Maturity
Model(TM) for Software (SW-CMM)54 to provide guidance on how to gain
control of their processes for developing and maintaining software and how
to evolve toward a culture of software engineering and management
excellence.

In brief, SW-CMM calls for assessing different process areas-clusters of
related activities such as project planning, requirements management, and
quality assurance-by determining whether key practices are implemented and
whether overarching goals are satisfied. Successful implementation of
these practices and satisfaction of these goals result in the achievement
of successive maturity levels. SW-CMM maturity levels range from 1 to 5,
with level 1 meaning that the process is either characterized as ad hoc
and occasionally even chaotic with few processes defined and success
depending on individual effort; level 2 meaning that the process is
repeatable; level 3 meaning that the process is defined; level 4 meaning
that the process is managed; and level 5 meaning that the process is
optimized.

According to the central design agency they achieved a maturity rating of
level 3 against the SW-CMM based on 13 process areas, including
requirements management, software project planning, software project
tracking and oversight, subcontract management, software quality
assurance, software configuration management, organizational process
focus, organizational process definition, training program, integrated
software management, software product engineering, intergroup
coordination, and peer reviews. Further, we were told that NTCSS was one
of the programs included in the review. However, we have yet to receive
the appraisal report to determine the extent to which the appraisal
addressed the weaknesses discussed in this report. Nevertheless, our
research has shown that properly performing such appraisals can be a
useful starting point for making software and system related development
improvements.

Conclusions

It is unclear whether the Navy's planned investment in NTCSS is warranted.
Of particular concern is the absence of reliable analysis showing that
further investment will produce future mission benefits commensurate with
estimated costs, as well as the void in information concerning whether the
deployed and operational components of NTCSS are actually producing
expected value. Compounding this uncertainty is the inherent risk of
defining and developing NTCSS outside the context of either a well-defined
DOD or Navy enterprise architecture. Without this information, the Navy
cannot determine whether NTCSS as defined, and as being developed, is the
right solution to meet its strategic business and technological needs.

Even if these uncertainties were to be addressed, and the Navy had the
data needed to demonstrate that NTCSS plans are the right course of
action, then the manner in which NTCSS is being defined, developed,
tested, measured, and overseen would still be of concern. While any one of
the concerns that we found is troubling, their combination subjects the
program to an unacceptably high risk of failure. These effects are being
realized on NTCSS, as evidenced by the cancellation of one system
component and the repeated failure of another key component to pass
testing.

It is extremely important that Navy and DOD authorities responsible and
accountable for ensuring prudent use of limited resources reassess whether
allowing NTCSS to continue as planned is warranted. It is also important
that the decision on how to proceed be based on reliable data about
program cost, benefits, risk, and status.

Recommendations for Executive Action

We recommend that the Secretary of Defense direct the Secretary of the
Navy to determine if continued investment in NTCSS, as planned, represents
a prudent use of the department's limited resources. To accomplish this,
the Secretary of the Navy should direct the program office to take the
following three actions:

o collaborate with the Office of the Assistant Secretary of Defense for
Networks and Information Integration/Chief Information Officer, the Office
of Program Analysis and Evaluation, and the Naval Cost Analysis Division
to prepare a reliable economic analysis that encompasses all viable
alternatives, including the Navy's recent enterprise resource planning
program;

o ensure that development of this economic analysis (1) complies with cost
estimating best practices, including recognition of costs to resolve open
trouble reports and change proposals, and relevant OMB cost benefit
guidance and (2) incorporates available data on whether deployed NTCSS
capabilities are actually producing benefits; and

o collaborate with the Undersecretary of Defense for Acquisition,
Technology, and Logistics and the Under Secretary of Defense (Comptroller)
to ensure that NTCSS is adequately aligned with evolving DOD and Navy
enterprise architectures.

In addition, we recommend that the Secretary of Defense direct the
Secretary of the Navy to present the results of these analyses to the
Deputy Secretary of Defense, or his designee, and seek a departmental
decision on how best to proceed with the program. Until this is done, we
recommend that the Secretary of Defense direct the Secretary of the Navy
to halt further deployment of NTCSS and to limit future investment in
already deployed applications to essential operation and maintenance
activities and only developmental activities deemed essential to national
security needs.

If-based on reliable data-a decision is made to continue the NTCSS
program, we recommend that the Secretary of Defense direct the Secretary
of the Navy to ensure that the following two actions are taken:

o the NTCSS program implements effective program management activities,
including earned value management, requirements development and
management, and test management; and

o key stakeholders, such as the central design agency and the
developmental testing organization, have the people, processes, and tools
to effectively execute their respective roles and responsibilities.

Finally, we recommend that Secretary of Defense reestablish the Office of
the Assistant Secretary of Defense for Networks and Information
Integration/Chief Information Officer as the milestone decision authority
and direct the Secretary of the Navy to take steps to ensure that Navy
oversight entities fulfill their roles and responsibilities on NTCSS,
including ensuring that reliable program reporting occurs and is acted
upon.

Agency Comments and Our Evaluation

In its written comments on our draft report, signed by the Deputy to the
Assistant Secretary of Defense for Networks and Information Integration
(Command, Control, Communications, Intelligence, Surveillance, and
Reconnaissance and Information Technology Acquisition) and reprinted in
appendix IV along with our detailed responses, DOD stated that some of our
findings are valid. For example, it acknowledged that NTCSS was defined
and implemented without a complete and formal enterprise architecture.
However, it also commented that our overall findings significantly
understated and misrepresented the program's level of discipline and
conformance with applicable guidance and direction. The department added
that NTCSS "has proven to be the right solution to meet the Navy's
strategic business and technological needs," and that sound program
management practices are in place and improving.

Neither DOD's comment about our overall findings nor its claims about
NTCSS being the right solution and being effectively managed are
adequately supported, as evidenced by the numerous factual instances that
we site in the report where the Navy did not comply with either DOD
acquisition policies and guidelines or industry best practices.
Specifically, the report shows that the program's latest economic analysis
did not provide the Navy a reliable basis upon which to make investment
decisions. For example, the analysis did not include measurable,
quantifiable benefits for each alternative, and the cost estimates did not
meet six of the nine criteria associated with reliable cost estimates. The
analysis also was not independently reviewed in accordance with DOD
guidance and the Navy had yet to demonstrate that already deployed NTCSS
Optimized applications are producing expected benefits. We appropriately
concluded that the Navy does not know whether the program as defined is
the right solution to meet the Navy's strategic business and technological
needs.

With respect to our recommendations, DOD fully concurred with two of the
recommendations and partially concurred with the remaining five
recommendations. The five areas of disagreement, DOD's basis for its
disagreement, and our response to DOD's position follow.

First, DOD stated that it does not see merit in conducting a formal
economic analysis for the NTCSS program that would address all viable
alternatives because, at this late stage, NTCSS is a "very mature
program," and the final application (OOMA) is about to be fielded.
Further, DOD said it saw no merit in seeking Office of Program Analysis
and Evaluation (PA&E) review of the economic analysis. Instead, it said
that it will "coordinate" with PA&E in analyzing the relationship of NTCSS
with other programs that may provide similar functionality and "brief the
results" to selected stakeholders.

We do not agree that NTCSS is a "very mature program." In particular, the
Navy still plans to spend in fiscal years 2006 through 2009 an additional
$348 million, which is approximately one-third of what has been spent on
the program to date. Further, there is no evidence to support the claim
that the OOMA application is about to be fielded. OOMA has failed
operational testing twice and is not yet fully developed or tested despite
the Navy's initial plan to field it in 2001. In addition, the Navy's
stated intention to develop an economic analysis for OOMA only and then,
separately, prepare an "analysis to determine the relationship" of NTCSS
and other alternative programs is not consistent with guidance and best
practice, which advocate basing such analyses on the full scope of the
planned investment. In addition, the proposal to limit key stakeholders'
involvement in developing the economic justification to "coordinating" and
"briefing would be inappropriate." These stakeholders have specific
expertise and roles relative to economically justifying system investments
that should be exploited. Until it conducts a complete and disciplined
analysis of the entire NTCSS program (reviewed and approved by PA&E and
the Naval Cost Analysis Division) and provides this analysis to all key
stakeholders, the Navy's investment decisions will continue to be made
without complete and reliable data.

Second, the department stated that further deployment of NTCSS should not
be limited at this time. Nevertheless, it stated that it will use the
results of the analysis referred to above that depicts NTCSS's
relationship with other programs to provide appropriate direction to the
program. We do not agree that development should not be limited and would
note that the department's own comment acknowledges the need to decide on
an appropriate direction for the program. In our view, prudent use of
taxpayer resources warrant both a reliable economic analysis that can be
used to inform any decision on this direction and fiscal restraint to
investing until an informed decision can be made.

Third, DOD said that the Navy does not need to be directed to ensure that
effective program management activities are implemented because it is
continuously improving program management activities. Further, DOD stated
that, although it is not required to implement an earned value management
system because the individual projects do not meet the dollar threshold
and there are no formal contract deliverables, it is nevertheless adhering
to the 32 earned value management criteria set forth in applicable
standards. The department added that it intends to have the Navy Inspector
General conduct a separate study to further ensure that the program is
using the best program management activities.

We do not agree with these comments. In particular, neither during our
review nor in its comments did the Navy provide evidence that it has
implemented effective program management activities or has improvements
under way. As we state in our report, neither the decomposition of the
program into small, fiscal year-based projects nor the absence of a
contractual relationship is a valid reason for not effectively
implementing earned value management. Further, the Navy's earned value
management self-assessment showed that it had not adhered to 17 of the 32
earned value management standards. Without reliable, timely, and auditable
earned value management data, the program office cannot adequately manage
technical, cost, and schedule risks and problems.

Fourth, the department stated that key stakeholders of the NTCSS program
have the necessary people, processes, and tools to effectively execute
their respective roles and responsibilities, noting in particular that the
central design agency has demonstrated its competency and capability and
was certified as SW-CMM maturity level 3. Nevertheless, the department
agreed to address this recommendation.

We support the Navy's stated commitment to address this recommendation. In
addition, we would note that DOD's comment that stakeholders have the
resources they need is not consistent with statements from stakeholders
during our review who said that there were manpower and resource
shortfalls that affected the oversight and execution of program
activities. Further, despite the Navy's statement that the central design
agency achieved SW-CMM maturity level 3, no documentation supporting this
statement, such as appraisal reports, were provided. Furthermore, Navy
officials told us that the central design agency did not have a
development testing lab and was therefore unable to effectively execute
testing activities.

Fifth, DOD stated that it is "premature" to reestablish the DOD Chief
Information Officer as the milestone decision authority as NTCSS
development is over 95 percent complete. Instead, it stated that existing
oversight entities would ensure that effective program management and
reporting was occurring.

We do not agree that elevating the milestone decision authority at this
time is premature based on the statement that the program is 95 percent
complete. For programs that have not been developed using industry best
practices and technical and management discipline, which is the case for
NTCSS, such claims of being essentially complete have historically proven
inaccurate because they are not grounded in reliable performance data.
Moreover, the Navy still plans to spend $348 million on NTCSS over the
next three fiscal years. Finally, as stated in our report, the current
milestone decision authority has allowed the program to operate unchecked
although a major application has repeatedly failed operational testing,
and another application was cancelled.

We are sending copies of this report to interested congressional
committees; the Director, Office of Management and Budget; the Secretary
of Defense; the Deputy Secretary of Defense; the Under Secretary of
Defense for Acquisition, Technology and Logistics; the Under Secretary of
Defense (Comptroller); the Assistant Secretary of Defense (Networks and
Information Integration)/Chief Information Officer; the Deputy Assistant
Secretary of the Navy for Command, Control, Communication, Computers and
Intelligence, and Space; the Program Executive Office for Command,
Control, Communication, Computers and Intelligence, and Space within the
Space and Naval Warfare Systems Command; the Department of the Navy Chief
Information Officer; and the Office of the Chief of Naval Operations for
Material Readiness and Logistics Operations. This report will also be
available at no charge on our Web site at http://www.gao.gov.

If you or your staff have any questions on matters discussed in this
report, please contact me at (202) 512-3439 or [email protected] . Contact
points for our Offices of Congressional Relations and Public Affairs may
be found on the last page of this report. GAO staff who made major
contributions to this report are listed in appendix V.

Randolph C. Hite Director, Information Technology Architecture     and
Systems Issues

Objective, Scope, and MethodologyAppendix I

Our objective was to determine whether the Naval Tactical Command Support
System (NTCSS) is being managed according to important aspects of the
Department of Defense's (DOD) acquisition policies and guidance, as well
as other relevant acquisition management best practices. To accomplish our
objective, we focused on the program's (1) economic justification; (2)
architectural alignment; (3) program management, namely progress
measurement and reporting, funding disclosure, and oversight; and (4) key
system development activities, namely requirements development and
management, test management, and system maturity indicators. For
requirements and test management, we focused on the one NTCSS application
that is currently being acquired, known as the Optimized Organizational
Maintenance Activity (OOMA).

To determine whether the Navy has economically justified its investment in
NTCSS, we reviewed the latest economic analysis to determine the basis for
the cost and benefit estimates and net present value calculations. This
included evaluating the analysis against DOD and Office of Management and
Budget (OMB) guidance, as well as relevant best practices.1 It also
included interviewing program officials, including the Assistant Program
Manager; the office of the Deputy Assistant Secretary of the Navy for
Command, Control, Communication, Computers and Intelligence, and Space;
the Office of Program Analysis and Evaluation; and the Naval Cost Analysis
Division as to their respective roles, responsibilities, and actual
efforts in developing and/or reviewing the economic analysis. In addition,
we also interviewed the Assistant Program Manager and the office of the
Deputy Assistant Secretary of the Navy for Command, Control,
Communication, Computers and Intelligence, and Space about the purpose and
use of the analysis for managing the Navy's investment in the NTCSS
program including the extent to which measures and metrics showed that
projected benefits in the economic analysis were actually being realized.

To determine whether the Navy has aligned NTCSS to either the DOD business
enterprise architecture2 or a Navy architecture, we relied on our prior
reports addressing DOD and Navy architecture development and
implementation efforts, a memo and analysis results on NTCSS's compliance
with the business enterprise architecture, and documents on the Navy's
architecture efforts. We also interviewed Navy officials from the program
office; the office of the Deputy Assistant Secretary of the Navy for
Command, Control, Communication, Computers and Intelligence, and Space;
the office of the Navy Research, Development, and Acquisition Chief
Engineer; and the Office of the Assistant Secretary of Defense for
Networks and Information Integration/Chief Information Officer about DOD
and Navy architecture efforts and NTCSS's alignment to them.

To determine whether the Navy was effectively measuring, reporting, and
overseeing the program, we did the following:

o We first asked the central design agency to self-assess their
satisfaction of 32 best practice criteria regarding their earned value
management system. Using the results of their self-assessment to target
our analysis, we then assessed those aspects of the earned value
management system the self-assessment reported as meeting the criteria, by
comparing the documentation with relevant DOD guidance and best
practices.3 We selected these two projects as case studies to determine
the degree to which earned value management was being implemented. The two
projects selected were (1) 2004 OOMA software project and (2) 2004 NTCSS
hardware installation and integration (for both OOMA and Optimized NTCSS).
We selected these two because they were the projects for which Navy
provided us the most earned value management related documentation. To
understand the Navy's reasons why they were not performing certain
elements of earned value management, we interviewed officials including
the Assistant Program Manager, and officials at the central design agency
in Norfolk and the in service engineering agency in Charleston.

o To assess reporting capabilities, we reviewed program documentation such
as Acquisition Program Baselines, program deviation reports, and Defense
Acquisition Executive Summary reports. We also reviewed information and
documentation on the Rapid Improvement Team pilot Web site including a
report that assesses the current health of the program on a monthly basis
and a report that address risks for different projects within the program.

o To assess compliance with budget policies and guidance, we compared
NTCSS budget documentation with DOD and Navy financial management policies
and guidance.

o To assess oversight of the program, we interviewed the program manager,
milestone decision authority, functional sponsor, Navy Chief Information
Officer, and a representative of the program's executive steering
committee.

o To determine whether the Navy was effectively managing key system
development activities, namely requirements management, testing, and
system maturity indicators, we did the following:

o To assess requirements development and management capabilities, we
reviewed program documentation such as the official list of requirements
and system specifications, and evaluated them against relevant best
practices4 for several characteristics including traceability and
prioritization. We attempted to trace requirements to both higher level
documents and lower level specifications. We also attended the NTCSS Forum
where requirements were gathered and discussed. We interviewed Navy
officials such as the Assistant Program Manager, Commanding Officer and
Executive Director of the central design agency, and the OOMA Functional
Manager to discuss their roles and responsibilities for developing and
managing requirements.

o To assess test management, we reviewed program documentation such as the
test and evaluation master plan, test plans, test reports, and guidance.
We then compared these documents with DOD guidance and best practices and
focused on the effectiveness of developmental testing and the adequacy of
developmental testing documentation.5 Our review

o also included an audit report prepared by the Naval Audit Service6 and a
test report prepared by Navy's independent operational test organization.7
We interviewed Navy officials such as the Assistant Program Manager,
Commanding Officer and Executive Director of the central design agency,
OOMA Functional Manager, and an official in the office of the Director of
Navy Test and Evaluation and Technology Requirements to discuss their
roles and responsibilities for test management.

We did not independently validate information on the program's cost and
budget or the number of trouble reports and change proposals.

We conducted our work at DOD headquarters in Arlington, Virginia; at Space
and Naval Warfare Center, San Diego, California; Space and Naval Warfare
Systems Center, Norfolk, Virginia; and Naval Air Systems Command in
Patuxent River, Maryland. We performed our work from September 2004
through November 2005 in accordance with generally accepted government
auditing standards.

Trouble Reports and Change Proposals AssessmentAppendix II

One indicator of system quality, and thus the effectiveness of the
development activities used to produce system products, is the volume and
significance of system problems and change proposals. For the Naval
Tactical Command Support System (NTCSS), trouble reports are prepared to
document system defects, and change proposals are prepared to introduce
additional system functionality. Priority levels are assigned to trouble
reports and change proposals, with 1 being the most critical and 5 being
the least critical. Table 11 defines the 5 priority levels.

Table 11: NTCSS Trouble Report and Change Proposal Priorities

Priority level Definition                                                  
Priority 1     Prevents the accomplishment of an operational or            
                  mission-essential capability; and jeopardizes safety or     
                  security.                                                   
Priority 2     Adversely affects the accomplishment of an operational or   
                  mission-essential capability, and no work-around solution   
                  is available.                                               
Priority 3     Adversely affects the accomplishment of an operational or   
                  mission-essential capability, but a work-around solution is 
                  available.                                                  
Priority 4     Results in user/operator inconvenience or annoyance but     
                  does not affect a required operational or mission-essential 
                  capability.                                                 
Priority 5     Any other effect.                                           

Source: Navy.

Available data on the number and significance of open trouble reports and
change proposals over the last 2 years do not demonstrate that NTCSS
overall is a high-quality system that is delivering promised or expected
capabilities. Specifically, the data shows that hundreds of open (yet to
be resolved) trouble reports and change proposals have continued to affect
the system.

Trouble Reports

The total number of NTCSS priority 1, 2, and 3 trouble reports have stayed
about the same over the last 2 years-totaling about 700. Of this total,
NTCSS priority 1 and 2 trouble reports have decreased by 117, with
priority 1 trouble reports being virtually eliminated. While this is
movement in a positive direction, about 300 priority 2 trouble reports
still remain open and these by definition are adversely affecting
accomplishment of an operational or mission-essential capability. (See
figs. 1 and 2.)

Figure 1: Total Number of Open NTCSS and OOMA Priority 1, 2, and 3 Trouble
Reports

Figure 2: Open NTCSS Priority 1 and 2 Trouble Reports

Further, open priority 3 trouble reports have increased during this time
to about 250 and, given that priority 3s require work-arounds, they
decrease system capability and performance. Neither the number of priority
2 trouble reports, which continue to be in the hundreds, nor the upward
trend in priority 3 trouble reports are indicative of a maturing,
high-quality system. (See fig. 3.)

Figure 3: Open NTCSS Priority 3 Trouble Reports

With respect to the OOMA application in particular, the trend in the
volume of significant trouble reports shows that this application is
particularly problematic. Specifically, while priority 1 OOMA open trouble
reports have been virtually eliminated, the number of open priority 2 OOMA
trouble reports has risen significantly from 12 to 90 in about the last 2
years. (See fig. 4.)

Figure 4: Open OOMA Priority 1 and 2 Trouble Reports

Moreover, the number of open OOMA priority 3 trouble reports has not
significantly declined over the last 2 years, with these remaining at
roughly 160. (See fig. 5.)

Figure 5: Open OOMA Priority 3 Trouble Reports

Change Proposals

The picture for NTCSS change proposals is similar to that for trouble
reports. Specifically, the total number of open NTCSS priority 1, 2, and 3
change proposals has increased over the last 2 years-going from about 325
to 425. Of this total, NTCSS priority 2 change proposals have increased by
72, with 247 priority 2 proposals still being open. (See figs. 6 and 7.)

Figure 6: Total Number of Open NTCSS and OOMA Priority 1, 2, and 3 Change
Proposals

Figure 7: Open NTCSS Priority 1 and 2 Change Proposals

Further, NTCSS priority 3 change proposals have increased during this time
to about 81, and given that priority 3 change proposals require current
work-arounds, this is not a positive trend. (See fig. 8.)

Figure 8: Open NTCSS Priority 3 Change Proposals

With respect to OOMA specifically, the number of open priority 2 change
proposals has risen slightly from 7 to 12. (See fig. 9.) Similarly, the
number of open priority 3 change proposals has also increased somewhat
from 78 to 97. (See fig. 10.) While the number of priority 2 change
proposals is not large, the trend in these, as well as the trend in the
more significant number of priority 3 change proposals, is not consistent
with those of a maturing system.

Figure 9: Open OOMA Priority 1 and 2 Change Proposals

Figure 10: Open OOMA Priority 3 Change Proposals

Earned Value Management AssessmentAppendix III

Earned value management (EVM) guidance was developed by the American
National Standards Institute/Electronic Industries Alliance.1 This
guidance identifies 32 criteria that reliable EVM systems should meet. The
32 criteria are organized into the following five categories:

o Organization: Activities that define the scope of the effort and assign
responsibilities for the work;

o Planning and budgeting: Activities for planning, scheduling, budgeting,
and authorizing the work;

o Accounting: Activities to accumulate the costs of work and material
needed to complete the work;

o Analysis: Activities to compare budgeted, performed, and actual costs;
analyze variances; and develop estimates of final costs; and

o Revisions and data maintenance: Activities to incorporate internal and
external changes to the scheduled, budgeted, and authorized work.

NTCSS central design agency (CDA) officials provided a self-assessment of
their compliance with each of the criteria, reporting that they met 15 of
the 32 criteria (see table 12). Using the results of their self-assessment
to target our analysis, we then assessed those aspects of the EVM system
the self-assessment reported as meeting the criteria, by comparing the
documentation with relevant Department of Defense (DOD) guidance and best
practices.2 Our assessment indicates that the NTCSS program satisfied two,
and partially satisfied one, of the 32 criteria (see table 12).3

Table 12: Navy Satisfaction of EVM Criteria

                                        

      Criteriaa         Definitions        Self-        GAO       GAO analysis   
                                         assessment  assessment  
  Organization                                                   
  Define the        The work breakdown   Yes         Yes         The EVM reports 
  authorized work   structure is a                               for the OOMA    
  elements for the  direct                                       software        
  program. A work   representation of                            development     
  breakdown         the work scope in                            project and the 
  structure,        the project,                                 NTCSS hardware  
  tailored for      documenting the                              installation    
  effective         hierarchy and                                project had a   
  internal          description of tasks                         work breakdown  
  management        to be performed and                          structure.      
  control, is       the relationship to                          
  commonly used in  the product                                  
  this process.     deliverables. The                            
                    work breakdown                               
                    structure breaks                             
                    down all authorized                          
                    work scope into                              
                    appropriate elements                         
                    for planning,                                
                    budgeting,                                   
                    scheduling, cost                             
                    accounting, work                             
                    authorization,                               
                    measuring progress,                          
                    and management                               
                    control. It also                             
                    ensures the                                  
                    statement of work is                         
                    entirely captured                            
                    and allows for                               
                    integration of                               
                    technical, schedule,                         
                    and cost                                     
                    information.                                 
  Identify the      The organizational   Yes         No          CDA officials   
  program           structure identifies                         have yet to     
  organizational    the organization                             provide         
  breakdown         responsible for each                         documentation   
  structure,        segment of work,                             to demonstrate  
  including the     including                                    satisfaction of 
  major             subcontracted and                            this criterion. 
  subcontractors    intra-organizational                         Such            
  responsible for   effort. In order to                          documentation   
  accomplishing the meet this guideline,                         includes an     
  authorized work,  objective evidence                           organizational  
  and define the    requires a work                              breakdown       
  organizational    breakdown structure                          structure with  
  elements in which intersection with an                         detail          
  work will be      organizational                               regarding       
  planned and       breakdown structure.                         subcontractors. 
  controlled.                                                    
  Provide for the   The integration of   Yes         No          The CDA has yet 
  integration of    planning,                                    to provide      
  the company's     scheduling,                                  documentation   
  planning,         budgeting, work                              to demonstrate  
  scheduling,       authorization, and                           satisfaction of 
  budgeting, work   cost accumulation                            this criterion. 
  authorization,    management processes                         Such            
  and cost          provides the                                 documentation   
  accumulation      capability for                               includes copies 
  processes with    establishing the                             of master,      
  each other and,   performance                                  intermediate,   
  as appropriate,   measurement                                  and detailed    
  the program work  baseline,                                    schedules;      
  breakdown         identifying work                             operational     
  structure and the progress, and                                schedules;      
  program           collecting of actual                         control account 
  organizational    costs for management                         plans;          
  structure.        analysis and                                 performance     
                    corrective actions.                          reports by work 
                                                                 breakdown       
                                                                 structure and   
                                                                 organizational  
                                                                 breakdown       
                                                                 structure;      
                                                                 responsibility  
                                                                 assignment      
                                                                 matrix;         
                                                                 statement of    
                                                                 work; work      
                                                                 authorization   
                                                                 documents; and  
                                                                 work breakdown  
                                                                 structure and   
                                                                 organizational  
                                                                 breakdown       
                                                                 structure       
                                                                 documentation.  
  Identify the      Visibility into      No          No          We did not      
  company           direct and indirect                          analyze this    
  organization or   costs is essential                           criterion       
  function          for successful                               because it was  
  responsible for   management of a                              self-assessed   
  controlling       project. Therefore,                          by the CDA as   
  overhead          project managers                             not being met.  
  (indirect costs). should clearly                               
                    identify managers                            
                    who are responsible                          
                    for controlling                              
                    indirect costs,                              
                    including overhead,                          
                    burden, general and                          
                    administrative                               
                    costs, and who has                           
                    authority to approve                         
                    expenditure of                               
                    resources. They                              
                    should also document                         
                    the process for                              
                    management and                               
                    control of indirect                          
                    costs.                                       
  Provide for       The integration of   No          No          We did not      
  integration of    the work breakdown                           analyze this    
  the program work  structure and                                criterion       
  breakdown         organizational                               because it was  
  structure and the breakdown structure                          self-assessed   
  program           establishes where                            by the CDA as   
  organizational    the performance                              not being met.  
  structure in a    measurement                                  
  manner that       necessary for                                
  permits cost and  project management                           
  schedule          is performed. This                           
  performance       intersection results                         
  measurement by    in designation of a                          
  elements of       focal point for                              
  either or both    management control                           
  structures, as    (the control account                         
  needed.           manager). It is also                         
                    the initiation point                         
                    for work                                     
                    authorization,                               
                    performance                                  
                    management, and                              
                    performance                                  
                    measurement. The                             
                    control account                              
                    manager identifies                           
                    the plan for work                            
                    task accomplishment,                         
                    including defining                           
                    the effort required,                         
                    cost elements                                
                    (labor, material,                            
                    etc.), and the                               
                    resources required                           
                    to do the job.                               
  Planning and                                                   
  budgeting                                                      
  Schedule the      The scheduling of    Yes         Yes         Detailed        
  authorized work   authorized work                              schedule        
  in a manner that  facilitates                                  documents for   
  describes the     effective planning,                          both projects   
  sequence of work  reporting, and                               describe the    
  and identifies    forecasting, which                           sequence and    
  significant task  is critical to the                           interdependence 
  interdependencies success of all                               of work         
  required to meet  projects. An                                 relative to     
  the program       integrated network                           project         
  requirements.     scheduling system                            requirements.   
                    has distinct tasks                           
                    that can be                                  
                    summarized by work                           
                    breakdown structure                          
                    and organizational                           
                    breakdown structure                          
                    identifiers to track                         
                    progress and measure                         
                    performance.                                 
  Identify physical Objective indicators Yes         No          The metrics in  
  products,         enable measurement                           the NTCSS       
  milestones,       of work                                      hardware        
  technical         accomplished,                                installation    
  performance       thereby allowing                             project reports 
  goals, or other   accurate comparison                          contained       
  indicators that   to planned work.                             unexpectedly    
  will be used to   Meaningful                                   and             
  measure progress. performance metrics                          unrealistically 
                    enable better                                large           
                    management insight                           improvements in 
                    and decision making,                         performance     
                    allowing maximum                             that were not   
                    time for management                          explained. In   
                    action to keep the                           addition, the   
                    project on plan.                             program office  
                                                                 told us that    
                                                                 the measurement 
                                                                 data for the    
                                                                 OOMA software   
                                                                 project is      
                                                                 distorted due   
                                                                 to numerous     
                                                                 baseline        
                                                                 changes and     
                                                                 requirements    
                                                                 changes.        
                                                                 Satisfying this 
                                                                 criterion       
                                                                 requires valid  
                                                                 data.           
  Establish and     The assignment of    No          No          We did not      
  maintain a        budgets to scheduled                         analyze this    
  time-phased       segments of work                             criterion       
  budget baseline,  produces a plan                              because it was  
  at the control    against which actual                         self-assessed   
  account level,    performance can be                           by the CDA as   
  against which     compared. This is                            not being met.  
  program           called the                                   
  performance can   performance                                  
  be measured.      measurement                                  
  Budget for        baseline. The                                
  far-term efforts  establishment,                               
  may be held in    maintenance, and use                         
  higher-level      of the performance                           
  accounts until an measurement baseline                         
  appropriate time  are indispensable to                         
  for allocation at effective program                            
  the control       management.                                  
  account level.                                                 
  Initial budgets                                                
  established for                                                
  performance                                                    
  measurement will                                               
  be based on                                                    
  either internal                                                
  management goals                                               
  or the external                                                
  customer                                                       
  negotiated target                                              
  cost, including                                                
  estimates for                                                  
  authorized but                                                 
  undefinitized                                                  
  work. On                                                       
  government                                                     
  contracts, if an                                               
  over-target                                                    
  baseline is used                                               
  for performance                                                
  measurement                                                    
  reporting                                                      
  purposes, prior                                                
  notification must                                              
  be provided to                                                 
  the customer.                                                  
  Establish budgets An essential part of No          No          We did not      
  for authorized    project planning and                         analyze this    
  work with         establishing a                               criterion       
  identification of performance                                  because it was  
  significant cost  measurement baseline                         self-assessed   
  elements (e.g.,   is the establishment                         by the CDA as   
  labor and         of budgets for all                           not being met.  
  material) as      work authorized.                             
  needed for        Identification of                            
  internal          the budget cost                              
  management and    elements documents                           
  for control of    the required                                 
  subcontractors.   resources and                                
                    integrates the work                          
                    scope with the                               
                    performing                                   
                    organization.                                
  To the extent it  The effort contained Yes         No          The CDA has yet 
  is practical to   within a control                             to provide      
  identify the      account is                                   documentation   
  authorized work   distributed into                             to demonstrate  
  in discrete work  either work packages                         satisfaction of 
  packages,         or planning                                  this criterion. 
  establish budgets packages. Work                               Such            
  for this work in  packages are single                          documentation   
  terms of dollars, tasks, assigned to a                         includes        
  hours, or other   performing                                   control account 
  measurable units. organization for                             plans divided   
  Where the entire  completion, and                              into work and   
  control account   should be natural                            planning        
  is not subdivided subdivisions of                              packages, or    
  into work         control account                              control account 
  packages,         effort resulting in                          schedules and   
  identify the      a definable end                              time-phased     
  far-term effort   product or event.                            budgets.        
  in larger         Budgets established                          
  planning packages at the work package                          
  for budget and    level provide the                            
  scheduling        detail for effective                         
  purposes.         execution of the                             
                    baseline plan. This                          
                    approach provides                            
                    meaningful product                           
                    or                                           
                    management-oriented                          
                    events for                                   
                    performance                                  
                    measurement.                                 
  Provide that the  The integrity of the No          No          We did not      
  sum of all work   performance                                  analyze this    
  package budgets,  measurement baseline                         criterion       
  plus planning     is maintained when                           because it was  
  package budgets   the budget of the                            self-assessed   
  within a control  control account                              by the CDA as   
  account, equals   equals the sum of                            not being met.  
  the control       its work and                                 
  account budget.   planning package                             
                    budgets. This                                
                    prevents duplicate                           
                    recording of                                 
                    budgets.                                     
  Identify and      Meaningful events    No          No          We did not      
  control the level are critical for                             analyze this    
  of effort         performance                                  criterion       
  activity by       measurement.                                 because it was  
  time-phased       Measurement of level                         self-assessed   
  budgets           of effort activity                           by the CDA as   
  established for   provides no                                  not being met.  
  this purpose.     visibility into                              
  Only that effort  actual performance.                          
  that is           Level of effort                              
  unmeasurable or   activity is defined                          
  for which         as having no                                 
  measurement is    measurable output or                         
  impractical may   product at the work                          
  be classified as  package level and,                           
  level of effort.  therefore, must be                           
                    limited to avoid                             
                    distorting project                           
                    performance data.                            
  Establish         Indirect costs are   No          No          We did not      
  overhead budgets  for common                                   analyze this    
  for each          activities that                              criterion       
  significant       cannot be                                    because it was  
  organizational    specifically                                 self-assessed   
  component of the  identified with a                            by the CDA as   
  company for       particular project                           not being met.  
  expenses that     or activity and                              
  will become       should typically be                          
  indirect costs.   budgeted and                                 
  Reflect in the    controlled                                   
  program budgets,  separately at the                            
  at the            functional or                                
  appropriate       organization manager                         
  level, the        level. It is                                 
  amounts in        important to have an                         
  overhead accounts indirect budgeting                           
  that are planned  and forecasting                              
  to be allocated   process because                              
  to the program as indirect costs                               
  indirect costs.   account for a major                          
                    portion of the cost                          
                    of any project. As                           
                    such, the budgetary                          
                    control and                                  
                    management of this                           
                    category cannot be                           
                    overlooked or                                
                    minimized.                                   
  Identify          Project managers     No          No          We did not      
  management        need to realize the                          analyze this    
  reserves and      performance                                  criterion       
  undistributed     measurement baseline                         because it was  
  budget.           planning process                             self-assessed   
                    contains risk and                            by the CDA as   
                    identify a                                   not being met.  
                    management reserve                           
                    contingency for                              
                    unplanned activity                           
                    within the project                           
                    scope.                                       
  Provide that the  A project baseline   No          No          We did not      
  program target    that reflects the                            analyze this    
  cost goal is      common agreement                             criterion       
  reconciled with   between the two                              because it was  
  the sum of all    parties provides a                           self-assessed   
  internal program  common reference                             by the CDA as   
  budgets and       point for progress                           not being met.  
  management        assessment. It                               
  reserves.         provides recognition                         
                    of contractual                               
                    requirements and                             
                    precludes                                    
                    unauthorized changes                         
                    to the performance                           
                    measurement                                  
                    baseline.                                    
  Accounting                                                     
  considerations                                                 
  Record direct     A project            No          No          We did not      
  costs in a manner cost-charging                                analyze this    
  consistent with   structure                                    criterion       
  the budgets in a  established in the                           because it was  
  formal system     accounting system                            self-assessed   
  controlled by the ensures that actual                          by the CDA as   
  general books of  direct costs are                             not being met.  
  account.          accumulated and                              
                    reported in a manner                         
                    consistent with the                          
                    way the work is                              
                    planned and                                  
                    budgeted.                                    
  When a work       Actual costs need to No          No          We did not      
  breakdown         be available at all                          analyze this    
  structure is      levels of the work                           criterion       
  used, summarize   breakdown structure                          because it was  
  direct costs from to support project                           self-assessed   
  control accounts  management with                              by the CDA as   
  into the work     performance                                  not being met.  
  breakdown         measurement data.                            
  structure without Cost collection                              
  allocation of a   accounts mapped to                           
  single control    the work breakdown                           
  account to two or structure ensure                             
  more work         performance                                  
  breakdown         measurement data                             
  structure         integrity.                                   
  elements.                                                      
  Summarize direct  To ensure            No          No          We did not      
  costs from the    performance                                  analyze this    
  control accounts  measurement data                             criterion       
  into the          integrity, actual                            because it was  
  contractor's      costs need to be                             self-assessed   
  organizational    available at all                             by the CDA as   
  elements without  levels of the                                not being met.  
  allocation of a   organizational                               
  single control    breakdown structure.                         
  account to two or                                              
  more                                                           
  organizational                                                 
  elements.                                                      
  Record all        All indirect costs   No          No          We did not      
  indirect costs    should be recorded                           analyze this    
  that will be      in the accounting                            criterion       
  allocated to the  system. Allocating                           because it was  
  project.          indirect costs to                            self-assessed   
                    the appropriate                              by the CDA as   
                    direct costs assures                         not being met.  
                    that all projects                            
                    benefiting from                              
                    indirect costs                               
                    receive their fair                           
                    share.                                       
  Identify unit     A manufacturing      Yes         No          The CDA has not 
  costs, equivalent accounting system                            yet provided    
  unit costs, or    capable of isolating                         documentation   
  lot costs when    unit and lot costs                           to demonstrate  
  needed.           in a production                              satisfaction of 
                    environment allows                           this criterion. 
                    the flexibility to                           Such            
                    plan, measure                                documentation   
                    performance, and                             includes a      
                    forecast in a more                           manufacturing   
                    efficient way when                           resource        
                    there are multiple                           planning        
                    projects in the same                         project cost    
                    production line.                             collection      
                                                                 structure or an 
                                                                 enterprise      
                                                                 resource        
                                                                 planning system 
                                                                 that supports   
                                                                 the             
                                                                 identification  
                                                                 of unit costs,  
                                                                 equivalent unit 
                                                                 costs, or lot   
                                                                 costs when      
                                                                 needed          
                                                                 including       
                                                                 differentiation 
                                                                 of work in      
                                                                 process.        
  For EVM, the      Material items       No          No          We did not      
  material          consumed in the                              analyze this    
  accounting system production of                                criterion       
  will provide (1)  project deliverables                         because it was  
  accurate cost     are accounted for                            self-assessed   
  accumulation and  and progress is                              by the CDA as   
  assignment of     measured at the                              not being met.  
  costs to control  point most closely                           
  accounts in a     aligned to the                               
  manner consistent actual consumption.                          
  with the budgets  Material accounting                          
  using recognized, systems should                               
  acceptable,       adhere to these                              
  costing           three                                        
  techniques; (2)   characteristics: (1)                         
  cost performance  the material                                 
  measurement at    accounting system                            
  the point in time provides full                                
  most suitable for accountability and                           
  the category of   effective                                    
  material          measurement of all                           
  involved, but no  material purchased;                          
  earlier than the  (2) material costs                           
  time of progress  should be accurately                         
  payments or       charged to control                           
  actual receipt of accounts using                               
  material; and (3) recognized,                                  
  full              acceptable costing                           
  accountability of techniques; and (3)                          
  all material      when necessary, the                          
  purchased for the use of estimated                             
  program,          actual costs to                              
  including the     ensure accurate                              
  residual          performance                                  
  inventory.        measurement should                           
                    be used.                                     
  Analysis and      
  management        
  reports           
  At least on a     Visibility into      Yes         No          In order to     
  monthly basis,    project performance                          produce         
  generate the      helps the project                            reliable and    
  following         manager focus                                accurate        
  information at    resources on those                           variance        
  the control       areas in need of                             reports, many   
  account and other attention. Accurate                          of the          
  levels as         and reliable EVM                             aforementioned  
  necessary for     data supports                                criteria that   
  management        management control                           our analysis    
  control using     needs by allowing                            showed that the 
  actual cost data  the project manager                          CDA did not     
  from, or          to identify root                             perform must be 
  reconcilable      causes for variances                         satisfied.      
  with, the         and establish                                Therefore, this 
  accounting        actions to minimize                          criterion is    
  system: (1)       impact at the                                not being       
  comparison of the control account                              satisfied.      
  amount of planned level.                                       
  budget and the                                                 
  amount of budget                                               
  earned for work                                                
  accomplished                                                   
  (this comparison                                               
  provides the                                                   
  schedule                                                       
  variance) and (2)                                              
  comparison of the                                              
  amount of the                                                  
  budget earned and                                              
  the actual                                                     
  (applied where                                                 
  appropriate)                                                   
  direct costs for                                               
  the same work                                                  
  (this comparison                                               
  provides the cost                                              
  variance).                                                     
  Identify, at      The analysis of      Yes         No          The metrics in  
  least monthly,    deviations from plan                         the NTCSS       
  the significant   for both schedule                            hardware        
  differences       and cost at least                            installation    
  between both      monthly provides                             project reports 
  planned and       management at all                            contained       
  actual schedule   levels the ability                           unexpectedly    
  performance and   to rapidly and                               and             
  planned and       effectively                                  unrealistically 
  actual cost       implement corrective                         large           
  performance and   actions with an                              improvements in 
  provide the       understanding of the                         performance     
  reasons for the   project risk and                             that were not   
  variances in the  causes of risk.                              explained. In   
  detail needed by                                               addition, the   
  program                                                        program office  
  management.                                                    told us that    
                                                                 the measurement 
                                                                 data for the    
                                                                 OOMA software   
                                                                 project is      
                                                                 distorted due   
                                                                 to numerous     
                                                                 baseline        
                                                                 changes and     
                                                                 requirements    
                                                                 changes.        
                                                                 Satisfying this 
                                                                 criterion       
                                                                 requires valid  
                                                                 data.           
  Identify budgeted Ongoing indirect     No          No          We did not      
  and applied (or   cost analysis                                analyze this    
  actual) indirect  provides visibility                          criterion       
  costs at the      into potential                               because it was  
  level and         indirect cost                                self-assessed   
  frequency needed  overruns and the                             by the CDA as   
  by management for opportunity to                               not being met.  
  effective         develop and                                  
  control, along    implement management                         
  with the reasons  action plans to meet                         
  for any           project objectives.                          
  significant                                                    
  variances.                                                     
  Summarize the     Variances provide an No          No          We did not      
  data elements and understanding of the                         analyze this    
  associated        conditions, allowing                         criterion       
  variances through the project manager                          because it was  
  the program       to properly allocate                         self-assessed   
  organization      available resources                          by the CDA as   
  and/or work       to mitigate project                          not being met.  
  breakdown         risk. They also                              
  structure to      identify significant                         
  support           problem areas from                           
  management needs  all levels of the                            
  and any customer  organization and                             
  reporting         project scope of                             
  specified in the  work, derived from                           
  project.          the same data                                
                    sources. Thus,                               
                    variances provide                            
                    valuable management                          
                    information.                                 
  Implement         Earned value data    Yes         No          The metrics in  
  managerial        must be utilized by                          the NTCSS       
  actions taken as  all levels of                                hardware        
  the result of     management for                               installation    
  earned value      effective project                            project reports 
  information.      execution. Because                           contained       
                    of this, the data                            unexpectedly    
                    produced by the EVM                          and             
                    system must be                               unrealistically 
                    available to                                 large           
                    managers on a timely                         improvements in 
                    basis and must be of                         performance     
                    sufficient quality                           that were not   
                    to ensure that                               explained. In   
                    effective management                         addition, the   
                    decisions can be                             program office  
                    made as a result of                          told us that    
                    its analysis.                                the measurement 
                                                                 data for the    
                                                                 OOMA software   
                                                                 project is      
                                                                 distorted due   
                                                                 to numerous     
                                                                 baseline        
                                                                 changes and     
                                                                 requirements    
                                                                 changes.        
                                                                 Satisfying this 
                                                                 criterion       
                                                                 requires valid  
                                                                 data.           
  Develop revised   Estimates at         No          No          We did not      
  estimates of cost completion based on                          analyze this    
  at completion     predictive                                   criterion       
  based on          performance measures                         because it was  
  performance to    increase the                                 self-assessed   
  date, commitment  probability that the                         by the CDA as   
  values for        project can be                               not being met.  
  material, and     executed within the                          
  estimates of      reported estimates                           
  future            at completion. When                          
  conditions.       estimates at                                 
  Compare this      completions are                              
  information with  analyzed at least                            
  the performance   monthly and updated                          
  measurement       as required, the                             
  baseline to       robustness of the                            
  identify          financial reporting                          
  variances at      requirements is                              
  completion        enhanced, thereby                            
  important to      reducing the                                 
  company           potential for                                
  management and    surprises. Monthly                           
  any applicable    estimates at                                 
  customer          completion reviews                           
  reporting         are essential for                            
  requirements,     management decisions                         
  including         including the                                
  statements of     planning of project                          
  funding           future funding                               
  requirements.     requirements.                                
  Revisions and     
  data maintenance  
  Incorporate       The incorporation of Yes         No          The CDA has yet 
  authorized        authorized changes                           to provide      
  changes in a      in a timely manner                           documentation   
  timely manner,    maintains the                                to demonstrate  
  recording the     integrity of the                             satisfaction of 
  effects of such   performance                                  this criterion. 
  changes in        measurement baseline                         Such            
  budgets and       and thus its                                 documentation   
  schedules. In the effectiveness as a                           includes change 
  directed effort   baseline against                             control logs    
  prior to          which to manage and                          and work        
  negotiation of a  control performance.                         authorization   
  change, base such                                              documents.      
  revisions on the                                               
  amount estimated                                               
  and budgeted to                                                
  the program                                                    
  organizations.                                                 
  Reconcile current Budget changes       Yes         No          The CDA has yet 
  budgets to prior  should be controlled                         to provide      
  budgets in terms  and understood in                            documentation   
  of changes to the terms of scope,                              to demonstrate  
  authorized work   resources, and                               satisfaction of 
  and internal      schedule, and that                           this criterion. 
  replanning in the budgets should                               Such            
  detail needed by  reflect current                              documentation   
  management for    levels of authorized                         includes change 
  effective         work. Furthermore,                           documents or    
  control.          budget revisions                             change control  
                    should be traceable                          logs.           
                    to authorized                                
                    contractual targets                          
                    and control account                          
                    budgets.                                     
  Control           Retroactive changes  Yes         No          The CDA has yet 
  retroactive       to the baseline may                          to provide      
  changes to        mask variance trends                         documentation   
  records           and prevent use of                           to demonstrate  
  pertaining to     the performance data                         satisfaction of 
  work performed    to project estimates                         this criterion. 
  that would change of cost and schedule                         Such            
  previously        at completion.                               documentation   
  reported amounts  Retroactive budget                           includes change 
  for actual costs, adjustments may                              control logs or 
  earned value, or  delay visibility of                          approved        
  budgets.          overall project                              retroactive     
  Adjustments       variance from plan,                          change          
  should be made    thus reducing the                            controls.       
  only for          alternatives                                 
  correction of     available to                                 
  errors, routine   managers for project                         
  accounting        redirection or                               
  adjustments,      termination.                                 
  effects of                                                     
  customer or                                                    
  management                                                     
  directed changes,                                              
  or to improve the                                              
  baseline                                                       
  integrity and                                                  
  accuracy of                                                    
  performance                                                    
  measurement data.                                              
  Prevent revisions Changes made outside Yes         No          The CDA has yet 
  to the program    the authorized                               to provide      
  budget except for baseline control                             documentation   
  authorized        processes compromise                         to demonstrate  
  changes.          the integrity of                             satisfaction of 
                    performance trend                            this criterion. 
                    data and delay                               Such            
                    visibility into                              documentation   
                    overall project                              includes change 
                    variance from plan.                          control logs,   
                                                                 control         
                                                                 accounts, and   
                                                                 work package    
                                                                 plans.          
  Document changes  By ensuring that            Yes     Partial  We were         
  to the            budget and schedule                          provided        
  performance       revisions are                                documentation   
  measurement       documented and                               showing eight   
  baseline.         traceable, the                               baseline        
                    integrity of the                             changes for the 
                    performance                                  NTCSS hardware  
                    measurement baseline                         installation    
                    is maintained and                            project.        
                    can be verified. The                         However, the    
                    performance                                  program office  
                    measurement baseline                         told us that    
                    should reflect the                           the EVM data    
                    most current plan                            for the OOMA    
                    for accomplishing                            software        
                    the effort.                                  project is      
                    Authorized changes                           distorted due   
                    should be quickly                            to numerous     
                    recorded in the                              baseline        
                    system and                                   changes and     
                    incorporated into                            requirements    
                    all relevant                                 changes.        
                    planning. Planning                           
                    and authorization                            
                    documents must also                          
                    be updated                                   
                    accordingly prior to                         
                    commencement of new                          
                    work.                                        
  Number satisfied                       15          2           
  Number partially                       0           1           
  satisfied                                                      
  Number not                             17          29          
  satisfied                                                      
  Total                                  32          32          

Sources: Navy CDA self-assessment and GAO analysis of Navy provided data.

aBased on the National Defense Industrial Association Program Management
Systems Committee Intent Guide (January 2005).

Comments from the Department of DefenseAppendix IV

The following are GAO's comments on the Department of Defense's letter
dated November 23, 2005.

1.See the Agency Comments and Our Evaluation section of this report.

2.We disagree. Our report contains numerous instances where the Navy did
not comply with either DOD acquisition policies and guidelines or industry
best practices, in the areas of (1) economic justification; (2)
architectural alignment; (3) project management, including progress
measurement and reporting, funding disclosure, and oversight activities;
and (4) system development, including requirements management and testing.
Moreover, the Navy has not provided any evidence to demonstrate that our
report is incorrect with respect to the level of program discipline and
conformance with applicable policy and guidance in the areas that we
reviewed.

3.We disagree. Knowing that NTCSS is the right solution to meet the Navy's
strategic business and technological needs would require that a frame of
reference articulating these needs be available as a point of comparison.
Such a frame of reference is an enterprise architecture. However, the Navy
stated the system was defined and implemented without a complete and
formal enterprise architecture. Our experience with federal agencies has
shown that investing in an information technology solution without
defining the solution in the context of an architecture often results in
systems that are duplicative, not well integrated, and unnecessarily
costly to maintain and interface. In addition, in February 2005, key
program stakeholders and representatives from user organizations
questioned whether NTCSS as defined was the right solution to cost
effectively meet users' needs. At that time, program officials stated
their intent to develop a new economic analysis to gather the information
needed to determine whether to continue investing in NTCSS. In November
2005, program officials told us that they no longer planned to develop
this economic analysis. Without a well-defined architecture and a reliable
economic analysis, the Navy cannot be sure that NTCSS is the right
solution.

4.See comment 2.

5.We acknowledge DOD's comment but would note that it is contrary to
statements made to us during the audit. For example, officials with the
milestone decision authority stated that, due to staffing reductions, the
office was unable to fully perform oversight activities and has had to
delegate completion of these activities. Also, Naval Cost Analysis
Division officials stated that they only review cost estimates that are
prepared for milestone reviews because staffing limitations do not permit
them to review all cost estimates. Further, Navy officials stated that the
central design agency was unable to effectively execute testing activities
because it did not have a development testing lab.

6.We disagree with this approach because its scope is narrower than our
recommendation. Specifically, we recommended that the Navy develop a
reliable economic analysis of the NTCSS program that includes all viable
alternatives, including the Navy's Enterprise Resource Planning program.
DOD acquisition policy and guidance provide detailed instructions on how
economic analyses should be performed to obtain information that is
critical for decisions regarding investments of scarce resources. Without
such information, Navy risks that its continued investment in the system
may not be justified.

7.We disagree. With respect to the statement that NTCSS is a "very mature
program," NTCSS has been under development for about 10 years at a cost of
about $1.1 billion, and the Navy plans to spend an additional $348 million
between fiscal years 2006 and 2009. Further, as appendix II of our report
shows, there are hundreds of open trouble reports and change proposals
that need to be addressed before the system can deliver promised or
expected capabilities. In addition, should the OOMA application pass
operational testing and be fielded, there are over 200 sites where the
necessary hardware must be installed and related training must occur.
These two efforts will require a significant investment of time and
resources, and it is therefore critical that the Navy ensure that NTCSS is
the proper system before investing additional funds. With respect to the
statement that "the final application is about to fielded," there is no
evidence to support this. Since its originally planned fielding date of
2001, OOMA has failed operational testing twice, and the application is
still under development. Therefore, it is premature to assert that the
application will soon pass developmental and follow-on operational
testing.

8.See comment 6. Further, we disagree with the proposal to limit key
stakeholders' involvement in developing the economic justification to
"coordinating" and "briefing." These stakeholders have specific expertise
and roles relative to economically justifying system investments that
should be exploited. Until it conducts a complete and disciplined analysis
of the entire NTCSS program (reviewed and approved by the Office of
Program Analysis and Evaluation and the Naval Cost Analysis Division) and
provides this analysis to all key stakeholders, the Navy's investment
decisions will continue to be made without complete and reliable data.

9.We disagree. As discussed in our report, the 2004 economic analysis did
not adhere to five of eight criteria elements contained in the Office of
Management and Budget Circulars A-94 and A-11.

10.We disagree. The 2004 economic analysis that the Navy provided us
focused on three fielding alternatives for the NTCSS program, not just the
OOMA application. The Navy did not provide a 2004 economic analysis for
just OOMA as the final NTCSS application.

11.We disagree. As stated in our report, officials from the Office of
Program Analysis and Evaluation and the Naval Cost Analysis Division told
us that they did not review the 2004 NTCSS economic analysis.

12.See comment 10.

13.We agree that the Navy ERP program did not exist when the original
NTCSS analysis of alternatives was conducted. However, the Navy ERP
program was initiated in 1998 and therefore did exist when the Navy
conducted subsequent analysis of alternatives.

14.See comment 9.

15.We do not question whether these annual reviews occurred and what
resulted from them. However, the point in our report is that NTCSS has not
been defined and developed in the context of a DOD or Navy enterprise
architecture because a well-defined version of either has not existed to
guide and constrain the program. As a result, meaningful analysis showing
how NTCSS aligns to evolving DOD and Navy architecture efforts could not
be produced. This means that the Navy does not have a sufficient basis for
knowing if NTCSS, as defined, properly fits within the context of future
DOD and Navy business operational and technological environments.

16.We disagree. Our recommendation to limit further deployment of NTCSS is
a way of ensuring that the Navy takes a "strategic pause" while it takes
the time to ensure that decisions regarding future investment are made
using reliable information, which our report shows has not historically
been the case. As long as the Navy is not appropriately limiting work on
NTCSS, it is continuing to invest resources without having justified doing
so.

17.See comment 6.

18.See comment 2.

19.We disagree. As we state in our report, neither the decomposition of
the program into small, fiscal year-based projects nor the absence of a
contractual relationship is a valid reason for not effectively
implementing earned value management. Without reliable, timely, and
auditable earned value management data, the program office cannot
adequately manage technical, cost, and schedule risks and problems.

20.We disagree. The Navy's own self-assessment of compliance with the 32
criteria, detailed in appendix III of our report, showed that 17 of these
criteria were not being satisfied. Further, our assessment showed that the
Navy did not satisfy 29 of the 32 criteria, and program officials did not
provide any evidence to refute the results of our assessment.

21.The Navy did not provide us with a copy of the CDA Software Measurement
Plan.

22.See comment 5. Further, the Navy's position that "key stakeholders of
the NTCSS program do, in fact, have the people, processes and tools to
effectively execute their respective roles and responsibilities," is not
consistent with its comment that this area will be part of a planned
review.

23.We disagree. Although the Navy states that the program is 95 percent
complete, it still plans to spend $348 million over the next three fiscal
years, which is approximately 32 percent of what has been spent on the
program to date. In addition, because the Navy lacks disciplined
acquisition management practices, as discussed in our report, including
earned value management, we question how it is able to reliably determine
what percentage of the work has been completed and the percentage that
remains to be done. As stated in our report, the current milestone
decision authority has allowed the program to proceed while a major
application repeatedly failed operational testing, and another application
was cancelled. In addition, the Navy stated its intent to revisit the need
to change milestone decision authority.

GAO Contact and Staff AcknowledgmentsAppendix V

Randolph C. Hite (202) 512-3439 or [email protected]

In addition to the contact named above, Cynthia Jackson, Assistant
Director; Harold J. Brumm; Calvin L. H. Chang; Jennifer K. Echard; Joanne
Fiorino; Neelaxi Lakhmani; Freda Paintsil; Jamelyn Payan; Karen Richey;
Dr. Karl Seifert; Andrea Smith; and Dr. Rona B. Stillman made key
contributions to this report.

(310287)

transparent illustrator graphic

www.gao.gov/cgi-bin/getrpt? GAO-06-215 .

To view the full product, including the scope

and methodology, click on the link above.

For more information, contact Randolph C. Hite at (202) 512-3439 or
[email protected].

Highlights of GAO-06-215 , a report to the Subcommittee on Readiness and
Management Support, Committee on Armed Services, U.S. Senate

December 2005

DOD SYSTEMS MODERNIZATION

Planned Investment in the Naval Tactical Command Support System Needs to
Be Reassessed

Because it is important that the Department of Defense (DOD) adheres to
disciplined information technology (IT) acquisition processes to
successfully modernize its business systems, GAO was asked to determine
whether the Naval Tactical Command Support System (NTCSS) is being managed
according to important aspects of DOD's acquisition policies and guidance,
as well as other relevant acquisition management best practices. NTCSS was
started in 1995 to help Navy personnel effectively manage ship, submarine,
and aircraft support activities. To date, about $1 billion has been spent
to partially deploy NTCSS to about one-half its intended ashore and afloat
sites.

What GAO Recommends

GAO is making recommendations to the Secretary of Defense to develop the
analytical basis to determine if continued investment in NTCSS represents
prudent use of limited resources. GAO is also making recommendations to
strengthen management of the program, conditional upon a decision to
proceed with further investment in the program. DOD either fully or
partially concurred with the recommendations. It also stated that while
some of GAO's findings are valid, the overall findings understated and
misrepresented the program's level of discipline and conformance with
applicable guidance and direction.

The Department of the Navy has not managed its NTCSS program in accordance
with key aspects of the department's policies and related guidance,
including federal and recognized best practice guidance. Collectively,
these policies and guidance are intended to reasonably ensure that
investment in a given IT system represents the right solution to fill a
mission need and, if it is, that acquisition and deployment of the system
are handled in a manner that maximizes the chances of delivering defined
system capabilities on time and within budget. In the case of NTCSS,
neither of these outcomes is being realized. Specifically,

           The Navy has not economically justified its ongoing and planned
           investment in NTCSS. Specifically, it (1) has not reliably
           estimated future costs and benefits and (2) has not ensured that
           independent reviews of its economic justification were performed
           to determine its reliability.

           The Navy has not invested in NTCSS within the context of a
           well-defined DOD or Navy enterprise architecture, which is
           necessary to guide and constrain NTCSS in a way that promotes
           interoperability and reduces redundancy with related and dependent
           systems.

           The Navy has not effectively performed key measurement, reporting,
           budgeting, and oversight activities. In particular, earned value
           management, which is a means for determining and disclosing actual
           performance against budget and schedule estimates, has not been
           implemented effectively, and oversight entities have not had the
           visibility into the program needed to affect its direction.

           The Navy has not adequately conducted requirements management and
           testing activities. For example, requirements were neither
           prioritized nor traced to related documentation to ensure that the
           system delivers capabilities that meet user needs. This
           contributed to failures in developmental testing that have
           prevented the latest component of NTCSS from passing operational
           testing twice over the last 4 years.

Reasons the Navy cited for not following policies and guidance ranged from
their not being applicable to the NTCSS program, to lack of time available
to apply them, to plans for strengthening system practices not being
applied retroactively. Nevertheless, the Navy has begun taking steps and
is considering other steps intended to address some of the above problems.
Until program management improves, NTCSS will remain a risky program.
*** End of document. ***