Information Technology: DOD Needs to Ensure That Navy Marine	 
Corps Intranet Program Is Meeting Goals and Satisfying Customers 
(08-DEC-06, GAO-07-51). 					 
                                                                 
The Navy Marine Corps Intranet (NMCI) is a 10-year, $9.3 billion 
information technology services program. Through a		 
performance-based contract, the Navy is buying network		 
(intranet), application, and other hardware and software services
at a fixed price per unit (or "seat") to support about 550 sites.
GAO prepared this report under the Comptroller General's	 
authority as part of a continued effort to assist Congress and	 
reviewed (1) whether the program is meeting its strategic goals, 
(2) the extent to which the contractor is meeting service level  
agreements, (3) whether customers are satisfied with the program,
and (4) what is being done to improve customer satisfaction. To  
accomplish this, GAO reviewed key program and contract		 
performance management-related plans, measures, and data and	 
interviewed NMCI program and contractor officials, as well as	 
NMCI customers at shipyards and air depots.			 
-------------------------Indexing Terms------------------------- 
REPORTNUM:   GAO-07-51						        
    ACCNO:   A63967						        
  TITLE:     Information Technology: DOD Needs to Ensure That Navy    
Marine Corps Intranet Program Is Meeting Goals and Satisfying	 
Customers							 
     DATE:   12/08/2006 
  SUBJECT:   Computer services contracts			 
	     Computer support services				 
	     Contract administration				 
	     Customer service					 
	     Government information dissemination		 
	     Information technology				 
	     Interoperability					 
	     Naval procurement					 
	     Performance measures				 
	     Program evaluation 				 
	     Program management 				 
	     Strategic planning 				 
	     Surveys						 
	     Navy Marine Corps Intranet Program 		 

******************************************************************
** This file contains an ASCII representation of the text of a  **
** GAO Product.                                                 **
**                                                              **
** No attempt has been made to display graphic images, although **
** figure captions are reproduced.  Tables are included, but    **
** may not resemble those in the printed version.               **
**                                                              **
** Please see the PDF (Portable Document Format) file, when     **
** available, for a complete electronic file of the printed     **
** document's contents.                                         **
**                                                              **
******************************************************************
GAO-07-51

   

     * [1]Report to Congressional Addressees

          * [2]December 2006

     * [3]INFORMATION TECHNOLOGY

          * [4]DOD Needs to Ensure That Navy Marine Corps Intranet Program Is
            Meeting Goals and Satisfying Customers

     * [5]Contents

          * [6]Results in Brief
          * [7]Background

               * [8]NMCI Purpose, Scope, and Status
               * [9]NMCI Program Management Structure
               * [10]NMCI Contract Description

                    * [11]SLAs
                    * [12]Customer Satisfaction Surveys

                         * [13]End User Satisfaction Survey
                         * [14]Echelon II (Navy) and Major Command (Marine
                           Corps) Commander Survey and Network Operations
                           Leader Survey

               * [15]Previous GAO Work on NMCI

          * [16]Navy Has Not Met NMCI Strategic Goals and Has Not Focused on
            Measuring Strategic Program Outcomes

               * [17]Navy Developed a Performance Management Plan to Measure
                 and Report NMCI Progress in Meeting Strategic Goals but Did
                 Not Implement It
               * [18]NMCI Strategic Goals and Associated Performance Category
                 Targets Have Not Been Met

          * [19]Contractor Has Largely Met Many but Has Not Met Other SLAs

               * [20]Contractor Satisfaction of SLAs Has Varied by Agreement
                 and Seat Type, with Not All Agreements Being Met

                    * [21]Significant Percentage of All Applicable Seat Types
                      Have Met Certain Site-Specific Agreements
                    * [22]Certain Site-Specific Agreements Have Not Been
                      Consistently Met Over Time
                    * [23]Significant Percentage of All Seat Types Have Not
                      Met Certain Site- Specific Agreements
                    * [24]Most Enterprisewide Agreements Have Been Met, but a
                      Few Have Not

               * [25]Contractor Satisfaction of SLAs Relative to
                 Contractually Defined Performance Levels Has Varied

          * [26]NMCI Customer Groups' Satisfaction Levels Vary, but Overall
            Customer Satisfaction Is Low

               * [27]End User Surveys Show Dissatisfaction with NMCI
               * [28]Commander and Network Operator Surveys Show That Both
                 Customer Groups Are Dissatisfied

                    * [29]Commander Survey Results
                    * [30]Network Operations Leaders Survey Results

               * [31]Shipyard and Air Depot Customers Consistently Identified
                 a Range of Concerns and Areas of Dissatisfaction with NMCI

                    * [32]Continued Reliance on Legacy Systems
                    * [33]Loss in Workforce Productivity
                    * [34]Lack of Support of Dynamic Work Environments
                    * [35]Limitations in Help Desk Support
                    * [36]Problems with NMCI Site Preparation and Transition

          * [37]Customer Satisfaction Improvement Efforts Are Not Being
            Guided by Effective Planning
          * [38]Conclusions
          * [39]Recommendations for Executive Action
          * [40]Agency Comments and Our Evaluation

     * [41]Objectives, Scope, and Methodology
     * [42]Customer Satisfaction Survey Questions

          * [43]End User Customer Satisfaction Survey Questions
          * [44]Navy Echelon II Commanders and Marine Corps Major Command
            Commander's Customer Satisfaction Incentive Survey
          * [45]Navy and Marine Corps Network Operations Leaders' Customer
            Satisfaction Incentive Survey

     * [46]SLA Descriptions and Performance
     * [47]Comments from the Department of Defense
     * [48]GAO Contact and Staff Acknowledgments
     * [49]PDF6-Ordering Information.pdf

          * [50]Order by Mail or Phone

Report to Congressional Addressees

December 2006

INFORMATION TECHNOLOGY

DOD Needs to Ensure That Navy Marine Corps Intranet Program Is Meeting
Goals and Satisfying Customers

Contents

Tables

Figures

December 8, 2006Letter

Congressional Addressees

The Navy Marine Corps Intranet (NMCI) program is a multiyear information
technology (IT) services program; its goals are to provide information
superiority and to foster innovation via interoperability and shared
services. The Navy awarded the NMCI services contract--currently valued at
$9.3 billion--to Electronic Data Systems (EDS) in October 2000. The
contract calls for EDS to replace thousands of independent networks,
applications, and other hardware and software^1 with a single, internal
communications network (intranet), and associated desktop, server, and
infrastructure assets and services for Navy and Marine Corps customers
(end users, network operators, and commanders).

Because of the size and importance of NMCI, as well as continuing
widespread congressional interest, we prepared this report under the
Comptroller General's authority as part of a continued effort to assist
Congress and reviewed (1) whether the program is meeting its strategic 
goals, (2) the extent to which the contractor is meeting its service level
agreements (SLA),^2 (3) whether customers are satisfied with the program,
and (4) what is being done to improve customer satisfaction.

To accomplish these objectives, we reviewed program documentation,
analyzed performance data (including those related to SLAs and customer
satisfaction surveys), reviewed collection processes and results, met with
customers at several large NMCI sites (Navy shipyards and air depots) to
discuss their level of satisfaction, and interviewed officials from the
program office, the Navy's Chief Information Officer's (CIO) office, and
EDS. We performed our work from April 2005 to August 2006, in accordance
with generally accepted government auditing standards. Details on our
objectives, scope, and methodology are in appendix I.

Results in Brief

After investing about 6 years and $3.7 billion on NMCI, the Navy has yet
to meet the program's two strategic goals--to provide information
superiority and to foster innovation. A plan that the Navy developed in
2000 to measure various aspects of the program, and thereby gauge program
goal attainment, has not been implemented, and associated performance
reports have not been issued. According to Navy officials, implementing
this plan has not been as high a priority as, for example, deploying NMCI
and measuring contractor performance. While program officials told us that
NMCI has achieved much, they were unable to provide performance data to
demonstrate these achievements relative to either the program's strategic
goals or the nine performance categories that its 2000 performance
measurement plan and other initiatives defined for these goals. Given
this, we mapped contractor performance targets and data to the nine
performance categories and strategic goals, which prompted the Navy to do
the same. The Navy's mapping shows that NMCI has met only 3 of 20
performance targets (15 percent). This means that the mission-critical
information superiority and operational innovation outcomes used to
justify NMCI have yet to be attained.

NMCI contractor performance in meeting SLAs depends on how satisfaction of
the agreements is measured and presented. When we analyzed performance
relative to operational "seats" since September 2004, without regard to
the operational status of any site,^3 we determined that while EDS had
largely met many of the agreements, it had not consistently met others,
and still other agreements were generally not being met. For example,
during March 2006, EDS met its agreement to resolve customer problems
reported to the help desk for 91 percent of the basic seats, but did not
meet this agreement for 52 percent of the mission-critical seats.^4
According to the Navy, it does not measure SLA performance in this manner.
Instead, it measures agreement performance as defined in the contract for
purposes of determining contract incentive payments. Using this approach,
the Navy reports that, as of March 2006, the contractor achieved "full
payment" or "full performance," which are levels of performance that
qualify for increased payments, for approximately 55 percent of the
"eligible" seats. In contrast, the Navy reports that these performance
levels were met for about 94 percent of eligible seats in June 2005. These
views on agreement performance illustrate that, by having robust
performance management efforts and considering a range of perspectives and
metrics, important performance insights can be identified and used.

NMCI customers, which the Navy divides into three groups--end users,
organizational commanders, and network operators--vary in the extent to
which they are satisfied with the program's performance. With respect to
end users, the Navy reports that the percentage satisfied with NMCI rose
from about 54 percent in December 2002, to about 80 percent in September
2005. However, the rate of improvement dropped off after June 2004, and
the percentage of end users that the Navy considers to be satisfied is
below the Navy-wide target of 85 percent. Moreover, the percentage of end
users considered to be satisfied includes many satisfaction survey
responses that are at the lower end of the range of scores that the Navy
has defined "satisfied" to mean. With respect to commander and network
operator satisfaction, the latest Navy data show that these two customer
groups are not satisfied. For example, on a scale from 0-3, with 0 being
dissatisfied and 1 being slightly satisfied, commanders' response averaged
0.8 and operators' response averaged 0.3. In addition, officials
representing customer groups at five shipyard or air depot installations
that we visited expressed a number of concerns and areas of
dissatisfaction with NMCI. For example, they told us that they have had to
continue using their existing IT systems to support daily operations
because NMCI does not adequately meet their needs. Without satisfied
customers, the Navy runs the risk that NMCI will not attain the widespread
acceptance necessary to ever achieve strategic program goals.

NMCI program officials told us that improving customer satisfaction is a
program priority and thus they have invested and continue to invest time
and resources in a variety of improvement activities. For example, they
said that they have expanded NMCI capabilities in a number of ways, such
as the implementation of broadband remote access. However, these
improvement efforts are not being guided by a documented plan or plans
with prioritized initiatives that are defined in terms of activities to be
performed, resources to be committed, schedules to be met, and measurable
results to be achieved. Instead, officials told us that because they have
limited resources, they undertake improvement activities that have not
been prioritized whenever resources become available. Given the importance
of NMCI customer satisfaction, it is important to take a structured and
disciplined approach to managing improvement activities. Without it, the
program office cannot adequately ensure that improvement activities are
cost effectively managed.

To assist the Navy in managing and making informed investment decisions
about the NMCI program, we are making recommendations to the Secretary of
Defense aimed at implementing effective program performance management,
expanding measurement and understanding of SLA performance, effectively
managing customer satisfaction improvement efforts, and deciding whether
performance to date warrants changes to the program.

In written comments on a draft of this report, the Department of Defense
(DOD) stated that it agreed with our recommendations. Nevertheless, the
department also said that the Navy believes that the draft report
contained factual errors, data misinterpretations, and unsupported
conclusions. In this regard, the Navy generally made five points.

oIt said that our review focused on Navy shipyards and air depots and
excluded Marine Corps sites. We disagree. Our scope, as stated throughout
the report, extended to both Navy and Marine Corps sites and customers.

oThe Navy said that NMCI is a strategic success and is meeting its goals
of providing information superiority and fostering innovation. We
disagree. As we show in the report, the Navy's own performance targets,
along with SLA and other performance data, show that NMCI has met only 3
of 20 performance categories associated with its two goals. Meeting
program strategic goals, in our view, should be the measure of a program's
strategic success.

oThe Navy said that we misinterpreted SLA data as they relate to the
contractually-specified performance categories of full payment and full
performance. We disagree. Our use of SLA data relative to the full payment
and full performance categories presents the Navy's own analysis and
includes no GAO interpretations. The analysis of SLA data that we
performed and included in the report decouples these data from these two
performance categories and offers more visibility into and coverage of
contractor performance relative to each individual SLA.

oThe Navy said that our conclusion that certain customers were marginally
satisfied is not supported by the survey responses, which the Navy
contends can only be viewed as either satisfied or unsatisfied customers.
While we acknowledge that the Navy views responses of 5.5 or higher on a
1-10 point scale as satisfied customers, our point is that this viewing is
too simplistic because it does not differentiate between degrees of
satisfaction. Therefore, our characterizing of responses of 5.5 to 7 as
marginally satisfied provides additional insight and perspective into
customers' true level of satisfaction.

oThe Navy said the program office adequately reports to key program
decision makers. We disagree, as evidenced by the fact that this reporting
has not conveyed the range and magnitude of performance and customer
satisfaction issues that our report contains.

Beyond these major points, the Navy also provided various technical
comments, which we have incorporated as appropriate in this report.

Background

The Department of the Navy is a large and complex organization with a wide
range of mission operations and supporting business functions. For
example, the Navy has about 350,000 active duty officers and enlisted
personnel, 130,000 ready reserve, and 175,000 civilian employees. Navy's
fleet operations involve approximately 280 ships and 4,000 aircraft
operating throughout the world. Further, the Navy's annual operating
budget is about $120 billion and is used to fund such things as ship and
aircraft operations, air depot maintenance, and Marine Corps operations.

The department's primary organizational components are the Secretary of
the Navy, the Chief of Naval Operations, and the Commandant of the Marine
Corps. The structural relationships among these components are summarized
later and in figure 1.

Figure 1: Simplified Department of the Navy Organization Chart

oSecretary of the Navy: Department of the  Navy headquarters recruits,
organizes, supplies, equips, trains, and mobilizes, naval forces. Among
other things, this includes construction, outfitting, and repair of Navy
and Marine Corps ships, equipment, and facilities. It also includes
formulating and implementing policies and programs.

oNaval and Marine Corps Operating Forces: The operating forces commanders
and fleet commanders have two chains of command. Administratively, they
report to the Chief of Naval Operations, and are responsible for
providing, training, and equipping naval forces. Operationally, they
provide naval forces and report to the appropriate Unified Combatant
Commanders. The operating forces include a variety of organizations with
diverse missions, such as the Atlantic and Pacific Fleets, Naval Network
Warfare Command, and Naval Reserve Forces.

oNaval shore establishment: The Navy shore establishment includes
facilities and activities for repairing machinery, electronics, ships, and
aircraft; providing communications capabilities; providing training;
providing intelligence and meteorological support; storing repair parts,
fuel, and munitions; and providing medical support. It consists of
organizations such as the Naval Sea Systems Command (which includes
shipyards), Naval Air Systems Command (which includes aviation depots),
Space and Naval Warfare Systems Command, Navy Personnel Command, Naval
Education and Training Command, and the Office of Naval Intelligence.

The Navy's many and dispersed organizational components rely heavily on IT
to help them perform their respective mission operations and business
functions. For fiscal year 2006, the Navy's IT budget was about $5.8
billion, which included funding for the development, operation, and
maintenance of Navy-owned IT systems, as well as funding for
contractor-provided IT services and programs, such as NMCI.

The Assistant Secretary of the Navy for Research, Development and
Acquisition is responsible for Navy acquisition programs. Reporting to the
Assistant Secretary are numerous entities that have authority,
responsibility, and accountability for life-cycle management of
acquisition programs within their cognizance. These entities include
certain program managers, system command, and program executive officers.

The Navy Chief Information Officer (CIO) is responsible for developing and
issuing IT management policies and standards in coordination with the
above Assistant Secretary, the system commands, and others. The Navy CIO
is also responsible for ensuring that major programs comply with the
Clinger-Cohen Act (1996)^5 and for recommending to the Secretary of the
Navy whether to continue, modify, or terminate IT programs, such as NMCI.

NMCI Purpose, Scope, and Status

NMCI is a major, Navy-wide IT services program. Its goals are to provide
information superiority--an uninterrupted information flow and the ability
to exploit or deny an adversary's ability to do the same--and to foster
innovative ways of operating through interoperable and shared network
services. The program is being implemented through a multiyear IT services
contract that is to provide desktop, server, infrastructure, and
communications-related services at Navy and Marine Corps sites located in
the United States and Japan. Through this contract, the Navy is replacing
independent local and wide area networks with a single network and related
desktop hardware and software that are owned by the contractor. Among
other things, the contractor is to provide voice, video, and data
services; infrastructure improvements; and customer service. This type of
contract is commonly referred to as "seat management." Generally speaking,
under seat management, contractor-owned desktop and other computing
hardware, software, and related services are bundled and provided on the
basis of a fixed price per unit (or seat).

In October 2000, the Navy's goal was to have between 412,000 and 416,000
seats operational by fiscal year 2004. As of June 2006, the Navy reported
that about 303,000 seats were operational at about 550 sites. According to
the Navy, initial delays in meeting deployment schedules were due to
underestimates in its existing inventory of legacy applications that
needed to be migrated to NMCI. Subsequent delays were attributed to
developing and implementing a certification and accreditation process^6
for all applications, as well as legislation^7 requiring certain analyses
to be completed before seat deployment could exceed specific levels.

The number of seats at each site ranges from a single seat to about
10,000. These sites include small sites, such as office facilities located
throughout the United States, and large sites, such as shipyards and air
depots, which use unique software to assist in repair work.^8

NMCI Program Management Structure

Various organizations in the Navy are responsible for NMCI management and
oversight (see fig. 2). The Program Executive Officer for Enterprise
Information Systems (PEO-EIS) along with the NMCI Program Manager are
responsible for NMCI acquisition and contract management. The program is
also overseen and supported by several groups. One is the Navy's
Information Executive Committee, which provides guidance for, and
oversight of, NMCI and other information issues. The committee is made up
of CIOs from a range of Navy commands, activities, offices, and other
entities within the Navy. Another is the NMCI Executive Committee, which
includes representatives of the heads of a broad cross section of
organizations throughout the Navy, and the contractor. Its mission is to
help in the review, oversight, and management of the Navy's implementation
of NMCI, as well as to assist in identifying and resolving process and
policy impediments within the Navy that hinder an efficient and effective
implementation process. Additionally, the Network Warfare Command
(NETWARCOM)^9 and the Marine Corps Network Operations and Security Command
(MCNOSC),^10 are the two entities primarily responsible for network
operations management in the Navy and Marine Corps, respectively. The Navy
CIO is responsible for overall IT policy.

Figure 2: Organizations Responsible for NMCI Management and Oversight

NMCI Contract Description

On October 6, 2000, the Navy awarded a 5-year contract for NMCI services
to a single service provider--EDS--for an estimated 412,000 to 416,000
seats and minimum value of $4.1 billion. The original contract also
included a 3-year option for an additional $2.8 billion in services,
bringing the potential total contract value to $6.9 billion. The
department and EDS subsequently restructured the contract to be a 7-year,
$6 billion contract with a 3-year option for an additional $2.8 billion
beginning in fiscal year 2008. Following further contract restructuring
and the Navy's decision to exercise the 3-year option, the total contract
period and minimum value is now 10 years and about $9.3 billion. Figure 3
illustrates the value of the NMCI contract.

Figure 3: The Value of the NMCI Contract

The NMCI contract type is commonly referred to as seat management because
pricing for the desktop services is based on a fixed price per "seat."
Seats include desktop computers, as well as other devices, such as
cellular phones. Pricing for these seats varies depending on the services
provided. For example, having classified connectivity, mission-critical
service, additional user accounts, or additonal software installation
increases the amount paid per seat.

The NMCI contract is performance-based, which means that it contains
monetary incentives to provide services at specified levels of quality and
timeliness. The contract includes several types of incentives, including
incentives tied to SLA performance, and customer satisfaction surveys.

SLAs

The contract currently specifies 23 SLAs divided into three tiers: 100
SLAs, 200 SLAs, and 300 SLAs. The 100 tier is referred to as base
agreements, the 200 as transitional agreements, and the 300 as additional
agreements. Examples of agreements for each tier are provided below.

o100--End user services (SLA 103)

o200--Web access services (SLA 206)

o300--Network management services (SLA 328)

SLAs are further categorized as enterprisewide, site-specific, or both.
Unlike site-specific SLAs, enterprisewide SLAs are not analyzed on a
site-by-site basis. See table 1 for a list of agreements organized by tier
and category.

Table 1: List of SLAs Organized by Tier and Category

                                        

                SLA number and name              Site-specific Enterprisewide 
Base agreements                                                            
101-End user problem resolution                     X                      
102-Network problem resolution                      X                      
103-End user services                               X             X        
104-Help desk                                                     X        
105-Move, add, change                               X                      
106-Information assurance incentives                              X        
107-NMCI intranet                                   X                      
Transitional agreements                                                    
203-E-mail services                                               X        
204-Directory services                              X             X        
206-Web access services                             X             X        
211-Unclassified but Sensitive Internet             X             X        
Protocol Router Network (NIPRNET) access                                   
225-Base area network/local area network            X                      
communications services                                                    
226-Proxy and caching services                                    X        
231-System service - Domain name server             X             X        
Additional agreements                                                      
324-Wide area network network connectivity          X                      
325-Base area network/local area network            X                      
communications services                                                    
328-Network management service                      X                      
329-Operational support services                                  X        
332-Application server connectivity                 X                      
333-Security operational services                                 X        
334-Information assurance operational                             X        
service-PKI                                                                
336-Information assurance planning services                       X        

Source: GAO analysis of NMCI SLA data.

Each agreement has one or more performance categories. For example, SLA
102 has 1 performance category (Network Problem Resolution), while SLA 107
has 3 performance categories (NMCI Intranet Availability, Latency/Packet
Loss, and Voice and Video Quality of Service). Collectively, there are 51
performance categories.

Each performance category has specific performance targets that the
contractor must reach in order for the category to be met. An example of a
target is providing e-mail server services to users 99.7 percent of the
time that they are supposed to be available.

The contract currently specifies two levels of performance to be used in
determining, on a site-by-site basis, what performance-based payment
incentives, if any, EDS will earn in a given quarter (3-month period).^11
If either of these levels of performance is not met, the contractor is to
be paid 85 percent of the amount allowed under the contract for each seat
that has been cut over (i.e., is operational).

1.Full payment. To achieve this level for a given seat, the contractor
must meet 100 percent of the applicable SLAs for that seat, and 50 to 90
percent of the planned seats at the site must be cut over. Meeting a
quarterly agreement is defined as performance at or above the applicable
target(s) for either (1) 2 out of the 3 months preceding an invoice or (2)
the current month of the invoice. If these conditions are met, the
contractor is paid 100 percent of the amount allowed per seat. If, in
subsequent months, the contractor fails to achieve 100 percent of the
agreements, the amount paid is 85 percent of the amount allowed per seat.

2.Full performance. To achieve this level for a given seat, the contractor
must meet 100 percent of the applicable SLAs for that seat, and over 90
percent of the planned seats at the site must be cut over. Meeting an
agreement is defined as performance at or above the target(s) for either
(1) 2 out of the 3 months preceding a quarterly invoice or (2) the current
month of the invoice. If these conditions are met, the contractor is paid
100 percent of the amount allowed per seat. Once a site has achieved full
performance, it remains eligible for full payments, regardless of changes
to the numbers of seat orders. However, the contractor is required to
provide "financial credits" to the Navy in the event that the agreements
are not met at some future time.

Customer Satisfaction Surveys

The contract also provides for administration of three customer
satisfaction surveys: End User, Echelon II/ Major Command,^12 and Network
Operations Leaders. These surveys and their related financial incentives
are discussed below.

End User Satisfaction Survey

The contractor began conducting quarterly satisfaction surveys of Navy end
users in June of 2002 and Marine Corps end users in March 2005. These
surveys are administered to a different mix of 25 percent of eligible
users^13 each quarter, with nearly all users being surveyed each year.

Since March 2004, the survey has consisted of 14 questions, all relating
to satisfaction with the NMCI program^14 and 10 focusing on satisfaction
with EDS.^15 For each question, users are asked to indicate their level of
dissatisfaction/satisfaction according to a 10-point scale, with 1-5
denoting levels of dissatisfaction, and 6-10 denoting levels of
satisfaction. The Navy considers end users to be satisfied in general,
with the program, or with the contractor, if the average response across
the 14, 4, or 10 questions, respectively, is 5.5 or higher. The survey
instrument also includes space for additional comments and asks the end
users to identify and rank reasons for dissatisfaction or suggestions for
improvements. See table 2 for a list of the 14 questions.

Table 2: NMCI End User Customer Satisfaction Survey Questions

                                        

What is your satisfaction                                                  
*With having access to the computer hardware you need to accomplish your   
job?                                                                       
With the dependability of the computer you use?                            
*With having access to the software you need to accomplish your job?       
With network reliability?                                                  
With the professionalism of EDS personnel?                                 
With finding and using information about NMCI services?                    
With the accuracy of information describing how to use NMCI services?      
*With training on how to use NMCI effectively?                             
With technical support services provided by the help desk?                 
With technical support services provided by on-site personnel?             
With the timeliness of problem resolution?                                 
With the solution implemented to correct any problem you experienced?      
*With the process to make changes to your IT environment?                  
What is your overall satisfaction with services provided by EDS?           

Source: March 2006 Quarterly Customer Satisfaction Survey Report.

Note: Questions marked with an asterisk are not used for incentive
purposes

Based on the quarterly survey results, the contractor is eligible for an
incentive payment of $12.50 per seat if 85 to 90 percent of the average
responses is 5.5 or higher, and $25 per seat if greater that 90 percent
respond in this way. No incentive is to be paid if fewer than 85 percent
respond as being satisfied.

Echelon II (Navy) and Major Command (Marine Corps) Commander Survey and
Network Operations Leader Survey

In October 2004, the Navy designated two additional categories of
customers--commanders and network operations leaders--and developed
separate satisfaction surveys for each. In general, the commander survey
focuses on whether NMCI is adequately supporting a command's mission needs
and strategic goals; the network operations leader survey focuses on
whether the contractor is meeting certain operational network
requirements. The surveys are administered every 6 months.

The latest commander survey was distributed to the heads of 23 Navy and
Marine Corps command units. The network operations leader survey was
distributed to NETWARCOM and MCNOSC.

Both surveys are organized by major topic and subtopic. For the commander
survey, the major topics and subtopics are as follows:

oWarfighter support--including classified network support, deployable
support, and emergent requirement support.

oCutover services--including planning, preparation, and execution.

oTechnical solutions--including the new service order and delivery
process, and technical performance.

oService delivery--including organizational understanding, customer
service, and issue management.

For the network operations leader surveys, the major topics and subtopics
are as follows:

oMission support and planning--including interoperability support,
continuity of operations, future readiness, and public key infrastructure.

oNetwork management--including network status information, information
assurance, urgent software patch implementation, and data management.

oService delivery--including organizational understanding, communications,
issue management, and flexibility and responsiveness.

Appendix II provides a complete listing of the questions included in the
commander survey and the network operations leader survey.

Responses to the questions in both surveys are solicited on a scale of
0-3, with 0 being dissatisfied, and 3 being extremely satisfied. To
aggregate the respective surveys' results, the Navy averages the responses
by command units, and network operations units.

Based on the 6-month survey results, the contractor is eligible for an
incentive payment of up to $50 per seat, with average scores of less than
0.5 receiving no incentive, 0.5 to less than 1.5 receiving 25 percent of
the incentive, between 1.5 to less than 2.25 receiving 50 percent of the
incentive, and at least 2.25 receiving 100 percent of the incentive.

Previous GAO Work on NMCI

We have reported on a number of NMCI issues since the program's inception.
For example, in March 2000, we reported that the Navy's acquisition
approach and implementation plan had a number of weaknesses, and thus
introduced unnecessary program risk. In particular, we said that the Navy
lacked a plan for addressing many program requirements and information on
NMCI's potential impacts on Navy personnel.^16

In October 2002, we reported that NMCI's transition costs for shipyards
and air depots was unclear, which in turn limited the ability of such
industrially funded entities to set the future rates that they would
charge their customers.^17 Accordingly, we recommended that the program,
in collaboration with the Naval Sea Systems Command and the Naval Air
Systems Command, systematically and expeditiously resolve implementation
issues that affect the ability of shipyards and depots to plan and budget.
In response to these recommendations, the Navy took a number of actions,
including establishing an Executive Customer Forum to, among other things,
adjudicate issues requiring collaborative decision making among Navy
component CIOs, including those from the Naval Sea Systems Command and the
Naval Air Systems Command, which represent Navy shipyards and air depots,
respectively.

In April 2003, we reported on the extent to which five DOD IT services
projects, including NMCI, had followed leading commercial outsourcing
practices.^18 For NMCI, we found that while the Navy had employed most of
these practices, it did not follow the key practice related to
establishing an accurate baseline of the existing IT environment, choosing
instead to rely on a preexisting and dated inventory of its legacy
applications. Because of this, we concluded that the Navy substantially
underestimated the number of legacy applications that needed to transition
to NMCI, in turn causing the program's time frame for transitioning to
slip considerably. We recommended that DOD take steps to learn from such
lessons, so that such mistakes are not repeated on future IT outsourcing
projects.

Navy Has Not Met NMCI Strategic Goals and Has Not Focused on Measuring
Strategic Program Outcomes

Consistent with relevant laws and guidance, the Navy defined strategic
goals for its NMCI program and developed a plan for measuring and
reporting on achievement of these goals. However, the Navy did not
implement this plan, choosing instead to focus on defining and measuring
contractually specified SLAs. According to Navy officials, implementing
the goal-oriented plan was not a priority, compared with swiftly deploying
NMCI seats and measuring satisfaction of contract provisions. While
program officials told us that NMCI has produced considerable mission
value and achieved much, they did not have performance data to demonstrate
progress in relation to either the program's strategic goals or nine
performance categories that its plan and related efforts defined relative
to these goals. Given this, we mapped SLAs to the nine performance
categories and two strategic goals, which prompted the Navy to do the
same. The Navy's mapping shows that NMCI has met few of the categories'
performance targets, and thus has yet to meet either of the strategic
goals. This means that the mission-critical information superiority and
operational innovation outcomes that were used to justify investment in
NMCI have yet to be attained. Without effective performance management,
the Navy is increasing the risk that the program will continue to fall
short of its goals and expected results.

Navy Developed a Performance Management Plan to Measure and Report NMCI
Progress in Meeting Strategic Goals but Did Not Implement It

Various laws --such as the Government Performance & Results Act and
Clinger-Cohen Act--require federal agencies to identify and report on
mission and strategic goals, associated performance measures, and actual
performance. Federal IT guidance^19 also recognizes the importance of
defining program goals and related measures and performance targets, as
well as determining the extent to which targets, measures, and goals are
being met.

In initiating NMCI, the Navy established two strategic goals for the
program. According to the Navy, the program's primary goal is to support
"information superiority," which it characterizes as "providing the
capability to collect, process, and disseminate an uninterrupted flow of
information while exploiting or denying an adversary's ability to do the
same." In this regard, NMCI was to create an integrated network in which
connectivity among all parts of the shore establishment, and with all
deployed forces at sea and ashore, enables all members of the network to
collaborate freely, share information, and interoperate with other
services and nations. The second goal is to "foster innovation" by
providing an interoperable and shared services "environment that supports
innovative ways of integrating doctrine and tactics, training, and
supporting activities into new operational capabilities and more
productive ways of using resources." Related to these goals, the Navy also
cited significant benefits that were to accrue from NMCI, including (1) an
uninterrupted flow of information; (2) improvements to interoperability,
security, information assurance, knowledge sharing, productivity, and
operational performance; and (3) reduced costs.

To determine its progress in meeting these program goals and producing
expected benefits, the Navy included a performance measurement plan in its
"2000 Report to Congress" on NMCI. According to the Navy, the purpose of
this 2000 performance measurement plan was to document its approach to
ensuring that key NMCI outcomes (i.e., results and benefits) and measures
were identified and collected. In this regard, the plan identified eight
strategic performance measurement categories, and related them to the NMCI
strategic program goals. Subsequently, the Navy added a ninth performance
category. According to program office and the Navy CIO officials, the nine
performance categories are all relevant to determining program performance
and strategic goal attainment. Moreover, the plan states that these
categories provide for making NMCI an integrated portion of the Navy and
Marine Corps strategic vision, support the principles of using IT to
support people, and focus on the mission value of technology.

These nine categories, including the Navy's definition of each, are as
follows:

oInteroperability: ability to allow Navy systems and applications to
communicate and share information with, and for providing services to and
accepting services from, other military services.

oSecurity and information assurance: compliance with relevant DOD, Navy,
and Marine Corps information assurance policies and procedures.

oWorkforce capabilities: ability to (1) increase people's access to
information, (2) provide tools and develop people's skills for obtaining
and sharing information, and (3) support a knowledge-centric and -sharing
culture that is built on mutual trust and respect.

oProcess improvement: role as a strategic enabler for assessment and
benchmarking of business and operational processes, and for sharing of
data, information, applications, and knowledge.

oOperational performance: ability to support improved mission (operational
and business) performance.

oService efficiency: economic effectiveness (i.e., its cost versus
services and benefits).

oCustomer satisfaction: key stakeholders (e.g., end users,) degree of
satisfaction.

oProgram management: ability to (1) meet the seat implementation schedule
and the NMCI budget, (2) achieve specified levels of network performance,
and (3) proactively manage program risks.

oNetwork operations and maintenance: includes such things as virus
detection and repair, upgradeability, scalability, maintainability, asset
management, and software distribution.

The performance plan also included metrics, targets, and comparative
baselines that were to be used for the first annual performance report,
although it noted that progress in meeting some performance targets would
not be measured until after contract award and that some of the cited
measures could at some point cease to provide useful information for
making decisions, while others may need to be collected continuously. The
plan also stated that the Navy would fully develop performance measures
for each of the categories and that it would produce an annual report on
NMCI's performance in each of the categories.

However, the Navy has not implemented its 2000 performance management
plan. For example, the Navy did not develop performance measures for each
of the performance categories and has not reported annually on progress
against performance targets, categories and goals. Instead, Navy officials
told us that they focused on defining and measuring progress against
contractually specified SLAs, deploying NMCI seats, and reducing the
number of Navy applications that are to run on NMCI workstations.
According to these officials, measuring progress against the program's
strategic goals was not a priority.

Because measurement of goal attainment has not been the Navy's focus to
date, when we sought (from both the program office and the Navy CIO
office) performance data demonstrating progress in meeting NMCI's
strategic goals and performance categories, the Navy was unable to provide
data in this context. Instead, these officials said that data were
available relative to contract performance, to include SLA performance
levels and customer satisfaction survey results. Given this, we mapped the
available contract-related performance data to the nine performance
categories and targets and provided our analysis to the program office and
the Navy CIO office. The Navy provided additional performance data and
revisions to our mappings. Our analysis of the Navy-provided mapping,
including associated fiscal year 2005 data, is discussed in the next
section.

NMCI Strategic Goals and Associated Performance Category Targets Have Not
Been Met

The Navy has not fully met any of its performance categories associated
with achieving NMCI strategic goals and realizing program benefits. For
example, the performance category of "Program management" has four
performance targets relative to cost, schedule, performance, and risk. For
fiscal year 2005, the NMCI program met one of the performance targets. It
did not meet the other three targets and thus did not meet this
performance category. Overall, the Navy defined 20 targets for the 9
performance categories. Of these 20, the Navy met 3, did not meet 13, and
was unable to determine if it met 4. The specific performance targets for
each performance category are described below, along with performance in
fiscal year 2005 against each target. Table 3 summarizes the number of
targets met and not met for each category.

Table 3: NMCI Satisfaction of Performance Targets for Each Performance
Category for Fiscal Year 2005

                                        

            Performance area          Number of Targets met Targets Unable to 
                                        targets             not met determine 
Interoperability                           3           1       1         1 
Security/information assurance             2           0       2         0 
Workforce capabilities                     3           1       1         1 
Process improvement                        2           0       1         1 
Operational performance                    1           0       1         0 
Service efficiency                         2           0       1         1 
Customer satisfaction                      1           0       1         0 
Program management                         4           1       3         0 
Network operations and maintenance         2           0       2         0 
Total                                     20           3      13         4 

Source: GAO analysis of Navy data.

Interoperability: The Navy defined information systems interoperability,
critical joint applications interoperability, and operational testing
targets as its measures of this category. For fiscal year 2005, it met the
information systems interoperability target. However, it did not meet the
critical joint applications interoperability target, and it could not
determine whether it met the operational testing target because of
insufficient data.

oInformation systems interoperability: The target was to be level 2 on the
DOD Levels of Information Systems Interoperability (LISI) Scale.^20 The
Navy reports that NMCI was a level 2.

oCritical joint applications interoperability: The target was for all
critical joint applications to be interoperable with NMCI.^21 In fiscal
year 2005, the Navy did not transition all of its critical joint
applications to NMCI. Moreover, of the 13 applications that were fully or
partially transitioned, one was determined not to be interoperable.

oOperational testing: The target was to be "Potentially Operationally
Effective" and "Potentially Operationally Suitable." However, Navy
reported that the Joint Interoperability Test Command operational testing
did not produce sufficient data to determine this.

Security and information assurance: The Navy identified SLAs and
information assurance incentive targets as its measures of this category.
For fiscal year 2005, it did not meet either target.

oSLAs: The target was to meet 100 percent of all security-related
agreements. The Navy reported that it met this target during 4 months of
the fiscal year but did not meet it for 8 months, including the last 6
months of the fiscal year.

oInformation assurance incentives: The target was to have the contractor
earn 100 percent of the incentive each year. However, the contractor did
not earn 100 percent of the incentive for the last 6 months of this fiscal
year.

Workforce capabilities: The Navy defined the reduction of civilian IT
workforce, percentage of workforce with access to NMCI, and the amount of
professional certifications as its measures of this category. For fiscal
year 2005, it reported that it met the reduction of civilian IT workforce
target but did not meet the percent of workforce with access target and
could not determine whether it met the professional certifications target.

oReduction of civilian IT workforce: The target was to have a zero
reduction in its civilian IT workforce. The Navy reported that it met this
target.

oPercent of workforce with access: The target was for 100 percent of its
workforce to have access. As of September 30, 2005, 82 percent of the
applicable workforce had a seat.

oAmount of professional certifications: While Navy officials stated that
the target is professional certifications, they could not provide a
measurable target. Therefore, it cannot be determined whether the target
was met.

Process improvement: The Navy defined certain customer survey and
technology refreshment targets as its measures of this category. For
fiscal year 2005, the Navy did not meet the leadership survey target and
could not determine whether it met the technology refreshment target.

oInformation from customer surveys: The target was to have the contractor
earn 100 percent of the Echelon II survey and the Network Operations
Leaders' survey incentives. However, the contractor earned 25 percent of
the incentive for the Echelon II survey, and 0 percent of the incentive
for the Network Operations Leaders' survey in fiscal year 2005.

oTechnology refreshment: While Navy officials stated that the target is
technology refreshment, they could not provide measurable targets.
Therefore, it cannot be determined whether the target was met.

Operational performance: The Navy identified information from the network
Operations Leaders' survey as its target for measuring this category. For
fiscal year 2005, it did not meet this target.

oNetwork Operations Leaders' survey: The target was for the contractor to
earn 100 percent of the Network Operations Leaders' survey incentive. The
contractor earned 0 percent of the incentive in fiscal year 2005.

Service efficiency: The Navy defined SLA performance and cost/service
ratio per seat targets as measures of this category. For fiscal year 2005,
the Navy did not meet the SLA performance target, and it could not
determine if it met the cost/service ratio per seat target.

oSLA performance: The target was to have 100 percent of seats at the full
performance or full payment level. As of September 2005, the Navy reported
that 82 percent of seats achieved full payment or full performance. This
is down from March 2005, when the Navy reported that 96 percent of seats
achieved full payment or full performance.

oCost/service per seat: The target was to have the cost/service ratio per
seat to not exceed what it was prior to NMCI. According to the Navy, while
the per seat cost for NMCI is higher, the service level is also higher.
However, the Navy did not have sufficient information to determine if the
target was met.

Customer satisfaction: The Navy identified information from the end user
satisfaction survey as a target for measuring this category. It did not
meet this target in fiscal year 2005.

oCustomer satisfaction survey: The target was to have 85 percent of NMCI
end users satisfied. However, the percentage of users reported as
satisfied from December 2004 through September 2005 ranged from 75 to 80
percent.

Program management: The Navy defined cost, schedule, performance, and
risk-related performance targets as measures of this category. For fiscal
year 2005, it reports that it met the cost target because it did not
obligate more than 100 percent of available NMCI funding but did not meet
the schedule, performance, and risk targets.

oCost: The target was to obligate up to 100 percent of program funds on
NMCI in fiscal year 2005. The Navy reports that it obligated 97 percent of
these funds in this fiscal year. Program officials stated that the other 3
percent was spent on legacy IT infrastructure.

o Schedule: The target was to deploy all seats that were scheduled for
deployment in fiscal year 2005. The Navy reports that it deployed 77
percent of these scheduled seats.

oPerformance: The target was to have 100 percent of eligible seats at full
payment or full performance. The Navy reports that, as of September 2005,
82 percent of the seats achieved full payment or full performance.

oRisk: The target is to be "green" in all risk areas.^22 The Navy reports
that it was "yellow" in several risk areas, such as schedule and
organizational change management.

Network operations and maintenance: The Navy defined SLA performance,
leadership survey results, and technology refreshment targets for
measuring this category. For fiscal year 2005, it did not meet the SLA
performance or the leadership survey results targets. Further, it could
not determine if it met the technology refreshment target.

oSLA performance: The target was to have 100 percent of eligible seats at
either full payment or full performance. As of September 2005, the Navy
reported that 82 percent of seats were achieving full payment or full
performance. This is down from March 2005, when the Navy reported that 96
percent of seats achieved full payment or full performance.

oLeadership survey results: The target was to have the contractor earn 100
percent of both the Echelon II and Network Operations Leaders' survey
incentives. Through September 30, 2005, the contractor earned 25 percent
of the Echelon II incentive, and 0 percent of the operator's incentive.

Notwithstanding the above described performance relative to performance
category targets and strategic goals, Navy CIO and program officials
described the program as a major success. CIO officials, for example
stated that NMCI has significantly improved the Navy's IT environment, and
will increase productivity through greater knowledge sharing and improved
interoperability. They also stated that a review and certification process
for all applications deployed on the network has been implemented and thus
compliance with security and interoperability requirements has been
ensured. According to these officials, NMCI's value has been demonstrated
repeatedly over the last few years. In this regard, they cited the
following examples but did not provide verifiable data to support them.

oImproved security through continuous security assessments, a centralized
distribution of vulnerability information, configuration control of
critical servers, and an improved response to new vulnerabilities/threats.

oImproved continuity of operations (e.g., the Navy reports that it had no
prolonged disruptions due to recent hurricanes and fires on the West
Coast).

oIncreased personnel training and certification by increasing the amount
of offerings.

oIdentified opportunities for improving efficiency through the use of
performance metrics.

oImproved software and hardware asset management and implementation of
standard and secure configurations.

oProvided pier-side (waterfront) connectivity and Navy-wide public key
infrastructure.^23

The Navy's mapping of fiscal year 2005 data to performance categories and
targets as summarized above shows that the NMCI program has not yet met
either of its strategic goals. Specifically, the information superiority
and innovation goals that were used to justify the program have yet to be
attained. Further, although the Navy developed a plan to measure and
report on NMCI progress in meeting the strategic goals, this plan was not
implemented. As a result, the development and reporting of program
performance relative to strategic goals has not occurred.

Contractor Has Largely Met Many but Has Not Met Other SLAs

Our analysis of Navy contractor performance data since September 2004
shows that the extent to which the site-specific agreements have been met
for all operational seats (regardless of site) varies widely by individual
agreement, with some always being met but others having varied performance
over time and by seat type. Our analysis also showed that, although the
contractor has met most of the enterprisewide agreements during this time
period, it has not met a few. The Navy's analysis and reporting of
contractor performance relative to the SLAs, using data for the same time
period, showed that the percentage of operational seats meeting the
agreements averaged about 89 percent from March 2005 to September 2005,
then declined to 74 percent in October 2005 and averaged about 56 percent
between November 2005 and March 2006. These differences in how SLA
performance can be viewed illustrate how contractor performance against
the agreements can be viewed differently depending on how available data
are analyzed and presented. They also illustrate the importance of having
a comprehensive, transparent, and consistent approach to program
performance management that considers a range of perspectives and metrics.

Contractor Satisfaction of SLAs Has Varied by Agreement and Seat Type,
with Not All Agreements Being Met

For the period beginning October 2004 and ending March 2006, the
contractor's performance relative to site-specific SLAs has varied, with
certain agreements consistently being met regardless of seat type, other
agreements being met to varying degrees over time, and still others
largely not being met for certain seat types.^24 Variability in
performance has also occurred for enterprisewide agreements, although most
have been met.

Significant Percentage of All Applicable Seat Types Have Met Certain
Site-Specific Agreements

Between October 2004 and March 2006, the contractor has met, or usually
met, the agreement for each seat type for many SLAs. For example, the
contractor met SLA 324, which covers wide area network connectivity, for
all seat types all of the time. Also, SLA 325, covering network
communication services, and SLA 332, measuring application server
connectivity, were met for all seat types over the same time period. SLA
225, which measures base area network and local area network performance,
was met for essentially all seat types (see fig. 4). Similarly, SLA 328,
which measures the time to implement new seats and application servers,
was met for 94 percent or more of deployed seat types in January 2005
through March 2006 (see fig. 5). (See app. III for descriptions of each
SLA and figures illustrating levels of performance relative to each
applicable seat type.)

Figure 4: Site Level Performance for SLA 225

Figure 5: Site Level Performance for SLA 328

Certain Site-Specific Agreements Have Not Been Consistently Met Over Time

The contractor has not consistently met certain agreements between October
2004 and March 2006. For example, satisfaction of SLA 102, which covers
response time for network problem resolution, has ranged from a high of
100 percent in March 2005 and June 2005 to a low of 79 percent in February
2006. As of March 2006, this SLA was met by 97 percent of all seat types
(see fig. 6). Also, satisfaction of SLA 107, which is a measure of network
performance in areas of availability, latency/packet loss,^25 and quality
of service in support of videoconferencing and voice-over-IP, has varied
over time. Specifically, satisfaction has ranged from a high of 99 percent
in January 2006 to a low of 71 percent in January 2005. As of March 2006,
this agreement was met by 90 percent of all seat types (see fig. 7).

Figure 6: Site Level Performance for SLA 102

Figure 7: Site Level Performance for SLA 107

Significant Percentage of All Seat Types Have Not Met Certain
Site-Specific Agreements

Between October 2004 and March 2006, the contractor has not met certain
agreements for all seat types. For example, for SLA 101, which is a
measure of the time it takes to resolve NMCI user issues, the percentage
of seats meeting the agreement has widely varied. Specifically, the
percentage of mission-critical seats that met the agreement has been
consistently and significantly lower than was the case for the basic or
high end seats. In particular, as of March 2006, SLA 101 was met for about
90 percent of basic seats, 77 percent of high end seats, and 48 percent of
mission-critical seats (see fig. 8). Similarly, for SLA 103, which is a
measure of performance of end user services, the percentage of basic seats
that met the agreement was consistently and significantly lower than that
of high end or mission-critical seats. In March 2006, SLA 103 was met for
about 63 percent of basic seats, 74 percent of high end seats, and 86
percent of mission-critical seats (See fig. 9).

Figure 8: Percentage of Seats Meeting SLA 101

Figure 9: Percentage of Seats Meeting SLA 103

Most Enterprisewide Agreements Have Been Met, but a Few Have Not

The contractor generally met most of the SLAs that have enterprisewide
applicability. In particular, of the 13 such SLA's, 8 were met each month
between October 2004 and March 2006, and another was met all but 1 month
during this time period. Further, a tenth SLA was met for 14 out of the 18
months during this period.

However, the contractor has not consistently met 3 of the 13
enterprisewide SLAs. Specifically, SLA 103, which covers end user
services, was not met 12 of the 18 months. SLA 104, which covers the help
desks, was not met 11 out of the 18 months, including 8 out of the last 9
months of this period. SLA 106, which covers information assurance
services including identifying incidents, responding to incidents, and
configuration of NMCI, was not met for 11 out of 18 months, including the
last 9 months of the period. (See fig. 10 for a summary of the months in
which the contractor met and did not meet the enterprisewide SLAs.)

Figure 10: Months in Which the Enterprisewide SLAs Were Met and Not Met
between October 2004 and March 2006

Contractor Satisfaction of SLAs Relative to Contractually Defined
Performance Levels Has Varied

NMCI program officials told us that they measure the contractor's
SLA-related performance in terms of the percentage of eligible seats that
have met the contractual definitions of full payment and full performance.
More specifically, they compare the number of seats on a site-by-site
basis that have met these definitions with the number of seats that are
eligible. As discussed earlier, full payment means that the contractor has
met 100 percent of the applicable agreements at a given site, and 50 to 90
percent of the planned seats at that site have been cut over (i.e., are
operational). Full performance means that the contractor has met 100
percent of the applicable agreements at a given site, and over 90 percent
of the planned seats at that site have been cut over. In effect, this
approach focuses on performance for only those seats that are at sites
where at least 50 percent of the planned number of seats are actually
operating. It excludes performance at sites where less than 50 percent of
the ordered seats are operating. Moreover, it combines the results for all
SLAs and, therefore, does not highlight differences in performance among
service areas.

For the period beginning in October 2004 and ending in March 2005, the
contractor's performance in meeting the agreements from a contractual
standpoint increased, with the percentage of operational seats that met
either performance level having jumped markedly between October and
December 2004 (about 5 to 65 percent), then generally increasing to a high
of about 96 percent in March 2005. Since then, the percentage of seats
meeting either of the two performance levels fluctuated between 82 and 94
percent through September 2005 and then decreased to 74 percent in October
2005. From November 2005 through March 2006, the percentage of seats
meeting either performance level decreased to 55 percent. (See fig. 11 for
the trend in the percentage of operational seats meeting either the full
payment or full performance levels; see fig. 12 for the number of seats
achieving either performance level versus the number eligible for doing so
for the same time period.)

Figure 11: Trend in the Percentage of Operational Seats Meeting Either the
Full Payment or Full Performance Levels

Figure 12: Number of Seats Achieving Either the Full Payment or Full
Performance Levels Versus the Number of Seats Eligible

The preceding descriptions of SLA performance illustrate that contractor
performance against the agreements can be viewed differently depending on
how relevant data are analyzed and presented. Further, they illustrate the
importance of considering different perspectives and metrics in order to
have a comprehensive, transparent, and consistent approach to program
performance management.

NMCI Customer Groups' Satisfaction Levels Vary, but Overall Customer
Satisfaction Is Low

The Navy's three groups of NMCI customers--end users, organizational
commanders, and network operators--vary in the extent to which they are
satisfied with the program, but collectively these customers are generally
not satisfied. With respect to end users, the Navy reports that overall
satisfaction with NMCI improved between 2003 and 2005; however, reported
satisfaction levels have dropped off since September 2005. In addition,
while the Navy reports that this overall level of end user satisfaction
with contractor provided services has averaged about 76 percent since
April 2004,^26 this is below the Navy-wide target of 85 percent and
includes many survey responses at the lower end of the range of scores
that the Navy has defined "satisfied" to mean. With respect to commanders
and network operations leaders, neither is satisfied with NMCI. In
addition, officials representing each of the customer groups at five
shipyard or air depot installations that we visited expressed a number of
NMCI concerns and areas of dissatisfaction with the program. Without
satisfied customers, the Navy runs the risk that NMCI will not attain the
widespread acceptance necessary to achieve strategic program goals.

End User Surveys Show Dissatisfaction with NMCI

Despite reported improvements in end user satisfaction levels since 2002,
end user responses to quarterly satisfaction surveys have been
consistently at the low end of the range of scores that the Navy defines
the term "satisfied" to mean, and the percentage of end users that Navy
counts as being "satisfied" have consistently been below the Navy's
satisfaction target level. Specifically, although the Navy's satisfied
users dropped from about 66 percent in June 2002 to around 54 percent for
the next two quarters (September and December 2002), satisfaction
reportedly rose steadily from March 2003 through September 2005, peaking
at that time at about 80 percent. Since then, the percentage of end users
that the Navy reports to be satisfied has declined, leveling off at around
76 percent over the next several months.^27 This means that even with the
Navy's forgiving definition of what constitutes a satisfied end user, at
least 24 percent of end users are dissatisfied with NMCI. (See fig. 13 for
the trends in end user satisfaction with the program and the contractor.)

Figure 13: Trends in End User Satisfaction Levels Related to Program
Contractor Target Levels

Note: Survey participants varied over time.

Exacerbating this long-standing shortfall in meeting end user satisfaction
expectations is the fact that the Navy considers a "satisfied" end user to
include users that are at best marginally satisfied and arguably somewhat
dissatisfied. That is, the Navy uses an average score of 5.5 or greater
(on its 10-point satisfaction scale, where 1 is dissatisfied, and 10 is
satisfied) as the threshold for categorizing and counting end users as
satisfied. This means that users counted as satisfied may include a large
contingent that are at the low end of the satisfaction range (e.g.,
between 5.5 and 7). When the results of the March 2006 survey are examined
in this context, we see that this is the case. For example, we see that 8
of the 14 questions received an average score below 7.0.

Additional insights into the degree and nature of end user satisfaction
(and dissatisfaction) are apparent when the reported percentage of
satisfied users are examined from different perspectives, such as by (1)
individual survey questions and (2) organizational units. For example,
Navy-reported end user satisfaction survey results for the quarter ending
March 31, 2006, show that while the percentage of users deemed satisfied
with the program averaged about 74 percent, the percentage reported as
satisfied relative to each survey question ranged from a low 52 to a high
of 87 percent. These insights into end user sources of satisfaction and
dissatisfaction are summarized as follows:

oVariations in satisfaction levels by question. While the percentage of
end users who are categorized as satisfied with the program and the
contractor do not significantly differ (74 versus 76 percent,
respectively), variations do exist among the percentage satisfied with the
14 areas that the questions address. For example, far fewer (66 percent)
were satisfied with the reliability of the NMCI network than were
satisfied with the professionalism of EDS personnel (87 percent). (See
table 4 for the percentage of users satisfied and dissatisfied according
to each of the 14 survey questions.)

oVariations in satisfaction levels by organizational unit. The percentage
of end users who were categorized as being satisfied with the NMCI program
varied by organizational unit as much as 18 percentage points. For
example, about 66 percent of users in the Naval Sea Systems Command were
deemed satisfied with the program as compared with about 84 percent in the
Commander of Navy Installations. Similarly, the percentage of end users
who were categorized as satisfied with the contractor also varied by 17
percentage points, with the Naval Sea Systems Command and Naval Air
Systems Command having about 69 percent of its users viewed as satisfied
and the Commander of Navy Installations having about 86 percent. (See
tables 5 and 6 for percentages of satisfied end users by Navy and Marine
Corps, respectively, organizations as of March 31, 2006.)

Table 4: NMCI End User Customer Satisfaction Survey Questions and Results
for the Quarterly Period Ending on March 31, 2006

                                        

               Survey questions             Average Percentage not Percentage 
                                              score      satisfied  satisfied 
With the process to make changes to your     5.5            48%        52% 
IT environment?  ^a                                                        
With training on how to use NMCI             6.5             32         68 
effectively?^a                                                             
With having access to the software you       6.6             33         67 
need to accomplish your job?^a                                             
With having access to the computer           7.0             26         74 
hardware you need to accomplish your                                       
job?^a                                                                     
With network reliability?                    6.4             34         66 
With the timeliness of problem               6.6             32         68 
resolution?                                                                
With the dependability of the computer       6.8             29         71 
you use?                                                                   
What is your overall satisfaction with       6.8             27         73 
services provided by EDS?                                                  
With the solution implemented to correct     7.0             27         73 
any problem you experienced?                                               
With finding and using information about     7.0             23         77 
NMCI services?                                                             
With technical support services provided     7.2             25         75 
by the help desk?                                                          
With the accuracy of information             7.1             22         78 
describing how to use NMCI services?                                       
With technical support services provided     7.1             25         75 
by on-site personnel?                                                      
With the professionalism of EDS              8.0            13%        87% 
personnel?                                                                 

Source: GAO based on Navy-provided data.

^aResponses to these questions were not used to determine levels of
satisfactions with contractor provided services.

Note: Scores shown may reflect rounding decisions made by the Department
of the Navy regarding the results of its calculations.

Table 5: Percentages of Satisfied End Users by Navy Budget Submitting
Office, as of March 31, 2006

                                        

Navy budget submitting offices     Percentage    Percentage satisfied with 
                                  satisfied with contractor-provided services 
                                    NMCI program                              
Naval Sea Systems Command                 66%                          69% 
Naval Air Systems Command                  67                           69 
Naval Facilities Engineering               68                           71 
Command                                                                    
Space and Naval Warfare                    69                           71 
Systems Command                                                            
Chief of Naval Operations                  72                           75 
Administrative Assistant to                75                           76 
the Under Secretary of the                                                 
Navy                                                                       
Commander, U.S. Pacific Fleet              77                           78 
Reserve Forces                             78                           81 
Manpower, Personnel, Training              79                           81 
and Education                                                              
Commander, U.S. Atlantic Fleet             79                           80 
Aggregated Navy Budget                     80                           81 
Submitting Offices^a                                                       
Naval Supply Systems Command               80                           82 
Commander, Navy Installations             84%                          86% 

Source: GAO based on Navy provided data.

^aIncludes the Bureau of Medicine, Military Sealift Command, Navy
Engineering Logistics Office, Naval Meteorology and Oceanography Command,
Office of Naval Intelligence, Office of Naval Research, and the Naval
Security Group.

Note: Scores shown may reflect rounding decisions made by the Department
of the Navy regarding the results of its calculations.

Table 6: Percentages of Satisfied End Users by U.S. Marine Corps Major
Command, as of March 31, 2006

                                        

Marine Corps Major  Percentage satisfied         Percentage satisfied with 
        Commands          with NMCI program      contractor-provided services 
Aggregated                           69%                               72% 
Marines^a                                                                  
Training and                          69                                72 
Education Command                                                          
U.S. Marine Forces,                   70                                72 
Atlantic                                                                   
Logistics Command                     71                                73 
U.S. Marine Forces,                   71                                73 
Pacific                                                                    
U.S. Marine Forces,                  77%                               81% 
Reserve                                                                    

Source: GAO based on Navy-provided data.

^aIncludes Enterprise USMC, Headquarters Marine Corps, Marine Corps Combat
Development Center, Marine Corps Recruiting Command and Marine Corps
Systems Command. Surveys were distributed to 1,671 of a total population
of 6,685 end users in these Commands.

Note: Scores shown may reflect rounding decisions made by the Department
of the Navy regarding the results of its calculations.

Commander and Network Operator Surveys Show That Both Customer Groups Are
Dissatisfied

The Navy conducted surveys of commander and network operations leader
units in September 2005 and in March 2006. Overall, survey results show
that neither commanders nor operators are satisfied with NMCI.

Commander Survey Results

The results from the two commander satisfaction surveys conducted to date
show that the customers are not satisfied, with NMCI. Specifically, on a
scale of 0-3 with 0 being not satisfied, and 1 being slightly satisfied
with the contractor's support in meeting the mission needs and strategic
goals of these organizations, the average response from all organizations
was 0.65 and 0.76 in September 2005 and March 2006, respectively. The
latest survey results show minor differences in the degree of
dissatisfaction with the four types of contractor services addressed
(cutover services, technical solutions, service delivery, and warfighter
support). (See table 7 for results of the September 2005, and March 2006,
commander satisfaction surveys.)

Table 7: Results of the 6-Month Periods Ending on September 30, 2005 and
March 31, 2006, Commander Surveys

                                        

               September              March   
                 2005                  2006   
  Reporting              War-fighter  Cutover Technical  Service      Average  War-fighter  Cutover Technical  Service      Average 
 organization                support          solutions delivery organization      support          solutions delivery organization 
                                     services                           score              services                           score 
Assistant for                      *        *         2        2         2.00            2        1         1        1         1.25 
Administration                                                                                                                      
to the Under                                                                                                                        
Secretary of                                                                                                                        
the Navy                                                                                                                            
Bureau of                          *        0         1        1         0.67           **       **        **       **          n/a 
Personnel                                                                                                                           
Commander of                       1        0         0        1         0.50            1        0         0        0         0.25 
Navy                                                                                                                                
Installations                                                                                                                       
Chief of Naval                     1        1         1        1         1.00            0        1         0        2         0.75 
Operations                                                                                                                          
(CNO)                                                                                                                               
CNO-Field                         **       **        **       **          n/a            1        2         1        1         1.25 
Support                                                                                                                             
Activity,                                                                                                                           
Pacific                                                                                                                             
Command                                                                                                                             
Commander,                         0        0         0        0         0.00            0        0         1        0         0.25 
Atlantic Fleet                                                                                                                      
Commander,                         1        0         0        0         0.25            0        0         0        0         0.00 
Pacific Fleet                                                                                                                       
Naval Air                          0        0         0        0         0.00            0        0         0        0         0.00 
Systems                                                                                                                             
Command                                                                                                                             
Naval                              0        1         1        0         0.50            0        1         1        0         0.50 
Facilities                                                                                                                          
Engineering                                                                                                                         
Command                                                                                                                             
Naval Sea                          0        0         0        0         0.00            0        0         0        0         0.00 
Systems                                                                                                                             
Command                                                                                                                             
Naval Supply                       1        0         0        0         0.25            0        0         0        0         0.00 
Systems                                                                                                                             
Command                                                                                                                             
Office of                         **       **        **       **          n/a            *        2         1        2         1.67 
Naval Research                                                                                                                      
Reserve Forces                     1        2         2        1         1.50            1        2         2        2         1.75 
Space and                          1        *         1        1         1.00            2        *         1        2         1.67 
Naval Warfare                                                                                                                       
Systems                                                                                                                             
Command                                                                                                                             
Logistics                          0        0         0        0         0.00            0        0         0        0         0.00 
Command                                                                                                                             
Manpower,                         **       **        **       **          n/a            2        1         2        2         1.75 
Personnel,                                                                                                                          
Education and                                                                                                                       
Training                                                                                                                            
Headquarters                       0        0         0        0         0.00            *        1         1        0         0.67 
Marine Corps                                                                                                                        
Marine Corps                       1        2         2        2         1.75            0        1         2        3         1.50 
Combat                                                                                                                              
Development                                                                                                                         
Center                                                                                                                              
Marine Corps                       0        1         0        0         0.25            0        1         1        0         0.50 
Systems                                                                                                                             
Command                                                                                                                             
Marine Corps                      **       **        **       **          n/a            *        0         0        0         0.00 
Recruiting                                                                                                                          
Command                                                                                                                             
Commander,                        **       **        **       **          n/a            1        1         1        1         1.00 
Marine Forces                                                                                                                       
Marine Forces,                     0        0         0        0         0.00           **       **        **       **          n/a 
Atlantic                                                                                                                            
Marine Forces,                     0        1         1        1         0.75            0        0         0        0         0.00 
Pacific                                                                                                                             
Marine Forces,                    **       **        **       **          n/a            0        2         0        0         0.50 
Reserves                                                                                                                            
Military                           0        0         0        0         0.00           **       **        **       **          n/a 
Sealift                                                                                                                             
Command                                                                                                                             
Naval                              *        2         2        2         2.00           **       **        **       **          n/a 
Education and                                                                                                                       
Training                                                                                                                            
Command                                                                                                                             
Training and                       2        2         0        1         1.25            *        3         2        2         2.33 
Education                                                                                                                           
Command                                                                                                                             
Overall                                                                  0.65                                                  0.76 
satisfaction                                                                                                                        
average                                                                                                                             

Legend

"*" no response was provided

"**" the organization was not included in survey report

Source: GAO based on Navy-provided data.

Network Operations Leaders Survey Results

The Navy-reported results of the two network operations leader
satisfaction surveys conducted to date show that these customers are also
not satisfied with NMCI. Specifically, on a scale of 0-3 with 0 being not
satisfied and 1 being slightly satisfied with the contractor's support in
meeting the mission needs and strategic goals of these two organizations,
the average of the responses from NETWARCOM in September 2005 was 0.33,
rising to 0.67 in March 2006. For MCNOSC, the average of the responses to
both surveys was 0.00. (See table 8 for these results.) Of the three types
of contractor services addressed in the survey (mission support and
planning, network management, and service delivery), network management
services, which includes information assurance and urgent software
patching, received a score of 0 from both organizations on both surveys.

Table 8: Results for the 6-Month Periods Ending on September 30, 2005, and
March 31, 2006, Network Operations Leaders Survey

                                        

              September                      March  
                2005                         2006   
Reporting       Mission    Network  Service Average   Mission    Network  Service Average 
organization  support & management delivery   score   support management delivery   score 
               planning                                     &                             
                                                                                          
                                                     planning                             
NETWARCOM          0.00       0.00     1.00    0.33      1.00       0.00     1.00    0.67 
MCNOSC             0.00       0.00     0.00    0.00      0.00       0.00     0.00    0.00 
Overall                                        0.17                                  0.33 
satisfaction                                                                              
average                                                                                   

Source: GAO based on Navy-provided data.

Shipyard and Air Depot Customers Consistently Identified a Range of
Concerns and Areas of Dissatisfaction with NMCI

Consistent with the results of the Navy's customer satisfaction surveys,
officials representing end users, commanders, and network operations
personnel at five shipyards or air depots^28 that we interviewed cited a
number of concerns or sources of dissatisfaction with NMCI. The anecdotal
information that they provided to illustrate their concerns are described
in the next section.

Continued Reliance on Legacy Systems

Shipyard and air depot officials for all five sites told us that they have
continued to rely on their legacy systems rather than NMCI for various
reasons. For example, officials at one air depot stated that NMCI provided
less functionality than their legacy systems and thus they have continued
to use these legacy systems to support mission operations. Also, officials
at one shipyard told us that site personnel lack confidence in NMCI and
thus they continue to use legacy systems. Officials at the other two
shipyards voiced even greater concerns, with officials at one saying that
only NMCI seats (i.e., workstations) are running on the NMCI intranet
(their servers are still running on their legacy network), and officials
at the other saying that NMCI does not support their applications and thus
they primarily use it for e-mail. Similarly, officials at an air depot
stated that NMCI workstations are not capable of supporting certain
applications, such as high-performance modeling, and thus they operate
about 233 other workstations to support their needs.

Loss in Workforce Productivity

According to a memo from the Commander of one shipyard to the Naval Sea
Systems Command dated December 2005, NCMI software updates adversely
affect the operation of network applications. Consistent with this,
officials at two of the sites stated that NMCI is hurting workforce
productivity, with officials at one shipyard saying that system downtime,
particularly as it relates to major applications, has deteriorated and is
unacceptable, and officials at another shipyard said that NMCI response
time is slow both on- and off-site. To illustrate, officials at one air
depot said that personnel cannot download more than one file at a time,
while officials at shipyards stated that "reach back" to legacy systems
through NMCI is slow, sometimes taking 45 minutes to open a document.
Further, officials at shipyards complained that users' profiles do not
follow the user from one workstation to another, causing users to recreate
them, while officials at one air depot stated that NMCI does not provide
them the capability to monitor employees' inappropriate use of the
Internet (e.g., excessive use or accessing unauthorized sites).

Lack of Support of Dynamic Work Environments

Both air depot and shipyard officials described their respective work
environments as dynamic, meaning that they are frequently changing, and
thus require flexibility in moving and configuring workstations. Further,
shipyards operate at the waterfront, which we were told is an environment
that requires quick responses to changing needs. For example, ships come
in, barges are created to service them, and these barges must be outfitted
with computers. Decisions occur in a short amount of time regarding new
barge set ups and equipment movements. According to shipyard officials,
NMCI has not been able to support these barge-related requirements because
it is not flexible enough to quickly react to shifting work priorities. As
a result, officials with one shipyard stated that they have had to provide
their own waterfront support using legacy systems. Similarly, officials
with the air depots stated that the NMCI contractor has a difficult time
moving seats fast enough to keep up with changing needs.

Limitations in Help Desk Support

Officials from each of the shipyards and air depots voiced concerns and
dissatisfaction with help desk assistance. According to officials with the
air depots, the quality of help desk support is inconsistent, and thus
they have had to assume more of the burden in dealing with IT system
problems since they transitioned to NMCI. Shipyard officials were even
more critical of help desk support. According to officials at one
shipyard, help desk support is not working, as it is almost impossible to
get a help desk call done in 1 hour. Similarly, officials at another
shipyard told us that help desk responsiveness has been poor because it
takes hours, if not days, to get problems fixed. The previously cited memo
from the Commander of one shipyard to the Naval Sea Systems Command cited
an average time of 2.4 days to respond to customer inquiries.

Problems with NMCI Site Preparation and Transition

Officials from all five sites expressed concerns with the manner in which
they were prepared for transitioning to NMCI. According to officials at
one air depot, certain seat management requirements were overlooked, and
NMCI users have struggled with understanding the contract processes that
govern, for example, how to order new software and hardware, or how to
relocate machines, because the contractual terms are difficult to follow,
and training was not adequate. In particular, they said users do not
understand with whom they should talk to address a given need, and
officials with one air depot noted that NMCI has no solution for their
electronic classroom needs. Officials at one shipyard attributed the lack
of NMCI site preparation to insufficient planning prior to deploying NMCI
and a lack of transparency in how NMCI was being managed, including how
deployment issues were to be resolved. As stated by officials at another
shipyard, the transition to NMCI was difficult and very disruptive to
operations because they had no control over the contractor transition
team.

NMCI program officials told us that they are aware of the concerns and
sources of dissatisfaction of shipyard and air depot customers, however,
they added that many of them are either not supported by data or reflect
customers' lack of familiarity with the services available under the
contract. In particular, they said that they have not been provided any
data showing a drop in workforce productivity caused by NMCI. They also
said that continued reliance on legacy systems illustrates a lack of
familiarity with the contract because provisions exist for moving legacy
servers onto NMCI and supporting certain applications, such as
high-performance modeling. Further, they said that the contract supports
monitoring Internet usage, provides waterfront support to shipyards, and
provides help desk service 24 hours a day, 7 days a week. Nevertheless,
they acknowledged that both a lack of customer understanding, and customer
perceptions about the program are real issues affecting customer
satisfaction that need to be addressed.

Customer Satisfaction Improvement Efforts Are Not Being Guided by
Effective Planning

The NMCI program office reports that improving customer satisfaction is a
program priority. Accordingly, it has invested and continues to invest
time and resources in a variety of activities that it associated with
customer satisfaction improvement, such as holding user conferences and
focus groups. However, these efforts are not being guided by a documented
plan that defines prioritized improvement projects and associated resource
requirements, schedules, and measurable goals and outcomes. Given the
importance of improved customer satisfaction to achieving NMCI program
goals and benefits, it is important for the Navy to take a structured and
disciplined approach to planning its improvement activities. Without it,
the program office cannot adequately ensure that it is effectively
investing scarce program resources.

As we have previously reported,^29 effectively managing program
improvement activities requires planning and executing such activities in
a structured and disciplined fashion. Among other things, this includes
developing an action plan that defines improvement projects and
initiatives, assigns roles and responsibilities, sets priorities,
identifies resource needs, establishes time lines with milestones, and
describes expected results in measurable terms. The Software Engineering
Institute's IDEAL^SM model, for example, is one recognized approach for
managing process improvement efforts.^30 According to this model,
improvement efforts should include a written plan that serves as the
foundation and basis for guiding improvement activities, including
obtaining management commitment to and funding for the activities,
establishing a baseline of commitments and expectation against which to
measure progress, prioritizing and executing activities and initiatives,
determining success, and identifying and applying lessons learned. Through
such a structured and disciplined approach, improvement resources can be
invested in a manner that produces optimal results. Without such an
approach, improvement efforts can be reduced to trial and error.

The NMCI program office identified seven initiatives that are intended to
increase customer satisfaction with the program. According to program
officials, the initiatives are (1) holding user conferences, (2)
conducting focus groups, (3) administering diagnostic surveys, (4)
strengthening help desk capabilities, (5) expanding network services
(e.g., adding broadband remote access), (6) assessing infrastructure
performance, and (7) initiating a lean six sigma effort.^31 Following are
descriptions of each initiative:

User conferences. The program office has conducted semiannual NMCI user
conferences since 2000. According to program officials, these conferences
provide a forum for users to directly voice to program leaders their
sources of dissatisfaction with NMCI. During the conferences, users ask
questions, participate in issue-focused breakout sessions, and engage in
informal discussions. We attended the June 2005 user conference and
observed that Navy and contractor program officials provided information,
such as updates on current and planned activities and capabilities, while
users had opportunities to provide comments and ask questions. According
to program officials, the conferences are useful in making program
officials aware of customer issues and are used to help diagnose NMCI
problems.

Focus groups. According to program officials, they conduct user focus
groups to, among other things, solicit reasons for customer
dissatisfaction and explore solutions and to test newly proposed end user
satisfaction survey questions. The focus group sessions include invited
participants and are guided by prepared scripts. The results of the
sessions are summarized for purposes of identifying improvements such as
revisions to user satisfaction survey questions.

Diagnostic surveys. The program office performs periodic surveys to
diagnose the source of user dissatisfaction with specific services, such
as e-mail, printing, and technical support. According to program
officials, these surveys help identify the root causes of user
dissatisfaction and support analysis of areas needing improvement.
However, they could not identify specific examples of where such causes
have been identified and addressed and measurable improvements have
resulted.

Help desk improvement team.  The program office established a team to
identify the reasons for declining end user satisfaction survey scores
relative to the technical support services provided by the help desk.
According to program officials, the team traced declining satisfaction
levels to such causes as help desk agents' knowledge, training, and
network privilege shortfalls. To address these limitations, the program
office reports that it has redesigned and restructured help desk
operations to organize help desk agents according to skills and
experience, route calls according to the skill level needed to address the
call, target needed agent training, hold daily meetings with agents to
apprise them of recent issues, and monitor help desk feedback. However,
program officials could not link these efforts to measurable improvements
in help desk performance, and NMCI customers that we interviewed during
our visits to shipyards and air depots voiced concerns with help desk
support.

Expanded network services. NMCI program officials stated that a key
improvement initiative has been expanding the scope of network-related
services that are available under the contract. In particular, they point
to such new services as broadband remote access for all laptop users,
antispam services for all e-mail accounts, and antispyware services for
all accounts as having improved customer satisfaction. Further, they said
that the planned addition of wireless broadband access will increase
customer satisfaction. However, they could not provide data showing how
these added services affected customer satisfaction, or how future
services are expected to affect satisfaction.

Infrastructure performance assessment. Working with EDS, the program
office undertook an NMCI network infrastructure assessment that was
intended to identify and mitigate performance issues. This assessment
included establishing metrics and targets for common user functions such
as opening a Web site, then determining actual network performance at the
Washington Navy Yard and Marine Corps installations in Quantico, Virginia.
According to program officials, assessment results included finding that
network performance could be improved by balancing traffic among firewalls
and upgrading wide area network circuits. As a result of this initial
assessment, the program has begun adjusting network settings and upgrading
hardware at additional NMCI sites. Further, program officials said they
are expanding their use of network infrastructure metrics to all sites.
However, they neither provided us with a plan for doing so, nor did they
demonstrate that these efforts have affected customer satisfaction.

Lean six sigma. Program officials said they are applying lean six sigma
techniques to improve customer satisfaction. In particular, they have
established a customer satisfaction workgroup, which is to define a
process for identifying customer problems and prioritizing improvement
projects. They said that, for each project, they will perform concept
testing using pilot projects and focus groups. They also said that they
plan to establish a steering committee that includes representatives from
the Navy and the contractor. The officials told us that they have
initiated seven projects using lean six sigma techniques, although they
did not provide us with any information about the results of these
projects or their impact on customer satisfaction.

While any or all of these initiatives could result in improvements to
customer satisfaction, the program office could not demonstrate that they
have produced or will produce measurable improvements. Moreover, the
latest customer satisfaction data provided to us show that satisfaction
levels are not improving. Further, it is unclear how these various
initiatives relate to one another, and various aspects of these
initiatives appear redundant such as multiple teams and venues to identify
root causes and propose solutions.

One reason for this lack of demonstrable improvements and redundancy is
the way in which the program office has pursued its improvement
initiatives. In particular, they have not been pursued as an integrated
portfolio of projects that were justified and prioritized on the basis of
relative costs and benefits. Further, they have not been guided by a
well-defined action plan that includes explicit resource, schedule, and
results-oriented baselines, as well as related steps for knowing whether
expected outcomes and benefits have actually accrued. Rather, program
officials stated that customer satisfaction improvement activities have
been pursued as resources become available and have been in reaction to
immediate issues and concerns.

Without a proactive, integrated, and disciplined approach to improving
customer satisfaction, the Navy does not have adequate assurance that it
is optimally investing its limited resources. While the lean six sigma
techniques that program officials told us they are now applying to
customer satisfaction improvement advocate such an approach, program
officials did not provide us with documentation demonstrating that they
are effectively planning and executing these projects.

Conclusions

IT service programs, like NMCI, are intended to deliver effective and
efficient mission support and to satisfy customer needs. If they do not,
or if they are not being managed in a way to know whether or not they do,
then the program is at risk. Therefore, it is important for such programs
to be grounded in outcome-based strategic goals that are linked to
performance measures and targets, and it is important for progress against
these goals, measures, and targets to be tracked and reported to agency
and congressional decision makers. If such measurement does not occur,
then deviations from program expectations will not become known in time
for decision makers to take timely corrective action. The inevitable
consequence is that program results will fall short of those that were
promised and used to justify investment in the program. The larger the
program, the more significant these deviations and their consequences can
be.

NMCI is an enormous IT services program and thus requires highly effective
performance management practices. However, such management, to include
measurement of progress against strategic program goals and reporting to
key decision makers of performance against strategic goals and other
important program aspects, such as examining service level agreement
satisfaction from multiple vantage points and ensuring customer
satisfaction, has not been adequate. One reason for this is that
measurement of progress against strategic program goals has not been a
priority for the Navy on NMCI, giving way to the Navy's focus on deploying
NMCI seats to more sites despite a long-standing pattern of low customer
satisfaction with the program and known performance shortfalls with
certain types of seats. Moreover, despite investing in a range of
activities intended to improve customer satisfaction, plans to effectively
guide these improvement efforts, including plans for measuring the success
of these activities, have not been developed. Given that the Navy reports
that it has already invested about 6 years and $3.7 billion in NMCI, the
time to develop a comprehensive understanding of the program's performance
to date, and its prospects for the future, is long overdue.

To its credit, the Navy recognizes the importance of measuring program
performance, as evidenced by its use of service level agreements, its
extensive efforts to survey customers, and its various customer
satisfaction improvement efforts. However, these steps need to be given
the priority that they deserve and be expanded to obtain a full and
accurate picture of program performance. Doing less increases the risk of
inadequately informing ongoing NMCI investment management decisions that
involve huge sums of money and carry important mission consequences.

Recommendations for Executive Action

To improve NMCI performance management and better inform investment
decision making, we recommend that the Secretary of Defense direct the
Secretary of the Navy to ensure that the NMCI program adopts robust
performance management practices that, at a minimum, include (1)
evaluating and appropriately adjusting the original plan for measuring
achievement of strategic program goals and provides for its implementation
in a manner that treats such measurement as a program priority; (2)
expanding its range of activities to measure and understand service level
agreement performance to provide increased visibility into performance
relative to each agreement; (3) sharing the NMCI performance results with
DOD, Office of Management and Budget, and congressional decision makers as
part of the program's annual budget submissions; and (4) reexamining the
focus, scope, and transparency of its customer satisfaction activities to
ensure that areas of dissatisfaction described in this report are
regularly disclosed to the aforementioned decision makers and that
customer satisfaction improvement efforts are effectively planned and
managed. In addition, we recommend that the Secretary of Defense direct
the Secretary of the Navy, in collaboration with the various Navy entities
involved in overseeing, managing, and employing NMCI, to take appropriate
steps to ensure that the findings in this report and the outcomes from
implementing the above recommendations are used in considering and
implementing warranted changes to the NMCI's scope and approach.

Agency Comments and Our Evaluation

In its comments on a draft of this report, signed by the Deputy Assistant
Secretary of Defense (Command, Control, Communications, Intelligence,
Surveillance, Reconnaissance & Information Technology Acquisition
Programs) and reproduced in appendix IV, DOD agreed with our
recommendations and stated that it has implemented, is implementing, or
will implement each of them. In this regard, the department stated that
the report accurately highlights the need to adjust the NMCI strategic
goals and associated measures, and it committed to, among other things,
sharing additional NMCI performance data with decision makers as part of
the annual budget process. Notwithstanding this agreement, DOD also
commented that the Navy believes that our draft report contained factual
errors, misinterpretations, and unsupported conclusions. We do not agree
with the Navy's position. The Navy's points are summarized below along
with our response.

oThe Navy stated that our review focused on Navy shipyards and air depots
to the exclusion of Marine Corps sites. We disagree. As the Objectives,
Scope and Methodology section of our report points out, the scope of our
review covered the entire NMCI program and extended to Navy and Marine
Corps sites based on data we obtained from program officials. For example,
our work on the extent to which NMCI had met its two strategic goals was
programwide, and our work on SLA performance and customer satisfaction
surveys included Navy and Marine Corps sites at which NMCI was operating
and Navy and Marine Corps customers that responded to the program's
satisfaction surveys.

oThe Navy stated that NMCI is a strategic success, noting that the program
is meeting its goals of providing information superiority (as well as
information security) and fostering innovation. As part of these
statements, the Navy cited such things as the number of users supported
and seats deployed, the types of capabilities fielded, and contracting
actions taken. In addition, the Navy stated that NMCI has thwarted
intrusion attacks that have penetrated other DOD systems, and it concluded
that NMCI represents a major improvement in information superiority over
the Navy's legacy network environment in such areas as virus protection
and firewall architecture. It also noted that more Naval commands now have
access to state-of-the-art workstations and network services, which it
concluded means that NMCI is fostering innovation. While we do not
question these various statements about capabilities, improvements, and
access, we would note they are not results-oriented, outcome-based
measures of success. Moreover, we do not agree with the statements about
NMCI meeting its two strategic goals and being a strategic success. As we
show in our report using the Navy's own performance categories,
performance targets, and actual SLA and other performance data, NMCI met
only 3 of the 20 performance targets spanning nine performance categories
that the Navy established for determining goal attainment. Concerning
these results, the Navy stated that our report's use of SLA performance
data constitutes a recommendation on our part for using such data in
determining program goal attainment, which the Navy said is "awkward"
because SLAs "do not translate well into broad goals." We do not agree
that our report recommends the use of any particular performance data and
targets for determining program goal attainment. Our report's use of these
data and targets is purely because the NMCI program office provided them
to us in response to our inquiry for NMCI performance relative the nine
Navy-established performance categories. We are not recommending any
particular performance targets or data. Rather, we are recommending that
the approach for measuring achievement of strategic goals be reevaluated
and adjusted. Accordingly, we support DOD's comment that the Navy needs to
adjust the original NMCI strategic goals and associated measures.

oThe Navy stated that we misinterpreted SLA data as they relate to the
contractor performance categories of full payment and full performance. We
disagree. The report presents a Navy-performed analysis of SLA data
relative to the full payment and full performance categories that offers
no interpretation of these data. However, because the Navy's analysis of
SLA data is an aggregation, we performed a different analysis to provide
greater visibility into individual SLA performance that the Navy's full
payment and full performance analyses tends to hide. Our analysis also
avoids the bundling and averaging concerns that the Navy raised.

oThe Navy stated that some of our customer satisfaction conclusions were
unsupported. Specifically, the Navy said that the way it collects end-user
satisfaction responses, 5.5 or higher on a scale of 10 indicates a
satisfied user, and such a scale is in line with industry practice.
Therefore, the Navy said that user satisfaction survey responses do not
"break out" in a way that supports our conclusion that scores of 5.5
through 7 are marginally satisfied users. We do not agree. While we
recognize that the Navy's 1-10 scale does not differentiate between
degrees of satisfaction, we believe that doing so would provide insight
and perspective that is lacking from merely counting a user as satisfied
or not satisfied. When we analyzed the responses to individual questions
in terms of degrees of satisfaction, we found that average responses to 10
of 14 survey questions were 5.5 to 7, which is clearly close to the lower
limit of the satisfaction range. Also, with regard to customer
satisfaction, the Navy stated that our inclusion in the report of
subjective statements from shipyard and air depot officials did not
include any data to support the officials' statements and thus did not
support our conclusions. We recognize that the officials' statements are
subjective and anecdotal, and our report clearly identified them as such.
Nevertheless, we included them in the report because they are fully
consistent with the customer satisfaction survey results and thus help
illustrate the nature of NMCI user concerns and areas of dissatisfaction
that the survey results show exist.

oThe Navy stated that NMCI provides adequate reports to key decision
makers. However, we disagree because the reporting that the Navy has done
has yet to disclose the range of performance and customer satisfaction
issues that our report contains. Our message is that fully and accurately
disclosing program and contractor performance and customer satisfaction to
the various entities responsible for overseeing, managing, and employing
NMCI will serve to strengthen program performance and accountability.

The Navy also provided various technical comments, which we have
incorporated as appropriate.

We are sending copies of this report to interested congressional
committees; the Secretary of Defense; the Secretary of the Navy; the
Commandant of the Marine Corps; and the Director, Office of Management and
Budget. We also will make copies available to others upon request. In
addition, the report will be available at no charge on the GAO Web site at
http://www.gao.gov .

If you have any questions concerning this information, please contact me
at (202) 512-6256 or by e-mail at [email protected] . Contact points for
our Offices of Congressional Relations and Public Affairs may be found on
the last page of this report. Key contributors to this report are listed
in appendix V.

Randolph C. Hite
Director, Information Technology Architecture and Systems
Issues

List of Congressional Addressees

The Honorable John Warner
Chairman
The Honorable Carl Levin
Ranking Minority Member
Committee on Armed Services
United States Senate

The Honorable Susan M. Collins
Chairman
The Honorable Joseph I. Lieberman
Ranking Minority Member
Committee on Homeland Security and Governmental
Affairs
United States Senate

The Honorable Judd Gregg
United States Senate

The Honorable Olympia J. Snowe
United States Senate

The Honorable John E. Sununu
United States Senate

The Honorable Duncan L. Hunter
Chairman
The Honorable Ike Skelton
Ranking Minority Member
Committee on Armed Services
House of Representatives

The Honorable Tom Davis
Chairman
The Honorable Henry A. Waxman
Ranking Minority Member
Committee on Government Reform
House of Representatives

The Honorable Thomas H. Allen
House of Representatives

The Honorable Michael H. Michaud
House of Representatives

Appendix I
Objectives, Scope, and Methodology

Our objectives were to review (1) whether the Navy Marine Corps Intranet
(NMCI) is meeting its strategic goals, (2) the extent to which the
contractor is meeting its service level agreements (SLA), (3) whether
customers are satisfied with the program, and (4) what is being done to
improve customer satisfaction.

To determine whether NMCI is meeting its strategic goals, we

oreviewed documents provided by Department of the Navy describing the
mission need for NMCI, strategic goals, performance measures, and data
gathered on actual performance,

oconducted interviews with officials from the offices of the Department of
Defense Chief Information Officer (CIO), Department of the Navy CIO, and
Assistant Secretary of the Navy for Research, Development, and
Acquisition, including officials in the NMCI program office,

oidentified the NMCI strategic goals, related performance categories,
associated performance targets, and actual performance data through
document reviews and interviews,

odeveloped an analysis showing NMCI's performance relative to the
strategic goals, performance categories, and targets based upon available
actual performance data, and

oshared our analysis with program officials and adjusted the analysis
based on comments and additional data they provided.

To determine the extent to which performance expectations defined in NMCI
SLAs have been met, we

oconducted interviews with NMCI program office and contractor officials to
gain an understanding of available SLA performance data and potential
analysis methods,

oobtained data on actual SLA performance that are used by the Navy as the
basis for making performance-based payments to the contractor and, for
each SLA, these data indicated whether one or more measurement(s) were
taken and if so, whether the measure was met or

not, for each seat type (i.e., basic, high end, and mission-critical), at
every site for each month from October 2004 through March 2006,^1

oanalyzed data for site-specific SLAs by calculating the number of seats
that met each agreement at each site for each month and when measurement
data were available according to seat type, we calculated the number of
seats that met each agreement for each seat type. Otherwise, we calculated
the total number of seats that met each agreement. We counted an agreement
as met at a site if all of the agreement's measured targets were met at
the site for a given month. To calculate the percentage of seats for which
an agreement was met, we added the total number of seats at all sites for
which an agreement was met, and divided it by the total number of seats at
all sites for which measurements were made,

oanalyzed data for enterprisewide SLAs by determining whether an agreement
was met at all Navy (excluding the Marine Corps) and all Marine Corps
sites for each month, and we counted an agreement as met if all of the
agreement's measured targets were met for a given month,

ocompared our site specific and enterprisewide SLA analyses across months
to identify patterns and trends in overall SLA performance and in
situations were an SLA is composed of site specific and enterprisewide
measures, we did not aggregate our site specific and enterprisewide
results. Thus, an SLA could have been met at the site level but not at the
enterprisewide, and vice versa, and

odescribed our analysis method and shared our results with program office
and contractor officials and made adjustments based on their comments.

To determine whether NMCI customers are satisfied, we

oobtained and analyzed results of end users surveys conducted from June
2002 through March 2006 and commanders and network operations leaders
surveys from September 2005 through March 2006,

oconducted interviews with NMCI program office and contractor officials to
gain an understanding of how the surveys were developed and administered
and their procedures for validating and auditing reported results,

oanalyzed data in the survey reports by comparing actual with desired
results, and we also analyzed the data to identify trends in satisfaction
levels over time and variation in satisfaction by question, organization,
and type of service, and

oconducted interviews with a broad range of NMCI users at Navy sites:
Portsmouth Naval Shipyard, Norfolk Naval Shipyard, Puget Sound Naval
Shipyard, Jacksonville Naval Air Depot, and North Island Naval Air Depot.
We selected these sites because they are among the largest, include
diverse user communities, and represent different stages of program
implementation. Participants in the interviews included officials from the
Offices of the Commander, CIO, Information Technology and Communications
Services, end users relying on NMCI desktop services in day-to-day
operations, and the contractor.

To determine what has been done to improve customer satisfaction, we

ointerviewed program office and contractor officials to identify and
develop an understanding of customer satisfaction improvement efforts. To
determine the results and impact of each effort, and we interviewed
program officials and obtained and analyzed relevant documentation,

oresearched best practices into effective management of improvement
activities and compared the program office's approach with the practices
we identified to evaluate the overall effectiveness of the customer
satisfaction improvement activities, and

oattended the June 2005 NMCI enterprise conference to observe the
proceedings.

We performed our work from April 2005 to August 2006 in accordance with
generally accepted government auditing standards.

Appendix II
Customer Satisfaction Survey Questions

This appendix includes the questions used in the three customer
satisfaction surveys: End User Customer Satisfaction Survey, Navy Echelon
II and Marine Corps Major Command Commander's Incentive Survey, and Navy
and Marine Corps Network Operations Leader's Survey.

End User Customer Satisfaction Survey Questions

The end user customer satisfaction survey consists of 14 questions, 10 of
which are tied to incentives. Users are asked to think only of the
experiences they have had with the services during the prior 3 months. If
a question is not relevant to their experience, they are asked to indicate
that it is not applicable. Otherwise, they are asked to score it on a 1-10
scale with 1-5 being levels of dissatisfaction, and 6-10 being levels of
satisfaction. Users are also currently asked demographic information in
the survey, as well as suggestions for improvement, and sources of
dissatisfaction. Table 9 lists the end user customer satisfaction survey
questions.  ^1

Table 9: NMCI End User Customer Satisfaction Survey Questions

                                        

                           What is your satisfaction                          
*With having access to the computer hardware you need to accomplish your   
job?                                                                       
With the dependability of the computer you use?                            
*With having access to the software you need to accomplish your job?       
With network reliability?                                                  
With the professionalism of EDS personnel?                                 
With finding and using information about NMCI services?                    
With the accuracy of information describing how to use NMCI services?      
*With training on how to use NMCI effectively?                             
With technical support services provided by the help desk?                 
With technical support services provided by on-site personnel?             
With the timeliness of problem resolution?                                 
With the solution implemented to correct any problem you experienced?      
*With the process to make changes to your IT environment?                  
What is your overall satisfaction with services provided by EDS?           

Source: March 2006 Quarterly Customer Satisfaction Survey Report.

Note: Questions marked with an asterisk are not used for incentive
purposes.

Navy Echelon II Commanders and Marine Corps Major Command Commander's
Customer Satisfaction Incentive Survey

The commander's customer satisfaction incentive survey consists of four
topics (warfighter support services, cutover services, technology
solutions, and service delivery) corresponding to key mission and/or
business objective-related services or capabilities. Each topic is broken
down into a number of subtopics. Under each subtopic, the survey asks
commanders to indicate whether they agree, disagree, or have no basis to
respond to a series of statements about EDS's performance. The survey also
asks commanders to rate their overall satisfaction with each topic as
"extremely satisfied," "mostly satisfied," "slightly satisfied," "not
satisfied," or "no basis to respond." The last section of each topic
contains two open-ended questions soliciting feedback on satisfaction with
NMCI services.

Table 10 is a condensed version of the commander's customer satisfaction
survey that includes each of the subtopics, statements about EDS's
performance, the overall topic satisfaction question, and the two
open-ended questions.^2

Table 10: Navy Echelon II and Marine Corps Major Command Commander's
Customer Satisfaction Incentive Survey's Questions

                                        

Warfighter support services                                                
1. Please evaluate EDS support for the following warfighter support        
service areas:                                                             
                                                                              
Classified network support                                                 
                                                                              
oEDS understands the requirements unique to SIPRNet Systems.               
                                                                              
oEDS adequately supports SIPRNet operations.                               
                                                                              
oEDS provides timely SIPRNet technical support.                            
                                                                              
oEDS provides adequate remote access to the SIPRNet from NMCI seats.       
                                                                              
Deployable support                                                         
                                                                              
oEDS provides adequate and effective predeployment training.               
                                                                              
oEDS provides deployment process documentation that is readily available,  
clear,      and accurate.                                                  
                                                                              
oEDS provides effective NMCI help desk support to deployed assets.         
                                                                              
oEDS effectively supports the movement of resources out of the NMCI        
environment      for deployment into IT21 and MTDN environments.           
                                                                              
oEDS effectively supports the reintegration of deployed resources into the 
NMCI      environment.                                                     
                                                                              
oEDS provides "Pack up Kits" with appropriate content for supporting       
resources      while deployed.                                             
                                                                              
Emergent requirement support (support for unplanned events)                
                                                                              
oEDS effectively responds to emergent requirements.                        
                                                                              
oEDS provides flexible and responsive support.                             
                                                                              
oEDS is innovative in developing solutions to support emergent             
requirements.                                                              
                                                                              
2. Please rate your overall satisfaction with warfighter support services. 
                                                                              
3. Comments and feedback                                                   
                                                                              
oWhat improvement would most increase your satisfaction?                   
                                                                              
oIf your satisfaction with this service has changed during the past 3      
months, what is      the primary reason for the change?                    
                                Cutover services                              
1. Please evaluate EDS performance for the following cutover services:     
                                                                              
Cutover planning                                                           
                                                                              
oEDS incorporates lessons learned into its cutover planning processes.     
                                                                              
oEDS cutover planning considers requirements unique to specific sites and  
        organizations.                                                        
                                                                              
oEDS accurately identifies infrastructure "build out" requirements.        
                                                                              
Cutover preparation                                                        
                                                                              
oEDS correctly captures site/organization data in support of NMCI asset.   
                                                                              
oEDS coordinates with the designated points of contact prior to asset      
cutover.                                                                   
                                                                              
oEDS infrastructure "build outs" are completed correctly and in            
coordination with the      government designated points of contact.        
                                                                              
Cutover execution                                                          
                                                                              
oEDS delivers complete and accurate services as ordered.                   
                                                                              
oEDS delivers according to agreed upon schedules.                          
                                                                              
oEDS fulfills its                                                          
                                                                              
o"Execution Discipline" obligations.                                       
                                                                              
oEDS effectively deploys specialized assets (classified, deployable, very  
small site      design, etc.).                                             
                                                                              
2. Please rate your overall satisfaction with cutover services.            
                                                                              
3. Comments and feedback                                                   
                                                                              
oWhat improvement would most increase your satisfaction?                   
                                                                              
oIf your satisfaction with this service has changed during the past 3      
months, what is      the primary reason for the change?                    
                              Technology solutions                            
1. Please evaluate EDS performance for the following technology services:  
                                                                              
New service order and delivery process                                     
                                                                              
oEDS makes available new services in a timely manner once they have been   
        added to the contract and approved for operation.                     
                                                                              
oEDS delivers ordered services in a timely fashion.                        
                                                                              
oEDS delivers accurately against submitted task orders.                    
                                                                              
Technical performance                                                      
                                                                              
oEDS NMCI service (hardware/software, help desk, on-site support, and      
        connectivity) are available when and where needed.                    
                                                                              
oEDS provides accurate and dependable technical services.                  
                                                                              
oEDS provides quality technical services.                                  
                                                                              
oEDS technical services are flexible enough to support dynamic             
organizational      needs.                                                 
                                                                              
2. Please rate your overall satisfaction with technical solutions.         
                                                                              
3. Comments and feedback                                                   
                                                                              
oWhat improvement would most increase your satisfaction?                   
                                                                              
oIf your satisfaction with this service has changed during the past 3      
months, what is       the primary reason for the change?                   
                                Service delivery                              
1. Please evaluate EDS performance in the following service delivery       
areas:                                                                     
                                                                              
Organizational understanding                                               
                                                                              
oEDS understands my command's mission requirements.                        
                                                                              
oEDS understands my command's operational processes.                       
                                                                              
oEDS understands my organizational structure and hierarchy.                
                                                                              
Customer service                                                           
                                                                              
oEDS NMCI Help Desk Support (1.866.THE.NMCI) is consistent and effective.  
                                                                              
oEDS NMCI on-site technical support is consistent and effective.           
                                                                              
oEDS communicates relevant information to command personnel in a timely    
       fashion.                                                               
                                                                              
oEDS supports individual command requirements.                             
                                                                              
Issue management                                                           
                                                                              
oEDS coordinates with the appropriate government personnel and             
representatives.                                                           
                                                                              
oEDS responds to issues in a timely manner.                                
                                                                              
oEDS resolves issues timely and effectively.                               
                                                                              
oEDS develops solutions that are transferable throughout the enterprise.   
                                                                              
oEDS appropriately considers command and Department of Navy needs as part  
of     issue prioritization.                                               
                                                                              
oEDS accurately tracks and provides insight into identified issues.        
                                                                              
2. Please rate your overall satisfaction with service delivery.            
                                                                              
3. Comments and feedback                                                   
                                                                              
oWhat improvement would most increase your satisfaction?                   
                                                                              
oIf your satisfaction with this service has changed during the past 3      
months, what is     the primary reason for the change?                     

Source: Navy Echelon II Commands and Marine Corps Command Commanders
Customer Satisfaction Incentive Survey: Period of Performance April 1,
2005, through September 30, 2005.

Navy and Marine Corps Network Operations Leaders' Customer Satisfaction
Incentive Survey

The network operations leaders' customer satisfaction incentive survey
consists of three topics (mission support and planning, network
management, and service delivery) corresponding to key mission and/or
business objective-related services or capabilities. Each topic is broken
down into a number of subtopics. Under each subtopic, the survey asks the
leaders to indicate whether they agree, disagree, or have no basis to
respond to a series of statements about EDS's performance. The survey also
asks the leaders to rate their overall satisfaction with each topic as
"extremely satisfied," "mostly satisfied," "slightly satisfied," "not
satisfied," or "have no basis to respond." The last section of each topic
contains two open-ended questions soliciting feedback on satisfaction with
NMCI services.

Table 11 is an abbreviated version of the network operations leader's
surveys that includes each of the subtopics, statements about EDS's
performance, the overall topic satisfaction question, and the two
open-ended questions.^3

Table 11: Navy and Marine Corps Network Operations Leader's Customer
Satisfaction Incentive Survey's Questions

                                        

                          Mission support and planning                        
1. Please evaluate EDS's performance for the following mission support and 
planning services:                                                         
                                                                              
Interoperability support                                                   
                                                                              
oEDS adequately supports internal (Navy/Marine Corps) interoperability.    
                                                                              
oEDS adequately supports external (.mil, .com, Joint, coalition)           
interoperability.                                                          
                                                                              
oEDS provides adequate reach back capabilities to legacy                   
systems/applications.                                                      
                                                                              
oEDS correctly identifies and is able to resolve interoperability issues.  
                                                                              
Continuity of operations                                                   
                                                                              
oEDS is knowledgeable concerning continuity of operations plans.           
                                                                              
oEDS demonstrates the effectiveness of its continuity of operations plans. 
                                                                              
oEDS can effectively recover NMCI systems and data in the event of a       
disaster.                                                                  
                                                                              
oEDS effectively utilizes and supports the NMCI Military Detachment        
Training     Program.                                                      
                                                                              
Future readiness                                                           
                                                                              
oEDS solutions are scalable.                                               
                                                                              
oEDS is flexible in planning for future scenarios.                         
                                                                              
oEDS is innovative in developing solutions to combat emerging IT threats   
to NMCI      operations.                                                   
                                                                              
Public Key Infrastructure (PKI) services                                   
                                                                              
oEDS has developed an efficient and effective PKI solution.                
                                                                              
oEDS provides PKI services that are readily available and reliable.        
                                                                              
oEDS provides PI services that are easy to understand and use.             
                                                                              
oEDS provides adequate PKI related training and other instructional        
       documentation.                                                         
                                                                              
2. Please rate your overall satisfaction with mission support and planning 
services.                                                                  
                                                                              
3. Comments and feedback                                                   
                                                                              
oWhat improvement would most increase your satisfaction?                   
                                                                              
oIf your satisfaction with this service has changed during the past 3      
months, what      is the primary reason for the change?                    
                               Network management                             
1. Please evaluate EDS performance for the following network management    
services:                                                                  
                                                                              
Network status information                                                 
                                                                              
oEDS provides sufficient availability to network performance data.         
                                                                              
oEDS provides sufficiently detailed visibility into network performance    
issues.                                                                    
                                                                              
oEDS provides network performance data that adequately represents live     
network      operations.                                                   
                                                                              
Information Operations Condition (INFOCON)/Information Assurance           
Vulnerability Alert (IAVA) Awareness and Compliance                        
                                                                              
oEDS implements IAVA's in a timely manner.                                 
                                                                              
oEDS understands the requirements associated with each INFOCON level.      
                                                                              
oEDS adjusts to INFOCON changes in a timely manner.                        
                                                                              
Urgent software patch implementation                                       
                                                                              
oEDS efficiently and effectively supports the processes required to get    
urgent     software patches approved so that they can be deployed onto the 
network.                                                                   
                                                                              
oEDS maintains a current knowledge of software patch availability and      
deployment     processes.                                                  
                                                                              
oEDS maintains accurate configuration management of software patch         
deployment      throughout the enterprise.                                 
                                                                              
oEDS provides timely responses to urgent software patch releases.          
                                                                              
Data management                                                            
                                                                              
oEDS effectively manages user account data.                                
                                                                              
oEDS effectively manages systems log data.                                 
                                                                              
oEDS effectively manages system permissions and trust relationships.       
                                                                              
oEDS effectively manages system and user backup data.                      
                                                                              
oEDS effectively manages network architecture diagrams.                    
                                                                              
2. Please rate your overall satisfaction with network management services. 
                                                                              
3. Comments and feedback                                                   
                                                                              
oWhat improvement would most increase your satisfaction?                   
                                                                              
oIf your satisfaction with this service has changed during the past 3      
months, what is     the primary reason for the change?                     
                               Service delivered                              
1. Please evaluate EDS performance for the following service delivery      
areas:                                                                     
                                                                              
Organizational understanding                                               
                                                                              
oEDS understands organizational mission requirements.                      
                                                                              
oEDS understands organizational policies and operational procedures.       
                                                                              
oEDS understands your organizational structure.                            
                                                                              
Communications                                                             
                                                                              
oEDS effectively communicates with the right people.                       
                                                                              
oEDS effectively communicates planned maintenance and network outages.     
                                                                              
oEDS effectively communicates changes in NMCI configurations.              
                                                                              
oEDS coordinates with the appropriate parties when planning network events 
that     significantly impact the network.                                 
                                                                              
Issue management                                                           
                                                                              
oEDS provides sufficient visibility into the status of open issues.        
                                                                              
oEDS appropriately coordinates issue resolution efforts with network       
operators.                                                                 
                                                                              
oEDS independently identifies and reports to the government network        
related     issues.                                                        
                                                                              
oEDS appropriately considers command and Navy needs in issue               
prioritization.                                                            
                                                                              
oEDS applies lessons learned in order to resolve related issues across the 
NMCI     enterprise.                                                       
                                                                              
2. Please rate your overall satisfaction with EDS service delivery.        
                                                                              
3. Comments and feedback                                                   
                                                                              
oWhat improvement would most increase your satisfaction?                   
                                                                              
oIf your satisfaction with this service has changed during the past 3      
months, what is      the primary reason for the change?                    

Source: Navy and Marine Corps Network Operations Leaders Customer
Satisfaction Incentive Survey: Period of Performance April 1, 2005,
through September 30, 2005.

Appendix III
SLA Descriptions and Performance

This appendix contains descriptions and performance trends for NMCI's
service level agreements. SLAs are measured at site level, enterprisewide,
or both the site and enterprisewide. Site level SLA performance is based
on the percentage of operational seats that met the SLA, meaning that all
performance targets for a given SLA were met for a particular month. Where
applicable, the percentage of seats meeting an SLA was analyzed by seat
type (i.e., basic, high end, and mission-critical).

Enterprisewide SLA performance is based on whether the SLA was met for a
given month, meaning that all performance targets for a given SLA were met
for a particular month.

SLA 101-End user problem resolution:  This SLA measures the percentage of
all resolved NMCI problems against identified performance target values.
Figure 14 portrays the contractor's historical site level performance with
SLA 101.

Figure 14: Site Level Performance SLA 101

SLA 102-Network problem resolution: This SLA measures the resolution of
problems associated with the contractor provided network devices and
connections. Figure 15 portrays the contractor's historical site level
performance with SLA 102.

Figure 15: Site Level Performance for SLA 102

SLA 103-End user services: This SLA measures performance with end user
services, including E-mail, Web and Portal, File Share, Print, Network
Logon, Access to Government Applications, and RAS services. Figure 16
portrays the contractor's historical site level performance with SLA 103.
Figure 17 portrays the contractor's historical enterprisewide performance
with SLA 103.

Figure 16: Site Level Performance for SLA 103

Figure 17: Enterprisewide Performance for SLA 103

SLA 104-Help desk: This SLA measures help desk services including, average
speed of answer, average speed of response, call abandonment rate, and
first call resolution. Figure 18 portrays the contractor's historical
enterprisewide performance with SLA 104.

Figure 18: Enterprisewide Performance for SLA 104

SLA 105-Move, add, change (MAC): This SLA measures the time to complete
MAC activity, from the receipt of the MAC request from an authorized
government submitter to the completion of the MAC activity. MACs include
activities such as moving a seat from one location to another and adding
seats at a location. Figure 19 portrays the contractor's historical site
level performance with SLA 105.

Figure 19: Site Level Performance for SLA 105

SLA 106-Information assurance (IA) services: This SLA measures the
contractor's IA services, including security event detection, security
event reporting, security event response, and IA configuration management.
Figure 20 portrays the contractor's historical enterprisewide performance
with SLA 106.

Figure 20: Enterprisewide Performance for SLA 106

SLA 107-NMCI intranet: This SLA measures performance of the NMCI Intranet
in areas of availability, latency/packet loss,^1 and quality of service in
support of videoteleconferencing and voice-over-IP. Figure 21 portrays the
contractor's historical site level performance with SLA 107.

Figure 21: Site Level Performance for SLA 107

SLA 203-E-mail services: This SLA measures the performance of e-mail
transfers. Figure 22 portrays the contractor's historical enterprisewide
performance with SLA 203.

Figure 22: Enterprisewide Performance for 203

SLA 204-Directory services: This SLA measures the availability and
responsiveness of directory services. Directory services include
supporting the management and use of file services, security services,
messaging, and directory information (e.g., e-mail addresses) for users.
Figure 23 portrays the contractor's historical site level performance with
SLA 204. Figure 24 portrays the contractor's enterprisewide performance
with SLA 204.

Figure 23: Site Level Performance for SLA 204

Figure 24: Enterprisewide Performance for SLA 204

SLA 206-Web access services: This SLA measures the performance of user
access to internal and external Web content. Figure 25 portrays the
contractor's historical site level performance with SLA 206. Figure 26
portrays the contractor's historical enterprisewide performance with SLA
206.

Figure 25: Site Level Performance for SLA 206

Figure 26: Enterprisewide Performance for SLA 206

SLA 211-Unclassified but Sensitive Internet Protocol Router Network
(NIPRNET) access: This SLA measures the performance of NIPRNET access,
including latency and packet loss. Figure 27 portrays the contractor's
historical site level performance with SLA 211. Figure 28 portrays the
contractor's historical enterprisewide performance with SLA 211

Figure 27: Site Level Performance for SLA 211

Figure 28: Enterprisewide Performance for SLA 211

SLA 225-Base area network/local area network (BAN/LAN) communications
services: This SLA measures BAN/LAN performance, including availability
and latency. Figure 29 portrays the contractor's historical site level
performance with SLA 225.

Figure 29: Site Level Performance for SLA 225

SLA 226-Proxy and caching service: This SLA measures the availability of
the proxy and caching services. Proxy servers are located between a client
and a network server and are intended to improve network performance by
fulfilling small requests. Figure 30 portrays the contractor's historical
enterprisewide performance with SLA 226.

Figure 30: Enterprisewide Performance for SLA 226

SLA 231-System service-domain name server: This SLA measures the
availability and latency of Domain Name Server services. The Domain Name
Server translates domain names to IP addresses and vice versa. Figure 31
portrays the contractor's historical site level performance with SLA 231.
Figure 32 portrays the contractor's historical enterprisewide performance
with SLA 231.

Figure 31: Site Level Performance for SLA 231

Figure 32: Enterprisewide Performance for SLA 231

SLA 324-Wide area network connectivity: This SLA measures the percent of
bandwidth used to provide connection to external networks. Figure 33
portrays the contractor's historical site level performance with SLA 324.

Figure 33: Site Level Performance for SLA 324

SLA 325-BAN/LAN communication services: This SLA measures the percent of
bandwidth utilized on shared network segments. Figure 34 portrays the
contractor's historical site level performance for SLA 325.

Figure 34: Site Level Performance for SLA 325

SLA 328-Network management service-asset management: This SLA measures the
time it takes to implement new assets, such as seats, and application
servers. Figure 35 portrays the contractor's historical site level
performance with SLA 328.

Figure 35: Site Level Performance for SLA 328

SLA 329-Operational support services: This SLA measures the effectiveness
of NMCI's disaster recovery plan. Figure 36 portrays the contractor's
historical enterprisewide performance with SLA 329.

Figure 36: Enterprisewide Performance for SLA 329

SLA 332-Application server connectivity: This SLA measures both the time
it takes for the contractor to implement the connectivity between the
network backbone and an application server and the percentage of available
bandwidth from an application server to the local supporting backbone.
Figure 37 portrays the contractor's historical site level performance with
SLA 332.

Figure 37: Site Level Performance for SLA 332

SLA 333-NMCI security operational services-general: This SLA measures the
percentage of successful accreditations on the first attempt, based on
compliance with DOD certification and accreditation policies and
procedures. Figure 38 portrays the contractor's historical enterprisewide
performance with SLA 333.

Figure 38: Enterprisewide Performance for SLA 333

SLA 334-Information assurance operational service-PKI: This SLA measures
the timeliness of revoking a PKI certificate when required, ability of a
NMCI user to obtain the DOD PKI certificate of another NMCI user, and the
time it takes for user registration of DOD PKI within NMCI. Figure 39
portrays the contractor's historical enterprisewide performance with SLA
334.

Figure 39: Enterprisewide Performance for SLA 334

SLA 336-Information assurance planning services: This SLA measures the
time it takes to distribute new or revised security products (hardware and
software). Figure 40 portrays the contractor's historical enterprisewide
performance with SLA 336.

Figure 40: Enterprisewide Performance for SLA 336

Appendix IV
Comments from the Department of Defense

Appendix V
GAO Contact and Staff Acknowledgments

GAO Contact

Randolph C. Hite, 202-512-3439, [email protected]

Staff Acknowledgments

In addition to the individual named above, Mark Bird, Assistant Director;
Scott Borre; Timothy Case; Barbara Collier; Vijay D'Souza; Neil Doherty;
Jim Fields; Mike Gilmore; Peggy Hegg; Wilfred Holloway; George Kovachick;
Frank Maguire; Charles Roney; Sidney Schwartz; Karl Seifert; Glenn
Spiegel; Dr. Rona Stillman; Amos Tevelow; and Eric Winter made key
contributions to this report.

(310602)

www.gao.gov/cgi-bin/getrpt?GAO-07-51 .

To view the full product, including the scope 
and methodology, click on the link above.

For more information, contact Randolph C. Hite at 202-512-3439 or
[email protected].

Highlights of [55]GAO-07-51 , a report to congressional addressees

December 2006

INFORMATION TECHNOLOGY

DOD Needs to Ensure That Navy Marine Corps Intranet Program Is Meeting
Goals and Satisfying Customers

The Navy Marine Corps Intranet (NMCI) is a 10-year, $9.3 billion
information technology services program. Through a performance-based
contract, the Navy is buying network (intranet), application, and other
hardware and software services at a fixed price per unit (or "seat") to
support about 550 sites. GAO prepared this report under the Comptroller
General's authority as part of a continued effort to assist Congress and
reviewed (1) whether the program is meeting its strategic  goals, (2) the
extent to which the contractor is meeting service level agreements, (3)
whether customers are satisfied with the program, and (4) what is being
done to improve customer satisfaction. To accomplish this, GAO reviewed
key program and contract performance management-related plans, measures,
and data and interviewed NMCI program and contractor officials, as well as
NMCI customers at shipyards and air depots.

[56]What GAO Recommends

GAO is making recommendations to the Secretary of Defense aimed at
implementing effective program performance management, expanding
measurement and understanding of service level agreement performance,
effectively managing customer satisfaction improvement efforts, and
deciding whether overall performance to date warrants program changes. In
commenting on a draft of this report, DOD agreed with GAO's
recommendations.

NMCI has not met its two strategic goals--to provide information
superiority and to foster innovation via interoperability and shared
services. Navy developed a performance plan in 2000 to measure and report
progress towards these goals, but did not implement it because the program
was more focused on deploying seats and measuring contractor performance
against contractually specified incentives than determining whether the
strategic mission outcomes used to justify the program were met. GAO's
analysis of available performance data, however, showed that the Navy had
met only 3 of 20 performance targets (15 percent) associated with the
program's goals and nine related performance categories. By not
implementing its performance plan, the Navy has invested, and risks
continuing to invest heavily, in a program that is not subject to
effective performance management and has yet to produce expected results.

GAO's analysis also showed that the contractor's satisfaction of NMCI
service level agreements (contractually specified performance
expectations) has been mixed. Since September 2004,  while a significant
percentage of agreements have been met for all types of seats, others have
not consistently been met, and still others have generally not been met.
Navy measurement of agreement satisfaction shows that performance needed
to receive contractual incentive payments for the most recent 5-month
period was attained for about 55 to 59 percent of all eligible seats,
which represents a significant drop from the previous 9-month period.
GAO's analysis and the Navy's measurement of agreement satisfaction
illustrate the need for effective performance management, to include
examining agreement satisfaction from multiple perspectives to target
needed corrective actions and program changes.

GAO analysis further showed that NMCI's three customer groups (end users,
commanders, and network operators) vary in their satisfaction with the
program. More specifically, end user satisfaction surveys indicated that
the percent of end users that met the Navy's definition of a satisfied
user has remained consistently below the target of 85 percent (latest
survey results categorize 74 percent as satisfied). Given that the Navy's
definition of the term "satisfied" includes many marginally satisfied and
arguably somewhat dissatisfied users, this percentage represents the best
case depiction of end user satisfaction. Survey responses from the other
two customer groups show that both were not satisfied. GAO interviews with
customers at shipyards and air depots also revealed dissatisfaction with
NMCI. Without satisfied customers, the Navy will be challenged in meeting
program goals.

To improve customer satisfaction, the Navy identified various initiatives
that it described as completed, under way, or planned. However, the
initiatives are not being guided by a documented plan(s), thus limiting
their potential effectiveness. This means that after investing about 6
years and $3.7 billion, NMCI has yet to meet expectations, and whether it
will is still unclear.

GAO's Mission

The Government Accountability Office, the audit, evaluation and
investigative arm of Congress, exists to support Congress in meeting its
constitutional responsibilities and to help improve the performance and
accountability of the federal government for the American people. GAO
examines the use of public funds; evaluates federal programs and policies;
and provides analyses, recommendations, and other assistance to help
Congress make informed oversight, policy, and funding decisions. GAO's
commitment to good government is reflected in its core values of
accountability, integrity, and reliability.

Obtaining Copies of GAO Reports and Testimony

The fastest and easiest way to obtain copies of GAO documents at no cost
is through GAO's Web site ( www.gao.gov ). Each weekday, GAO posts
newly released reports, testimony, and correspondence on its Web site. To
have GAO e-mail you a list of newly posted products every afternoon, go to
www.gao.gov and select "Subscribe to Updates."

Order by Mail or Phone

The first copy of each printed report is free. Additional copies are $2
each. A check or money order should be made out to the Superintendent of
Documents. GAO also accepts VISA and Mastercard. Orders for 100 or more
copies mailed to a single address are discounted 25 percent. Orders should
be sent to:

U.S. Government Accountability Office 441 G Street NW, Room LM Washington,
D.C. 20548

To order by Phone: Voice: (202) 512-6000 TDD: (202) 512-2537 Fax: (202)
512-6061

To Report Fraud, Waste, and Abuse in Federal Programs

Contact:

Web site: www.gao.gov/fraudnet/fraudnet.htm E-mail:
[email protected] Automated answering system: (800) 424-5454 or (202)
512-7470

Congressional Relations

Gloria Jarmon, Managing Director, [email protected] (202) 512-4400 U.S.
Government Accountability Office, 441 G Street NW, Room 7125 Washington,
D.C. 20548

Public Affairs

Paul Anderson, Managing Director, [email protected] (202) 512-4800
U.S. Government Accountability Office, 441 G Street NW, Room 7149
Washington, D.C. 20548

References

Visible links  
  55. http://www.gao.gov/cgi-bin/getrpt?GAO-07-51
*** End of document. ***