Defense Computers: DOD Y2K Functional End-to-End Testing Progress and
Test Event Management (Letter Report, 10/18/1999, GAO/AIMD-00-12).

Pursuant to a congressional request, GAO provided information on the
effectiveness of the Department of Defense's efforts to perform year
2000-related end-to-end tests for its major business functions.

GAO noted that: (1) because year 2000 conversions often involve numerous
large interconnecting systems with many external interfaces and
extensive supporting technology infrastructures, year 2000 testing
should be approached in a structured and disciplined fashion; (2) GAO's
year 2000 guidance recommends that in planning and managing end-to-end
tests, agencies define test boundaries, secure the commitment of data
exchange partners, prepare test procedures and data, define exit
criteria and document test results, among other steps; (3) each of the
individual test events GAO attended and reviewed within the four
functional areas generally satisfied the key processes that GAO's year
2000 test guide defines as necessary to effectively plan, conduct, and
report on end-to-end testing; (4) moreover, while the events' respective
approaches to implementing the key processes varied, these differences
were appropriately based on consideration of the event's scope and
complexity; (5) in addition, overall end-to-end test efforts within
three of the four functional areas were reported to be largely on
schedule and expected to be completed by October 1999; (6) however, at
the time GAO briefed the Communications functional area on the results
of GAO's review, it could not provide complete progress information; and
(7) while information was subsequently provided by Communications, it
showed that the functional area had not yet developed plans to test 31
mission-critical systems.

--------------------------- Indexing Terms -----------------------------

 REPORTNUM:  AIMD-00-12
     TITLE:  Defense Computers: DOD Y2K Functional End-to-End Testing
	     Progress and Test Event Management
      DATE:  10/18/1999
   SUBJECT:  Y2K
	     Computer software verification and validation
	     Systems conversions
	     Data integrity
	     Strategic information systems planning
	     Information resources management
	     Systems compatibility
IDENTIFIER:  Y2K

******************************************************************
** This file contains an ASCII representation of the text of a  **

** GAO report.  Delineations within the text indicating chapter **
** titles, headings, and bullets are preserved.  Major          **
** divisions and subdivisions of the text, such as Chapters,    **
** Sections, and Appendixes, are identified by double and       **
** single lines.  The numbers on the right end of these lines   **
** indicate the position of each of the subsections in the      **
** document outline.  These numbers do NOT correspond with the  **
** page numbers of the printed product.                         **
**                                                              **
** No attempt has been made to display graphic images, although **
** figure captions are reproduced.  Tables are included, but    **
** may not resemble those in the printed version.               **
**                                                              **
** Please see the PDF (Portable Document Format) file, when     **
** available, for a complete electronic file of the printed     **
** document's contents.                                         **
**                                                              **
** A printed copy of this report may be obtained from the GAO   **
** Document Distribution Center.  For further details, please   **
** send an e-mail message to:                                   **
**                                                              **
**                                            **
**                                                              **
** with the message 'info' in the body.                         **
******************************************************************

Report to the Chairman of the Subcommittee on Defense, Committee on
Appropriations, House of Representatives

October 1999

DEFENSE COMPUTERS

DOD Y2K Functional End-to-End Testing Progress and Test Event
Management
*****************

*****************

GAO/AIMD-00-12

Letter                                                                     3

Appendixes

Appendix I:Briefing on DOD Y2K Functional End-to-End Testing: Progress and
Test Event Management

                                                                         14

Appendix II:Objectives, Scope, and Methodology

                                                                         47

Appendix III:Comments From the Department of Defense

                                                                         49

Appendix IV:GAO Contact and Staff Acknowledgements

                                                                         53

Table 1:  Summary of Recommended End-to-End Test Management Processes6

Table 2:  Summary of the Four Functional Area Test Approaches7

Table 3:  Summary of Test Events Satisfying GAO Key Processes9

DISA    Defense Information Systems Agency

DOD     Department of Defense

JUSE    Joint User Switch Exercise

OMB     Office of Management and Budget

OSD     Office of the Secretary of Defense

PSA     Principal Staff Assistant

Y2K     Year 2000

                                                 Accounting and Information
                                                        Management Division

B-283564

October 18, 1999

The Honorable Jerry Lewis
Chairman, Subcommittee on Defense
Committee on Appropriations
House of Representatives

Dear Mr. Chairman:

Complete and thorough Year 2000 (Y2K) testing is essential to provide
reasonable assurance that new or modified systems process dates correctly
and will not jeopardize an organization's ability to perform core business
operations after the millennium. This is especially true for the
Department of Defense (DOD) which relies on a complex and broad array of
interconnected computer systems--including weapons, command and control,
satellite, inventory management, transportation management, health,
financial, personnel and payment systems--to carry out its core business
functions and military operations. 

At your request, we initiated a review of the effectiveness of DOD's
efforts to perform Year 2000-related end-to-end tests for its major
business functions, including Health Affairs, Communications, Personnel,
and Logistics. Together, these functional areas are performing thousands
of end-to-end tests to ensure that key business processes and systems can
continue operating into the year 2000. Specifically, for each functional
area, we analyzed reported information on the status and progress of all
test events. We also selected and reviewed a critical test event in each
functional area to determine whether it was planned and managed in
accordance with our Year 2000 testing guide./Footnote1/ On September 14,
1999, we briefed you on the results of our review. This report provides a
summary of our briefing and a recommendation to Defense for strengthening
oversight of end-to-end testing for the Communications functional area.
Subsequent to our briefing, logistics officials submitted additional
information on the inclusion of installation telecommunications providers
in related test events. We have clarified the briefing slides to reflect
this. These clarifications, however, do not affect our overall conclusions
and recommendation. The briefing slides are presented in appendix I, and
our objectives, scope, and methodology are in appendix II. The Office of
the Assistant Secretary of Defense provided written comments on a draft of
this report. These comments are discussed at the end of this report and
reprinted in appendix III. We performed our audit work from March through
September 1999 in accordance with generally accepted government auditing
standards.

Results in Brief

Because Year 2000 conversions often involve numerous large interconnecting
systems with many external interfaces and extensive supporting technology
infrastructures, Year 2000 testing should be approached in a structured
and disciplined fashion. Our Year 2000 guidance recommends that in
planning and managing end-to-end tests, agencies define test boundaries,
secure the commitment of data exchange partners, prepare test procedures
and data, define exit criteria,/Footnote2/ and document test results,
among other steps. Each of the individual test events we attended and
reviewed within the four functional areas generally satisfied the key
processes that our Year 2000 test guide defines as necessary to
effectively plan, conduct, and report on end-to-end testing./Footnote3/
Moreover, while the events' respective approaches to implementing the key
processes varied, these differences were appropriately based on
consideration of the event's scope and complexity. 

In addition, overall end-to-end test efforts within three of the four
functional areas were reported to be largely on schedule and expected to
be completed by October 1999. However, at the time we briefed the
Communications functional area on the results of our review, it could not
provide complete progress information. While information was subsequently
provided by Communications, it showed that the functional area had not yet
developed plans to test 31 mission-critical systems. We are making a
recommendation to Defense to ensure that these systems are tested or that
there is adequate justification for their exclusion from end-to-end test
events. While Defense only partially concurred with this recommendation,
it provided information showing the status of the systems in question. We
did not verify this information.

Background

In August 1998, the Deputy Secretary of Defense recognized the need to
ensure that various key lines of business or functional areas within the
department could continue to operate effectively at and after the turn of
the century. Therefore, the Deputy Secretary directed Office of the
Secretary of Defense focal points, known as Principal Staff Assistants
(PSAs), to verify that all functions would be unaffected by Year 2000
issues. In doing so, the PSAs were to (1) document mission-critical
functions and systems supporting those functions, (2) coordinate,
facilitate, and monitor Year 2000 end-to-end test and evaluation
activities of services, agencies, and commands, and (3) in some cases,
conduct Y2K end-to-end functional testing.

The purpose of end-to-end testing is to verify that a defined set of
interrelated systems, which collectively support an organizational core
business area or function, interoperate as intended in an operational
environment (either actual/Footnote4/ or simulated). These interrelated
systems include not only those owned and managed by the organization, but
also the external systems with which they interface or that otherwise
support the core business area or function.

The boundaries for end-to-end tests can vary depending on a given business
function's system dependencies and criticality to the organizational
mission. Therefore, in managing end-to-end test activities, it is
important to analyze the interrelationships among core business functions
and their supporting systems, and the mission impact and risk of date-
induced systems failures and to use these analyses to define test
boundaries. It is also important to work early and continually with
functional partners to ensure that related end-to-end test activities are
effectively coordinated and integrated. As highlighted in table 1, our
Year 2000 test guide, which has been adopted by the Office of Management
and Budget (OMB), recommends that federal agencies take the following
actions in planning and managing end-to-end tests.

Table****Helvetica:x11****1:    Summary of Recommended End-to-End Test
                                Management Processes

-----------------------------------------------------------------------
| Define the system       : Agencies should define boundaries for     |
| boundaries of the end-  : the end-to-end test based on an           |
| to-end test(s)          : assessment of their mission-critical      |
|                         : business functions, inter- and            |
|                         : intraorganization system dependencies,    |
|                         : as well as the probabilities and          |
|                         : impacts of any of these systems           |
|                         : suffering a date-related failure.         |
|---------------------------------------------------------------------|
| Secure the commitment   : Because end-to-end testing addresses      |
| of data exchange        : business areas or functions that          |
| partners                : involve multiple internal and external    |
|                         : organizations, participation by all key   |
|                         : data exchange partners should be          |
|                         : solicited and obtained.                   |
|---------------------------------------------------------------------|
| Establish an            : A team composed of representatives from   |
| interorganizational     : each of the organizations participating   |
| test team               : in the test should be formed to manage    |
|                         : the planning, execution, and reporting    |
|                         : of the test.                              |
|---------------------------------------------------------------------|
| Confirm Year 2000       : In order to execute end-to-end testing    |
| compliance of           : and ensure that all systems in the        |
| telecommunications      : chain of support to core business areas   |
| infrastructure          : function as intended, agencies should     |
|                         : ensure that the telecommunications        |
|                         : infrastructure that interconnects the     |
|                         : systems is compliant and ready for        |
|                         : testing.                                  |
|---------------------------------------------------------------------|
| Schedule and plan end-  : A plan should be developed specifying     |
| to-end test(s)          : key tasks and requirements for test       |
|                         : planning, execution, and validation as    |
|                         : well 25 milestones and resources          |
|                         : associated with performing these tasks.   |
|---------------------------------------------------------------------|
| Prepare end-to-end      : Interorganizational test procedures and   |
| procedures and data     : data, including steps, cases, and input   |
|                         : conditions that verify the correct        |
|                         : handling of critical dates, should be     |
|                         : prepared and approved by team             |
|                         : representatives.                          |
|---------------------------------------------------------------------|
| Define end-to-end test  : The conditions or requirements for        |
| exit criteria           : successfully completing end-to-end        |
|                         : testing need to be established.           |
|---------------------------------------------------------------------|
| Execute end-to-end      : Tests should be executed in accordance    |
| test(s)                 : with established plans and procedures.    |
|---------------------------------------------------------------------|
| Document test results   : Test results should be documented so      |
|                         : that the data can be used to validate     |
|                         : that test exit criteria had been met      |
|                         : and to assess and correct problems        |
|                         : discovered during the testing.            |
|---------------------------------------------------------------------|
| Correct Year 2000       : On the basis of interoganization          |
| defects                 : specified criteria, such as defect        |
|                         : severity and test exist criteria,         |
|                         : defects identified during the test        |
|                         : should be prioritized and corrected.      |
|---------------------------------------------------------------------|
| Ensure that end-to-end  : Test results should be compared to test   |
| test exit criteria are  : exit criteria to ensure that specified    |
| met                     : conditions are met.                       |
-----------------------------------------------------------------------

The table below explains how the four functional areas included in our
review approached their end-to-end tests.

Table****Helvetica:x11****2:    Summary of the Four Functional Area Test
                                Approaches

-----------------------------------------------------------------------
| Function    : Description of decomposition                          |
|---------------------------------------------------------------------|
| Health      : Health Affairs divided its function into three        |
| Affairs     : business processes: patient care, patient             |
|             : administration, and medical logistics. Health         |
|             : Affairs then broke down each process into several     |
|             : sub-processes, termed "threads."                      |
|---------------------------------------------------------------------|
| Communicati : Because Communications cross cuts all                 |
| ons         : functional/operational areas, Communications is       |
|             : testing based on system user. Therefore,              |
|             : Communications divided its function into 263          |
|             : mission-critical systems. Various military            |
|             : services, Defense agencies, and commanders-in-chief   |
|             : own these systems.                                    |
|---------------------------------------------------------------------|
| Personnel   : Personnel divided its function into six areas:        |
|             : Army, Navy, Air Force, Marine Corps, Civilian, and    |
|             : DEERS/RAPIDS (personnel systems). Personnel then      |
|             : broke down each area into sub-processes, termed       |
|             : "threads." Personnel is not conducting its own end-   |
|             : to-end tests. Instead, Personnel is participating     |
|             : in and observing service-level testing.               |
|---------------------------------------------------------------------|
| Logistics   : Logistics divided its function into four business     |
|             : processes: requisition, receipt, shipment, and        |
|             : inventory control and asset status. Logistics then    |
|             : broke down each process into several sub-processes,   |
|             : termed "threads." Logistics tested these four         |
|             : processes in two phases: intracomponent (within       |
|             : each military service or Defense agency) and          |
|             : intercomponent (joint testing with military           |
|             : services and Defense agencies).                       |
-----------------------------------------------------------------------

The test events we selected from each area to review ranged from a simple
test involving two information systems located within one organization to
an intricate test of DOD's voice and data telecommunications networks
involving several commands and multiple systems. Specifically, the Health
Affairs test event we reviewed assessed the ability of two interfacing
systems to issue and process blood requests after the calendar year
rollover. The Communications test event that we observed was a portion of
a larger test and assessed whether voice communications could be sent from
Fort Monmouth, New Jersey, to St. Louis, Missouri, using DOD
telecommunications networks and equipment and whether messages could be
exchanged using the Defense Messaging System from the Strategic Command to
the Atlantic Command. The Personnel test event assessed the Army's ability
to create active duty units for deployment from the Army Reserve and Army
National Guard. Finally, the Logistics event focused on intercomponent
testing--between the Army, Air Force, Navy, Marine Corps, and Defense
Logistics Agency--and was designed to verify the Year 2000 readiness of 17
of the 53 total logistics requisition and receipt processes.

End-to-End Tests Reported to Be on Schedule

Available information for the respective areas indicates that, as of
August 1999, end-to-end tests were largely on schedule and expected to be
completed by October 1999. In particular, 

o Health Affairs, which had three primary business processes, completed
  testing for two--patient care and patient administration business
  processes--and was on schedule to complete tests for the third--medical
  logistics--by the end of September 1999. 

o Personnel tests for the Army, Air Force, and Civilian areas had been
  completed, while the Navy tests were scheduled to be done October 17,
  1999. The Marine Corps was behind schedule on one test. However, it
  completed the test by September 9, 1999. 

o Logistics intra- and intercomponent tests, which involve four primary
  business processes--requisition, shipment, receipt, and inventory
  control and asset status--had been completed for intercomponent
  transactions. Tests were scheduled to be done by the end of August 1999
  and, according to Logistics officials, were completed on schedule.

o When we briefed the Communications functional area on the results of
  our review in July 1999, it was unable to provide progress information
  on all of its 263 mission-critical systems. Subsequently,
  Communications reported that 77 mission-critical systems had completed
  testing and 155 systems did not require testing./Footnote5/ The
  functional area also reported that the remaining 31 mission-critical
  systems/Footnote6/ did not yet have plans for testing and were
  considered to be behind schedule.

Selected End-to-End Test Events Were Managed According to GAO Guidance

We selected one test event from each functional area, determined whether
the key processes outlined in our Year 2000 testing guide were followed
and found that DOD had completed the majority of the processes called for
in the guide. For example, for the four test events reviewed, DOD had
defined test boundaries, defined exit criteria that would be used to
determine when a test was successfully completed, and described how the
test results would be documented. While the event's respective approaches
to implementing the key processes varied, these differences were based on
the consideration of the event's scope and complexity and inherent
business risk. Our test guidance permits such differences when justified
on the basis of business value and risk.

Table 3 summarizes the results of our review. As the table notes, of the
possible 44 key processes spanning the 4 test events, 34 were fully
satisfied while another 2 were partially satisfied. For the remaining 8
key processes, 4 were still in progress, and 4 processes concerning
correcting defects found were "not applicable" because initial testing
results had not yet disclosed Year 2000 defects. However, some of the test
results that were obtained during our review were still being analyzed by
DOD. 

Table****Helvetica:x11****3:    Summary of Test Events Satisfying GAO Key
                                Processes

--------------------------------------------------------------------------
| Selected functional  :     Fully :  Partially :      In :  N/A : Total |
| area test event      : satisfied :  satisfied : progress:      :       |
|------------------------------------------------------------------------|
| Health Affairs       :         8 :          1 :       1 :    1 :   11  |
|------------------------------------------------------------------------|
| Communications       :         9 :          0 :       1 :    1 :   11  |
|------------------------------------------------------------------------|
| Personnel            :         9 :          0 :       1 :    1 :   11  |
|------------------------------------------------------------------------|
| Logistics            :         8 :          1 :       1 :    1 :   11  |
|------------------------------------------------------------------------|
| Total                :        34 :          2 :       4 :    4 :   44  |
--------------------------------------------------------------------------

Note: Due to differences in scope and complexity of the test events, the
results of individual functions are not comparable.

In all cases where we determined that the test events' key processes
called for in our guide had only been partially satisfied, the PSAs and
test managers agreed to address our concerns and initiate corrective
actions. For example:

o While Health Affairs prepared procedures for its test event, these
  procedures were not sufficiently detailed and did not define each step
  to be executed or precisely define input data. As a result, it was
  necessary for system operators to augment the test procedures during
  the test's execution. While this approach was satisfactorily carried
  out because the relative simplicity of the test event permitted face-to-
  face coordination and synchronization of the procedures, it was
  unnecessarily risky and could have been easily avoided by ensuring that
  test procedures were complete. Health Affairs officials agreed that
  more detailed procedures should have been established, and they
  committed to ensuring that other Health Affairs test events have them. 

o Although the Logistics function is reliant on telecommunications
  providers such as military installations and the Defense Information
  Systems Agency (DISA), at the time of the test event we observed,
  documentation offering assurances that installations'
  telecommunications infrastructures were Y2K compliant was not provided
  by Logistics functional managers. Our test guide states that, in order
  to ensure that all systems in the chain of support function as
  intended, the telecommunications infrastructure that interconnects the
  systems must be compliant and ready for testing. Subsequent to our
  review, Logistics officials provided information showing that
  installations' telecommunications infrastructures had been included in
  installation test events. Logistics officials agreed, however, that
  they had not yet confirmed the Y2K compliance of the infrastructures,
  and reported that they have subsequently initiated steps to do so.

Conclusions

Given that virtually all Defense business functions and military
operations rely heavily on technology, it is vital that Year 2000 end-to-
end testing efforts be effectively planned and executed. All four of the
individual test events that we reviewed were well-managed because each
either satisfied or had steps underway or planned to address all relevant
end-to-end management key processes specified in our test guide. Moreover,
differences between the functional areas' approaches to implementing these
key processes were generally commensurate with the events' scope and
complexity. Finally, reported functional area status information indicates
that end-to-end tests are generally progressing on schedule. However, DOD
does not yet have assurance that all of its communications systems will be
Year 2000 compliant and, as such, should ensure that all mission-critical
communications systems are tested.

Recommendation

We recommend that the Secretary of Defense direct the Senior Civilian
Official of the Office of the Assistant Secretary of Defense for Command,
Control, Communications, and Intelligence to report to the Deputy
Secretary immediately on plans for end-to-end testing the 31 mission-
critical communications systems, including milestones for executing tests
and reporting test results, or to otherwise justify in writing to the
Deputy Secretary why any of the systems will not be included in an end-to-
end test event. 

Agency Comments and Our Evaluation

DOD concurred with our findings and partially concurred with our
recommendation to report to the Deputy Secretary on the status and plans
for Y2K testing of the 31 mission-critical communications systems
disclosed in our report.

In partially concurring on the recommendation, DOD stated that during the
July through August 1999 period of our review, testing data in the OSD Y2K
database was still evolving, and as a result, test data were incomplete
for many of the 31 systems. Since then, resolution has been reached on the
testing status of the 31 communications systems. DOD reported and provided
documentation to show that (1) Y2K testing for 14 of the 31 systems has
been completed, (2) 9 systems do not process dates and are exempt from end-
to-end test requirements, (3) 4 systems are trusted systems, which cannot
be tested in a Y2K environment due to safety, security, or operational
necessity reasons, (4) 2 systems are developmental systems that will not
be deployed before the millennium rollover, (5) 1 system has been
reclassified as a nonmission-critical system and does not require
additional testing, and (6) 1 system is scheduled to complete testing by
October 15, 1999. We have not verified the status information provided by
DOD. 

We are sending copies of this report to Representative John P. Murtha,
Ranking Minority Member, Subcommittee on Defense, House Appropriations
Committee; Senator John Warner, Chairman, and Senator Carl Levin, Ranking
Minority Member, Senate Committee on Armed Services; Senator Ted Stevens,
Chairman, and Senator Daniel Inouye, Ranking Minority Member, Subcommittee
on Defense, Senate Committee on Appropriations; and Representative Floyd
Spence, Chairman, and Ike Skelton, Ranking Minority Member, House
Committee on Armed Services.

We are also sending copies to the Honorable John Koskinen, Chair of the
President's Year 2000 Conversion Council; the Honorable William Cohen,
Secretary of Defense; the Honorable John Hamre, Deputy Secretary of
Defense; General Henry Shelton, Chairman of the Joint Chiefs of Staff;
Arthur Money, Senior Civilian Official of the Office of the Assistant
Secretary of Defense for Command, Control, Communications, and
Intelligence; and the Honorable Jacob Lew, Director of the Office of
Management and Budget. Copies will also be made available to others upon
request.

Should you or your staff have any questions concerning this report, please
contact me at (202) 512-6240. I can also be reached by e-mail at
[email protected]. Other points of contact and key contributors to this
report are listed in appendix IV.

Sincerely yours,

*****************

*****************

Jack L. Brock, Jr.
Director, Governmentwide and Defense
Information Systems

--------------------------------------
/Footnote1/-^Year 2000 Computing Crisis: A Testing Guide (GAO/AIMD-
  10.1.21, November 1998).
/Footnote2/-^Exit criteria are test conditions or requirements for
  successfully completing testing.
/Footnote3/-^Our observations are limited to the specific events we
  witnessed, and we cannot draw conclusions regarding end-to-end testing
  from an overall functional area perspective.
/Footnote4/-^Risks of testing in the production environment must be
  thoroughly analyzed and precautions taken to preclude damage to systems
  and data.
/Footnote5/-^The Communications function considers systems that do not
  require end-to-end testing to be developmental systems, those that do
  not process dates, and stand-alone systems.
/Footnote6/-^According to Communications officials, some of these systems
  are satellite and control systems, which may require waivers.

BRIEFING ON DOD Y2K FUNCTIONAL END-TO-END TESTING: PROGRESS AND TEST EVENT
MANAGEMENT
===========================================================================

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

OBJECTIVES, SCOPE, AND METHODOLOGY
==================================

As requested by the House Committee on Appropriations, Subcommittee on
Defense, our objectives were to (1) assess the status and progress of all
test events within four functional areas--Health Affairs, Communications,
Personnel, and Logistics--and (2) review the management effectiveness of a
critical test event for each of the four functional areas. Together, these
functional areas are performing thousands of end-to-end tests to ensure
that key business processes and systems can continue operating into the
year 2000.

To meet our first objective, we obtained status and progress information
for the aforementioned functional areas and compared the reported status
information to milestones contained in individual functional test
plans/Footnote1/ to identify variances. We discussed this information with
DOD officials and personnel from the Office of the Assistant Secretary of
Defense (Health Affairs), the Office of the Assistant Secretary of Defense
for Command, Control, Communications, and Intelligence, the Office of the
Undersecretary of Defense (Personnel and Readiness), and the Deputy
Undersecretary of Defense (Logistics). Also, for each of the functional
areas, where necessary, we obtained updated status and progress
information on end-to-end test events.

To meet our second objective, we selected one specific test event for each
functional area. The four selected test events were based on each PSA's
designation that the test event was of key importance in ensuring that the
function could continue unaffected at and after the turn of the century.
The selected test events, and the dates and locations we observed the
events, were

o Health Affairs--Patient Care/Issuance and Processing of Blood Requests
  (May 18, 1999) at the Advanced Technology Integration Center in Falls
  Church, Virginia.

o Communications--Joint User Switch Exercise (JUSE-99-Y2K)
  (June 10, 1999) at the Army Communications-Electronics Command, Fort
  Monmouth, New Jersey.

o Personnel--Army Personnel/Mobilization/Reserve Unit (June 17, 1999) at
  the Army Personnel Command, Alexandria, Virginia.

o Logistics--Intercomponent test of requisition and receipt processes
  (June 24-25, 1999) at the Navy Fleet Material Support Office in
  Mechanicsburg, Pennsylvania. 

For the selected test events, we interviewed DOD officials and reviewed
pertinent documentation for each event, including test event
plans,/Footnote2/ procedures, conditions, exit criteria, results, reports,
defects, correction action plans, and we observed the actual execution of
the test event. We then compared the particulars of each event to our Year
2000 test guide's end-to-end testing key processes, identified variances,
and discussed with test officials the reasons for and impacts of any
variances.

To supplement our documentation reviews and observations, we interviewed
DOD officials, including those from the TRICARE Military Health Systems,
the Army Communications-Electronics Command, the Office of the Deputy
Chief of Staff for Personnel, and the Office of the Deputy Undersecretary
of Defense (Logistics); test event coordinators; and test directors
regarding additional clarifications after our visit. These officials
addressed telecommunications infrastructure Year 2000 compliance issues.
They also provided additional documentation from the test event we
witnessed (i.e., test results, quick look, and final reports). Due to the
time criticality of the year 2000, as our reviews were completed on each
of the functional areas, we provided briefings detailing our observations
to each Defense PSA and test director as follows:

o Health Affairs--June 15, 1999,

o Communications--July 12, 1999,

o Personnel--July 15, 1999, and

o Logistics--August 17, 1999.

We performed our audit work primarily at DOD headquarters and at the test
event locations described above. We requested and received comments on a
draft of this report from DOD and incorporated those comments as
appropriate. We performed our audit work from March through September 1999
in accordance with generally accepted government auditing standards.

--------------------------------------
/Footnote1/-^Updated plans included in our review were the December 15,
  1998, plan for communications; the January 1999, plan for health
  affairs; the January 31, 1999, plan for logistics; and the January 28,
  1999, plan for personnel.
/Footnote2/-^Test event plans included in our review were: Military Health
  Systems Patient Care Functional Readiness Assessment Test Plan, v1.2,
  April 1999; Communications Joint User Switch Exercise 99-Y2K Exercise
  Directive, April 1999; Army Personnel Command Test Plan, June 1999; Army
  System Test Plans, June 1999, for EDAS, SIDPERS-3, and PEPDUS; and
  Logistics Exercise Directive, May 1999.

COMMENTS FROM THE DEPARTMENT OF DEFENSE
=======================================

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

GAO CONTACT AND STAFF ACKNOWLEDGEMENTS
======================================

GAO Contact

Randolph C. Hite, (202) 512-6240

Acknowledgements

In addition to those named above, Ronald B. Bageant, Cristina T. Chaplain,
Katherine I. Chu, Richard B. Hung, Myong S. Kim, Madhav S. Panwar, and
Alicia L. Sommers made key contributions to this report.

(511659)

*** End of document. ***