Defense Computers: Management Controls Are Critical to Effective Year
2000 Testing (Letter Report, 06/30/1999, GAO/AIMD-99-172).
The Defense Department (DOD) has planned or under way hundreds of Year
2000 end-to-end test and evaluation activities that must be finished in
a relatively short time. So far, DOD is taking steps to ensure that
these related end-to-end activities are effectively coordinated.
However, DOD is far from successfully completing its Year 2000
end-to-end test activities. Much remains to be done. DOD needs to ensure
that it completes efforts to establish end-to-end test management
controls outlined in GAO's Year 2000 test guide--namely, establishing an
independent quality assurance program to guarantee that its test
guidance, plans, and standards are being met and that any deviations or
other reasons for low confidence in end-to-end test results are brought
to the attention of senior management. Also, DOD must ensure that it
effectively implements all of the controls it has included in its plans
so that DOD executives receive timely and reliable information on
end-to-end test results and limitations. With such information, DOD can
act swiftly to correct known problems and to fill voids in test
coverage.
--------------------------- Indexing Terms -----------------------------
REPORTNUM: AIMD-99-172
TITLE: Defense Computers: Management Controls Are Critical to
Effective Year 2000 Testing
DATE: 06/30/1999
SUBJECT: Y2K
Computer software verification and validation
Systems conversions
Strategic information systems planning
Information resources management
Computer software
Embedded computer systems
Data integrity
Internal controls
IDENTIFIER: Y2K
DOD Year 2000 Program
******************************************************************
** This file contains an ASCII representation of the text of a **
** GAO report. This text was extracted from a PDF file. **
** Delineations within the text indicating chapter titles, **
** headings, and bullets have not been preserved, and in some **
** cases heading text has been incorrectly merged into **
** body text in the adjacent column. Graphic images have **
** not been reproduced, but figure captions are included. **
** Tables are included, but column deliniations have not been **
** preserved. **
** **
** Please see the PDF (Portable Document Format) file, when **
** available, for a complete electronic file of the printed **
** document's contents. **
** **
** A printed copy of this report may be obtained from the GAO **
** Document Distribution Center. For further details, please **
** send an e-mail message to: **
** **
** **
** **
** with the message 'info' in the body. **
******************************************************************
United States General Accounting Office GAO Report
to the Chairman, Subcommittee on Defense, Committee on
Appropriations, House of Representatives June 1999
DEFENSE COMPUTERS Management Controls Are Critical to Effective
Year 2000 Testing GAO/AIMD-99-172 United States General Accounting
Office
Accounting and Information Washington, D.C. 20548
Management Division B-282625
Letter June 30, 1999 The Honorable Jerry Lewis Chairman,
Subcommittee on Defense Committee on Appropriations House of
Representatives Dear Mr. Chairman: You requested that we review
the Department of Defense's (DOD) efforts to integrate and
coordinate its various Year 2000 end-to-end test activities.1
DOD's approach to conducting Year 2000 end-to-end testing is to
have * the military services conduct system integration testing, *
the Office of the Secretary of Defense (OSD) coordinate,
facilitate, and monitor test and evaluation activities carried out
by the military services, Defense agencies, and Commanders in
Chief (CINC)2 and, in some cases conduct end-to-end testing for
key functional areas such as logistics, communications, and
personnel, and * the CINCs conduct military operational exercises
to verify their Year 2000 mission readiness. An important aspect
of effective end-to-end testing is establishing and implementing
management controls that help ensure that tests are planned,
executed, and reported on, among other things, in an integrated
fashion, and that managers receive timely, reliable, and
verifiable information on test results and limitations. Thus, we
agreed with your staff to determine whether (1) DOD's plans
recognize relationships and dependencies among these test and
evaluation activities and (2) DOD has established the management
controls to ensure that its various Year 2000 end-to-end test and
evaluation activities are effectively integrated. As DOD conducts
1End-to-end Year 2000 testing refers to testing performed to
verify that a defined set of interrelated systems, which
collectively support an organizational core business function or
operation, interoperate as intended in a Year 2000 environment.
There are three other phases of testing that should precede end-
to-end testing, including software unit testing, software
integration testing, and system acceptance testing. 2CINCs are
responsible for DOD's unified combatant commands, which include
the Atlantic Command, Central Command, European Command, Pacific
Command, United States Forces Korea, Southern Command, Space
Command, North American Aerospace Defense Command, Special
Operations Command, Strategic Command, and Transportation Command.
Letter Page 1
GAO/AIMD-99-172 Defense Computers B-282625 specific test and
evaluation events, we will be separately reporting to you on the
DOD's effectiveness in managing these events, including its
implementation of end-to-end test management controls. We
performed our audit work from October 1998 through April 1999 in
accordance with generally accepted government auditing standards.
For additional information on our objectives, scope, and
methodology, see appendix I. The Office of the Assistant Secretary
of Defense provided written comments on a draft of this report.
These comments are discussed at the end of this report and
reprinted in appendix IV. Results in Brief DOD's end-to-
end test and evaluation plans that were available at the time of
our review recognize relationships and dependencies among various
end-to-end test and evaluation activities. For example, the North
American Aerospace Defense Command (NORAD) operational evaluation
plans3 linked the various service and Defense agency information
systems to its mission-critical warfighting tasks and operational
evaluation scenarios. Similarly, the Army systems integration test
plan specified five phases of integration testing activities, one
of which was end-to-end testing by the functional areas and
another of which was operational evaluations by the combatant
commands. We also found that OSD and the Joint Chiefs of Staff
(JCS), in order to integrate its various Year 2000 end-to-end test
activities, are establishing test and evaluation management
controls (structures and processes) that are consistent with the
end-to-end test management controls specified in our Year 2000
test guide.4 For example, in August 1998, the Secretary of
Defense assigned the CINCs with responsibility for conducting Year
2000 exercises to verify operational readiness. Later in the same
month, the Deputy Secretary of Defense assigned
interorganizational responsibility and authority for the various
end-to-end test activities to OSD functional area focal points to
ensure Year 2000 readiness for key functional areas that support
the combatant commands' operations. Also, both OSD and JCS
subsequently issued guidance to the military services, Defense
agencies and activities, and the CINCs specifying how 3NORAD's
plans for the first two phases of its operational evaluations were
entitled Vigilant Virgo 99-1 and Amalgam Virgo 99-2. 4Year 2000
Computing Crisis: A Testing Guide (GAO/AIMD-10.1.21, issued as an
exposure draft in June 1998; issued in final in November 1998).
Letter Page 2
GAO/AIMD-99-172 Defense Computers B-282625 these respective Year
2000 test and evaluation activities were to be planned, executed,
and reported. Further, JCS and OSD have established data bases to
collect specified data on the respective end-to-end test and
evaluation activities. OSD has also established a Year 2000 test
and evaluation function to independently evaluate, among other
things, end-to-end test and evaluation results. To do this, the
designated test director is in the process of defining an
assurance-based approach and metrics for measuring the confidence
that can be attached to specific test event results. However,
this approach and associated metrics have yet to be established,
and little time remains for doing so. While DOD's planning efforts
are being coordinated to recognize the relationships among end-to-
end test and evaluation activities and it is establishing controls
for managing these relationships, there are still significant
challenges confronting DOD in the actual execution of these tests.
The primary challenge, of course, is time. With less than 7
months remaining before the Year 2000 deadline, Defense cannot
afford major slippages in its test and evaluation schedule nor
does it have the luxury of redoing tests that prove ineffective or
incomplete. Exacerbating this pressure is the fact that,
according to Defense, 245 of DOD's 2,038 mission-critical systems-
some of which are needed to execute test and evaluation
activities-are not yet Year 2000 compliant, and thus may require
invocation of system contingency plans as part of the test and
evaluation event. With so little time remaining for DOD's many
organizational components to conduct hundreds of related end-to-
end test events, it will be important that end-to-end test and
evaluation events are well-managed. In particular, DOD must
ensure that its established controls are effectively implemented
for each test event. Also, we are recommending that DOD ensure
that controls are established for independently ensuring that
CINCs, military services, and Defense agencies adhere to
established end-to-end test and evaluation guidance, plans, and
standards. By doing this, the department's executive leadership
can receive timely and reliable information on test results,
progress, and limitations, such as gaps in the scope of end-to-end
test events due to the unavailability of compliant systems or
tested contingency plans. With such information, DOD leaders can
act swiftly to address mission areas at risk by filling voids in
test coverage either through additional end-to-end test and
evaluation or through contingency planning. In commenting on a
draft of this report, DOD concurred with our recommendations and
noted that it is taking actions to implement a quality Page 3
GAO/AIMD-99-172 Defense Computers B-282625 assurance program and
reinforce the importance of adhering to testing and evaluation
management controls. Background To protect
the security of the United States, DOD relies on a complex array
of computer-dependent and mutually supportive organizational
components, including the military services, CINCs, and Defense
agencies. It also relies on a broad array of computer systems,
which include weapon systems, command and control systems,
satellite systems, inventory management systems, transportation
management systems, health systems, financial systems, personnel
systems, and payment systems. In turn, these systems share
thousands of interface connections with systems belonging to
private contractors, other government agencies, and international
organizations. To effectively ensure that this immense and complex
array of organizational units and supporting computer systems is
ready for the Year 2000, DOD must verify not only that individual
systems function correctly in a Year 2000 environment, but also
that sets of interrelated and interconnected systems properly
interoperate in such an environment. The depth and complexity of
DOD's organizational structure and its dependency on computer
systems is further illustrated in appendix II. GAO's Past Work on
DOD's Over the last 2 years, we have reviewed DOD's Year 2000
efforts and Overall Year 2000 Program progress, and made
recommendations to strengthen program management. Has Identified
the Need for In response, DOD has taken steps to implement our
recommendations by Management Controls providing the
controls and guidance needed to fix and test individual systems.
It has also appropriately shifted its focus to core business areas
(i.e., functional areas such as logistics and communications, and
combatant commands' operational areas). Also, the Deputy
Secretary has personally become actively engaged in directing and
monitoring Year 2000 efforts. We recently testified that a key to
the success of these steps rested in putting in place (i.e.,
establishing, implementing, and enforcing) effective management
controls for DOD to have timely and reliable information to know
what is going right and what is going wrong so that corrective
action can be swift and effective.5 We also identified the need
for DOD to gain greater visibility into each of its core business
area's Year 2000 risks and 5Year 2000 Computing Crisis: Defense
Has Made Progress, But Additional Management Controls Are Needed
(GAO/T-AIMD-99-101, March 2, 1999). Page 4
GAO/AIMD-99-172 Defense Computers B-282625 readiness. One of the
critical areas of visibility that we cited in this regard was end-
to-end test activities. End-to-End Testing Is an Complete and
thorough Year 2000 testing is essential to provide reasonable,
Essential Part of an but not absolute, assurance that (1)
new or modified systems process dates Effective Year 2000 Test
correctly and (2) an organization's ability to perform core
business Program operations and functions
will not be jeopardized after the millenium. To be done
effectively, this testing should be managed in a structured and
disciplined fashion. Our Year 2000 test guide defines a step-by-
step framework for managing all Year 2000 test activities. This
framework sets forth five levels of test activity supported by
continuous management oversight and control. The first level
establishes the organizational key processes needed to effectively
direct and support the next four levels. The other four levels
define key processes for planning, conducting, and reporting on
tests of incrementally larger system components, beginning with
tests of software units and culminating with tests of sets of
interrelated systems, referred to as end-to-end testing. The
purpose of end-to-end testing is to verify that a defined set of
interrelated systems, which collectively support an organizational
core business area or operation, interoperate as intended in an
operational environment (either actual6 or simulated). These
interrelated systems include not only those owned and managed by
the organization, but also the external systems with which they
interface. The boundaries for end-to-end tests are not fixed or
predetermined, but rather vary depending on a given business
function's or operation's system dependencies and criticality to
the organizational mission. Therefore, in managing end-to-end
test activities, it is important to analyze the interrelationships
among core business functions/operations and their supporting
systems and the mission impact and risk of date-induced systems
failures. It is also important to work early and continually with
functional/operational partners to ensure that related end-to-end
test activities are effectively coordinated and integrated. 6Risks
of testing in the production environment must be thoroughly
analyzed and precautions taken to preclude damage to systems and
data. Page 5
GAO/AIMD-99-172 Defense Computers B-282625 DOD Has Initiated Year
DOD has underway three closely related end-to-end test and
evaluation 2000 End-to-End Test efforts to verify
that the department can perform core functional and operational
missions in a Year 2000 environment. These are: (1) military and
Evaluation service-sponsored system integration
tests, (2) functional area Year 2000 Activities
end-to-end tests, and (3) CINC operational evaluations. Because
the respective DOD organizational components that are conducting
these test and evaluation efforts, as described earlier, are
mutually dependent, each of these test efforts is also mutually
dependent. Military Service System The military services
are conducting system integration tests to ensure the Integration
Testing correct functioning of the interfaces
between interconnected systems and to demonstrate the Year 2000
readiness of selected business functions and operational
capabilities. The services have developed system integration test
plans that specify high-level test policy and schedules, and that
build upon the individual system renovation and validation
activities that they have already completed. The test plans
specify how the military services will determine whether discrete
systems can work together to perform the military service's
missions, including organizing, training, and equipping their
respective forces. For example, the Army plans to conduct the Air
Defense Operations Test Case to demonstrate that the Air and
Missile Defense Workstation can correctly exchange date/time
information with Battlefield Functional Area Control Systems. As
shown in figure 1, the military services have scheduled system
integration tests from February 1999 through mid-October 1999.
Figure 1: Military Service System Integration Test Schedule
(Calendar Year 1999) JAN FEB MAR APR MAY JUN JUL
AUG SEP OCT NOV DEC Army Navy Marine Corps Air Force
Primary evaluation Functional Area End-to-End In August 1998, the
Deputy Secretary of Defense directed five OSD focal Testing
points, known as Principal Staff Assistants (PSAs), to ensure that
their Page 6
GAO/AIMD-99-172 Defense Computers B-282625 respective lines of
business or functional areas would continue to operate in the Year
2000. Table 1: Functional Areas Designated for End-to-End Testing
Communications Includes telecommunications and other systems used
to transmit and receive information Logistics Includes
management of material, operation of supply, maintenance
activities, material transportation, base operations and support
Health/Medical Includes providing medical care to active
military personnel, dependents, and retirees Personnel
Includes recruiting of new personnel, personnel relocation,
civilian disability compensation, veterans education assistance,
etc. Intelligence Includes collection, processing,
integration, analysis and interpretation of available information
concerning foreign countries or areas In response to the Deputy
Secretary of Defense's direction, the PSAs, in collaboration with
the military services and Defense agencies, are at various stages
of planning and conducting Year 2000 functional end-to-end tests.
Specifically, the PSAs have directed the appropriate military
service and Defense agency components to identify core business
processes, or "threads," within the respective functional areas.
The PSAs are then to determine whether the military service and
Defense agency testing and/or CINC Year 2000 operational
evaluations (discussed in the next section) adequately assess the
designated functional area threads. If not, the PSAs are to
direct the appropriate military service or Defense agency
component to develop, execute, and report the results of end-to-
end tests to fill gaps in thread test coverage. In some cases,
such as the health/medical functional area, the PSA may develop
and execute the tests. An example of a thread within the logistics
functional area is the process that a soldier in the field follows
to requisition and receive ammunition from the forward ammunition
depot using the unit's automated requisitioning system and the
appropriate distribution system. Testing this thread could
involve the supply, transportation, reordering, and procurement
activities. Concurrent with the military services' and Defense
agencies' functional thread designations, the PSAs have drafted
high-level functional area end-to-end test plans and schedules and
coordinated them with the military services and Defense agencies.
As illustrated in figure 2, these plans show Page 7
GAO/AIMD-99-172 Defense Computers B-282625 that functional area
end-to-end testing of specified threads will occur through October
1999. Figure 2: End-to-End Testing Schedule for Functional Areas
(Calendar Year 1999) JAN FEB MAR APR MAY JUN
JUL AUG SEP OCT NOV DEC Logistics Personnel Medical Communications
Intelligence Primary evaluation Backup evaluation (timeframes
established to conduct additional or supplementary tests or
evaluations, if necessary) CINC Operational In August 1998,
the Secretary of Defense directed the CINCs to plan and
Evaluations execute a series of simulated Year 2000
operational exercises.7 According to the department, these
exercises are to assess whether Defense can still perform the
tasks that are critical to carrying out military missions in a
Year 2000 environment (for example, tactical warning;
transportation of goods, equipment, and personnel; deployment and
sustainment of troops; command and control; air refueling; and
aeromedical evacuation). DOD has defined almost 500 of these
tasks. In response to the Secretary's direction, each CINC
designated a particular operational mission(s) to evaluate and
specified the minimum set of tasks needed to perform the
mission(s). The CINCs then identified the minimum number of
automated systems, known collectively as thin lines, that would be
required to complete the critical tasks. For example, NORAD
identified a thin line of 65 specific systems needed to complete
its Integrated Tactical Warning/Attack Assessment task.
Accordingly, it subsequently planned and conducted an operational
evaluation to assess its capability to perform this task in a Year
2000 environment. That is, NORAD evaluated the capability of its
systems to track and forward missile and space air threats to the
National Military Command Center and Cheyenne Mountain Operations
7Memorandum from the Secretary of Defense, dated August 7, 1998,
to the secretaries of the military departments, Chairman of the
Joint Chiefs of Staff, Under Secretaries of Defense, et al.,
regarding Year 2000 compliance. Page 8
GAO/AIMD-99-172 Defense Computers B-282625 Center, with the
mission support systems' clocks rolled forward to January 1,
2000.8 The CINCs, in collaboration with the military services and
Defense agencies that support their respective operational
missions, report that they are at varying stages of planning and
executing their Year 2000 operational evaluations. According to
DOD, JCS has scheduled 32 of these operational evaluations through
September 1999 that will exercise a subset of DOD's tasks. As
illustrated in figure 3, as of April 12, 1999, 13 evaluations had
been reported as completed at seven different combatant commands.
Figure 3: CINC Operational Evaluations Schedule (Calendar Year
1999) JAN FEB MAR APR MAY JUN JUL AUG SEP OCT NOV DEC
USACOM USCENTCOM USEUCOM USPACOM USFK USSOCOM USSOUTHCOM
USSPACECOM NORAD USSTRATCOM USTRANSCOM Primary evaluation Backup
evaluation DOD Year 2000 The Deputy Secretary
of Defense has acknowledged the need to ensure End-to-End Test
Plans that DOD's Year 2000 end-to-end testing efforts recognize
key mission relationships and dependencies between the CINCs, OSD
functional areas, Recognize military
services, and Defense agencies. Moreover, recent DOD Year 2000
Organizational and test guidance specifies that the
test plans should define relevant System Dependencies
organizational and system relationships. Unless DOD's end-to-end
test plans do so, the likelihood that key operations and functions
will be adequately tested is greatly reduced. 8As noted in the
introduction to this report, we will be reporting separately on
DOD's effectiveness in managing this and other test and evaluation
events. Page 9
GAO/AIMD-99-172 Defense Computers B-282625 We reviewed available
plans for early operational evaluations as well as draft plans for
the initial five functional end-to-end tests and the military
service integration tests. Our review showed that DOD's Year 2000
end-to-end test and evaluation plans recognize relevant
organization and supporting systems relationships and
dependencies. The results of our review of the plans for the
military service integration tests, functional area tests, and
operational evaluations, respectively, are summarized below.
Military Service System The military services have
drafted system integration test plans. We Integration Test Plans
reviewed the Army and the Navy system integration plans and found
that they generally described relevant relationships with the
functional area end-to-end test plans and the CINC operational
evaluation plans.9 For example, the Army plan defined its
integration testing in five phases: (1) individual system testing,
(2) OSD functional end-to-end testing, (3) CINC operational
evaluations, (4) Army operational evaluation (to cover any mission
threads the OSD and CINC testing did not), and (5) contingency
assessment.10 The Army plan also discussed the need to designate
organizational responsibility for central, interorganizational
coordination of each of the five phases. Functional Area End-to-
End Each of the initial five functional areas-communications,
logistics, Test Plans personnel,
health/medical, and intelligence-have drafted test plans. Our
review of drafts of these plans11 showed that all five generally
addressed relevant relationships with the CINC operational
evaluations. For example, the logistics draft plan described how
some functional threads relate to CINC operational thin lines, and
it defined processes for coordinating and integrating more
detailed test planning, execution, and reporting activities. Also,
the functional draft test plans generally described the
relationships between the respective functional area testing and
the military services' system integration testing. For example,
the logistics test plan specified 9"U.S. Army Operation Order 99-
01, Millennium Passage" (January 1999), "Naval Year 2000 Test
Master Plan" (March 1999). 10Assessment designed to evaluate the
ability of DOD to go to war in an environment degraded by Year
2000 failures. 11Updated plans included in our review were the
December 15, 1998, plan for communications; the January 1999, plan
for health and medical; the December 22, 1998, plan for
intelligence; the January 31, 1999, plan for logistics; and the
January 28, 1999, plan for personnel. Page 10
GAO/AIMD-99-172 Defense Computers B-282625 the military service
and Defense agency components that are responsible for planning
and conducting specific functional thread tests. CINC Operational
We reviewed the operational evaluation plans for two completed
CINC Evaluation Plans exercises that were performed jointly by
NORAD and the U.S. Strategic Command. The first exercise,12
performed from December 2 through 4, 1998, focused primarily on
the missile warning element of NORAD's Integrated Tactical Warning
and Attack Assessment function. The follow-on exercise,13
conducted from February 15 through 28, 1999, involved a
comprehensive evaluation of NORAD and the Strategic Command's
thin-line systems for air warning, missile warning, space warning,
and aerospace control. We found that these plans recognized the
CINCs' dependence on various functional areas and systems. For
example, the plans recognized the military service and Defense
agency functional systems needed to support the commands'
respective thin-line operational objectives. However, DOD's
execution of initial operational evaluations did not include
actually testing certain thin-line functional systems, such as
communications and intelligence systems, because the systems were
not yet Year 2000 compliant. According to CINC documents,
evaluations of the performance of these omitted systems will be
included in other DOD organizations' test plans and verified
later. Also, at the time of our review, the DOD operational
evaluations that we reviewed did not test any weapon systems.
This is because DOD had originally chosen to rely on the military
services' weapon systems integration tests. Since then, DOD has
recognized the importance of including weapon systems in selected
operational exercises and expanded the exercises to include weapon
systems. 12Known as Vigilant Virgo 99-1. 13Known as Amalgam Virgo
99-2. Page 11 GAO/AIMD-
99-172 Defense Computers B-282625 DOD Is Establishing
Our Year 2000 test guide defines management controls for effective
Year Management Controls 2000 test programs. These controls
include organizational structures and processes (i.e., policies,
procedures, plans, and standards) for ensuring that for
Integrating test activities, including end-to-end
testing, are planned, executed, End-to-End Testing
reported, and overseen in a structured and disciplined manner. In
the case of end-to-end testing, our guide discusses the need to
ensure that relationships among organizations and their systems
are effectively managed through interorganizational controls
(structures and processes) that govern how testing will be
planned, executed, reported, and overseen, and how test results
will be used. For example, our guide describes the need to: *
clearly establish interorganizational responsibility and
accountability for end-to-end test activities; * establish
organizational expectations (i.e., policies and guidance) for
planning and executing end-to-end testing, including such things
as (1) test coverage, test conditions, test metrics, and test
reporting content, format, and frequency, and (2) expectations for
integrating and coordinating related test activities; and *
establish mechanisms for ensuring that (1) end-to-end test
expectations are being met, including quality assurance14 controls
to validate that collected information is reliable and (2)
collected information is effectively shared and used to take
needed corrective action. Without such controls, organizations can
limit both the effectiveness and efficiency of their end-to-end
test activities. DOD has taken a number of actions to establish
the management controls needed to integrate and coordinate its
various end-to-end test and evaluation activities that are
consistent with our Year 2000 test guide. First, DOD assigned
interorganizational responsibility and accountability for end-to-
end test activities to the OSD PSAs. Specifically, in August
1998,15 the Deputy Secretary of Defense charged the PSAs with
ensuring that the 14The purpose of this quality assurance is to
independently ensure that test and evaluation activities and
results are complete and accurate and conform to test and
evaluation plans, guidance, and standards. 15Memorandum from the
Deputy Secretary of Defense, dated August 24, 1998, to the
secretaries of the military departments, Chairman of the Joint
Chiefs of Staff, Under Secretaries of Defense, et al., regarding
Year 2000 verification of national security capabilities. Page 12
GAO/AIMD-99-172 Defense Computers B-282625 various functions that
support DOD's operational missions can effectively operate in a
Year 2000 environment. Second, DOD issued guidance and direction
on Year 2000 test planning, execution, and reporting. For
example, in addition to its guidance on creating and executing
operational evaluations, JCS issued draft guidance in October 1998
to the CINCs defining how Year 2000 operational evaluations should
be planned and executed. This guidance, which was updated in
April 1999,16 addressed the need to ensure that these evaluations
are coordinated with functional end-to-end tests and military
service integration tests, and how the results should be analyzed
and reported. Also, in late 1998, the Office of the Assistant
Secretary of Defense for Command, Control, Communications, and
Intelligence (OASD/C3I) began briefing functional representatives
in Defense agencies and the military services on test
expectations. Further, in March 1999, OASD/C3I issued appendix I
to DOD's Year 2000 Management Plan, 17 which provides additional
guidance on planning, executing, and evaluating functional end-to-
end testing. Third, DOD is establishing mechanisms for collecting
information on end-to-end test progress and results and ensuring
that it is reliable and available for management action. For
example, JCS has developed a central data base to store and
analyze selected data about each operational evaluation that the
CINCs are required to report in their plans and in reports that
are to be submitted to the Joint Chiefs of Staff following the
evaluations.18 OSD is defining end-to-end functional test metrics
that will be collected from the functional thin line/system
integration tests and stored/analyzed in an OSD data base. Also,
in December 1998, OASD/C3I and JCS began holding biweekly Year
2000 meetings19 with representatives from OASD/C3I, JCS, the
CINCs, the military services, and the Defense agencies. The
purpose of these meetings is to facilitate coordination and
integration of the various end-to-end test activities that cut
across 16Joint Staff Year 2000 Operational Evaluation Guide,
Version 3.0, April 1, 1999. 17DOD Year 2000 Management Plan,
Version 2.0, appendix I, Guidelines to Support DOD Y2K Operational
Readiness. 18Joint Chiefs of Staff guidance requires the CINCs to
submit reports 7 days and 30 days after the completion of a Year
2000 test that describe the evaluation, the critical mission(s)
and task(s) and thin line systems that were assessed, failures
that occurred during the evaluation, and actions to correct
problems. 19Known within Defense as synchronization meetings. Page
13 GAO/AIMD-
99-172 Defense Computers B-282625 organizational boundaries.
Further, in February 1999, OASD/C3I established a Year 2000 test
and evaluation function to independently evaluate, among other
things, end-to-end test and evaluation results. To do this, the
designated test director is in the process of defining an
assurance-based approach and metrics for measuring the confidence
that can be attached to specific test event results. However,
this quality assurance approach and associated metrics have yet to
be established, and little time remains for doing so. DOD Must
Ensure That An effective system of internal management controls
requires both timely Its End-to-End Test
establishment of such controls (i.e., definition and institutional
awareness and understanding) and consistent implementation of the
controls (i.e., Events Effectively adherence and
enforcement). As discussed above, we found that with the
Implement Established exception of the end-to-end test and
evaluation quality assurance process, Management Controls
DOD has established end-to-end test management controls that are
consistent with our Year 2000 test guide. However, establishing
controls is only part of what DOD needs to do to ensure that its
end-to-end test activities are effectively managed. DOD must also
ensure that these controls are adhered to and enforced in
planning, executing, and reporting the results of actual end-to-
end test events. Fully implementing and enforcing these end-to-end
test management controls would be important if DOD was conducting
only a handful of Year 2000 end-to-end test events and its
component organizations' missions were not so dependent on
compliant systems. However, DOD is conducting literally hundreds
of end-to-end test activities and events within an intense 9-month
period (February to mid-October 1999), and some of these
activities are closely related. As a result, adherence to these
controls is absolutely critical. To illustrate this criticality,
we discussed earlier in the report that some systems that are to
be part of the thin-line operational evaluations are not yet
compliant and thus are unavailable for a given test event. As of
March 31, 1999, 245 of 2,038 mission-critical systems, some of
which may be included in an operational evaluation, were reported
as being not yet compliant.20 In cases where systems are not yet
ready, CINCs can either (1) implement the system contingency plan,
(2) postpone the operational 20Appendix III provides examples of
key systems that are currently behind schedule and describes their
importance to Defense's mission. Page 14
GAO/AIMD-99-172 Defense Computers B-282625 evaluation until the
necessary thin line of systems is ready, (3) not test the system
and assume proper functioning of the thin line of systems, or (4)
count on other DOD organizations to verify the missing thin line
at a later date. Regardless, these delays and gaps can not only
affect the particular end-to-end test event, but also can affect
related test events. While DOD is establishing end-to-end test
management controls for identification and disposition of these
delays and gaps in its various end-to-end test events, these
controls must be followed to be effective. To do less could limit
DOD's end-to-end testing effectiveness, and thus its Year 2000
operational readiness. Conclusions DOD has underway or
planned hundreds of related Year 2000 end-to-end test and
evaluation activities that must be completed in a relatively short
time. Thus far, DOD is taking steps to ensure that these related
end-to-end activities are effectively coordinated. This is
evidenced by the fact that draft and final test and evaluation
plans for the various functional and operational mission areas
recognize relevant interorganizational relationships and
dependencies, and the fact that important management controls have
either been established or are being established. However, DOD is
far from successfully completing its various Year 2000 end-to-end
test activities, and much remains to be addressed and
accomplished. To effectively do so, DOD must ensure that it
completes efforts to establish end-to-end test management controls
specified in our Year 2000 test guide-namely, establishing an
independent quality assurance program for ensuring that its test
guidance, plans, and standards are being met and that any
deviations or other reasons for low confidence in end-to-end test
results are brought to the attention of senior managers. Also, it
must ensure that it effectively implements all of the controls it
has included in its various plans so that DOD executive leadership
receives timely and reliable information on end-to-end test
results and limitations. With such information, DOD leaders can
act swiftly to correct known problems and to fill voids in test
coverage either through additional end-to-end test and evaluation
or through contingency planning. Recommendations We recommend
that the Secretary of Defense (1) direct the Assistant Secretary
for C3I to immediately implement a quality assurance program for
end-to-end test and evaluation activities under the newly
designated Page 15 GAO/AIMD-
99-172 Defense Computers B-282625 Year 2000 test director to
provide independent evaluations of test event results and (2)
reiterate to the OSD, JCS, and military service end-to-end testing
principals the importance of ensuring that established end-to-end
test and evaluation management controls are implemented and
enforced on their respective end-to-end events, and that
deviations from these controls be disclosed through existing Year
2000 reporting mechanisms. Agency Comments and The Office of the
Assistant Secretary of Defense provided written Our Evaluation
comments on a draft of this report, which are reprinted in
appendix IV. DOD concurred with both of our recommendations and
outlined the actions it has planned, or already begun, to
implement them. Regarding our recommendation that Defense
immediately implement a quality assurance program for end-to-end
test and evaluation activities, Defense acknowledged that such a
program should have been implemented in the design phase of its
testing activities and stated that it has initiated steps to
implement a program that will include (1) Inspector General
independent audits of test results, (2) military service
operational test agencies' review of test results, and (3) funding
to support service and agency operated independent verification
and validation activities. Regarding our recommendation that the
Deputy Secretary of Defense reiterate the importance of ensuring
that test and evaluation management controls are implemented and
enforced, Defense stated that it has begun implementing our
recommendation by making modifications to its Year 2000 guidance
and by reinforcing the importance of adhering to management and
reporting controls at Year 2000 Executive-Service Principals'
meetings, Year 2000 Steering Committee meetings, and the
synchronization meetings. We are sending copies of this report to
Representative John P. Murtha, Ranking Minority Member,
Subcommittee on Defense, House Appropriations Committee, Senator
John Warner, Chairman, and Senator Carl Levin, Ranking Minority
Member, Senate Committee on Armed Services; Senator Ted Stevens,
Chairman, and Senator Daniel Inouye, Ranking Minority Member,
Subcommittee on Defense, Senate Committee on Appropriations;
Representative Floyd Spence, Chairman, and Ike Skelton, Ranking
Minority Member, House Committee on Armed Services. We are also
sending copies to the Honorable John Koskinen, Chair of the
President's Year 2000 Conversion Council; the Honorable William
Cohen, Secretary of Defense; the Honorable John Hamre, Deputy
Secretary of Defense; General Henry Shelton, Chairman of the Joint
Chiefs of Staff, Arthur Money, Senior Civilian Official of the
Office of the Assistant Page 16
GAO/AIMD-99-172 Defense Computers B-282625 Secretary of Defense
for Command, Control, Communications, and Intelligence; and the
Honorable Jacob J. Lew, Director, Office of Management and Budget.
Copies will also be made available to others upon request. If you
have any questions about this report, please call me at (202) 512-
6240. Other key contributors of this report are listed in
appendix V. Sincerely yours, Jack L. Brock, Jr. Director,
Governmentwide and Defense Information Systems Page 17
GAO/AIMD-99-172 Defense Computers Contents Letter
1 Appendix I
20 Objectives, Scope, and Methodology Appendix II
22 Complexity of DOD's Organizational Structure and Reliance on
Computer Systems Appendix III
28 Examples of Key DOD Mission-Critical Systems Reported to Be
Behind Schedule Appendix IV
29 Comments From the Department of Defense Appendix V
32 GAO Contact and Staff Acknowledgements Table
Table 1: Functional Areas Designated for End-to-End Testing
7 Page 18 GAO/AIMD-99-172
Defense Computers Contents Figures Figure 1: Military Service
System Integration Test Schedule (Calendar Year 1999)
6 Figure 2: End-to-End Testing Schedule for Functional Areas
(Calendar Year 1999)
8 Figure 3: CINC Operational Evaluations Schedule (Calendar Year
1999) 9 Figure II.1: High-Level DOD Organizational Chart
23 Figure II.2: High-Level Army Organizational Chart
24 Figure II.3: High-Level Army Materiel Command Organizational
Chart 25 Abbreviations C3I Command, Control,
Communications, and Intelligence CINC Commanders in Chief
CIO Chief Information Officer DOD Department of
Defense DSN Defense Switch Network GCCS Global
Command and Control System JCS Joint Chiefs of Staff NORAD
North American Aerospace Defense Command OASD/C3I Office of the
Assistant Secretary of Defense for Command, Control,
Communications, and Intelligence OSD Office of the
Secretary of Defense PSA Principal Staff Assistant TBMCS
Theater Battle Management Core System Page 19
GAO/AIMD-99-172 Defense Computers Appendix I Objectives, Scope,
and Methodology
Appendix I Our objectives were to determine if (1) DOD's plans for
Year 2000 functional tests, military service integration tests,
and operational evaluations recognize the relationships and
dependencies among these test and evaluation activities and (2)
DOD has established the management controls to ensure that its
various Year 2000 end-to-end test and evaluation activities are
effectively integrated. As such, this report does not address
controls related to other Year 2000-related test activities,
including software unit testing, software integration testing, and
system acceptance testing. Nor does it address the actual
implementation of controls for specific end-to-end test
activities. To accomplish the first objective, we reviewed
Defense's Year 2000 Management Plan (Version 2.0, December 1998).
We also analyzed end-to-end test plans initially issued in the
October 1998 time frame by DOD officials at the direction of the
Deputy Secretary of Defense for five functional areas:
communications, health and medical, intelligence, logistics, and
personnel. Since these plans were considered to be working
documents, we also analyzed updated plans issued from December
1998 through January 1999 for the same five functions.1 In
addition, we obtained and reviewed test plans for two of the
operational evaluations performed at the North American Aerospace
Defense Command (NORAD) and U.S. Strategic Command, and also
witnessed operational tests conducted during February 1999 at
NORAD. We also reviewed integration testing plans for each of the
military services-the Army, Navy, Air Force, and Marine Corps. We
discussed these plans with the Deputy Secretary of Defense and
other responsible DOD executives, including the Senior Civilian
Official of the Office of the Assistant Secretary of Defense for
Command, Control, Communications, and Intelligence, who serves in
the capacity of the DOD Chief Information Officer (CIO), the
Deputy CIO, Joint Chiefs of Staff and CINC officials, and Defense
agency and military service personnel. To accomplish the second
objective, we reviewed Defense's Year 2000 Management Plan
(Version 2.0, December 1998) and DOD Year 2000 guidance, such as
guidance provided in memoranda regarding the Year 2000 initiative
issued by the Secretary of Defense on August 7, 1998, and the
Deputy Secretary of Defense on August 24, 1998, and other DOD
guidance. We compared DOD's plans and guidance to controls defined
in our Year 1Updated plans included in our review were the
December 15, 1998, plan for communications; the January 1999, plan
for health and medical; the December 22, 1998, plan for
intelligence; the January 31, 1999, plan for logistics; and the
January 28, 1999, plan for personnel. Page 20
GAO/AIMD-99-172 Defense Computers Appendix I Objectives, Scope,
and Methodology 2000 test guide2 as a basis for identifying
strengths and weaknesses. We also discussed Defense's management
controls for Year 2000 testing efforts with the Deputy Secretary
of Defense; CIO officials; Joint Chiefs of Staff and CINC
officials; and Defense agency and military service personnel.
Further, we attended monthly DOD Year 2000 Steering Committee
meetings, Year 2000 synchronization meetings, and Year 2000
training sessions where various efforts to address DOD testing
issues were discussed. We performed our audit work from October
1998 through April 1999 in accordance with generally accepted
government auditing standards. 2Year 2000 Computing Crisis: A
Testing Guide (GAO/AIMD-10.1.21). Published as an exposure draft
in June 1998 and finalized in November 1998. Page 21
GAO/AIMD-99-172 Defense Computers Appendix II Complexity of DOD's
Organizational Structure and Reliance on Computer Systems
Appendix I I DOD is the largest and most complex organization in
the world. To accomplish its missions, DOD employs a matrixed
organizational structure. Administratively, DOD is organized into
the following major organizational units: the Office of the
Secretary of Defense (OSD); the Joint Chiefs of Staff (JCS); the
unified combatant commands, such as the Atlantic Command and the
Transportation Command; and the military services (Army, Navy, Air
Force, and Marine Corps). (See figure II.1.) Page 22
GAO/AIMD-99-172 Defense Computers Appendix II Complexity of DOD's
Organizational Structure and Reliance on Computer Systems Figure
II.1: High-Level DOD Organizational Chart Secretary of Defense
Deputy Secretary of Defense Department of the
Department of the Office of the Army
Department of the Navy Air Force
Secretary of Defense Joint Chiefs of Staff
Chairman JCS Secretary of the Navy Secretary
of the Air Force
The Joint Staff A Under
Under Under Secretaries
Vice Chairman, JCS Secretary
Secretary Assistant Secretaries and
Chief Commandant and Chief
of Defense Chief of Staff, Army Assistant
of of Assistant of
and Equivalents Chief of Naval
Secretaries Naval Marine Secretaries
Staff
Operations of the Operations Corps of
the Air Force Navy
Air Force
Chief of Staff, Air Force Commandant, Marine Corps Navy
Marine Air Force Major
Corps Major Commands
Major Commands & Agencies Commands
& Agencies & Agencies DOD Field Activities
Defense Agencies Unified Combatant
Commands American Forces Information Service
Defense Advanced Research Projects Agency
Atlantic Command Defense Prisoner of War/Military
Ballistic Missile Defense Organization
Central Command Police Office
Defense Commissary Agency
European Command DOD Education Activity
Defense Contract Audit Agency
Pacific Command DOD Human Resources Activity
Defense Finance and Accounting Service
* United States Forces Korea Office of Economic Adjustment
Defense Information Systems Agency
Southern Command TRICARE Management Activity
Defense Intelligence Agency
Space Command Washington Headquarters Services
Defense Legal Services Agency
* NORAD Defense Logistics Agency
Special Operations Command Defense Security Cooperation Agency
Strategic Command Defense Security Service
Transportation Command Defense Threat Reduction Agency National
Imagery and Mapping Agency* National Security Agency/Central
Security Service* *Reports directly to Secretary of Defense Under
OSD are numerous large Defense agencies and field activities,
including the Defense Logistics Agency, Defense Finance and
Accounting Service, and Defense Information Systems Agency.
Similarly, under each of the military services are many large
organizational units. For example, the Army has 15 major commands
and numerous other functional activities, Page 23
GAO/AIMD-99-172 Defense Computers Appendix II Complexity of DOD's
Organizational Structure and Reliance on Computer Systems such as
the Army Materiel Command and the 8th U.S. Army. (See figure
II.2.) Figure II.2: High-Level Army Organizational Chart A
Department of the Army Secretary of the Army Under Secretary of
the Army Chief of Staff Army Assistant
Secretaries: Director of Information Systems for Civil Works
Command, Control, Communications and Computers Financial
Management & Comptroller The Inspector General
Installations & The Auditor General Environment
Chief of Legislative Liaison Manpower & Reserve Chief of
Public Affairs Affairs Director of Office
Small & Acquisition, Logistics Disadvantaged Business
Utilization & Technology Deputy Chiefs of Staff: ACS
Installation Management Major Commands:
Intelligence
Corps of Engineers Logistics Chief of
Engineers Criminal Investigation
Command Operations & Plans The Surgeon General
Medical Command Personnel
Intelligence and Security Command Chief National Guard Bureau
Military District of Washington Chief Army Reserve
Space and Missile Defense Command The Judge Advocate General
Forces Command Training and Doctrine Command Chief of Chaplains
Special Operations Command Military Traffic Management Command B
Army Materiel Command U.S. Army Europe and 7th Army 8th U.S. Army
U.S. Army Pacific U.S. Army South All of these Army units are in
turn very large organizations. The Army Materiel Command alone
employs more than 65,000 civilian and military employees at 285
locations worldwide, and ranks in business volume with Page 24
GAO/AIMD-99-172 Defense Computers Appendix II Complexity of DOD's
Organizational Structure and Reliance on Computer Systems the top
10 corporations in the U.S. It consists of nine subordinate
commands (e.g., the Army Aviation and Missile Command, the Army
Communications-Electronics Command, and the Army Research
Laboratory) and 11 reporting activities (e.g., the Army Materiel
Systems Analysis Activity and Army Materiel Command-Europe). (See
figure II.3.) Figure II.3: High-Level Army Materiel Command
Organizational Chart B Army Materiel Command (AMC) Major
Separate Subordinate Reporting Commands Activities
Aviation & Missile Command
Systems Analysis Activity Research Laboratory
Europe Communications-Electronics Command
Inspector General Activity Industrial Operations Command
Installations and Service Activity Soldiers & Biological Chemical
Command Logistics Support Activity
Simulation, Training & Instrumentation Command
School of Engineering and Logistics Tank-Automotive & Armaments
Command Field Assistance Science and
Technology Test and Evaluation Command
International Cooperative Programs Activity Security Assistance
Command Intelligence and
Technology Security Activity Integrated Procurement Systems Office
Logistics Support Element Page 25
GAO/AIMD-99-172 Defense Computers Appendix II Complexity of DOD's
Organizational Structure and Reliance on Computer Systems
Operationally, DOD's combatant forces are assigned to a combatant
command. Each of these combatant commands is responsible for
military operations for specified geographic regions or theaters
of operations. To support each of these commands, DOD has
assigned specific operational support responsibilities to its many
other organizational units, including OSD, the military services,
Defense agencies, and other commands. For example, if a conflict
erupted in the Pacific or Indian Oceans, the Pacific Command would
be the DOD organizational unit responsible for all military
operations in that region, and its CINC would report directly to
the National Command Authority, which consists of the President of
the United States and the Secretary of Defense. Also, the Pacific
Command CINC would be supported by (1) military service components
(e.g., U.S. Army Pacific, Marine Forces Pacific, U.S. Pacific
Fleet, U.S. Pacific Air Forces), (2) subordinate unified commands
(e.g., 8th U.S. Army, U.S. Forces Japan, U.S. Forces Korea), (3)
standing joint task forces (e.g., Joint Interagency Task Force
West, Joint Task Force-Full Accounting), and (4) other supporting
units (e.g., Asia-Pacific Center for Security Studies, Joint
Intelligence Center Pacific). In short, this specified mix of DOD
organizational entities, and their supporting systems, would
interoperate to collectively fulfill the specified Pacific Command
mission. DOD's Organizations DOD relies extensively on
computer systems. Its portfolio includes Are System Reliant
weapon systems, command and control systems, satellite systems,
inventory management systems, transportation management systems,
health systems, financial systems, personnel systems, and payment
systems. Collectively, DOD reports that it operates and maintains
more than 1.5 million computers, 28,000 systems, and 10,000
networks. Further, DOD exchanges information with thousands of
public and private sector business partners, which involve
thousands of system and network interfaces. Each of DOD's
organizational units is also system reliant. For example, the
Army depends on about 1,200 systems, of which roughly 400 are
considered by the Army to be mission-critical. Each of its major
commands similarly is system dependent. The Army Materiel
Command, for example, has reported that it depends on
approximately 650,000 computer applications and system
infrastructure devices, about 1,800 of which support weapon
systems (e.g., the AH-64A Apache and AH-64D Apache Longbow attack
helicopters, the M1A2 Abrams tank system, the M2/M3A3 Bradley
fighting vehicle, and the Patriot missile system). The command
also reports that it Page 26
GAO/AIMD-99-172 Defense Computers Appendix II Complexity of DOD's
Organizational Structure and Reliance on Computer Systems is
responsible for 81 mission-critical business systems that involve
350 data exchange interfaces. Letter Page 27
GAO/AIMD-99-172 Defense Computers Appendix III Examples of Key DOD
Mission-Critical Systems Reported to Be Behind Schedule
Append IIix I We testified in March 19991 and April 19992 that
while Defense had recently made progress by providing the controls
and guidance needed to fix and test systems, it was behind
schedule. The following are three examples of some of these
systems. * First, the Global Command and Control System (GCCS)
system is deployed at more than 600 sites worldwide and is
Defense's primary system for generating a common operating picture
of the battlefield for planning, executing, and managing military
operations. Completion of the component-level GCCS at some
locations is currently scheduled for as late as September 30,
1999. * Second, the Defense Switch Network (DSN), scheduled to be
completed by September 30, 1999, is the primary long-distance
voice communications service for DOD. DSN provides both dedicated
and common-user voice communications services at all priority
levels for command and control and special command and control
users as well as routine service for administrative users
throughout the department. * Third, the Theater Battle Management
Core System (TBMCS) is being developed by the Air Force and is
intended to replace three Year 2000 non-compliant legacy systems.
TBMCS is to be a primary support tool used by theater commanders
to provide information to the warfighter and for peacetime and
humanitarian operations. Because of developmental problems that
have resulted in schedule slippages, the Air Force does not expect
to fully implement TBMCS until September 30, 1999, at the
earliest. Schedule slippages have also caused the Air Force to
remediate a legacy system, the Contingency Theater Automation
Planning System-scheduled to be completed in September 1999-in the
event of further delays to TBMCS. 1Year 2000 Computing Crisis:
Defense Has Made Progress, But Additional Management Controls Are
Needed (GAO/T-AIMD-99-101, March 2, 1999). 2Year 2000 Computing
Crisis: Federal Government Making Progress But Critical Issues
Must Still Be Addressed to Minimize Disruptions (GAO/T-AIMD-99-
144, April 14, 1999). Page 28
GAO/AIMD-99-172 Defense Computers Appendix IV Comments From the
Department of Defense Appendix IV Page 29 GAO/AIMD-99-
172 Defense Computers Appendix IV Comments From the Department of
Defense Page 30 GAO/AIMD-99-172
Defense Computers Appendix IV Comments From the Department of
Defense Page 31 GAO/AIMD-99-172
Defense Computers Appendix V GAO Contact and Staff
Acknowledgements
Appendix V GAO Contact Randolph C. Hite, (202) 512-
6240 Acknowledgements In addition to the above contact,
Ronald B. Bageant, Scott A. Binder, Cristina T. Chaplain,
Katherine I. Chu, Richard B. Hung, Steven M. Hunter, Myong S. Kim,
Robert P. Kissel, Jr., Denice M. Millett, Madhav S. Panwar, Robert
G. Preston, Karen S. Sifford, Alicia L. Sommers, and Yvonne J.
Vigil made key contributions to this report. (511656) Letter
Page 32 GAO/AIMD-99-172
Defense Computers Ordering Information The first copy of each GAO
report and testimony is free. Additional copies are $2 each.
Orders should be sent to the following address, accompanied by a
check or money order made out to the Superintendent of Documents,
when necessary, VISA and MasterCard credit cards are accepted,
also. Orders for 100 or more copies to be mailed to a single
address are discounted 25 percent. Orders by mail: U.S. General
Accounting Office P.O. Box 37050 Washington, DC 20013 or visit:
Room 1100 700 4th St. NW (corner of 4th and G Sts. NW) U.S.
General Accounting Office Washington, DC Orders may also be placed
by calling (202) 512-6000 or by using fax number (202) 512-6061,
or TDD (202) 512-2537. Each day, GAO issues a list of newly
available reports and testimony. To receive facsimile copies of
the daily list or any list from the past 30 days, please call
(202) 512-6000 using a touchtone phone. A recorded menu will
provide information on how to obtain these lists. For information
on how to access GAO reports on the INTERNET, send an e-mail
message with "info" in the body to: [email protected] or visit
GAO's World Wide Web Home Page at: http://www.gao.gov United
States Bulk Rate General Accounting Office
Postage & Fees Paid Washington, D.C. 20548-0001 GAO
Permit No. GI00 Official Business Penalty for Private Use $300
Address Correction Requested
*** End of document. ***