Defense Computers: U.S. Space Command's Management of Its Year 2000
Operational Testing (Letter Report, 11/15/1999, GAO/AIMD-00-30).

Pursuant to a congressional request, GAO provided information on the
Department of Defense's management of various year 2000-related
end-to-end testing activities, focusing on: (1) the U.S. Space Command's
management of its end-to-end test of space control systems essential to
major theater war; and (2) what the results of this test show with
respect to operational risks and readiness.

GAO noted that: (1) year 2000 end-to-end testing is an essential
component of an effective year 2000 testing program since year
2000-related problems can affect so many of the systems owned and
operated by an entity as well as systems belonging to business partners
and infrastructure providers; (2) to be effective, end-to-end testing
should be approached in a structured and disciplined fashion; (3) both
the Joint Chiefs of Staff (JCS) guidance to its combatant commands on
managing year 2000 operational evaluations, and GAO's year 2000 test
guidance define a number of key management controls to employ when
planning, executing, analyzing, and reporting on such test and
evaluation events; (4) GAO found that the Space Command's space control
operational evaluation satisfied 16 of 21 key processes prescribed by
JCS guidance; (5) the Command performed a rehearsal before conducting
the evaluation to ensure that all critical systems and interfaces were
operating correctly and that all staff knew their roles and
responsibilities; (6) in response to GAO's concerns, Space Command has
taken positive actions to address the remaining five key processes; (7)
three of the key processes were addressed during the course of GAO's
review and two were addressed in response to a recommendation GAO made
at its briefing; (8) during the course of its review, Space Command
began ensuring that contingency plans were in place for its
mission-critical systems, which it had not done before conducting the
space control operational evaluation; (9) after GAO found that
configuration management procedures were not always followed while
executing the evaluation, Space Command initiated an effort to ensure
that such procedures are followed in future evaluations; (10) during
GAO's review, the Command amended its report to discuss its decision to
exclude six communications systems from the evaluation and whether this
adversely impacted the ability to draw conclusions about mission
readiness; and (11) at the time of GAO's briefing, Space Command still
needed to address two partially satisfied key processes, which included:
(a) not documenting whether test cases for most intelligence systems met
performance exit criteria; and (b) not ensuring that 1 of 29 systems
included in the evaluation was year 2000 compliant.

--------------------------- Indexing Terms -----------------------------

 REPORTNUM:  AIMD-00-30
     TITLE:  Defense Computers: U.S. Space Command's Management of Its
	     Year 2000 Operational Testing
      DATE:  11/15/1999
   SUBJECT:  Y2K
	     Computer software verification and validation
	     Systems conversions
	     Strategic information systems planning
	     Information resources management
	     Military satellites
	     Interagency relations
	     Systems compatibility
IDENTIFIER:  Y2K

******************************************************************
** This file contains an ASCII representation of the text of a  **

** GAO report.  Delineations within the text indicating chapter **
** titles, headings, and bullets are preserved.  Major          **
** divisions and subdivisions of the text, such as Chapters,    **
** Sections, and Appendixes, are identified by double and       **
** single lines.  The numbers on the right end of these lines   **
** indicate the position of each of the subsections in the      **
** document outline.  These numbers do NOT correspond with the  **
** page numbers of the printed product.                         **
**                                                              **
** No attempt has been made to display graphic images, although **
** figure captions are reproduced.  Tables are included, but    **
** may not resemble those in the printed version.               **
**                                                              **
** Please see the PDF (Portable Document Format) file, when     **
** available, for a complete electronic file of the printed     **
** document's contents.                                         **
**                                                              **
** A printed copy of this report may be obtained from the GAO   **
** Document Distribution Center.  For further details, please   **
** send an e-mail message to:                                   **
**                                                              **
**                                            **
**                                                              **
** with the message 'info' in the body.                         **
******************************************************************

Report to the Chairman of the Subcommittee on Defense, Committee on
Appropriations, House of Representatives

November 1999

DEFENSE COMPUTERS

U.S. Space Command's Management of Its Year 2000 Operational
Testing
*****************

*****************

GAO/AIMD-00-30

Letter                                                                     3

Appendixes

Appendix I:Briefing on Results of GAO Review of SPACECOM Space Control Y2K
OPEVAL

                                                                         14

Appendix II:Objectives, Scope, and Methodology

                                                                         67

Appendix III:GAO Contact and Staff Acknowledgements

                                                                         68

Table 1:  Summary of JCS Year 2000 Operational Evaluation
Criteria                                         7

Table 2:  Summary of Space Command's Satisfaction of JCS
Evaluation Criteria for the Space Control Evaluation8

DOD       Department of Defense

JCS       Joint Chiefs of Staff

OPEVAL    operational evaluation

SPACECOM  U.S. Space Command

Y2K       Year 2000

                                                 Accounting and Information
                                                        Management Division

B-282546

November 15, 1999

The Honorable Jerry Lewis
Chairman, Subcommittee on Defense
Committee on Appropriations
House of Representatives

Dear Mr. Chairman:

Complete and thorough end-to-end testing is essential to provide
reasonable assurance that new or modified systems used to collectively
support a core business function or mission operation will not jeopardize
an organization's ability to deliver products and services as a result of
the Year 2000 (Y2K) computing problem. This is especially true for the
Department of Defense (DOD) because it relies on a complex and broad array
of interconnected computer systems--including weapons, command and
control, satellite, inventory management, transportation management,
health, financial, personnel and payment systems--to carryout its military
operations and supporting business functions.

At your request, we reviewed DOD's management of various Year 2000-related
end-to-end testing activities. As part of our efforts, we assessed the
U.S. Space Command's management of its end-to-end test of space control
systems essential to major theater war (one of 16 operational evaluations
for the command) and determined what the results of this test show with
respect to operational risks and readiness./Footnote1/ We briefed Space
Command officials on our findings on October 1, 1999, and made a
recommendation to correct the management weaknesses that we found. Space
Command immediately acted to address our recommendation. We then briefed
your office on our findings and Space Command's actions to address our
recommendation on November 1, 1999. The purpose of this letter is to
summarize our briefing to your office. The briefing slides that we
presented to your office are in appendix I, and the objectives, scope, and
methodology of our review are detailed in appendix II. Space Command
provided oral comments on our briefing slides, and we have incorporated
them as appropriate. We performed our audit work from March through
October 1999 in accordance with generally accepted government auditing
standards.

Results in Brief

Year 2000 end-to-end testing is an essential component of an effective
Year 2000 testing program since Y2K-related problems can affect so many of
the systems owned and operated by an entity as well as systems belonging
to business partners and infrastructure providers. Moreover, to be
effective, end-to-end testing should be approached in a structured and
disciplined fashion. Both the Joint Chiefs of Staff (JCS) guidance to its
combatant commands on managing Year 2000 operational
evaluations,/Footnote2/ (the term JCS uses to refer to Year 2000 end-to-
end testing) and our Year 2000 test guidance/Footnote3/ define a number of
key management controls to employ when planning, executing, analyzing, and
reporting on such test and evaluation events. 

We found that Space Command's space control operational evaluation
satisfied 16 of 21 of the key processes prescribed by JCS guidance. For
example, the Command established a Y2K task force to guide the evaluation
effort, which included satellite/system specialists, test and evaluation
experts, system analysts, military component and service representatives,
and public affairs representatives. Further, the Command performed a
rehearsal before conducting the evaluation to ensure that all critical
systems and interfaces were operating correctly and that all staff knew
their roles and responsibilities.

In response to our concerns, Space Command has taken positive actions to
address the remaining five key processes. Three of the key processes were
addressed during the course of our review and two were addressed in
response to a recommendation we made at our briefing. During the course of
our review, Space Command began ensuring that contingency plans were in
place for its mission-critical systems, which it had not done before
conducting the space control operational evaluation. Also, after we found
that configuration management procedures were not always followed

while executing the evaluation,/Footnote4/ Space Command initiated an
effort to ensure that such procedures are followed in future evaluations.
In addition, during our review, the Command amended its report to discuss
its decision to exclude six communications systems from the evaluation and
whether this adversely impacted the ability to draw conclusions about
mission readiness.

At the time of our October 1, 1999, briefing, Space Command still needed
to address two partially satisfied key processes, which included (1) not
documenting whether test cases for most intelligence systems met
performance exit criteria and (2) not ensuring that 1 of 29 systems
included in the evaluation was Y2K compliant. We therefore recommended
that Space Command amend its final report to JCS to recognize the
uncertainties and risks associated with its failure to take these steps
and the actions underway or planned to address these uncertainties and
risks. Without taking these steps, Space Command could not adequately know
the Year 2000 readiness of critical tasks--collecting surveillance and
intelligence data to disseminate warning messages--associated with
conducting the space control mission. Because Space Command has
subsequently amended its final report and plans to ensure that these
weaknesses are not repeated in a November operational evaluation of its
intelligence mission, we are not making further recommendations at this
time. 

Background

Space Command's mission is to provide direct support to combatant
commanders and military forces through the use of space-based satellites
and other technologies needed for navigation, surveillance and
reconnaissance, communications, environmental and attack warnings during
war and peacetime operations. To perform this mission, Space Command
relies on a wide array of information technology systems, including
command and control systems, geographically dispersed radar sites,
satellites, communications networks, and intelligence systems. 

In August 1998, the Secretary of Defense directed JCS to require its
combatant commands, including Space Command, to plan, execute, analyze,
and report on a series of simulated Year 2000 operational evaluations. The
evaluations, which were to assess whether DOD can continue to perform
critical military operations in a Year 2000 environment, are one of three
DOD end-to-end testing efforts./Footnote5/

The purpose of end-to-end testing is to verify that a defined set of
interrelated systems, which collectively support an organizational core
business area or function, interoperate as intended in an operational
environment (either actual or simulated). These interrelated systems
include not only those owned and managed by an organization, but also the
external systems with which they interface or that otherwise support the
business area or function. The combatant commands' core business areas or
functions are referred to as "thin lines."

The boundaries for end-to-end tests can vary depending on a given business
function's system dependencies and criticality to the organizational
mission. Therefore, in managing end-to-end test activities, it is
important to analyze the interrelationships among core business functions
and their supporting systems and the mission impact and risk of date-
induced system failures and to use these analyses to define test
boundaries. It is also important to work early and continually with
functional partners to ensure that related end-to-end test activities are
effectively coordinated and integrated. Table 1 summarizes key processes
recommended by JCS' Year 2000 operational evaluation guidance, which is
consistent with our Year 2000 test guide.

Table****Helvetica:x11****1:    Summary of JCS Year 2000 Operational
                                Evaluation Criteria

------------------------------------------------------------------------
|           :                                                          |
|----------------------------------------------------------------------|
| Planning  :  Specify test assumptions and limitations                |
|----------------------------------------------------------------------|
|           :  Establish a Year 2000 task force                        |
|----------------------------------------------------------------------|
|           :  Identify critical missions/tasks/systems                |
|----------------------------------------------------------------------|
|           :  Verify that systems essential to mission are Year       |
|           : 2000 compliant                                           |
|----------------------------------------------------------------------|
|           :  Develop an operational evaluation plan to guide event   |
|           : planning and execution                                   |
|----------------------------------------------------------------------|
|           :  Identify and schedule support from other commands,      |
|           : DOD components, etc.                                     |
|----------------------------------------------------------------------|
|           :  Determine relevant and necessary resources (e.g.,       |
|           : funding, personnel, equipment, etc.)                     |
|----------------------------------------------------------------------|
|           :  Ensure approved Year 2000 contingency plans are         |
|           : prepared                                                 |
|----------------------------------------------------------------------|
|           :  Develop a risk management plan                          |
|----------------------------------------------------------------------|
|           :  Identify simulation needs and establish supporting      |
|           : testing environment                                      |
|----------------------------------------------------------------------|
|           :  Develop data collection and analysis plan or approaches |
|----------------------------------------------------------------------|
| Execution :  Conduct operational evaluation rehearsal                |
|----------------------------------------------------------------------|
|           :  Follow configuration management policy                  |
|----------------------------------------------------------------------|
|           :  Perform baseline test for operational evaluation        |
|----------------------------------------------------------------------|
|           :  Execute required Year 2000 date rollover tests          |
|----------------------------------------------------------------------|
|           :  Collect and archive all Year 2000-relevant data and     |
|           : ensure that systems are reset to current day operations  |
|----------------------------------------------------------------------|
| Analysis  :  Categorize, document, and report Year 2000 failures     |
|----------------------------------------------------------------------|
|           :  Determine mission impact of Year 2000 failures          |
|----------------------------------------------------------------------|
|           :  Ensure exit criteria are met                            |
|----------------------------------------------------------------------|
| Reporting :  Prepare Year 2000 reports describing mission impact     |
|           : and readiness                                            |
|----------------------------------------------------------------------|
|           :  Provide reports to JCS within required timeframes       |
------------------------------------------------------------------------

Space Command has already completed 16 operational evaluations to assess
its ability to manage and provide combatant support during a major theater
war. These evaluations covered seven mission areas, including
(1) integrated tactical warning and attack assessment, (2) space control,
(3) force enhancement, (4) weather support, (5) command and control of
space forces, (6) space operations support, and (7) space lift. The space
control mission area provides (1) surveillance support to monitor, track,
identify, and catalog all orbiting space objects for collision avoidance
and (2) protection support to monitor, detect, assess, characterize,
track, and issue warnings about threats, both natural and man-made,
against United States and allied space systems. The space control
evaluation was executed between March 11 and March 25, 1999.

Space Command Implemented Most Important Management Processes During Its
Space Control Evaluation

As noted in table 2 below, we found that, for its space control
operational evaluation, Space Command satisfied the majority of the
management process controls (16 of 21) specified in JCS' operational
evaluation guidance. 

Table****Helvetica:x11****2:    Summary of Space Command's Satisfaction of
                                JCS Evaluation Criteria for the Space
                                Control Evaluation

------------------------------------------------------------------------
| Phases                        : Number of primary: Number of primary |
|                               :       evaluation :         criteria  |
|                               :         criteria :        satisfied  |
|----------------------------------------------------------------------|
| Planning                      :               11 :                9  |
|----------------------------------------------------------------------|
| Execution                     :                5 :                4  |
|----------------------------------------------------------------------|
| Analysis                      :                3 :                2  |
|----------------------------------------------------------------------|
| Reporting                     :                2 :                1  |
|----------------------------------------------------------------------|
| Total                         :               21 :               16  |
------------------------------------------------------------------------

Consistent with JCS guidance governing operational evaluation planning,
Space Command established a Year 2000 task force, which included
satellite/system specialists, test and evaluation experts, system
analysts, military component and service representatives, and public
affairs representatives. It identified 35 critical tasks that it needed to
carry out the space control mission in support of a major theater war.
Space Command also issued a directive to ensure testing resources would be
made available for operational evaluations and earmarked about $8 million
for operational evaluation activities--including the space control
evaluation. Further, Space Command developed a test plan that documented
participant roles and responsibilities, critical missions and tasks, test
cases, and reporting requirements.

Space Command also took effective steps in executing, analyzing, and
reporting on its evaluation. For instance, before executing the
operational evaluation, Space Command performed a rehearsal to ensure that
all critical systems and interfaces were operating correctly and that all
staff knew their roles and responsibilities. Before resetting systems to
current day operations, Space Command ensured that thin line systems were
assessed, master scenario events were performed and deviations were
identified, and that all data needed to make an assessment of the
command's ability to perform the space control mission were collected and
archived. 

Space Command Acted to Address Three Partially Satisfied Key Processes

Following its operational evaluation, Space Command took action to resolve
three partially satisfied key processes. In doing so, it increased its
assurance with respect to the Y2K readiness of space control critical
tasks involving intelligence and communications systems.

First, before conducting its test, Space Command did not verify that
contingency plans were in place for the 29 systems included in the
evaluation. Instead, Space Command relied exclusively on system owners to
do so. As noted in JCS testing guidance, contingency plans identify
alternative systems or workaround procedures to use when performing a
mission in the event of a system disruption. As such, JCS guidance states
that it is essential that commands ensure that these plans are in place
prior to executing the operational evaluation so that they can be invoked
in the case of system failure. Subsequent to the evaluation, Space Command
began verifying that contingency plans are in place for its mission-
critical systems.

Second, while executing the evaluation, Space Command did not follow
configuration management procedures. JCS guidance specifies that system
configurations not be changed during testing unless authorized by the test
director. During the space control evaluation, changes were made to one
system after the baseline for the evaluation was established and without
authorization from the test director. These changes contributed to a
"hard" failure during testing. (Information on the nature of the system
failure is classified)./Footnote6/ After the evaluation, Space Command
directed the 17th Test Squadron and intelligence unit to review this
deviation and its impact on the command's ability to determine mission
readiness. On September 30, 1999, the intelligence unit and 17th Test
Squadron reported that the deviation did not materially affect mission
readiness. To prevent similar problems in future evaluations, Space
Command directed the 17th Test Squadron and intelligence unit to develop
ways to improve testing documentation and procedures with a special focus
on ensuring that documentation standards, configuration management
procedures, and baseline test requirements are followed. 

Third, in reporting on the evaluation, Space Command did not specify how
its exclusion of six communications systems from the test impacted its
ability to draw conclusions about mission readiness. When planning the
evaluation, Space Command concluded that it would not include six
communications systems in the evaluation due to resource constraints or
because the systems were to be included in a future evaluation. As a
result, Space Command assumed that communications systems would be
available to perform critical tasks and disseminate time-sensitive
warnings to combatant commanders. While Space Command communicated this
assumption to JCS in its operational evaluation plan, it did not report on
how this scope limitation could adversely affect its ability to draw
conclusions about mission readiness. Instead, Space Command reported to
JCS that critical space control tasks could be performed across the
calendar and leap year dates with no significant impact on its mission
readiness. Space Command has since ensured that omitted communications
systems were included in other Year 2000 end-to-end testing or operational
evaluation events and disclosed this limitation in its final report on the
evaluation.

Space Command Is Acting to Address Recommendation Made at the Briefing

At the time of our October 1, 1999, briefing, Space Command had not yet
addressed two partially satisfied key processes. First, in planning the
evaluation, Space Command did not ensure that one intelligence system to
be tested was certified as compliant. Rather, it only verified that the
software application relevant to the evaluation was compliant. Year 2000
compliance of an application in isolation is of very limited value unless
the system platform that it runs on, as well as other applications
operating on the system, is compliant. As such, both JCS guidance and
GAO's end-to-end test guidance define system, not application, compliance
as a precondition to end-to-end testing. 

Second, Space Command did not document whether intelligence systems met
system performance exit criteria for all test cases. Specifically, the
command was supposed to show whether it could process a predetermined
number of transactions within specific time constraints. While command
officials contend that this was done, only one-fifth of the transactions
for intelligence critical tasks were documented. Space Command officials
stated that it was too time-consuming for operators to print screens for
these tasks during the evaluation. 

At our briefing, we recommended that Space Command amend its final report
to JCS to recognize the (1) uncertainties and risks associated with its
failure to fully satisfy these criteria and (2) the actions it had
underway or planned to address these uncertainties and risks. Space
Command agreed with this recommendation. It plans to amend its final
report to disclose these limitations and to pursue an alternative data
collection strategy for its planned November 1999 operational evaluation
of its intelligence mission in order to verify that intelligence
systems/tasks fully meet performance criteria.

Conclusion

By acting swiftly to address our recommendation, made during the
October 1, 1999, briefing, Space Command has demonstrated its commitment
to improving management controls over Year 2000 testing activities and the
effectiveness and value of its operational evaluation as well as mitigated
the risks associated with being able to operate effectively in the Year
2000. Further, it has ensured that DOD managers have complete and reliable
information to use in making informed military decisions. As a result,
Space Command has satisfied the intent of our recommendation, and we are
not making any further recommendations at this time.

We are sending copies of this report to Representative John P. Murtha,
Ranking Minority Member, Subcommittee on Defense, House Appropriations
Committee; Senator John Warner, Chairman, and Senator Carl Levin, Ranking
Minority Member, Senate Committee on Armed Services; Senator Ted Stevens,
Chairman, and Senator Daniel Inouye, Ranking Minority Member, Subcommittee
on Defense, Senate Committee on Appropriations; and Representative Floyd
Spence, Chairman, and Ike Skelton, Ranking Minority Member, House
Committee on Armed Services.

We are also sending copies to the Honorable John Koskinen, Chair of the
President's Year 2000 Conversion Council; the Honorable William Cohen,
Secretary of Defense; the Honorable John Hamre, Deputy Secretary of
Defense; General Henry Shelton, Chairman of the Joint Chiefs of Staff;
Arthur Money, Assistant Secretary of Defense for Command, Control,
Communications, and Intelligence; and the Honorable Jacob Lew, Director,
Office of Management and Budget. Copies will also be made available to
others upon request.

Should you or your staff have any questions concerning this report, please
contact me at (202) 512-6240. I can also be reached by e-mail at
[email protected]. Other points of contact and key contributors to this
report are listed in appendix III.

Sincerely yours,

*****************

*****************

Jack L. Brock, Jr.
Director, Governmentwide and Defense
  Information Systems

--------------------------------------
/Footnote1/-^DOD refers to its combatant commands' end-to-end tests as
  operational evaluations.
/Footnote2/-^Joint Staff Year 2000 Operational Evaluation Guide, Version
  3.0, April 1, 1999.
/Footnote3/-^Year 2000 Computing Crisis: A Testing Guide (GAO/AIMD-
  10.1.21, issued as an exposure draft in June 1998; issued in final in
  November 1998).
/Footnote4/-^Configuration management involves establishing product
  baselines and systematically controlling changes made to those
  baselines. Without an effective configuration management process,
  organizations can lose control of the software product, potentially
  produce and use inconsistent product versions, and create operational
  problems.
/Footnote5/-^In addition to conducting operational evaluations, the
  military services are conducting system integration testing, and the
  functional business areas, such as personnel and health affairs, are
  conducting functional end-to-end tests. Each of these end-to-end testing
  activities is discussed in detail in Defense Computers: Management
  Controls Are Critical to Effective Year 2000 Testing (GAO/AIMD-99-172,
  June 30, 1999).
/Footnote6/-^A "hard" failure is a Y2K-related failure that results in an
  obvious adverse impact to the system. For example, the system shuts
  down, displays erroneous data, or performs other unexpected actions.

BRIEFING ON RESULTS OF GAO REVIEW OF SPACECOM SPACE CONTROL Y2K OPEVAL
======================================================================

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

*****************

OBJECTIVES, SCOPE, AND METHODOLOGY
==================================

At the request of the Chairman, House Appropriations Committee,
Subcommittee on Defense, we selected the Space Command Space Control
evaluation for review to determine (1) if the evaluation was planned,
executed, and documented in accordance with DOD guidelines, and
(2) what the evaluation results indicated concerning readiness and risks.
This operational evaluation was selected in collaboration with the Defense
Inspector General to ensure appropriate coverage of all combatant command
operational evaluations and no duplication of effort.

To satisfy our first objective, we reviewed the evaluation plan, testing
documentation and records, and test results and associated reports. We
also interviewed Space Command officials responsible for Year 2000
operational evaluation planning, execution, and reporting tasks. Further,
we examined the century date rollover testing documents for the
operational evaluation and compared Space Command's operational evaluation
planning, execution, analysis, and reporting actions against JCS
operational evaluation guidance and our Year 2000 testing guide.

To satisfy the second objective, we reviewed Space Command's operational
evaluation results, including its 7- and 30-day reports and system problem
tracking reports. We also interviewed Space Command officials and analysts
responsible for developing operational evaluation assessment
methodologies, interpreting evaluation metrics, and ensuring that
evaluation exit criteria were met.

On October 1, 1999, we briefed Space Command leadership on the results of
our review. Space Command provided oral comments on our briefing slides,
and we have incorporated them as appropriate. We performed our work from
March through October 1999 in accordance with generally accepted
government auditing standards.

GAO CONTACT AND STAFF ACKNOWLEDGEMENTS
======================================

GAO Contact

Randolph C. Hite, (202) 512-6240

Acknowledgements

In addition to those named above, Ronald B. Bageant, Cristina T. Chaplain,
Madhav Panwar, and Yvonne Vigil made key contributions to this report.

(511661)

*** End of document. ***