[Senate Hearing 107-650]
[From the U.S. Government Publishing Office]
S. Hrg. 107-650
IMPROVED MANAGEMENT OF DEPARTMENT OF DEFENSE TEST AND EVALUATION
FACILITIES
=======================================================================
HEARING
before the
SUBCOMMITTEE ON EMERGING THREATS AND CAPABILITIES
of the
COMMITTEE ON ARMED SERVICES
UNITED STATES SENATE
ONE HUNDRED SEVENTH CONGRESS
SECOND SESSION
__________
MAY 21, 2002
__________
Printed for the use of the Committee on Armed Services
U.S. GOVERNMENT PRINTING OFFICE
81-638 PDF WASHINGTON DC: 2002
---------------------------------------------------------------------
For Sale by the Superintendent of Documents, U.S. Government Printing Office
Internet: bookstore.gpo.gov Phone: toll free (866) 512-1800; (202) 512�091800
Fax: (202) 512�092104 Mail: Stop IDCC, Washington, DC 20402�090001
COMMITTEE ON ARMED SERVICES
CARL LEVIN, Michigan, Chairman
EDWARD M. KENNEDY, Massachusetts JOHN WARNER, Virginia
ROBERT C. BYRD, West Virginia STROM THURMOND, South Carolina
JOSEPH I. LIEBERMAN, Connecticut JOHN McCAIN, Arizona
MAX CLELAND, Georgia BOB SMITH, New Hampshire
MARY L. LANDRIEU, Louisiana JAMES M. INHOFE, Oklahoma
JACK REED, Rhode Island RICK SANTORUM, Pennsylvania
DANIEL K. AKAKA, Hawaii PAT ROBERTS, Kansas
BILL NELSON, Florida WAYNE ALLARD, Colorado
E. BENJAMIN NELSON, Nebraska TIM HUTCHINSON, Arkansas
JEAN CARNAHAN, Missouri JEFF SESSIONS, Alabama
MARK DAYTON, Minnesota SUSAN COLLINS, Maine
JEFF BINGAMAN, New Mexico JIM BUNNING, Kentucky
David S. Lyles, Staff Director
Judy A. Ansley, Republican Staff Director
______
Subcommittee on Emerging Threats and Capabilities
MARY L. LANDRIEU, Louisiana, Chairman
EDWARD M. KENNEDY, Massachusetts PAT ROBERTS, Kansas
ROBERT C. BYRD, West Virginia BOB SMITH, New Hampshire
JOSEPH I. LIEBERMAN, Connecticut RICK SANTORUM, Pennsylvania
BILL NELSON, Florida WAYNE ALLARD, Colorado
JEAN CARNAHAN, Missouri TIM HUTCHINSON, Arkansas
MARK DAYTON, Minnesota SUSAN COLLINS, Maine
JEFF BINGAMAN, New Mexico JIM BUNNING, Kentucky
(ii)
C O N T E N T S
__________
CHRONOLOGICAL LIST OF WITNESSES
Improved Management of Department of Defense Test and Evaluation
Facilities
May 21, 2002
Page
Wynne, Hon. Michael W., Principal Deputy Under Secretary of
Defense for Acquisition, Technology, and Logistics............. 4
Young, Hon. John J., Jr., Assistant Secretary of the Navy for
Research, Development, and Acquisition......................... 10
Christie, Hon. Thomas P., Director, Operational Test and
Evaluation..................................................... 13
Krings, Hon. John E., Member, Defense Science Board Task Force on
Test and Evaluation Capabilities............................... 21
(iii)
IMPROVED MANAGEMENT OF DEPARTMENT OF DEFENSE TEST AND EVALUATION
FACILITIES
----------
TUESDAY, MAY 21, 2002
U.S. Senate,
Subcommittee on Emerging
Threats and Capabilities,
Committee on Armed Services,
Washington, DC.
The subcommittee met, pursuant to notice, at 9:44 a.m. in
room SR-232A, Russell Senate Office Building, Senator Mary L.
Landrieu (chairman of the subcommittee) presiding.
Committee members present: Senators Landrieu, Levin, Bill
Nelson, Bingaman, and Roberts.
Committee staff members present: David S. Lyles, staff
director.
Majority staff members present: Daniel J. Cox, Jr.,
professional staff member; Kenneth M. Crosswait, professional
staff member; Richard W. Fieldhouse, professional staff member;
Peter K. Levine, general counsel; and Arun A. Seraphin,
professional staff member.
Minority staff members present: Judith A. Ansley,
Republican staff director; Edward H. Edens IV, professional
staff member; Brian R. Green, professional staff member;
William C. Greenwalt, professional staff member; Carolyn M.
Hanna, professional staff member; Mary Alice A. Hayward,
professional staff member; Ambrose R. Hock, professional staff
member; and Thomas L. MacKenzie, professional staff member.
Staff assistants present: Dara R. Alpert and Leah C.
Brewer.
Committee members' assistants present: Marshall A. Hevron
and Jeffrey S. Wiener, assistants to Senator Landrieu; William
K. Sutey, assistant to Senator Bill Nelson; John A. Bonsell,
assistant to Senator Inhofe; George M. Bernier III, assistant
to Senator Santorum; Robert Alan McCurry, assistant to Senator
Roberts; Douglas Flanders, assistant to Senator Allard; James
P. Dohoney, Jr., assistant to Senator Hutchinson; and Derek
Maurer, assistant to Senator Bunning.
OPENING STATEMENT OF SENATOR MARY L. LANDRIEU
Senator Landrieu. Good morning. Our hearing will come to
order on our test and evaluation (T&E) oversight. Let me begin
by thanking all of our witnesses for being here this morning
and thank my Ranking Member for his good work in support of
this subcommittee and his leadership so ably on this
subcommittee for many years.
I will get right into my opening statement. We have just
one panel this morning. Both Senator Roberts and I will have
opening statements, we will then hear from the four of you, and
then go into a short round of questions. This is a very
important subject matter to the both of us, and it was Senator
Roberts' suggestion that we have this meeting to hear from the
Department of Defense about the recommendations that I have
made on test and evaluation. Our subcommittee is very
interested in making sure that our test and evaluation process
is what it should be, not just for the warfighter and for their
safety, but for the taxpayers who are looking for a strong and
smart military, and it is the goal of our subcommittee to help
get us to that goal.
So because of that goal, this subcommittee 3 years ago
initiated legislation requiring a task force to report on the
state of the Department's test and evaluation facilities. That
report, as you all know, because two of you were involved in
processing it, in December 2000 found that the services had
reduced their institutional funding of the Department's major
test and evaluation ranges by about $1 billion since 1990. As a
result of this inadequate funding the task force concluded,
quote, `` Testing is not being conducted adequately, and there
is growing evidence that the acquisition system is not meeting
expectations as far as delivering high-quality, reliable, and
effective equipment to our forces.''
Just to cite a few of the findings of that report, the
recapitalization rate for the Department's T&E has reached 400
years. The aging T&E infrastructure increases the probability
of failure in test support capabilities that could cause
significant and costly schedule slippages. In recent years, 66
percent of the Air Force programs have stopped operational
testing due to major systems or safety shortcomings, which was
quite alarming.
Since 1996, approximately 80 percent of Army systems tested
failed to achieve reliability requirements during operational
testing. As a result, the Director concluded the acquisition
process failed to deliver systems to the warfighter that meet
reliability and effectiveness requirements, so obviously we
have some work to do here. There are probably a number of ways
that we could correct this deficiency, but if these
deficiencies are acknowledged today, clearly the status quo is
not going to do.
Today we will hear from two representatives of the
Department of Defense, our Director of Operational Test and
Evaluation (DOT&E), Mr. Christie, and from Mr. Jack Krings, the
former Director of Operational Test Evaluation who played a key
role in this task force.
I want to say that we welcome the Department's views on
this proposed legislation, and we will do our best to address
the legitimate concerns that you raise today. We want to make
sure that we get this legislation right, or that if we
acknowledge that these deficiencies exist, we actually come up
with a way to significantly improve them.
At the same time, I want to say how strongly I share the
views expressed by this report. As it says, we owe it to our
men and women in uniform to ensure that the weapons systems
they carry into battle will work as they are intended. Adequate
testing of weapons is not an abstract concept. Lives depend on
it, and taxpayers, particularly in this day, as we reach and
stretch for every dollar to protect us against
counterterrorism, would demand that we not waste our resources
by putting something in the field and then having to go back to
the test lab.
So this testing is important, and I think the way that we
are funding it, there is a disincentive, because the money
comes out of the procurement, basically, or the other parts of
the program. There is a disincentive for testing that I think
is crucial to the development of these very sophisticated
systems.
So with that, let me call on Senator Roberts to make a
statement. We hope to work with you, gentlemen, to see what we
can work out in this regard.
Senator Roberts.
STATEMENT OF SENATOR PAT ROBERTS
Senator Roberts. Thank you, Madam Chairman. This morning,
the Subcommittee on Emerging Threats and Capabilities meets to
receive testimony on legislative proposals to reform DOD's test
and evaluation infrastructure, as you have already indicated.
The December 2000 Defense Science Board (DSB) report and the
latest annual report by the Director of Operational Test and
Evaluation raise serious concerns about the Department's test
and evaluation capabilities. I commend you, Madam Chairman, for
your attention to these problems.
I think we need to ask the Department whether the solution
that has been put forth in this legislation is the one that
they prefer and can implement in the real world of
transformation and the ever-changing asymmetrical warfare
threats that we face today. During the markup of this year's
defense authorization bill, I expressed some reservations about
this proposal, but the overall goals of the legislation are
indeed very laudable. I had concerns about how this proposal
was developed. The committee had not held hearings or engaged
the Department. For this reason, I wanted to have this hearing
in order to hear the Department's views.
I certainly thank my chairman and my colleague for holding
this hearing, but I still think we may have put the cart before
the horse in addressing this issue. The committee has already
acted on this proposal, and we are now simply holding the
hearing, but the chairman is right, problems have been
identified with the current funding, capabilities, and
facilities in the test and evaluation infrastructure, and that
is something that we should address. However, this subcommittee
needs to adequately discuss the underlying approach of how the
Department tests its weapons systems.
The test and evaluation process has grown up around an
acquisition culture which has been all too content in taking a
15 to 20 year time period to develop and deploy any new weapons
systems. Does the entire test and evaluation process need to be
reevaluated in a period of rapid commercial technology
development, joint experimentation, spiral development, and
rapidly fieldable prototypes and, if so, will the conclusions
reached 2 years ago by the Defense Science Board hold up to
scrutiny under new criteria? We need to adapt testing to new
ways of buying, rather than simply conform our buying to old,
inflexible ways of testing.
I also have substantive reservations about the legislative
proposal contained in the committee-passed bill that I would
like our witnesses to address. For example, the establishment
of the DOD test and evaluation enterprise would continue a
trend of centralizing various service functions. This could
further erode the military services' Title X responsibilities
for equipping and also training our forces. If centralization
really is more efficient, then why should the military services
have any acquisition function at all?
I am confident that the military services do add value. I
am going to ask some questions about that, so I am a little
skeptical about the moves to complete centralization of
additional acquisition functions. I am also concerned about the
test waiver provision in the bill, which appears to be somewhat
inflexible. This provision may be establishing a long and
bureaucratic process for a program office to obtain needed and
legitimate waivers. The net effect is that it may take even
longer.
However, I thank my Madam Chairman for holding today's
hearing, and I look forward to hearing from our witnesses on
the Department of Defense's efforts to address these
challenges. I hope we can learn from this hearing, and at the
end of the comments, I would tell my colleague and friend, that
I think we might be able to work this out. I have already
talked to Mr. Wynne about the possibility of having the
Department report back to the subcommittee in a very short time
period in regards to how they would implement either this
legislation, or any suggestions that they might make which
would certainly give us a smoother ride when this bill gets to
the floor.
So with that, I thank you very much for the hearing.
Senator Landrieu. Thank you, Senator Roberts.
Senator Bingaman, do you have any opening remarks?
Senator Bingaman. I do not.
Senator Landrieu. Thank you.
Mr. Wynne, if you would proceed please.
STATEMENT OF HON. MICHAEL W. WYNNE, PRINCIPAL DEPUTY UNDER
SECRETARY OF DEFENSE FOR ACQUISITION, TECHNOLOGY, AND LOGISTICS
Mr. Wynne. Thank you very much, Madam Chairman and members
of the subcommittee. Thank you for inviting me here today to
talk about the proposed legislation to improve the management
of the Department of Defense test and evaluation facilities.
Our military is the premier force in the world, and part of
their superiority is due to the systems that support them.
Developing, testing, producing, and supporting these systems is
what we do best. Everyone, the testers, the acquisition
personnel, and the requirements community are all motivated to
provide the very best systems possible.
The perspective I bring is bigger, of acquisition itself.
While testing provides the extra assurance that the system will
work and meet its requirements, test and evaluation is but one
of the many supporting processes that deliver military
capability to our warfighters, and the bottom line is, our
systems work. They are working every day across the world from
training, to peacekeeping, to warfighting, and if you would not
mind, Madam Chairman, I would like to put up a chart to
illustrate what I perceive is the cycle that we are talking
about.
[The chart referred to follows:]
The outer circle is the weapons systems development that
begins with the Joint Requirements Oversight Council, which
consists of the Vice Chairman of the Joints Chiefs as chair,
and all of the service chiefs as members. They vote on the
requirements that the weapons system has to meet before it goes
to the servicemen and women of this country. They establish the
requirements that the systems have to do. The testing and
development is a smaller circle, and this is where this
legislation is focused, in that smaller circle, to essentially
correct what is perceived to be deficiencies in the larger
circle.
In fact, they enhance the development of the larger circle,
because it highlights to the Joint Requirements Oversight
Council some overreaches or some underreaches in weapons
development. The point here is that the waiver dispositions are
more important than the waivers themselves, and ultimately
every waiver that is created in test must be disposed of, or
the operation requirement must be changed. To that effect, some
of the items cited for the Army, for example, have gone through
astounding reliability increases since this report was
published, and what was found during development and
operational testing.
The acquisition process today is roughly in balance. There
is a natural tendency for the acquisition community to want to
get systems to our warfighters faster so that the warfighter
will have the advantage of the best technology available today.
It is also natural for the test community to have a desire to
hold back systems from being fielded until all problems are
identified, weaknesses are fixed, and the system meets all
requirements.
All parties are part of an open debate, and have a seat at
the table. Decision makers get the best advice available, and
ultimately the warfighter benefits from this process.
Therefore, while this proposal contains some areas of mutual
concern, we believe it will impede the overall weapons systems
development cycle and therefore are opposed to it in its
present form.
We recognize that there are some problems with our current
test process, and many of our facilities appear to lack
appropriate funding, but the proposed legislation will not fix
these problems. The legislation creates an imbalance in the
acquisition process by providing more control to the Director
of Operational Test and Evaluation, raising his authority as a
member of the acquisition team. Shifting control will not
correct our problems. The ranges will not be funded to the
level we prefer, and the test waivers will still be necessary,
and test failures will still recur.
In fact, as we proceed through trying to shorten the cycle
of development to get this technology to our warfighters, we
would anticipate that we would encounter more risk, not less
risk, and that therefore we would be encountering more test
failures, not less test failures. I would like to address these
two issues of test waivers and infrastructure funding. The
Defense Science Board and the DOT&E reports highlight specific
programs as problems because of the number of failures or
waivers during operational tests. Tests cannot be viewed as a
pass/fail situation. Test failures can provide valuable
information, and waivers and deviations may not only be
necessary but, because of the technical complexity of the
business we are in, may make good military and business sense.
The proposed legislation attempts to eliminate deviations
from the test and evaluation master plan by requiring approval
of either the DOT&E or the Secretary or Deputy Secretary of
Defense. I do not think we really want the Secretary of Defense
to be approving test plan waivers for acquisition programs. He
has enormous demands on his schedule, and requiring signatures
on test plan waivers at his level would slow testing, and would
slow system fielding at the end of the day. A better solution
would be a notification that is provided in the DOT&E annual
report as it relates to specific systems and disposition of the
waivers and process to provide this committee and others in
Congress feedback on how the waivers and test failures created
actually were fixed later in the cycle.
The DOT&E already must approve the operational test plans
under the test and evaluation master plan, and is aware of test
plan changes. Giving the DOT&E more control over this waiver
process does not mitigate the need for waivers. Waivers are
given because the designers know a system will not meet a
specific requirement, or failed some portion of the test, but
would still provide very useful military capability to the
warfighter.
A good example of why waivers are important is the F-18E/F
program. This system had about 50 waivers once it failed its
operational test. Some of the waivers were due to one of its
subcomponent, the advance-targeting, forward-looking, infrared
system. This system was simply not ready, and therefore could
not be tested with the rest of the aircraft. Instead, an
existing FLIR, forward-looking infrared radar, was used, and
those portions of the tests were waived. Once that system
became ready, it will be installed and tested. From those
original 50 waivers, 30 have been tested, and the remaining
will be tentatively completed by 2006. So I ask, should we have
slowed the process and perhaps impacted the production line of
F-18s until the ATFLIR is ready? I do not think so.
The same could be said of the Predator, which did not pass
tests and thus needed waivers to be fielded. Should the system
be slowed and the fielding delayed of a system that we know to
be better than anything we have used before? It should be okay
to have waivers and failures, because the big picture is that
even if a system does not meet all of its requirements, it may
still have greater capability than anything that currently
exists.
One of the reasons cited for waivers in the DSB and DOT&E
reports is that our systems are not receiving enough
development testing before proceeding to operational testing.
This is happening in some cases. We already have work in
progress to resolve this issue. This is sometimes the case
because there is not enough money to sufficiently test programs
because they were underfunded from the start, or they are
operating under such tight budget constraints in that anything
less than a fully successful test program requires additional
testing and, therefore, additional dollars.
There are things we are doing, in fact, to correct this
process. We are realistically pricing programs and also
requiring full funding of our programs. The defense acquisition
executive has mandated that the cost analysis independent group
estimates are used unless there is a compelling reason to use
different estimates. This will help ensure that all elements of
the program, to include testing, are not short-changed because
of affordability problems.
Another reason programs are reducing developmental testing
is schedule crunch, a reaction to the tremendous pressure we
put on program managers to speed up systems development. This
pressure comes not just from the Department but also from you,
Congress, to get this technology into our warfighters' hands,
but again, our readiness of a program for tests must be weighed
and balanced against the other concerns for military utility,
the cost, and the schedule, and that balance is currently in
hand, and management is receiving adequate information to make
decisions.
The other major issue cited is inadequate funding for test
infrastructure. The proposed legislation creates a centralized
activity to manage the test ranges, and fences range investment
accounts. The present budget provides the best balance of
funding for the full scope of the DOD's mission. There is just
not enough money in the budget to do everything we want. The
test community has a place at the table when decisions are made
to allocate funding. Frankly, setting up a fenced account will
only move money from other needs, not solve funding shortfall
problems.
Also, centralized management will not resolve the problem
of range management, but will only result in a new office that
will require extra reporting, extra financial management, and
ultimately delay effective management. This amounts to another
agency that could slow down our acquisition cycle times even
more, and it could have an even worse impact on training, which
is a large user of test ranges, but will have little say in
this investment.
In managing our T&E facilities there is a delicate balance
of training and testing, because training for both service and
joint exercises often involves the test ranges. A good example
of this is the Nellis Air Force Base, which is one of the Air
Force's primary test facilities for aircraft and weapons
systems, and home to Red Flag, an annual exercise that involves
not only joint services, but also international forces.
In fiscal year 2000, 93 percent of sorties flown at Nellis,
83 percent of sorties flown at Eglin Air Force Base, and 60
percent of the sorties at Edwards Air Force Base were, in fact,
for training, not for tests. Placing the ranges and facilities
under the control of the DOT&E, rather than the services where
it is presently held, could have an impact on the readiness of
our servicemen if it becomes a contentious issue of investment.
We recognize that we have more challenges ahead,
specifically as we continue to emphasize evolutionary
acquisition and spiral development to shorten the weapons
systems acquisition and fielding cycle times. Because the
Secretary and Congress desire to speed up the transition of
technology into usable equipment, we may see more test waivers
as we add iterations of capability. Our desire is to get the
warfighter equipment that is better than anything they have in
order to give them a decisive advantage. We think that the
current balance allows for that.
We must continue to involve the DOT&E in the establishment
and exercise of test programs and spiral development, but just
as we do not want the designer to be responsible for setting
test requirements, the specific design requirements should be
left to the services and not driven by what the tester thinks
must be tested to ensure effectiveness and suitability.
The tests developed would provide, do provide the right
kind of management data for good decisions and corrective
actions. Rather than striving for zero waivers, we should
strive for better data and better analysis, which the current
DOT&E provides extremely well.
There are some parts of the legislation that we do support.
We agree that the need to include the test functional community
in our ongoing human capital strategic planning and our
contribution-based workforce demonstration project is great. We
also agree that we need a DOD-wide accounting system for
testing, but this should be part of the ongoing financial
management renovation and management program that we are
involved in.
In closing, we appreciate the good intentions of Congress.
However, we feel that this proposed legislation will not result
in the goal of an integrated and well-managed T&E process, nor
will it provide a well-managed and integrated acquisition
process, which is the larger circle that we referred to. To
provide better capabilities to the warfighter faster, is what
our intentions are. We would like to continue to work with you
and the DOT&E to find solutions to the challenges that the
Defense Science Board raised.
Thank you. I am happy to have John Young with me today, the
Navy's senior acquisition executive. He would like to provide a
few comments from the perspective of the services, if you do
not mind.
[The prepared statement of Mr. Wynne follows:]
Prepared Statement by Hon. Michael W. Wynne
Madam Chairman and members of the subcommittee:
Thank you for inviting me here today to talk with you about the
proposed legislation `` Improved Management of Department of Defense
Test and Evaluation (T&E) Facilities.'' The importance of T&E in
ensuring our systems work is critical and we appreciate your interest
in this topic. Our military is the premier force in the world and part
of their superior advantage is due to the exceptional systems that
support them. Developing and fielding systems so that a soldier is
confident a gun will fire when the trigger is pulled, a bomb will find
its correct target, or a communication system will send a call for
reinforcements, is what the Department's acquisition, technology and
logistics workforce does best. There are no second chances in our
business. Testing provides the extra assurance that a system will work
when it has to and under all types of conditions.
Improving our T&E process has been the subject of many studies such
as the Defense Science Board's (DSB) Report on Test and Evaluation
released in September 1999, the DSB's Report on Test and Evaluation
Capabilities released in December 2000, and most recently, the Defense
Operational Test and Evaluation (DOT&E) Annual Report for fiscal year
2001. These reports investigated and identified ways to improve our
process and your legislation reflects many of the recommendations from
those studies.
Some of these same recommendations were reviewed last August and
again in December by Secretary Rumsfeld's Senior Executive Council
(SEC). The SEC, a council made up of the Secretary, Deputy Secretary,
the Under Secretary for Acquisition, Technology and Logistics, and
Service Secretaries, reviewed the issues of centralized funding and
management of the test and management infrastructure. As a result of
the SEC discussions, the Service Secretaries approach is: (1) for the
services to work together to effectively utilize and manage resources
across the three services; (2) that neither a separate OSD range
management office nor centralized funding is necessary; and (3) that
the services already have sufficient incentives to effectively manage
these enterprises and to adequately fund needed facilitation. We
presently have in place a Vice-Chief-level Board of Directors that
provides cross-service use and accountability of T&E facilities.
Furthermore, OSD does influence T&E funding through the Defense
Planning Guidance (DPG) and the budget process.
We agree with some areas of the proposed legislation such as
section 235 that calls for human capital planning of the T&E workforce
and section 234 that creates a single DOD-wide accounting system. The
Department is working on both of these areas with our human capital
planning efforts and our initiative to improve financial management
systems across DOD. We have reservations about creating a centralized
activity to manage the test ranges as proposed in section 231 or
creating fenced range investment accounts as proposed in section 232
and section 233. We believe that centralized management likely would
not resolve the problem of range management but could result in a new
office that will require extra reporting, extra financial management,
and ultimately delay effective management.
One key area the proposed legislation fails to recognize is the
fact that our test ranges and facilities also support vital operational
training as well as operational testing. T&E is the insurance policy
that assures the Department that a system will meet its requirements,
and as with any insurance policy, balance is the key. In managing our
T&E facilities, there is a delicate balance of testing and training
because training for both service and joint exercises often involves
the test ranges. A good example of this is Nellis Air Force Base, one
of the Air Force's primary test facilities for aircraft and weapon
systems, and home to Red Flag, an annual exercise that involves not
only joint services, but also international forces. In fiscal year
2000, 93 percent of sorties flown at Nellis, 83 percent at Eglin Air
Force Base, and 60 percent of the sorties at Edwards Air Force Base
were for training. Likewise, the Navy's Atlantic Underwater Test and
Evaluation Center (AUTEC) facilities support T&E of many underwater
systems, but also supports sound testing of submarines, necessary for
pre-deployment operational readiness. Both in the DSB report and the
proposed legislation, it is very unclear as to what authority the
central agency holds, but it seems to unbalance the test and training
that each service manages each year.
The legislative proposal is also not clear on the delineation
between developmental testing and operational testing. Developmental
testing is important for learning a system's characteristics and
capabilities and the results of such tests often impact the design.
Operational testing confirms the systems performance once design is
complete. Most of the T&E performed at the test ranges and facilities
is developmental in nature and not within the purview of the DOT&E. In
fiscal year 2001, the Navy's developmental testing accounted for 58
percent of the total workload. Delegating control of the test ranges
and facilities to DOT&E would put developmental testing under the
cognizance of operational test. This would create significant cost and
schedule impacts to crucial developmental testing. The very nature of
the test community is to continue testing until all issues are
resolved. Placing control of the test facilities under the testers
could create an endless do-loop of testing.
The imbalance between test and training, and developmental and
operational testing, would be compounded when money is moved from
program accounts and fenced in range investment accounts as recommended
by section 232. While we are concerned with infrastructure issues such
as better calibration, or getting more data more quickly, placing
investment in a ``frozen account'' might result in unbalanced
investment that will impact training. We are concerned with the
continuing problem surrounding overhead costs and their impact to
program mangers (PMs) when they use the test ranges and facilities.
However, a range investment account established as a percentage of the
RDT&E account from each service would essentially result in a tax on
each of the PMs, regardless of their test requirements and would
introduce certain rigidities into the system that would be undesirable.
The DSB reports highlight a potential management issue with regard
to the quantity of waivers from approved test requirements in the TEMP,
but do not address the actual impact of these waivers on our forces.
The proposed legislation in section 236 eliminates deviations from the
Test and Evaluation Master Plans (TEMPs) without the approval of the
DOT&E or the Secretary or Deputy Secretary of Defense, without re-
delegation, and requires notification to Congress. This provision
removes any flexibility in testing, which is undesirable when we are
weighing a system's readiness against the need to provide it to the
warfighter. DOT&E already must approve test plans under the TEMP and is
aware of test plan changes. The proposed language is not clear as to
the level of deviation that is addressed. Assuming it refers to major
test events and not specific system characteristics, threat
presentations or other program or tester level decisions, an
alternative approach could be notification provided in the DOT&E annual
report as related to specific systems.
We recognize we have more challenges ahead, specifically as we
continue to emphasize evolutionary acquisition and spiral development
to shorten the weapon system development life-cycle. Spiral development
allows us to get militarily useful capability to our warfighters and at
less cost by producing and deploying systems based on mature
technologies that will satisfy only a portion of the objective need.
Because the Secretary desires to speed up the transition of technology
into usable equipment through incremental fielding of capability, we
may need the increased flexibility that test waivers can provide as we
add iterations of capability, especially if the performance of that
technology is not completely understood. Additionally, in order to
obtain an early understanding of what we are facing from a support and
maintenance point of view, we may want to deploy equipment that may
require prudent testing waivers. In certain cases, development and
operational testing, by their very nature, cannot exactly replicate the
real world, and we need to gain real world experience to get the most
accurate level of performance. Many times the early gear is for
training units, which is the perfect place to gain feedback and
introduce corrective actions. Our desire is to get to the warfighter
equipment that is better than anything they have. Safety of our people
will always be our number one concern, but beyond safety, we must not
let the best be the enemy of the good when it comes to operational
requirements.
In closing, I want to express my appreciation to Congress for their
support. Congress has long been a valued partner in our quest for
change throughout the Department. The T&E area has been no different.
We appreciate the support of Congress, but we feel this proposed
legislation will not result in the goal of an integrated and well-
managed T&E process.
Thank you for the opportunity to provide this statement for the
record.
Senator Landrieu. That will be fine. Thank you, Mr. Wynne.
Mr. Young.
STATEMENT OF HON. JOHN J. YOUNG, JR., ASSISTANT SECRETARY OF
THE NAVY FOR RESEARCH, DEVELOPMENT, AND ACQUISITION
Mr. Young. Madam Chairman, distinguished members of the
subcommittee, thank you very much for this opportunity to
discuss the management of the Defense Department's test and
evaluation facilities.
One of the mandated responsibilities of the Service
Secretaries includes the requirement to train and equip their
respective services. The acquisition process implied in this
responsibility includes taking the necessary steps to ensure
that the systems that we put in the hands of our soldiers,
sailors, airmen, and marines will operate as intended in combat
situations. Lives depend on it. In order to fulfill this
obligation, test resources and facilities are an integral part
of each service's acquisition process, and must be maintained
by the services in order to provide both acquisition and life
cycle support to our systems.
The most fundamental aspect of our acquisition process is
that we continually conduct test and evaluation of systems
throughout all stages of development. Build a little, test a
little, and learn a lot does work, and that is how we are doing
business. By test a little, I really mean a lot of testing
along the way, not just a few large tests at major milestones.
This testing philosophy becomes even more crucial in an
evolutionary or spiral acquisition process as we specifically
strive to deliver capability to the fleet today that is good
enough, while continuing development on the ultimate solution
for the future.
As the Navy's Service Acquisition Executive (SAE), and
speaking for the other SAEs, we are all interested in
optimizing the test infrastructure throughout this entire
process. The Major Range Test Facility Base (MRTFB) facilities
discussed in this proposed legislation are just one part of the
overall T&E infrastructure that we work very hard to support.
If you take the Arleigh Burke class Guided Missile Destroyer
(DDG) as an example, we support contractor facilities where we
conduct extensive testing, including many developmental tests,
or DT events. We support the Aegis Computer Program Center in
Dahlgren, Virginia, where we do extensive software development
and integration testing. We support the Surface Combatant
Systems Center at Wallops Island, Virginia, where we test and
evaluate developmental and in-service systems together. We
support the mechanical and electrical test facility at the
Surface Ship Engineering Station in Philadelphia, Pennsylvania
where we develop, test, and evaluate integrated engine, damage
control, and navigation systems. We support the Naval Surface
Warfare Center in Dahlgren, Virginia, where we conduct live
fire gun evaluations, and finally, the Navy supports the AUTEC
Range, a MRTFB facility where DDGs undergo test and evaluation
of systems at sea during combat systems qualifications trials.
Each element of this integrated test infrastructure plays
an essential role, and modernization and sustainment decisions
must be made considering the complete test infrastructure. It
is this total integrated testing infrastructure that Secretary
England and his colleagues believe must be managed within each
respective service.
The test resources also go beyond facilities and equipment.
It includes the people. Each of the services works hard to
develop officers and civilians who have experience in the test
community, as well as on acquisition programs. New platforms
and weapons benefit greatly from the service-specific
experience of people manning the test ranges. Further, the
entire Defense Department benefits when these skills are
brought to bear on the test programs of other services. It is
not necessary or helpful to centralize the funding and
management of these facilities, and the skilled people who
oversee testing activities.
Like the Army and the Air Force, the Department of the Navy
continually seeks to ensure that there is a balanced, full
spectrum test infrastructure. To break a portion of these
facilities out from the whole and fence the resources that go
with them will lead to a suboptimization of the overall
integrated management that I talked about, and does not
recognize all of the facilities involved that are necessary to
carry out development tests and operational tests. As you have
heard, all of the Service Secretaries felt very strongly about
retaining this T&E facility responsibility and oversight when
the issue was considered before the Senior Executive Council
earlier this year.
In addition to the need for integration across the range of
test ranges and facilities, the services also have integrated
the test facilities into their engineering capabilities. For
example, as part of the previous four rounds of BRAC, the
Department of the Navy has created full spectrum Warfare
Centers. These Warfare Centers support research development,
test, and evaluation, as well as in-service engineering for our
existing assets. Test resources and facilities are critical to
the way these full spectrum Warfare Centers develop and support
Navy and Marine Corps systems. MRTFB ranges and facilities are
integral parts of many of these Centers. The synergy developed
from this collocation and the sharing of human and equipment
capital has greatly improved Navy and Marine Corps acquisition
programs. An effort to disassociate the test and evaluation
facilities from our Warfare Centers would damage this synergy.
As you have heard, Navy Major Range and Test Facility Bases
are also used for more than operational testing. In fiscal year
2001, development testing was 58 percent of the workload. 15
percent of the workload was for other Department of Defense
users. In that year, our Navy MRTFB ranges and facilities were
used for F-22, B-2, C-17, and Patriot testing, and only 4
percent of the fiscal year 2001 workload was for Navy
operational testing, while almost 15 percent supported
operational readiness through training and other uses. The
Defense Department is efficiently and very effectively using
all its MRTFB and other test assets.
Within our overall T&E planning, the Navy has a three-step
process to aggressively manage its MRTFB resources and
facilities. First, the MRTFB competition process validates
whether or not newly nominated facilities should be included in
this MRTFB base, and revalidates whether the existing
facilities should remain. Second, budget reviews starting at
the individual billet level are conducted to determine the
required usage and the funding that is required for each MRTFB
facility. Finally, rigorous investment reviews are conducted
using documented investment road maps to validate test and
evaluation proposals. Through these processes, the overhead
costs of MRTFB facilities are determined and centrally funded
under the Navy's test and evaluation sponsor, N91. Development
and acquisition programs are charged the incremental costs of
the testing and operations at these facilities.
Finally, Admiral Fallon, the Vice Chief of Naval
Operations, is the Navy's member on the Tri-Service Vice Chief
Board of Directors. This group provides coordinated oversight
management of the various MRTFB facilities.
The bottom line is that we have a plan and oversight
process, and we support this plan within our Department-wide
priorities, and we maintain the facilities that are used by all
components of the Defense Department.
Likewise, consistent with Secretary Wynne's comments, the
Navy has a specifically defined process for granting waivers to
the testing conducted under a Test and Evaluation Master Plan.
Today's complex weapons consist of multiple integrated
subsystems, and the entire system cannot be stopped for the
delay of a single subsystem. Test exceptions follow a rigorous
review process that includes the Program Manager, Program
Executive Officer, Commander Operational Test and Evaluation
Force, the Resource Sponsor, and the Navy's Executive Agent for
T&E, N91. If the program is under DOT&E oversight, we must gain
written concurrence from DOT&E for exceptions and waivers.
Of over 315 programs, only 12 in the Navy have requested
exemptions, resulting in a total of 93 test requirements waived
or deferred. Mr. Christie has noted that he believes the
services have successfully addressed some of the concerns about
the waiver process.
To summarize, MRTFB facilities are an integral part of a
total test infrastructure for each service. Further, this test
infrastructure is an integral part of our laboratories, warfare
centers, and development programs. The services are budgeting
the cost of operating these facilities within the resource
constraints that affect every program. Finally, when it is time
to test, there are rigorous processes to ensure that all
requirements are tested or appropriately deferred to a future
test.
We want to continue to communicate fully and openly with
Congress, industry, our warfighters, and our acquisition
professionals on these issues. We all share a common goal of
doing everything it takes to make sure our service members are
provided with the safest, most dependable, and highest-
performance equipment as quickly as possible within available
fiscal constraints. We appreciate the support provided by
Congress, and look forward to working together with this
subcommittee toward this goal.
Senator Landrieu. Thank you very much. I appreciate both of
your statements, and would now ask Mr. Christie and Mr. Krings
if you will--and you do know that your full testimony will be
put in the record, so you might want to take this opportunity
just to summarize all your statements so we can get to some
questions.
STATEMENT OF HON. THOMAS P. CHRISTIE, DIRECTOR, OPERATIONAL
TEST AND EVALUATION
Mr. Christie. Yes. I am also pleased, Madam Chairman,
Senator Roberts, and Senator Bingaman, to have this opportunity
to discuss this proposed legislation. As you probably know, I
served on both of the DSB panels that we are talking about the
results of--they made quite a few recommendations--but today I
appear here not as a member of either of those panels, but as
the Department's DOT&E, a position that this committee honored
me with confirmation nearly a year ago.
Never in my wildest dreams, when I served on the DSB panels
of a couple of years ago, did I dream that I would be called
upon to implement all of those recommendations.
Senator Landrieu. Had you known, you would have made less
of them?
Mr. Christie. No, no, no, I am not saying that. I
understood at the time the difficulty that would ensue. I just
did not realize I would be the stuckee.
I have, in fact, given, as you requested, an assessment of
all of your proposals in my written statement. I am not going
to cover those, but there were 25 recommendations in this last
report, and within my role as DOT&E, I think we have been able
to address 16 of these within the building in the past year--
some with a lot of success, some with less success, and some
with no success, but we have attempted to take them all on. The
other nine lay outside my responsibilities.
Let me just talk about a couple of those that we are in the
process of working that impinge on this entire problem.
Value of testing. This may seem like a strange topic.
However, because of the way testing is currently planned and
funded, articulating its value has become critical to the
survival of the test ranges and adequate test and evaluation.
As more and more of the costs of tests and costs of the ranges
are being charged directly to programs, the ranges find
themselves having to sell their capability to program managers.
As test and evaluation overhead and maintenance costs have
shifted to the individual acquisition programs, the cost of
testing to program managers has risen. Thus, a program manager
who chooses to go to a specific range for testing is charged
not just for the cost of the test, but also for a large
fraction, in some cases, for the upkeep and maintenance costs
of that range.
Needless to say, program managers are not anxious to pay
for more than their direct costs, and I do not blame them.
Unfortunately, too often, program officers have tended to avoid
testing under these circumstances. This is especially true in
development testing, where the record shows that we have
brought too many systems into operational tests--and the
discussions by Mr. Wynne and Mr. Young that went on earlier
dealt with these--before they were ready.
The latest Army estimate is that 75 percent of their
systems fail to meet even 50 percent of their reliability
requirements in operational tests. My office has been working
with the test community in an effort to develop some sort of an
approach to express return on investment from testing for
program managers.
Quality of testing. The DSB found that, ``testing is not
being done adequately.'' The quality of testing can suffer when
testing is avoided, when adequate capabilities to test do not
exist, or when the testing is not funded properly in either
magnitude or phasing. The DSB found existing policies that were
being used to avoid or defer some testing and, more
importantly, to avoid evaluation.
I sent a memorandum to the services on this late last year
asking them to cease the unilateral waiving of requirements for
testing, not the waiving of requirements, and requiring that
all operational requirements be part of their evaluation. There
has been real evidence, and John spoke to that also in his
statement, and I think we are well on the way to having solved
that problem within the Department.
The last threat to the quality of Government T&E discussed
by the DSB is funding. The DSB considered the magnitude of the
funding allocated to T&E by the services as well as its
phasing, and by phasing I mean that development testing is not
supported well enough or early enough; hence, systems get into
operational tests with too many problems. This may sound as if
it is a developmental test problem. It is in part, but as I
said before, one significant root cause of this problem is how
the tests are funded.
I believe the funding structure has to change to solve this
problem, and again, this is the institutional funding versus
program funding that I am talking about.
The DSB also found the state of the infrastructure, to
include physical plants, ranges, real estate, instrumentation,
and other analysis capabilities--targets, personnel, and so
forth--in need of near-term investment and high-level emphasis.
The report identified three areas just as examples, and I will
not go into those now, but adequate targets was one of the
biggest problems that we found.
Let me turn now to the recommendations that were not
implemented. They center on the management of T&E resources.
The DSB, as part of its response to this committee, recommended
that DOD create a test and evaluation resource enterprise. As
envisioned by the task force, the enterprise would, (1) fund
and manage the DOD T&E organization's workforce and
infrastructure; (2) it would be at the OSD level under my
office; (3) it would be funded by transferring the appropriate
military services' funding for investment, operations, and
maintenance of the MRTFB test resources and facilities to the
enterprise; and (4) it would allow the operation of the test
facilities to remain under service control. We are also
addressing this problem in the building.
For example, defense planning for the fiscal year 2004
budget includes two actions that bear on our efforts to improve
T&E policies, procedures, and infrastructures. In that planning
guidance, we are called upon to provide by this fall an
assessment of how best to make the ranges able to support
affordable, adequate testing. We are further asked for a review
of what changes are needed to harmonize the Department's new
acquisition strategies discussed by Mr. Wynne and Mr. Young
with respect to testing policies and procedures.
Both aspects of this guidance are consistent with the
findings of the DSB, and should lead to consideration of many
of the topics that are advanced in the proposed legislation,
because we recognize that current funding policies and
structure can, in fact, work against adequate testing. However,
plans and reviews are neither an implementation nor a solution.
The proposed legislation is a potential solution in line with
the DSB recommendations.
The development of a strategic plan for the maintenance and
modernization of our T&E infrastructure is a much-needed step
in guiding our efforts to provide a robust T&E capability for
the future. Today, we have inequities that surface on a case-
by-case basis where we have one service conducting tests in one
context, and another service with very similar weapons
conducting a test in a different context. We need to adjudicate
these differences and bring to bear some standards. This is
also one of the issues we are looking at very much.
The second planning item calls for streamlining T&E to
match the goals of streamlined acquisition. There are those
who, after observing DOD programs for the last dozen or so
years, might believe that streamlining T&E is a code word for
testing less. I do not agree with that assertion. However, in
order to streamline, I believe we will have to address
increasing the tempo with which we conduct tests and analyze
the results. Currently, it is almost as if the schedules at the
ranges depend on the systems not being ready for test. In fact,
only about 40 percent of the tests scheduled start on time,
because the systems are not ready.
If the latest acquisition initiatives deliver what we hope
they will, then a greater fraction of programs should be ready
for testing on or near their schedules. In this respect, I fear
the T&E community might not be prepared for success in
acquisition reform. That means the ranges will have to increase
their capacity to improve, or improve their responses. Right
now, for example, the Navy has had to pause AIM-9X testing, in
part because the test infrastructure at the Navy's test site
cannot keep up with the demands of that one test. This fall,
there are 15 tests scheduled at the same site.
In sum, many of the items in the proposed legislation would
likely be addressed when future defense plans are implemented,
so what we may have here is a difference in the schedule for
transformation, not necessarily one of different goals.
Addressing an issue, however, does not necessarily mean that
the Department will come up with a solution, much less one that
matches the DSB recommendations very closely. Nevertheless, the
direction that the Department is taking is an acknowledgement
that there is a problem, and improvement is necessary, and you
have my commitment that I will press to find that appropriate
solution.
In summary, then, I can say that the Department largely
supports the thrust of the DSB report recommendations. We have
already had some success in implementing the recommendations of
that report. This legislation seeks to accelerate that
implementation faster and more thoroughly than what we have
accomplished or planned so far. A review of the legislation
shows that it does match the DSB recommendations in many
respects. It addresses in some cases more fully many of the
problems that we have identified when we were on the task
force.
I thank you for your kind attention to my remarks, I
believe testing to be a critical part of what we must do for
our soldiers, sailors, airmen, and marines, and I believe your
careful consideration of the Defense Science Board
recommendation reflects that same concern.
Thank you.
[The prepared statement of Mr. Christie follows:]
Prepared Statement by Hon. Thomas P. Christie
I am pleased to have this opportunity to discuss the proposed
improved management of Department of Defense Test and Evaluation
Facilities legislation that implements major Defense Science Board
(DSB) recommendations with respect to test and evaluation (T&E). Two
recent DSB reports on T&E, one in September 1999 and another--which
your committee directed--in December 2000, made a number of
recommendations for improving the Department's T&E programs. As you no
doubt know, I served on both of these DSB panels. But I appear here
today, not as a member of either of those panels, but as the
Department's Director, Operational Test and Evaluation (DOT&E), a
position for which this committee honored me with confirmation nearly a
year ago. I must admit that never in my wildest dreams did I believe,
as I participated in those two DSB task forces, that I would have the
opportunity to implement those recommendations.
You have asked me to provide an assessment of the proposed
legislation, the current state of the Department's test and evaluation
facilities, the findings of the DSB task force report and my annual
report, and any other recommendations to address the problems
identified by the DSB task force or my annual report.
While I have some specific comments to make concerning the proposed
legislation, with your forbearance, I would first like to briefly
review what has been accomplished since July of last year when I was
confirmed, with respect to the major recommendations of the December
2000 DSB Report.
That report in essence covered five major areas:
The Value of Testing
Management of T&E Resources
The Quality of Testing
Specific T&E Investments
Use of Training Facilities/Exercises for T&E Events
In all, there were 25 recommendations made with respect to those
topics. I have, within my role as DOT&E, been able to address 16 of
these during this past year--some with more success, some with less,
and some with no success. The other nine lay outside my area of
responsibility. Let me briefly cover some of the steps we have taken to
address some of these recommendations.
THE VALUE OF TESTING
The value of testing may seem like a strange first topic for the
DSB. It should be obvious to everyone that the Department's goal is to
field weapons that work, and that testing is invaluable as a design
tool, a means for verifying performance, and ultimately confirming the
operational effectiveness and suitability of those weapons. But I'm
concerned that the current funding structure works against adequate
testing. Because of the way testing is currently planned and funded,
articulating its value has become critical to the survival of the
ranges and adequate test and evaluation capabilities. As more and more
of the cost of tests and the cost of the ranges are being charged
directly to programs, the ranges find themselves having to ``sell''
their capability to program managers.
As test range overhead and maintenance costs have shifted to the
individual acquisition programs, the cost of testing to program
managers has risen. Thus, a program manager who chooses to go to a
specific range for testing is charged not just for the cost of the
test, but also for a large fraction of the upkeep and maintenance costs
of that range. Needless to say, program offices are not anxious to pay
for more than the direct cost of their testing, and I don't blame them.
Unfortunately, too often program offices tend to avoid testing under
these circumstances. This is especially true of developmental testing,
where the record shows that we have brought into operational test many
systems before they were ready. The latest Army estimate is that 75
percent of the systems fail to meet even 50 percent of their
reliability requirement in their operational tests.
I have heard program managers say: ``A dollar spent on testing is a
dollar spent looking for trouble.'' Under the current funding
structure, one can see why ``articulating the value of testing''
becomes necessary for the ranges. Unfortunately, the ranges have not
been good at it. Government weapons programs do not have the same
market-created measures as in the private sector to demonstrate the
value of testing such as warranties, recalls, and class action law
suits that are real in the private sector and that provide a cost risk
to industry which testing helps reduce.
My office has been working with the Army test community on an
effort that develops an approach to express the return on investment in
testing for program managers. These approaches include quantifying the
cost benefit to finding failure modes early to avoid retrofits and the
life cycle cost benefit from improved reliability when the reliability
testing is robust. We have also found interest from and are utilizing
the professional testing organization, the International Test and
Evaluation Association, which this year will sponsor two symposia with
the theme ``The Value of Testing.''
THE QUALITY OF TESTING
The DSB found that ``Testing is not being done adequately.'' The
quality of testing can suffer when testing is avoided, when adequate
capabilities to test don't exist, or when the testing is not funded
properly in either magnitude or phasing.
The DSB found existing policies that were being used to avoid or to
defer some testing and (more importantly) to avoid evaluation. I sent a
memorandum to the services on this, asking them to cease the unilateral
waiving of requirements and requiring that all operational requirements
be part of the evaluation. The specific policy most obvious was a Navy
policy that allowed waivers to test and evaluation. There has been real
evidence of change in specific programs.
Where adequate test capabilities don't exist, they need to be
developed. The Central Test and Evaluation Investment Program (CTEIP)
is part of my responsibility as DOT&E. CTEIP has a number of programs
aimed at developing and fielding needed improvements to our test
capabilities. I'll mention some of these later in the context of the
DSB's recommendations for specific investments, some of which I have
been able to fund with the limited CTEIP budget and other funds
available to me.
The last threat to the quality of government T&E, discussed by the
DSB, is funding. The DSB considered the magnitude of the funding
allocated to T&E by the services as well as its phasing. The DSB
recommended a ``reform of the acquisition process in order to support
the adequate and robust T&E of new weapons systems that work the first
time, all the time.'' By phasing I mean that developmental testing is
not supported well enough or early enough. Hence, systems get into
operational tests with too many problems. This may sound as if it is a
developmental test problem. It is in part. But as I said before, one
significant root cause of the problem is ``how the tests are funded.''
The funding structure has to change to solve the problem.
SPECIFIC T&E INVESTMENTS
The DSB ``found the state of the infrastructure--to include
physical plant, range real estate, instrumentation, data reduction and
analysis capabilities, targets, personnel, among other facets of test
planning and conduct--in need of near-term investment and high-level
emphasis . . .'' Three areas identified--and they were but examples,
and not a complete list--were frequency management, embedded
instrumentation, and more realistic targets.
Frequency Management
With the resources at my disposal, I have been able to invest in
systems for Advanced Range Telemetry (bandwidth efficient
instrumentation), a Joint Advanced Missile Instrumentation (a spectrum
efficient GPS [Global Positioning System] hybrid system) and an
Enhanced Range Application Program (a flexible data link to support T&E
and Training). This last project is an example of how the test and
training communities can position themselves, with respect to
instrumentation, to work together more closely. This project also
provides a concrete initiative to begin to implement improvement in the
fifth and last area discussed by the DSB.
Embedded Instrumentation
With respect to embedded instrumentation, we planned to initiate
projects to pursue embedded instrumentation enabling technologies, but
funding reductions in our testing technology program last year forced
us to postpone project initiation.
Subsequent to the DSB, the Department has rewritten the Acquisition
Regulations. One section in the regulations that is getting attention
is embedded instrumentation. The current regulation includes a
requirement for the program manager to consider embedded
instrumentation. The Department's Business Improvement Council is
considering an initiative that would require the program manager to
evaluate embedded instrumentation in the analysis of alternatives. If
embedded instrumentation promises a cost benefit over the life cycle,
it would become a requirement for the system. I note that the DSB came
to its conclusions on embedded instrumentation as it was considering
the connection between testing and training. Embedding instrumentation
could make possible a better link between testing and training.
Realistic Targets
Target problems remain a very serious impediment to realistic
testing (and training for that matter). The Navy needs a self-defense
target ship to permit us to adequately test ship defense systems. Our
missile defense programs need more realistic targets; the target drone
situation for air-to-air missiles testing and training continues to
worsen. These aerial targets are needed for a large number of programs.
Unfortunately again, the way these programs are funded has had a
negative effect. The first program manager who admits he needs these
assets will be the one to bear the major part of their cost.
As I stated earlier, I have addressed some 16 of the 25
recommendations found in the DSB report in my first months in office. I
would say that we have made progress on 13 of the 16. Let me now turn
to the recommendations that were not implemented. They centered on
management of T&E resources.
MANAGEMENT OF T&E RESOURCES
The DSB--as part of its response to this committee--recommended
that DOD create a `` Test and Evaluation Resource Enterprise.'' As
envisioned by the task force, the Enterprise would (1) fund and manage
the DOD T&E organizations, workforce, and infrastructure, (2) be at the
OSD level under the Director, Operational Test and Evaluation, (3) be
funded by transferring the appropriate military service's funding for
investment, operations, and maintenance of Major Range and Test
Facilities Base (MRTFB) test resources and facilities to the
Enterprise, and (4) allow the operations of the test facilities to
remain under service control.
Defense plans for fiscal year 2004 include two actions that bear on
efforts to improve T&E policies, procedures, and infrastructure. We are
called upon to provide by this fall an assessment of how best to make
the ranges able to support affordable, adequate testing. We are further
asked for a review of what changes are needed to harmonize the
Department's new acquisition strategies with testing policy and
procedures.
Both aspects of the guidance are consistent with the findings of
the DSB and should lead to consideration of many of the same topics
advanced in the proposed legislation because we recognize that current
funding policies and structure can work against adequate testing.
The development of a strategic plan for the maintenance and
modernization of our T&E infrastructure is a much-needed step in
guiding our efforts to provide a robust T&E capability for the future.
There may be a number of ways to implement such a plan. Among other
things, it would require us to reconcile testing methodologies between
the services.
For example, this year we examined two weapons test plans by
different services against the same intended target set. One weapon
system was to be tested on an Army range against a moving column of
remotely controlled armored vehicles with realistic countermeasures and
with the potential for dust and obscuration that movement brings. The
other system was to be tested at an Air Force range against a static
array of hulks with hot plates that were to simulate the signature of
hot vehicles. Clearly a more balanced strategic view would preclude
such inequalities.
Today these inequities surface on a case-by-case basis, usually
after the services have done their planning and often only during the
operational test phase. Turning around such planning at that point is
neither streamlined nor efficient. Hopefully, a well-done strategic
plan would change that.
Further, I cannot imagine a strategic plan that did not bring the
test ranges in line with Sec. 907 of the Strom Thurmond National
Defense Authorization Act for Fiscal Year 1999, which aimed at cost-
based management. In that sense, the strategic plan would address the
DSB recommendation for a common financial management system.
Finally, I cannot imagine a strategic plan that did not address
much needed improvements in the T&E workforce, which was yet another
DSB recommendation.
The second planning item calls for streamlining T&E to match the
goals of streamlined acquisition. There are those who, after observing
DOD programs for the last dozen or so years, might believe that
``streamlining T&E'' is a code-word for ``test less.'' I do not agree
with that assertion. However, in order to streamline, I believe we will
have to address increasing the tempo with which we conduct tests and
analyze the results. Currently, it's almost as if the schedules at the
ranges depend on systems not being ready for test. In fact, only about
40 percent of tests start on time because the systems are not ready. As
I have said before, Lord knows what would happen if all the programs
that claimed to be ready for testing in 2002 actually showed up for
testing. If the latest acquisition initiatives deliver what they hope
for, then a greater fraction of programs should be ready for testing on
or near their schedules. In this respect, I fear the T&E community
might not be prepared for success in acquisition reform. That means the
ranges will have to increase their capacity or improve their
responsiveness. Right now the Navy has had to pause AIM-9X testing in
part because the test infrastructure at the Navy's test site cannot
keep up with the demands of that one test. In the fall, there are 15
tests scheduled for that one site.
In some cases, such as the F-22, the inability of the test
infrastructure to maintain a high tempo of testing, to surge when
needed, may be slowing down the progress of the program. AIM-9X testing
is suffering because U.S. Navy and U.S. Air Force QF-4s and their
ranges are not interoperable. We have also seen delays at the Army's
White Sands Missile Range due to critical infrastructure staffing
shortfalls.
Many of the items in the proposed legislation would likely be
addressed when future Defense plans are implemented. So what we may
have here is a difference in the schedule for transformation, not
necessarily one of different goals. Addressing an issue does not
necessarily mean the Department would come up with a solution, much
less one that matches the DSB or the proposed legislation which, I have
said, follows the DSB recommendations very closely. Nevertheless, the
direction the Department is taking is an acknowledgement that there is
a problem and improvement is necessary. You have my commitment that I
will press to find an appropriate solution.
Let me now comment on the proposed legislation. First, we recognize
that it is crafted to fully implement the recommendations of the
Defense Science Board task force. I can offer you a few observations
based on my personal experience.
One problem area that I can point to is the effect the transfer
will have on the Central Test and Evaluation Investment Program. The
DSB used CTEIP as the model for organization and process. However, the
CTEIP was established to develop tools needed for T&E. It would be
better to keep large-scale operational funds separate from development
of test equipment.
1. Section 236 allows deviation from the approved Test and
Evaluation Master Plan (TEMP) with either Secretary of Defense, Deputy
Secretary of Defense, or my approval followed by notification to this
committee within 30 days.
On the surface, this seems like a good thing. Any substantial
deviation from a master plan ought to be reviewed carefully, at least
by my office and that of the Under Secretary of Defense for
Acquisition, Technology, and Logistics (AT&L) to ensure that test
adequacy is not jeopardized. So first, there should be a requirement to
notify our offices of any departures.
On the other hand, the acquisition regulations encourage tailoring.
In that context, such tailoring may include no longer producing TEMPs
as we know them. For example, the Air Force has briefed my staff on
plans to forego TEMPs as such, and replace them with a combined
acquisition strategy and testing document. I am concerned that, if
deviations must be reported, the documents themselves will trend to
less and less detail making deviations more difficult to detect.
2. The legislation requires a report and plan by the Under
Secretary of Defense (AT&L) on improving the T&E workforce.
This section recognizes that most of the individuals doing testing
and evaluation in the Department are part of the Acquisition Corps. I
know that some Senators and Representatives call the Acquisition
Workforce the ``Pentagon buyers,'' and they are constantly pushing the
Department to reduce their numbers. So you have put the Under Secretary
of Defense (AT&L) in a tough spot (not that he isn't in a tough enough
spot already). But the legislation recognizes the fact that most tester
positions are currently under the responsibility of the Acquisition
Corps.
3. The final section I comment on Section 231 suggests the Under
Secretary of Defense (AT&L) has responsibility to designate which
ranges comprise the MRTFB (Major Range and Test Facilities Base). For
the last 3 years, that responsibility has been with my office. The
Deputy Secretary signed the new 3200.11 Directive formalizing that
responsibility 2 weeks ago.
In summary then, I can say that the Department largely supports the
thrust of the DSB report. We have already had some success in
implementing the recommendations of that report. This legislation seeks
to accelerate that implementation faster and more thoroughly than what
we have accomplished and planned so far. A review of the legislation
shows it to match the DSB recommendations in many respects. However,
the legislation could cause us problems. The Department desires the
opportunity to discuss the proposed Senate legislative objectives
internally as well as with your committee. We believe that together we
can develop a plan, potentially including a legislative proposal that
addresses the recommendations in an effective manner.
I want to thank you for your kind attention to my remarks. I
believe testing is a critical part of what we must do for our soldiers,
sailors, airmen, and marines. Thank you.
Senator Landrieu. Thank you very much, Mr. Christie.
Mr. Krings.
STATEMENT OF HON. JOHN E. KRINGS, MEMBER, DEFENSE SCIENCE BOARD
TASK FORCE ON TEST AND EVALUATION CAPABILITIES
Mr. Krings. Good morning, Madam Chairman and members of
your subcommittee. Thank you for the opportunity to appear
before you to discuss my views on the proposed legislation. I
spent 15 years as a fighter pilot in the Air Force and the Air
National Guard, and 30 years as an experimental test pilot with
McDonnell Aircraft Company before appearing here as the first
DOT&E. I have remained actively engaged in testing since
leaving the Pentagon.
First, I want to congratulate you. From my point of view,
this is the most significant test and evaluation legislation
since 1983, when Congress, and many of the people that are on
this committee, established the position of DOT&E. It addresses
longstanding problems identified more than 30 years ago by the
President's Blue Ribbon Defense Panel, problems that have been
underscored by dozens of studies and reports ever since,
including the 1999 and 2000 reports of the Defense Science
Board Task Force on Evaluation and the DOT&E's Annual Report
for Fiscal Year 2001.
This morning I will comment on the findings, the
recommendations of these studies, the proposed legislation, the
current state of the Department's test and evaluation
facilities, and the basis for the DSB task force findings. From
my point of view, the committee's recommendations to establish
a Department of test and evaluation resource enterprise is a
most important part of the proposed legislation for several
reasons.
The current funding for essential maintenance and
modernization of the test infrastructure is inadequate. We
recognize this, and we understand why. We recognize this to be
the services do not make the required investment in test
resources because test and evaluation competes with service
programs, which has been mentioned more than once this morning.
The result is that over a period of decades, service-managed
and funded test and evaluation facilities have deteriorated to
the point where they cannot support adequate testing of today's
systems. These facilities are not able to support adequate test
and evaluation of new, emerging, and leap-ahead systems without
prudent investments in modernization. We did not say large, we
said prudent investments in modernization.
The enterprise envisioned in this modernization will
consolidate funding and modernize the infrastructure by looking
across the Major Range and Test Facility Base and making the
best investments for all of DOD. The net result is, all the
services will get the affordable test resources and facilities
that they need for adequate joint testing of their current and
future weapons systems.
As proposed in the DSB report, in addition to consolidating
the funding, the enterprise will manage these test ranges and
test facilities through a board of directors with
representatives from the MRTFB, the military people from the
MRTFB. This management plan has been discussed, debated, and
validated, and is a major part of the implementation.
Essentially, it requires members of the test community from the
various major ranges and test facilities to participate
effectively in managing the test resource allocation and
investment.
Some will likely argue that the Office of the Secretary of
Defense is just taking away resources from the services and
building another bureaucracy. The reality is, test ranges and
facilities will be better-funded, and they will be intimately
involved in the decision as to how the money will be spent. As
a result, a service program manager will have the entire
national range complex restored and available for testing, not
a single service capability and, most importantly, there will
be accountability and sunshine on the process.
When this administration took office, the defense
transition team asked me to help two members of the team who
had already read the DSB 2000 report on test and evaluation
capability. They said Secretary Rumsfeld wanted a DOT&E that
could implement the DSB recommendations, and he, the Secretary,
believed Tom Christie was the best candidate. Tom did not ask
to be the DOT&E. He had been watching it for years, and knew
that that was not a smart thing to do. He became a candidate
only because the transition team convinced him that he was the
only one that could effectively implement the DSB
recommendation that he helped author. He has continuously
avoided any activities or expression that would suggest that he
is personally seeking additional funding for his own
organization. The transition team then recommended that Mr.
Christie appoint a team of outside experts to write an
implementation plan. I ended up in the same position as Tom,
not ever expecting to have to do any of this, but I was asked
to lead this team.
We wrote and delivered a comprehensive implementation plan
and schedule. We recommended a unified concept like the concept
in the DSB report, giving prominent roles to the technical
directors in the field. The implementation plan includes a
financial accounting system that will enable the Department to
manage and report to Congress the actual cost of testing for
the first time in the history of the Department.
During the course of the 2000 DSB study, we considered many
sources of information, the findings of the DSB 1999 study, as
well as data and insight, equally as important as the data,
gathered at on-site visits to nearly all of the test evaluation
facilities across the United States, and extensive briefings
from all DOD test and evaluation organizations, the DSB task
force findings, and the implementations team.
Recommendations are grounded in reality, and build upon a
solid foundation of personal and historical experience data and
analysis. Tom Christie and I served on both the 1999 and the
2000 DSB task forces. We went everywhere. We heard every word
that has ever been written about testing, believe me.
In closing, I want to say that I agree with the committee
that our soldiers, sailors, airmen, and marines must have
weapons systems that work in combat. Everybody agrees to that.
Their lives depend on it. This vital legislation, like your
previous legislation that created the DOT&E, which took a long
time, and a big hill to climb, is another critical step toward
helping the Department meet its responsibility to adequately
test weapons systems before putting them in the hands of our
servicemen and women.
I sincerely appreciate the work of this subcommittee and
what this legislation will achieve. I will be happy to answer
any questions.
Thank you.
[The prepared statement of Mr. Krings follows:]
Prepared Statement by Hon. John E. Krings
Good morning, Madam Chairman, and members of your subcommittee.
Thank you for the opportunity to appear before you to discuss my views
on the proposed legislation to improve the management of Department of
Defense test and evaluation facilities.
I spent 15 years as a fighter pilot in the Air Force and Air
National Guard, and 30 years as an experimental test pilot with
McDonnell Aircraft Company before appearing here as the first DOT&E. I
have remained actively engaged in T&E since leaving the Pentagon.
First, I want to congratulate you. From my point of view, this is
the most significant test and evaluation legislation since 1983 when
the U.S. Congress established the position of Director, Operational
Test and Evaluation. It addresses long-standing problems identified
more than 30 years ago by the President's Blue Ribbon Defense Panel,
problems that have been underscored by dozens of studies and reports
ever since, including the 1999 and 2000 reports of the Defense Science
Board Task Force on Test and Evaluation; and The Director, Operational
Test and Evaluation's annual report for fiscal year 2001.
This morning, I will comment on the findings and recommendations of
these studies, the proposed legislation, the current state of the
Department's test and evaluation facilities and the basis for the DSB
Task Force's findings.
From my point of view, the committee's recommendation to establish
a Department of Defense Resource Enterprise (T&E/RE) is the most
important part of this proposed legislation for several reasons.
The current funding for essential maintenance and modernization of
the test infrastructure is inadequate. We recognize this and we
understand why. The services don't make the required investments in
test resources and facilities because test and evaluation competes with
service programs. The result is that over a period of decades, service
managed and funded test and evaluation facilities have deteriorated to
the point where they cannot support adequate testing of today's weapon
systems. Sixty-seven percent of the test facilities are more than 30
years old, and 41 percent are over 40 years old. The recapitalization
rate is 400 years! These facilities are not able to support adequate
testing and evaluation of new, emerging, and leap-ahead systems without
prudent investments in modernization.
The enterprise envisioned in this proposed legislation will
consolidate funding and modernize the infrastructure by looking across
the MRTFB, and make the best investments for all of DOD. The net result
is all the services will get the affordable test resources and
facilities they need to adequately and jointly test their current and
future weapon systems.
As proposed in the DSB report, in addition to consolidating the
funding, the T&E/RE will manage the test ranges and test facilities
through a board of directors with representatives from the MRTFB. This
management plan has been discussed, debated, and validated, and is a
major part of the implementation. Essentially, it allows members of the
test community from the various major ranges and test facilities to
participate effectively in managing test resource allocation and
investment.
Some will likely argue that the Office of the Secretary of Defense
is just taking away resources from the services and building another
bureaucracy. The reality is test ranges and facilities will be better
funded, and they will be intimately involved in the decisions as to how
the money will be spent. As a result, a service program manager will
have the entire national range complex resources available for testing,
not a single service capability. Most importantly, there will be
accountability and sunshine on the process.
When this administration took office, the Defense Transition Team
asked me to help two members of the team who had read the DSB 2000
Report on Test and Evaluation Capabilities. They said Secretary
Rumsfeld wanted a DOT&E that could implement the DSB recommendations
and he believed Mr. Christie was the best candidate. Tom Christie
didn't ask to be the DOT&E. Tom became a candidate only because the
transition team convinced him that he was the only one who could
effectively implement the DSB recommendations he authored. He has
continuously avoided any activities or expressions that would suggest
he is personally seeking additional funding for his organization.
The transition team then recommended that Mr. Christie appoint a
team of outside experts to write an implementation plan. I was asked to
lead the team. We wrote and delivered a comprehensive implementation
plan and schedule. We recommended a unified concept, like the concept
in the DSB Report, giving prominent roles to the technical directors in
the field.
The implementation plan includes a financial accounting system that
will enable the Department to track, manage, and report to Congress the
actual cost of testing for the first time in the history of the
Department.
During the course of the 2000 DSB study, we considered many sources
of information: the findings of the DSB 1999 study, as well as data and
insight gathered during on-site visits to nearly all test and
evaluation facilities across the United States; and extensive briefings
from all DOD test and evaluation organizations. The DSB Task Force's
findings, and the implementation team's recommendations are grounded in
reality and built upon a solid foundation of personal and historical
experience, data, and analysis. Tom Christie and I served on both the
1999 and 2000 DSB Task Forces.
In closing I want to say that I agree with the committee that our
soldiers, sailors, airmen, and marines must have weapon systems that
work in combat. Their lives depend on it. This vital legislation, like
your previous legislation that created the DOT&E is another critical
step toward helping the Department meet its responsibility to
adequately test weapons systems before putting them in the hands of our
service men and women.
I sincerely appreciate the work of the subcommittee and what this
legislation will achieve. I will be happy to answer any questions.
Senator Landrieu. Thank you very much, all excellent
statements. I really think it is going to get us off to a good
start for this discussion. The best news I have heard is that
there really does seem to be complete agreement from the
Department and from the gentleman that has led this important
report. Our goal seems to be the same, to have a system where
the incentives are in the right places to do the right things
to get a flexible but thorough testing system for our
Department of Defense so it can support the best military in
the world. To be open to new acquisition strategies, with a
testing mechanism that we can be certain we are getting to the
warfighter what they need and the taxpayer the best bargain and
best investment process, so I am very encouraged by all
panelists having that goal. The questions, of course, are going
to be about how best to get there.
Second, I want to thank you, Mr. Christie and Mr. Krings,
for being very brave, in the sense. I have been in this
business now a long time, and it is very rare that you actually
see someone that will serve on the committee and then volunteer
and go to Washington to try and implement the recommendations
of the task force. That alone is worth commending you both for
your good work and for stepping forward.
Let me begin by asking if there is--since I heard a
consensus of the goal, I want to make sure that we also have a
consensus about the depth or the seriousness of the problem, so
I am going to ask each panelist if you agree with some of the
findings of this report, and I am just going to ask three
questions, just answer yes or no.
Mr. Young and Mr. Wynne, do you agree that the
infrastructure that has been highlighted in this report is
about 400 years, the recapitalization rate is about 400 years,
and the architect is about 70 years? Do you generally agree
with that assessment of the condition of the testing facility?
Mr. Wynne. Yes, ma'am. I would only say that is adequate
for testing all of the equipment that we have given them, and
the funding is proffered when the testing is inadequate, but we
are trying to get all of our facilities down to a 67-year
recapitalization rate and that is a subject of a separate
committee.
Senator Landrieu. To a 6 to 7 year?
Mr. Wynne. 67.
Senator Landrieu. 67, down from 400?
Mr. Wynne. Yes, ma'am.
Senator Landrieu. Okay. Mr. Young.
Mr. Young. I do not have the specific numbers, but I would
agree with what Mike said, across the infrastructure we have
problems. I do not know that the test infrastructure is an
anomaly but all of the DOD infrastructure needs to be brought
down to, as he said, a 67-year recapitalization rate.
Senator Landrieu. Do you agree with the general finding
that 66 percent of the Air Force program stopped operational
testing due to a major system or safety shortcoming? Do your
records reflect that or acknowledge that?
Mr. Wynne. I would say that comes from the 1990s, early
1990s. It may not reflect what is going on today, but I would
say it this way, that airplanes that we deliver to our Air
Force go through a thorough scrub, and before any operational
characteristics are changed. The stopping or starting of test
is a natural fall-out of essentially trying to aggressively
meet high-G requirements, high bomb accuracy requirements, and
as far as the segmentation into safety versus nonsafety, safety
is our first concern, always, and some of those safety aspects
you do not run into until you get into a serious operational
test, so I cannot agree or disagree that the current stats
would mirror or not mirror that number.
Senator Landrieu. But in your testimony, and the reason I
asked those questions, in both of your testimony you
acknowledge that the problems do exist, that you are in the
process of addressing them. You seem to acknowledge that the
fundamental basis of this report that there are some
shortcomings and areas that needed to be addressed, is that
correct?
Mr. Wynne. Yes, ma'am. In every aspect of the management of
our government we can identify those areas where we can
improve, no doubt about it.
Senator Landrieu. Mr. Christie, let me ask you, since you
have had a long experience in this field, what do you think the
chances are that you would be able to get an agreement with the
Department on some of the issues that you have acknowledged,
either with or without this legislation? If we did not push
forward with some of the pieces or all of the pieces in this
legislation, what do you think the impact on the test and
evaluation will be in 2 or 3 years?
Mr. Christie. As I said, I think we have made some
progress. The waivers process is, in fact, one that we have
addressed within the building, and the United States Navy,
which was identified as the culprit, as I recall, in the report
has, in fact, changed the process--we are talking about waivers
now of testing requirements, not waivers of operational
requirements per se. When we have a requirement, an operational
requirement that is on the books, we should at least gather
data that permits one to evaluate whether we are effective in
meeting that requirement. The Navy has changed their process
with respect to that.
As far as 2 or 3 years from now, the sooner we get underway
with making some of these changes, and I think a very important
issue is the institutional funding----
Senator Landrieu. The funding piece. The waivers we seem to
be making progress on.
Mr. Christie. Yes.
Senator Landrieu. It's the funding.
Mr. Christie. The funding piece is another issue. There has
been some progress there as far as proper institutional
funding, but I am not about to say that it would solve the
problems that I think were highlighted in that DSB report. The
services have competing requirements when they put their POMs
together and their budgets together, and I understand that, but
that has led to problems with the ranges and our ability to
conduct adequate tests over the years, and continues to do so.
Senator Landrieu. On the funding issue, and I cannot find
the exact statement, but I remember reading about the--here it
is. On that issue, because we seem to acknowledge that funding
and the--usually competition is good, but I am not certain in
this particular instance this competition between acquisition
and testing is very helpful and that is one of the issues we
are trying to focus on, but according to the budget request
this year, the Army proposed not to increase its testing and
evaluation, but to decrease it from $128 to $123 million. The
Navy did not, even with this report and even with the work,
offer to increase its testing, but it decreased from $123 to
$118 million, and the Air Force did the same, from $125 to $90
million, so the words about the importance of testing, that we
are underfunded and we need more money, do not seem to be
reflected in the budget, so the amount of funding is a problem,
but also the system that we have funding competing with
acquisition seems to be a problem.
Mr. Krings, one more question and then I will turn it over
to my colleagues. You spoke very passionately about this
subject. I am always impressed with people who seem to come to
the table with a lot of direct experience. What, in your
experience as a fighter pilot, or in your association with the
contractor that you worked for, led you to be interested in
this, and why you think it is so important that this
subcommittee really try to work with the Department and the
services to try to come up with a better system?
Mr. Krings. I think most of my passion for this particular
effort came when I was the DOT&E. I naively came to this job--
and Senator Bingaman may remember. I suggested at one time
during my hearing that maybe we would just put a DOT in for a
couple of years, and everybody would straighten out, and then
we could walk away and everything would work well. Well, I got
a lot of ridicule about that. It is more like straightening
teeth. You take the wires off and they go right back where they
were again.
So I did not realize at that time that the competition that
exists between the services--which is good at times. I am not
arguing with that, but in this particular case it does seem to
me that the ownership, and making a national range, would
benefit everywhere. We do have, as we speak today, significant
limitations due to resources in the major programs that are
going on.
I just did two Red Team reviews for the Air Force on the F-
22, and we have significant problems in the F-22 in terms of
resources. We can only shoot one Advanced Medium-Range Air-to-
Air Missile (AMRAAM) a month in this country. That is kind of
hard to imagine, but that is because it is not a national
basis. So when we began to see unified and joint operations and
we said we are going to train like we fight, meaning we are
going to train jointly, it seemed natural to say we are going
to test like we fight, which means we would test jointly, and
the concept of a national range, people putting things together
and not duplicating things, it just--the more you look around,
if anyone in this room went on the trip that we went on, the
same result would come up. You do not really have to have all
that experience. You can see it. They will tell you that when
you go out and talk to them.
So this is a response from the people who have to do the
job every day, as opposed to those who might sit back here and
think they know how to do the job every day, and one gets
rather passionate when you see things not being done well, and
the ability to fix it is there.
Senator Landrieu. My time has expired, but this
subcommittee under Senator Roberts' leadership has done a great
job in trying to focus our efforts toward jointness, toward
working together, recognizing that competition is good, but
cooperation is also very good, and the sharing of resources,
minimizing cost, and maximizing the result, so I hope that this
hearing will be helpful to us. We have already identified some
pieces of the legislation that we could agree on, some that
might need additional work, and I look forward to working with
Senator Roberts to try to present to Congress something that
will be really beneficial and continue to move us toward a
reform system.
Senator Roberts.
Senator Roberts. Thank you to my colleague and my chairman.
Are you a chairlady or chairwoman or chairperson----
Senator Landrieu. I answer to just about anything, as long
as you call me and do not forget me.
Senator Roberts. I thank my friend.
Senator Landrieu. Good.
Senator Roberts. Mr. Young, the chairman indicated the
service requests, which were somewhat under last year's, and it
occurred to me that all of your testimony reflects a lack of
funding. We are pretty good at pointing fingers at the
services, and at people like yourself, but Congress has not
always been very supportive of fully funding the test and
evaluation infrastructure, and I know this has been a problem
in recent years. What has been the impact of that?
Mr. Young. Sir, if you will allow me the privilege of
sitting on your side of the table for a minute, because in
working for 10 years for the Senate Appropriations Committee I
was part of making recommendations to the committee and
reviewing the budget. I think those processes have had a
significant effect on the test ranges, and the Department as a
whole does not want to put money at risk when they ask for
money.
For example, in fiscal years 1998, 1999, and 2000, funds
for the Navy and Air Force test ranges, the MRTFB funds that
sustain those ranges were cut $15 to $25 million each year. At
that point, the Department tends to get very concerned about
making sure they can totally defend the budget request. The
services tend not to put resources into activities that
Congress reduces year after year. However, I can tell you the
Army T&E lines in total grow about 13 percent over the FYDP,
the Air Force lines grow 25 percent over the FYDP, and the Navy
lines stay paced just ahead of inflation.
The chairman talked about a couple of specific lines, but
there are three or four lines that pay the bills to operate the
ranges, then there are a couple of lines that modernize the
ranges and those modernization lines do fluctuate, depending on
what equipment you need to buy for a range at a given time.
However, on the whole, the trend is that we have lost money, if
you will, over the last several years on the Hill, and the
Department currently has a budget which reverses that trend to
a pretty good degree, and we are defending those moneys
aggressively.
Senator Roberts. I am tempted to ask you about the attitude
of some of the appropriators, but I will not put you on the
spot.
Mr. Young. They are excellent people, sir. [Laughter.]
Senator Landrieu. Please do not get us in any more trouble
than we are already in.
Senator Roberts. In the House of Representatives, in which
I used to serve, there were times that I felt there should be a
hunting season for appropriators. I love appropriators in the
Senate. I carry their bags, I press their ties, I clean their
windows.
Senator Landrieu. He does not realize I am now one, an
appropriator, you see. I am taking this back.
Senator Roberts. Yes, that is one of the reasons I am
saying this. [Laughter.]
We will talk to Ted and Danny and see if we cannot make
some improvements.
I have a question of Mr. Christie, 16 of 25 is pretty good.
I might add that Kobe Bryant did not hit that many last night,
but maybe Michael Jordan--but at any rate, many are called and
few are chosen, and I want to thank you for your willingness to
take up a position of responsibility where you had been in the
advice category--and I am desperately looking here for your
statement.
On page 9, ``So what we may have here is a difference in
schedule for transformation, not necessarily one of different
goals. Addressing an issue does not necessarily mean the
Department will come up with a solution, much less one that
matches the DSB, or proposed legislation which I have said
follows the DSB recommendations very closely. Nevertheless, the
direction the Department is taking is an acknowledgement there
is a problem and improvement is necessary. You have my
commitment I will find an appropriate solution.'' I want to
thank you for that statement.
Then you also said on page 11, ``A review of the
legislation shows it to match the DSB recommendations in many
respects. However, the legislation could cause us problems. The
Department desires the opportunity to discuss the proposed
Senate legislative objectives internally, as well as with your
committee. We believe that together we can develop a plan. . .
.'' Which is the suggestion of the chairman, and I think is a
good suggestion, so I thank you.
Let me ask you the question. You stated in your annual
report that the organizational and the budgetary
recommendations in the DSB study are needed, though
controversial, and the Department chose not to implement these
recommendations.
Just a real quick summary on why the DOD chose not to
implement the DSB recommendation to establish a department of
test and evaluation resource enterprise. That is quite an
acronym mouthful. That is DTE--never mind.
Mr. Christie. Well, the biggie, which is the DOD test and
evaluation enterprise, was, in fact, brought before the Senior
Executive Council, which consists of the three service
secretaries, the Under Secretary of Defense for Acquisition,
and is chaired by the Deputy Secretary. All major decisions
policywise as well as many of the major budget decisions are,
in fact, put in front of that group.
I and Jack Krings here, who had developed the
implementation plan, had the honor of presenting our proposal
to that group, and met with, not surprisingly, opposition from
the services. That was expected, and we have heard Mr. Young in
particular discuss that today--why the services feel so
strongly about this.
What happened, this was in mid-August, mid to late August--
--
Senator Roberts. Of last year.
Mr. Christie. The decision was sort of, kick the can down
the road. It was clear the services were adamantly opposed.
There was no decision made, and we will come back and talk
about this at some future date, and then September 11 came, and
there was no further serious discussion again of this issue
before the end of the year.
The fact that the issue is still there in the context of
more adequate T&E, to include possibly this way of doing
business, is borne out by some of the direction that I talk
about in my statement that appears in the planning guidance.
That is for 2004, but that is another year. The planning
guidance says, let us develop a strategic plan to address these
issues and include it in next year's budget.
Senator Roberts. One of the suggestions I am going to make,
and I would inform the chairman, instead of 2004 we do it in 2
weeks. In other words, that you get back to us in 2 weeks, more
especially Mr. Wynne and others, to recommend what you could
live with, how you could implement this legislation, making
some suggestions. I realize that 2 weeks and 2004 is a little
bit off, to say the least, but I think since the legislation is
in the mark, and since it will be on the floor--it is not on
the House side, but we would rather work with you to see if we
could come up with some reasonable agreement, if we possibly
can.
Elliott Cohen said in this month's Foreign Affairs, and yet
the Predator, the UAV, one of the technological stars of
Afghanistan and Kosovo, was judged not operationally effective
or suitable by the Pentagon's Office of Testing and Evaluation
in 2001 and this determination had less to do with the
qualities of the Predator than it did with the extraordinary
standards for effectiveness set by the Department. It was a
classic case of impossibly demanding requirements causing the
Pentagon to disparage its own systems, creating pressure to
defer adequate acquisition of what is good today in a perpetual
quest for the extraordinary system that will do anything and
everything tomorrow.
How true is that statement?
Mr. Christie. Well, let's address the Predator. Yes, we
evaluated that system against the stated operational
requirements on the part of the United States Air Force, and in
fact there was an article yesterday in the Aerospace Daily--I
think it was yesterday or the day before--that discussed the
two recent crashes and the board that had investigated them.
They found two causes for the crashes, a different cause for
each accident.
The first one was--the system I think was operating in
weather--that the deicing system had not worked. That also was
pointed out in our report.
The second cause was that the hand-off between systems was
not executed properly. In fact, during the operational test,
because they could not execute that, they did not test that
aspect, in other words, handing off from one Predator to
another. I think that article states that both of these
deficiencies were highlighted in the DOT&E report. In summary,
it did not meet its operational requirements as spelled out in
the operational requirements document.
This is not to say we should not have deployed it. I am not
saying that.
Senator Roberts. Right, exactly. That is the point I am
trying to make.
Mr. Christie. You still have a capability there, but it is
not what we thought we bought, or what we stated it should have
been doing.
Senator Roberts. There is nothing like a war to make you
change your mind.
Senator Landrieu. That is true, but I want to interject, if
I could, as is my liberty as chair, to say that a solution, or
one of the keys that we want to get to is, if you knew it did
not pass the deicing test, and it went into the battlefield,
you should not have flown it in ice.
Mr. Christie. I do not know that it flew in ice, but it is
very likely it did.
Senator Landrieu. Or whatever. I mean, if that was the
problem. I do not know if that was the problem. It is not a
question of whether you deploy it and keep it in the shop or
send it to the battlefield, but the system, or the testing is
such that the information is passed from the test to the
battlefield, so if it did not pass the test, not to push the
equipment so you hopefully save lives.
Mr. Wynne. Madam Chairman, actually you do push it.
Actually, because you think it may work, and you need that
capability, and the effect is dramatic, and in fact we have not
lost that many Predators in this engagement that would not
allow us to push the envelope, and I do not know about this
particular instance, about the heaviness or the lightness of
it, but in fact in every engagement like this, even with the
results of these two fine gentlemen, we would push the system
and expect to push Predator almost to the limit.
Senator Roberts. For a command decision, if you have a very
important mission, you are going to push the envelope. You are
going to fly the bird. I mean, after all, it is unmanned.
Albeit, you do not want anything to go wrong with it, but it
would depend on the mission and the command.
Actually, my question was, is there a danger that rigid
test criteria imposed by Congress, or internally at DOD, could
harm major systems acquisition reform by making spiral
acquisitions in the development of fieldable prototypes just as
burdensome as the current process?
Mr. Christie. I do not see that happening, in fact. My job
is not to tell the Secretary of Defense or the operational
commanders that they should or should not deploy a system, or
should or should not buy a system. But, if the service, in this
case the United States Air Force, says this is what this
aircraft or this particular system is supposed to do, and
spells that out very explicitly, then we should test against
it, and if it fails, that should be reported, and then the
decisionmaker makes his decision.
Like Mike says, they may have decided, and did, that has
capability that we need there. I think we will have the same
situations arise in spiral development. We will test the
system--in fact, establish criteria--and then test against
them, and we will report the results.
Mr. Krings. As a professional long-term envelope-pusher, we
never recommend that the field go beyond where the testing has
been, because there may or may not be a cliff there. There may
be a gentle slope. The fact that it is unmanned really does not
make much difference, because there are often people on the
ground, or people relying on that, and I do not think that is
done very often, certainly not successfully very often
So consequently you are absolutely right, we do not always
get all the testing done, but the key is to tag it and say what
it can do, what it cannot do, do not go past here because we do
not know what the results are, and we put many things into the
field and should and would, and will continue to, before they
are fully developed, or before they are fully tested, but we
have to put a tag on there about what has been done and what
has not been done so that the CINC or whoever is operating it--
--
Senator Landrieu. Can make an informed decision.
Mr. Krings. Sure.
Mr. Wynne. I have tremendous respect for both Tom Christie,
who I have admired for a long time, and Jack Krings, who I have
admired for a long time, and has been of enormous assistance to
me in the past.
I will say only that we rely on the personalities that are
sitting at this table to be rational, but this legislation
unbalances the balance that is currently in the acquisition,
and in a different setting at a different time the DOT&E could
force the Secretary or Deputy Secretary in each occasion to
make a determination, and I just think that that burdens the
Secretary and puts the system at risk, if you will, for
schedule and for delivery to our soldiers, so I do share
Senator Roberts' opinion on that.
Mr. Young. Can I make one brief comment? The requirements
process we have talked about is not a science. We do our best
to set the requirements, but I think if I understood Senator
Roberts' comment we do in the end want to get systems in the
field. It is very painful when the experts here at the table
say a system is not operationally effective and suitable, but
in the case of Predator it has proven to be operationally
useful, if I could use that word. I think you have seen some
writings of Admiral Blair and other people, that say they want
systems, especially systems that are not directly putting
people's lives in danger out there in the fleet as soon as we
can provide them.
For example, there is electronic warfare software that is
being developed for surface ships rode on the Anzio the other
day. We want to deploy it as fast as we can. It has not been
operationally tested, but it would be a tragedy if we do not
get it in the hands of the fleet as soon as possible. So we do
have to look at adjusting the test process to get systems in
the hands of users, assess them fairly, and recognize that the
requirements process is not a science. We may get close but not
over the bar, and yet close was darned good when you need it in
Afghanistan.
Senator Landrieu. Okay. Senator Bingaman, and we are going
to have a vote in a few moments, but my intention is to finish
this round of questioning and then probably go vote, and wrap
up before we go vote.
Go ahead, Senator Bingaman.
Senator Bingaman. Thank you very much. My concern on this
is the current situation, which we have had, really, since I
have been here--I have been on this committee now 20 years, and
I think that the situation has deteriorated as far as
investments during that time, investment in our test
facilities.
The way I am thinking about it--and this is to paraphrase
some of the testimony you have already given here, but just to
see if I have got it right--there is a disincentive on the part
of the services to invest in testing, in resources, and in
facilities. You say in your testimony the services do not make
the required investments in test resources, and so the test and
evaluation competes with service programs. Does anybody
disagree with that?
Mr. Wynne. Sir, there is competition throughout. We just
cannot buy everything that is asked for.
Senator Bingaman. I understand, but it seems to me there is
always a stronger push for the programs than there is for the
testing facilities that have a more general purpose. That
causes the testing facilities to ratchet up their costs,
because they have to find resources somewhere. They add more
and more overhead to the cost of doing tests. That creates a
further disincentive on the part of the services to use those
facilities, so there is a reluctance to test, which is an end
result of the process.
Unless we can find a way to ensure that adequate funding
goes into the infrastructure for this test and evaluation
function, then we cannot break out of this downward spiral, as
I see it, and I think that is what we are trying to do in this
legislation.
I do not know, the only alternative I have heard is that we
are going to do better by trying to get some resources to
these, but I did not really hear that from you, as I understood
it. Your comment was that the resources are about where they
ought to be.
Mr. Wynne. Sir, I would say that the way the President's
budget and the 5-year defense plan lays out, the resources
going into the test and evaluation line are increasing over
time. One of the comments I would make is, this addresses one
part of the test facilities. Secretary Young addressed the
other developmental tests. My partner here, Mr. Christie, also
addressed the developmental test issues. That is not covered by
this legislation, so that it would create another tension and
imbalance in, maybe, that distribution of investment as well.
Senator Bingaman. Let me ask both Mr. Christie and Mr.
Krings to just comment on whether they think an imbalance is
created by trying--as I see what we are trying to do in this
legislation, we are trying to cordon off a certain amount of
resources and say, this should go to basic infrastructure so
that these test facilities do not have to add so much overhead
to the cost of doing test for the services, so that we do not
have the disincentive on the part of the services to do the
testing.
Does that not make some sense, Mr. Christie?
Mr. Christie. Of course, and I am a big supporter, and one
of the big recommendations in that report is to emphasize more
the institutionalized funding of these facilities--and the
facilities are not just hardware or buildings, they are people
also, a big part of that. The disincentive we are talking about
is, as those dollars have gone down for these test facilities,
they have had to charge an increasingly large share to the
programs, the acquisition programs, for their testing. The
acquisition programs have had to pay for a growing share of the
overhead costs, and that, to me, is the disincentive for
testing.
Now, on the front end of that, how much money goes into
those accounts, I do not know that there is a disincentive on
the part of the services to fund those accounts. They compete
with not just the acquisition programs, but with operations and
maintenance and so forth, and yes, I want to see more money
into the institutional funding, such that the programs do not
have to pay that increasing share.
With all due respect, what is in the FYDP I think, is
growth in the outyears, but we never get there.
Mr. Krings. Also, just to make something clear, we are
talking about the resources and facilities, not the act of
testing, or the act of evaluation. It is clearly done by the
services, but what happens--and all testing is done with these
facilities, development testing, research testing, operational
testing. It is not just operational, all testing is done there,
so all communities that test.
Interestingly enough, a lot of allies come over and test in
these facilities, because we have the best in the world, so
everybody pays for this. The key element is, though, like the
B-2 program, a significant cost in the B-2 test program was
building South Base, a hither to classified test facility. That
is a lot of money.
So if you need something in your program and it is not
there, guess who gets to pay for it, your program, so that
takes money away from testing. You then have test problems,
which stretch out your testing, and the next thing, there is
not enough money to get the testing finished, and we have many
programs today, as we speak, that are in exactly that same
position. They have had to take money that was allocated for
testing, and use it to build infrastructure because it was
their turn, and it was not there, and they need it, and they
need to get the job done, so it is not uncommon.
Senator Bingaman. I will stop with that, Madam Chairman.
Thank you.
Senator Landrieu. Thank you. We have been joined by
Chairman Levin, and I believe he has a few questions, and we
are very happy, Mr. Chairman, that you have joined us for this
important hearing. I said when you came in we have gotten some
groundwork covered in this hearing, and there seems to be some
consensus about our legislation, but still some areas of
disagreement, and we are hoping to work through them.
Chairman Levin. Thank you very much, Madam Chairman, and
thank you for this hearing. It is a very important subject that
may seem dry or arcane or complex to a lot of people, but there
is an awful lot riding on it, and I just want to congratulate
you, Senator Bingaman and others who have worked so hard on
this issue. I know Senator Roberts has a great interest in this
issue, and hopefully we will be able to maintain the thrust of
this language and do whatever revisions are appropriate, but to
keep the thrust of what we are trying to do here.
I want to just briefly read the paragraph in the Defense
Science Board's task force on the test and evaluation
facilities, and I do not think this paragraph has been read yet
this morning, and here is what it says, and of course, Mr.
Krings is here this morning to represent the Defense Science
Board's report.
`` The unwillingness of the services to provide adequate
resources for T&E, while still maintaining substantial
redundant capabilities, suggest that a change is needed. The
current funding structure of the Department's T&E facilities
does not lead to long-range business planning, and it is not
possible for them to make investment decisions based on future
utilization or a business-like return on assets analyses.
Centralized, consolidated management of T&E facilities
within the Department of Defense could overcome many of these
serious problems. A defense T&E resource enterprise evolved
from a central test and evaluation investment program will
significantly improve DOD testing by optimizing test resource
investments and streamlining the management of these vital
assets, including both personnel and facilities.''
So my question is of Mr. Christie, who was a member of that
task force, as to whether he agreed with the task force's
findings and recommendations regarding the establishment of a
T&E resource enterprise at the time the report was written.
Now, I am also going to ask you what your current view is
on it, but at the time the report was written, did you agree
with that report?
Mr. Christie. Of course. I was part of the study, and I
agreed with that.
Chairman Levin. Now, do you agree with those findings
today?
Mr. Christie. Well, I am not disavowing those findings. I
am living in a different world today, and I have to adhere to
decisions that are made in the building, which I am doing, but
I helped author that report, and certainly agree with the
findings.
Chairman Levin. Thank you.
Senator Landrieu. Thank you, Mr. Chairman, for coming. The
vote has been called, and I am going to suggest that we just
give summary remarks and then close this hearing. I think it
has been very helpful and, as you can see, there are many
members of our committee that feel strongly about acknowledging
that the status quo is just not going to do. I mean, there are
clearly some places that need significant improvement, and I do
believe that this legislation helps us to move in that
direction. If there are places that are imperfect, or some
language that we could modify to meet some of the comments made
this morning, I am open to it, but I wanted to see if Senator
Roberts had a couple of suggestions, too, and then we will try
to close up.
Senator Roberts. I was going to ask for the record--and I
am just going to make this statement, and perhaps Mr. Wynne you
can get back to me, or Mr. Young, and Mr. Christie. What would
be the impact of the proposed legislation on planned or ongoing
testing of existing programs, and the ones I picked pretty well
track what we are into in regards to transformation and the war
on terrorism, and the asymmetrical threat that we face, such
as, for example, the Air Force's joint strike fighter, the
Navy's cooperative engagement capability, the V-22 Osprey for
the Marines, and the Army's Comanche attack helicopter. What
parts of the T&E infrastructure are critical to effectively
test these programs?
[The information referred to follows:]
Mr. Wynne, Mr. Young, and Mr. Christie. The following are examples
of major Air Force, Marine Corps, and Navy programs that are under
development and the DOD test and evaluation facilities and ranges that
are being used to support the programs. Also provided are comments on
the potential impact of the Senate's proposed legislation regarding the
management and funding of the Department's test facilities and ranges.
major system--jsf
1. Major System--Air Vehicle/Air System
A. Contractor Test Facility
LM Aero Ft Worth, TX
B. Government Test Facilities
NAWC-AD Patuxent River, MD
NAWC-AD Lakehurst, NJ
Eglin AFB, FL
AFFTC Edwards AFB, CA
C. Government Test Ranges
NAWC-AD Patuxent River, MD
AFFTC Edwards AFB, CA
NAWC-WD China Lake, Pt Mugu, CA
Nellis Test and Training Range, NV
2. Major System--Propulsion
A. Contractor Test Facilities
Pratt and Whitney West Palm Beach, FL and East Hartford, CT
General Electric Evandale, OH and Peebles, OH
B. Government Test Facilities
AEDC Tullahoma, TN
NAWC-AD Patuxent River, MD
AFFTC Edwards AFB, CA
NAWC-WD China Lake, Pt. Mugu, CA
C. Government Test Ranges
NAWC-AD Patuxent River, MD
AFFTC Edwards AFB, CA
NAWC-WD China Lake, Pt Mugu, CA
3. Major System--Mission Systems*
A. Contractor Test Facilities
Northrup Grumman El Segundo, CA and Baltimore, MD
LM Aero Ft Worth, TX
BAE Systems--Sanders Nashua, NH
LMMFC Orlando, FL
Boscombe Down, UK
B. Government Test Facilities
Wright-Patterson AFB, OH
NAWC-AD Patuxent River, MD
NAWC-AD Lakehurst, NJ
AFFTC Edwards AFB, CA
Rome Labs, NY
RFSS Redstone Arsenal, AL
NWSC Crane, IN
Holloman AFB, NM
NAWC-WD China Lakc, Pt Mugu, CA
C. Government Test Ranges
NAWC-AD Patuxent River, MD
AFFTC Edwards AFB, CA
NAWC-WD China Lake, Pt Mugu, CA
Nellis Test and Training Range, NV
*Mission Systems includes radar, electronic warfare suite,
distributed aperture system, electro optical targeting system,
communication, navigation and identification subsystems, cockpit
systems, and armament.
4. Legislation Impact
A funding reduction of $123 million (i.e., 0.625 percent of $19.7
billion) across the FYDP would reduce funding below OSD directed
levels, increasing risk in execution of the JSF program and potentially
resulting in schedule delays. Furthermore, funding reductions would
deviate from agreements with JSF international partners.
major system--v22
1. Major System--Air Vehicle/Air System
A. Contractor Test Facilities
Boeing Company Rotorcraft Division, Philadelphia, PA
Bell Helicopter Textron, Ft. Worth, TX
Bell Helicopter Textron, Amarillo, TX
B. Government Test Facilities
NAWCAD PAX River, MD
E3 and lightning facilities, PAX River, MD
Edwards AFB, CA (MOB)
NSWC Dahlgren, VA
Climatic Lab, Eglin AFB, FL
NASA Lewis Research Eacility, OH
C. Government Test Range
Atlantic Test Range
Ft. Huachuca, AZ
MCAS Quantico, VA
MCAS New River, NC
MCAS Cherry Point, NC
MCB Twenty Nine Palms, CA
Pope AFB, SC
Ft. A.P. Hill, VA
National Guard Base, Duluth, MN (U.S. Army AQTD)
D. Foreign Government Bases and Ranges
Canadian Forces Base, Shearwater, Nova Scotia, Canada
2. Major System--Propulsion
A. Contractor test Facilities
Rolls Royce Corporation, Indianapolis, IN
B. Government Test Facilities
Naval Air Propulsion Center, Trenton, NJ
3. Major System--Mission Systems
A. Contractor Test Facilities
Boeing Company Rotorcraft Division, Philadelphia, PA
Bell Helicopter Textron, Ft. Worth, TX
B. Government Test Facilities
NAWCAD PAX River, MD
ACETEF, PAX River, MD
E3 and lightning facilities, PAX River, MD
Manned Flight Simulator, PAX River, MD
Edwards AFB, CA (MOB)
Benefield Anachoic Facility, Edwards AFB, CA
Avionics Test and Integration Complex, Edwards AFB, CA
NAWCAD Indianapolis, IN
NAWCAD, Lakehurst, NJ
Air Force Electronic Warfare Evaluation Simulator, Randolph
AFB, San Antonio, TX
Flight Taining Device, New River, NC
NSWC Dahlgren, VA
Pt. Magu, CA
C. Government Test Range
Atlantic Test Range
Nevada Testing and Training Range, NV
Utah Test Range, Hill AFB, UT
White Sands, NM
NAWC, China Lake (Echo Range), AZ
Eglin AFB, FL
MCAS New River, NC
MCAS Yuma, AZ
MCAS Cherry Point, NC
Marine Corps Mountain Warfare Training Center, Bridgeport, CA
MCB Twenty Nine Palms, CA
FAA Tech Center, NJ
Ft. Sumner, NM (MOB)
Ft. Bliss, TX
Nellis AFB, NV
Eilson AFB, AK
Robins AFB, Warner Robins, GA
4. Legislation Impact
Design, development, and test for resolution of discrepancies in
the V-22 program are funded in the restructured program in accordance
with Blue Ribbon Panel recommendations. Preservation of this budget is
necessary in order to maintain the recently approved restructured
program. A reduction of RDT&E in fiscal year 2003 will necessarily
result in extending the program. There is no assurance that the
redistribution of these funds among test facilities and ranges will
directly benefit the V-22 program in such a way to mitigate the impact
of loss of funs.
major system--cooperative engagement capability (cec)
1. Contractor Test Facility
Raytheon, St. Petersburg, FL
2. Government Test Facilities
NSWC Dahlgren, Dahlgren, VA (software)
NSWC Crane, Crane, IN (hardware)
Distributed Engineering Plant (DEP) (interoperability)
3. Government Test Ranges
Atlantic Test Range, NAWC--Air Division, NAS Patuxent River, MD
Atlantic Fleet Weapons Training Facility (AFWTF), Puerto Rico
4. Legislation Impact
Post-OPEVAL, the vast majority of CEC testing will be conducted
underway in Navy Operating Areas. CEC will not be a heavy user of
Government Test Ranges. Therefore, the impact of this legislation would
be the diversion of funding from the CEC Test and Evaluation effort to
fund the Military Test Range Infrastructure. As a result less funding
would be available to test and evaluate CEC, thereby increasing the
risk to successful Milestone Decisions and potentially delivering a
less effective and suitable system to the warfighter.
Senator Roberts. I do not want you to answer that now, but
if you could get back to that it would be helpful, and I am
going to make a suggestion, since we have a vote on, that
perhaps, Mr. Wynne, you could get back to us within a 2-week
time frame on some recommendations on how you could live with
and implement the legislation that has been authored by the
chairman, and I think we all agree we support the goals without
question, and work with Mr. Christie and see if you could come
up with some legislative recommendations.
Mr. Wynne. I would be happy to do that, Senator, and in
fact what I would also offer is that we should do a study on
whether the service MRTFB, which is the major test ranges, are,
in fact, paying the operating costs, and whether the programs
when they come in are being unfairly dinged.
My assumption here is that even if I centralize all of the
facilities, if I were to have a unique requirement, such as the
B-2 range construction referred to by Mr. Krings, the program
would still be charged for that unique requirement, because the
central fund will not forecast future unique investment needs.
It just cannot, because we would not tell them in some cases.
Senator Roberts. I think that would be a very important
study, so if you can get back to us in 2 weeks, that would be
much appreciated, and I for one, Madam Chairman, thank the
witnesses for taking time. It is a busy day, it is a busy time,
I know you have other things to do, and I want to thank you for
your leadership, and more especially you, Mr. Christie, because
you have served in an advisory capacity, now you are in the
responsibility saddle, and we will look together for a good
ride.
Senator Landrieu. Thank you all.
Mr. Young. Could I add one comment to something Mr. Krings
said?
Senator Landrieu. Very quickly.
Mr. Young. There is a central test and evaluation
investment program line. It has been there for years. It is
managed and run by OSD. It is within the purview of OSD to
resource that line to modernize for the good of all programs so
I am anxious at the suggestion the services are underresourcing
everything. I think the study that Secretary Wynne talks about
will show that within a few percentage points the ranges are
resourced, and they are appropriately making investments. There
is already an existing structure not unlike the proposed
legislation for OSD to do central investment for the good of
all services.
Senator Landrieu. Well, but the problem is, without a
constituency those lines are hard to sustain themselves through
the process, and that is the system--we are trying to create a
system where there is support for a robust, not tightly
controlled, flexible, smart, robust testing system that gives
our warfighters what they deserve, and we do not have it yet.
That is the point of this hearing, to get something that will
work.
So thank you all very much.
Mr. Wynne. Thank you very much, Madam Senator. Thank you,
Senator Roberts.
Senator Landrieu. We are adjourned.
[Whereupon, at 11:15 a.m., the subcommittee adjourned.]