Rural Development: Assessment of Data Used to Support Non-Housing
Direct Loan Programs Subsidy Cost Estimates (10-APR-01, 	 
GAO-01-516R).							 
								 
The Department of Agriculture's Rural Development's (RD)	 
long-standing problems with estimating the cost of its credit	 
programs in accordance with the Federal Credit Reform Act of 1990
and federal accounting standards continues to be a major factor  
in preventing Agriculture from achieving an unqualified opinion  
on its consolidated financial statements. This correspondence	 
focuses on RD's efforts to improve its credit program cost	 
estimates for its major non-housing direct loan programs. For RD 
to prepare reasonable subsidy cost estimates, being able to draw 
on reliable data is an important first step. GAO's testing	 
determined that the data included RD's three loan accounting	 
systems that are used to calculate key cash flow assumptions for 
the major non-housing direct loan programs are generally	 
accurate. The assumptions that RD has determined to be key for	 
calculating the subsidy cost estimates for these programs are the
average borrower interest rate and average loan term. For this	 
program, RD staff identified the average borrower interest rate  
as the key cash flow assumption.				 
-------------------------Indexing Terms------------------------- 
REPORTNUM:   GAO-01-516R					        
    ACCNO:   A00812						        
    TITLE:   Rural Development: Assessment of Data Used to Support    
             Non-Housing Direct Loan Programs Subsidy Cost Estimates          
     DATE:   04/10/2001 
  SUBJECT:   Accounting standards				 
	     Direct loans					 
	     Federal agency accounting systems			 
	     Government guaranteed loans			 
	     Loan accounting systems				 
	     Projections					 
	     USDA Federal Financing Bank System 		 
	     USDA Program Loan Accounting Program		 
	     USDA Rural Electrification 			 
	     Administration System				 
								 

******************************************************************
** This file contains an ASCII representation of the text of a  **
** GAO Testimony.                                               **
**                                                              **
** No attempt has been made to display graphic images, although **
** figure captions are reproduced.  Tables are included, but    **
** may not resemble those in the printed version.               **
**                                                              **
** Please see the PDF (Portable Document Format) file, when     **
** available, for a complete electronic file of the printed     **
** document's contents.                                         **
**                                                              **
******************************************************************
GAO-01-516R

Report to Congressional Requesters

United States General Accounting Office

GAO

April 2001 ARMY TRAINING Improvements Are Needed in 5- Ton Truck Driver
Training and Supervision

GAO- 01- 436

Page i GAO- 01- 436 Army Training Letter 1

Appendix I Objectives, Scope, and Methodology 21

Appendix II GAO Contact and Staff Acknowledgments 23

Tables

Table 1: Percentage of Students in Formal, Informal, and Reserve Programs
Satisfied With Training 10

Figures

Figure 1: Three M939 series Trucks 3 Figure 2: Number of Authorized,
Assigned, and On- hand

Instructors at Fort Leonard Wood, January- September 2000 6

Figure 3: On- hand Instructors as a Percentage of Assigned Instructors at
Three Schools, January- September 2000 8

Figure 4: Comparison of Two Formal Army Schools, Fiscal Year 2000 9

Figure 5: Percentage of Interviewees Aware of M939 Speed Limit Restriction
13

Figure 6: Some Recurring Conditions Cited in M939 Accident Reports, 1988- 99
17 Contents

Page 1 GAO- 01- 436 Army Training

April 11, 2001 The Honorable Christopher J. Dodd The Honorable Joseph I.
Lieberman United States Senate

The Honorable Rosa L. Delauro House of Representatives

In April 1997, a 5- ton M939 Army truck was involved in a fatal accident in
which two reservists died. The Army has over 30,000 of these trucks, which
are used extensively to carry personnel and pull equipment. You asked us to
report on the M939's accident history and to assess the training and
supervision received by its drivers. We broke your request into two issues.
The first dealing with the accident history and any inherent mechanical/
design defects in the truck itself. The second dealing with issues involving
the safe handling of the truck– the training and supervision of the
truck's drivers. In April 1999, we reported on the M939's accident history
and mechanical soundness. 1 For this second report, we (1) evaluated the
capacity of the Army's 5- ton truck driver training programs to fully train
drivers, (2) determined whether oversight procedures and processes for these
drivers are being followed, and (3) determined whether and how the Army uses
accident data to improve training, supervision, and safety.

The 5- ton truck driver training programs we reviewed do not graduate
drivers that are fully trained in all aspects of the instruction program and
for some tasks they may be required to perform. The main reasons for these
shortcomings are instructor shortages, limited environmental conditions
(lack of snow, ice, steep or rocky terrain, etc.) at the training sites, and
certain mission- related driving skills not being taught. There is also an
imbalance between the two formal truck driver training schools: the larger
one is understaffed to teach the number of students there, while

1 Military Safety: Army M939 5- Ton Truck Accident History and Planned
Modifications

(GAO/ NSIAD- 99- 82, Apr. 9, 1999).

United States General Accounting Office Washington, DC 20548

Results in Brief

Page 2 GAO- 01- 436 Army Training

the other one has smaller classes, conducts fewer classes per year, and
maintains a lower student- teacher ratio. In addition, some communication
problems hinder the flow of information to instructors, students,
supervisors, and licensed drivers.

Some supervisory procedures and processes designed to ensure that 5- ton
trucks are operated safely are not being performed or documented as
required. In particular, required annual “check rides” and
“sustainment

training” are either not properly performed or recorded. We reviewed
over 450 driver records and found that more than three- quarters of them did
not contain a required entry indicating that the driver had received an
annual check ride and/ or sustainment training as stipulated in Army
regulations.

The Army Safety Center maintains an accident database that has already
proven useful in developing some policies aimed at improving the safe
operation of M939 trucks. We analyzed M939 accident data from 1988 through
1999 and found trends that we believe could be used to improve driving
safety and to better focus training on problem areas. But the database is
not being periodically analyzed for these purposes, and opportunities are
thus being missed. Also, some accident reports have missing information,
thus limiting the usefulness of the database for some analytical processes
using these fields.

We are making recommendations aimed at improving the quality of truck driver
training, increasing compliance with Army regulations, and increasing the
safety of M939 truck driver operations. The Department of Defense concurred
with all our recommendations.

Page 3 GAO- 01- 436 Army Training

The Army has around 97,000 “medium tactical wheeled vehicles”
(about 57,000 5- ton trucks and 41,000 2- 1/ 2- ton trucks) in its fleet.
The M939 accounts for more than half its 5- ton trucks. The truck is used to
carry personnel or pull equipment under all weather and road conditions,
including rain, snow, ice, unpaved roads, sand, and mud (see fig. 1).

Figure 1: Three M939 Series Trucks

Source: Shane G. Deemer.

Background

Page 4 GAO- 01- 436 Army Training

The active Army uses formal and informal programs to train 5- ton truck
drivers. 2 The formal program is aimed at military personnel whose official
primary occupation will be “88M Motor Transport Operator”- or
truck driver. The program lasts 6 weeks and is taught in schools at Fort
Leonard Wood, Missouri, and Fort Bliss, Texas. Fort Leonard Wood trains
about 90 percent of all 88M students. Fort Bliss for the most part trains
the

“overflow” of students that Fort Leonard Wood cannot
accommodate. The formal instruction program calls for about 1 week in the
classroom and 5 weeks of hands- on training. Students who complete the
program do not immediately receive a license to drive a 5- ton truck; they
are licensed at their next duty station after undergoing additional training
and testing there. The Army Transportation Center and School at Fort Eustis,
Virginia, is responsible for the content of the instruction program used by
the formal training schools. It aligns under the Army Training and Doctrine
Command at Ft. Monroe, Virginia.

According to Army officials, informal programs are taught at installations
or units that need occasional 3 truck drivers but are not authorized any or
enough 88M drivers to handle their needs. Occasional drivers do not drive
trucks as their primary occupation; they do so on a part- time or as- needed
basis. Informal programs are usually 40 to 120 hours long and combine
classroom and driving time. Graduates are not automatically licensed and
must usually meet additional driving and testing requirements by their
units. Occasional drivers receive the same license as 88M drivers and,
accordingly, may be required to perform the same driving maneuvers.

The Army Reserve trains both Reserve and National Guard 88M drivers using a
two- part program that contains the same instructional material as the
formal program. The first part (81 hours) is conducted at the soldier's home
station during weekend drills. The second part (120 hours) is usually
conducted at a Reserve training center during a 2- week active duty session.
Like active Army truck drivers, program graduates must undergo additional
training and testing by their units before being licensed.

2 We define formal programs as resident training programs taught in a
school- house setting; and informal programs as those conducted by
individual Army units at many different installations.

3 We define an occasional driver as a driver licensed to operate a 5- ton
truck but not possessing the military occupation specialty designator or 88M
– Motor Transport Operator.

Page 5 GAO- 01- 436 Army Training

Graduates of the Army's truck driver training programs are not skilled
enough to safely handle 5- ton trucks in some situations for which they
should have received training. This is because of instructor shortages and
limited training conditions. Graduates are either partially trained or
untrained in some skills found in the instruction program. In addition, the
schools do not teach driving skills that are essential to performing the 5-
ton truck's primary mission.

One of the Army's two formal truck driver training schools, the school at
Fort Leonard Wood, Missouri, operates with sizable instructor shortages.
Because of this Fort Leonard Wood operates at a higher student- instructor
ratio than called for in the instruction program. In fiscal year 2000, the
Fort Leonard Wood facility trained nearly 90 percent of the Army's 88M
drivers in spite of these shortages. Instructors at the informal and Reserve
programs also said that their programs suffer from instructor shortages.

During the first 9 months of 2000, Fort Leonard Wood operated with an
average of 53 percent of its authorized instructors on- hand to teach the
program. The main reasons were 1) fewer personnel were assigned to teach
than were authorized and 2) even fewer were available (on- hand) than were
assigned due to other commitments (such as bus driving, funeral and parade
duty, leave, etc.). Authorized refers to the number of instructors the Army
determines are needed to teach a program; assigned refers to the number of
instructors the Army allocates to teach a program; and on- hand refers to
the number of instructors that are present and teaching a program. Figure 2
shows the number of instructors authorized, assigned, and on- hand at Fort
Leonard Wood in the first 9 months of 2000, when on average about 45 of 84
authorized instructors were available. Some Essential

Driving Skills Are Not Taught

Instructor Shortages

Page 6 GAO- 01- 436 Army Training

Figure 2: Number of Authorized, Assigned, and On- hand Instructors at Fort
Leonard Wood, January- September 2000.

Source: Our analysis of Army data.

Assuming that (1) the Army continues assigning instructors at about 85
percent of authorized levels and that (2) the number of instructors on- hand
remains constant at about 53 percent of those assigned, the Army would have
to increase its present authorized level of instructors from 84 to158, an
increase of 88 percent, in order to have a full complement on- hand.

The formal instruction program calls for a 6- to- 1 student- instructor
ratio- and Fort Leonard Wood is structured to operate at this ratio when
staffed at 100 percent of its authorized level. In the first 9 months of
2000, our review showed that Fort Leonard Wood operated overall at a higher
ratio of about 9 to 1. Nonetheless, training officials stated that the
school has been conducting the behind- the- wheel (hands- on) training
portion of the program at the 6- to- 1 ratio the instruction program calls
for. This means one instructor overseeing 3 trucks with two students per
truck. However, Army regulations stipulate a 1- to- 1 truck- instructor
ratio when a student driver is behind the wheel. In December 1998, Fort
Leonard Wood requested a waiver to allow the 6- to- 1 ratio when students
were driving

Page 7 GAO- 01- 436 Army Training

trucks. While the request has yet to be officially approved, school
officials claim that if required to maintain the 1- to- 1 ratio, each
student might drive as little as 30 miles during the entire course, instead
of the present target of about 100 miles per student on average.

Effects of Instructor Shortages

Instructor shortages affect the quantity and quality of training. Students
do not get sufficient hands- on driving experience and are not trained in
all the skills required by the instruction program.

Program officials at Fort Leonard Wood said that at times, instructors could
fully teach only about three- quarters of the instruction program's required
tasks. For example, in the second half of fiscal year 1999 two training
modules- driving off- road and basic vehicle control- were often carried out
only in part or demonstrated but not practiced. These two modules account
for almost 93 percent of the 85.5 hours students are supposed to spend
driving trucks. Because of instructor shortages during these two quarters,
the average number of miles driven by each student at Fort Leonard Wood
dropped from nearly 100 to less than 50. In addition, hands- on training is
presently limited to mostly driving in controlled settings only. Students
drive in convoys on unpaved but graded and regularly maintained training
routes at no more than 25 mph – receiving almost no training in how to
drive on public highways or in suburban settings. One group of trainers
stated that with more instructors, they could take students on some
realistic training rather than the “follow- theleader” driving
students now receive.

Students are also not being taught all the tasks that 5- ton- truck drivers
are expected to perform. Training officials at the two formal programs
stated they thought drivers should be trained in hauling loads or pulling
equipment- the primary mission of 5- ton trucks. While the instruction
program calls for 20 percent of all vehicles to operate with a load in the
cargo area, this is not being done, according to training officials, because
of logistical problems that make it difficult to train this skill. Pulling
equipment is not taught because it is not specified in the instruction
program. Therefore, students must learn these essential skills after
graduation and rotation to their next duty stations.

Neither the Marine Corps, which co- trains its 5- ton truck drivers with the
Army at Fort Leonard Wood, nor the smaller Fort Bliss school, which mostly
trains the overflow from Fort Leonard Wood, experience as severe Comparison
of On- hand

Instructors

Page 8 GAO- 01- 436 Army Training

instructor shortages as Fort Leonard Wood. Thus, neither encounters problems
teaching the instruction program in its entirety. According to Marine Corps
training officials, its detachment is authorized 76 instructors, and in the
first 9 months of 2000, averaged having 70 instructors assigned and 65 on-
hand (93 percent). During that same period of time, Fort Bliss training
officials stated its school was authorized 17 instructors but actually had
18 assigned and on- hand (106 percent).

During the first 9 months of 2000, the Marine Corps program averaged a
higher percentage of its assigned instructors on- hand than the Fort Leonard
Wood Army program – 93 percent versus 63 percent (see fig. 3). This,
according to Marine Corps training officials, was mostly because their
instructors did not have other commitments or assignments as did Army
instructors. Also, the average class size for the Marine Corps was much
smaller than that for the Army (44 versus 70 students), and they had more
instructors available to teach (65 on average versus the Army's 45). Because
of the smaller class size and larger number of on- hand instructors, the
Marine Corps can staff each truck at the 1- to- 1 instructorto- truck ratio
regulations call for. This, according to them, allows students to gain
driving skills in uncontrolled settings such as driving off- post, on public
highways, and in various urban settings.

On the other hand, the Fort Bliss school actually had a surplus of
instructors: it had 106 percent of its assigned instructors on- hand (see
fig. 3). According to program officials, their instructors also did not have
other commitments and assignments as did Fort Leonard Wood Army instructors.

Figure 3: On- hand Instructors as a Percentage of Assigned Instructors at
Three Schools, January- September 2000

Source: our analysis of Army data.

Page 9 GAO- 01- 436 Army Training

During fiscal year 2000, Fort Bliss also graduated fewer students, utilized
less of its overall available classroom capacity, averaged smaller class
sizes, and conducted about one- third the classes that Fort Leonard Wood
conducted (see fig. 4).

Figure 4: Comparison of Two Formal Army Schools, Fiscal Year 2000

Source: Our analysis of Army data.

Student Opinions Show Varied Satisfaction With Training Received

We surveyed 139 students at the two formal school programs, 72 students at
10 informal programs, and 98 students at 1 Army Reserve training program. We
asked them to rate their satisfaction with the type of training there were
receiving in various driving techniques and conditions. As table 1 shows,
students at Fort Bliss felt better about the training they received in many
driving skills than their counterparts at Fort Leonard Wood. Students in the
Reserve program were the most satisfied overall with the training they
received, while students in the informal programs were generally the least
satisfied. Imbalances Between the

Two Formal Army Schools

Page 10 GAO- 01- 436 Army Training

Table 1: Percentage of Students in Formal, Informal, and Reserve Programs
Satisfied With Training

Fort Bliss Fort Leonard Wood Informal Reserve

Backing empty truck 92 55 69 87 Overall wheel time 100 64 55 80 Small
inclines/ slopes– empty truck 38 66 15 50 Wheel time in different
weather/ surfaces– empty truck 08 07 09 57 Load in cargo area/ pulling
equipment a N/ A N/ A 02 04 Driving at night 00 23 09 32 a N/ A (not
applicable). The formal schools do not teach driving with a load, and the
instruction program does not call for training while pulling equipment.
Source: Our analysis of survey responses.

According to the instruction program, the majority of driving training time
(about 65 hours) should be dedicated to driving on and off roads through
woods, streams, brush, sand, mud, snow, ice, rocky terrain, ditches,
gullies, and ravines. However, we found that neither of the two formal
schools provides all these conditions in its training routes.

Students at Fort Bliss are well trained to drive in sand because the
school's training routes have sand. But the school seldom sees snow or ice
because these conditions seldom occur there. And the school's training
routes we observed were for the most part flat and unchallenging. One route
we drove offered few or no opportunities to drive through woods and brush,
over rocky terrain, or through gullies and ravines. The problem, according
to school officials, is that the land the training routes are on is too flat
and lacking in undergrowth. Training officials also told us that money
constraints and the fact that Fort Bliss' mission is to handle the overflow
of students from Fort Leonard Wood impede the development of more
challenging driving routes.

Training routes at Fort Leonard Wood also offered limited obstacles or
challenges. We drove what school officials said was the most difficult
training route and found that it did go through some woods and rocky terrain
and over some hills and inclines. However, it contained no sand and
engineering units maintained the surface the trucks drove on by routinely
smoothing out bumps, ruts, and other obstacles. Environmental Limitations

at Formal Training Schools

Page 11 GAO- 01- 436 Army Training Simulators Can Be Useful Training Tools

When adverse weather, dangerous road conditions, or other problems arise,
the formal schools hesitate to allow students to drive because of safety
concerns. However, the Army has determined that simulators can be used to
teach some driving skills that cannot be taught in high risk driving
conditions because of the dangers involved.

Because of safety concerns, the Fort Leonard Wood command has issued an oral
directive prohibiting students from driving off the installation. As a
result, students do not learn to drive trucks in traffic at highway speeds
or in urban settings. Furthermore, the training command frequently cancels
hands- on driver training in the presence of ice, snow, or fog because it
believes the risk of student drivers having a serious accident outweighs the
benefits of the driving experience.

Not training under adverse weather and road conditions limits the ability of
drivers to handle a truck safely in these situations when they rotate to
their new duty stations and begin to drive. In May 2000 the Analysis Center
at the Army Training and Doctrine Command completed a study that concluded,
among other things, that students graduating from the formal schools were
only about 15- percent proficient 4 in skills needed to drive in fog, ice,
or snow and 27- percent proficient in skills needed to drive on sand.

The study concluded that simulators could overcome these and other
shortcomings in driver training. It reviewed 31 critical driving tasks
taught at the formal schools and concluded that simulators could help
students obtain higher proficiency levels in as many as 22 of them. The
study also concluded that simulators might help reduce the potential for
accidents both during training and- most importantly- during the first year
after training by increasing driving proficiency in fog, snow, or ice.

4 The study defined proficiency as how well a school graduate performed a
specific driving task when compared to a driver with 1 year of post-
training experience, as assessed by qualified instructors.

Page 12 GAO- 01- 436 Army Training

Formal training program personnel agreed, stating that they cannot teach
students to drive under some of the more common hazardous conditions 5
because it is too dangerous. Other Army officials also said that simulators,
especially more advanced ones, can recreate such situations and give
students a sense of driving under these conditions without putting lives at
risk. Training personnel at both formal schools, Army Transportation School
officials, as well as the simulator study itself strongly cautioned,
however, that simulators should not replace actual behind- the- wheel
driving time.

The private sector uses simulators in its truck driving schools and
considers them very useful. Officials at two commercial driving schools
stated that their simulators help students learn to drive under various
high- risk driving and weather conditions, including braking with a load on
steep inclines or on wet and icy surfaces.

Some safety rules relating to M939 trucks are not being communicated
effectively. Moreover, many informal training programs seem to be unaware of
available assistance from the Army Transportation School. Better
communication is key to improving the flow of this type of information.

The M939 series trucks are not supposed to be driven over 40 mph, even under
ideal conditions. However, we found that some licensed drivers, students,
instructors, and supervisors alike were either unaware of the speed limit,
had forgotten about it, or did not know this restriction is still in effect
for M939s without anti- lock brake systems 6 . Two- thirds of licensed
drivers we interviewed, as well as about one- third of student drivers in
formal training programs and over two- thirds of student drivers in informal
training programs, did not know or could not recall the 40- mph limit. And
none in a group we interviewed from a recently graduated formal program
class were able to tell us the correct maximum speed

5 In 1995 the Army Deputy Director of Safety concluded that M939 trucks were
involved in a disproportionately high number of accidents in which a panic
stop on a wet surface with a partially loaded truck going over 40 mph were
among the factors that contributed to the accident.

6 For all M939s having been outfitted with anti- lock brake kits, the speed
limit restriction of 40 mph is no longer in effect. However, we did not
observe an M939 equipped with such during our on- site visits. Some
Information Not

Reaching Its Target

Page 13 GAO- 01- 436 Army Training

limit. Although nearly all the 65 formal and Reserve program instructors we
interviewed could state the correct speed limit, only about two- thirds of
informal program instructors and driver supervisors could do so. By
contrast, all of the nearly 100 students we interviewed at the Army Reserve
training program knew of the speed limit, and for a simple reason: all the
M939 trucks used for training had a dashboard sticker to remind the driver
of the speed limit. (See fig. 5.)

Figure 5: Percentage of Interviewees Aware of M939 Speed Limit Restriction

Source: Our analysis of interview responses.

There also appears to be a communication problem between informal program
instructors and the Army Transportation School. Although the instructors
believe their training programs are good ones, they also stated they do not
have enough time to focus on improving and upgrading these programs and
would like more input from “knowledgeable personnel,” such as
those at the Fort Eustis Transportation School who developed the formal
training program. Some said they could have avoided difficulties they
encountered in developing a high- quality informal program if such expertise
had been available. Many suggested that standardized, Armywide training
packages tailored for each type of vehicle would be an efficient and
economical way of training informal drivers.

However, none of the instructors we interviewed knew that the Transportation
School has a program available designed specifically for informal training
of M939 drivers. In November 1999, the Transportation School distributed a
CD- ROM driver training program 7 , which includes

7 Army Model Drivers Training Program M939, 5- ton Tactical Cargo Truck.

Page 14 GAO- 01- 436 Army Training

lessons on driving and performing operator maintenance on the M939 to Army
standards. Transportation School officials stated that the program was sent
to around 1,800 different Army locations (according to the number and
location of M939 trucks) and is also available through the Army's web site.

While facing similar instructor shortages and limited driving conditions,
the informal and Reserve training programs we reviewed must also try to
train drivers in a shorter time than the formal programs. The reserves also
have problems with their equipment.

The 10 informal programs we reviewed ranged between 40 and 120 hours
(compared to 6 weeks for the formal program). As a result, instructors focus
mostly on teaching the basics (driving on surfaced roads, backing up on flat
surfaces, and performing some required maintenance and service). Instructors
teach more difficult skills only if time and circumstances allow. Several
instructors questioned how their 40 to 80 hour programs could possibly teach
as much as was taught in the 6- week formal course.

The reserves have problems not only with instructor shortages, but also with
training equipment. Reserve officials said their 5- ton truck driver
training programs are generally understaffed because of a lack of available
senior noncommissioned officers to teach. Also, because programs are usually
not authorized a fleet of trucks exclusively for training, units must borrow
trucks from the installation where training is taking place or from other
nearby Army installations. The training unit is responsible for picking up
and returning the trucks or for paying to have the trucks delivered and
returned. They also pay an established usage fee to the units that lend the
trucks. This is costly, especially if a borrowed vehicle needs repair work
before it can pass the required safety inspection so that it can be used for
training. Reserve training officials told us that this happens frequently
and adversely impacts training. Additional Challenges

Facing Informal and Reserve Training Programs

Page 15 GAO- 01- 436 Army Training

Army regulations 8 require that truck drivers undergo a so- called
“check

ride” and “sustainment training 9 ” once a year (once
every 2 years for the Army Reserve and National Guard). Performing these
procedures- which are aimed at identifying and correcting poor driving
habits, maintaining high driving proficiency levels, and ensuring safe
driving- is the responsibility of the driver's assigned unit. Both
procedures must also be documented in personnel driving records. However, we
found that they are either not being performed or are not being recorded as
required.

We reviewed over 450 driving records and found that over 80 percent did not
contain an entry indicating a check ride had been performed every year and
for each type of vehicle in which the driver was licensed to drive. Eighty-
five percent of records also did not have an entry documenting that
sustainment training had been given annually as required. Seventy percent of
the drivers we interviewed (both 88M drivers and occasional drivers) stated
they either did not know what a check ride was or had not been given one
annually. Three- quarters of the drivers we interviewed also said they had
not attended an annual sustainment training course.

Supervisors 10 are responsible for administering check rides to assess a
driver's capabilities and overall driving habits. According to Army
officials, unit commanders and supervisors must also develop and implement
annual sustainment training programs, in part, on the basis of the results
of check rides. A number of supervisors told us that they do not always
conduct formal check rides because of personnel shortages and high operating
tempo; rather, they try to assess drivers' skills and give correctional
guidance- a sort of “informal” check ride- whenever they ride
with a driver. None of them knew about the Transportation School's informal
driver training program, which includes guidelines for sustainment training.

8 ArmyRegulation 600- 55, Army Driver and Operator Standardization Program
(Selection, Training, Testing, and Licensing). 9 Instruction and practice to
ensure that mastery of specific skills are maintained.

10 Supervisors are those in the driver's immediate chain- of- command who
oversee and direct the driver's day- to- day activities. Some Supervisory

Procedures Are Not Being Performed or Documented

Page 16 GAO- 01- 436 Army Training

The Army Safety Center maintains a ground accident database 11 that has been
used in the past to identify accident anomalies that in turn led to safety
improvements involving the operation of M939 series 5- ton trucks. The
database, however, is not complete because not all data fields in accident
investigation reports are always filled in. The database is also not being
analyzed on a regular basis to identify trends or recurring problems.

One of the purposes of the ground database is to provide demographic
information that can be used for statistical comparisons. The Army Safety
Center did so in 1998 when it compared accident rates of different Army
trucks 12 and found that the M939 series trucks had a much higher serious
accident rate than other similar trucks. In other, earlier studies, the
Center reviewed M939 accident data and found a series of recurring accident
conditions. On the basis of these studies, the Army Tank- automotive and
Armaments Command in December 1992 issued the first of several Army- wide
messages 13 warning of these problems and imposing the 40- mph speed limit
on the M939. Also on the basis of these studies, the Command conducted
additional studies on the M939, which in turn led to an estimated $122.4
million in recommended design modifications. 14

We analyzed nearly 400 M939 accident reports dating from 1988 through 1999
contained in the Safety Center's database and found that four of the 36 data
fields of information we requested for our analysis were often not filled
in. Safety Center personnel acknowledged that the missing data could weaken
any conclusions reached using these fields. Two fields –

Was the Driver Licensed at the Time of the Accident and What was the
Driver's Total Accumulated Army Motor Vehicle Mileage – contained no
information 45 and 50 percent of the time respectively, and because of this,
could not be included in the analyses we performed. Two other Fields

–What Was the Mistake Made and Why Was the Mistake Made- were also
often left blank.

11 Army regulations require that an accident investigation report be filled
out for all class A through C occupational injury accidents and all class A
through D property damage accidents. The classes denote the severity of the
accident, with “A” as the most serious or costly.

12 The M34/ 35 trucks, the M939 trucks, and the Family of Medium Tactical
Vehicle trucks. 13 The Army uses Ground Precautionary Messages and Safety of
Use Messages to disseminate service- wide safety information. 14 See
Military Safety: Army M939 5- Ton Truck Accident History and Planned
Modifications (GAO/ NSIAD- 99- 82, Apr. 9, 1999). Accident Database

Not Used Effectively

Page 17 GAO- 01- 436 Army Training

Our analysis also revealed patterns that, if studied further, might be
useful in improving training programs. For example, many of the reported
accidents occurred on wet or slippery surfaces or when the truck was hauling
cargo or pulling equipment. Furthermore, three- quarters of accidents
involved occasional drivers (those trained at informal schools). Some
patterns we identified are illustrated in figure 6.

Figure 6: Some Recurring Conditions Cited in M939 Accident Reports, 1988- 99

Source: Our analysis of Army Safety Center ground accident database.

Instructor shortages are affecting the quality and quantity of truck driver
training, especially at Fort Leonard Wood. The end result is that student
drivers are not fully trained in all aspects of the instruction program when
they graduate. This places an additional burden on the drivers' assigned
units, which must further train these drivers, and on supervisors, who must
be more vigilant in identifying drivers' shortcomings. If formal schools had
enough instructors on- hand, they would presumably be able to teach the
entire instruction program.

The student imbalance between the schools at Fort Leonard Wood, which is
understaffed, and Fort Bliss, which has smaller class sizes and a lower
student- instructor ratio, creates an ineffective use of resources. This
imbalance places an unnecessarily heavy burden on Fort Leonard Wood. If the
annual student load were more equally distributed between the two schools,
student graduates from Fort Leonard Wood might receive more complete
training. Conclusions

Page 18 GAO- 01- 436 Army Training

The formal schools are not adhering to the instruction program, which calls
for some training with trucks carrying cargo. Further, no training is
provided in how to pull equipment. With a high percentage of M939 accidents
taking place under these two conditions, the formal schools should provide
some training in these areas.

Similarly, students are not being trained to drive under different weather
and surface conditions. While it is understandable why formal schools
hesitate to take the risk of having students drive under hazardous or
highrisk conditions, it is also necessary that students receive such
training. An army study concluded that simulators can provide an effective
means of safely training drivers in high- risk weather and different road-
surface situations.

Because annual check rides and sustainment training are not always being
performed, unsafe driving habits may go undetected. Further, if corrective
oversight or training is not recorded, unit commanders and supervisors
cannot know which drivers need attention. Although performing and recording
check rides and sustainment training may be time- consuming, these
procedures can save lives.

Some important safety information, such as M939 speed limit restrictions, is
not always being passed on to or remembered by drivers, supervisors, and
trainers. Using inexpensive devices, such as dashboard stickers, is a simple
way to remind these personnel of the speed restrictions.

The Safety Center's accident database could be used to identify trends that
may show the need for greater training emphasis in certain driving
maneuvers. A periodic analysis of the database could assist school
officials, instructors, and supervisors in adjusting instruction programs or
mentoring drivers. However, such analysis would prove more useful if all
fields of information contained in the database were complete.

We recommend that the Secretary of the Army direct the Commander of the
Training and Doctrine Command to

? review and modify, as needed, instructor levels for the formal training
programs to ensure that the programs are adequately staffed to teach the
anticipated class size;

? balance the student load between the two schools by bringing the Fort
Bliss school up to fuller capacity and/ or increasing the number of classes
Recommendations

Page 19 GAO- 01- 436 Army Training

annually taught there, thereby reducing the student load and associated
problems created by such at Fort Leonard Wood;

? enforce the instruction program used by the two formal schools to ensure
that students receive hands- on training in driving trucks loaded with cargo
and also modify the program to include driving when pulling equipment- two
essential skills in performing the primary mission of the 5- ton tactical
fleet; and

? consider using simulators at the two formal schools to safely teach known
training shortfalls such as driving under hazardous conditions, with the
understanding that simulators not be used to replace hands- on driving
conducted under less risky conditions.

We also recommend that the Secretary of the Army issue instructions to all
applicable major army commands to

? require adherence to Army regulations on check rides and sustainment
training of licensed truck drivers and

? require that warning stickers indicating speed restrictions be prominently
displayed in the cabs of all M939 trucks not equipped with anti- skid brake
systems.

We further recommend that the Secretary of the Army direct the Commander of
the Army Safety Center to

? ensure that all information fields in accident reports are properly filled
in and

? periodically review accident data for the presence of trends or anomalies
for the purposes of informing trainers and supervisors of any information
that may help them perform their duties or help improve safety.

In oral comments on a draft of this report, Department of Defense officials
concurred with all our recommendations.

We are providing copies of this report to the Honorable Donald H. Rumsfeld,
Secretary of Defense; the Honorable Joseph W. Westphal, Ph. D., Acting
Secretary of the Army; and interested congressional committees. Copies will
also be made available to other interested parties upon request. Agency
Comments

and Our Review

Page 20 GAO- 01- 436 Army Training

If you or your staff have questions concerning the report, please call me at
(202) 512- 5559. Our scope and methodology is explained in appendix I. GAO
contacts and staff acknowledgments to this report are listed in appendix II.

Derek B. Stewart Director Defense Capabilities and Management

Appendix I: Objectives, Scope, and Methodology

Page 21 GAO- 01- 436 Army Training

Our objectives were to (1) evaluate the capacity of the Army's 5- ton truck
driver training programs to fully train drivers, (2) determine whether
oversight procedures and processes for these drivers are being followed, and
(3) determine whether and how the Army uses accident data to improve
training, supervision, and safety.

To evaluate the capacity of the Army's 5- ton truck driver training programs
to fully train drivers, we reviewed applicable training programs in terms of
compliance and completeness at both of the Army's formal schools (Fort
Leonard Wood and Fort Bliss) and 10 different informal training facilities
located at 4 installations. We also reviewed the training provided at one of
eight Army Reserve training centers. Reserve training centers all use the
same Program of Instruction. We reviewed these programs for compliance with
existing regulations and standard operating procedures established by the
various training components. To assess the completeness of training, we made
observations and collected documentation relating to the actual training
being conducted and compared that documentation to the training specified in
each training schools/ program's instruction program and also in relation to
the primary mission of the 5- ton truck fleet. We also discussed these
issues with officials responsible for designing the training programs,
training command personnel, driving instructors, and student drivers to gain
their perspectives. Lastly we compared the formal Marine Corps 5- ton
training program and two commercial sector training programs to the Army's
formal program to identify any training techniques and/ or devices that
might benefit 5- ton training curriculums.

To determine whether oversight procedures and processes for these drivers
are being followed, we documented the duties of supervisors of medium
tactical vehicles as found in Department of Defense and Army guidance,
instructions, procedures, and regulations. Through observations and
discussions with nearly 80 driver supervisors and nearly 200 truck drivers
stationed at 12 different Army and National Guard units, we then assessed
the degree to which they accomplished these responsibilities or followed
required documentary procedures. In addition, at the units visited we
collected over 450 historical driving records for truck operators and
reviewed them for required annual supervisory annotations relating to check
ride and sustainment training specified in Army regulations.

To ensure we collected information representative of the universe of
existing 5- ton truck informal training programs and the administering of
driver supervision responsibilities, we selected- for review and observation
purposes- four installations aligned under the U. S. Army Appendix I:
Objectives, Scope, and

Methodology

Appendix I: Objectives, Scope, and Methodology

Page 22 GAO- 01- 436 Army Training

Forces Command. This major command, according to the Army Materiel Command's
Logistic Support Activity, controls 94 percent of the active army's M939
series 5- ton trucks in the continental United States. Because Army
automated record- keeping systems cannot provide 5- ton truck densities or
locations below the major command level, we engaged the services of Army
Internal Review personnel to assist us. Within the four installations, we
requested that Internal Review personnel set up meetings with subordinate
commands conducting the majority of 5- ton truck driver training and with
commands maintaining the largest concentrations of 5ton trucks and/ or
drivers.

In discussing accident data with Army Safety Center personnel, we learned of
Army notifications currently in effect and relevant to the safe handling of
5- ton trucks that resulted from past analyses performed on the Center's
ground accident database. We reviewed these notifications, including
existing Army regulations and procedures pertaining to how this information
is to be disseminated Army- wide. We then queried 5- ton truck driver-
trainers, student drivers, supervisors, and licensed drivers to gain an
understanding of how knowledgeable they were of restrictions imposed by
these notifications.

To determine whether and how the Army uses accident data to improve
training, supervision, and safety, we interviewed safety center personnel
and obtained and reviewed past studies and analyses conducted by the Center.
In addition to identifying data that could be useful in improving training
or supervision, we analyzed 12 years of demographic accident information
pertaining to M939 series 5- ton tactical cargo trucks. Our analysis of this
information, compiled for us by Army Safety Center personnel, included Class
A, B, and C accidents occurring from January 1988 through December 1999 and
for which some degree of fault was attributable to an M939 driver. This
truck series accounts for about onehalf of the Army's 5- ton fleet and is
the series specifically mentioned in the request letter. We focused on
identifying the presence of any demographic anomalies or commonality factors
that, when compiled statistically, might prove beneficial to trainers,
supervisors, or the safer operation of M939 series trucks. We also discussed
the results of our accident analysis with Army Safety Center officials,
trainers, and supervisors to obtain their input and/ or concurrence.

We performed our work from May 1999 through July 2000 in accordance with
generally accepted government auditing standards.

Appendix II: GAO Contact and Staff Acknowledgments

Page 23 GAO- 01- 436 Army Training

Reginald L. Furr, Jr. (202) 512- 5426 In addition to those named above,
Aisha A. Mahmood, Stefano Petrucci, William R. Simerl, Lorelei St. James,
and Gerald L. Winterlin made key contributions to this report. Appendix II:
GAO Contact and Staff

Acknowledgments GAO Contact Acknowledgments

(702001)

The first copy of each GAO report is free. Additional copies of reports are
$2 each. A check or money order should be made out to the Superintendent of
Documents. VISA and MasterCard credit cards are also accepted.

Orders for 100 or more copies to be mailed to a single address are
discounted 25 percent.

Orders by mail:

U. S. General Accounting Office P. O. Box 37050 Washington, DC 20013

Orders by visiting:

Room 1100 700 4 th St., NW (corner of 4 th and G Sts. NW) Washington, DC
20013

Orders by phone:

(202) 512- 6000 fax: (202) 512- 6061 TDD (202) 512- 2537

Each day, GAO issues a list of newly available reports and testimony. To
receive facsimile copies of the daily list or any list from the past 30
days, please call (202) 512- 6000 using a touchtone phone. A recorded menu
will provide information on how to obtain these lists.

Orders by Internet

For information on how to access GAO reports on the Internet, send an email
message with “info” in the body to:

Info@ www. gao. gov or visit GAO's World Wide Web home page at: http:// www.
gao. gov

Contact one:

? Web site: http:// www. gao. gov/ fraudnet/ fraudnet. htm

? E- mail: fraudnet@ gao. gov

? 1- 800- 424- 5454 (automated answering system) Ordering Information

To Report Fraud, Waste, and Abuse in Federal Programs

A

Report to the Chairman, Committee on the Budget, House of Representatives

April 2001 ENVIRONMENTAL LIABILITIES

DOD Training Range Cleanup Cost Estimates Are Likely Understated

GAO- 01- 479

GAO United States General Accounting Office

Page 1 GAO- 01- 479 Training Range Cleanup Letter 3

Appendix I Objectives, Scope, and Methodology 26

Appendix II Comments From the Department of Defense 28

Appendix III GAO Contact and Staff Acknowledgments 34

Figures

Figure 1: Signs Warning of the Dangers and Presence of Unexploded Ordnance
at Fort McClellan 11 Figure 2: Examples of Unexploded Ordnance Found on
Training

Ranges 12

Abbreviations

CERCLA Comprehensive Environmental Response, Compensation, and Liability Act
DOD Department of Defense DOE Department of Energy DUSD( ES) Deputy Under
Secretary of Defense for Environmental

Security EPA Environmental Protection Agency SARA Superfund Amendments and
Reauthorization Act SFFAS Statement of Federal Financial Accounting
Standards Contents

Page 2 GAO- 01- 479 Training Range Cleanup

Page 3 GAO- 01- 479 Training Range Cleanup

April 11, 2001 The Honorable Jim Nussle Chairman Committee on the Budget
House of Representatives

Dear Mr. Chairman: The previous chairman of your Committee expressed concern
about the long- term budgetary implications associated with the
environmental cleanup of the Department of Defense's (DOD) training ranges.
The chairman requested that we review (1) the potential magnitude of the
cost to clean up these ranges in compliance with applicable laws and
regulations, (2) the scope and reliability of DOD's training range
inventory, and (3) the methodologies used to develop cost estimates. This
report conveys the results of that review. He also requested a similar
review of certain other DOD property that has associated environmental
cleanup and disposal costs on which we will issue a separate report at a
later date. This report focuses on DOD's efforts to collect, analyze, and
report information on its training ranges and the potential cleanup costs 1
of

1 Federal accounting standards define environmental cleanup costs as the
cost of removing, containing, and/ or disposing of (1) hazardous waste from
property or (2) material and/ or property that consists of hazardous waste
at permanent or temporary closure or shutdown of associated property, plant,
and equipment. Hazardous waste is a solid, liquid, or gaseous waste, or
combination of these wastes, which because of its quantity, concentration,
or physical, chemical, or infectious characteristics may cause or
significantly contribute to an increase in mortality or an increase in
serious irreversible, or incapacitating reversible, illness or pose a
substantial present or potential hazard to human health or the environment
when improperly treated, stored, transported, disposed of, or otherwise
managed. Cleanup may include, but is not limited to, decontamination,
decommissioning, site restoration, site monitoring, closure, and postclosure
costs.

United States General Accounting Office Washington, DC 20548

Page 4 GAO- 01- 479 Training Range Cleanup

unexploded ordnance 2 or other constituent contamination 3 on these training
ranges. 4

DOD has estimated that millions of acres of training ranges in the United
States and its territories are contaminated with unexploded ordnance that
could potentially harm the public and the environment if not properly
managed or cleaned up. With the increase in DOD downsizing and resulting
base closures in recent years, large numbers of military properties are
being turned over to non- DOD ownership and control. Although DOD has
procedures to mitigate the risk to human health and the environment, the
transfer of ownership results in the public being put at greater risk of
sickness, injury, or even death from unexploded ordnance or its constituent
contamination. DOD is subject to various laws that govern remediation of
contamination on military installations and standards establishing
requirements for DOD to recognize and report the costs of managing and
cleaning up these properties.

We conducted our work in accordance with generally accepted government
auditing standards from May 2000 through March 2001. Further details on our
scope and methodology are in appendix I.

DOD does not have complete and accurate data needed to estimate training
range cleanup costs. The two primary elements needed to develop these costs
are (1) an accurate and complete training range inventory and (2) a
consistent cost methodology. Because DOD does not have a complete inventory
and has not used a consistent cost methodology, the amounts reported for
training range cleanup cannot be relied upon and are

2 Unexploded ordnance are munitions that have been primed, fused, armed, or
otherwise prepared for action, and have been fired, dropped, launched,
projected, or placed in such a manner as to constitute a hazard to
operations, installation, personnel, or material and remain unexploded
either by malfunction, design or any other cause.

3 Military munitions may contain many constituents that can pollute the soil
and water supplies. These constituents can be released by the detonation of
ordnance or from damaged or deteriorated unexploded ordnance. Constituents
that may be released include propellants, explosives, pyrotechnics, chemical
agents, metal parts, and other inert components.

4 The cleanup of unexploded ordnance and other constituent contamination on
training ranges will be referred to in this report as training range
cleanup. This does not include the cleanup of nontraining range sites
containing unexploded ordnance or sites such as manufacturing facilities,
munitions burial pits, or open burn and open detonation sites. Results in
Brief

Page 5 GAO- 01- 479 Training Range Cleanup

likely significantly understated. For example, in its fiscal year 2000
financial statements, DOD reported its liability for the cleanup of training
ranges at approximately $14 billion. 5 However, other DOD estimates show
that its liability for training range cleanup could exceed $100 billion.
Without complete and accurate data, it is impossible to determine whether
these amounts represent a reasonable estimate of the long- term budget
implications of cleaning up DOD's training ranges.

The military services have not performed complete inventories of their
ranges, fully identifying the types and extent of the unexploded ordnance
present and the associated contamination. Recently, DOD began the initial
compilation of training range data in response to the Senate Report
accompanying the National Defense Authorization Act for Fiscal Year 2000
(Report 106- 50, May 17, 1999), which called for a complete estimate of the
current and projected costs for unexploded ordnance remediation 6 at active
facilities, installations subject to base realignment and closure, and
formerly used defense sites, including all training ranges. However, DOD's
initial data collection efforts in response to the Senate Report were
delayed in part because DOD did not issue formal guidance to the services
for collecting the range information until October 2000- 17 months after the
date of the Senate Report, which directed DOD to prepare a report to the
congressional defense committees by March 1, 2001. However, as of March 30,
2001, this report had not been issued. In addition to the delay, the
guidance when issued was not comprehensive enough to develop a complete and
accurate inventory. In an attempt to meet the March 1, 2001, deadline in the
Senate Report, DOD officials limited the scope of the information gathered
and analyzed. For example, DOD did not direct the services to collect
information or report on the unexploded ordnance constituent contamination
of soil, ground water, and surface water, or water ranges. As a result, the
March 2001 report will not be complete or accurate.

Federal financial accounting standards have required that DOD report a
liability for the estimated cost of cleaning up its training ranges in its
annual financial statements since fiscal year 1997, although DOD did not
begin to do so until fiscal year 1999. Since DOD had not completed an
inventory of its ranges, the services have used varying methods to estimate

5 DOD Fiscal Year 2000 Agency- wide Financial Statements, February 15, 2001.
6 Unexploded ordnance remediation also includes the cleanup of other
constituent contamination associated with unexploded ordnance.

Page 6 GAO- 01- 479 Training Range Cleanup

the size and condition of the ranges necessary to estimate the cost of
cleanup for financial statement reporting purposes. For example, in fiscal
year 2000, the Navy estimated training range acreage based upon limited
surveys completed in 1995 through 1997 and applied a cleanup cost factor of
$10,000 an acre to the total. The Army, lacking detailed knowledge of its
ranges, estimated the number of closed ranges and applied historical costs
from other cleanup efforts. These ad hoc measures do not substitute for the
comprehensive inventory of training ranges needed to develop reasonable
environmental liability estimates for the financial statements.

In addition, environmental liability costs reported in the financial
statements for training range cleanup are not consistently calculated and
reported across the services. To date, the services have not been provided
adequate guidance to develop consistent cost estimates. As a result, the
services have independently developed cost estimates and used different
methodologies for estimating the cost of cleaning up training ranges for
financial statement reporting. DOD officials told us that they planned to
use a standard methodology for estimating the cleanup costs in the March
2001 report; however, this methodology was available but not used by the
services for the fiscal year 2000 financial statements. Also, the
assumptions and cost factors DOD planned to use in the model for estimating
the training range cleanup costs for the March 2001 report have not been
independently validated as required by DOD policy to ensure reliable
estimates. DOD is planning to validate this cost estimating model later in
2001.

Service officials have told us they are unsure whether the standard
methodology used to estimate training range cleanup costs for the March 2001
report will be used in the future for estimating the cleanup liability
reported in the financial statements. However, without a consistent
methodology, cleanup costs reported in the financial statements and other
reports will not be comparable and have limited value to management when
evaluating cleanup costs of each the services' training ranges and budgeting
for the future.

The problems we have identified with DOD's accumulation of its inventory and
cost data on training range cleanup demonstrate that DOD does not have the
top management focus and leadership necessary to reliably report estimates
of training range cleanup costs. The need for similar programmatic
leadership has been previously recognized and

Page 7 GAO- 01- 479 Training Range Cleanup

recommended by the Defense Science Board 7 in 1998. The Defense Science
Board found that DOD had no specific unexploded ordnance remediation policy,
goals, or program. In addition, several members of Congress have recently
written letters to the Secretary of Defense to express similar concerns
about the need for high- level attention and resources to address training
range cleanup issues.

We are making recommendations to address (1) the need for DOD leadership in
managing the reporting of training range liabilities and (2) developing and
implementing guidance to ensure that DOD has a complete inventory of all
training ranges and that a consistent cost methodology is used in reporting
training range cleanup liabilities.

In commenting on a draft of this report, DOD concurred with our
recommendations. The additional information that DOD provided in response to
one of our recommendations is discussed in the “Agency

Comments and Our Evaluation” section. DOD is subject to various laws
dating back to the Comprehensive Environmental Response, Compensation, and
Liability Act (CERCLA) as amended by the Superfund Amendments and
Reauthorization Act (SARA) of 1986 that govern remediation (cleanup) of
contamination on military installations. DOD must also follow federal
accounting standards that establish requirements for DOD to recognize and
report the estimated costs for the cleanup of training ranges in the United
States and its territories. Increasing public concern about potential health
threats has affected not only the present operations of these training
ranges but also the management, cleanup, and control of this training range
land that has been, or is in the process of being, transferred to other
agencies and public hands.

DOD defines a range as any land mass or water body that is used or was used
for conducting training, research, development, testing, or evaluation of
military munitions or explosives. DOD classifies its ranges into the
following five types.

7 The Defense Science Board is a federal advisory committee established to
provide independent advice to the Secretary of Defense. Background

Training Range Classification

Page 8 GAO- 01- 479 Training Range Cleanup

? Active ranges are currently in operation, construction, maintenance,
renovation, or reconfiguration to meet current DOD component training
requirements and are being regularly used for range activities. Examples of
these ranges would include ranges used for bombing, missiles, mortars, hand
grenades, and artillery testing and practice.

? Inactive ranges are ranges that are not currently being used as active
ranges. However, they are under DOD control and are considered by the
military to be a potential active range area in the future, and have not
been put to a new use incompatible with range activities.

? Closed ranges have been taken out of service and are still under DOD
control but DOD has decided that they will not be used for training range
activities again.

? Transferred ranges have been transferred to non- DOD entities such as
other federal agencies, state and local governments, and private parties,
and are those usually associated with the formerly used defense sites 8
program.

? Transferring ranges are in the process of being transferred or leased to
other non- DOD entities and are usually associated with the base realignment
and closure program.

Congress addressed environmental contamination at federal facilities under
SARA in 1986. This legislation established, among other provisions, the
Defense Environmental Restoration Program and the Defense Environmental
Restoration Account as DOD's funding source under the Act. The goals of the
Defense Environmental Restoration Program include (1) identification,
investigation, research and development, and cleanup of contamination from
hazardous substances, pollutants, and contaminants and (2) correction of
other environmental damage such as detection and disposal of unexploded
ordnance which creates an imminent and substantial danger to the public
health or welfare or to the environment. The Office of the Deputy Under
Secretary of Defense for Environmental Security (DUSD( ES)) was created in
1993. That office has overall responsibility for environmental cleanup
within DOD and includes the Office of Environmental Cleanup that manages the
Defense Environmental Restoration Program.

8 Formerly used defense sites are properties that were formerly owned,
leased, possessed, or operated by DOD. Requirements to Address

and Report Training Range Cleanup Liabilities

Page 9 GAO- 01- 479 Training Range Cleanup

Carrying out any remediation or removal actions under applicable
environmental laws, including SARA, would likely require the immediate or
future expenditure of funds. Federal accounting standards determine how
those expenditures are accounted for and reported. The Chief Financial
Officers' Act of 1990, as expanded by the Government Management and Reform
Act of 1994, requires that major federal agencies, including DOD, prepare
and submit annual audited financial statements to account for its
liabilities, among other things. Two federal accounting standards, Statement
of Federal Financial Accounting Standards (SFFAS) Nos. 5 and 6, establish
the criteria for recognizing and reporting liabilities in the annual
financial statements, including environmental liabilities.

SFFAS No. 5, Accounting for Liabilities of the Federal Government, defines
liability as a probable future outflow of resources due to a past government
transaction or event. SFFAS No. 5 further states that recognition of a
liability in the financial statements is required if it is both probable and
measurable. Effective in 1997, SFFAS No. 5 defines probable as that which is
more likely than not to occur (for example, greater than a 50 percent
chance) based on current facts and circumstances. It also states that a
future outflow is measurable if it can be reasonably estimated. The
statement recognizes that this estimate may not be precise and, in such
cases, it provides for recognizing the lowest estimate of a range of
estimates if no amount within the range is better than any other amount.
SFFAS No. 6, Accounting for Property, Plant, and Equipment, further defines
cleanup costs as costs for removal and disposal of hazardous wastes or
materials that because of quantity, concentration, or physical or chemical
makeup may pose a serious present or potential hazard to human health or the
environment.

The Office of the Under Secretary of Defense (Comptroller) issues the DOD
Financial Management Regulation containing DOD's policies and procedures in
the area of financial management, which require the reporting of
environmental liabilities associated with the cleanup of closed,
transferred, and transferring ranges in the financial statements. 9 DOD has
taken the position that the cleanup of these ranges is probable and
measurable and as such should be reported as a liability in its financial
statements. Under the presumption that active and inactive ranges will

9 DOD Financial Management Regulation, Volume 4, Chapter 13, Accrued
Environmental and Nonenvironmental Disposal Cost Liabilities and Chapter 14
, Accrued Environmental Restoration (Cleanup) Liabilities, October 1999; and
Volume 6B, Chapter 10 , Notes to the Financial Statements, December 2000.

Page 10 GAO- 01- 479 Training Range Cleanup

operate or be available to operate indefinitely, the DOD Financial
Management Regulation does not specify when or if liabilities should be
recognized in the financial statements for these ranges.

The Senate Report accompanying the National Defense Authorization Act for
Fiscal Year 2000 directed DOD to provide a report to the congressional
defense committees, no later than March 1, 2001, that gives a complete
estimate of the current and projected costs for all unexploded ordnance
remediation. As of March 30, 2001, DOD had not issued its report. For the
purposes of the March 2001 report, DOD officials had stated that they would
estimate cleanup costs for active and inactive training ranges just as they
would for closed, transferred, and transferring ranges. Thus, the cleanup
costs shown in this report would have been significantly higher than the
training range liabilities reported in the financial statements, which only
include estimates for closed, transferred, and transferring ranges. However,
in commenting on a draft of our report, DOD officials informed us that they
would not be reporting the cleanup costs of active and inactive training
ranges in their March report.

As DOD downsizing and base closures have increased in recent years, large
numbers of military properties have been, and are continuing to be, turned
over to non- DOD ownership and control, resulting in the public being put at
greater risk. DOD uses a risk- based approach when transferring ranges from
its control to reduce threats to human health and the environment. DOD
attempts to mitigate risk to human health on transferred and transferring
ranges. In instances where DOD has not removed, contained, and/ or disposed
of unexploded ordnance and constituent contamination from training ranges
prior to transfer, it implements institutional controls to restrict access
to transferring ranges and to transferred ranges where risks are found.
Institutional controls include implementing community education and
awareness programs, erecting fences or barriers to control access, and
posting signs warning of the dangers associated with the range. Figure 1
shows signs posted at Fort McClellan, Alabama, warning of unexploded
ordnance. Fort McClellan has been designated for closure under the base
realignment and closure program and, as such, is in the process of
transferring base properties out of DOD control. Senate Reporting Directive

Training Ranges Pose Significant Risk

Page 11 GAO- 01- 479 Training Range Cleanup

Figure 1: Signs Warning of the Dangers and Presence of Unexploded Ordnance
at Fort McClellan

Source: General Accounting Office.

DOD officials have estimated that approximately 16 million acres of
potentially contaminated training ranges have been transferred to the public
or other agencies. The risk to the public was further discussed by an
Environmental Protection Agency (EPA) official in a letter dated April 22,
1999, to DUSD( ES). The EPA official cautioned that many training ranges
known or suspected to contain unexploded ordnance and other hazardous
constituents have already been transferred from DOD control, and many more
are in the process of being transferred, and the risks from many of these
have not been adequately assessed. The letter went on to state that risks
correspondingly increase as ranges that were once remote are encroached by
development or as the public increases its use of these properties. An
example of the development of sites adjacent to training ranges is the
planned construction of two schools and a stadium by the Cherry Creek School
District adjacent to the Lowry Bombing Range, a transferred range, near
Denver. Construction is expected to begin in May 2001.

Most training range contamination is a result of weapons systems testing and
troop training activities conducted by the military services. Unexploded
ordnance consists of many types of munitions, including hand grenades,
rockets, guided missiles, projectiles, mortars, rifle grenades, and

Page 12 GAO- 01- 479 Training Range Cleanup

bombs. Figure 2 shows examples of some of the typical unexploded ordnance
that has been removed from training ranges.

Figure 2: Examples of Unexploded Ordnance Found on Training Ranges

Source: U. S. Army Corps of Engineers.

Risks from this unexploded ordnance can encompass a wide range of possible
outcomes or results, including bodily injury or death, health risks
associated with exposure to chemical agents, and environmental degradation
caused by the actual explosion and dispersal of chemicals or other hazardous
materials to the air, soil, surface water, and groundwater. For example,
according to an EPA report, 10 EPA surveyed 61 current or former DOD
facilities containing 203 inactive, closed, transferred, and transferring
ranges and identified unexploded ordnance “incidents” at 24
facilities. These incidents included five accidental explosions, which
resulted in two injuries and three fatalities. According to an EPA official,
the three fatalities identified in their limited survey were two civilian
DOD contractors and one military service member.

10 Used or Fired Munitions and Unexploded Ordnance at Closed, Transferred,
and Transferring Military Ranges (September 2000, EPA 505- R- 00- 01).

Page 13 GAO- 01- 479 Training Range Cleanup

Although DOD reported its unexploded ordnance cleanup liability on training
ranges at about $14 billion in its fiscal year 2000 agencywide financial
statements, it is likely that the financial statements are substantially
understated. Further, significant cleanup costs will not be included in the
planned March 2001 report. DOD officials and Members of Congress have
expressed concern over the potential liability the government may be faced
with but are still uncertain how large the liability may be. Various
estimates have shown that cleanup of closed, transferred, and transferring
training ranges could exceed $100 billion. For example:

? In preparation for DOD's planned issuance of the Range Rule, 11 DOD began
an analysis of the potential costs that may be incurred if the Rule was
implemented. The Rule was intended to provide guidance to perform
inventories and provide cleanup procedures at closed, transferred, and
transferring ranges. The Rule was withdrawn in November 2000 and the cost
analysis was never formally completed. However, a senior DOD official said
that initial estimates in the cost analysis that was developed in 2000 put
the cleanup costs of training ranges at about $40 billion to $140 billion
for closed, transferred, and transferring training ranges.

? DOD estimated that its potential liability for cleanup of unexploded
ordnance might exceed $100 billion as noted in a conference report to the
National Defense Authorization Act for Fiscal Year 2001 (Report 106- 945,
October 6, 2000).

DOD will not respond fully to the Senate Report request for reporting the
costs of cleaning up unexploded ordnance on its training ranges. DOD
officials informed us that due to time constraints, the training range
liability to be reported in the March 2001 report would not be complete or
comprehensive because the required information could not be collected in
time for analysis and reporting. A DUSD( ES) official said that the March
2001 report will include a discussion of the limitations and omissions. DOD
officials stated that they have deferred the collection and analysis of

11 DOD's Range Rule was a proposed regulation that defined a process to
identify closed, transferred, and transferring ranges and address risk to
human health and the environment posed by unexploded ordnance on these
ranges. The Office of Management and Budget, EPA, and federal land managers
were extensively involved in the rulemaking process. On November 13, 2000,
DOD withdrew the Range Rule from the rulemaking process because DOD, EPA,
and federal land managers could not reach consensus on several key issues
including how explosives safety would be handled under the Rule, concurrence
on remedial actions, and who decides the remedy. DOD's Reported

Cleanup Costs Are Likely Substantially Understated

Significant Cleanup Costs Will Not Be Reported in the March 2001 Report

Page 14 GAO- 01- 479 Training Range Cleanup

key data elements. Some of the items that were excluded are the costs to
clean up the soil and groundwater resulting from unexploded ordnance and
constituent contamination. These omitted costs could be significant.

Further, the March 2001 report will not include information on water ranges.
DOD's 1996 Regulatory Impact Analysis 12 reported that DOD had approximately
161 million acres of water training ranges, almost 10 times the size of the
estimated closed, transferred, and transferring land ranges. In commenting
on a draft of this report, DOD stated that the 161 million acres of water
ranges are active training ranges, the majority of which are open- ocean,
deep water, restricted access areas and most are outside the territorial
waters of the United States. DOD also stated that the majority of water
ranges are not likely to cause an imminent and substantial danger to public
health and safety or the environment. However, until a complete and accurate
inventory is performed, DOD will be unable to determine whether some water
ranges meet the reporting requirement of SFFAS No. 5 and, thus, must be
reported in the financial statements.

The DOD Comptroller has revised the DOD Financial Management Regulation to
clarify DOD's fiscal year 2000 financial statement reporting requirements
for training range cleanup costs. The revision includes guidance that
requires the reporting of the cleanup costs of closed, transferred, and
transferring ranges as liabilities in the financial statements. DOD has
indicated that the costs to clean up these training ranges is probable and
measurable and as such should be reported as a liability in the financial
statements. We concur with DOD that these costs should be reported in the
financial statements as liabilities because they are probable and
measurable.

Specifically, they are probable because DOD is legally responsible for
cleaning up closed, transferred, and transferring ranges which were
contaminated as a result of past DOD action. For example, under SARA, DOD is
responsible for the cleanup of sites that create an imminent and substantial
danger to public health and safety or the environment. In addition, these
training range cleanup efforts are measurable. DOD has prior experience in
training range cleanup under the formerly used

12 DOD's 1996 Regulatory Impact Analysis was an analysis of the estimated
costs to implement the Range Rule when it was first proposed in 1997. As
stated earlier, a recent analysis was never formally completed due to DOD's
withdrawal of the Range Rule. Financial Statement

Liability Also Understated

Page 15 GAO- 01- 479 Training Range Cleanup

defense sites program and has used this experience to develop a methodology
to estimate future cleanup costs. However, as explained later in this
report, DOD has not based its reported financial statement liability for
cleanup of these ranges on a complete inventory or consistent cost
methodology, resulting in estimates that range from $14 billion to over $100
billion.

In addition, we believe that certain active and inactive sites may have
contamination that should also be recorded as a liability in the financial
statements because these sites meet criteria in federal accounting standards
for recording a liability. The DOD Financial Management Regulation does not
include instructions for recognizing a liability for training range cleanup
costs on active and inactive ranges in the financial statements. Although
cleanup of active and inactive ranges would not generally be recognized as a
liability in the financial statements, there are circumstances when an
environmental liability should be recognized and reported for these ranges.
A liability should be recognized on active and inactive ranges if the
contamination is government related, the government is legally liable, and
the cost associated with the cleanup efforts is measurable. For example,
contaminants from an active training range at the Massachusetts Military
Reservation threaten the aquifer that produces drinking water for nearby
communities. The problem was so severe that in January 2000, EPA issued an
administrative order under the Safe Drinking Water Act requiring DOD to
cleanup several areas of the training range. According to a DOD official,
the cleanup effort could cost almost $300 million. As a result, the cleanup
of this contamination is probable (since it is legally required) and
measurable. Thus, this liability should be recognized in the financial
statements under SFFAS No. 5.

Although DOD and the services have collected information on other
environmental contamination under the Defense Environmental Restoration
Program for years, they have not performed complete inventories of training
ranges to identify the types and extent of contamination present. To
accurately compute the training range liabilities, the military services
must first perform in- depth inventories of all of their training ranges.
Past data collection efforts were delayed because the services were waiting
for the promulgation of the Range Rule which has been withdrawn. DOD
recently began collecting training range data to meet the reporting
requirements for the Senate Report. However, as stated previously, DOD has
limited its data collection efforts and will not be reporting on the cleanup
of water ranges or the unexploded ordnance constituent contamination of the
soil and water. Training Range

Inventories Are Not Complete

Page 16 GAO- 01- 479 Training Range Cleanup

The Army, under direction from DUSD( ES), proposed guidance for the
identification of closed, transferred, and transferring ranges with the
preparation and attempted promulgation of the Range Rule. In anticipation of
the Range Rule, DOD prepared a Regulatory Impact Analysis report in 1996,
recognizing that the cleanup of its closed, transferred and transferring
training ranges was needed and that the cleanup costs could run into the
tens of billions of dollars.

To address inventories of its active and inactive ranges, DOD issued
Directive 4715.11 for ranges within the United States and Directive 4715.12
for ranges outside the United States in August 1999. These directives
required that the services establish and maintain inventories of their
ranges and establish and implement procedures to assess the environmental
impact of munitions use on DOD ranges. However, the directives did not
establish the guidance necessary to inventory the ranges nor establish any
completion dates. Although the directives assigned responsibility for
developing guidance to perform the inventories, DOD has not developed the
necessary guidance specifying how to gather the inventory information or how
to maintain inventories of the active and inactive training ranges.

Since fiscal year 1997, federal accounting standards have required the
recognition and reporting of cleanup costs, as mentioned earlier. However,
DOD did not report costs for cleaning up closed, transferred, and
transferring training ranges until the services estimated and reported the
training range cleanup costs in DOD's agencywide financial statements for
fiscal year 1999. Agencywide financial statements are prepared in accordance
with the DOD Financial Management Regulation, which is issued by the DOD
Comptroller and incorporates Office of Management and Budget guidance on
form and content of financial statements.

In an attempt to comply with the mandates in the Senate Report, DOD embarked
on a special effort to collect training range data necessary to estimate
potential cleanup costs. The Senate Report directed DOD to report all known
projected unexploded ordnance remediation costs, including training ranges,
by March 1, 2001, and to report subsequent updates in the Defense
Environmental Restoration Program annual report to Congress. While the
Senate Report did not expressly direct DOD to identify an inventory of
training ranges at active facilities, installations subject to base
realignment and closure, and formerly used defense sites, the data necessary
to fully estimate costs of unexploded ordnance- normally located on training
ranges- could only be attained in Previous Inventory

Initiatives Senate Report Expedited DOD Inventory Data Collection

Page 17 GAO- 01- 479 Training Range Cleanup

conjunction with the performance of a complete and accurate inventory that
includes training ranges.

Although the Senate Report's directives were dated May 1999, DOD did not
provide formal guidance to the services for collecting training range data
until October 2000- 17 months later. As a first step in February 2000, the
Under Secretary of Defense for Acquisition, Technology, and Logistics
assigned the responsibility to the Office of the Director of Defense
Research and Engineering, in coordination with DUSD( ES), for obtaining the
range data and preparing the report. On October 23, 2000, DUSD( ES) issued
specific guidance to the military services instructing them to gather range
information and detailing some of the specific information needed. Although
DOD instituted an Unexploded Ordnance Inventory Working Group in March 2000
to work with the services to develop specific guidance, service officials
told us that DOD had not clearly told them what was required or when it was
required until shortly before the official tasking was issued on October 23,
2000. Once officially tasked to gather range information, the services were
given until January 5, 2001, to gather and provide it to DOD for analysis by
a DOD contractor.

Lacking specific guidance from DOD to inventory their ranges, but
recognizing that they would eventually be tasked to gather range information
in anticipation of the Range Rule or for the Senate Report, each of the
services developed its own survey questionnaires to begin gathering range
information before the formal guidance was issued. The Navy took a proactive
approach and began developing a questionnaire in late 1999. The
questionnaire was issued to the Navy commands in December 1999. The Army and
the Air Force also developed their own questionnaires and issued them in
September 2000. Because the formal guidance was issued after the services
had begun their initial data collection, the services had to collect
additional data from their respective units or other sources. According to
DOD officials, the training range inventory information gathered from these
questionnaires for the March 2001 report will also be used in the future as
a basis for financial statement reporting.

Although the scope of ranges in the United States and its territories is not
fully known- because DOD does not have a complete inventory of training
ranges- DOD estimates that over 16 million acres of land on closed,
transferred, and transferring ranges are potentially contaminated with
unexploded ordnance. DOD also estimates that it has about 1,500 contaminated
sites. Many former military range sites were transferred to Range
Identification Is

Difficult and Costly

Page 18 GAO- 01- 479 Training Range Cleanup

other federal agencies and private parties. Training ranges must be
identified and investigated to determine type and extent of contamination
present, risk assessments performed, cleanup plans developed, and permits
obtained before the actual cleanup is begun. These precleanup costs can be
very expensive. For example, the Navy estimates that these investigative
costs alone are as much as $3.96 million per site.

Identifying the complete universe of current and former training ranges is a
difficult task. Ranges on existing military bases are more easily
identifiable and accessible. More problematic, however, are those ranges
that were in existence decades ago, that have been transferred to other
agencies or the public, and records of the ranges' existence or the ordnance
used cannot always be found. Special investigative efforts may be necessary
to identify those locations and ordnance used. In preparing for World War I
and World War II, many areas of the country were used as training ranges. In
some instances, documentation on the location of and/ or the types of
ordnance used on these ranges is incomplete or cannot be found. For example,
unexploded ordnance was unexpectedly found by a hiker in 1999 at Camp Hale
in Colorado, a site used for mountain training during World War II and since
transferred to the U. S. Forest Service. Because additional live rifle
grenades were found in 2000, the Forest Service has closed thousands of
acres of this forest to public use pending further action. This site also
serves as an example of the difficulty in identifying and cleaning up
unexploded ordnance in rough mountain terrain and dense ground cover.

In addition to not having an accurate and complete inventory of its training
ranges, DOD has just recently focused on development of a consistent
methodology for estimating its training range cleanup cost estimates.
However, DOD is using different methodologies for estimating cleanup costs
for the annual financial statements and the March 2001 report. While DOD is
using a standard methodology for estimating and reporting its cleanup costs
for the March 2001 report, that methodology was not used to estimate the
training range cleanup costs for the fiscal year 2000 financial statements.
In addition, each of the services is using different methodologies for
calculating cleanup cost estimates for reporting its liabilities in the
financial statements. Without a consistent methodology, cleanup costs
reported in the financial statements and other reports will not be
comparable and have limited value to management when evaluating cleanup
costs of each the services' training ranges and budgeting for the future.
Cost Methodologies

Are Inconsistent

Page 19 GAO- 01- 479 Training Range Cleanup

Because the military services do not apply a consistent cost methodology to
compute the liabilities for their financial statements, any comparison among
the training range liabilities across the services will not be meaningful.
DOD is reporting a liability of about $14 billion for fiscal year 2000 for
cleaning up closed, transferred, and transferring training ranges. Of the
$14 billion, the Navy is reporting a liability of $53.6 million. The Navy,
based on limited surveys completed in 1995 through 1997, estimated the
number and size of its training ranges and applied a $10,000 an acre cleanup
cost factor to compute its liability. The Navy based its estimates on the
assumption of cleaning up its closed, transferred, and transferring ranges
to a “low” cleanup/ remediation level. The low cleanup/
remediation level means that the training ranges would be classified as
“limited public access” and be used for things such as livestock
grazing or wildlife preservation, but not for human habitation.

The Army recognized the largest training range cleanup liability for fiscal
year 2000. It reported a $13.1 billion liability for cleaning up closed,
transferred, and transferring ranges. The $13.1 billion was comprised of $8
billion to clean up transferred ranges, $4.9 billion for the cleanup of
closed ranges, and $231 million for the cleanup of transferring ranges. 13
The Army used an unvalidated cost model to compute the $8 billion costs of
cleaning up transferred ranges and used a different cost methodology for
estimating the $4.9 billion for closed ranges. The Air Force reported a
liability of $829 million for both fiscal years 1999 and 2000 based on a
1997 estimate of 42 closed ranges, using a historical cost basis for
estimating its liability.

According to DOD officials, DOD has standardized its methodology for
estimating and reporting the unexploded ordnance cleanup costs that will be
reported in the March 2001 report. DOD's cost model used to compute the
unexploded ordnance cleanup costs from its training ranges has not been
validated. The original cost model was initially developed by the Air Force
in 1991 and has been used by government agencies and the private sector to
estimate other environmental cleanup costs not associated with training
range cleanup. A new module was recently added to the cost model to estimate
costs for removing unexploded ordnance and its

13 The amount reported for transferred and transferring ranges included the
cleanup of nontraining range sites containing unexploded ordnance, such as
ordnance disposal sites and ordnance manufacturing facilities. Fiscal Year
2000 Financial

Statement Liabilities DOD Used a Standard Cost Methodology for March 2001
Report

Page 20 GAO- 01- 479 Training Range Cleanup

constituents from former training ranges. The new module uses cost data
developed by the U. S. Army Corps of Engineers from past experiences in
cleaning up training ranges on formerly used defense sites.

DOD officials told us that they believe that this model is the best one
available to compute the cleanup costs. However, the assumptions and cost
factors used in the model were not independently validated to ensure
accurate and reliable estimates. DOD Instruction 5000.61 requires that cost
models such as this must be validated to ensure that the results produced
can be relied upon. We did not evaluate this model, but we were informed
that DOD is in the process of developing and issuing a contract to have this
model validated. A DOD official also informed us that DOD is currently
considering requiring that the cost model be used as a standard for the
military services' valuation of their cleanup cost estimates used to report
liabilities in the financial statements.

Until DOD standardizes and validates its costing methodology used for
estimating and reporting all cleanup cost estimates for training range
cleanup and requires its use DOD- wide, it has no assurance that the
military services will compute their cleanup costs using the same
methodology. As a result, the services will in all probability continue to
produce unreliable and differing estimates for their various reporting
requirements.

DOD lacks leadership in reporting on the cleanup costs of training ranges.
DUSD( ES) was created in 1993 as the office responsible for environmental
cleanup within DOD. However, this office has focused its principal efforts
on the cleanup of other types of environmental contamination, not unexploded
ordnance. Although requirements for reporting a training range environmental
liability have existed for years, DOD has not established adequate or
consistent policies to reliably develop the cost of the cleanup of training
ranges and to oversee these costing efforts.

Similar to the problems noted previously in this report concerning the
inventory delays and lack of guidance, the Defense Science Board, in 1998,
reported that DOD had not met its management responsibility for unexploded
ordnance cleanup. It reported that there were no specific DOD- wide
unexploded ordnance cleanup goals, objectives, or management plans. The
report went on to say that unexploded ordnance cleanup decisions are made
within the individual services, where remediation requirements are forced to
compete against traditional Lack of Leadership

and Focus Hinders DOD Progress in Reporting Training Range Cleanup Costs

Defense Science Board Findings and Recommendations

Page 21 GAO- 01- 479 Training Range Cleanup

warfighting and toxic waste cleanup requirements. This competition has
resulted in unexploded ordnance cleanup efforts being relegated to

“house- keeping duties” at the activity or installation level,
according to the Board's report.

To address DOD's unmet management responsibilities for unexploded ordnance
cleanup, the Defense Science Board recommended the establishment of an
Office of Secretary of Defense focal point for oversight of unexploded
ordnance cleanup activities within DOD. This recommendation was made even
though DUSD( ES) had overall responsibility for environmental cleanup under
the Defense Environmental Restoration Program. According to the Director of
DOD's Environmental Cleanup Program, a single focal point for managing the
cleanup of unexploded ordnance has still not been formally designated. A
focal point with the appropriate authority could be a single point of
contact who could manage and oversee the development of a complete and
accurate training range inventory, the development of a consistent cost
methodology across all services, and the reporting of the training range
liability for the financial statements and other required reports.

The Department of Energy (DOE) has been successful in its identification and
reporting of thousands of environmentally contaminated sites, with cleanup
liabilities reported at $234 billion in fiscal year 2000. Initially, in the
early 1990s, DOE was unable to report the estimated cleanup costs. However,
through substantial effort and support of DOE leadership, DOE was able to
receive a clean, or unqualified, audit opinion, 14 for its fiscal year 1999
and 2000 financial statements. DOE's efforts provide a useful example to DOD
in its efforts to identify and report cost estimates on its contaminated
sites.

After 50 years of U. S. production of nuclear weapons, DOE was tasked with
managing the largest environmental cleanup program in the world. DOE has
identified approximately 10,500 release sites from which contaminants could
migrate into the environment. DOE has made substantial progress in defining
the technical scope, schedules, and costs of meeting this challenge, and in
creating a plan to undertake it. DOE

14 An unqualified, or clean, audit opinion means that the auditor believes
that information presented in the financial statements as a whole is
presented fairly, in all material respects, in accordance with generally
accepted accounting principles. Department of Energy

Process to Identify and Estimate the Cost to Clean Up Hazardous Waste Sites
Provides a Useful Example

Page 22 GAO- 01- 479 Training Range Cleanup

officials told us that in order to build a reliable database and management
program for contaminated sites, the process requires a significant
investment in time and manpower.

DOE officials stated that they began their data collection and management
program process in the early 1990s and are continuing to build and update
their database. However, they emphasized that their efforts, similar to
DOD's current efforts, started with an initial data call to collect
preliminary information to identify the sites. They said the next step
involved sending teams to each of the sites to actually visit and observe
the site, sometimes taking initial samples, to further identify and confirm
the contaminants, and to help assess the risk associated with the site
contamination. The information gathered was entered into a central database
in 1997 to be used for management and reporting purposes. In 1999, DOE
completed entering baseline data for all known cleanup sites.

In addition to the above steps, once a site was selected for cleanup, a much
more involved process was done to further test for and remove the
contaminants. However, until a site is fully cleaned up, each site is
reviewed and cost estimates are reviewed annually and any changes in
conditions are recorded in the central database.

DOE officials told us that in addition to providing the necessary leadership
and guidance to inventory and manage their sites, another key to this
success was establishing a very close working relationship between the
program office and the financial reporting office to ensure consistent and
accurate reporting of their cleanup liabilities.

As military land, including training ranges, is transferred to the public
domain, the public must have confidence that DOD has the necessary
leadership and information to address human health and environmental risks
associated with training range cleanup. Also, the Congress needs related
cost information to make decisions on funding needed. DOD's recent efforts
to develop the information needed to report training range cleanup costs for
the required March 2001 report represent an important first step in
gathering the needed data. However, accurate and complete reporting can only
be achieved if DOD compiles detailed inventory information on all of its
training ranges and uses a consistent and valid cost methodology. Because of
the complexity of the data gathering process and the many issues involved in
the cleanup of training ranges, top management leadership and focus is
essential. A senior- level official with appropriate management authority
and resources is key to effectively Conclusions

Page 23 GAO- 01- 479 Training Range Cleanup

leading these efforts to produce meaningful and accurate reports on training
range cleanup costs.

We recommend that the Secretary of Defense designate a focal point with the
appropriate authority to oversee and manage the reporting of training range
liabilities.

We also recommend that the Secretary of Defense require the designated focal
point to work with the appropriate DOD organizations to develop and
implement guidance for inventorying all types of training ranges, including
active, inactive, closed, transferred, and transferring training ranges. We
recommend that this guidance, at a minimum, include the following
requirements:

? key site characterization information for training ranges be collected for
unexploded ordnance removal;

? identification of other constituent contamination in the soil and/ or
water;

? performance time frames, including the requirements to perform the
necessary site visits to confirm the type and extent of contamination; and

? the necessary policies and procedures for the management and maintenance
of the inventory information.

We further recommend that the Secretary of Defense require the designated
focal point to work with the appropriate DOD organizations to develop and
implement a consistent and standardized methodology for estimating training
range cleanup costs to be used in reporting its training range cleanup
liabilities in DOD's agency- wide annual financial statements and other
reports as required. In addition, we recommend that the Secretary of Defense
require that the designated focal point validate the cost model in
accordance with DOD Instruction 5000.61.

Further, we recommend that the Secretary of Defense require the DOD
Comptroller to revise the DOD Financial Management Regulation to include
guidance for recognizing and reporting a liability in the financial
statements for the cleanup costs on active and inactive ranges when such
costs meet the criteria for a liability found in the federal accounting
standards. Recommendations

Page 24 GAO- 01- 479 Training Range Cleanup

In commenting on a draft of this report, DOD stated that it has made
significant progress in estimating and reporting environmental liabilities
on its financial statements; however, much work remains to be done. DOD's
response also indicated that as the department increases its knowledge
related to this area, the appropriate financial and functional policies will
be updated to incorporate more specific guidance for recognizing and
reporting environmental liabilities.

DOD concurred with our recommendations, but provided several comments in
response to our recommendation that the Secretary of Defense require the DOD
Comptroller to revise the DOD Financial Management Regulation to include
guidance for recognizing and reporting a liability in the financial
statements for the cleanup costs on active and inactive ranges when such
costs meet the criteria for a liability.

DOD stated that it revised Volume 6B, Chapter 10, of the DOD Financial
Management Regulation to clarify instances when a liability should be
recognized for an active or inactive range on an active installation.
However, this revision of the DOD Financial Management Regulation does not
address the recognition of an environmental liability at active and inactive
ranges in accordance with the criteria of SFFAS No. 5. For example, as
stated in our report, the total $300 million cleanup cost estimate on the
active range at the Massachusetts Military Reservation should be recognized
as a liability in accordance with the criteria in SFFAS No. 5.

DOD further stated that since it intends to continue to use its active and
inactive ranges in the foreseeable future, the removal of ordnance to
maintain safety and usability is considered an ongoing maintenance expense.
DOD stated that this expense is not accrued as a liability except in those
few specific instances in which an environmental response action- beyond
what is necessary to keep the range in operation- is probable and the costs
of such a response is measurable. Although this position is consistent with
SFFAS No. 5, it is not specifically indicated in the DOD Financial
Management Regulation.

Finally, DOD stated that as the Department gains additional experience in
this area, it will review appropriate chapters in the DOD Financial
Management Regulation to determine what, if any, additional specific
guidance may need to be included regarding recognizing and reporting
liabilities. While we agree that such a review is appropriate, we continue
to recommend that the DOD Financial Management Regulation be revised Agency
Comments

and Our Evaluation

Page 25 GAO- 01- 479 Training Range Cleanup

to include guidance in those instances when active and inactive ranges meet
the criteria in SFFAS No. 5.

DOD also provided several technical comments, which we have incorporated in
the report as appropriate.

We are sending copies of this report to the Honorable John Spratt, Ranking
Minority Member, House Committee on the Budget, and to other interested
congressional committees. We are also sending copies to the Honorable Donald
H. Rumsfeld, Secretary of Defense; the Honorable David R. Oliver, Acting
Under Secretary of Defense for Acquisition, Technology, and Logistics; and
the Honorable Mitchell E. Daniels, Jr., Director of the Office of Management
and Budget. Copies will be made available to others upon request.

Please contact me at (202) 512- 9095 if you or your staff have any questions
about this report. Other GAO contacts and key contributors to this report
are listed in appendix III.

Sincerely yours, Gregory D. Kutz Director Financial Management and Assurance

Appendix I: Objectives, Scope, and Methodology

Page 26 GAO- 01- 479 Training Range Cleanup

Our objectives were to review DOD's ongoing efforts to (1) gather and
collect information on its training ranges and issues affecting the
successful completion of the inventory and (2) recognize environmental
liabilities associated with the cleanup of unexploded ordnance from its
training ranges, including DOD's efforts to develop and implement a
methodology to develop cost estimates. The focus of our review was on DOD
efforts to gather and collect information on its training ranges and the
environmental costs associated with the cleanup of the training ranges. As a
result, other sites containing unexploded ordnance were not included in the
scope of our review. These sites include munitions manufacturing facilities,
munitions burial pits, and open burn and open detonation sites used to
destroy excess, obsolete, or unserviceable munitions. To accomplish these
objectives, we:

? reviewed relevant standards and guidance applicable to environmental
liabilities including Statement of Federal Financial Accounting Standards
(SFFAS) No. 5, Accounting for Liabilities of the Federal Government; SFFAS
No. 6, Accounting for Property, Plant, and Equipment; and DOD Financial
Management Regulation, Volume 6B, Chapter 10, and Volume 4, Chapters 13 and
14;

? reviewed DOD guidance to the military services for performing the training
range inventory survey;

? reviewed the military services' survey documents used to collect
information on training ranges;

? interviewed officials from the Deputy Under Secretary of Defense for
Environmental Security (DUSD( ES)); Director Defense Research and
Engineering; U. S. Army Corps of Engineers; and the Army, Navy, and Air
Force involved in planning and conducting the data collection efforts and
analyzing the data;

? interviewed an official from the Office of the Under Secretary of Defense
(Comptroller);

? interviewed officials from the U. S. Environmental Protection Agency;

? interviewed environmental officials from the states of Colorado and
Alabama;

? interviewed officials from the Department of Energy;

? interviewed the contractor selected by DOD, which assisted in planning and
analyzing the data and preparing the cost analysis for the March 2001
report; and

? visited two locations- Lowry Bombing Range, Denver, and Ft. McClellan,
Anniston, Alabama- to gain insight into the complexities involved in
estimating liabilities for training range cleanup. Appendix I: Objectives,
Scope, and

Methodology

Appendix I: Objectives, Scope, and Methodology

Page 27 GAO- 01- 479 Training Range Cleanup

We did not audit DOD's financial statements and therefore we do not express
an opinion on any of DOD's environmental liability estimates for fiscal year
1999 or 2000. We conducted our work in accordance with generally accepted
government auditing standards from May 2000 through March 2001. On March 29,
2001, DOD provided us with written comments on our recommendations, which
are discussed in the “Agency Comments and Our Evaluation”
section and are reprinted in appendix II. DOD also provided comments on
several other matters, which we have incorporated in the report as
appropriate but have not reprinted.

Appendix II: Comments From the Department of Defense

Page 28 GAO- 01- 479 Training Range Cleanup

Appendix II: Comments From the Department of Defense

Appendix II: Comments From the Department of Defense

Page 29 GAO- 01- 479 Training Range Cleanup

Appendix II: Comments From the Department of Defense

Page 30 GAO- 01- 479 Training Range Cleanup

Appendix II: Comments From the Department of Defense

Page 31 GAO- 01- 479 Training Range Cleanup

Appendix II: Comments From the Department of Defense

Page 32 GAO- 01- 479 Training Range Cleanup

Appendix II: Comments From the Department of Defense

Page 33 GAO- 01- 479 Training Range Cleanup

Appendix III: GAO Contact and Staff Acknowledgments

Page 34 GAO- 01- 479 Training Range Cleanup

Dianne Guensberg, (202) 512- 5285 Staff making key contributions to this
report were Paul Begnaud, Roger Corrado, Francine DelVecchio, and Stephen
Donahue. Appendix III: GAO Contact and Staff

Acknowledgments GAO Contact Acknowledgments

(918991)

The first copy of each GAO report is free. Additional copies of reports are
$2 each. A check or money order should be made out to the Superintendent of
Documents. VISA and MasterCard credit cards are also accepted.

Orders for 100 or more copies to be mailed to a single address are
discounted 25 percent.

Orders by mail:

U. S. General Accounting Office P. O. Box 37050 Washington, DC 20013

Orders by visiting:

Room 1100 700 4 th St., NW (corner of 4 th and G Sts. NW) Washington, DC
20013

Orders by phone:

(202) 512- 6000 fax: (202) 512- 6061 TDD (202) 512- 2537

Each day, GAO issues a list of newly available reports and testimony. To
receive facsimile copies of the daily list or any list from the past 30
days, please call (202) 512- 6000 using a touchtone phone. A recorded menu
will provide information on how to obtain these lists.

Orders by Internet

For information on how to access GAO reports on the Internet, send an email
message with “info” in the body to:

Info@ www. gao. gov or visit GAO's World Wide Web home page at: http:// www.
gao. gov

Contact one:

? Web site: http:// www. gao. gov/ fraudnet/ fraudnet. htm

? E- mail: fraudnet@ gao. gov

? 1- 800- 424- 5454 (automated answering system) Ordering Information

To Report Fraud, Waste, and Abuse in Federal Programs

United States General Accounting Office Washington, D. C. 20548- 0001

Official Business Penalty for Private Use $300

Address Correction Requested Presorted Standard

Postage & Fees Paid GAO Permit No. GI00

Report to the Chairman and Ranking Member, Subcommittee on Readiness and
Management Support, Committee on Armed Services, U. S. Senate

United States General Accounting Office

GAO

April 2001 BEST PRACTICES DOD Teaming Practices Not Achieving Potential
Results

GAO- 01- 510

Page i GAO- 01- 510 Best Practices Letter 1

Executive Summary 2

Chapter 1 Introduction 8 The Rise of IPTs in Product Development 9 Adoption
of the IPT Concept in DOD 12 Objectives, Scope, and Methodology 14

Chapter 2 IPTS Help Programs Achieve Better Outcomes 19 IPTs Can Improve the
Decision- Making Process 19 Less Effective Teams Had a More Sequential
Decision- Making

Process 21 Improved Product Outcomes Attributed to IPTs 25

Chapter 3 Authority and Knowledge Are Key to IPT Effectiveness 28 The Best
IPTs Had the Knowledge, Authority, and Other Elements

to Be Effective 28 Most DOD Teams Did Not Possess the Key Ingredients of
IPTs 35

Chapter 4 Differences in DOD and Commercial Teaming Approach Reflect
Different Environments 39 Commercial Firms Provided a Different, Supportive
Environment

for IPTs 39 DOD Environment Not As Conducive to IPTs 44 Amphibious Vehicle
Program Found Ways to Provide a More

Supportive IPT Environment 49 Previous GAO Recommendations and DOD Actions
Are Aimed at

Making the Weapon System Environment More Conducive to Best Practices 51

Chapter 5 Conclusions and Recommendations 53 Conclusions 53 Recommendations
for Executive Action 54 Agency Comments and Our Evaluation 55

Appendix I Comments From the Department of Defense 56 Contents

Page ii GAO- 01- 510 Best Practices Appendix II GAO Contacts and Staff
Acknowledgments 59

Related GAO Products 60

Tables

Table 1: Effective IPTs Also Had Successful Product Development Outcomes 25

Figures

Figure 1: Functional Approach to Product Development 10 Figure 2: IPT
Approach to Product Development 11 Figure 3: Players in Commercial Product
Development 13 Figure 4: Players in DOD's Product Development Programs 13
Figure 5: Decision- Making Process Employed by 3M's Pluto Team 20 Figure 6:
Decision Process Followed by the Advanced Amphibious

Assault Vehicle Firepower IPT 21 Figure 7: Sequential Decision- Making
Process for Adding Vehicles

to Accommodate More Weapon System Weight 22 Figure 8: How Advanced
Amphibious Assault Vehicle Gun

Calibration Decision Would Have Been Made Without IPTs 24 Figure 9:
Organization of 3M's Pluto IPT 29 Figure 10: Hewlett- Packard Printer 31
Figure 11: The Marine Corps Advanced Amphibious Assault

Vehicle 33 Figure 12: DaimlerChrysler's Town and Country Minivan 40 Figure
13: 3M Dental Products 42 Figure 14: Percentage of DOD IPT Members That
Perceived Key

Product Elements As Outside the Team's Control 46 Over half of the team
members believe the key product elements-

cost, schedule, requirements are outside of the team's control. 46

Page 1 GAO- 01- 510 Best Practices

April 10, 2001 The Honorable James Inhofe Chairman The Honorable Daniel
Akaka Ranking Member Subcommittee on Readiness and Management Support
Committee on Armed Services United States Senate

As you requested, this report examines how best practices can help the
Department of Defense maximize the benefits of integrated product teams in
its development of weapon systems. It examines the factors that are critical
to making integrated product teams effective, including the environment in
which such teams operate. We make recommendations to the Secretary of
Defense on how to better support the use of integrated product teams on
weapon system programs.

We are sending copies of this report to the Honorable Donald H. Rumsfeld,
Secretary of Defense; the Honorable Joseph W. Westphal, Acting Secretary of
the Army; the Honorable Robert B. Pirie, Jr., Acting Secretary of the Navy;
the Honorable Lawrence Delaney, Acting Secretary of the Air Force; the
Honorable Mitchell E. Daniels, Jr., Director, Office of Management and
Budget; and to interested congressional committees. We will also make copies
available to others upon request.

If you have any questions regarding this report, please call me at (202)
512- 4841. Other key contacts are listed in appendix II.

Katherine V. Schinasi Director Acquisition and Sourcing Management

United States General Accounting Office Washington, DC 20548

Executive Summary Page 2 GAO- 01- 510 Best Practices

Although the Department of Defense (DOD) has boosted its annual weapon
system investment from about $80 billion 4 years ago to about $100 billion
for fiscal year 2001, its buying power will be weakened if weapons continue
to cost more and take longer to develop than planned. DOD wants to improve
program outcomes by reducing weapon system development cost and time, while
still producing weapons that meet user needs. It has a long way to go; long-
standing practices that impede delivery of new weapons within estimates have
proven resistant to reform. GAO issued a series of reports on the success
leading commercial firms have had in significantly reducing the time and
money it takes to develop new and more sophisticated products- the kinds of
results that DOD seeks. Leading commercial firms find that integrated
product teams- teams that are responsible for all the activities of
development, from design to manufacturing- are key to achievement of such
results. The practices of leading commercial firms can help DOD maximize the
benefits of integrated product teams in its development of weapon systems.

In response to a request from the Chairman and the Ranking Member,
Subcommittee on Readiness and Management Support, Senate Committee on Armed
Services, GAO examined (1) whether and how integrated product teams affect
decision- making and product outcomes, (2) what factors are key to creating
effective integrated product teams, and (3) how the environment in which
products are managed affects the prospects for effective integrated product
teams.

Integrated product teams bring together the different professions or areas
of expertise needed to design and manufacture a new product, such as
engineering, manufacturing, purchasing, and finance. The essence of the
integrated product team approach is to concentrate this expertise in a
single organization together with the authority to design, develop, test,
manufacture, and deliver a product. The hallmark of these teams is their
ability to efficiently make decisions that cross lines of expertise. In
contrast, when the people with the necessary expertise reside in separate
organizations, they tend to work on new products sequentially. For example,
a product might be handed off from a concept group to a design group, a cost
group, a test group, and a manufacturing group before being delivered to the
customer. Often, factors such as how to manufacture or repair the product
are assessed after it has been designed and tested, forcing redesign and
rework from the preceding groups.

Commercial firms came to see this approach as taking too long and being too
costly and in the 1980s, began using integrated product teams as a way
Executive Summary

Purpose Background

Executive Summary Page 3 GAO- 01- 510 Best Practices

to get better results faster. In 1995, DOD adopted integrated product teams
in an attempt to improve its weapon system acquisitions. DOD's intention was
to use the teams in the same manner as commercial firms- to integrate
different functional disciplines into a team responsible for all aspects of
a new weapon.

To gain insights on how DOD's implementation of integrated product teams
compares with the practices of leading commercial firms, GAO conducted eight
case studies: three from leading commercial firms; four from DOD programs
experiencing cost, schedule, and performance problems; and one from a DOD
program that has been meeting its objectives. Within these case studies, GAO
examined 18 teams in detail, including over 100 interviews with team members
and leaders.

Integrated product teams work. Effective integrated product teams can make
significant product development decisions quickly and without relying
heavily on consultations with organizations outside the team. These teams
have developed and delivered superior products within predicted time frames
and budgets- often cutting calendar time in half compared with earlier
products developed without such teams. Officials from the more successful
programs GAO reviewed- three commercial and one from DOD- all cited
integrated product teams as a main factor in achieving such results. In the
four DOD programs that were not meeting cost and schedule objectives, GAO
found that the teams did not operate as effectively. Their decision- making
processes were sequential and involved numerous outside consultations for
information and approval.

Two elements are essential to determining whether a team is in fact an
integrated product team: the knowledge and authority needed to recognize
problems and make cross- cutting decisions expeditiously. Knowledge is
sufficient when the team has the right mix of expertise to master the
different facets of product development. Authority is present when the team
is responsible for making both day- to- day decisions and delivering the
product. In the programs experiencing problems, the teams either did not
have the authority or the right mix of expertise to be considered integrated
product teams. If a team lacks expertise, it will miss opportunities to
recognize potential problems early; without authority, it can do little
about them. Although these teams were called integrated product teams, by
and large they were not.

Leading commercial firms took steps to create an environment more conducive
to the integrated product team approach. They committed to Results in Brief

Executive Summary Page 4 GAO- 01- 510 Best Practices

making the approach integral to the product development process and backed
up that commitment through actions to ensure that implementation was not
left to chance. Importantly, the pressures of competing in the commercial
market meshed well with the decisionmaking advantages of integrated product
teams. While DOD endorses the integrated product team approach, it has not
taken steps to ensure that the approach is implemented at the program
execution level. In essence, the approach has been left to germinate in an
unchanged environment that is not necessarily conducive to integrated
product teams. For example, the pressures to launch and fund new programs
create incentives that pose obstacles for integrated product teams.
Implementation is thus more dependent on the ingenuity of individuals
working on the programs.

GAO makes recommendations on how DOD can better support the implementation
of integrated product teams on weapon system programs.

Integrated product teams improved both the speed and quality of the
decision- making process. These teams made decisions involving significant
trade- offs without relying unduly on other organizations for information or
approval. For example, a 3M team developing a new dental material decided,
based on its own analyses, to trade off some sophistication in the material
to get it to market sooner. Officials from the Advanced Amphibious Assault
Vehicle Program report that their teams reduced the time needed to make a
system design decision from 6 months to about a week. The teams at the four
remaining DOD programs had a less efficient decision- making approach. When
these teams faced a significant issue beyond their knowledge and authority,
they went through a lengthy, sequential process to obtain information and
approval. On one program, for example, a trade- off between reducing
performance requirements or increasing weight took a team 6 months and
numerous consultations with other teams, the contractor, program managers,
and service officials.

GAO observed a consistency between the effectiveness of teams and product
outcomes on the eight cases studied: programs that were meeting product
development objectives had more effective teams than the programs that were
having problems. In addition to meeting objectives, the successful programs
were often surpassing the performance of their predecessors in both time to
market and performance. For example, Principal Findings

Integrated Product Teams Help Programs Achieve Better Outcomes

Executive Summary Page 5 GAO- 01- 510 Best Practices

Hewlett- Packard officials stated that an integrated product team cut cycle
time and increased productivity six- fold. The four programs with less
effective teams were experiencing problems including cost growth, schedule
delays, and/ or performance difficulties. While not unusual for weapon
system programs, these are the kinds of problems DOD hoped integrated
product teams could help solve.

Integrated product teams in leading commercial firms and the Advanced
Amphibious Assault Vehicle program had the right mix of expertise to develop
new products. Their teams were responsible for developing and delivering the
product and making day- to- day decisions on cost, design, performance,
quality, test, and manufacturing issues. The combination of product
responsibility and expertise put the teams in a position to have enough
information to tackle crucial issues- like trade- offs- without having to
rely heavily on outside organizations. The 3M team's decision on the dental
material is an excellent example.

Other factors significantly enhanced team effectiveness. Collocating key
members facilitated communication, built trust, and contributed to unity of
purpose- all key elements of effective decision- making. For example, at the
Advanced Amphibious Assault Vehicle Program, because representatives from
the contractor and the DOD program office are located in the same building,
there is little or no delay in getting answers or sharing information to
make decisions. In instances where physical collocation is not possible,
leading firms link team members through electronic means- such as by shared
databases and software. On the more effective teams, team leaders selected
members, rather than having members assigned by another organization. This
allowed team members to demonstrate commitment and alignment with the team's
goals.

GAO examined 12 teams in detail from the DOD programs experiencing
development problems. Seven of these teams did not have responsibility for
day- to- day decisions on the range of product development issues, nor did
they bear responsibility for delivering the product. Rather, they were
limited to a segment of the product development process, such as monitoring
system performance requirements, testing the system, or providing logistics
during fielding. The remaining five teams that could claim product
responsibility were missing representatives from key areas of expertise,
such as cost and testing, or from key organizations, like the contractor.
Regardless of whether product responsibility or expertise was lacking, the
effect on a team was the same- it was not capable of identifying problems
and resolving them expeditiously through a Expertise and Authority

Are Key to Effective Integrated Product Teams

Executive Summary Page 6 GAO- 01- 510 Best Practices

collaborative decision- making process. Moreover, the teams did not enjoy
collocation and control over membership.

Corporate leaders from DaimlerChrysler, 3M, and Hewlett- Packard
demonstrated their commitment to integrated product teams by reorganizing to
better align their structure with the teams and making targeted investments
in physical assets, training, and other forms of help. These changes helped
ensure success at the working level. The firms delegated considerable power
to the teams and held the teams accountable for delivering on set goals.
They made it possible for the typical program manager to succeed in managing
with integrated product teams. DOD did not go much beyond policy statements
to create a supportive environment for integrated product teams. On the
weapon programs experiencing problems, implementation often meant changing
team labels rather than altering lines of authority or team dynamics. Little
training was provided and then only at the initiative of the program.
Program teams were not often involved in setting key product goals, and
program officials observed that unrealistic goals were set before the teams
were formed. Regardless of their efforts, the teams could not make up for
the unachievable goals.

Differences in how commercial firms and DOD managers measure success and in
the pressures they face in starting programs significantly affect the
environment for integrated product teams. Commercial products' success is
measured in terms of the customer's acceptance of the final product and
cycle times short enough to beat the competition. These conditions create
incentives for gaining knowledge early, forming realistic goals and
estimates, and holding teams accountable for delivering the product- all of
which favor an integrated product team approach. In DOD, the pressures to
successfully launch new programs and protect their funding, coupled with
long cycle times, create incentives to be overly optimistic in setting
program goals and to focus on process concerns like obtaining incremental
funding. DOD's necessary reliance on defense contractors introduces another
complication for integrated product teams because the two organizations are
responsible for the product, but they do not necessarily share the same
incentives.

The Marine Corps Advanced Amphibious Assault Vehicle program has many of the
teaming characteristics of leading commercial firms. This accomplishment was
made possible by the unique environment- or culture- that the program's
initial manager created to center around the Differences in DOD and

Commercial Teaming Approach Reflect Different Environments

Executive Summary Page 7 GAO- 01- 510 Best Practices

integrated product team approach. Unlike the other DOD cases, the teams were
not made to fit among standing organizations and procedures.

GAO recommends that the Secretary of Defense designate as integrated product
teams only those teams that will have the day- to- day responsibility for
developing and delivering a product, such as a weapon system, and the cross
section of expertise to do so. GAO recommends the Secretary of Defense use
the practices and characteristics described in this report to develop and
communicate standards for what constitutes an integrated product team. GAO
also recommends that the Secretary of Defense put weapon system program
offices in a better position to create and sustain effective integrated
product teams, such as by giving them responsibility for a deliverable
product, authority to make decisions on that product, and representation
from the critical areas of expertise. Finally, GAO recommends that the
Secretary of Defense help program managers and team leaders become catalysts
for implementing the integrated product team approach by (1) devoting
professional education to make these individuals capable of creating the
culture necessary to foster integrated product teams and (2) drawing lessons
from programs like the Advanced Amphibious Assault Vehicle for bridging the
barriers between program offices and contractors.

DOD agreed with the report and most of its recommendations. DOD partially
concurred with the recommendation that only those teams with day- to- day
responsibility for a product and the necessary cross section of expertise be
designated as integrated product teams. It noted that while such teams are
unique and require certain conditions and investments, the designation
“integrated product team” has spread throughout the workforce
and has benefited other teams as well. DOD does not want to lose those
benefits by limiting the designation. DOD's position reflects the practical
reality that the designation of integrated product teams is now difficult to
restrict. Given the Department's recognition that program office integrated
product teams require certain conditions and investments to succeed that
other integrated product teams may not need, GAO believes that if the
Department takes the actions contained in GAO's other recommendations, the
objective of the recommendation will be achieved. DOD's comments appear in
full in appendix I. Recommendations for

Executive Action Agency Comments

Chapter 1: Introduction Page 8 GAO- 01- 510 Best Practices

Reflecting its urgency to acquire new weapon systems to replace those seen
as outdated and too costly to operate, the Department of Defense (DOD) has
boosted its annual weapon system investment from about $80 billion 4 years
ago to about $100 billion for fiscal year 2001. Over the next 5 years, DOD
plans to spend over $500 billion developing and acquiring weapon systems.
DOD would like to get the most out of this investment and has set goals to
develop new weapons in half the traditional time and within budget.
Historically, DOD has not received predictable returns on weapon system
investments. Although they provide superior capability, they have cost
significantly more and taken much longer to complete than originally
estimated. When one program needs more money than planned, unplanned trade-
offs- such as delaying or canceling other programs- may be necessary. As a
result of such recurrent problems, about 5 years ago we began a body of work
to examine weapon system acquisition issues from a different, more cross-
cutting perspective- one that draws lessons learned from the best commercial
product development efforts to see if they can be applied to weapon system
development. Leading commercial firms have developed increasingly
sophisticated products in significantly less time and at lower costs- the
kinds of results that DOD wants.

Our previous work has shown that leading commercial firms expect their
program managers to deliver high- quality products on time and within
budget. 1 Accordingly, the firms have created an environment and adopted
practices that put their program managers in a good position to succeed in
meeting these expectations. We have also reported on the importance of
having knowledge about a product's technology, design, and producibility at
key junctures in the product development process. A key vehicle leading
commercial firms employ to attain such knowledge is the integrated product
team (IPT). Although organizations may employ various types of teams to
develop new products, an IPT is a particular type of team vested with (1)
the knowledge from the different areas of expertise needed to design,
develop, and manufacture a new product and (2) the authority to use that
knowledge in making decisions about the product. According to leading
commercial firms, IPTs have proven essential to improving product
development outcomes. IPTs have enabled firms like DaimlerChrysler to
significantly reduce the time it takes to develop a new product- by as much
as 50 percent- while at the same time yielding a product more sophisticated
and possessing higher quality

1 Best Practices: Better Management of Technology Development Can Improve
Weapon System Outcomes (GAO/ NSIAD- 99- 162, July 30, 1999). Chapter 1:
Introduction

Chapter 1: Introduction Page 9 GAO- 01- 510 Best Practices

than its predecessors. This report identifies best practices for creating
effective IPTs, such as those from leading commercial firms, which can help
DOD develop and produce better weapon systems significantly faster and at
less cost.

Product development, whether for commercial or defense application, is a
complex undertaking. The process begins with a concept or idea for meeting a
customer's need, the idea is converted to detailed design drawings, and the
design is translated into articles or prototypes that can be tested. During
the product's development, the processes for manufacturing the product must
be also be identified and tested. The development process is characterized
by a tension between competing demands on the product. These demands include
the desire for the highest performance and the most features, the lowest
cost and shortest time to market, and the ease with which the product can be
produced in both quantity and quality. Trade- offs between these demands
must be made to provide the customer a desirable product quickly and at a
reasonable price. If performance features are allowed to dominate, the
product may become too expensive. If costs are cut too much, then the
product's quality may suffer. A product design that ignores the limits of
manufacturing processes may never make it into the hands of the customer.

Taking a product from idea to delivery requires expertise from a number of
different professions or functions, which can vary depending on the type of
product. To illustrate, designing a product's features may require the
collaboration of people with expertise in areas such as mechanical,
electrical, materials, and software engineering. People with a financial
management background are needed to accurately estimate the cost of the
product and to keep track of the budget. People expert in test and
evaluation are needed to objectively assess the performance of product
prototypes. Production engineers make sure the design lends itself to proven
manufacturing processes, even developing new processes when necessary.
Quality assurance experts ensure that defects are kept out of the product
design and manufacturing processes. Yet another group of people are
responsible for understanding and representing the customer's needs, often
part of the marketing function in commercial industry.

In commercial industry, how the knowledge of these experts and the authority
for making decisions are brought to bear on the product development process
have evolved considerably. Years ago, as companies grew and additional
products were developed, many tended to organize The Rise of IPTs in

Product Development

Chapter 1: Introduction Page 10 GAO- 01- 510 Best Practices

work around departments and divisions that represented areas of expertise,
referred to as a functional approach to product development. This approach,
with some illustrative functions, is shown in figure 1 below.

Figure 1: Functional Approach to Product Development

Source: GAO.

In this approach, knowledge was segregated or distributed by function, as
was authority. Each organization managed and made decisions on its piece of
a number of different products. Development of a product occurred
sequentially, with people from each function doing their work on the product
and then handing it over to the people from the next function. While each
function attained a high level of expertise, the knowledge needed to
recognize a potential problem often resided in a function that came later in
the product development process. Thus, proposed solutions had to be reworked
in the preceding functions. For example, if the manufacturing group for an
automobile found that the engine compartment was not large enough to hold
the engine, the automobile would be turned back to the design engineers. The
engineers would have to redesign the engine compartment, the financial staff
would have to reassess the costs and the test and evaluation people might
have to reevaluate the vehicle's crash protection performance. After this
additional time and effort- rework- the automobile could once again proceed
to manufacturing.

Rework Rework Rework Move

product Move

product Move

product Engineering

Product A Product B Product C Product D

Engineering

Product A Product B Product C Product D

Finance

Product A Product B Product C Product D

Finance

Product A Product B Product C Product D

Test and Evaluation

Product A Product B Product C Product D

Test and Evaluation

Product A Product B Product C Product D

Manufacturing

Product A Product B Product C Product D

Manufacturing

Product A Product B Product C Product D

Deliver A Deliver B Deliver C Deliver D

Rework Rework Rework Rework Rework Rework Move

product Move

product Move

product Move

product Move

product Move

product Engineering

Product A Product B Product C Product D

Engineering

Product A Product B Product C Product D

Finance

Product A Product B Product C Product D

Finance

Product A Product B Product C Product D

Test and Evaluation

Product A Product B Product C Product D

Test and Evaluation

Product A Product B Product C Product D

Manufacturing

Product A Product B Product C Product D

Manufacturing

Product A Product B Product C Product D

Deliver A Deliver B Deliver C Deliver D

Chapter 1: Introduction Page 11 GAO- 01- 510 Best Practices

In the 1980s, companies began to look for better ways to bring the knowledge
of the people in different functions together in the design phase of a new
product to reduce rework and shorten cycle times. They organized teams made
up of a cross section of the different functional disciplines and gave them
responsibility for developing an entire product. These efforts evolved into
the IPT approach as it is known today. In the 1990s, Boeing received acclaim
for the success of its 777 aircraft, which was developed by using design/
build teams, which were IPTs.

The essence of the IPT approach is to concentrate in a single organization
the different areas of expertise needed to develop a product, together with
the authority and responsibility to design, develop, test, and manufacture
the product. Figure 2 illustrates some of the areas of expertise that can be
brought into the structure of an IPT organization.

Figure 2: IPT Approach to Product Development

Source: GAO.

Under the IPT approach, each team possesses the knowledge to collaboratively
identify problems and propose solutions, minimizing the amount of rework
that has to be done. When this knowledge is accompanied by the authority to
make key product decisions, IPTs can make trade- offs between competing
demands and more quickly make design changes, if necessary. For example,
design engineers on a Caterpillar IPT initially proposed that very large
differential gears be used to transmit power from the engine to the rear
wheels on a large vehicle. While other team members did not see a problem,
an experienced production engineer on the team noted that no gear
manufacturer made a gear that large and that to create such a production
capability would be

Product A Product B Product C Product D

Deliver A Deliver

A Engineering Finance Manufacturing Test and

Evaluation Engineering Finance

Manufacturing Test and Evaluation

Deliver B Deliver

B Engineering Finance

Manufacturing Test and Evaluation Engineering Finance

Manufacturing Test and Evaluation

Deliver C Deliver

C Engineering Finance

Manufacturing Test and Evaluation Engineering Finance

Manufacturing Test and Evaluation

Engineering Finance Manufacturing Test and

Evaluation Engineering Finance

Manufacturing Test and Evaluation

Deliver D Deliver

D

Product A Product B Product C Product D

Deliver A Deliver

A Engineering Finance Manufacturing Test and

Evaluation Engineering Finance

Manufacturing Test and Evaluation

Deliver A Deliver

A Engineering Finance Manufacturing Test and

Evaluation Engineering Finance

Manufacturing Test and Evaluation

Deliver B Deliver

B Engineering Finance

Manufacturing Test and Evaluation Engineering Finance

Manufacturing Test and Evaluation

Deliver B Deliver

B Engineering Finance

Manufacturing Test and Evaluation Engineering Finance

Manufacturing Test and Evaluation

Deliver C Deliver

C Engineering Finance

Manufacturing Test and Evaluation Engineering Finance

Manufacturing Test and Evaluation

Deliver C Deliver

C Engineering Finance

Manufacturing Test and Evaluation Engineering Finance

Manufacturing Test and Evaluation

Engineering Finance Manufacturing Test and

Evaluation Engineering Finance

Manufacturing Test and Evaluation

Deliver D Deliver

D Engineering Finance

Manufacturing Test and Evaluation Engineering Finance

Manufacturing Test and Evaluation

Deliver D Deliver

D Deliver D Deliver

D

Chapter 1: Introduction Page 12 GAO- 01- 510 Best Practices

risky. Consequently, the design engineers revised the design to enable
existing differential gears to be used, saving significant time in the
process. In the functional approach to product development, this design
problem might not have been discovered until late in product development,
when the manufacturing organization got involved.

DOD accepts IPTs as a vehicle for getting better acquisition outcomes. This
acceptance was formalized in May 1995, when the Secretary of Defense
directed that the concept of IPTs be applied throughout the acquisition
process to the maximum extent possible. DOD employs three basic levels of
IPTs: (1) the Overarching IPT works above the program level and its primary
responsibility is to advise the Defense Acquisition Executive on issues
related to all of the programs the executive is responsible for, (2) the
Working- Level IPT also works above the program level and links the program
manager to the Overarching IPT, and (3) the Program IPT represents the
program level and executes the tasks to design, develop, and manufacture a
weapon system. The first two types of IPTs perform oversight on a program
and other than the program manager, do not typically include people from the
program office. Within DOD, IPTs were to become the main element of an
overall management approach that calls for considering all aspects of a
weapon system, including performance features, manufacturing processes, and
logistic support, throughout design and development.

While the basic DOD and commercial product development processes are
similar, the number and responsibilities of key players differ. In the
commercial world, there are two main players in product development- the
product developer and the customer. Figure 3 describes the role of the
product developer and the customer in commercial product development.
Adoption of the IPT

Concept in DOD

Chapter 1: Introduction Page 13 GAO- 01- 510 Best Practices

Figure 3: Players in Commercial Product Development

Source: GAO.

The DOD process for product development and acquisition is somewhat more
complex because it involves at least three major players- the DOD customer,
the DOD program office, and the product developer, as illustrated in figure
4.

Figure 4: Players in DOD's Product Development Programs

Source: GAO.

The additional player complicates the task of taking a product from concept
to delivery because the knowledge and authority to accomplish the tasks are
distributed between the product developer and the program management office.
Thus, to concentrate knowledge and authority in IPTs for weapon systems,
organizational and functional barriers must be bridged.

An organization that is buying a product to be built to the buyers
specifications and needs An organization

that is buying a product to be built to the buyer's specifications and needs

A manufacturing firm that is responsible for developing and producing the
product. The firm has a product team that has the knowledge, capabilities
and resources to translate the customer's specifications and needs into a
product that can be designed and produced with agreed- upon resources A firm
that is responsible for developing

and producing the product. The firm has a product team that has the
knowledge, capabilities and resources to translate the customer's
specifications and needs into a product that can be designed and produced
with agreed- upon resources

Customer Product Developer DOD program office Customer

Product developer

The warfighting community that uses weapons to perform combat missions and
creates the demand for new weapons The warfighting

community that uses weapons to perform combat missions and creates the
demand for new weapons

The DOD acquisition workforce that develops a strategy for acquiring a new
weapon and marshals the resources to execute the strategy The DOD
acquisition

workforce that develops a strategy for acquiring a new weapon and marshals
the resources to execute the strategy

The defense firm that designs, develops, and manufactures the weapon system

Chapter 1: Introduction Page 14 GAO- 01- 510 Best Practices

The Chairman and the Ranking Member, Subcommittee on Readiness and
Management Support, Senate Committee on Armed Services, requested that we
conduct a body of work to examine various aspects of the acquisition process
to identify best practices that can improve the outcomes of weapon system
programs. To date, we have issued reports on advanced quality concepts,
earned value management techniques used to assess progress on major
acquisition programs, management of a product's transition from development
to production, management of the supplier base, technology maturation,
training program offices on the application of best practices, testing and
evaluation, and setting product requirements (see related GAO products at
the end of this report.)

This report covers the use of IPTs in new product development. Our overall
objective was to evaluate best practices for creating effective IPTs which
can help management of weapon systems. Specifically, we examined (1) whether
and how integrated product teams affect decisionmaking and product outcomes,
(2) what factors are key to creating effective integrated product teams, and
(3) how the environment in which products are managed affects the prospects
for effective integrated product teams.

We follow a similar overall methodology for conducting best practices
reviews of DOD's process for developing new weapon systems. We start by
identifying an aspect of weapon system development- in this report, the use
of integrated product teams- that has been shown to have a significant
impact on the outcomes of new product developments. Our sources for such
information include the large body of individual weapon system reviews we
have conducted over many years; studies from other sources, such as the
Defense Science Board; and discussions with defense experts, including past
and current DOD officials, defense industry representatives, and analysts
from private organizations that study defense issues. Before beginning a
review of a particular topic, we confirm with DOD officials that the topic
is one in which the potential for improvement is significant. Once we have
identified the topic, we use a case study approach because case studies
provide the in- depth knowledge needed to understand individual practices.
They show how practices affect program outcomes as well as issues
surrounding their adoption and implementation. In selecting case studies, we
look for examples of (1) excellent practices from leading commercial firms,
(2) typical or prevailing practices within DOD organizations, and (3) where
possible, DOD organizations that exhibit excellent practices. Objectives,
Scope,

and Methodology

Chapter 1: Introduction Page 15 GAO- 01- 510 Best Practices

To obtain information about teaming practices and identify the best
practices in the use of IPTs in the commercial world, we conducted
literature searches and contacted university faculty, industry associations,
and consultants in the use of product development teams. We selected several
companies known for their exceptional use of integrated teams in product
development that resulted in better product performance and reduced
development cycle time. We visited each company to discuss (1) the way teams
contribute to better product development outcomes, (2) the structure and
organization of teams, and (3) the organizational support and commitment
needed to enable teams to achieve their potential. In addition, we obtained
an understanding of the overall teaming process and the practices that the
companies believed were critical for successful teams. We selected at least
one team from each company for an in- depth review. After our visits, we
prepared individual company summaries from which we developed a model that
represents best teaming practices. The firms we visited and a description of
the teams we selected follows.

? DaimlerChrysler, an automobile manufacturer located in Auburn Hills,
Michigan. DaimlerChrysler's Minivan Platform team is responsible for the
design, development, and production of new minivans. Team Epic, part of the
minivan platform team, designs and develops electric vehicles.

? 3M, a manufacturer of a variety of industrial and consumer products
located in St. Paul, Minnesota. 3M's Pluto team is responsible for the
development of a new dental material.

? Hewlett- Packard, a high technology electronic products manufacturer
located in Palo Alto, California. Hewlett- Packard's Snakes Program is
responsible for developing new computer workstations.

Our report summarizes a number of best commercial practices in the use of
IPTs. As such, we do not suggest that all commercial firms use best
practices or imply that all commercial practices represent the best. Due to
the highly competitive nature of the businesses these firms are involved
with, we do not always attribute an individual practice to a specific
company.

To obtain insights into the dynamics of IPTs used in new weapon system
development efforts, we conducted case studies of five DOD weapon systems.
At the program offices, we interviewed key managers for an overall
perspective of the program. We focused our work on the teams responsible for
executing the development of the weapon system and we interviewed members of
those teams. We selected at least three different teams at each program
office. The programs were Advanced Amphibious Assault Vehicle, CH- 60S Fleet
Combat Support Helicopter, Extended

Chapter 1: Introduction Page 16 GAO- 01- 510 Best Practices

Range Guided Munitions, Global Broadcast Service, and Land Warrior. A
description of each program follows.

? The Advanced Amphibious Assault Vehicle is an Acquisition Category I 2
Marine Corps program. It is a high- speed amphibious armored personnel
carrier that will replace the current family of amphibious assault vehicles.
Its purpose is to transport troops from ships to the shore. The vehicle is
estimated to weigh about 37 tons and be able to carry 17 combat- equipped
Marines plus a crew of 3. It is to travel in excess of 20 knots in the water
and travel over land at 45 mph. Its armament includes a 7.62 mm machine gun
and a 30 mm cannon. Total budgeted program cost is about $8. 7 billion for
1,013 vehicles. It is expected to begin fielding in fiscal year 2006.

? The CH- 60S Fleet Combat Support Helicopter is an Acquisition Category I
Navy program. The CH- 60S helicopter is the replacement for the current CH-
46D. It is a combination of the Army's UH- 60 Blackhawk and the Navy's SH-
60 Seahawk and is designed to provide the Navy with a capability to
replenish forces performing search and rescue missions and airborne mine
countermeasures missions at sea. Program costs are estimated at $4.3
billion. The program began in 1998 with initial fielding expected in 2002.

? The Extended Range Guided Munition is an Acquisition Category II Navy
program. The weapon is a projectile, 5 inches in diameter, that is fired
from guns aboard Navy surface ships. The projectile incorporates a rocket
motor, an internal global positioning system, and an inertial navigation
system. These systems will give the projectile guidance and control to a
fixed target location determined prior to firing. The rocket motor will
provide greater range capabilities than current projectiles. The program
began in 1996 and is expected to begin fielding in 2004.

2 DOD makes distinctions among categories of weapon systems, primarily
according to the level of investment required. Acquisition Category I
programs are defined as major defense acquisition programs estimated to cost
over $365 million for research, development, test and evaluation, or have
procurement costs of more than $2. 190 billion (both in fiscal year 2000
constant dollars). Acquisition Category II programs are defined as
acquisition programs estimated to cost over $140 million for research,
development, test and evaluation, or have procurement costs of more than
$660 million (both in fiscal year 2000 constant dollars).

Chapter 1: Introduction Page 17 GAO- 01- 510 Best Practices

? The Global Broadcast Service is an Acquisition Category I joint- service
program. It will augment and interface with other communications systems and
provide continuous, high speed, one- way flow of high- volume data, audio,
imagery, and video information to forces around the globe. It consists of a
satellites, fixed and transportable transmitters, and fixed and
transportable receivers. The program began in 1996 and is expected to begin
production in late 2002.

? The Land Warrior is an Acquisition Category II Army program. It is an
integrated fighting system for dismounted combat soldiers. It consists of
five subsystems: computer/ radio, software, protective clothing and
individual equipment, integrated helmet assembly, and weapons. The Army
currently expects to procure 34,000 units for a total cost of about $2.1
billion. The Land Warrior is designed to enhance lethality, command and
control, survivability, mobility, and sustain individual soldiers and
infantry units. The program began in 1996, and production is expected to
begin in fiscal year 2003.

To select DOD programs, we identified programs from each of the services
that had experienced problems in meeting cost, schedule, or performance
goals and at least one program that was meeting its development objectives.
We selected programs in the engineering, manufacturing, and development
phase of the acquisition process so that the teaming practices being
reviewed would reflect those of a single prime contractor and enough
progress would have been made to determine whether the program was meeting
its objectives. We also selected weapon programs that entered this phase
after the 1995 policy was implemented to ensure that the programs had a
reasonable chance to implement the IPT policy. To address potential
variances due to program size, we selected programs from different
acquisition category levels. The Land Warrior program provided a range of
analytical information. The performance problems described in the report
occurred in 1999 and earlier, and revealed limitations in the program's
teaming arrangements. The report also covers the current program manager's
efforts to overcome these limitations through teaming and other actions.

We selected three teams from each program office to review in detail. We
used a structured questionnaire to interview approximately 80 IPT leaders,
members, and contractors. Individual team leaders and members also completed
a survey we had prepared regarding how respondents viewed their teams and
their role on the teams. The information collected from the interviews and
the survey was compiled into a database to facilitate a comparative
analysis. Through the analysis, we determined a team's

Chapter 1: Introduction Page 18 GAO- 01- 510 Best Practices

composition, product responsibility, and the environment in which the team
operates. In addition, we collected information regarding the process used
by the teams to make significant decisions. We flowcharted commercial and
DOD program office team decisions to identify the number and level of
organizations required to make actual decisions, as well as the length of
time it took to make the decision.

To better understand the environment under which IPTs operate, we reviewed
current DOD policy directives and guidance on using IPTs in weapon system
program offices. We met with officials from the Office of the Secretary of
Defense and the services responsible for the implementation of IPTs in DOD.
We analyzed studies on IPTs conducted by external organizations such as the
Center for Naval Analyses and the Institute for Defense Analyses. We
reviewed evaluations conducted on implementation of IPTs by the Army, the
Navy, and the National Center for Advanced Technologies.

We also drew on knowledge gained from our prior best practices work. In
particular, we have developed a good base of knowledge regarding differences
in the commercial and DOD environments as they relate to developing new
products. We applied this knowledge in our assessment of the environmental
factors that affect the implementation of IPTs in both sectors.

We conducted our review from December 1999 through February 2001 in
accordance with generally accepted government auditing standards.

Chapter 2: IPTs Help Programs Achieve Better Outcomes

Page 19 GAO- 01- 510 Best Practices

Of the eight programs we reviewed, four exhibited the characteristics
considered the hallmark of IPTs- the ability to efficiently make product
development decisions that cut across different lines of expertise. Compared
with the other four programs, these IPTs had the knowledge and authority to
make decisions in less time, with fewer consultations outside the team, and
with fewer reviews and approvals. At the time of our review, their products
were developed or were being developed within the time frames and budgets
originally estimated. In most cases, the IPTs developed products that
outperformed previous products that were developed without IPTs. The IPTs
were credited with making these results possible; they were thus not only
more efficient; they were also more effective.

The teams from the remaining four programs relied heavily on consultations
with individuals outside of the team to obtain the knowledge and approval to
make significant decisions. While no team is expected to operate in
isolation, the degree of outside consultations made the decision- making
process of these teams much longer and less efficient- much like the process
that predated IPTs. The four programs with less effective teams all
experienced difficulty in meeting product development objectives- manifested
by cost growth, schedule delays, and/ or performance problems. While it is
difficult to prove a direct cause and effect relationship, in some cases,
managers cited the teams' ineffectiveness as directly contributing to the
problems; in other cases, the teams were not in a position to solve or
prevent problems.

Officials at the leading commercial firms and DOD's Advanced Amphibious
Assault Vehicle program believe that because of the knowledge and authority
that resided in their IPTs, the teams required fewer external reviews and
approvals. Consequently, the decision- making process was significantly
shortened- from months to a week or less. While IPTs do not work in
isolation, effective IPTs are self- sufficient, containing the variety of
expertise to recognize early when decisions are needed, such as trade- offs
between competing demands, and the authority necessary to make these
decisions. An example is 3M's Pluto IPT, which was responsible for
developing a technology capable of producing a low shrinking dental
material. The team needed to make a choice concerning the first product to
be marketed based on this new technology. The choice was between producing a
simpler material that could be delivered to customers sooner or taking more
time to deliver a more technically advanced material. The team assessed the
trade- offs and decided in favor of delivering the simpler product sooner
and the more technically Chapter 2: IPTs Help Programs Achieve

Better Outcomes IPTs Can Improve the Decision- Making Process

Chapter 2: IPTs Help Programs Achieve Better Outcomes

Page 20 GAO- 01- 510 Best Practices

advanced material later. Figure 5 depicts the decision- making process used
by this IPT.

Figure 5: Decision- Making Process Employed by 3M's Pluto Team

3M's Pluto IPT had the knowledge and authority to make a significant trade-
off between product performance and schedule.

Source: GAO.

Pluto team members represented the key areas of expertise needed for the
product and were able to assess the technical feasibility of the two options
as well as the cost and schedule trade- offs. The team conducted the
necessary evaluations and research to make its decision, which was supported
by upper management.

Officials from the Advanced Amphibious Assault Vehicle program cite similar
experiences with IPTs. For example, the Firepower IPT developing the 30- mm
gun challenged a requirement that the targeting system maintain accurate
calibration for several days at a time. The team's design engineers believed
the requirement would be costly to achieve. The team's user representative
reported that gun operators, as a standard practice, calibrated the
targeting system daily before each mission. Therefore, the

Above Program

Within Program

Within Team

Team confers with manager on decision Team confers

with manager on decision Team decides on simpler product Team decides on

simpler product Team has knowledge and authority to assess feasibility

of options and conduct evaluations and market research Team has knowledge
and authority to assess feasibility

of options and conduct evaluations and market research

Month 1 Month 2 Month 3 Month 4 Simpler

product Sooner? Simpler

product Sooner?

More advanced

product? More

advanced product? Team leader challenges

team with choice Team leader challenges team with

choice Above

Program Within Program

Within Team

Team confers with manager on decision Team confers

with manager on decision Team decides on simpler product Team decides on

simpler product Team has knowledge and authority to assess feasibility

of options and conduct evaluations and market research Team has knowledge
and authority to assess feasibility

of options and conduct evaluations and market research

Month 1 Month 2 Month 3 Month 4 Month 1 Month 2 Month 3 Month 4 Simpler

product Sooner? Simpler

product Sooner?

More advanced

product? More

advanced product? Simpler

product Sooner? Simpler

product Sooner?

More advanced

product? More

advanced product? Team leader challenges

team with choice Team leader challenges team with

choice

Chapter 2: IPTs Help Programs Achieve Better Outcomes

Page 21 GAO- 01- 510 Best Practices

requirement for maintaining accuracy could be reduced to 1 day- a tradeoff
that made for a less sophisticated and less costly design. Such a decision
can prevent problems later in the development cycle that additional
technical sophistication can cause. Figure 6 depicts the decision- making
process used by this IPT.

Figure 6: Decision Process Followed by the Advanced Amphibious Assault
Vehicle Firepower IPT

The Firepower IPT made a significant decision to trade performance and
reduce cost. Source: GAO.

The mix of expertise on the Firepower IPT provided the knowledge to identify
the problem and reach a decision to make the trade- off between cost and
performance. The IPT was able to make the decision in 1 week and only had to
consult with one organization outside of the team- the group that set the
original performance requirements.

The teams at the four remaining DOD programs had a less efficient decision-
making approach. These teams had to routinely consult with several
organizations because the knowledge and authority to make significant
decisions did not reside within the team. When these teams were faced with a
significant issue that outstripped their knowledge and Less Effective Teams

Had a More Sequential DecisionMaking Process

Other Organizations

Within Program Office

Within Product Team

Day 1

Day 2

Day 5

Day 6 Day

3 Day

4 Design to

original requirement?

Design to original requirement?

Change requirement ease design?

Change requirement ease design?

Team decides to reduce requirement and lower design risk

Team decides to reduce requirement and lower design risk Recognition

that calibration requirement poses design

challenge Recognition

that calibration requirement poses design

challenge Team confers with

organization that established requirements

Team confers with organization that established requirements

Team assesses user needs for calibration

Team assesses user needs for calibration Other

Organizations Within Program Office

Within Product Team

Day 1

Day 2

Day 5

Day 6 Day

3 Day

4 Day

1 Day

1 Day

2 Day

2 Day

5 Day

5 Day

6 Day

6 Day

3 Day

3 Day

4 Day

4 Design to

original requirement?

Design to original requirement?

Change requirement ease design?

Change requirement to

ease design? Team decides

to reduce requirement and lower design risk

Team decides to reduce requirement and lower design risk Recognition

that calibration requirement poses design

challenge Recognition

that calibration requirement poses design

challenge Team confers with

organization that established requirements

Team confers with organization that established requirements

Team assesses user needs for calibration

Team assesses user needs for calibration

Chapter 2: IPTs Help Programs Achieve Better Outcomes

Page 22 GAO- 01- 510 Best Practices

authority, decision- making involved a lengthy and inefficient sequential
process to obtain information and approval.

In one case, a contractor team working on a weapon system found that a
performance requirement could not be met without increasing the weight of
the weapon system. The consequence of the increased weight was that more
vehicles would be required to transport the system, increasing the logistic
burden on the users. The problem was referred to the program office team
that was responsible for ensuring that the contractor met performance
requirement or the type of vehicles required. The team lacked the authority
to change the performance requirement. After 6 months and numerous requests
for knowledge and authority, the decision was made to accept the added
weight and to increase the number of vehicles. Figure 7 depicts the
decision- making process the team used.

Figure 7: Sequential Decision- Making Process for Adding Vehicles to
Accommodate More Weapon System Weight

This DOD team, when faced with a significant trade- off issue, required 6
months to reach a decision and had to involve many players from various
levels outside of the team.

Source: GAO.

The program office team expended a great deal of effort to collect, analyze,
and exchange information with six other teams within the program office
(such as logistics and testing), the program manager level, the prime
contractor, and representatives from all three military services that were
to use the weapon system. Moreover, the team had to consult

Within Program

Within team Above

Program Month

1 Month

2 Month

5 Month

6 Month

3 Month

4 Team considers

the option of adding transport vehicles Team considers

the option of adding transport vehicles

Information exchanged with services Information

exchanged with services

Team reevaluates impact of option Team reevaluates

impact of option Option discussed

with program management

team Option discussed

with program management

team Team recommends

adding transport vehicles Team recommends

adding transport vehicles Information

exchanged with services Information

exchanged with services

Information obtained from

6 other teams Information

obtained from 6 other teams Contractor

informs team of issue Contractor

informs team of issue Information

obtained from 6 other teams Information

obtained from 6 other teams

Program management Team makes

decisions to add vehicles

Program management Team makes

decisions to add vehicles Within

Program Within

Program Within team Within

team Above

Program Above

Program Month

1 Month

1 Month

2 Month

2 Month

5 Month

5 Month

6 Month

6 Month

3 Month

3 Month

4 Month

4 Team considers

the option of adding transport vehicles Team considers

the option of adding transport vehicles

Information exchanged with services Information

exchanged with services

Team reevaluates impact of option Team reevaluates

impact of option Option discussed

with program management

team Option discussed

with program management

team Team recommends

adding transport vehicles Team recommends

adding transport vehicles Information

exchanged with services Information

exchanged with services

Information obtained from

6 other teams Information

obtained from 6 other teams Contractor

informs team of issue Contractor

informs team of issue Information

obtained from 6 other teams Information

obtained from 6 other teams

Program management Team makes

decisions to add vehicles

Program management team makes

decisions to add vehicles

Chapter 2: IPTs Help Programs Achieve Better Outcomes

Page 23 GAO- 01- 510 Best Practices

with these organizations each time new information was obtained- a form of
rework. Despite this complicated process, a representative from the prime
contractor observed that the decision should have taken less time to make.
Furthermore, the representative stated that the contractor was not
significantly involved in making the decision and questioned the decision
because it increased the complexity of the design and placed additional
vehicle and manpower burdens on the system's user.

Officials from the Advanced Amphibious Assault Vehicle program painted a
similarly complex picture when they analyzed how the gun calibration
decision would have been made without IPTs. They estimated that the decision
would have taken 6 months to reach because the required knowledge and
authority would have been much more widely dispersed among other teams and
organizations. Figure 8 depicts the program office's assessment of how the
decision to meet the gun calibration requirement would have been made
without an IPT.

Chapter 2: IPTs Help Programs Achieve Better Outcomes

Page 24 GAO- 01- 510 Best Practices

Figure 8: How Advanced Amphibious Assault Vehicle Gun Calibration Decision
Would Have Been Made Without IPTs

Making the same gun calibration decision on the Advanced Amphibious Assault
Vehicle program without an IPT would have required more time to obtain
knowledge and authority outside the team, particularly above the program
level.

Source: GAO.

This decision would have required sequential consultation with six different
functional organizations, the program management office, the user community,
and the defense contractor. When facing such a decision path, it is
understandable that the team that had the original idea might decide not to
propose the change, finding it easier to pursue the technical solution
rather than the requirement trade- off. Representatives from the other
weapon system programs that experienced problems meeting development
objectives described similar processes that required the teams to consult
with multiple organizations for information, concurrence, or authorization
before a decision could be made.

Within DOD Program Office

Within Contractor Program Office Other

Organizations Month

1 Month

2 Month

5 Month

6 Month

3 Month

4 Proposal referred

to DOD Contracting Representative Proposal referred

to DOD Contracting Representative Contractor team

gets corporate approval to seek change Contractor team gets corporate

approval to seek change

Design to original requirement?

Design to original requirement?

Change requirement to ease design?

Change requirement to ease design?

Program office assesses

proposal Program

office assesses

proposal Program office seeks information

from engineering and cost experts and consults with users Program office
seeks information from engineering and cost experts and consults with users

Change referred to DOD Contracting

Representative Change referred

to DOD Contracting Representative

Change sent to corporate headquarters Change sent

to corporate headquarters

Team implements

change Team

implements change Contractor

identifies trade- off Contractor

identifies trade- off

Program office gets approval

to make change Program

office gets approval

to make change Within

DOD Program Office

Within Contractor Program Office Other

Organizations Month

1 Month

1 Month

2 Month

2 Month

5 Month

5 Month

6 Month

6 Month

3 Month

3 Month

4 Month

4 Proposal referred

to DOD Contracting Representative Proposal referred

to DOD Contracting Representative Contractor team

gets corporate approval to seek change Contractor team gets corporate

approval to seek change

Design to original requirement?

Design to original requirement?

Change requirement to ease design?

Change requirement to ease design?

Program office assesses

proposal Program

office assesses

proposal Program office seeks information

from engineering and cost experts and consults with users Program office
seeks information from engineering and cost experts and consults with users

Change referred to DOD Contracting

Representative Change referred

to DOD Contracting Representative

Change sent to corporate headquarters Change sent

to corporate headquarters

Team implements

change Team

implements change Contractor

identifies trade- off Contractor

identifies trade- off

Program office gets approval

to make change Program

office gets approval

to make change

Chapter 2: IPTs Help Programs Achieve Better Outcomes

Page 25 GAO- 01- 510 Best Practices

We observed a consistency between the effectiveness of teams and product
outcomes on the eight cases we studied: programs that were meeting product
development objectives had more effective IPTs than the programs that were
having problems. Table 1 depicts the product outcomes for the eight cases.

Table 1: Effective IPTs Also Had Successful Product Development Outcomes
Program Cost Status Schedule Status Performance Status

Effective IPTs DaimlerChrysler Product costs

lowered Decreased development cycle months by 50 percent Improved vehicle
designs

Hewlett- Packard Decreased cost by over 60 percent Decreased development

schedule by over 60 percent Improved system integration and product designs

3M Outperform cost goals Product delivery estimates

shortened by 12 to18 months In comparison to current products, improved
performance by 80 percent

Advanced Amphibious Assault Vehicle Current product

unit cost lower than original product estimate

Ahead of original development schedule Demonstrated 5 fold increase in

speed Less Effective IPTs CH- 60S helicopter a Schedule delayed Software and
structural difficulties Extended Range Guided Munition

Increases in development costs Schedule slipped 3 years Redesigning due to
technical

difficulties Global Broadcast Service Experiencing cost

growth Schedule slipped 1 1/ 2 years Software and hardware design shortfalls

Land Warrior b Cost increase of about 50 percent Schedule delayed 4 years
Overweight equipment, inadequate

battery power and design a Program official told us that program costs
increased due to a requirement for additional capabilities

and an increase in the number of helicopters.

b The Land Warrior performance problems cited here primarily reflect the
first version of the system, circa 1999. The system has since been
redesigned but had not completed testing at the time of our review.

Source: GAO analysis of commercial and DOD data.

In addition to meeting product development objectives, the successful
programs were often surpassing the performance of their predecessors in both
time to market and performance. These improvements were attributed in large
part to the effectiveness of the IPTs. The four programs with less effective
teams were experiencing the kinds of problems that, while not unusual for
weapon system programs, DOD hoped IPTs could help solve. Improved Product

Outcomes Attributed to IPTs

Chapter 2: IPTs Help Programs Achieve Better Outcomes

Page 26 GAO- 01- 510 Best Practices

Officials at leading commercial firms and the Advanced Amphibious Assault
Vehicle program attribute their successful product outcomes directly to
their IPTs. Specifically, DaimlerChrysler officials attributed reduced cycle
time, improved product performance, and better market success to their
switch to IPTs. Hewlett- Packard officials stated that the company's teaming
approach resulted in higher product quality, better design results, and
improved system integration. A Hewlett- Packard official stated that the
Snakes Program team simultaneously developed three computer workstations in
9 months, half the time normally required, with four times the performance
of existing workstations. A HewlettPackard IPT developing printer equipment
increased productivity six- fold, despite using one- quarter fewer
employees, and reduced the product defect rate to 2 percent- of which the
majority were cosmetic defects. In another example, company officials said
that in the past, test equipment was developed at a cost between $25,000 and
$70,000 and required up to 4 years to develop- which was well behind the
performance of their competitors. Now, their IPT approach enables the
company to develop a higher quality product in two- thirds less time and
with a price of $10,000 to $25,000.

A 3M official in the dental products division stated that the Pluto IPT
created a revolutionary dental material that surpasses similar products on
the market. The team leader reported that members developed a material that
shrinks 50 percent less than current materials and can withstand 80 percent
more stress. In addition, team members filed five patents, of which four
have been issued- a valuable benefit to the company. The team leader
attributes the IPT's decision- making- including the trade- off between
product performance and schedule- with shortening the product development
time as much as 18 months. In addition, the team leader believes that the
IPT will outperform the competition because 3M's patents make it difficult
for other companies to bring a product to market in a similar technology
area. Lastly, Advanced Amphibious Assault Vehicle program officials believe
their IPT approach was critical to the program's ability to meet or exceed
its cost, schedule, and performance objectives since it began in 1995-
atypical for large DOD programs.

The remaining four DOD cases we reviewed experienced problems, including
schedule delays, cost overruns, or a failure to meet performance objectives.
For example, the schedule and cost targets were increased for the Extended
Range Guided Munition Program because key performance requirements proved
too difficult to meet within the original estimates. Also, the Land Warrior
program manager restructured the program's Effective IPTs Helped

Reduce Product Cost and Cycle Time

Programs With Less Effective IPTs Experienced Poor Outcomes

Chapter 2: IPTs Help Programs Achieve Better Outcomes

Page 27 GAO- 01- 510 Best Practices

operations, including selecting a new contractor, after the initial version
of the equipment proved too heavy and ineffective in testing. Restructuring
the program and redesigning, developing, and testing an improved version of
the equipment added cost and time to the effort. Finally, the CH- 60S and
Global Broadcast Service programs also experienced schedule delays when
technical problems were revealed as the software or systems were tested; GBS
also experienced cost growth.

It is difficult to isolate a cause and effect relationship between less
effective teams and program problems. However, in some cases, program
officials and team members did link ineffective teams to poor product
outcomes. Several team members attributed poor outcomes to one program's IPT
structure or the IPT's ineffective decision- making process. Specifically,
one team member stated that the inability of the IPT- which was led and
primarily staffed by contractor employees- to make a decision on a key
technical component resulted in an overall program schedule delay, cost
increase, and reduced performance requirements. A team leader from another
program observed that the program's structure- which required teams to
report to one another- slowed the decision- making process and resulted in
difficulties in establishing performance requirements. He added that some of
the discrepancies in the requirements could have been avoided. In another
case, a team member on a program experiencing cost and schedule increases
identified a potential technical issue and proposed a change in the weapon's
design. The contractor, who had final authority, refused the change and
moved forward with the original design. This design was ultimately deemed
unacceptable by the user.

Chapter 3: Authority and Knowledge Are Key to IPT Effectiveness

Page 28 GAO- 01- 510 Best Practices

Effective IPTs possess the knowledge and authority essential to the kind of
decision- making that is their hallmark. Knowledge is sufficient when the
team has the right mix of expertise to master the different facets of
product development. Authority is present when the team is responsible for
making both day- to- day decisions and delivering the product. These two
elements are essential to determining whether a team is in fact an IPT.
Other factors significantly enhance an IPT's effectiveness. For the programs
we studied, effective IPTs had key members physically collocated where
possible to facilitate the communication, interaction, and overall
operations. When physical collocation was not possible, resources were
provided to connect members through virtual means, such as shared software.
Effective IPTs were also given control over selecting members, and changes
in membership were driven by the team's need for different knowledge or
skills.

In the programs experiencing product development problems, the teams either
did not have responsibility for product development or were missing key
areas of expertise. Although called IPTs, in reality, they were not. If a
team is missing either the knowledge or the authority to recognize and make
difficult decisions, it is ill- equipped to carry out the role expected from
an IPT. Some of these programs had separate DOD and defense contractor
teams, which further dispersed knowledge and authority. Moreover, DOD did
not routinely collocate team members. Less effective teams also did not have
control over their composition. Team membership fluctuated often but did not
appear to be directly tied to the needs of the project; members left and
joined the team due to personnel rotation policies or other reasons.

Research shows that product development responsibility and crossfunctional
membership are fundamental IPT elements. If a team lacks expertise, it will
miss opportunities to recognize potential problems early; without authority,
it can do little about them. IPTs in leading commercial firms and the
Advanced Amphibious Assault Vehicle program had the right cross section of
functional disciplines to develop new products. Their IPTs were responsible
for developing and delivering the product and making day- to- day decisions
on cost, design, performance, quality, test, and manufacturing issues. The
combination of product responsibility and expertise equipped the IPTs with
the information needed to tackle crucial issues- like trade- offs- without
having to rely heavily on organizations outside the IPT. Once so- equipped,
the collocation of team members and control over the selection of members
made the IPTs even better. Chapter 3: Authority and Knowledge Are Key

to IPT Effectiveness The Best IPTs Had the Knowledge, Authority, and Other
Elements to Be Effective

Chapter 3: Authority and Knowledge Are Key to IPT Effectiveness

Page 29 GAO- 01- 510 Best Practices

Along with being responsible for developing a complex new dental material,
3M's Pluto IPT had the authority to conduct research, select material
attributes based on customer needs, determine the delivery schedule,
estimate the cost of the material, and perform and evaluate the scientific
experiments to create the material. To meet these expectations, the team
possessed all key areas of expertise. Figure 9 illustrates the variety and
types of expertise found on the IPT.

Figure 9: Organization of 3M's Pluto IPT

3M's Pluto IPT has representation from all of the functional disciplines
needed to design, develop, and produce the new dental material.

Source: GAO analysis based on discussions with 3M.

Hewlett- Packard's Snakes IPT consisted of representatives from research and
development, marketing, quality, leadership, finance, and Product
Responsibility and

Cross- functional Membership Are Essential IPT Elements

Team Leader

Marketing Marketing Technical/ Professional

Service Technical/

Professional Service

Material Specialist

Material Specialist Dentist Dentist Hardgoods

Engineer Hardgoods

Engineer Regulation

Specialist Regulation

Specialist Manufacturing

Engineer Manufacturing

Engineer Chemist Chemist Team Leader

Marketing Marketing Technical/ Professional

Service Technical/

Professional Service

Material Specialist

Material Specialist Dentist Dentist Hardgoods

Engineer Hardgoods

Engineer Regulation

Specialist Regulation

Specialist Manufacturing

Engineer Manufacturing

Engineer Chemist Chemist

Chapter 3: Authority and Knowledge Are Key to IPT Effectiveness

Page 30 GAO- 01- 510 Best Practices

manufacturing. Collectively, the IPT is responsible for designing,
developing, and building new computer workstations. Company officials noted
that the breadth of knowledge on the IPT not only speeds the pace of
development but the amount of innovation as well. They also stated that IPTs
may also include customers and suppliers.

Similarly, Daimler Chrysler's Minivan platform team comprise design
engineers and representatives from planning, finance, marketing,
procurement, and manufacturing. They are vested with full authority to
design, develop, and produce new vehicle lines. Given the complexity of
developing a vehicle, smaller IPTs concentrate on developing component
parts, such as the door. Even the door IPT includes specialists for sheet
metal, glass, hardware, wiring, electrical switches, customer liaison, and
manufacturing. This IPT addresses day- to- day issues on designing door
features, determining performance characteristics, and constructing the
door. Equally important, the IPT is responsible for ensuring the entire door
is ready when production of the vehicle starts. If it is not, the IPT could
delay the entire delivery schedule.

Similarly, the Firepower IPT on the Advanced Amphibious Assault Vehicle
program has responsibility for designing, developing, prototyping, and
testing the gun system, including the barrel, ammunition feeder, and the
gunner's station. The IPT has members from engineering, testing, logistics,
cost estimating, manufacturing, and modeling and simulation. Importantly,
these members are drawn from the Marine Corps acquisition workforce, weapon
system operators, and the defense contractor and subcontractors responsible
for building the system.

Chapter 3: Authority and Knowledge Are Key to IPT Effectiveness

Page 31 GAO- 01- 510 Best Practices

All of the IPTs that were producing good outcomes had their core team
members physically working in the same location. Based on actual results,
officials from the three commercial firms and the Advanced Amphibious
Assault Vehicle office shared the view that collocation provided many
benefits and cited it as a key factor to positioning an IPT for success. For
example, collocated IPT members can raise issues earlier, perform tasks
faster, and reach decisions quicker than core members who are geographically
dispersed.

Figure 10: Hewlett- Packard Printer

Hewlett- Packard uses special software, the internet, and communication
devices to virtually collocate remote IPT members that develop new products
such as the printer shown here.

Source: Hewlett- Packard.

When collocated, team members can have frequent ad hoc meetings to share
information and identify issues that could require tradeoffs. Face- toface
informal communication greatly adds to information flow, better Collocated
Members

Provide Benefits to IPTs

Chapter 3: Authority and Knowledge Are Key to IPT Effectiveness

Page 32 GAO- 01- 510 Best Practices

cohesion, and a full understanding of other members' roles- all of which
help foster team unity and performance. Company representatives told us that
the regular informal interaction reduces the need for formal team meetings-
such meetings account for a small percent of an IPT member's time. Lastly,
company officials told us that collocated teams are able to build trust,
which can improve their functioning.

Company leaders observed negative team dynamics when members were
geographically dispersed. Without constant face- to- face interaction, team
members were inclined to have separate discussions and make decisions
regarding product development without involving one another. On occasion,
this resulted in a disconnect and took the members in opposing directions.
Representatives from one company observed that when members are at remote
locations, it is difficult to have team cohesion, and the individuals must
work harder to achieve the same level of efficiency as the collocated
representatives. Several officials stated that remote IPT members can be
excluded from spontaneous informal communications or interactions. As a case
in point, the leader of one IPT is very concerned that a contemplated move
of some members 1 mile from the core team could damage the team's
effectiveness.

The three commercial firms and the Advanced Amphibious Assault Vehicle
program went to great lengths to collocate IPT members. When Daimler
Chrysler relocated its operations, the firm constructed a facility to house
all platform team members in one location, including 800 to 900 permanent
engineers, 300 to 500 contract engineers; representatives from planning,
finance, procurement, manufacturing; and some key suppliers. Officials at 3M
report that they also constructed a facility to collocate their IPT members.

Chapter 3: Authority and Knowledge Are Key to IPT Effectiveness

Page 33 GAO- 01- 510 Best Practices

Figure 11: The Marine Corps Advanced Amphibious Assault Vehicle

Collocation has facilitated communications between DOD and contractor
personnel during the development of the Advanced Amphibious Assault Vehicle
shown here.

Source: DOD.

The Advanced Amphibious Assault Vehicle program office and the defense
contractor for the vehicle are collocated- which is atypical for most DOD
programs. The original program manager believed that IPTs are the right way
to manage the program and that collocation was essential to effective IPTs.
As a result, he required the contractor to lease a facility- at DOD's cost-
to house the its research and development operation on one floor and the DOD
program office staff on another floor. A Marine Corps IPT member explained
that a contractor, when not collocated, may work in isolation and
periodically brief DOD. Much of the work done to that point might need to be
redone if issues arose. He added that working side by side with the
contractor has eliminated the need for formal meetings or briefings because
DOD and contractor members are equally informed regarding the program's
status and progress.

Given that the commercial firms we reviewed have worldwide facilities,
physical collocation of every team member, particularly key suppliers, is
not always feasible. When team members can not be physically collocated,

Chapter 3: Authority and Knowledge Are Key to IPT Effectiveness

Page 34 GAO- 01- 510 Best Practices

leading firms connect remote members through virtual means. The best of
these are shared software and databases that enable team members in one
location to see the results of work done in another location. For example,
if the product designs are stored in a computer database, when one team
member makes a change, the other members can see it in near real- time.
Other companies use advanced equipment to improve their video conferencing
capabilities or enable online team meetings through the Internet. Still,
officials at 3M and Daimler Chrysler believe that virtual collocation does
not replicate the benefits of face- to- face interactions. As a result,
companies temporarily collocate remote members during key phases of product
development to enable them to work side by side with their team.

IPTs at leading commercial firms are given control over selecting members.
The firms believe that it is very important a team have the right expertise
on the team. As a result, team leaders are hand- selected by upper
management based on reputation, knowledge, and/ or expertise. In turn, team
leaders select team members they believe have the expertise and the
interpersonal skills that would match the team's needs.

At 3M, when a new team is formed, an announcement is sent to employees
notifying them of team's purpose, time frame, and skills needed. Employees
are allowed to volunteer for teams. Team leaders select team members from
the pool of volunteers. According to company officials, the self- nomination
process allows staff to demonstrate commitment and alignment with the team
goals and ensures that the team members share a common purpose.

Membership on commercial IPTs can change for different reasons, such as
attrition or promotion. However, we found that the predominant reason to
change members was to meet the changing needs of the team. At one company,
as the product moved through the development phases, the mix of expertise
was sometimes changed as the team's need for knowledge and skills changed.
For example, conceptual staffs, such as design engineers, are needed in the
initial stages but may be replaced with test engineers as the product
proceeds. Control Over Membership

Can Enhance IPTs

Chapter 3: Authority and Knowledge Are Key to IPT Effectiveness

Page 35 GAO- 01- 510 Best Practices

The four DOD programs that were experiencing problems had teams that lacked
the key elements- product responsibility and cross- functional
representation- found in the successful cases. In our view, these teams were
IPTs in name only. Most of these teams did not have responsibility for
decisions on product development issues or delivering the product. Teams
that could claim product responsibility did not have sufficient cross-
functional representation. Regardless of whether product responsibility or
expertise was lacking teams were incapable of identifying problems and
resolving them expeditiously through a collaborative decision- making
process. Moreover, the teams did not exhibit other characteristics-
collocation and control over membership- that contribute to effectiveness.
Neither characteristic appears to be required or encouraged by DOD policy
and team leaders and members perceived difficulties in adopting these
characteristics within DOD.

Seven of the 12 teams we studied were not responsible for the delivery of a
weapon system or a component, nor were they responsible for day- to- day
decisions on product development issues. Instead, the teams were responsible
for monitoring or managing a part of the development process. For example,
several teams exclusively managed the test process- under DOD guidance-
including reviewing the contractor's test procedures, scheduling the system
for developmental and operational tests, and ensuring that the test
certification requirements were met. Other teams were responsible for
monitoring the contractor to ensure performance requirements were met,
addressing logistics issues when the system was fielded, tracking system
costs, or handling contract management issues. Still other teams primarily
focused on planning, coordinating, or developing acquisition strategies and
program schedules- and bore no direct responsibility for delivering the
weapon system or one of its components.

The remaining five teams that believed they had product responsibility for
the most part excluded representatives from critical product development
functions such as design or manufacturing. Instead of being integrated into
the team, members from the missing functions were consulted by the team as
issues arose, which made decisions take longer. For example, one team co-
leader stated that his team's responsibility was limited to technical
issues; people from other key disciplines, such as cost, were not team
members. When a cost issue occurs, the leader needs to contact cost experts
for their input. Another team leader from the same program stated that his
team is primarily comprised of mechanical engineers with responsibility for
many issues, including design, requirements, Most DOD Teams Did

Not Possess the Key Ingredients of IPTs

DOD Teams Lacked the Knowledge and Authority Necessary for IPTs

Chapter 3: Authority and Knowledge Are Key to IPT Effectiveness

Page 36 GAO- 01- 510 Best Practices

manufacturing, schedule, and production. However, representatives from cost,
test, quality, or logistics were not team members. Those represented are
invited to participate in team activities when an issue arises. We were told
that a team for one program had all of the key functional disciplines,
including members from the cost and testing functions. However, after
meeting with those individuals, it was clear that they were not real
members; they were either unaware of the team's existence or had not
attended a meeting for a long period of time.

Knowledge and authority were further dispersed on the 12 DOD teams because
their programs operated with two sets of teams- one belonging to DOD and the
other belonging to the contractor. The DOD teams interacted with the defense
contractor staff to solve problems or to provide periodic updates but did
not routinely include representation from the contractors. When DOD program
officials did participate on contractor IPTs, they typically served as the
customer representative, not fully participating team members. Program
office and contractor teams met separately and addressed issues
independently, and involvement was limited to sequential reviews.

When limited by lack of product responsibility or lack of requisite
expertise, a team must go to other teams and organizations to get the
knowledge and authority needed to make decisions. The result is a sequential
decision- making process, with numerous rework loops. Program managers and
team leaders put in extra effort to overcome these limitations. One program
manager created temporary teams on an ad hoc basis to address specific
product issues, such as difficulties in meeting a weight requirement. Team
leaders in other programs informed us that they frequently invite
individuals from other disciplines to participate in their team meetings on
an as- needed basis to obtain a broader perspective. The Land Warrior
program manager went so far as to create a “shadow” IPT
organization to manage the program on a day- to- day basis, while leaving
the formal, functionally organized teams in place. He noted that the formal
structure had been set up and members assigned before he became the program
manager. Finding this structure difficult to manage effectively, he created
the shadow organization and staffed it with team members of his choice.

Most of the DOD teams on the less successful programs were not fully
collocated, and none of the teams were collocated with their contractor
counterparts. Many of the team members found that this made it difficult to
communicate on a real time basis and they had to work harder to DOD Teams
Typically Did

Not Collocate

Chapter 3: Authority and Knowledge Are Key to IPT Effectiveness

Page 37 GAO- 01- 510 Best Practices

operate well. One leader stated that he is not always aware of what other
teams are doing that may affect his team. A contractor official observed
that the DOD team that she interfaced with could have taken less time if
they were collocated and able to work the issue side by side.

DOD guidance does not address collocation, despite its advantages, as a
means to enhance IPT effectiveness. DOD officials cite the cost and
logistical difficulties with relocating geographically dispersed programs
and defense contractors as the primary reason for not collocating core team
members. For example, one program manager noted that his program involves a
variety of DOD agencies, commands, and all three services located throughout
the United States and several foreign countries, including Italy and Korea.
The program manager thought it impractical to collocate all of the
organizations. Appropriately, DOD still supports developing the capabilities
for shared databases and other technical means to replicate collocation.
Marine Corps and contractor officials from the amphibious vehicle program
had the same initial misgivings about collocation. Today, they told us they
cannot imagine running a program any other way. Officials from leading
commercial firms stated that they also confronted cost and logistical issues
but believed that the investment to collocate was warranted relative to the
investment made in a new product development.

Most team leaders had little say in the composition of their teams. Team
members also had little input into the teams they were assigned to as the
functional organizations made the assignments. If defense contractor
representatives are included on the team, they are typically chosen by their
organizations without DOD's involvement.

Although some team leaders stated that they would like the opportunity to
select the members, DOD does not routinely empower teams to do so. In fact,
the DOD Integrated Product and Process Development Handbook states that the
“selection of team members for IPTs often lies outside the direct
control of the IPT leader.” According to DOD guidance, IPT members
should be drawn from a functional discipline- organizations such as
engineering and financial management that operate independently of weapon
system programs. Generally, the functional leaders assign team members to
IPTs, and while some negotiation can occur, program and IPT leaders have
little say over choosing members.

DOD teams also did not have control over the membership changes. In general,
we found that membership fluctuated frequently, and the majority DOD Teams
Do Not

Control Membership

Chapter 3: Authority and Knowledge Are Key to IPT Effectiveness

Page 38 GAO- 01- 510 Best Practices

of the team members were not original members. For example, at one program
office, 71 percent of the team members were not original members. One team
member told us that his team has had four different team leaders since he
became a member 2 years ago. Another team member had four different program
managers within 4 years. Unlike commercial firms, where changes in
membership were driven by changes in the team's needs, the reasons for
turnover in DOD teams were seldom driven by the needs of the team. For
example, some military personnel stated that they join and leave teams
frequently because military policy is to rotate people every 3 years, but
they can rotate as often as 18 months. Unsurprisingly, the majority of
military personnel stated that it was not likely that they would be involved
on the IPT through the program's life cycle- which can last 15 years.

Regardless of the reason, team members and leaders observed that frequent
turnover results in a loss of corporate knowledge and sets the team back.
For example, one member stated that when new members join the team, there is
an inclination to revisit issues and past decisions, which can slow the
team's progress. Another team member noted that when the IPT is initially
launched, goals and mission statements are established. When original
members rotate, the IPT can lose sight of the objectives and lose some of
the advances gained in the early stages.

Chapter 4: Differences in DOD and Commercial Teaming Approach Reflect
Different Environments

Page 39 GAO- 01- 510 Best Practices

Differences in the environment in which teams operate can have a significant
effect on successfully implementing the IPT approach. We found that leading
commercial firms provided a more supportive foundation for IPTs. Company
leaders committed to the IPT approach and backed up that commitment through
actions designed to ensure that implementation was not left to chance. In
short, they created a different, more conducive environment for IPTs. While
DOD endorses the IPT approach and has issued policies and other guidance, it
has not taken steps to ensure that IPTs are implemented at the program
execution level. In essence, the IPT approach has been left to germinate in
an unchanged environment that is not necessarily conducive to IPTs.
Successful implementation is thus more dependent on the ingenuity of
individuals working on programs.

Differences in how commercial firms and DOD managers measure success and in
the pressures they face in starting programs significantly affects the
environment for integrated product teams. Commercial products' success is
measured in terms of the customer's acceptance of the final product and
cycle times short enough to beat the competition. These conditions create
incentives for gaining knowledge early, forming realistic goals and
estimates, and holding teams accountable for delivering the product- all of
which favor an IPT approach. In DOD, the pressures to successfully launch
new programs and protect their funding, coupled with long cycle times,
create incentives to be overly optimistic in setting program goals and to
focus on process- versus product- concerns. DOD's necessary reliance on
defense contractors introduces another complication for IPTs because two
major organizations (DOD and defense contractors) are responsible for the
product, and they do not necessarily share the same incentives. Notably, the
amphibious vehicle program has overcome these obstacles and made IPTs work
in the DOD environment.

DaimlerChrysler, 3M, and Hewlett- Packard all provided an environment that
supported the IPT approach to product development. Corporate leaders not
only embraced the IPT approach, but demonstrated their commitment by
reorganizing to better align their structure with IPTs and making targeted
investments in physical assets, training, and other forms of help. The firms
delegated considerable power to IPTs, such as in setting product development
goals, but held the teams accountable for delivering on those goals. In
addition, the pressures of successfully competing in the marketplace- that
foster realism, short cycle times, and satisfying the customer- play well to
the strengths of the IPT approach. Chapter 4: Differences in DOD and

Commercial Teaming Approach Reflect Different Environments

Commercial Firms Provided a Different, Supportive Environment for IPTs

Chapter 4: Differences in DOD and Commercial Teaming Approach Reflect
Different Environments

Page 40 GAO- 01- 510 Best Practices

Although DaimlerChrysler and 3M did not plan to restructure their
organizations when they decided to implement IPTs, they found that their
former organizations were at odds with the IPT approach. For example, in the
1980s, DaimlerChrysler (then Chrysler) had separate organizations for key
functions, such as engineering, finance, and manufacturing. Moreover,
engineers were organized around the types of components- such as climate
control- rather than product types. This organization made it difficult even
for all of the engineers working on a particular vehicle to talk with one
another, let alone interact with functions other than engineering, such as
finance. DaimlerChrysler realized that IPTs could not simply be patched
across such organizations. This realization was followed by a corporate
reorganization along platform lines- classes of vehicles- to reinforce the
emphasis on products, rather than functions or components.

Figure 12: DaimlerChrysler's Town and Country Minivan

DaimlerChrysler reorganized around platform teams to better support its IPT
approach. Source: DaimlerChrysler.

The companies took other steps to reinforce their commitment to IPTs. For
example, DaimlerChrysler officials noted that some employees were resistant
to the IPT approach. To encourage employee acceptance and ensure
organizational and product goals were achieved, a two- pronged Leading
Commercial Firms

Demonstrate Commitment Through Action

Chapter 4: Differences in DOD and Commercial Teaming Approach Reflect
Different Environments

Page 41 GAO- 01- 510 Best Practices

performance appraisal process was instituted that solicited input from both
an IPT member's immediate supervisor and other members and organizations the
member interfaced with. We found that at 3M and Hewlett- Packard, the IPT
leader either prepared the members' performance evaluations or provided
significant input to it. Officials noted that capturing an individual's
performance on an IPT was a driving factor in garnering acceptance of the
IPT approach.

In addition to the physical infrastructure investments made to collocate and
integrate the workplace, the companies invested other resources to ensure
that IPTs were successfully implemented at the product development level. In
an earlier report on best training practices, we noted that leading firms
focus on a few, key initiatives at any one time 1 and deliver targeted,
hands- on training to ensure that implementation is successful at the
product development level. DaimlerChrysler, 3M, and Hewlett- Packard took
the same approach. These companies offered extensive front- end planning
assistance. For example, Hewlett- Packard helps new teams plan and define
their priorities and track their progress. A company official believed this
help could reduce a project's time by 10 to 20 percent. At 3M, the company
provided team sponsors, who were top managers that established the IPT and
assisted the team with leadership and high- level decision- making. In some
cases, the companies provided facilitators that assisted IPTs with hands- on
guidance to enhance their daily performance.

1 Best Practices: DOD Training Can Do More to Help Weapon System Programs
Implement Best Practices (GAO/ NSIAD- 99- 206; Aug. 16, 1999).

Chapter 4: Differences in DOD and Commercial Teaming Approach Reflect
Different Environments

Page 42 GAO- 01- 510 Best Practices

Figure 13: 3M Dental Products

3M top managers served as team sponsors Source: 3M

According to officials from the leading commercial firms, achievable, clear,
and shared goals are vital to an IPT's success. The goals include the timing
of bringing a product to market, its features, and a cost that will appeal
to customers yet yield an acceptable profit. The three companies routinely
involve their IPTs early in the product development process, giving them the
opportunity and authority to affect a product's goals. While subject to some
constraints, IPT leaders and members were given the flexibility to make
trade- offs between competing objectives. The role 3M's Pluto team played in
trading off product sophistication for an earlier delivery date is a prime
example of a team being given both the opportunity to be involved in goal-
setting and the authority to affect the goals.

There is a consequence for IPTs having such influence over product goals-
the product's success is readily measurable, and the teams are held
accountable for its success. If the product is delivered late, does not
perform as expected, or costs more than it could sell for profitably, the
Companies Give IPTs

Control Over the Product but Hold Them Accountable

Chapter 4: Differences in DOD and Commercial Teaming Approach Reflect
Different Environments

Page 43 GAO- 01- 510 Best Practices

IPT is responsible. If a product fails because it does not meet one or more
of its goals, the team is held accountable for that failure. This
consequence helps the team to be aggressive but realistic in setting goals
and motivates the team toward achievement of goals as it develops the
product.

Based on our current and previous work on best practices, 2 the demands
leading commercial firms make of new product developments create a set of
incentives that mesh well with the IPT approach. These firms insist on a
solid business case for starting a new product, which centers on designing
and manufacturing a product that will sell well enough to make an acceptable
profit. Barring an unforeseen change in the market, if the firm delivers the
right product on time and for the right price, the customer will buy and the
product succeeds. To ensure success, leading commercial firms insist on
having high levels of knowledge about the technological, design, and
production content of the product. In particular, before a new product
development is launched, leading firms ensure that technology development is
complete and that immature technology is not allowed onto a product. To meet
market demands and stay competitive, the firms consciously limit cycle time-
the length of time it takes to develop a new product. The leading commercial
firms we have visited had cycle times that ranged from 18 months to just
over 4 years.

With product success clearly defined in terms of customer acceptance and
cycle times kept short, accountability is readily established in terms of
delivering a quality product on time. This reality helps keep an IPT focused
on the product itself, can minimize membership changes, and fosters trade-
offs. Candor in recognizing risks early and realism in making estimates are
fostered because doing otherwise, such as overselling product performance or
delivery dates, can set the team up for disappointing the customer and
failing. Similarly, the leading firms' insistence on demonstrated knowledge
about technology maturity and other aspects of the product reinforces
realism because knowledge is more directly linked to product success than
promises or projections. The IPT, with its full cross section of expertise,
is ideally suited to having the key aspects of product knowledge on hand to
provide realism, minimize surprises, and quickly respond to potential
problems.

2 See related GAO products. Incentives in Commercial

Environment Are Conducive to IPTs

Chapter 4: Differences in DOD and Commercial Teaming Approach Reflect
Different Environments

Page 44 GAO- 01- 510 Best Practices

In the programs we reviewed, DOD's environment was not conducive to IPTs.
DOD has not backed up its commitment to the IPT approach with investments
and other actions to ensure success at the program execution level. Teams
typically were not involved in the goal- setting process and could not
really be held accountable for goals that were unrealistic. Moreover, the
pressures on launching and funding new programs created incentives that
posed obstacles for IPTs. Shared responsibility between program offices and
contractors further complicated the environment for IPTs.

After DOD formally adopted the IPT approach in 1995, it mandated the use of
the IPTs on all weapon programs to the extent possible and made a
significant amount of IPT information, instructions, directives, and manuals
available. However, some of the information is too vague and is not
practical for implementation at the program execution level. Moreover, the
policy is not coupled with top- level action- instead, implementation falls
on the shoulders of the program offices. For example, the 1995 policy
memorandum directing program offices to implement IPTs does not include the
factors essential for an effective IPT. Other IPT policies designate as IPTs
teams that have a legitimate purpose but cannot practically operate as
effective IPTs. For example, the Overarching and Working- Level IPTs are
oversight in nature and cannot be expected to execute the day- to- day
responsibilities of an IPT. Also, the policies specify that some IPTs be
comprised of a single functional discipline or profession, such as test and
evaluation and cost, that by definition do not possess the mix of expertise
to make the cross- functional decisions expected of an IPT. It is not that
these teams should not exist, but that assigning the designation of
“IPT” to teams for which it should not apply dilutes the
designation. It contributes to the view that IPTs are nothing new. On that
point, over half of the team members we interviewed stated that DOD's
adoption of IPTs resulted in little change at the program execution level;
most saw IPTs as simply a new name for an old approach.

No DOD organization ensures or monitors implementation of IPTs, leaving
implementation dependent on the circumstances of the individual program and
the capability of the government and contractor managers. According to a
representative from the DOD organization that writes IPT guidance, the
organization's role is not to ensure or monitor the program offices'
implementation of the guidance. An IPT point of contact for one of the
services informed us that his office makes IPT information available but
that implementation is left up to the programs. While a reasonable amount of
latitude for IPTs is good, DOD has not provided top- level attention, as DOD
Environment

Not As Conducive to IPTs

DOD- Level Support for IPTs Does Not Extend Much Beyond Policy

Chapter 4: Differences in DOD and Commercial Teaming Approach Reflect
Different Environments

Page 45 GAO- 01- 510 Best Practices

leading commercial firms have, to ensure that the guidance is followed at
the program execution level. At the program or program execution level,
resources provided to IPTs varied and were not part of a systematic approach
to ensure IPT effectiveness. For example, 71 percent of the team members we
interviewed said that if program offices provided support, it was usually in
response to a team request.

While commercial firms went through an organizational transformation to
support IPTs, in many respects DOD maintained the status quo after it
adopted the IPT policy. Organizations set up years ago around functional
disciplines, such as engineering and financial management, continue to write
guidelines for their functions, and to hire, train, and manage the career
progression of the acquisition workforce. Thus, they still wield
considerable control over members of program teams. When a program office is
set up to develop and produce a new weapon system, the staff is drawn from
these organizations but maintains their professional ties to them. Program
offices and their IPTs are in essence superimposed over the standing
functional organizations. Performance appraisals for staff working on the
weapons we reviewed are still largely controlled by the functional
organizations, not the program teams. Nearly 80 percent of the team members
we interviewed said that they continue to be evaluated by superiors in their
parent functional organization, not the IPT leader. Furthermore, most team
members were not aware whether their performance appraisers received input
from the IPT supervisors.

In the programs experiencing developmental problems, DOD did not
systematically involve IPTs in setting product development goals. Often,
these goals were not realistic and resulted from overselling a new program
in its early stages. This made it unlikely for any team, regardless of
capability, to meet the goals and difficult to hold team members accountable
for results. According to most IPT members we interviewed, key product
goals, such as system cost, delivery schedule, and performance requirements
were often fixed and outside of the team's control. Their responses are
captured in figure 14. Teams Did Not Have

Control Over Goals

Chapter 4: Differences in DOD and Commercial Teaming Approach Reflect
Different Environments

Page 46 GAO- 01- 510 Best Practices

Figure 14: Percentage of DOD IPT Members That Perceived Key Product Elements
As Outside the Team's Control

Over half of the team members believe the key product elements- cost,
schedule, requirements are outside of the team's control.

Source: GAO analysis.

In some cases, the key product goals were established during the initial
concept stage- years before IPTs were set up. We have previously reported
that such goals are often set optimistically, reducing the probability that
they can be achieved despite best efforts. One program manager told us that
overselling at the concept stage locked the program and its IPTs into
unachievable goals. Of the IPT members interviewed, 45 percent stated that
the fixed elements hampered their decision- making ability. For example, an
IPT leader told us that during the concept stage, a requirement was set that
the program would use commercial off- the- shelf technology. When a key
piece of commercial software unexpectedly became unavailable and there were
uncertainties regarding the reliability of other commercial products, the
IPT could not change the requirement.

No Elements are Fixed

14% Two Elements Fixed

18% Three Elements

Fixed* 51%

One Element Fixed 17%

Chapter 4: Differences in DOD and Commercial Teaming Approach Reflect
Different Environments

Page 47 GAO- 01- 510 Best Practices

It was the team leader's opinion that the requirement led to problems that
contributed to significant delays in the delivery schedule.

DOD teams were not routinely held accountable for product outcomes. Two
program managers informed us that instead their teams were held accountable
for how well they managed aspects of the acquisition process, such as test
and evaluation. A program manager with cost and schedule overruns was
hesitant to hold the teams accountable because the original goals were never
achievable. Similarly, a team leader unable to maintain the program schedule
stated that his team was not a failure because the schedule goal was
unrealistic and out of the team's control. Another team member noted that
IPTs can influence key program elements but that the teams are not penalized
when their actions lead to cost and schedule overruns.

DOD's incentives for managing weapon systems do not put IPTs in as good a
position to succeed as their commercial counterparts. Programs are started
with a legitimate desire for an improved combat capability. However, the
intense competition for funds needed to launch a new weapon system program
encourages the conceptualization of a new weapon that offers significantly
greater- even unique- performance relative to its predecessor. As a result,
new programs are often started with immature technologies, that deny
managers and teams the high levels of product knowledge that are important
to realism. Moreover, new programs must fit into forecasts of available
funding; as a result, incentives are strong to make optimistic estimates of
cost and cycle time. Because actual cycle times can be very long, lasting 10
to 15 years, the more tangible goals for teams become securing the next
increment of funding and getting approval for moving into the next stage of
development- process, versus product, goals. Weapon system programs are
developed in a more critical environment in which evidence of problems, such
as an unreachable performance goal, can invite criticism and a potential
loss of funding and other support. Thus, the candor needed to identify and
resolve trade- offs, for which IPTs are ideally suited, is implicitly
discouraged.

Accountability for meeting weapon system goals is difficult to establish.
Unlike the commercial environment, in DOD the customer is very involved
throughout the development cycle and becomes increasingly vested in a
particular weapon. The DOD customer is thus not likely to walk away from a
weapon even if it took longer, cost more, and performed less than
anticipated. Long cycle times, coupled with DOD's policy for rotating
Incentives in DOD

Environment Create Obstacles for IPTs

Chapter 4: Differences in DOD and Commercial Teaming Approach Reflect
Different Environments

Page 48 GAO- 01- 510 Best Practices

military personnel, also impair accountability. Military personnel,
including most program managers, stay with a program for a limited amount of
time and then are rotated to new assignments. A DOD analysis shows that a
program with an 11- year development cycle will have, on the average, four
program managers. Most of the military personnel we interviewed did not
expect to participate on a team throughout the program's development. Other
team members believed that, as a result, military personnel may have less
accountability or commitment to the team.

DOD's employment of defense contractors to design and build its weapon
systems is an additional complication for IPTs that commercial firms do not
face. With the exception of two programs, the DOD and contractor personnel
worked on separate teams, usually in different geographic areas. Team
leaders and members informed us that a lack of trust between the DOD program
office and the contractor might inhibit effective teaming. Team members also
perceive that the DOD program office teams and the contractor might have
conflicting incentives or competing interests. For example, a team leader
noted that contractors are paid to participate in IPT activities. As a
result, the leader believed the contractor had an incentive to generate
meaningless IPT documents to receive credit for the activities. According to
a member of another team, the contractor was not interested in a proposal
made by the DOD team that could reduce the program schedule because it would
have reduced the contractor's payment. On the other hand, a contractor
representative cited the contractor's inability to convince DOD team members
that increasing the test schedule would unnecessarily extend the program
schedule.

In other cases, DOD team members believed the contractors resisted trade-
offs and other changes that could have prevented problems because the
contractor would be paid more money to correct rather than to prevent
problems. Consistent with the competitive pressures at the start of a
program, one team co- leader stated that the contract proposal process may
encourage contractors to underestimate cost and schedule estimates and
overestimate performance expectations to win the contract. Those estimates
then contribute to unrealistic program baselines. The point is not so much
that the team members' statements are accurate, but that their perceptions
pose obstacles to effective IPTs and further blur accountability for
successfully delivering the product.

Chapter 4: Differences in DOD and Commercial Teaming Approach Reflect
Different Environments

Page 49 GAO- 01- 510 Best Practices

The Advanced Amphibious Assault Vehicle program has many of the teaming
characteristics of leading commercial firms. This accomplishment was made
possible by the unique environment- or culture- that the program's initial
manager created to center around the IPT approach. The IPTs were responsible
for delivering components of the vehicle and had the knowledge and authority
to make trade- offs, such as reducing the calibration requirement for the
30- mm gun 3 . Unlike the other DOD cases, Advanced Amphibious Assault
Vehicle IPTs were not made to fit among standing organizations and
procedures. Clearly, the program manager's vision and entrepreneurship were
the driving forces behind the program's success with IPTs- traits that one
cannot reasonably expect to find across the board. However, his recognition
of the need to create a culture for such teams, the steps he took to create
that culture, and the other conditions that helped make IPTs successful on
this program are both observable and replicable.

The original program manager saw IPTs as the key to the new vehicle's
success and collocation as the only way to break down the barriers between
DOD program offices and contractors. By making collocation a requirement in
the request for contract proposals, he forced the contractor and DOD program
office staff to work in the same facility- the first DOD weapon system
program office to do so. Moreover, he created one set of teams- comprised of
both Marine Corps and contractor staff. Officials and team members from both
the Marine Corps and the contractor were adamant that the IPT structure,
bolstered by collocation, created a positive working relationship by helping
to break down barriers to trust, improving communication, and creating
common goals between the Marine Corps and the program office for developing
the vehicle.

The program manager actively sought to create a shared understanding of the
customer's needs among Marine Corps and contractor staff. He provided
opportunities for the contractor's engineers to learn first hand the user's
conditions and needs. For example, he had the contractor's staff spend a
night aboard an amphibious ship and stay in the troop compartment.
Contractor staff drove the existing amphibious assault

3 Another example of trade- offs made by amphibious vehicle IPTs includes
that between the baseline transmission and the six- speed transmission.
Although the six- speed transmission required a greater investment during
the development phase of the program, it is expected to result in a 21 to 1
rate of return during the life- cycle of the vehicle. Additional benefits of
the six- speed transmission are a 5 percent decrease in fuel consumption and
a 40 percent increase in the interval before maintenance is required.
Amphibious Vehicle

Program Found Ways to Provide a More Supportive IPT Environment

Chapter 4: Differences in DOD and Commercial Teaming Approach Reflect
Different Environments

Page 50 GAO- 01- 510 Best Practices

vehicle to understand the environment that Marines were operating in and the
limitations of the current vehicle. Contractors also took Marine Corps
leadership training classes, where they heard the experiences of a Marine
Corps corporal who almost drowned in the current vehicle. Moreover, because
the prototypes for the new vehicle are being built in the facility where the
IPTs work, the teams can see and experience first hand the results of their
design efforts. This provides immediate feedback to the team members and
fosters a sense of ownership.

Other features of the program were conducive to effective IPTs. Because the
program was the most important acquisition for the Marine Corps, it had full
and stable funding, and the program manager had the full backing of the
Marine Corps hierarchy. This enabled the program to provide financial
incentives, including bonuses to the contractor personnel, when key
performance requirements were exceeded and to make significant investments
in training and information systems. We have previously reported that the
program emulated the best practices of leading commercial firms in targeting
hands- on training to staff on key initiatives, including IPTs. 4 The
program also developed a paperless communication system; a virtual product
model of the amphibious vehicle; and an on- line, real- time shared data
source that enabled teams to operate from the same set of records. Moreover,
the program is one of the few we have found that matured key technologies,
most notably the propulsion system, before the program was started. 5
Finally, the program has had very low turnover in key personnel; the
original program manager stayed with the program for 10 years. Both he and
the deputy program manager worked on the enabling technologies in a science
and technology effort before the program began.

4 Best Practices: DOD Training Can Do More to Help Weapon System Programs
Implement Best Practices (GAO/ NSIAD- 99- 206, Aug. 16, 1999). 5 Best
Practices: Better Management of Technology Development Can Improve Weapon
System Outcomes (GAO/ NSIAD- 99- 162, July 30, 1999).

Chapter 4: Differences in DOD and Commercial Teaming Approach Reflect
Different Environments

Page 51 GAO- 01- 510 Best Practices

The effect the environment has on the success of IPTs is consistent with
what we have observed on other best practices. In previous reports on how
best practices can improve the weapon acquisition process, we have
consistently pointed out that practices are adopted because they help a
program- commercial or defense- succeed in its environment (see Related GAO
Products). Thus, to identify specific best practices from commercial firms
and simply recommend that DOD adopt them will not produce the desired
improvement. Rather, one must first address how to change the factors that
reinforce the prevailing- and suboptimal- practices. In our reports, we have
identified a number of actions DOD needs to take to make the weapon system
acquisition environment more conducive to adopting best practices. DOD has
agreed with these reports and with the need to make changes in its
environment. Perhaps the most significant action DOD has taken to date has
been to revamp its policies that guide weapon acquisitions to emulate some
of the conditions that encourage best practices. DOD's success in
implementing these policies on individual weapon system programs will affect
several conditions important to creating effective IPTs.

Actions we have previously recommended or suggested DOD take that we believe
can help IPTs operate successfully are summarized in the points that follow:

? Mature key technologies and match available resources with weapon system
requirements before launching a development program.

? Develop an initial version of a weapon system that provides a worthwhile
capability and introduce more advanced capabilities in later versions as the
enabling technologies mature.

? Make the acquisition process knowledge- based by focusing on attaining key
aspects of product knowledge- technology maturity, design maturity, and
production process maturity- at the right times.

? Keep weapon system development cycle times to 5 years or less and tie a
program manager's tenure to the full cycle.

? Target training on key improvements to acquisition management to program
offices and ensure that it is delivered to the program office work site.

? Involve key suppliers- that is, those firms that make key components and
subsystems for prime contractors- early in the design and development
process.

? Send the right signals on individual weapon system decisions- that is,
decisions that reinforce the above principles rather than make exceptions to
them. Previous GAO

Recommendations and DOD Actions Are Aimed at Making the Weapon System
Environment More Conducive to Best Practices

Chapter 4: Differences in DOD and Commercial Teaming Approach Reflect
Different Environments

Page 52 GAO- 01- 510 Best Practices

These actions, collectively, can (1) lower the pressures to oversell weapon
system performance and underestimate costs and schedules at program launch,
(2) infuse more knowledge of a new weapon system earlier and throughout the
development process, (3) make people more capable of delivering the weapon
system as promised, and (4) hold people more accountable for delivering the
weapon system.

DOD has made important changes to its policies that guide how weapon systems
are acquired and managed, commonly referred to as the “5000

series.” Among these changes are (1) changing the launch point for new
weapon system programs and calling for technologies to be mature before
including them on a program and (2) adopting an evolutionary approach to
developing new weapons, allowing for better versions with more advanced
technologies to be fielded when they are ready. In other statements and
memoranda, DOD has called for limiting cycle time to 5 to 7 years, agreed to
take steps to better match technology and other resources with requirements
before launching new programs, and revamped its professional education for
the acquisition workforce to be more responsive to managers' needs and more
capable of providing needed help to the workplace.

These are positive steps toward creating a better environment for best
practices. Clearly, more steps remain to be taken. Perhaps the most
important of these is implementation at the service and individual program
level. Thus far, the positive changes we have seen on specific programs,
including IPTs, have been the result of extraordinary effort on the part of
individual executives and managers. The systemic pressures and incentives
that reinforce the practices of the past have been slow to change.

Chapter 5: Conclusions and Recommendations Page 53 GAO- 01- 510 Best
Practices

When properly armed with knowledge and authority, IPTs improve decision-
making and help better products to be developed more quickly. They do this
by reducing rework in product planning, design, and manufacturing; reducing
cycle time and costs; and improving first- time product quality. At issue is
not so much whether to employ IPTs but rather how to employ them
effectively. Leading commercial firms have been successful with IPTs because
(1) they have given their teams the key elements of IPTs, (2) they have
taken action and made investments at the corporate level to ensure
implementation occurs at the program execution level, and (3) their
competitive environment creates incentives that align well with IPTs. In
addition, their IPTs have worked in conjunction with other good management
practices, such as maturing technologies before they are turned over to a
team responsible for delivering the final product.

DOD has rightly endorsed IPTs as a vehicle to improve management of the
development of weapon systems. On the programs that were experiencing
problems in meeting their objectives, IPTs were not effective because they
did not have the knowledge or authority to recognize problems early and
resolve them. The teams were at a disadvantage because they did not possess
the key elements of IPTs and in fact were IPTs in name only; DOD did not
back policies up with actions to ensure IPTs were implemented at the program
execution level; and the DOD environment for managing weapon systems created
obstacles, not incentives, for IPTs.

In leading commercial companies, the corporate environment has become
conducive to IPTs, so that the typical program manager can employ IPTs
effectively. In DOD, it takes a rare program manager to make IPTs work. Much
of the success of IPTs on the amphibious vehicle program, for example, can
be attributed to extraordinary individual efforts and unique circumstances,
rather than to a systematic DOD approach. The challenge for DOD is to create
the conditions under which the resources and tools typically provided to
most weapon system program managers will enable the effective use of IPTs.
We have previously reported on how DOD can create conditions more conducive
to adopting best practices and DOD has taken initial actions to do so. These
changes could make the DOD environment more conducive to IPTs. If DOD is
successful in implementing changes at the individual program execution
level, it will be more likely that the typical program manager will be able
to create effective IPTs. Still, DOD must take specific steps to put program
offices in a better position to create the elements of effective IPTs.
Chapter 5: Conclusions and

Recommendations Conclusions

Chapter 5: Conclusions and Recommendations Page 54 GAO- 01- 510 Best
Practices

We recommend that the Secretary of Defense designate as IPTs only those
teams that will have the day- to- day responsibility for developing and
delivering a product, such as a weapon system, and the cross- section of
expertise to do so. For those teams so designated, we recommend the
Secretary of Defense use the IPT practices and characteristics in this
report to develop and communicate to program offices standards for defining
what constitutes an effective IPT. Such standards could then be used to (1)
determine the extent that IPTs have been effectively implemented in weapon
system programs and (2) track progress in implementing IPTs.

We also recommend that the Secretary of Defense put program offices in a
better position to create and sustain effective IPTs by

? refining the IPT designation to be used exclusively for new product
development teams encompassing core components;

? ensuring IPTs have the sufficient knowledge and authority by (1) giving
them responsibility for a deliverable product, along with the authority to
make decisions on that product and (2) providing representation from each
functional area of expertise critical to product design, development, and
manufacture;

? enabling IPT leaders to participate in program goal setting and holding
the teams accountable for achieving those goals;

? encouraging and supporting program managers' efforts to collocate team
members, including contractor personnel;

? providing program managers and team leaders with greater authority and
control over selection of IPT members, rating authority, and rotation of
members; and

? establishing indicators to enable program and team management to evaluate
the performance of IPTs, such as the efficiency of the decisionmaking
process employed by a team.

Finally, we recommend that the Secretary of Defense help program managers
and team leaders become catalysts for IPT implementation by

? devoting professional education to make existing and prospective program
managers and IPT leaders aware of and capable of creating the culture
necessary to foster IPTs in weapon system programs; Recommendations for

Executive Action

Chapter 5: Conclusions and Recommendations Page 55 GAO- 01- 510 Best
Practices

? drawing lessons from programs like the Advanced Amphibious Assault Vehicle
to (1) bridge barriers between program offices and contractors and (2) use
collocation to break down barriers and create trust; and

? supporting IPT s with the resources- such as information technology,
training, and expert help- needed to maximize their effectiveness.

DOD concurred with a draft of this report and most of its recommendations
and agreed to emphasize the practices and characteristics discussed in the
report concerning the operation of program offices' integrated product
teams. (See app. I.)

DOD partially concurred with the recommendation that only those teams with
day- to– day responsibility for a product and the necessary cross
section of expertise be designated as integrated product teams. It noted
that while such teams are unique and require certain conditions and
investments, the designation “integrated product team” has
spread throughout the workforce and has benefited other teams as well. DOD
did not want to lose those benefits by limiting the designation. DOD's
position reflects the practical reality that the designation of integrated
product teams is now difficult to restrict. Given the Department's
recognition that program office integrated product teams require certain
conditions and investments to succeed that other integrated product teams
may not need, we believe that if the Department takes the actions contained
in our other recommendations, the objective of the recommendation will be
achieved. Agency Comments

and Our Evaluation

Appendix I: Comments From the Department of Defense

Page 56 GAO- 01- 510 Best Practices

Appendix I: Comments From the Department of Defense

Appendix I: Comments From the Department of Defense

Page 57 GAO- 01- 510 Best Practices

Now on pp. 7 and 54. Now on pp. 7 and 54.

Appendix I: Comments From the Department of Defense

Page 58 GAO- 01- 510 Best Practices

Now on pp. 7, 54, and 55. Now on pp. 7 and 54.

Appendix II: GAO Contacts and Staff Acknowledgments

Page 59 GAO- 01- 510 Best Practices

Jack L. Brock (202) 512- 4841 Paul L. Francis (202) 512- 2811

In addition to those named above, Russ Allen, Kathleen Joyce, Gordon Lusby,
Marco Martinez, Elisabeth Ryan, and Yelena Thompson made key contributions
to this report. Appendix II: GAO Contacts and Staff

Acknowledgments GAO Contacts Acknowledgments

Related GAO Products Page 60 GAO- 01- 510 Best Practices

Best Practices: Better Matching of Needs and Resources Will Lead to Better
Weapon System Outcomes (GAO- 01- 288 March 8, 2001).

Best Practices: A More Constructive Test Approach Is Key to Better Weapon
System Outcomes (GAO/ NSIAD- 00- 199 July 31, 2000).

Defense Acquisition: Employing Best Practices Can Shape Better Weapon System
Decisions (GAO/ T- NSIAD- 00- 137, Apr. 26, 2000).

Best Practices: DOD Training Can Do More to Help Weapon System Programs
Implement Best Practices (GAO/ NSIAD- 99- 206, Aug. 16, 1999).

Best Practices: Better Management of Technology Development Can Improve
Weapon System Outcomes (GAO/ NSIAD- 99- 162, July 30, 1999).

Defense Acquisitions: Best Commercial Practices Can Improve Program Outcomes
(GAO/ T- NSIAD- 99- 116, Mar. 17, 1999).

Defense Acquisition: Improved Program Outcomes Are Possible

(GAO/ T- NSIAD- 98- 123, Mar. 17, 1998).

Best Practices: DOD Can Help Suppliers Contribute More to Weapon System
Programs (GAO/ NSIAD- 98- 87, Mar. 17, 1998).

Best Practices: Successful Application to Weapon Acquisition Requires
Changes in DOD's Environment (GAO/ NSIAD- 98- 56, Feb. 24, 1998).

Major Acquisitions: Significant Changes Underway in DOD's Earned Value
Management Process (GAO/ NSIAD- 97- 108, May 5, 1997).

Best Practices: Commercial Quality Assurance Practices Offer Improvements
for DOD (GAO/ NSIAD- 96- 162, Aug. 26, 1996). Related GAO Products

(707465)

The first copy of each GAO report is free. Additional copies of reports are
$2 each. A check or money order should be made out to the Superintendent of
Documents. VISA and MasterCard credit cards are also accepted.

Orders for 100 or more copies to be mailed to a single address are
discounted 25 percent.

Orders by mail:

U. S. General Accounting Office P. O. Box 37050 Washington, DC 20013

Orders by visiting:

Room 1100 700 4 th St., NW (corner of 4 th and G Sts. NW) Washington, DC
20013

Orders by phone:

(202) 512- 6000 fax: (202) 512- 6061 TDD (202) 512- 2537

Each day, GAO issues a list of newly available reports and testimony. To
receive facsimile copies of the daily list or any list from the past 30
days, please call (202) 512- 6000 using a touchtone phone. A recorded menu
will provide information on how to obtain these lists.

Orders by Internet

For information on how to access GAO reports on the Internet, send an e-
mail message with “info” in the body to:

Info@ www. gao. gov or visit GAO's World Wide Web home page at: http:// www.
gao. gov

Contact one:

? Web site: http:// www. gao. gov/ fraudnet/ fraudnet. htm

? E- mail: fraudnet@ gao. gov

? 1- 800- 424- 5454 (automated answering system) Ordering Information

To Report Fraud, Waste, and Abuse in Federal Programs

GAO- 01- 516R RD Loan Validation

United States General Accounting Office Washington, DC 20548

April 10, 2001 The Honorable Ann Veneman The Secretary of Agriculture

Subject: Rural Development: Assessment of Data Used to Support Non- Housing
Direct Loan Programs Subsidy Cost Estimates

Dear Madam Secretary: Rural Development's (RD) long- standing problems with
estimating the cost of its credit programs in accordance with the Federal
Credit Reform Act of 1990 and federal accounting standards- credit reform
implementation- continues to be a major factor preventing the Department of
Agriculture (USDA) from achieving an unqualified opinion on its consolidated
financial statements. In addition, these problems materially affect USDA's
budget submissions because the same cost estimates are generally used for
both budget preparation and financial reporting.

Since April 1999, we have been assessing RD's credit reform implementation
efforts in such areas as (1) identifying key cash flow assumptions, (2)
improving cash flow models, (3) assessing cash flow model data, and (4)
implementing other procedures to enhance the credit subsidy estimation
process. RD has divided its credit programs into three areas: housing direct
loans, non- housing direct loans, and guaranteed loans. Our efforts to date
have primarily focused on the non- housing and guaranteed loan programs.
This letter is part of a series of status reports 1 on RD's efforts to
improve its credit program cost estimates, and focuses solely on RD's major
nonhousing direct loan programs, 2 which RD reported at $40.6 billion in
loans outstanding as of September 30, 2000. 3

1 See Credit Reform: Improving Rural Development's Credit Program Cost
Estimates (GAO/ AIMD- 00- 286R, August 22, 2000) and two related
correspondences: Credit Reform: Rural Development's Efforts to Improve Loan
Cost Estimates, December 17, 1999, and Credit Reform: Improving Rural
Development's Loan Cost Estimates, June 25, 1999.

2 The major non- housing direct loan programs were Water and Waste Disposal,
Federal Financing Bank Electric, Municipal Electric, Telecommunications
Hardship, and Electric Hardship. The criteria used to identify these major
programs included outstanding loan balances, obligation trends, and subsidy
cost.

3 Rural Development's Consolidated Financial Statements for Fiscal Year
2000.

GAO- 01- 516R RD Loan Validation Page 2 The reasonableness of RD's loan
program cost estimates is affected by the quality of

its cash flow assumptions, which are calculated based on data recorded in
RD's loan accounting systems. As part of our ongoing work, this letter
provides our assessment of the accuracy of the data that RD uses to
calculate key cash flow assumptions- the assumptions that have the greatest
impact on the program's estimated subsidy cost. RD accounts for the major
non- housing direct loan programs in the Program Loan Accounting System
(PLAS), the Rural Electrification Administration system (REA), and the
Federal Financing Bank system (FFB).

Results in Brief

For RD to prepare reasonable subsidy cost estimates, being able to draw on
reliable data is an important first step. Our testing determined that the
data included in RD's PLAS, REA, and FFB loan accounting systems that are
used to calculate key cash flow assumptions for the major non- housing
direct loan programs are generally accurate. The assumptions that RD has
determined to be key for calculating the subsidy cost estimates for these
programs are the average borrower interest rate and average loan term,
except for the FFB electric program. 4 For this program, RD staff identified
the average borrower interest rate as the key cash flow assumption.

In commenting on a draft of this letter, RD officials agreed with our
finding. We have incorporated their comments as appropriate.

Background

The Federal Credit Reform Act of 1990, the related accounting standard,
Statement of Federal Financial Accounting Standards (SFFAS) No. 2, 5
Accounting for Direct Loans and Loan Guarantees, and various budget
guidance, 6 together known as credit reform,

were established to more accurately measure the government's costs of
federal credit programs. As part of implementing credit reform, agencies are
required to estimate the net cost of extending or guaranteeing credit-
generally referred to as the subsidy cost- based on the present value 7 of
estimated net cash flows for the life of the loan, excluding administrative
costs.

4 Based on our previous work to assess RD's credit reform implementation
efforts, we agree with the key cash flow assumptions RD has identified for
its major non- housing direct loan programs. 5 SFFAS No. 2 was amended by
SFFAS No. 18, Amendments to Accounting Standards for Direct Loans and Loan
Guarantees. The objective of the amendments was to improve financial
reporting related to subsidy costs and performance of federal credit
programs.

6 Office of Management and Budget circulars A- 11 and A- 34 include guidance
for implementing credit reform including estimating credit subsidy costs. 7
Present value is the worth of a future stream of returns or costs in terms
of money paid immediately. In calculating present value, prevailing interest
rates provide the basis for converting future amounts into their
“money now” equivalents.

GAO- 01- 516R RD Loan Validation Page 3 RD management is responsible for
accumulating sufficient, relevant, and reliable data

on which to base its estimated net cash flows. RD's process for estimating
its net cash flows and calculating its subsidy costs are shown in figure 1.

Figure 1: Rural Development's Process for Calculating Subsidy Costs for Its
Non- Housing Direct Loan Programs

Notes: 1. Except where indicated, these processes are automated. 2. Model B
is the cash flow model for RD's non- housing direct loan programs. 3. SAS, a
statistical analysis system, is an integrated suite of software designed to
perform various functions such as data access, data management, data
analysis, and data presentation. RD uses this SAS program to summarize and
analyze loan accounting system data related to the key cash flow
assumptions.

4. In conjunction with the Model B calculations, RD uses the Credit Subsidy
Calculator, a computer software program developed by the Office of
Management and Budget (OMB), to calculate the subsidy costs for the non-
housing direct loan programs. The OMB Credit Subsidy Calculator was
developed to provide a consistent approach to calculating the present values
of credit program costs.

Source: Rural Development.

As RD makes loans to borrowers, data related to each loan- including the
applicable term and interest rate- are entered in RD's loan accounting
systems, and the associated loan documents are filed in headquarters or
regional offices. From the loan accounting systems, key data such as loan
amount or borrower interest rate are captured by an automated program that
calculates RD's average borrower interest rate and average loan term
assumptions. These assumptions are then manually entered into RD's cash flow
model, a computer- based spreadsheet referred to as Model B, 8 which
calculates an estimate for the subsidy cost of the program.

In response to prior audit findings, USDA organized a credit reform
implementation task force and developed a detailed implementation plan. The
task force has made

8 RD has three cash flow models: Model A for direct housing loan programs,
Model B for non- housing direct loan programs, and Model C for guaranteed
loan programs.

Rural Development

Loan Accounting

System Subsidy

costs Model B

estimates net cash flows

Manual process Manual

process

Loan Files

SAS report on key cash flow

assumptions SAS Program

reads and extracts data

GAO- 01- 516R RD Loan Validation Page 4 progress in several areas, including
completing and documenting sensitivity analyses 9

for the non- housing direct loan program's cost estimates. As reported in
our August 22, 2000 letter, 10 we assessed RD's approach to performing
sensitivity analyses in order to identify key cash flow assumptions. RD
determined that there were two key cash flow assumptions for the major non-
housing direct loan programs: average borrower interest rate and average
loan term. 11 We agreed with RD's determination that these were the most
significant cash flow assumptions based on our prior review. As a result,
variations in these assumptions have the greatest impact on subsidy cost
estimates. Therefore, it is critical that these assumptions be based on
reliable data and be correctly calculated.

Scope and Methodology

In order to assess the reliability of the data in the loan accounting
systems that support the calculation of key cash flow assumptions, which are
entered in RD's cash flow model, we selected a random sample of loans (Water
and Waste Disposal Program) or advances (all other programs) from the
appropriate systems- PLAS, REA, and FFB- as of June 30, 2000. Prior to
selecting these sample transactions, we determined that the unpaid principal
balances recorded in the loan accounting systems agreed with the June 30
general ledger trial balances for each loan program. For the FFB electric
loan program, our population comprised 631 loans. For the REA loan program
(Municipal Electric, Telecommunications Hardship, and Electric Hardship),
our population comprised 4,052 loans. For the Water and Waste Disposal loan
program, our population comprised 5,844 loans.

For sampled loans, we obtained documentation from the RD loan files
supporting the loan accounting systems' data that are used to calculate the
average borrower interest rate and loan term assumptions used in the cash
flow model. The documentation obtained included loan or bond agreements,
Financial Requirement and Expenditure Statements, Voucher and Schedule of
Payments, FFB Interest Rate Confirmation Notices, and Quarterly Federal
Register Municipal Interest Rate Schedules. We then compared the data in the
loan accounting systems to the supporting documents to verify that the
recorded interest rates and components of the loan terms documented in the
automated records agreed with the supporting documents.

We conducted our work in Washington, D. C., and at selected RD field offices
from August 2000 through February 2001 in accordance with generally accepted
government auditing standards. We received oral comments on a draft of our
letter from the Office of the Chief Financial Officer, Rural Development.

9 Sensitivity analysis is a process used to identify the assumptions that,
when adjusted, have the greatest impact on the credit subsidy estimate. 10
GAO/ AIMD- 00- 286R.

11 However, for the FFB electric program, RD staff identified the average
borrower interest rate as the key cash flow assumption.

GAO- 01- 516R RD Loan Validation Page 5

Loan Accounting Systems Data Are Generally Accurate

Our testing determined that the data included in the PLAS, REA, and FFB
systems that are used to calculate key cash flow assumptions for RD's major
non- housing direct loan programs are generally accurate. For the PLAS and
REA programs, RD has determined that the average borrower interest rate and
average loan term are the key assumptions used in their calculation of
subsidy costs. For the FFB electric program, RD has determined that the
average borrower interest rate is the key assumption.

RD uses the borrower interest rates recorded in the three loan accounting
systems to calculate the average interest rate assumption. In 323 of the 325
total sample loans 12 tested, the borrower interest rate recorded in the
loan accounting systems agreed to the supporting documentation in the loan
files. For two sample loans, RD could not locate the documentation to
support the borrowers' interest rates. As a result, we counted those two
items as errors because RD had no assurance that the rates for those loans
were correct. However, these errors were not material, as shown in enclosure
I, and therefore, we concluded that the interest rate data in RD's loan
accounting systems for the major non- housing direct loan programs were
generally accurate.

RD uses the loan maturity date and the loan advance date 13 recorded in the
three loan accounting systems to calculate the average loan term assumption.
In 323 of the 325 total sample loans tested, the advance date agreed to
supporting documentation in the loan files. As with testing the accuracy of
the interest rate data, the two errors related to missing documentation. In
320 of the 325 total sample loans tested, the maturity dates in the systems
agreed to the supporting documentation. Of the five cases in error, the
missing documentation mentioned above accounted for two, and there were
three cases in which the maturity dates in the system did not agree with
those in the loan files. These errors related to the Water and Waste
Disposal loans included in RD's PLAS system. In two of the three cases, the
loan system's maturity dates reflected 1 year more than the loan file
documentation. In the remaining case, the loan system's maturity date
reflected 5 years more than the loan file documentation. However, these
errors were not material, as shown in enclosure I, and therefore, we
concluded that the loan term data for the major non- housing direct loan
programs were generally accurate.

Conclusion

Having reliable non- housing direct loan programs' data to prepare subsidy
cost estimates represents important progress in achieving more reasonable
estimates for the cost of these programs. Our testing showed that the
information used to

12 The 325 total sample loan population consisted of 114 REA loans, 97 FFB
loans, and 114 Water and Waste Disposal loans. 13 The advance date reflects
the date the agency releases a portion of the total loan amount to the
borrower.

GAO- 01- 516R RD Loan Validation Page 6 calculate key cash flow assumptions
for the major non- housing direct loan programs

was reliable.

Agency Comments

In commenting on a draft of this letter, RD officials agreed with our
finding. We have incorporated their comments as appropriate.

We are sending copies of this letter to Patricia Healy, Acting Chief
Financial Officer, Department of Agriculture, and R. Mack Gray, Acting
Deputy Under Secretary for Natural Resources and Environment. This letter
will also be available on GAO's home page at http:// www. gao. gov.

If you have any questions about this letter, please contact me at (202) 512-
9508 or McCoy Williams, Acting Director, at (202) 512- 6906. Key
contributors to this assignment are listed in enclosure II.

Sincerely yours, Linda M. Calbom Director, Financial Management and
Assurance

Enclosures

Enclosure I GAO- 01- 516R RD Loan Validation Page 7

Sample Results

We selected a random sample of loans from RD's major non- housing direct
loan programs in the Program Loan Accounting System (PLAS), the Rural
Electrification Administration (REA) system, and the Federal Financing Bank
(FFB) system. Table 1 and the accompanying notes identify the random samples
selected, the number of errors found, and a projection of these errors to
the population of the major nonhousing direct loan programs in the three
loan accounting systems.

Table 1: Summary of Sample Results for Rural Development's Major Non-
Housing Direct Loan Programs Loan accounting system

Major non- housing direct loan program

Population size

Sample size

Errors found in the interest rate data

Errors found in the loan term data

Conclusion

PLAS Water and Waste Disposal

5,844 114 0 3 maturity date errors a Not material REA Municipal Electric

Telecommunications Hardship

Electric Hardship 4,052 114 2 b 2 interest

rate, advance date, and maturity date errors b

Not material FFB Federal Financing

Bank 631 97 0 0 No errors

found a For three of the sampled loans, the maturity dates did not agree to
those in the loan files. When projecting these three errors to the
population of 5, 844, we are 95 percent confident that the errors in the
population are between 42 and 389 loans. Our best estimate is that 154 loans
have maturity dates that did not agree to those in the loan files. This fell
below our tolerable amount of errors, which was 526 loans (a 9 percent
tolerable error rate) for the Water and Waste Disposal sample.

b For two of the sample loans, RD could not locate the documentation to
support the borrowers' interest rates, advance dates, and maturity dates. As
a result, we counted those two items as errors because RD had no assurance
that the system data for those loans were correct. When projecting these two
errors to the population of 4,052, we are 95 percent confident that the
errors in the population are between 13 and 220 loans. Our best estimate is
that 71 loans had no documentation. This fell below our tolerable amount of
errors, which was 365 loans (a 9 percent tolerable error rate) for the REA
sample.

Enclosure II GAO- 01- 516R RD Loan Validation Page 8

Staff Acknowledgments

Dan Blair, Marcia Carlsen, Carla Lewis, Jerry Pennington, and Ronda Price
made key contributions to this report.

(913912)
*** End of document ***