Information Technology: Customs Automated Commercial Environment 
Program Progressing, but Need for Management Improvements	 
Continues (14-MAR-05, GAO-05-267).				 
                                                                 
The Department of Homeland Security (DHS) is conducting a	 
multiyear, multibillion-dollar acquisition of a new trade	 
processing system, planned to support the movement of legitimate 
imports and exports and strengthen border security. By		 
congressional mandate, plans for expenditure of appropriated	 
funds on this system, the Automated Commercial Environment (ACE),
must meet certain conditions, including GAO review. This study	 
addresses whether the fiscal year 2005 plan satisfies these	 
conditions, describes the status of DHS's efforts to implement	 
prior GAO recommendations for improving ACE management, and	 
provides observations about the plan and DHS's management of the 
program.							 
-------------------------Indexing Terms------------------------- 
REPORTNUM:   GAO-05-267 					        
    ACCNO:   A19236						        
  TITLE:     Information Technology: Customs Automated Commercial     
Environment Program Progressing, but Need for Management	 
Improvements Continues						 
     DATE:   03/14/2005 
  SUBJECT:   Accountability					 
	     Appropriated funds 				 
	     Budget outlays					 
	     Cost overruns					 
	     Federal funds					 
	     Federal procurement				 
	     Funds management					 
	     International trade				 
	     Performance measures				 
	     Program evaluation 				 
	     Schedule slippages 				 
	     Trade policies					 
	     National preparedness				 
	     Customs administration				 
	     Procurement planning				 
	     Strategic planning 				 
	     Border security					 
	     Customs Service Automated Commercial		 
	     Environment System 				 
                                                                 

******************************************************************
** This file contains an ASCII representation of the text of a  **
** GAO Product.                                                 **
**                                                              **
** No attempt has been made to display graphic images, although **
** figure captions are reproduced.  Tables are included, but    **
** may not resemble those in the printed version.               **
**                                                              **
** Please see the PDF (Portable Document Format) file, when     **
** available, for a complete electronic file of the printed     **
** document's contents.                                         **
**                                                              **
******************************************************************
GAO-05-267

                 United States Government Accountability Office

GAO Report to Congressional Committees

March 2005

                                  INFORMATION
                                   TECHNOLOGY

Customs Automated Commercial Environment Program Progressing, but Need for
                       Management Improvements Continues

                                       a

GAO-05-267

[IMG]

March 2005

INFORMATION TECHNOLOGY

Customs Automated Commercial Environment Program Progressing, but Need for
Management Improvements Continues

What GAO Found

The fiscal year 2005 ACE expenditure plan, including related program
documentation and program officials' statements, largely satisfies the
legislative conditions imposed by the Congress. In addition, some of the
recommendations that GAO has previously made to strengthen ACE management
have been addressed, and DHS has committed to addressing those that
remain. However, much remains to be done before these recommendations are
fully implemented. For example, progress has been slow on implementing the
recommendation that the department proactively manage the dependencies
between ACE and related DHS border security programs. Delays in managing
the relationships among such programs will increase the chances that later
system rework will be needed to allow the programs to interoperate.

Among GAO's observations about the ACE program and its management are
several regarding DHS's approach to addressing previously identified cost
and schedule overruns. DHS has taken actions intended to address these
overruns (such as revising its baselines for cost and schedule, as GAO
previously recommended); however, it is unlikely that these actions will
prevent future overruns, because DHS has relaxed system quality standards,
meaning that milestones are being passed despite material system defects.
Correcting such defects will require the program to use resources (e.g.,
people and test environments) at the expense of later system releases.
Until the ACE program is held accountable not only for cost and schedule
but also for system capabilities and benefits, the program is likely to
continue to fall short of expectations.

Finally, the usefulness of the fiscal year 2005 expenditure plan for
congressional oversight is limited. For example, it does not adequately
describe progress against commitments (e.g., ACE capabilities, schedule,
cost, and benefits) made in previous plans, which makes it difficult to
make well-informed judgments on the program's overall progress. Also, in
light of recent program changes, GAO questions the expenditure plan's
usefulness to the Congress as an accountability mechanism. The expenditure
plan is based largely on the ACE program plan of July 8, 2004. However,
recent program developments have altered some key bases of the ACE program
plan and thus the current expenditure plan. In particular, the expenditure
plan does not reflect additional program releases that are now planned or
recent changes to the roles and responsibilities of the ACE development
contractor and the program office. Without complete information and an
up-to-date plan, meaningful congressional oversight of program progress
and accountability is impaired.

                 United States Government Accountability Office

Contents

       Letter                                                               1 
                       Compliance with Legislative Conditions               2 
                           Status of Open Recommendations                   2 
                          Observations on Management of ACE                 6 
                                     Conclusions                            8 
                        Recommendations for Executive Action               10 
                                   Agency Comments                         10 

Appendixes

                                       Appendix I: Appendix II: Appendix III:

Briefing to Subcommittees on Homeland Security, House and
Senate Committees on Appropriations 12

Comments from the U.S. Department of Homeland
Security 119

Contacts and Staff Acknowledgments 122 GAO Contacts 122 Staff
Acknowledgments 122

Contents

Abbreviations

ACE Automated Commercial Environment
ACS Automated Commercial System
CBP U.S. Customs and Border Protection
CBPMO Customs and Border Protection Modernization Office
CIO chief information officer
CMU Carnegie Mellon University
EA enterprise architecture
eCP e-Customs Partnership
EVM earned value management
IDIQ indefinite-delivery/indefinite-quantity
IEEE Institute of Electrical and Electronics Engineers
IRB Investment Review Board
ITDS International Trade Data System
IV&V independent verification and validation
JAR Java Archive
OIG Office of Inspector General
OIT Office of Information and Technology
ORR operational readiness review
OTB Over Target Baseline
PRR production readiness review
PTR program trouble report
SA-CMM(R) Software Acquisition Capability Maturity Model
SAT system acceptance test
SDLC systems development life cycle
SEI Software Engineering Institute
SIT system integration test
SWIT software integration test
TRR test readiness review
UAT user acceptance test
US-VISIT United States Visitor and Immigrant Status Indicator

Technology

Contents

This is a work of the U.S. government and is not subject to copyright
protection in the United States. It may be reproduced and distributed in
its entirety without further permission from GAO. However, because this
work may contain copyrighted images or other material, permission from the
copyright holder may be necessary if you wish to reproduce this material
separately.

A

United States Government Accountability Office Washington, D.C. 20548

March 14, 2005

The Honorable Judd Gregg
Chairman
The Honorable Robert C. Byrd
Ranking Minority Member
Subcommittee on Homeland Security
Committee on Appropriations
United States Senate

The Honorable Harold Rogers
Chairman
The Honorable Martin Olav Sabo
Ranking Minority Member
Subcommittee on Homeland Security
Committee on Appropriations
House of Representatives

In November 2004, U.S. Customs and Border Protection (CBP), within the
Department of Homeland Security (DHS), submitted to the Congress its
fiscal year 2005 expenditure plan for the Automated Commercial
Environment (ACE) program. ACE is to be CBP's new import and export
processing system. The program's goals include facilitating the movement
of legitimate trade through more effective trade account management and
strengthening border security by identifying import and export
transactions that could pose a threat to the United States. DHS currently
plans to acquire and deploy ACE in 11 increments, referred to as releases,
over 9 years. The first 3 releases are deployed and operating. The fourth
release is in the final stages of testing. Later releases are in various
stages
of definition and development. The risk-adjusted ACE life-cycle cost
estimate is about $3.3 billion,1 and through fiscal year 2004, about $1
billion
in ACE-appropriated funding has been provided.

As required by DHS's fiscal year 2005 appropriations,2 we reviewed the
ACE fiscal year 2005 expenditure plan. Our objectives were to
(1) determine whether the expenditure plan satisfies certain legislative
conditions, (2) determine the status of our open ACE recommendations,

1CBP's ACE life-cycle cost estimate not adjusted for risk is about $3.1
billion. 2Pub. L. 108-334 (Oct. 18, 2004).

and (3) provide any other observations about the expenditure plan and
DHS's management of the ACE program.

On December 20, 2004, we briefed your offices on the results of this
review. This report transmits the results of our work. The full briefing,
including our scope and methodology, can be found in appendix I.

Compliance with Legislative Conditions

The fiscal year 2005 expenditure plan satisfied or partially satisfied the
conditions specified in DHS's appropriations act. Specifically, the plan,
including related program documentation and program officials' statements,
satisfied or provided for satisfying all key aspects of (1) meeting the
capital planning and investment control review requirements of the Office
of Management and Budget (OMB) and (2) review and approval by DHS and OMB.
The plan partially satisfied the conditions that specify (1) compliance
with the DHS enterprise architecture3 and (2) compliance with the
acquisition rules, requirements, guidelines, and systems acquisition
management practices of the federal government.

Status of Open Recommendations

CBP is working toward addressing our open recommendations. Each
recommendation, along with the status of actions to address it, is
summarized below.

o 	Develop and implement a rigorous and analytically verifiable
costestimating program that embodies the tenets of effective estimating as
defined in the Software Engineering Institute's (SEI) institutional and
project-specific estimating models.4

The CBP Modernization Office's (CBPMO) implementation of this
recommendation is in progress. CBPMO has (1) defined and documented

3An enterprise architecture is an institutional blueprint for guiding and
constraining investments in programs like ACE.

4SEI's institutional and project-specific estimating guidelines are
defined respectively in Robert E. Park, Checklists and Criteria for
Evaluating the Cost and Schedule Estimating Capabilities of Software
Organizations, CMU/SEI-95-SR-005, and A Manager's Checklist for Validating
Software Cost and Schedule Estimates, CMU/SEI-95-SR-004 (Pittsburgh, Pa.:
Carnegie Mellon University Software Engineering Institute, 1995).

processes for estimating expenditure plan costs (including management
reserve costs); (2) hired a contractor to develop cost estimates,
including contract task orders, that are independent of the ACE
development contractor's estimates; and (3) tasked a support contractor
with evaluating the independent estimates and the development contractor's
estimates against SEI criteria. According to the summary-level results of
this evaluation, the independent estimates either satisfied or partially
satisfied the SEI criteria, and the development contractor's estimates
satisfied or partially satisfied all but two of the seven SEI criteria.

o 	Ensure that future expenditure plans are based on cost estimates that
are reconciled with independent cost estimates.

CBPMO's implementation of this recommendation is complete with respect to
the fiscal year 2005 expenditure plan. In August 2004, CBP's support
contractor completed an analysis comparing the cost estimates in the
fiscal year 2005 expenditure plan (which are based on the ACE development
contractor's cost estimates) with the estimate prepared by CBPMO's
independent cost estimating contractor; this analysis concluded that the
two estimates are consistent.

o 	Immediately develop and implement a human capital management strategy
that provides both near-and long-term solutions to the program office's
human capital capacity limitations, and report quarterly to the
appropriations committees on the progress of efforts to do so.

CBPMO's implementation of this recommendation is in progress, and it has
reported on its actions to the Congress. Following our recommendation,
CBPMO provided reports dated March 31, 2004, and June 30, 2004, to the
appropriations committees on its human capital activities, including
development of a staffing plan that identifies the positions it needs to
manage ACE. However, in December 2004, CBPMO implemented a reorganization
of the modernization office, which makes the staffing plan out of date. As
part of this reorganization, CBP transferred government and contractor
personnel who have responsibility for the Automated Commercial System,5
the Automated Targeting System,6 and ACE training

5The Automated Commercial System is CBP's system for tracking,
controlling, and processing imports to the United States.

6The Automated Targeting System is CBP's system for identifying import
shipments that warrant further attention.

from non-CBPMO organizational units to CBPMO. According to CBPMO, this
change is expected to eliminate redundant ACE-related program management
efforts.

o 	Have future ACE expenditure plans specifically address any proposals or
plans, whether tentative or approved, for extending and using ACE
infrastructure to support other homeland security applications, including
any impact on ACE of such proposals and plans.

CBP's implementation of this recommendation is in progress. In our fiscal
year 2004 expenditure plan review,7 we reported that CBPMO had discussed
collaboration opportunities with DHS's United States Visitor and Immigrant
Status Indicator Technology (US-VISIT) program8 to address the potential
for ACE infrastructure, data, and applications to support US-VISIT. Since
then, ACE and US-VISIT managers have again met to identify potential areas
for collaboration between the two programs and to clarify how the programs
can best support the DHS mission. The US-VISIT and ACE programs have
formed collaboration teams that have drafted team charters, identified
specific collaboration opportunities, developed timelines and next steps,
and briefed ACE and US-VISIT program officials on the teams' progress and
activities.

o 	Establish an independent verification and validation (IV&V) function to
assist CBP in overseeing contractor efforts, such as testing, and ensure
the independence of the IV&V agent.

CBP has completed its implementation of this recommendation. To ensure
independence, CBPMO has selected an IV&V contractor that, according to CBP
officials, has had no prior involvement in the modernization program. The
IV&V contractor is to be responsible for reviewing ACE products and
management processes and is to report directly to the CBP chief
information officer.9

7GAO, Information Technology: Early Releases of Customs Trade System
Operating, but Pattern of Cost and Schedule Problems Needs to Be
Addressed, GAO-04-719 (Washington, D.C.: May 14, 2004).

8US-VISIT is a governmentwide program to collect, maintain, and share
information on foreign nationals in order to enhance national security and
facilitate legitimate trade and travel while adhering to U.S. privacy
laws.

9According to a CBP official, the IV&V contract was awarded on December
30, 2004.

o 	Define metrics, and collect and use associated measurements, for
determining whether prior and future program management improvements are
successful.

CBPMO's implementation of this recommendation is in progress. CBPMO has
implemented a program that generally focuses on measuring the ACE
development contractor's performance through the use of earned value
management,10 metrics for the timeliness and quality of deliverables, and
risk and issue disposition reporting. Additionally, it is planning to
broaden its program to encompass metrics and measures for determining
progress toward achieving desired business results and acquisition process
maturity. The plan for expanding the metrics program is scheduled for
approval in early 2005.

o 	Reconsider the ACE acquisition schedule and cost estimates in light of
early release problems, including these early releases' cascading effects
on future releases and their relatively small size compared to later
releases, and in light of the need to avoid the past levels of concurrency
among activities within and between releases.

CBP has completed its implementation of this recommendation. In response
to the cost overrun on Releases 3 and 4, CBPMO and the ACE development
contractor established a new cost baseline of $196 million for these
releases, extended the associated baseline schedule, and began reporting
schedule and cost performance relative to the new baselines. Additionally,
in July 2004, a new version of the ACE Program Plan was developed that
rebaselined the ACE program, extending delivery of the last ACE release
from fiscal year 2007 to fiscal year 2010, adding a new screening and
targeting release, and increasing the ACE life-cycle cost estimate by
about $1 billion to $3.1 billion. Last, the new program schedule reflects
less concurrency between future releases.

o 	Report quarterly to the House and Senate Appropriations Committees on
efforts to address open GAO recommendations.

CBP's implementation of this recommendation is in progress. CBP has
submitted reports to the committees on its efforts to address open GAO

10Earned value management is a method of measuring contractor progress
toward meeting deliverables by comparing the value of work accomplished
during a given period with that of the work expected in that period.

recommendations for the quarters ending March 31, 2004, and June 30, 2004.
CBPMO plans to submit a report for the quarter ending September 30, 2004,
after it is approved by DHS and OMB.

Observations on Management of ACE

We made observations related to ACE performance, use, testing,
development, cost and schedule performance, and expenditure planning. An
overview of the observations follows:

Initial ACE releases have largely met a key service level agreement.

According to a service level agreement between the ACE development
contractor and CBPMO, 99.9 percent of all ACE transactions are to be
executed successfully each day. The development contractor reports that
ACE has met this requirement on all but 11 days since February 1, 2004,
and attributed one problem that accounted for 5 successive days during
which the service level agreement was not met to CBPMO's focus on meeting
schedule commitments.

Progress toward establishing ACE user accounts has not met expectations.
CBPMO established a goal of activating 1,100 ACE importer accounts by
February 25, 2005, when Release 4 is to become operational. Weekly targets
were established to help measure CBPMO's progress toward reaching the
overall goal. However, CBPMO has not reached any of its weekly targets,
and the gap between the actual and targeted number of activated accounts
has continued to grow. To illustrate, as of November 26, 2004, the goal
was 600 activated accounts and the actual number was 311.

Release 3 testing and pilot activities were delayed and have produced
system defect trends that raise questions about decisions to pass key
milestones and about the state of system maturity. Release 3 test phases
and pilot activities were delayed and revealed system defects, some of
which remained open at the time decisions were made to pass key lifecycle
milestones. In particular, we observed the following:

o 	Release 3 integration testing started later than planned, took longer
than expected, and was declared successful despite open defects that
prevented the system from performing as intended. For example, the test
readiness milestone was passed despite the presence of 90 severe defects.

o 	Release 3 acceptance testing started later than planned, concluded
later than planned, and was declared successful despite having a material

inventory of open defects. For example, the production readiness milestone
was passed despite the presence of 18 severe defects.

o 	Release 3 pilot activities, including user acceptance testing, were
declared successful, despite the presence of severe defects. For example,
the operational readiness milestone was passed despite the presence of 6
severe defects.

o 	The current state of Release 3 maturity is unclear because defect data
reported since user acceptance testing are not reliable.

Release 4 test phases were delayed and overlapped, and revealed a higher
than expected volume and significance of defects, raising questions about
decisions to pass key milestones and about the state of system maturity.
In particular, we observed the following:

o 	Release 4 testing revealed a considerably higher than expected number
of material defects. Specifically, 3,059 material defects were reported,
compared with the 1,453 estimated, as of the November 23, 2004, production
readiness milestone.

o 	Changes in the Release 4 integration and acceptance testing schedule
resulted in tests being conducted concurrently. As we previously reported,
concurrent test activities increase risk and have contributed to past ACE
cost and schedule problems.

o 	The defect profile for Release 4 shows improvements in resolving
defects, but critical and severe defects remain in the operational system.
Specifically, as of November 30, 2004, which was about 1.5 weeks from
deployment of the Release 4 pilot period, 33 material defects were
present.

Performance against the revised cost and schedule estimates for Releases 3
and 4 has been mixed. Since the cost and schedule for Releases 3 and 4
were revised in April 2004, work has been completed under the budgeted
cost, but it is being completed behind schedule. In order to improve the
schedule performance, resources targeted for later releases have been
retained on Release 4 longer than planned. While this has resulted in
improved performance against the schedule, it has adversely affected cost
performance.

The fiscal year 2005 expenditure plan does not adequately describe
progress against commitments (e.g., ACE capabilities, schedule, cost, and
benefits) made in previous plans. In the fiscal year 2004 expenditure
plan, CBPMO committed to, for example, acquiring infrastructure for ACE
releases and to defining and designing an ACE release that was intended to
provide additional account management functionality. However, the current
plan described neither the status of infrastructure acquisition nor
progress toward defining and designing the planned account management
functionality. Also, the current plan included a schedule for developing
ACE releases, but neither reported progress relative to the schedule
presented in the fiscal year 2004 plan nor explained how the individual
releases and their respective schedules were affected by the rebaselining
that occurred after the fiscal year 2004 plan was submitted.

Some key bases for the commitments made in the fiscal year 2005
expenditure plan have changed, raising questions as to the plan's currency
and relevance. Neither the expenditure plan nor the program plan reflected
several program developments, including the following:

o 	A key Release 5 assumption made in the program and expenditure plans
regarding development, and thus cost and delivery, of the multimodal
manifest functionality is no longer valid.

o 	Additional releases, and thus cost and effort, are now planned that
were not reflected in the program and expenditure plans.

o 	The current organizational change management approach is not fully
reflected in program and expenditure plans, and key change management
actions are not to be implemented.

o 	Significant changes to the respective roles and responsibilities of the
ACE development contractor and CBPMO are not reflected in the program and
expenditure plans.

Conclusions	DHS and OMB have largely satisfied four of the five conditions
associated with the fiscal year 2005 ACE expenditure plan that were
legislated by the Congress, and we have satisfied the fifth condition.
Further, CBPMO has continued to work toward implementing our prior
recommendations aimed at improving management of the ACE program and thus
the program's chances of success. Nevertheless, progress has been slow in
addressing some of our recommendations, such as the one encouraging
proactive

management of the relationships between ACE and other DHS border security
programs, like US-VISIT. Given that these programs have made and will
continue to make decisions that determine how they will operate, delays in
managing their relationships will increase the chances that later system
rework will eventually be required to allow the programs to interoperate.

Additionally, while DHS has taken important actions to help address ACE
release-by-release cost and schedule overruns that we previously
identified, it is unlikely that the effect of these actions will prevent
the past pattern of overruns from recurring. This is because DHS has met
its recently revised cost and schedule commitments in part by relaxing
system quality standards, so that milestones are being passed despite
material system defects, and because correcting such defects will
ultimately require the program to expend resources, such as people and
test environments, at the expense of later system releases (some of which
are now under way).

In the near term, cost and schedule overruns on recent releases are being
somewhat masked by the use of less stringent quality standards;
ultimately, efforts to fix these defects will likely affect the delivery
of later releases. Until accountability for ACE is redefined and measured
in terms of all types of program commitments-system capabilities,
benefits, costs, and schedules-the program will likely experience more
cost and schedule overruns.

During the last year, DHS's accountability for ACE has been largely
focused on meeting its cost and schedule baselines. This focus is revealed
by the absence of information in the latest expenditure plan on progress
against all commitments made in prior plans, particularly with regard to
measurement and reporting on such things as system capabilities, use, and
benefits. It is also shown by the program's insufficient focus on system
quality, as demonstrated by its willingness to pass milestones despite
material defects, and by the absence of attention to the current defect
profile for Release 3 (which is already deployed).

Moreover, the commitments that DHS made in the fiscal year 2005
expenditure plan have been overcome by events, which limits the currency
and relevance of this plan and its utility to the Congress as an
accountability mechanism. As a result, the prospects of greater
accountability in delivering against its capability, benefit, cost, and
schedule commitments are limited. Therefore, it is critically important
that DHS define for itself and the Congress an accountability framework
for

ACE, and that it manage and report in accordance with this framework. If
it does not, the effects of the recent rebaselining of the program will be
short lived, and the past pattern of ACE costing more and taking longer
than planned will continue.

Recommendations for Executive Action

To strengthen accountability for the ACE program and better ensure that
future ACE releases deliver promised capabilities and benefits within
budget and on time, we recommend that the DHS Secretary, through the Under
Secretary for Border and Transportation Security, direct the Commissioner,
Customs and Border Protection, to define and implement an ACE
accountability framework that ensures

o 	coverage of all program commitment areas, including key expected or
estimated system (1) capabilities, use, and quality; (2) benefits and
mission value; (3) costs; and (4) milestones and schedules;

o 	currency, relevance, and completeness of all such commitments made to
the Congress in expenditure plans;

o  reliability of data relevant to measuring progress against commitments;

o 	reporting in future expenditure plans of progress against commitments
contained in prior expenditure plans;

o 	use of criteria for exiting key readiness milestones that adequately
consider indicators of system maturity, such as severity of open defects;
and

o 	clear and unambiguous delineation of the respective roles and
responsibilities of the government and the prime contractor.

Agency Comments	In written comments on a draft of this report signed by
the Acting Director, Departmental GAO/OIG Liaison, DHS agreed with our
findings concerning progress in addressing our prior recommendations. In
addition, the department agreed with the new recommendations we are making
in this report and described actions that it plans to take to enhance
accountability for the program. These planned actions are consistent with
our recommendations. DHS's comments are reprinted in appendix II.

We are sending copies of this report to the Chairmen and Ranking Minority
Members of other Senate and House committees and subcommittees that have
authorization and oversight responsibilities for homeland security. We are
also sending copies to the Secretary of Homeland Security, the Under
Secretary for Border and Transportation Security, the CBP Commissioner,
and the Director of OMB. In addition, the report will be available at no
charge on the GAO Web site at http://www.gao.gov.

Should you or your offices have any questions on matters discussed in this
report, please contact me at (202) 512-3459 or at [email protected]. Other
contacts and key contributors to this report are listed in appendix III.

Randolph C. Hite Director, Information Technology Architecture

and Systems Issues

Appendix I

Briefing to Subcommittees on Homeland Security, House and Senate
Committees on Appropriations

Information Technology: Customs Automated Commercial Environment Program
Progressing, but Need for Management Improvements Continues

Briefing to the Staffs of the
Subcommittees on Homeland Security,
Senate and House Committees on Appropriations

December 20, 2004

                                       1

Appendix I Briefing to Subcommittees on Homeland Security, House and
Senate Committees on Appropriations

Introduction Objectives Results in Brief Background Results

o  Legislative Conditions

o  Status of Recommendations

o  Observations Conclusions Recommendations Agency Comments Attachment 1:
Scope and Methodology

Appendix I Briefing to Subcommittees on Homeland Security, House and
Senate Committees on Appropriations

The Department of Homeland Security's (DHS) Bureau of Customs and Border
Protection (CBP)1 is over 3 years into its second attempt to introduce new
trade processing capability, known as the Automated Commercial Environment
(ACE). The goals of ACE are to

o 	facilitate the movement of legitimate trade through more effective
trade account management;

o 	strengthen border security by identifying import/export transactions
that have an elevated risk of posing a threat to the United States or of
violating a trade law or regulation; and

o 	provide a single system interface between the trade community2 and the
federal government,3 known as the International Trade Data System (ITDS),
and thereby reduce the data reporting burden placed on the trade community
while also providing federal agencies with the data and various
capabilities to support their respective international trade and
transportation missions.

1CBP was formed from the former U.S. Customs Service and other entities
with border protection responsibility. 2Members of the trade community
include importers and exporters, brokers and trade advisors, and carriers.
3Includes federal agencies responsible for managing international trade
and transportation processes.

Appendix I Briefing to Subcommittees on Homeland Security, House and
Senate Committees on Appropriations

The Department of Homeland Security Appropriations Act, 2005,1 states that
DHS may not obligate any funds for ACE until DHS submits for approval to
the House and Senate Committees on Appropriations a plan for expenditure
that

1. meets the capital planning and investment control review requirements
established by the Office of Management and Budget (OMB), including
Circular A-11, part 7,2

2. complies with DHS's enterprise architecture;

3. complies with the acquisition rules, requirements, guidelines, and
systems acquisition management practices of the federal government;

4. is reviewed and approved by the DHS Investment Review Board (IRB),3
Secretary of Homeland Security, and OMB; and

5. is reviewed by GAO.

1Pub. L. 108-334 (Oct. 18, 2004).

2OMB Circular A-11 establishes policy for planning, budgeting,
acquisition, and management of federal capital assets.

3The purpose of the Investment Review Board is to integrate capital
planning and investment control, budgeting, acquisition, and management of
investments. It is also to ensure that spending on investments directly
supports and furthers the mission and that this spending provides optimal
benefits and capabilities to stakeholders and customers.

Appendix I Briefing to Subcommittees on Homeland Security, House and
Senate Committees on Appropriations

In the Department of Homeland Security Appropriations Act for fiscal year
2005, the Congress appropriated approximately $321.7 million for the ACE
program.1

DHS submitted its fiscal year 2005 expenditure plan for $321.7 million on
November 8, 2004, to its House and Senate Appropriations Subcommittees on
Homeland Security.

DHS currently plans to acquire and deploy ACE in 11 increments, referred
to as releases. The first three releases are deployed and operational. The
fourth release is in the final stages of testing. Other releases are in
various stages of definition and development.

1Pub. L. 108-334 (Oct. 18, 2004).

Appendix I Briefing to Subcommittees on Homeland Security, House and
Senate Committees on Appropriations

Objectives

As agreed, our objectives were to

o 	determine whether the ACE fiscal year 2005 expenditure plan satisfies
the legislative conditions,

o  determine the status of our open recommendations on ACE, and

o 	provide any other observations about the expenditure plan and DHS's
management of the ACE program.

We conducted our work at CBP headquarters and contractor facilities in the
Washington, D.C., metropolitan area from April 2004 through December 2004,
in accordance with generally accepted government auditing standards.
Details of our scope and methodology are provided in attachment 1.

Appendix I Briefing to Subcommittees on Homeland Security, House and
Senate Committees on Appropriations

Appendix I Briefing to Subcommittees on Homeland Security, House and
Senate Committees on Appropriations

Appendix I Briefing to Subcommittees on Homeland Security, House and
Senate Committees on Appropriations

Appendix I Briefing to Subcommittees on Homeland Security, House and
Senate Committees on Appropriations

Objective 3: Observations

o  Initial ACE releases have largely met a key service level agreement.

o  Progress toward establishing ACE user accounts has not met
expectations.

o 	Release 3 testing and pilot activities were delayed and have produced
system defect trends that raise questions about decisions to pass key
milestones and about the state of system maturity.

o 	Release 3 integration testing started later than planned, took longer
than expected, and was declared successful despite open defects that
prevented system from performing as intended.

o 	Release 3 acceptance testing started later than planned, concluded
later than planned, and was declared successful despite material inventory
of open defects.

o 	Release 3 pilot activities, including user acceptance testing, were
declared successful despite severe defects remaining open.

o 	Current state of Release 3 maturity is unclear because defect data
since user acceptance testing are not reliable.

Appendix I Briefing to Subcommittees on Homeland Security, House and
Senate Committees on Appropriations

Objective 3: Observations

o 	Release 4 test phases were delayed and overlapped, and revealed a
higher than expected volume and significance of defects, raising questions
about decisions to pass key milestones and about the state of system
maturity.

o 	Release 4 testing revealed a considerably higher than expected number
of material defects.

o 	Release 4 integration and acceptance testing schedule changes resulted
in tests being conducted concurrently.

o 	Release 4 defect profile shows improvements in resolving defects, but
critical and severe defects remain in operational system.

o 	Performance against the revised cost and schedule estimates for
Releases 3 and 4 has been mixed.

o 	The fiscal year 2005 expenditure plan does not adequately describe
progress against commitments (e.g., ACE capabilities, schedule, cost, and
benefits) made in previous plans.

Appendix I Briefing to Subcommittees on Homeland Security, House and
Senate Committees on Appropriations

Objective 3: Observations

o 	Some key bases for the commitments made in the fiscal year 2005
expenditure plan have changed, raising questions as to the plan's currency
and relevance.

o 	A key Release 5 assumption underpinning program and expenditure plans
is no longer valid.

o 	Additional release(s) are now planned that were not reflected in the
program and expenditure plans.

o 	The current organizational change management approach is not fully
reflected in program and expenditure plans, and key change management
actions are not to be implemented.

o 	Recent changes to the respective roles and responsibilities of the ACE
development contractor and CBP's Modernization Office are not reflected in
the program and expenditure plans.

Appendix I Briefing to Subcommittees on Homeland Security, House and
Senate Committees on Appropriations

We are making recommendations to the DHS Secretary to strengthen
accountability for the ACE program and better ensure that future ACE
releases deliver expected capabilities and benefits within budget and on
time.

In their oral comments on a draft of this briefing, DHS and CBP officials,
including the DHS Chief Information Officer (CIO), the Border and
Transportation Security CIO, and the CBP Acting CIO, generally agreed with
our findings, conclusions, and recommendations and stated that it was fair
and balanced. They also provided clarifying information that we
incorporated as appropriate in this briefing.

Appendix I Briefing to Subcommittees on Homeland Security, House and
Senate Committees on Appropriations

ACE is to support eight major CBP business areas.

1.	Release Processing: Processing of cargo for import or export; tracking
of conveyances, cargo and crew; and processing of in-bond, warehouse,
Foreign Trade Zone, and special import and export entries.

2.	Entry Processing: Liquidation and closeout of entries and entry
summaries related to imports, and processing of protests and decisions.

3.	Finance: Recording of revenue, performance of fund accounting, and
maintenance of the general ledger.

4.	Account Relationships: Maintenance of trade accounts, their bonds and
CBP-issued licenses, and their activity.

5.	Legal and Policy: Management of import and export legal, regulatory,
policies and procedures, and rulings issues.

6.	Enforcement: Enforcement of laws, regulations, policies and procedures,
and rulings governing the import and export of cargo, conveyances, and
crew.

Appendix I Briefing to Subcommittees on Homeland Security, House and
Senate Committees on Appropriations

7.	Business Intelligence: Gathering and reporting data, such as references
for import and export transactions, for use in making admissibility and
release decisions.

8.	Risk: Decisionmaking about admissibility and compliance of cargo using
riskbased mitigation, selectivity, and targeting.

Appendix I Briefing to Subcommittees on Homeland Security, House and
Senate Committees on Appropriations

The ACE technical architecture is to consist of layers or tiers of
computer

technology:

  o The Client Tier includes user workstations and external system interfaces.

o The Presentation Tier provides the mechanisms for the user workstations

and external systems to access ACE.

  o The Integration Services Tier provides the middleware for integrating and

routing information between ACE software applications and legacy systems.

  o The Applications Tier includes software applications comprising commercial

products (e.g., SAP1) and custom-developed software that provide the

functionality supporting CBP business processes.

o The Data Tier provides the data management and warehousing services for

    ACE, including database backup, restore, recovery, and space management.

Security and data privacy are to be embedded in all five layers.

1SAP is a commercial enterprise resource planning software product that
has multiple modules, each performing separate but integrated business
functions. ACE will use SAP as the primary commercial, off-the-shelf
product supporting its business processes and functions. CBP's
Modernization Office is also using SAP as part of a joint project with its
Office of Finance to support financial management, procurement, property
management, cost accounting, and general ledger processes.

Appendix I Briefing to Subcommittees on Homeland Security, House and
Senate Committees on Appropriations

Simplified View of ACE Technical Architecture

Appendix I Briefing to Subcommittees on Homeland Security, House and
Senate Committees on Appropriations

CBP's Modernization Office (CBPMO) is responsible for acquiring and
implementing ACE through a contract awarded on April 27, 2001, to IBM
Global Services. IBM and its subcontractors are collectively called the
e-Customs Partnership (eCP).

CBPMO's initial strategy provided for acquiring ACE in four increments
deployed over 4 years. In September 2002, the modernization office
modified this strategy to acquire and deploy the first three increments in
six releases; all four increments were to be deployed over 4 years. In
October 2003, CBPMO changed its plans, deciding to acquire and deploy ACE
in 10 releases over 6 years.

Subsequently, between January and July 2004, CBPMO and eCP conducted a
planning project called the Global Business Blueprint. It was intended to
define how ACE will use SAP and other technologies to perform CBP business
processes in Releases 5, 6, and 7; to define the functional scope of these
releases; and to develop updated program schedule and cost estimates.
Following the blueprint, CBP changed its acquisition strategy again. It
currently plans to acquire and deploy ACE in 11 releases over 9 years.

Appendix I Briefing to Subcommittees on Homeland Security, House and
Senate Committees on Appropriations

The functionality associated with, status of, and plans for the 11 ACE
releases are as follows.

Release 1 (ACE Foundation): Provide IT infrastructure-computer hardware
and system software-to support subsequent system releases. This release
was deployed in October 2003 and is operating.

Release 2 (Account Creation): Give initial group of CBP national account
managers1 and importers access to account information, such as trade
activity. This release was deployed in October 2003 and is operating.

Release 3 (Periodic Payment): Provide additional account managers and
importers, as well as brokers and carriers,2 access to account
information; provide initial financial transaction processing and CBP
revenue collection capability, allowing importers and their brokers to
make monthly payments of duties and fees.

1CBP national account managers work with the largest importers.

2Brokers obtain licenses from CBP to conduct business on behalf of the
importers by filling out paperwork and obtaining a bond; carriers are
individuals or organizations engaged in transporting goods for hire.

Appendix I Briefing to Subcommittees on Homeland Security, House and
Senate Committees on Appropriations

This release was deployed in July 2004 and is operating. As a result, CBP
reports that importers can now obtain a national view of their
transactions on a monthly statement and can pay duties and fees on a
monthly basis for the first time since CBP and its predecessor
organizations were established in 1789. Additionally, according to CBP,
Release 3 provides a national view of trade activity, thus greatly
enhancing its ability to accomplish its mission of providing border
security while facilitating legitimate trade and travel. CBP also reports
that as of December 6, 2004, it had processed 27,777 entries and collected
over $126.5 million using Release 3.

Release 4 (e-Manifest: Trucks): Provide truck manifest1 processing and
interfacing to legacy enforcement systems and databases. This release is
under development and scheduled for deployment beginning in February 2005.

Screening S1 (Screening Foundation): Establish the foundation for
screening and targeting cargo and conveyances by centralizing criteria and
results into a single standard database; allow users to define and
maintain data sources and business rules. This release is scheduled for
deployment beginning in September 2005.

1Manifests are lists of passengers or invoices of cargo for a vehicle,
such as a truck, ship, or plane.

Appendix I Briefing to Subcommittees on Homeland Security, House and
Senate Committees on Appropriations

Screening S2 (Targeting Foundation): Establish the foundation for advanced
targeting capabilities by enabling CBP's National Targeting Center to
search multiple databases for relevant facts and actionable intelligence.
This release is scheduled for deployment beginning in February 2006.

Release 5 (Account Revenue and Secure Trade Data): Leverage SAP
technologies to enhance and expand accounts management, financial
management, and postrelease functionality, as well as provide the initial
multimodal manifest1 capability. This release is scheduled for deployment
beginning in November 2006.

Screening S3 (Advanced Targeting): Provide enhanced screening for
reconciliation, intermodal manifest, Food and Drug Administration data,
and inbond, warehouse, and Foreign Trade Zone authorized movements;
integrate additional data sources into targeting capability; provide
additional analytical tools for screening and targeting data. This release
is scheduled for deployment beginning in February 2007.

1The multimodal manifest involves the processing and tracking of cargo as
it transfers between different modes of transportation, such as cargo that
arrives by ship, is transferred to a truck, and then is loaded onto an
airplane.

Appendix I Briefing to Subcommittees on Homeland Security, House and
Senate Committees on Appropriations

Screening S4 (Full Screening and Targeting): Provide screening and
targeting functionality supporting all modes of transportation and all
transactions within the cargo management lifecycle, including enhanced
screening and targeting capability with additional technologies. This
release is scheduled for deployment beginning in February 2009.

Release 6 (e-Manifest: All Modes and Cargo Security): Provide enhanced
postrelease functionality by adding full entry processing; enable full
tracking of cargo, conveyance, and equipment; enhance the multimodal
manifest to include shipments transferring between transportation modes.
This release is scheduled for deployment beginning in February 2009.

Release 7 (Exports and Cargo Control): Implement the remaining ACE
functionality, including Foreign Trade Zone warehouse; export, seized
asset and case tracking system; import activity summary statement; and
mail, pipeline, hand carry, drawback, protest, and document management.
This release is scheduled for deployment beginning in May 2010.

The graphic on the following slide illustrates the planned schedule for
ACE.

Appendix I Briefing to Subcommittees on Homeland Security, House and
Senate Committees on Appropriations

                              Current ACE Schedule

Appendix I Briefing to Subcommittees on Homeland Security, House and
Senate Committees on Appropriations

ACE is intended to support CBP satisfaction of the provisions of Title VI
of the North American Free Trade Agreement, commonly known as the
Modernization Act. Subtitle B of the Modernization Act contains the
various automation provisions that were intended to enable the government
to modernize international trade processes and permit CBP to adopt an
informed compliance approach with industry. The following table
illustrates how each ACE release is to fulfill the requirements of
Subtitle B.

Appendix I Briefing to Subcommittees on Homeland Security, House and
Senate Committees on Appropriations

               ACE Satisfaction of Modernization Act Requirements

Appendix I Briefing to Subcommittees on Homeland Security, House and
Senate Committees on Appropriations

Appendix I Briefing to Subcommittees on Homeland Security, House and
Senate Committees on Appropriations

Appendix I Briefing to Subcommittees on Homeland Security, House and
Senate Committees on Appropriations

Appendix I Briefing to Subcommittees on Homeland Security, House and
Senate Committees on Appropriations

Appendix I Briefing to Subcommittees on Homeland Security, House and
Senate Committees on Appropriations

Appendix I Briefing to Subcommittees on Homeland Security, House and
Senate Committees on Appropriations

Since March 2001, six ACE expenditure plans have been submitted.1
Collectively, the six plans have identified a total of $1,401.5 million in
funding.

o 	On March 26, 2001, CBP submitted to its appropriations committees the
first expenditure plan seeking $45 million for the modernization contract
to sustain CBPMO operations, including contractor support. The
appropriations committees subsequently approved the use of $45 million,
bringing the total ACE funding to $50 million.

o 	On February 1, 2002, the second expenditure plan sought $206.9 million
to sustain CBPMO operations; define, design, develop, and deploy Increment
1, Release 1 (now Releases 1 and 2); and identify requirements for
Increment 2 (now part of Releases 5, 6, and 7 and Screenings 1 and 2). The
appropriations committees subsequently approved the use of $188.6 million,
bringing total ACE funding to $238.6 million.

1In March 2001, appropriations committees approved the use of $5 million
in stopgap funding to fund program management office operations.

Appendix I Briefing to Subcommittees on Homeland Security, House and
Senate Committees on Appropriations

o 	On May 24, 2002, the third expenditure plan sought $190.2 million to
define, design, develop, and implement Increment 1, Release 2 (now
Releases 3 and 4). The appropriations committees subsequently approved the
use of $190.2 million, bringing the total ACE funding to $428.8 million.

o 	On November 22, 2002, the fourth expenditure plan sought $314 million
to operate and maintain Increment 1 (now Releases 1, 2, 3, and 4); to
design and develop Increment 2, Release 1 (now part of Releases 5, 6, and
7 and Screening 1); and to define requirements and plan Increment 3 (now
part of Releases 5, 6, and 7 and Screenings 2, 3, and 4). The
appropriations committees subsequently approved the use of $314 million,
bringing total ACE funding to $742.8 million.

Appendix I Briefing to Subcommittees on Homeland Security, House and
Senate Committees on Appropriations

                    Chronology of Six ACE Expenditure Plans

o 	On January 21, 2004, the fifth expenditure plan sought $318.7 million
to implement ACE infrastructure; to support, operate, and maintain ACE;
and to define and design Release 6 (now part of Releases 5, 6, and 7) and
Selectivity 2 (now Screenings 2 and 3). The appropriations committees
subsequently approved the use of $316.8 million, bringing total ACE
funding to $1,059.6 million.

o 	On November 8, 2004, CBP submitted its sixth expenditure plan, seeking
$321.7 million for detailed design and development of Release 5 and
Screening 2, definition of Screening 3, Foundation Program Management,
Foundation Architecture and Engineering, and ACE Operations and
Maintenance.

Appendix I Briefing to Subcommittees on Homeland Security, House and
Senate Committees on Appropriations

Appendix I Briefing to Subcommittees on Homeland Security, House and
Senate Committees on Appropriations

DHS and OMB satisfied or partially satisfied each of its legislative
conditions; GAO satisfied its legislative condition.

Condition 1. The plan, in conjunction with related program documentation
and program officials' statements, satisfied the capital planning and
investment control review requirements established by OMB, including
Circular A-11, part 7, which establishes policy for planning, budgeting,
acquisition, and management of federal capital assets.

The table that follows provides examples of the results of our analysis.

Appendix I Briefing to Subcommittees on Homeland Security, House and
Senate Committees on Appropriations

Appendix I Briefing to Subcommittees on Homeland Security, House and
Senate Committees on Appropriations

  Appendix I Briefing to Subcommittees on Homeland Security, House and Senate
                          Committees on Appropriations

Condition 2. The plan, including related program documentation and program
officials' statements, partially satisfied this condition by providing for
future compliance with DHS's enterprise architecture (EA).

DHS released version 1.0 of the architecture in September 2003.1 We
reviewed the initial version of the architecture and found that it was
missing, either partially or completely, all the key elements expected in
a well-defined architecture, such as a description of business processes,
information flows among these processes, and security rules associated
with these information flows.2 Since we reviewed version 1.0, DHS has
drafted version 2.0 of its EA. We have not reviewed this draft.

According to CBPMO officials, they have been working with the DHS EA
program office in developing version 2.0 to ensure that ACE is aligned
with DHS's evolving EA. They also said that CBP participates in both the
DHS EA Center of Excellence and the DHS Enterprise Architecture Board.3

1Department of Homeland Security Enterprise Architecture Compendium
Version 1.0 and Transitional Strategy.

2GAO, Homeland Security: Efforts Under Way to Develop Enterprise
Architecture, but Much Work Remains, GAO-04-777 (Washington, D.C.: Aug. 6,
2004).

3The Center of Excellence supports the Enterprise Architecture Board in
reviewing component documentation. The purpose of the Board is to ensure
that investments are aligned with the DHS EA.

Appendix I Briefing to Subcommittees on Homeland Security, House and
Senate Committees on Appropriations

In August 2004, the Center of Excellence approved CBPMO's analysis
intended to demonstrate ACE's architectural alignment, and the Enterprise
Architecture Board subsequently concurred with the center's approval.
However, DHS has not yet provided us with sufficient documentation to
allow us to understand DHS's architecture compliance methodology and
criteria (e.g., definition of alignment and compliance) or with verifiable
analysis justifying the approval.

Appendix I Briefing to Subcommittees on Homeland Security, House and
Senate Committees on Appropriations

Condition 3. The plan, in conjunction with related program documentation,
partially satisfied the condition of compliance with the acquisition
rules, requirements, guidelines, and systems acquisition management
practices of the federal government.

The Software Acquisition Capability Maturity Model (SA-CMM(R)), developed
by Carnegie Mellon University's Software Engineering Institute (SEI), is
consistent with the acquisition guidelines and systems acquisition
management practices of the federal government, and it provides a
management framework that defines processes for acquisition planning,
solicitation, requirements development and management, project management,
contract tracking and oversight, and evaluation.

In November 2003, SEI assessed ACE acquisition management against the
SA-CMM and assigned a level 2 rating, indicating that CBPMO has instituted
basic acquisition management processes and controls in the following
areas: acquisition planning, solicitation, requirements development and
management, project management, contract tracking and oversight, and
evaluation.

Appendix I Briefing to Subcommittees on Homeland Security, House and
Senate Committees on Appropriations

In June 2003, the Department of the Treasury's Office of Inspector General
(OIG) issued a report on the ACE program's contract, concluding that the
former Customs Service, now CBP, did not fully comply with Federal
Acquisition Regulation requirements in the solicitation and award of its
contract because the ACE contract is a multiyear contract and not an
indefinite-delivery/indefinite-quantity (IDIQ) contract. Further, the
Treasury OIG found that the ACE contract type, which it determined to be a
multiyear contract, is not compatible with the program's stated needs for
a contract that can be extended to a total of 15 years, because multiyear
contracts are limited to 5 years. Additionally, the Treasury OIG found
that Customs combined multiyear contracting with IDIQ contracting
practices. For example, it plans to use contract options to extend the
initial 5-year performance period.

CBP disagrees with the Treasury OIG conclusion.

To resolve the disagreement, DHS asked GAO to render a formal decision. We
are currently reviewing the matter.

Appendix I Briefing to Subcommittees on Homeland Security, House and
Senate Committees on Appropriations

Condition 4. DHS and OMB satisfied the condition that the plan be reviewed
and approved by the DHS IRB, the Secretary of Homeland Security, and OMB.

On August 18, 2004, the DHS IRB reviewed the ACE program, including ACE
fiscal year 2005 cost, schedule, and performance plans. The DHS Deputy
Secretary, who chairs the IRB, delegated further review of the fiscal year
2005 efforts, including review and approval of the fiscal year 2005 ACE
expenditure plan, to the Under Secretary for Management, with support from
the Chief Financial Officer, Chief Information Officer, and Chief
Procurement Officer, all of whom are IRB members. The Under Secretary for
Management approved the expenditure plan on behalf of the Secretary of
Homeland Security on November 8, 2004.

OMB approved the plan on October 15, 2004.

Appendix I Briefing to Subcommittees on Homeland Security, House and
Senate Committees on Appropriations

Condition 5. GAO satisfied the condition that it review the plan. Our
review was completed on December 17, 2004.

Appendix I Briefing to Subcommittees on Homeland Security, House and
Senate Committees on Appropriations

Open recommendation 1: Develop and implement a rigorous and analytically

verifiable cost estimating program that embodies the tenets of effective
estimating

as defined in SEI's institutional and project-specific estimating models.1

Status: In progress

CBPMO has taken several steps to strengthen its cost estimating program.
First,

the program office has defined and documented processes for estimating

expenditure plan costs (including management reserve costs). Second, it
hired a

contractor to develop cost estimates, including contract task orders, that
are

independent of eCP's estimates. Third, it tasked a support contractor with

evaluating the independent and eCP estimates against SEI criteria.
According to

the summary-level results of this evaluation, the independent estimates
either

satisfied or partially satisfied the SEI criteria, and eCP's estimates
satisfied or

partially satisfied all but two of the seven SEI criteria (these were the
criteria for

calibration of estimates using actual experience and for adequately
reflecting

program risks in estimates). CBPMO officials have not yet provided us with
the

detailed results of this analysis because they have not yet been approved.

1For these models, see SEI's Checklists and Criteria for Evaluating the
Cost and Schedule Estimating Capabilities of Software Organizations and A
Manager's Checklist for Validating Software Cost and Schedule Estimates.

Appendix I Briefing to Subcommittees on Homeland Security, House and
Senate Committees on Appropriations

Open recommendation 2: Ensure that future expenditure plans are based on
cost estimates that are reconciled with independent cost estimates.

Status: Complete1

In August 2004, CBP's support contractor completed an analysis comparing
the cost estimates in the fiscal year 2005 expenditure plan, which are
based on the eCP's cost estimates, with the estimate prepared by CBPMO's
independent cost estimating contractor. This analysis, which was completed
3 months before the fiscal year 2005 expenditure plan was submitted to the
Appropriations Committees, states that the two estimates are consistent.

1With respect to the fiscal year 2005 expenditure plan.

Appendix I Briefing to Subcommittees on Homeland Security, House and
Senate Committees on Appropriations

Open recommendation 3: Immediately develop and implement a human capital
management strategy that provides both near-and long-term solutions to
program office human capital capacity limitations, and report quarterly to
the appropriations committees on the progress of efforts to do so.

Status: In progress

According to the expenditure plan, CBPMO has since developed a
modernization staffing plan that identifies the positions and staff it
needs to effectively manage ACE. However, CBPMO did not provide this plan
to us because it was not yet approved. Moreover, program officials told us
that the staffing plan is no longer operative because it was developed
before December 2004, when a modernization office reorganization was
implemented. As part of this reorganization, CBP transferred government
and contractor personnel who have responsibility for the Automated
Commercial System,1 the Automated Targeting System,2 and ACE training from
non-CBPMO organizational units. This change is expected to eliminate
redundant ACE-related program management efforts.

1The Automated Commercial System is CBP's system for tracking,
controlling, and processing imports to the United States. 2The Automated
Targeting System is CBP's system for identifying import shipments that
warrant further attention.

Appendix I Briefing to Subcommittees on Homeland Security, House and
Senate Committees on Appropriations

Following our recommendation, CBPMO provided reports dated March 31, 2004,
and June 30, 2004, to the appropriations committees on its human capital
activities, including development of the previously mentioned staffing
plan and related analysis to fully define CBPMO positions. Additionally,
it has reported on efforts to ensure that all modernization office staff
members complete a program management training program.

Appendix I Briefing to Subcommittees on Homeland Security, House and
Senate Committees on Appropriations

Open Recommendation 4: Have future ACE expenditure plans specifically
address any proposals or plans, whether tentative or approved, for
extending and using ACE infrastructure to support other homeland security
applications, including any impact on ACE of such proposals and plans.

Status: In progress

The ACE Program Plan states that ACE provides functions that are directly
related to the "passenger business process" underlying the U.S. Visitor
and Immigrant Status Indicator Technology (US-VISIT) program,1 and
integration of certain ACE and US-VISIT components is anticipated. In
recognition of this relationship, the expenditure plan states that CBPMO
and US-VISIT are working together to identify lessons learned, best
practices, and opportunities for collaboration.

1US-VISIT is a governmentwide program to collect, maintain, and share
information on foreign nationals for enhancing national security and
facilitating legitimate trade and travel, while adhering to U.S. privacy
laws and policies.

Appendix I Briefing to Subcommittees on Homeland Security, House and
Senate Committees on Appropriations

Specifically:

o 	In February 2004, ACE and US-VISIT managers met to identify potential
areas for collaboration between the two programs and to clarify how the
programs can best support the DHS mission and provide officers with the
information and tools they need. During the meeting, US-VISIT and ACE
managers recognized that the system infrastructure built to support the
two programs is likely to become the infrastructure for future border
security processes and system applications. Further, they identified four
areas of collaboration: business cases; program management; inventory; and
people, processes, and technology. These areas were later refined to be as
follows:

o 	Program Management coordination, which includes such activities as
creating a high-level integrated master schedule for both programs and
sharing acquisition strategies, plans, and practices;

o 	Business Case coordination, including such business case activities as
OMB budget submissions and acquisition management baselines;

Appendix I Briefing to Subcommittees on Homeland Security, House and
Senate Committees on Appropriations

o 	Inventory, which includes identifying connections between legacy
systems and establishing a technical requirements and architecture team to
review, among other things, system interfaces, data formats, and system
architectures; and

o 	People, Processes, and Technology, which includes establishing teams to
review deployment schedules and establishing a team and process to review
and normalize business requirements.

According to CBPMO, scheduling and staffing constraints prevented any
collaboration activities from taking place between February and July 2004.
In August 2004, the US-VISIT and ACE programs tasked their respective
contractors to form collaboration teams to address the four areas
identified at the February meeting. Nine teams were formed:

DHS investment management Business

Organizational change management Facilities

Information and data Technology

Privacy and security Deployment, operations, and maintenance

Program management

Appendix I Briefing to Subcommittees on Homeland Security, House and
Senate Committees on Appropriations

In September 2004, the teams met to develop team charters, identify
specific collaboration opportunities, and develop timelines and next
steps. In October 2004, CBPMO and US-VISIT program officials were briefed
on the progress and activities of the collaboration teams.

Appendix I Briefing to Subcommittees on Homeland Security, House and
Senate Committees on Appropriations

Open recommendation 5: Establish an IV&V function to assist CBP in
overseeing contractor efforts, such as testing, and ensure the
independence of the IV&V agent.

Status: Complete

According to ACE officials, they have selected an IV&V contractor that has
had no prior involvement in the modernization program to ensure
independence. These officials stated that the IV&V contractor will be
responsible for reviewing ACE products and management processes, and will
report directly to the CBP CIO. Award of this contract is to occur on
December 30, 2004.

Appendix I Briefing to Subcommittees on Homeland Security, House and
Senate Committees on Appropriations

Open recommendation 6: Define metrics, and collect and use associated
measurements, for determining whether prior and future program management
improvements are successful.

Status: In progress

CBPMO has implemented a metrics program that generally focuses on
measuring eCP's performance through the use of earned value management
(EVM), deliverable timeliness and quality metrics, and risk and issue
disposition reporting. Additionally, CBPMO is planning to broaden its
program to encompass metrics and measures for determining progress toward
achieving desired business results and acquisition process maturity. The
plan for expanding the metrics program is scheduled for approval in early
2005.

One part of CBPMO's metrics program that it has implemented relates to EVM
for its contract with eCP. EVM is a widely accepted best practice for
measuring contractor progress toward meeting deliverables by comparing the
value of work accomplished during a given period with that of the work
expected in that period. Differences from expectations are measured in the
form of both cost and schedule variances.

Appendix I Briefing to Subcommittees on Homeland Security, House and
Senate Committees on Appropriations

o 	Cost variances compare the earned value of the completed work with the
actual cost of the work performed. For example, if a contractor completed
$5 million worth of work and the work actually cost $6.7 million, there
would be a -$1.7 million cost variance. Positive cost variances indicate
that activities are costing less, while negative variances indicate
activities are costing more.

o 	Schedule variances, like cost variances, are measured in dollars, but
they compare the earned value of the work completed to the value of work
that was expected to be completed. For example, if a contractor completed
$5 million worth of work at the end of the month, but was budgeted to
complete $10 million worth of work, there would be a -$5 million schedule
variance. Positive schedule variances show that activities are being
completed sooner than planned. Negative variances show activities are
taking longer than planned.

In accordance with EVM principles, eCP reports on its financial
performance monthly. These reports provide detailed information on cost
and schedule performance on work segments in each task order. Cost and
schedule variances that exceed a certain threshold are further examined to
determine the root cause of the variance, the impact on the program, and
mitigation strategies.

Appendix I Briefing to Subcommittees on Homeland Security, House and
Senate Committees on Appropriations

Open recommendation 7: Reconsider the ACE acquisition schedule and cost

estimates in light of early release problems, including these early
releases'

cascading effects on future releases and their relatively small size
compared to

later releases, and in light of the need to avoid the past levels of
concurrency

among activities within and between releases.

Status: Complete

As we previously reported, the cost estimate for Releases 3 and 4 had
grown to

$185.7 million, which was about $36.2 million over the contract baseline,
and the

chances of further overruns were likely.1 Subsequently, the Release 3 and
4 cost

overrun grew to an estimated $46 million, resulting in CBPMO and eCP

establishing a new cost baseline for Releases 3 and 4 of $196 million. eCP
began

reporting performance against this new baseline in April 2004. Further, in
July

2004, CBPMO and eCP changed the associated contract task order baseline

completion date from September 15, 2004, to May 30, 2005, revised the
associated

interim task order milestones, and began reporting schedule performance
relative

to the new baselines.

1GAO, Information Technology: Early Releases of Customs Trade System
Operating, but Pattern of Cost and Schedule Problems Needs to Be
Addressed, GAO-04-719 (Washington, D.C.: May 14, 2004).

Appendix I Briefing to Subcommittees on Homeland Security, House and
Senate Committees on Appropriations

In July 2004, eCP also rebaselined the ACE program, producing a new
version of the ACE Program Plan. The new baseline extends delivery of the
last ACE release from fiscal year 2007 to fiscal year 2010 and adds a new
screening and targeting release. The new program plan also provides a new
ACE life-cycle cost estimate of $3.1 billion,1 which is a $1 billion
increase over the previous life-cycle cost estimate. According to the
expenditure plan, the new schedule reflects less concurrency between
releases. The following figure compares previous and current schedules for
ACE releases and shows a reduction in the level of concurrency between
releases.

1CBP's ACE life-cycle cost estimate adjusted for risk is about $3.3
billion.

Appendix I Briefing to Subcommittees on Homeland Security, House and
Senate Committees on Appropriations

ACE Schedule as of October 2003 Compared with November 2004 Version

Appendix I Briefing to Subcommittees on Homeland Security, House and
Senate Committees on Appropriations

Open recommendation 8: Report quarterly to the House and Senate
Appropriations Committees on efforts to address open GAO recommendations.

Status: In progress

CBPMO submitted reports to the Committees on its efforts to address open
GAO recommendations for the quarters ending March 31, 2004, and June 30,
2004. CBPMO plans to submit a report for the quarter ending September 30,
2004, after it is approved by DHS and OMB.

Appendix I Briefing to Subcommittees on Homeland Security, House and
Senate Committees on Appropriations

Appendix I Briefing to Subcommittees on Homeland Security, House and
Senate Committees on Appropriations

For each day that the system did not meet the service level agreement, eCP
identified the root cause. For example, one of the incidents was due to
insufficient shutdown and startup procedures and another was caused by an
incorrectly configured Java Archive (JAR) file.1 eCP also reported on
actions taken to prevent a reoccurrence of the problem. For example, eCP
reported that it has amended the startup and shutdown procedures, and made
operators aware of the changes, and it has implemented steps for correctly
capturing changes to JAR file configurations.

The November 10 to November 14 incidents were all attributed to a single
cause: a defect in a software update that allowed some trade users to
inappropriately view account information on other trade accounts.
According to the root cause analysis report, eCP corrected the software
error and then manually reviewed each account to ensure that permissions
had been set appropriately. However, this report also raised questions as
to whether system updates were being executed without regard to risk
mitigation in order to meet mandated schedules.

1JavaTM Archive (JAR) files bundle multiple class files and auxiliary
resources associated with applets and applications into a single archive
file.

Appendix I Briefing to Subcommittees on Homeland Security, House and
Senate Committees on Appropriations

Observation 2: Progress toward establishing ACE user accounts has not met
expectations.

CBPMO established a goal of activating 1,100 ACE importer accounts by
February 25, 2005, which is when Release 4 is to become operational.
According to CBP, it is expected that the 1,100 accounts will represent
more than 50 percent of total import duty collected at ports.

To help measure progress toward reaching the overall goal of 1,100
accounts, CBPMO established weekly targets. One target was to have 600
accounts activated by November 26, 2004. However, CBPMO reported that
activated ACE accounts as of this date were 311, which is about 48 percent
less than the interim target. In addition, since October 1, 2004, CBPMO
has not reached any of its weekly targets, and the gap between the actual
and targeted number of activated accounts has grown. As of December 15,
2004, CBPMO reports that 347 accounts have been activated. Further, CBPMO
officials said that they expect rapid growth in activated accounts as
Release 4 is deployed. The following figure shows the trend in target
versus actual accounts activated.

Appendix I Briefing to Subcommittees on Homeland Security, House and
Senate Committees on Appropriations

Target Versus Actual Activated ACE Accounts

Appendix I Briefing to Subcommittees on Homeland Security, House and
Senate Committees on Appropriations

CBPMO officials stated that they are currently analyzing the reasons for
the lower than expected number of user accounts. They also stated that
they have initiated more aggressive techniques to inform the trade
community about ACE benefits and to clarify the steps to participate.

Appendix I Briefing to Subcommittees on Homeland Security, House and
Senate Committees on Appropriations

Observation 3: Release 3 testing and pilot activities were delayed and
have produced system defect trends that raise questions about decisions to
pass key milestones and about the state of system maturity.

Development of each ACE release includes system integration and system
acceptance testing, followed by a pilot period that includes user
acceptance testing. Generally, the purpose of these tests is to identify
defects or problems either in meeting defined system requirements or in
satisfying system user needs. The purpose of the associated readiness
reviews is to ensure that the system satisfies criteria for proceeding to
the next stage of testing or operation.

Tests and their related milestones are described in the following table.

Appendix I Briefing to Subcommittees on Homeland Security, House and
Senate Committees on Appropriations

Appendix I Briefing to Subcommittees on Homeland Security, House and
Senate Committees on Appropriations

Appendix I Briefing to Subcommittees on Homeland Security, House and
Senate Committees on Appropriations

Release 3 integration testing started later than planned, took longer than
expected, and was declared successful despite open defects that prevented
system from performing as intended.

In September 2003, Release 3 system integration testing (SIT) was
scheduled to start on December 24, 2003, and last for 43 days. However,
the start of SIT testing was delayed until February 18, 2004, or about 2
months, and it lasted 56 days, or about 2 weeks longer than planned.

CBPMO officials attributed the delays in Release 3 testing to Release 2
testing delays that caused the shared test environments to be delivered
late to Release 3, and human capital that was held on Release 2 longer
than planned. These officials also explained that the additional 2 weeks
for Release 3 integration testing was due to the aforementioned late
delivery of test environments, as well as to last minute design and
development changes.

Appendix I Briefing to Subcommittees on Homeland Security, House and
Senate Committees on Appropriations

Release 3 SIT consisted of 85 test cases, all of which reportedly either
passed or passed with exceptions. Those tests passing with exceptions
generated defects, but because none of the test cases were judged to have
completely failed, SIT was declared to be successfully executed. The test
readiness review (TRR) milestone approval was granted because the approval
criteria did not stipulate that all critical and severe defects had to be
resolved, but rather that they either had to be resolved or have approved
work-off plans in place. As a result, TRR approval occurred on April 26,
2004, even though CBPMO reported that 2 critical and 90 severe defects
were open at this time. Of these 92 open defects, two critical ones were
reported to have been closed 2 days after TRR, with 77 of the remaining
severe defects being closed within the next 2 weeks. The remaining severe
defects were largely closed, according to CBP, 4 weeks after TRR, with the
final three being closed on June 21, 2004, which is 8 weeks after TRR.

Given that critical defects by definition prevent the system from
performing missionessential operations or jeopardize safety and security,
among other things, and severe defects prevent the system from working as
intended or produce errors that degrade system performance, using criteria
that permit one phase of testing to be concluded and another phase to
begin, despite having a large number of such problems, introduces
unnecessary risk.

Appendix I Briefing to Subcommittees on Homeland Security, House and
Senate Committees on Appropriations

Moreover, using such exit criteria represents a significant change from
the practice CBPMO followed on prior ACE releases, in which TRR could not
be passed if any critical defects were present, and Production Readiness
Review (PRR) could not be passed if any critical or severe defects were
present. In effect, this change in readiness review exit criteria creates
hidden overlap among test phases, as work to resolve defects from a prior
phase of testing occurs at the same time that work is under way to execute
a subsequent phase of testing. As we have previously reported, such
concurrency among test phases has contributed to a recurring pattern of
ACE release commitments not being met.

Appendix I Briefing to Subcommittees on Homeland Security, House and
Senate Committees on Appropriations

Release 3 acceptance testing started later than planned, concluded later
than planned, and was declared successful despite material inventory of
open defects.

Release 3 system acceptance testing (SAT) was planned to start on March 5,
2004, and last for 38 days. Because of delays caused by changes to the
requirements baseline affecting the development of test cases, SAT began
on May 7, 2004, about 2 months later than planned, and before all severe
SIT defects were closed. In order to avoid further Release 3 schedule
delays and maintain the PRR date of May 28, 2004, the SAT period was
shortened from 38 to 20 days, or approximately half of the originally
planned period. CBPMO officials noted that the program completed SAT in
the compressed schedule by investing the additional resources needed to
conduct tests 7 days a week, often for up to 12 hours each day.

Release 3 SAT consisted of 28 test cases, all of which reportedly passed
successfully. During the SAT test period from May 7 to May 27, 2004, 3
critical, 129 severe, and 19 moderate defects were found.

Appendix I Briefing to Subcommittees on Homeland Security, House and
Senate Committees on Appropriations

The exit criteria for Release 3 PRR also stipulated that all critical and
severe defects either be resolved or have work-off plans identified. At
the time of the PRR on May 28, 2004, CBP reported that 18 severe defects
remained open. According to CBP, because these defects were determined not
to pose an unacceptable risk to the system, their closure was
intentionally delayed until after PRR. However, such defects, according to
CBPMO's own definition, preclude the system from working as intended or
produce errors that degrade system performance. This is one reason why
guidance on effective test practices generally advocates closing such
defects before concluding one phase of testing and beginning the next.

Appendix I Briefing to Subcommittees on Homeland Security, House and
Senate Committees on Appropriations

Release 3 pilot activities, including user acceptance testing, were
declared successful, despite severe defects remaining open.

Two major activities conducted during the Release 3 Pilot Performance
Period were training for CBP and trade users and user acceptance testing
(UAT). This pilot period lasted from PRR on May 28, 2004, until ORR on
August 25, 2004.

In training to prepare users to operate Release 3, business scenarios were
used that reflected daily job functions; training was conducted over an
8-or 4-week period for CBP and trade users, respectively. This training
received an average user satisfaction score of about 4 on a 1 to 5 scale,
which is defined as "very good."

Release 3 UAT consisted of CBP and trade users executing 19 and 23 test
cases, respectively, and rating the release in several areas, again using
a 1 to 5 scale (with 1 indicating "very dissatisfied" and 5 indicating
"very satisfied"). The test areas were to address the major functionality
that is new or was significantly changed from Release 2.

Appendix I Briefing to Subcommittees on Homeland Security, House and
Senate Committees on Appropriations

UAT average user satisfaction scores for were 4.0 or "satisfied" for trade
users and 3.5 or "somewhat satisfied" for CBP users. According to CBPMO
officials, the target score was 4.0. A reason cited for the lower scores
for CBP users was that testing included a large number of less experienced
users, who tended to be more critical of ACE than users who had more
experience with the system.

The pilot period also produced a total of 191 defects, including 5
critical, 74 severe, 48 moderate, and 64 minor defects. CBPMO reported
that 6 of the 74 severe defects remained open at ORR on August 25, 2004.

Similar to the TRR and PRR exit criteria, the criteria for passing Release
3 ORR stipulated that all critical and severe defects either be resolved
or have work-off plans in place at the time of ORR. According to CBPMO,
all defects that were open at ORR either had an acceptable work-around in
place, or CBPMO expected that they would not adversely affect the use of
the system. However, by definition, severe defects adversely affect system
performance, and if an acceptable workaround exists, they are categorized
as moderate defects, not severe defects.

Appendix I Briefing to Subcommittees on Homeland Security, House and
Senate Committees on Appropriations

Trends in Defects during the Release 3 Testing Period, Including the
Number of Open Defects by Severity Classification at the Time of the
Readiness Reviews

Appendix I Briefing to Subcommittees on Homeland Security, House and
Senate Committees on Appropriations

Current state of Release 3 maturity is unclear because defect data since
user acceptance testing are not reliable.

Having current and accurate information on system defect density is
necessary to adequately understand system maturity and to make informed
decisions about allocation of limited resources in meeting competing
priorities. Since the Release 3 ORR, available data show that Release 3 is
operating with longstanding defects and that new defects have not been
closed. For example, the defect data as of November 30, 2004, show that 18
defects that were open at TRR were still open (11 moderate and 7 minor);
33 defects open at PRR were still open (16 moderate and 17 minor); and 92
defects open at ORR were still open (2 severe, 43 moderate, and 47 minor).
In addition, the data show that 43 defects opened since ORR (23 severe, 8
moderate, and 12 minor) were still open as of November 30, 2004. However,
CBPMO officials told us that these data are not reliable because the focus
has been on completing Release 4 testing and pilot activities, at the
expense of keeping Release 3 defect data current and accurate. As a
result, CBPMO does not currently have a complete picture of the maturity
of each of its releases so that it can make internal resource allocation
decisions.

Appendix I Briefing to Subcommittees on Homeland Security, House and
Senate Committees on Appropriations

Observation 4: Release 4 test phases were delayed and overlapped, and
revealed a higher than expected volume and significance of defects,
raising questions about decisions to pass key milestones and about the
state of system maturity.

As previously discussed, each ACE release is subject to SIT and SAT, which
are conducted by eCP. Each release also undergoes UAT, which is conducted
by CBP. Generally, the purpose of these tests is to identify defects or
problems in either meeting defined system requirements or in satisfying
system user needs. Defects are documented as PTRs that are classified by
severity. The four severity levels are (1) critical, (2) severe, (3)
moderate, and (4) minor.

Appendix I Briefing to Subcommittees on Homeland Security, House and
Senate Committees on Appropriations

Release 4 testing revealed a considerably higher than expected number of
material defects.

Before initiating Release 4 testing, eCP forecasted and planned for
resolution of an expected number of defects. Specifically, 2,018 total
defects were estimated to be found by the time of PRR. Of the 2,018, 343
were to be critical, 1110 severe, 383 moderate, and 182 minor. However, at
the time of PRR on November 23, 2004, 3757 total defects were reported,
which is about 86 percent more than expected. Moreover, the significance
of the defects was underestimated; 835 critical defects were reported (143
percent more than expected), and 2224 severe defects were reported (100
percent more than expected).

The following figure depicts the estimated and actual Release 4 defects
according to their severity level.

Appendix I Briefing to Subcommittees on Homeland Security, House and
Senate Committees on Appropriations

Release 4 Expected Versus Actual Defects by Severity

Appendix I Briefing to Subcommittees on Homeland Security, House and
Senate Committees on Appropriations

eCP officials attributed the difference between estimated and actual
Release 4 defects to their underestimating the complexity of developing
the release, and thus underestimating the likely number of defects.

As a result of this significantly higher than expected number and severity
of defects, eCP drew resources from a later release and, as discussed
later, passed PRR with 5 critical and 37 severe defects.

The following figure depicts the total number of expected Release 4
defects in comparison to the actual number of defects identified.

Appendix I Briefing to Subcommittees on Homeland Security, House and
Senate Committees on Appropriations

Release 4 Expected Versus Actual Defects over Time

Appendix I Briefing to Subcommittees on Homeland Security, House and
Senate Committees on Appropriations

Release 4 integration and acceptance testing schedule changes resulted in
tests being conducted concurrently.

According to the testing schedule, Release 4 SIT was scheduled to start on
May 12, 2004, and to finish on October 1, 2004. However, SIT was started
on June 28, 2004 (approximately 7 weeks later than planned) and completed
on November 23, 2004 (approximately 8 weeks later than planned).

According to the same testing schedule, SAT was scheduled to start on
October 19, 2004, and to last 39 days. However, SAT was started on
November 1, 2004, and was completed on November 23, 2004, thus lasting for
23 days. According to eCP's actual testing schedule, the SAT period was
shortened by 16 days, in order to reduce the impact of previous schedule
delays and conduct the planned PRR by November 23.

Further, the testing schedule planned to have no concurrency between SIT
and SAT. However, SIT and SAT were actually conducted concurrently, which
as we previously reported, increases risk and contributed to past ACE cost
and schedule problems (see next slide). According to program officials,
rather than waiting for SIT to be fully completed before starting SAT,
they began SAT on Release 4 functionality that successfully completed SIT.

Appendix I Briefing to Subcommittees on Homeland Security, House and
Senate Committees on Appropriations

Release 4 SIT and SAT Time Frames

Appendix I Briefing to Subcommittees on Homeland Security, House and
Senate Committees on Appropriations

Release 4 defect profile shows improvements in resolving defects, but
critical and severe defects remain in operational system.

The number of open Release 4 defects peaked on October 8, 2004, when there
were 59 critical, 243 severe, and 59 moderate defects open. CBPMO reports
that since then, many of these defects have been closed.

CBPMO's criteria for successfully passing PRR requires that all critical
and severe defects are resolved or have work-off plans. At the time of PRR
on November 23, 2004, CBPMO reported that most defects were closed, with
the exception of 5 critical and 37 severe defects for which they have
established or intended to establish work-off plans. However, as of
November 30, 2004, which was about 1.5 weeks from deployment of the
Release 4 pilot period, 3 critical defects and 30 severe defects remained
open.

The following graph shows the number of defects open each week during
Release 4 testing.

Appendix I Briefing to Subcommittees on Homeland Security, House and
Senate Committees on Appropriations

Release 4 Defect Trend

Appendix I Briefing to Subcommittees on Homeland Security, House and
Senate Committees on Appropriations

Observation 5: Performance against the revised cost and schedule estimates
for Releases 3 and 4 has been mixed.

Because the Release 3 and 4 contract was experiencing significant cost and
schedule overruns, CBPMO established a new baseline, referred to as the
Over Target Baseline (OTB) in April 2004. Program performance against the
OTB is measured using EVM cost variances and schedule variances. Release 3
and 4 cost performance against the new baseline has been positive, but the
schedule performance has not.

The chart on the following slide illustrates the cumulative cost variance
on Release 3 and 4 since the OTB was established.

Appendix I Briefing to Subcommittees on Homeland Security, House and
Senate Committees on Appropriations

Appendix I Briefing to Subcommittees on Homeland Security, House and
Senate Committees on Appropriations

Appendix I Briefing to Subcommittees on Homeland Security, House and
Senate Committees on Appropriations

As shown on the previous slide, eCP recovered about $1.4 million of the
schedule variance between August 2004 and October 2004 but still has not
completed $1.5 million worth of scheduled work. According to eCP, the
recent improvement in schedule performance reflects recent completion of
such work as Release 4 testing.

While cost performance on Release 3 and 4 has been positive since the new
baseline was established, schedule performance has not. In order to meet
Release 4 schedule commitments, resources have been held on Release 4
longer than planned to complete testing and resolve defects. While this
has resulted in an improvement in schedule performance in September and
October 2004, it has also contributed to a slip in cost performance in
October 2004. Continuing to devote extra resources to meet the Release 4
schedule could further impact the currently positive cost variance.

Appendix I Briefing to Subcommittees on Homeland Security, House and
Senate Committees on Appropriations

Observation 6: The fiscal year 2005 expenditure plan does not adequately
describe progress against commitments (e.g., ACE capabilities, schedule,
cost, and benefits) made in previous plans.

ACE is intended to provide greater security at our nation's borders while
improving import and export processing, and its latest life-cycle cost
estimate is about $3.1 billion. Given ACE's immense importance and sizable
cost and complexity, the Congress has placed limitations on the use of
program funds until it is assured, through the submission of periodic
expenditure plans, that the program is being well managed.

As we have previously reported, to permit meaningful congressional
oversight, it is important that expenditure plans describe how well CBP is
progressing against the commitments made in prior expenditure plans.1
However, the fiscal year 2005 expenditure plan did not adequately describe
such progress. In particular, in its fiscal year 2004 expenditure plan,
CBPMO committed to, for example,

o 	acquiring infrastructure (e.g., system environments, facilities,
telecommunications, and licenses) for ACE releases and

1GAO, Information Technology: Homeland Security Needs to Improve Entry
Exit System Expenditure Planning, GAO-03-563 (Washington, D.C.: June 9,
2003).

Appendix I Briefing to Subcommittees on Homeland Security, House and
Senate Committees on Appropriations

                                  Observations

o 	defining and designing the ACE release (designated Release 6 at the
time) that is intended to provide additional account management
functionality.

The fiscal year 2005 plan, however, did not address progress against these
commitments. For example, the plan did not describe the status of
infrastructure acquisition, nor did it discuss the expenditure of the
$106.6 million requested for this purpose. While the plan did discuss the
status of the initial ACE releases, it did not describe progress toward
defining and designing the functionality that was to be in the former
Release 6.

Also, the fiscal year 2005 expenditure plan included a schedule for
developing ACE releases, but neither reported progress relative to the
schedule presented in the fiscal year 2004 plan nor explained how the
individual releases and their respective schedules were affected by the
rebaselining that occurred after the fiscal year 2004 plan was submitted.

Further, while the fiscal year 2005 expenditure plan contained high-level
descriptions of the functionality provided by Releases 1 and 2, it did not
describe progress toward achieving the benefits they are expected to
provide.

Without such information, meaningful congressional oversight of CBP
progress and accountability is impaired.

Appendix I Briefing to Subcommittees on Homeland Security, House and
Senate Committees on Appropriations

Observation 7: Some key bases for the commitments made in the fiscal year
2005 expenditure plan have changed, raising questions as to the plan's
currency and relevance.

The ACE fiscal year 2005 expenditure plan is based largely on the July 8,
2004. ACE Program Plan. This July plan represents the program's
authoritative and operative guiding document or plan of action. Briefly,
it describes such things as the ACE release construct, development
methodology, deployment strategy, organizational change approach, training
approach, and role/responsibility assignments. It also identifies key
assumptions made in formulating the plan, provides a schedule for
accomplishing major program activities, and contains estimates of costs
for the total program and major activities.

Recent program developments and program changes have altered some key
bases (e.g., assumptions, release construct, organizational change
management approach, and roles and responsibilities) of the ACE program
plan, and thus the current expenditure plan. As a result, questions arise
as to the extent to which the expenditure plan's commitments remain
current and relevant.

Appendix I Briefing to Subcommittees on Homeland Security, House and
Senate Committees on Appropriations

A key Release 5 assumption underpinning the program and expenditure plans
is no longer valid.

Release 5 is to include the capability to receive a multimodal manifest
that can be screened for risk indicators. According to the ACE program
plan, delivery of this capability is to be accomplished using the SAP
software product, which the SAP vendor was expected to enhance because its
product does not currently contain the functionality to accommodate
multimodal manifests. This expectation for product enhancement, within
certain time and resource constraints, was an assumption in the ACE
program plan, and was to be accomplished under a contract between eCP and
the SAP vendor.

Following the program plan's approval, initial development of Release 5
began (e.g., planning for the release, negotiations to enhance the SAP
product, development of release initiation documents, conduct of release
functionality workshops). However, CBPMO has recently decided not to use
SAP to provide the multimodal manifest functionality, thus rendering a key
assumption in the program plan and the expenditure plan invalid. CBPMO has
since suspended all work to develop the multimodal manifest functionality
until a new approach to developing it is established. According to ACE
officials, this change is intended to result in providing the multimodal
manifest functionality faster and at lower cost.

Appendix I Briefing to Subcommittees on Homeland Security, House and
Senate Committees on Appropriations

Additional release(s) now planned that were not reflected in the program
and expenditure plans.

CBPMO now plans to add at least one new ACE release. According to CBPMO
officials, the need for additional Release 4 functionality was expressed
by various user groups during the development of this
release-functionality that was not in the scope of Release 4 and includes,
for example, the capability for trade users to look up transactions, and
for carriers to receive feedback on release of vehicles. In addition, the
need for ACE to more easily accommodate new legislative mandates was
identified. Therefore, a Release 4 enhancement, referred to as Release
4.1, has been added to the ACE release construct.

In October, CBPMO defined high-level functional requirements for Release
4.1, and it is currently defining more detailed requirements. However,
this additional release, including its scope, costs, and schedule, are not
reflected in the current ACE program plan or the fiscal year 2005
expenditure plan. According to program officials, any enhancement releases
will not be reflected in the program plan until its next major update
(August 2005), which is after CBPMO anticipates having implemented Release
4.1, and the first expenditure plan that could recognize it is the fiscal
year 2006 plan.

Appendix I Briefing to Subcommittees on Homeland Security, House and
Senate Committees on Appropriations

ACE officials also stated that the costs of Release 4.1 and any additional
releases will be funded by operations and maintenance funds provided for
in the expenditure plan.

Appendix I Briefing to Subcommittees on Homeland Security, House and
Senate Committees on Appropriations

The current organizational change management approach is not fully
reflected in program and expenditure plans, and key change management
actions are not to be implemented.

As we have previously reported, best practices for acquiring and
implementing commercial component-based systems include ensuring that the
organizational impact of introducing functionality embedded in the
commercial software products, like SAP, is proactively managed.1
Accordingly, about 2 years ago we first discussed with ACE program
executives the need to proactively prepare users for role, responsibility,
and business process changes associated with ACE implementation. To its
credit, the ACE program plan describes the organizational change approach
that is to be pursued to position CBP for these changes. Specifically, the
plan discusses three primary activities that are to be performed:
communicating and reaching out to stakeholders; providing training; and
establishing a performance measurement structure.

On August 10, 2004, a revised organizational change approach was
introduced.

This new approach introduces new change management activities. As of

November 2004, some of these activities are being or are planned to be

implemented.

1GAO, Information Technology: DOD's Acquisition Policies and Guidance Need
to Incorporate Additional Best Practices and Controls, GAO-04-722
(Washington, D.C.: July 2004).

Appendix I Briefing to Subcommittees on Homeland Security, House and
Senate Committees on Appropriations

These activities include conducting a communications campaign, mapping
employee roles with position descriptions, and providing learning aids and
help desk support.

However, because this revised organizational change approach was finalized
more than a month after the ACE Program Plan was completed, neither the
program plan nor the fiscal year 2005 expenditure plan fully reflects the
changes.

Moreover, because the ACE funding request for fiscal year 2005 did not
fully reflect the revised approach to managing organizational change, key
actions associated with the revised approach are not planned for
implementation in fiscal year 2005. For example, one key action was to
establish and communicate ACE usage targets, which would both encourage
ACE usage and permit performance to be measured. This is important,
according to eCP, because users may continue to rely on ACS, which would
preclude accrual of full ACE benefits. CBPMO officials stated that each of
the key actions that will not be implemented introduces risks that must be
mitigated. Formal program risks and associated mitigation plans are
currently under development. The following slide summarizes change
management actions in the revised approach that are not planned for
implementation and their associated risks.

Appendix I Briefing to Subcommittees on Homeland Security, House and
Senate Committees on Appropriations

Actions not planned for
implementation Risk statements

Establish and communicate targets for If ACS remains available to ACE
users, they may continue to use the ACE usage to encourage users to use
legacy system, and as a result the full benefits of ACE will not be ACE
rather than ACS. realized.

Before training, make users aware of the If ACE users do not understand
the differences between the legacy major differences between ACS and
systems and ACE, then the users will not understand how best to use ACE.
ACE, which may result in resistance to the new system and processes.

Discuss the future needs of CBP to If future roles of the OIT are not
established, then OIT may not be
establish new roles and responsibilities prepared to provide technical
support when ACE is transferred from
within the Office of Information and eCP to OIT.
Technology (OIT).

Send staff to visit ports to build critical If staff do not have adequate
access to representatives of occupational knowledge regarding
organizational groups at each port, then communications, training, and
deployment change objectives. efforts cannot be customized to each group's
needs. This may delay or

                             disrupt ACE adoption.

Source: CBP.

Appendix I Briefing to Subcommittees on Homeland Security, House and
Senate Committees on Appropriations

Recent changes to the respective roles and responsibilities of the ACE
development contractor and CBPMO are not reflected in the program and
expenditure plans.

As previously mentioned, on April 27, 2001, eCP was awarded a contract to
develop and deploy ACE. The strategy was for the government to play the
role of the system acquirer and to leverage the expertise of eCP, which
was to be the system developer. Accordingly, CBPMO has since been
responsible for performing system acquisition functions (e.g., contract
tracking and oversight, evaluation of acquired products and services, and
risk management), and eCP has been responsible for system development
functions (e.g., requirements development; design, development, testing,
and deployment of Releases 1, 2, 3, and 4; and related services, including
architecture and engineering). These respective roles and responsibilities
are reflected in the ACE program plan, and thus the fiscal year 2005
expenditure plan.

According to CBPMO officials, these respective roles and responsibilities
are being realigned so that CBPMO and eCP will share ACE development
duties. That is, CBPMO will be responsible for certain ACE development and
deployment efforts as well as for oversight of the development efforts for
which eCP will retain responsibility. eCP will also provide support to
CBPMO's development efforts.

Appendix I Briefing to Subcommittees on Homeland Security, House and
Senate Committees on Appropriations

More detailed information on how this change in roles and responsibilities
will be operationalized was not yet available. Moreover, this change in
approach is not reflected in either the ACE program plan or the fiscal
year 2005 expenditure plan.

Nevertheless, this change in approach is significant, and thus it is
important that it be managed carefully. As we previously reported,
effective management of a largescale systems modernization program, like
ACE, requires a clear allocation of the respective roles and
responsibilities of the government and the contractor,1 particularly with
regard to responsibility for integrating system components developed by
different parties. The extent to which these are made explicit and
unambiguous will go a long way in ensuring proper accountability for
performance.

1GAO, Tax Systems Modernization: Results of Review of IRS' Initial
Expenditure Plan, GAO/AIMD/GGD-99-206 (Washington, D.C.: June 1999).

Appendix I Briefing to Subcommittees on Homeland Security, House and
Senate Committees on Appropriations

DHS and OMB have largely satisfied four of the five conditions associated
with the fiscal year 2005 ACE expenditure plan that were legislated by the
Congress, and we have satisfied the fifth condition. Further, CBPMO has
continued to work toward implementing our prior recommendations aimed at
improving management of the ACE program and thus the program's chances of
success. Nevertheless, progress has been slow in addressing some of our
recommendations, such as the one encouraging proactive management of the
relationships between ACE and other DHS border security programs, like
US-VISIT. Given that these programs have made and will continue to make
decisions that determine how they will operate, delays in managing their
relationships will increase the chances that later system rework will
eventually be required to allow the programs to interoperate.

Additionally, while DHS has taken important actions to help address ACE
releaseby-release cost and schedule overruns that we previously
identified, it is unlikely that the effect of these actions will prevent
the past pattern of overruns from recurring. This is because DHS has met
its recently revised cost and schedule commitments in part by relaxing
system quality standards, so that milestones are being passed despite
material system defects, and because correcting such defects will
ultimately require the program to expend resources, such as people and
test environments, at the expense of later system releases (some of which
are now under way).

Appendix I Briefing to Subcommittees on Homeland Security, House and
Senate Committees on Appropriations

In the near term, cost and schedule overruns on recent releases are being
somewhat masked by the use of less stringent quality standards;
ultimately, efforts to fix these defects will likely affect the delivery
of later releases. Until accountability for ACE is redefined and measured
in terms of all types of program commitments- system capabilities,
benefits, costs, and schedules-the program will likely experience more
cost and schedule overruns.

During the last year, DHS's accountability for ACE has been largely
focused on meeting its cost and schedule baselines. This focus is revealed
by the absence of information in the latest expenditure plan on progress
against all commitments made in prior plans, particularly with regard to
measurement and reporting on such things as system capabilities, use, and
benefits. It is also shown by the program's insufficient focus on system
quality, as demonstrated by its willingness to pass milestones despite
material defects, and by the absence of attention to the current defect
profile for Release 3 (which is already deployed).

Appendix I Briefing to Subcommittees on Homeland Security, House and
Senate Committees on Appropriations

Moreover, the commitments that DHS made in the fiscal year 2005
expenditure plan have been overcome by events, which limits the currency
and relevance of this plan and its utility to the Congress as an
accountability mechanism. As a result, the prospects of greater
accountability in delivering against its capability, benefit, cost, and
schedule commitments are limited. Therefore, it is critically important
that DHS define for itself and the Congress an accountability framework
for ACE, and that it manage and report in accordance with this framework.
If it does not, the effects of the recent rebaselining of the program will
be short lived, and the past pattern of ACE costing more and taking longer
than planned will continue.

Appendix I Briefing to Subcommittees on Homeland Security, House and
Senate Committees on Appropriations

To strengthen accountability for the ACE program and better ensure that
future ACE releases deliver promised capabilities and benefits within
budget and on time, we recommend that the DHS Secretary, through the Under
Secretary for Border and Transportation Security, direct the Commissioner,
Customs and Border Protection, to define and implement an ACE
accountability framework that ensures

o 	coverage of all program commitment areas, including key expected or
estimated system (1) capabilities, use, and quality; (2) benefits and
mission value; (3) costs; and (4) milestones and schedules;

o 	currency, relevance, and completeness of all such commitments made to
the Congress in expenditure plans;

o  reliability of data relevant to measuring progress against commitments;

o 	reporting in future expenditure plans of progress against commitments
contained in prior expenditure plans;

o 	use of criteria for exiting key readiness milestones that adequately
consider indicators of system maturity, such as severity of open defects;
and

o 	clear and unambiguous delineation of the respective roles and
responsibilities of the government and the prime contractor.

Appendix I Briefing to Subcommittees on Homeland Security, House and
Senate Committees on Appropriations

In their oral comments on a draft of this briefing, DHS and CBP officials,
including the DHS Chief Information Officer (CIO), the Border and
Transportation Security CIO, and the CBP Acting CIO, generally agreed with
our findings, conclusions, and recommendations and stated that it was fair
and balanced. They also provided clarifying information that we
incorporated as appropriate in this briefing.

Appendix I Briefing to Subcommittees on Homeland Security, House and
Senate Committees on Appropriations

Attachment 1

                             Scope and Methodology

Scope and Methodology

To accomplish our objectives, we analyzed the ACE fiscal year 2005
expenditure plan and supporting documentation, comparing them to relevant
federal requirements and guidance, applicable best practices, and our
prior recommendations. We also interviewed DHS and CBP officials and ACE
program contractors. In particular, we reviewed

o  DHS and CBP investment management practices, using OMB A-11, part 7;

o 	DHS and CBP activities for ensuring ACE compliance with the DHS
enterprise architecture;

o  DHS and CBP acquisition management efforts, using SEI's SA-CMM;

o 	CBP cost estimating program and cost estimates, using SEI's
institutional and project-specific estimating guidelines;1

1SEI's institutional estimating guidelines are defined in Checklists and
Criteria for Evaluating the Cost and Schedule Estimating Capabilities of
Software Organizations, and SEI's project-specific estimating guidelines
are defined in A Manager's Checklist for Validating Software Cost and
Schedule Estimates.

Appendix I Briefing to Subcommittees on Homeland Security, House and
Senate Committees on Appropriations

Attachment 1

                             Scope and Methodology

o  CBP actions to coordinate ACE with US-VISIT using program
documentation;

o 	ACE testing plans, activities, system defect data, and system
performance data using industry best practices;

o 	independent verification and validation (IV&V) activities using the
Institute of Electrical and Electronics Engineers Standard for Software
Verification and Validation;1

o 	CBP establishment and use of performance measures using the draft
Performance Metrics Plan and eCP's cost performance reports;

o  ACE's performance using service level agreements;

1Institute of Electrical and Electronics Engineers (IEEE) Standard for
Software Verification and Validation, IEEE Std 10121998 (New York: Mar. 9,
1998).

Appendix I Briefing to Subcommittees on Homeland Security, House and
Senate Committees on Appropriations

Attachment 1

                             Scope and Methodology

o 	CBP's progress toward increasing the number of ACE user accounts,
against established targets;

o 	ACE's quality, using eCP defect data and testing results for Releases 3
and 4; and

o 	cost and schedule data and program commitments from program management
documentation.

For DHS-, CBP-, and contractor-provided data that our reporting
commitments did not permit us to substantiate, we have made appropriate
attribution indicating the data's source.

We conducted our work at CBP headquarters and contractor facilities in the
Washington, D.C., metropolitan area from April 2004 through December 2004,
in accordance with generally accepted government auditing standards.

Appendix II

Comments from the U.S. Department of Homeland Security

Appendix II
Comments from the U.S. Department of
Homeland Security

Appendix II
Comments from the U.S. Department of
Homeland Security

Appendix III

                       Contacts and Staff Acknowledgments

GAO Contact Mark T. Bird, (202) 512-6260

Staff 	In addition to the person named above, Carol Cha, Barbara Collier,
William Cook, Neil Doherty, Nnaemeka Okonkwo, and Shannin O'Neill made key

Acknowledgments contributions to this report.

GAO's Mission	The Government Accountability Office, the audit, evaluation
and investigative arm of Congress, exists to support Congress in meeting
its constitutional responsibilities and to help improve the performance
and accountability of the federal government for the American people. GAO
examines the use of public funds; evaluates federal programs and policies;
and provides analyses, recommendations, and other assistance to help
Congress make informed oversight, policy, and funding decisions. GAO's
commitment to good government is reflected in its core values of
accountability, integrity, and reliability.

Obtaining Copies of The fastest and easiest way to obtain copies of GAO
documents at no cost

is through GAO's Web site (www.gao.gov). Each weekday, GAO postsGAO
Reports and newly released reports, testimony, and correspondence on its
Web site. To Testimony have GAO e-mail you a list of newly posted products
every afternoon, go to

www.gao.gov and select "Subscribe to Updates."

Order by Mail or Phone	The first copy of each printed report is free.
Additional copies are $2 each. A check or money order should be made out
to the Superintendent of Documents. GAO also accepts VISA and Mastercard.
Orders for 100 or more copies mailed to a single address are discounted 25
percent. Orders should be sent to:

U.S. Government Accountability Office 441 G Street NW, Room LM Washington,
D.C. 20548

To order by Phone:	Voice: (202) 512-6000 TDD: (202) 512-2537 Fax: (202)
512-6061

To Report Fraud, Contact:
Waste, and Abuse in Web site: www.gao.gov/fraudnet/fraudnet.htm

E-mail: [email protected] Programs Automated answering system: (800)
424-5454 or (202) 512-7470

Congressional	Gloria Jarmon, Managing Director, [email protected] (202)
512-4400 U.S. Government Accountability Office, 441 G Street NW, Room 7125

Relations Washington, D.C. 20548

Public Affairs	Paul Anderson, Managing Director, [email protected] (202)
512-4800 U.S. Government Accountability Office, 441 G Street NW, Room 7149
Washington, D.C. 20548
*** End of document. ***