Information Technology: Census Bureau Needs to Improve Its Risk  
Management of Decennial Systems (11-DEC-07, GAO-08-259T).	 
                                                                 
For Census 2010, automation and information technology (IT) are  
expected to play a critical role. The Census Bureau plans to	 
spend about $3 billion on automation and technology that are to  
improve the accuracy and efficiency of census collection,	 
processing, and dissemination. From February 2006 through June	 
2009, the Bureau is holding a ''Dress Rehearsal'' during which it
plans to conduct operational testing that includes decennial	 
systems acquisitions. In October 2007, GAO reported on its review
of four key 2010 Census IT acquisitions to (1) determine the	 
status and plans, including schedule and cost, and (2) assess	 
whether the Bureau is adequately managing associated risks. This 
testimony summarizes GAO's report on these key acquisitions and  
describes GAO's preliminary observations on the performance of	 
handheld mobile computing devices used during the Dress 	 
Rehearsal.							 
-------------------------Indexing Terms------------------------- 
REPORTNUM:   GAO-08-259T					        
    ACCNO:   A78789						        
  TITLE:     Information Technology: Census Bureau Needs to Improve   
Its Risk Management of Decennial Systems			 
     DATE:   12/11/2007 
  SUBJECT:   Census						 
	     Cost analysis					 
	     Data collection					 
	     Information management				 
	     Information technology				 
	     IT acquisitions					 
	     Operational testing				 
	     Program evaluation 				 
	     Risk assessment					 
	     Risk management					 
	     Schedule slippages 				 
	     Source data automation				 
	     Strategic planning 				 
	     2010 Decennial Census				 

******************************************************************
** This file contains an ASCII representation of the text of a  **
** GAO Product.                                                 **
**                                                              **
** No attempt has been made to display graphic images, although **
** figure captions are reproduced.  Tables are included, but    **
** may not resemble those in the printed version.               **
**                                                              **
** Please see the PDF (Portable Document Format) file, when     **
** available, for a complete electronic file of the printed     **
** document's contents.                                         **
**                                                              **
******************************************************************
GAO-08-259T

   

     * [1]Results in Brief
     * [2]Background

          * [3]Role of IT in the Decennial Census

     * [4]Decennial IT Acquisitions Were at Various Stages of Developm

          * [5]MTAIP Was Completing Improvements on Schedule and at Estimat
          * [6]FDCA Had Provided Deliverables but Had Delayed Functionality
          * [7]After a Schedule Revision, DRIS Was Delivering Reduced Funct
          * [8]DADS II Contract Had Recently Been Awarded after a Delay
          * [9]Delayed Functionality Increases the Importance of Further Op

     * [10]The Bureau Was Making Progress in Risk Management Activities

          * [11]Project Teams Had Usually Established Risk Preparation Activ
          * [12]The Project Teams Had Identified and Analyzed Risks but Not
          * [13]Three of Four Project Teams' Risk Mitigation Plans and Monit
          * [14]Project Teams Were Inconsistent in Reporting Risk Status to

     * [15]Implementation of GAO Recommendations Should Help Improve th
     * [16]GAO's Mission
     * [17]Obtaining Copies of GAO Reports and Testimony

          * [18]Order by Mail or Phone

     * [19]To Report Fraud, Waste, and Abuse in Federal Programs
     * [20]Congressional Relations
     * [21]Public Affairs

Testimony

Before the Subcommittee on Information Policy, Census, and National
Archives, Committee on Oversight and Government Reform, U.S. House of
Representatives

United States Government Accountability Office

GAO

For Release on Delivery
Expected at 2:00 p.m. EST
Tuesday, December 11, 2007

INFORMATION TECHNOLOGY

Census Bureau Needs to Improve Its Risk Management of Decennial Systems

Statement of David A. Powner
Director, Information Technology Management Issues

Mathew J. Scire
Director, Strategic Issues

GAO-08-259T

Mr. Chairman and Members of the Subcommittee:

Thank you for the opportunity to participate in today's hearing on the
2010 Decennial Census Information Technology (IT) acquisitions that are an
integral part of the reengineered census. As you know, the decennial
census is mandated by the U.S. Constitution and provides data that are
vital to the nation. These data are used to reapportion the seats of the
U.S. House of Representatives, realign the boundaries of the legislative
districts of each state, allocate billions of dollars in federal financial
assistance, and provide a social, demographic, and economic profile of the
nation's people to guide policy decisions at each level of government.

Carrying out the census is the responsibility of the Department of
Commerce's Census Bureau, which is now preparing for the 2010 Census. The
Bureau is required to count the population on April 1, 2010, and the
Secretary of Commerce is required to report to the President on the
tabulation of total population by state within 9 months of that date.^1

The Bureau plans to rely on automation and technology to improve the
coverage, accuracy, and efficiency of the 2010 Census, and has awarded
four key IT contracts to that end. It is also holding what it refers to as
a Dress Rehearsal, from February 2006 through June 2009, a period
centering around a mock Census Day on April 1, 2008.^2 Planned Dress
Rehearsal activities include operational testing of the 2010 Census
systems in a census-like environment. The Bureau estimates that its IT
acquisitions will absorb about $3 billion of the total $11.5 billion cost
of the entire census.

As requested, our testimony today will summarize our report on the four
key IT acquisitions. In the report, we (1) determined the status and
plans, including schedule and costs, for four key IT acquisitions; and (2)
assessed whether the Bureau is adequately managing the risks facing these
key system acquisitions.^3 The report contains a detailed overview of the
scope and methodology we used. As you also requested, our testimony today
describes GAO's preliminary observations on the performance of handheld
mobile computing devices used during address canvassing activities in the
Dress Rehearsal.^4 The preliminary observations presented in this report
are based on field work we have conducted at the two Dress Rehearsal sites
(Stockton, CA and Fayetteville, NC), as well as a review of Bureau
documentation of its own observations of the Dress Rehearsal. The work on
which this testimony is based was performed in accordance with generally
accepted government auditing standards.

^113 U.S.C. 141 (a) and (b).

^2Since issuance of our report in October 2007, the Bureau has tentatively
moved the mock Census Day from April 1, 2008 to May 1, 2008.

^3GAO, Information Technology: Census Bureau Needs to Improve Its Risk
Management of Decennial Systems, [22]GAO-08-79 (Washington, D.C.: Oct. 5,
2007).

Results in Brief

As of October 2007, three key systems acquisitions for the 2010 Census
were in process, and a fourth contract had recently been awarded:

           o In one project, the Bureau is modernizing the database that
           provides address lists, maps, and other geographic support
           services for the census. This project is on schedule to complete
           improvements by the end of fiscal year 2008 and is meeting cost
           estimates.

           o In a second project, the Bureau is acquiring systems, equipment,
           and infrastructure for field staff to use in collecting census
           data. Deliverables provided to date include handheld mobile
           computing devices and installation of key support infrastructure.
           However, the schedule for this acquisition has been revised,
           resulting in delays in system development and testing of
           interfaces. Also, the life-cycle cost estimates for this program
           have increased, and we projected an $18 million cost overrun by
           December 2008. According to the contractor, the overrun is due
           primarily to an increase in the number of system requirements.

           o In a third project, the Bureau is acquiring a system for
           integrating paper, telephone responses, and field operations. The
           software development and testing are on schedule to provide (by
           December 2007) an initial system to process the major census forms
           during the Dress Rehearsal activities. However, the system
           development schedule was revised in October 2005, which is
           delaying some functionality. For example, a telephone-assistance
           system that was originally intended to be completed by fiscal year
           2008 has been delayed. This acquisition is meeting current cost
           estimates.

           o Finally, a contract to replace the current system used to
           tabulate and disseminate census data was recently delayed by about
           a year (it was ultimately awarded in September 2007). As a result,
           of the 1-year delay, the Dress Rehearsal activities will use the
           current tabulation and dissemination system rather than a
           modernized version.
			  
^4Address canvassing is a field operation to build a complete and accurate
address list. In this operation, census field workers go door to door
verifying and correcting addresses for all households and street features
contained on decennial maps.

           The delays mean that the Dress Rehearsal operational testing will
           take place without the full complement of systems and
           functionality that was originally planned. As a result, further
           system testing will be necessary to ensure that the decennial
           systems work as intended. However, as of October 2007, Bureau
           officials had not finalized their plans for testing all the
           systems, and it is not clear whether these plans would include
           testing to address all interrelated systems and functionality,
           such as end-to-end testing.^5 According to officials, these plans
           will not be finalized until February 2008. Without sufficient
           testing of all systems and their functionality, the Bureau
           increases the risk that costs will increase further, that
           decennial systems will not perform as expected, or both.

           As of October 2007, the four project teams managing the
           acquisitions had performed many practices associated with
           establishing sound and capable risk management processes. However,
           critical weaknesses remained. Specifically, three of the four
           project teams had developed risk management strategies identifying
           the scope of their risk management efforts; however, three project
           teams had weaknesses in identifying risks, establishing mitigation
           plans that identified planned actions and milestones, and
           reporting risk status to executive-level officials. For example,
           one project team did not adequately identify risks associated with
           performance issues experienced by handheld mobile computing
           devices. Further, in May and June 2007, both we and the Census
           Bureau observed the use of the handheld mobile computing device in
           Census-like conditions and these observations revealed a number of
           performance issues with the devices, such as slow and inconsistent
           data processing. The magnitude of these performance issues remains
           unclear. The Field Data Collection Automation (FDCA) contract
           anticipates the Bureau's need for data on the performance of the
           handheld mobile computing device; however, the Bureau has not
           fully specified the performance data it will use for the devices.
           As we have previously reported, a root cause of weaknesses in
           completing key risk management activities is the lack of policies
           for managing major acquisitions at the Bureau.^6 Until the project
           teams implement key risk management activities, they face an
           increased probability that decennial systems will not be delivered
           on schedule and within budget or perform as expected.

^5End-to-end testing is a form of operational testing that is performed to
verify that a defined set of interrelated systems that collectively
support an organizational core business function interoperate as intended
in an operational environment. The interrelated systems include not only
those owned and managed by the organization, but also the external systems
with which they interface.

           Because the entire complement of systems will not be available for
           Dress Rehearsal activities as originally planned, we recommended
           that the Census Bureau plan for and perform end-to-end testing so
           that all systems are tested in a census-like environment. Further,
           to help ensure that the three key acquisitions for the 2010 Census
           operate as intended, we recommended that the project teams
           strengthen risk management activities, including those associated
           with risk identification, mitigation, and oversight.

           In written comments on a draft of our report, the department
           agreed to examine additional ways to manage risks and prepare a
           formal action plan in response to our final report. However, the
           department said it had a major disagreement with our findings with
           regard to not conducting operational testing on a full complement
           of the key decennial systems, stating it plans to test all
           critical systems and interfaces during the Dress Rehearsal or
           later. Nonetheless, the Bureau's test plans have not been
           finalized, and it remains unclear whether testing will address all
           interrelated systems and functionality in a census-like
           environment, as would be provided by end-to-end testing.
           Consistent with our recommendation, following up with documented
           test plans to do end-to-end testing will help ensure that
           decennial systems will work as intended.
			  
			  Background

           Conducting the decennial census is a major undertaking involving
           many interrelated steps including

           o identifying and correcting addresses for all known living
           quarters in the United States (known as "address canvassing");
           o sending questionnaires to housing units;
           o following up with nonrespondents through personal interviews;
           o identifying people with nontraditional living arrangements;
           o managing a voluminous workforce responsible for follow-up
           activities;
           o collecting census data by means of questionnaires, calls, and
           personal interviews;
           o tabulating and summarizing census data; and
           o disseminating census analytical results to the public.

^6GAO, Census Bureau: Important Activities for Improving Management of Key
2010 Decennial Acquisitions Remain to be Done, [23]GAO-06-444T
(Washington, D.C.: Mar. 1, 2006).

           Role of IT in the Decennial Census

           The Bureau estimates that it will spend about $3 billion on
           automation and IT for the 2010 Census, including four major
           systems acquisitions that are expected to play a critical role in
           improving coverage, accuracy, and efficiency. Figure 1 shows the
           key systems and interfaces supporting the 2010 Census, and
           highlights the four major IT systems we discuss today. As the
           figure shows, these four systems are to play important roles with
           regard to different aspects of the process.

Figure 1: Key 2010 Census Systems and Interfaces

Note: Shaded boxes indicate systems discussed in the report.

To establish where to count (as shown in the top section of fig. 1), the
Bureau will depend heavily on a database that provides address lists,
maps, and other geographic support services. The Bureau's address list,
known as the Master Address File (MAF), is associated with a geographic
information system containing street maps known as the Topologically
Integrated Geographic Encoding and Referencing (TIGER(R)) database.^7 The
MAF/TIGER database is the object of the first major IT acquisition--the
MAF/TIGER Accuracy Improvement Project (MTAIP).

^7TIGER is a registered trademark of the U.S. Census Bureau.

To collect respondent information (a process depicted in the middle
section of fig. 1), the Bureau is pursuing two initiatives. First, the
Field Data Collection Automation (FDCA) program is expected to provide
automation support for field data collection operations as well as reduce
costs and improve data quality and operational efficiency. This
acquisition includes the systems, equipment, and infrastructure that field
staff will use to collect census data, such as handheld mobile computing
devices.^8 Second, the Decennial Response Integration System (DRIS) is to
provide a system for collecting and integrating census responses from all
sources, including forms, telephone interviews, and handheld mobile
computing devices in the field. DRIS is expected to improve accuracy and
timeliness by standardizing the response data and providing it to other
Bureau systems for analysis and processing.

To provide results (see the bottom section of fig. 1), the Data Access and
Dissemination System II (DADS II) acquisition is to replace legacy systems
for tabulating and publicly disseminating data. The DADS II program is
expected to provide comprehensive support to DADS. Replacement of the
legacy systems is expected to

           o maximize the efficiency, timeliness, and accuracy of tabulation
           and dissemination products and services;
           o minimize the cost of tabulation and dissemination; and
           o increase user satisfaction with related services.
			  
^8Handheld mobile computing devices will be used to update the Bureau's
address list, to perform follow-up at addresses for which no questionnaire
was returned, and to perform activities to measure census coverage.

           Table 1 provides a brief overview of the four acquisitions.

           Table 1: Four Key IT Acquisitions Supporting Census 2010
			  
IT acquisition: MAF/TIGER Accuracy Improvement Project (MTAIP); 
Purpose: Modernize the system that provides the address list, maps, and 
other geographic support services for the Census and other Bureau 
surveys. 

IT acquisition: Field Data Collection Automation (FDCA); 
Purpose: Provide automated resources for supporting field data 
collection, including the provision of handheld mobile computing 
devices to collect data in the field, including address and map data. 

IT acquisition: Decennial Response Integration System (DRIS); 
Purpose: Provide a solution for data capture and respondent assistance. 

IT acquisition: Data Access and Dissemination System (DADS II); 
Purpose: Develop a replacement for the DADS legacy tabulation and 
dissemination systems. 			  

           Source: GAO analysis of Census Bureau data.

           Responsibility for these acquisitions lies with the Bureau's
           Decennial Management Division and the Geography Division. Each of
           the four acquisitions is managed by an individual project team
           staffed by Bureau personnel. Additional information on the
           contracts for these four systems is provided in appendix I of the
           report.

           In preparation for the 2010 Census, the Bureau plans a series of
           tests of its (new and existing) operations and systems in
           different environments, as well as to conduct what it refers to as
           the Dress Rehearsal. During the Dress Rehearsal period, which runs
           from February 2006 through June 2009, the Bureau plans to conduct
           development and testing of systems, run a mock Census Day, and
           prepare for Census 2010, which will include opening offices and
           hiring staff. These Dress Rehearsal activities are to provide an
           operational test of the available system functionalities in a
           census-like environment, as well as other operational and
           procedural activities.
			  
			  Decennial IT Acquisitions Were at Various Stages of Development
			  and Showed Mixed Progress against Schedule and Cost Baselines

           As of October 2007, three key decennial systems acquisitions were
           in process and a fourth contract had recently been awarded. The
           ongoing acquisitions (FDCA, DRIS) showed mixed progress in
           providing deliverables while adhering to planned schedules and
           cost estimates. The two ongoing projects had experienced schedule
           delays; the date for awarding the fourth contract was postponed
           several times. In addition, we estimated that one of the ongoing
           projects (FDCA) will incur about $18 million in cost overruns. In
           response to schedule delays as well as other factors, including
           cost, the Bureau made schedule adjustments and planned to delay
           certain system functionality. As a result, Dress Rehearsal
           operational testing will not address the full complement of
           systems and functionality that was originally planned, and the
           Bureau has not yet finalized its plans for further system tests.
           Delaying functionality increases the importance of operational
           testing after the Dress Rehearsal to ensure that the decennial
           systems work as intended.
			  
			  MTAIP Was Completing Improvements on Schedule and at Estimated Cost

           MTAIP is a project to improve the accuracy of the MAF/TIGER
           database, which contains information on street locations, housing
           units, rivers, railroads, and other geographic features. We
           reported that MTAIP was on schedule to complete improvements by
           the end of fiscal year 2008 and was meeting cost estimates.

           As of October 2007, the acquisition was in the second and final
           phase of its life cycle. In Phase II, which began in January 2003
           and is ongoing, the contractor is developing improved maps for all
           3,037 counties in the United States. We reported that the
           contractor had delivered more than 75 percent of these maps, which
           are due by September 2008. Beginning in fiscal year 2008,
           maintenance for the contract will begin. The contract closeout
           activities are scheduled for fiscal year 2009.
			  
			  FDCA Had Provided Deliverables but Had Delayed Functionality and Was
			  Experiencing Cost Increases

           FDCA is to provide the systems, equipment, and infrastructure that
           field staff will use to collect census data. At the peak of the
           2010 Census, about 4,000 field operations supervisors, 40,000 crew
           leaders, 500,000 enumerators and address listers, and several
           thousand office employees are expected to use or access FDCA.

           As of October 2007, the contractor was in the process of
           developing and testing FDCA software for the Dress Rehearsal
           Census Day, and had delivered 1,388 handheld mobile computing
           devices to be used in address canvassing for the Dress Rehearsal.
           Also, key FDCA support infrastructure had been installed,
           including the Security Operation Center. In future contract
           phases, the project will continue development, deploy systems and
           hardware, support census operations, and perform operational and
           contract closeout activities.

           However, the Bureau revised FDCA's original schedule and delayed
           or eliminated some of its key functionality from the Dress
           Rehearsal, including the automated software distribution system.
           According to the Bureau, it revised the schedule because it
           realized it had underestimated the costs for the early stages of
           the contract, and that it could not meet the contractor's
           estimated level of first-year funding because the fiscal year 2006
           budget was already in place. According to the Bureau, this initial
           underestimate led to schedule changes and overall cost increases.

           According to the Bureau, FDCA was meeting all planned milestones
           on the revised schedule. For example, all sites for Regional
           Census Centers and Puerto Rico Area Offices had been identified.
           According to the Bureau, it is on schedule to open all these
           offices in January 2008.

           The project life-cycle costs had increased. At contract award in
           March 2006, the total cost of FDCA was estimated not to exceed
           $596 million. In May 2007, the life-cycle cost rose by a further
           $23 million because of increasing system requirements, which
           resulted in an estimated life-cycle cost of about $647 million.
           Table 2 shows the life-cycle cost estimates for FDCA as of October
           2007.

Table 2: FDCA Life-Cycle Cost Estimates

Execution period: Baseline planning period; 
Start date: March 31, 2006; 
End date: June 30, 2006; 
Cost estimates (in millions) September 2006: $11; 
Cost estimates (in millions) May 2007: $11. 

Execution period: Execution Period 1; 
Start date: July 1, 2006; 
End date: December 31, 2008; 
Cost estimates (in millions) September 2006: $200; 
Cost estimates (in millions) May 2007: $225. 

Execution period: Execution Period 2; 
Start date: January 1, 2009; 
End date: September 30, 2011; 
Cost estimates (in millions) September 2006: $319; 
Cost estimates (in millions) May 2007: $318. 

Execution period: Execution Period 3; 
Start date: August 1, 2010; 
End date: End of contract; 
Cost estimates (in millions) September 2006: $10; 
Cost estimates (in millions) May 2007: $10. 

Execution period: Leased equipment; 
Start date: N/A; 
End date: N/A; 
Cost estimates (in millions) September 2006: $12; 
Cost estimates (in millions) May 2007: $12. 

Execution period: Management reserve; 
Start date: N/A; 
End date: N/A; 
Cost estimates (in millions) September 2006: $7; 
Cost estimates (in millions) May 2007: $5. 

Execution period: Award fee; 
Start date: N/A; 
End date: N/A; 
Cost estimates (in millions) September 2006: $65; 
Cost estimates (in millions) May 2007: $65. 

Execution period: Total; 
Cost estimates (in millions) September 2006: $624; 
Cost estimates (in millions) May 2007: $647. 

Source: GAO analysis of Census Bureau data.

Note: Total may not add due to rounding.

In addition, FDCA had already experienced $6 million in cost overruns, and
both our analysis and the contractor's analysis expected FDCA to
experience additional cost overruns. Based on our analysis of cost
performance reports (from July 2006 to May 2007), we projected that the
FDCA project will experience further cost overruns by December 2008. The
FDCA cost overrun was estimated between $15 million and $19 million, with
the most likely overrun to be about $18 million. The contractor, in
contrast, estimated about a $6 million overrun by December 2008.

According to the contractor, the major cause of projected cost overruns
was the system requirements definition process. For example, in December
2006, the contractor noted a significant increase in the requirements for
the Dress Rehearsal Paper Based Operations in Execution Period 1.
According to the cost performance reports, this increase has meant that
more work must be conducted and more staffing assigned to meet the Dress
Rehearsal schedule.

The Bureau agreed that cost increases occurred in some cases because of
the addition of new requirements, most of which related to the security of
IT systems, but added that in other cases, increases occurred from the
process of the contractor converting high-level functional requirements
into more detailed specific requirements. However, the process of
developing detailed requirements from high-level functional requirements
does not inevitably lead to cost increases if the functional requirements
were initially well-defined.

The FDCA schedule changes have increased the likelihood that the systems
testing at the Dress Rehearsal will not be as comprehensive as planned.
The inability to perform comprehensive operational testing of all
interrelated systems increases the risk that further cost overruns will
occur and that decennial systems will experience performance shortfalls.

After a Schedule Revision, DRIS Was Delivering Reduced Functionality at
Projected Cost

DRIS is to provide a system for collecting and integrating census
responses, standardizing the response data, and providing it to other
systems for analysis and processing. The DRIS functionality is critical
for providing assistance to the public via telephone and for monitoring
the quality and status of data capture operations.

Although DRIS was currently on schedule to meet its December 2007
milestone, the Bureau revised the original DRIS schedule after the
contract was awarded in October 2005. Under the revised schedule, the
Bureau delayed or eliminated some functionality that was expected to be
ready for the Dress Rehearsal mock Census Day.

According to Bureau officials, they delayed the schedule and eliminated
functionality for DRIS when they realized they had underestimated the
fiscal years 2006 through 2008 costs for development. As shown in table 3,
the government's funding estimates for DRIS Phase I were significantly
lower than the contractor's.

Table 3: DRIS Cost Estimates for Phase I (as of March 2006)

Fiscal year: 2006; 
Cost estimates (in millions) Contractor: $18.6; 
Cost estimates (in millions) Government: $11.2. 

Fiscal year: 2007; 
Cost estimates (in millions) Contractor: $53.3; 
Cost estimates (in millions) Government: $23.8. 

Fiscal year: 2008; 
Cost estimates (in millions) Contractor: $48.7; 
Cost estimates (in millions) Government: $31.5. 

Fiscal Year: Total; 
Cost estimates (in millions) Contractor: $120.6; 
Cost estimates (in millions) Government: $66.5. 

Source: GAO analysis of Census Bureau data.

Originally, the DRIS solution was to include paper, telephone, Internet,
and field data collection processing; selection of data capture sites; and
preparation and processing of 2010 Census forms. However, the Bureau
reduced the scope of the solution by eliminating the Internet
functionality. In addition, the Bureau has stated that it will not have a
robust telephone questionnaire assistance system in place for the Dress
Rehearsal. As of October 2007, the Bureau was also delaying selecting
sites for data capture centers, preparing data capture facilities, and
recruiting and hiring data capture staff.

Although Bureau officials told us that the revisions to the schedule
should not affect meeting milestones for the 2010 Census, the delays mean
that more systems development and testing will need to be accomplished
later. Given the immovable deadline of the decennial census, the Bureau is
at risk of reducing functionality or increasing costs to meet its
schedule.

The DRIS project was not experiencing cost overruns, and our analysis of
cost performance reports from April 2006 to May 2007 projected no cost
overruns by December 2008. As of May 2007, the DRIS contract value had not
increased.

DADS II Contract Had Recently Been Awarded after a Delay

The DADS II acquisition is to replace the legacy DADS systems, which
tabulate and publicly disseminate data from the decennial census and other
Bureau surveys.^9 The DADS II contractor is also expected to provide
comprehensive support to the Census 2000 legacy DADS systems.

The DADS II contract award date had been delayed multiple times. The award
date was originally planned for the fourth quarter of 2005, but the date
changed to August 2006. On March 8, 2006, the Bureau estimated it would
delay the award of the DADS II contract from August to October 2006 to
gain a clearer sense of budget priorities before initiating the request
for proposal process. The Bureau then delayed the contract award again by
about another year. In January 2007, the Bureau released the DADS II
request for proposal, and the contract was finally awarded in September
2007. Because of these delays, DADS II will not be developed in time for
the Dress Rehearsal. Instead, the Bureau will use the legacy DADS system
for tabulation during the Dress Rehearsal. Nonetheless, the Bureau plans
to have the DADS II system available for the 2010 Census.

Delayed Functionality Increases the Importance of Further Operational Testing

Operational testing helps verify that systems function as intended in an
operational environment. However, for operational system testing to be
comprehensive, system functionality must be completed. Further, for
multiple interrelated systems, end-to-end testing is performed to verify
that all interrelated systems, including any external systems with which
they interface, are tested in an operational environment. However, as
described above, two of the projects had delayed planned functionality to
later phases, and one project contract had just recently been awarded in
September 2007. As a result, the operational testing that is to occur
during the Dress Rehearsal period around April 1, 2008, will not include
tests of the full complement of decennial census systems and their
functionality. As of October 2007, the Bureau had not yet finalized its
plans for system tests. If further delays occur, the importance of these
system tests will increase. Delaying functionality and not testing the
full complement of systems increases the risk that costs will rise
further, that decennial systems will not perform as expected, or both.

^9The DADS II contract was originally planned to establish a new Web-based
system that would serve as a single point for public access to all census
data and integrate many dissemination functions currently spread across
multiple Bureau organizations.

The Bureau Was Making Progress in Risk Management Activities but Critical
Weaknesses Remained

The project teams varied in the extent to which they followed disciplined
risk management practices. For example, three of the four project teams
had developed strategies to identify the scope of the risk management
effort. However, three project teams had weaknesses in identifying risks,
establishing adequate mitigation plans, and reporting risk status to
executive-level officials. These weaknesses in completing key risk
management activities can be attributed in part to the absence of Bureau
policies for managing major acquisitions, as we described in an earlier
report.^10 Without effective risk management practices, the likelihood of
project success is decreased.

According to the Software Engineering Institute (SEI), the purpose of risk
management is to identify potential problems before they occur. When
problems are identified, risk-handling activities can be planned and
invoked as needed across the life of a project in order to mitigate
adverse impacts on objectives. Effective risk management involves early
and aggressive risk identification through the collaboration and
involvement of relevant stakeholders. Based on SEI's Capability Maturity
Model^(R) Integration (CMMI^(R)), risk management activities can be
divided into four key areas

           o preparing for risk management,
           o identifying and analyzing risks,
           o mitigating risks, and
           o executive oversight.

           The discipline of risk management is important to help ensure that
           projects are delivered on time, within budget, and with the
           promised functionality. It is especially important for the 2010
           Census, given the immovable deadline.

^10 [32]GAO-06-444T .

           Project Teams Had Usually Established Risk Preparation Activities,
			  but Some Improvements in These Activities Were Needed

           Risk preparation involves establishing and maintaining a strategy
           for identifying, analyzing, and mitigating risks. The risk
           management strategy addresses the specific actions and management
           approach used to perform and control the risk management program.
           It also includes identifying and involving relevant stakeholders
           in the risk management process. Table 4 shows the status of the
           four project teams' implementation of key risk preparation
           activities as of October 2007.^11

           Table 4: Risk Management Preparation Activities Completed for the
           Key 2010 Census Systems

Specific practices: Determine risk sources and categories; 
MTAIP: practice not implemented; 
FDCA: practice fully implemented; 
DRIS: practice fully implemented; 
DADS: practice fully implemented. 

Specific practices: Define parameters used to analyze and categorize 
risks and parameters used to control risk management efforts; 
MTAIP: practice fully implemented; 
FDCA: practice fully implemented; 
DRIS: practice fully implemented; 
DADS: practice fully implemented. 

Specific practices: Establish and maintain the strategy to be used for 
risk management; 
MTAIP: practice partially implemented; 
FDCA: practice fully implemented; 
DRIS: practice fully implemented; 
DADS: practice fully implemented. 

Specific practices: Identify and involve the relevant stakeholders of 
the risk management process as planned; 
MTAIP: practice partially implemented; 
FDCA: practice partially implemented; 
DRIS: practice fully implemented; 
DADS: practice partially implemented. 

           Source: GAO analysis of project data.

           As the table shows, three project teams had established most of
           the risk management preparation activities. However, the MTAIP
           project team had implemented the fewest practices. The team did
           not adequately determine risk sources and categories or adequately
           develop a strategy for risk management. As a result, the project's
           risk management strategy was not comprehensive and did not fully
           address the scope of the risk management effort, including
           discussing techniques for risk mitigation and defining adequate
           risk sources and categories. In addition, three project teams
           (MTAIP, FDCA, and DADS II) had weaknesses regarding stakeholder
           involvement. The three teams did not provide sufficient evidence
           that the relevant stakeholders were involved in risk
           identification, analysis, and mitigation activities; reviewing the
           risk management strategy and risk mitigation plans; or
           communicating and reporting risk management status.
			  
^11This analysis primarily addresses project teams' implementation of risk
management processes. According to our analysis, the contractors for the
three contracts awarded (MTAIP, FDCA, and DRIS) had implemented adequate
risk management processes involving risk preparation, risk identification
and analysis, and risk mitigation.			  

           These weaknesses can be attributed in part to the absence of
           Bureau policies for managing major acquisitions, as we described
           in our earlier reports.^12 Without adequate preparation for risk
           management, including establishing an effective risk management
           strategy and identifying and involving relevant stakeholders,
           project teams cannot properly control the risk management process.
			  
			  The Project Teams Had Identified and Analyzed Risks but Not All Key
			  Risks Were Identified

           Risks must be identified and described in an understandable way
           before they can be analyzed and managed properly. This includes
           identifying risks from both internal and external sources and
           evaluating each risk to determine its likelihood and consequences.
           Table 5 shows the status of the four project teams' implementation
           of key risk identification and evaluation activities at the time
           of our October 2007 report.

           Table 5: Risk Identification and Evaluation Activities Completed
           for the Key 2010 Census Systems

Specific practices: Identify and document the risks; 
MTAIP: practice fully implemented;
FDCA: practice partially implemented; 
DRIS: practice fully implemented; 
DADS: practice partially implemented. 

Specific practices: Evaluate and categorize each identified risk using 
the defined risk categories and parameters, and determine its relative 
priority; 
MTAIP: practice partially implemented; 
FDCA: practice fully implemented; 
DRIS: practice fully implemented; 
DADS: practice fully implemented. 

           Source: GAO analysis of project data.
			  
           As of July 2007, the MTAIP and DRIS project teams were adequately
           identifying and documenting risks, including system interface
           risks. For example, the MTAIP project team identified significant
           risks regarding potential changes in funding and the turnover of
           contractor personnel as the program nears maturity, and the DRIS
           project team identified significant risks regarding new system
           security regulations, changes or increases to Phase II baseline
           requirements, and new interfaces after Dress Rehearsal.

           In contrast, the FDCA project team had not identified or
           documented any significant risks related to the handheld computers
           that will be used in the 2010 Census, despite problems arising
           during the Dress Rehearsal. The computers are designed to automate
           operations for field staff and eliminate the need to print
           millions of paper questionnaires and maps used by temporary field
           staff to conduct address canvassing and nonresponse follow-up.
           Automating operations may allow the Bureau to reduce the cost of
           operations; thus, it is critical that the risks surrounding the
           use of the handheld computers be closely monitored and effectively
           managed to ensure their success. However, the Bureau has not
           identified or documented risks associated with a variety of
           handheld computers performance problems that we identified through
           field work conducted at your request. Specifically, we found that
           during Dress Rehearsal activities between May 2007 and June 2007,
           as the Bureau tested a prototype of the handheld computers, field
           staff experienced multiple problems. For example, the field staff
           told us that they experienced slow and inconsistent data
           transmissions from the handheld computers to the central data
           processing center. The field staff reported the device was slow to
           process addresses that were a part of a large assignment area.
           Bureau staff reported similar problems with the handheld computers
           in observation reports, help desk calls, and debriefing reports.
           In addition, our own analysis of Bureau documentation revealed
           problems with the handheld computers:

                        o Bureau observation reports revealed that the Bureau
                        most frequently observed problems with slow
                        processing of addresses, large assignment areas, and
                        transmission.

                        o The help desk call log revealed that field staff
                        most frequently reported issues with transmission,
                        the device freezing, mapspotting and assignment
                        areas.

                        o Debriefing reports illustrated the impact of the
                        handheld mobile computing problems on address
                        canvassing. For example, one participant commented
                        that the field staff struggled to find solutions to
                        problems and wasted precious time in replacing the
                        devices.

                        o A time-and-motion study conducted by the Census
                        Bureau indicated that field staff reported
                        significant downtime in two test locations--about 23
                        percent in one location and about 27 percent in
                        another location. The study, which is a draft that is
                        subject to change, also described occurrences of
                        failed transmissions and field staff attempts to
                        resolve transmission problems.

           Collectively, the observation reports, help desk calls, debriefing
           reports, and time-and-motion study raised serious questions about
           the performance of the handheld computers during the address
           canvassing operation. According to the Bureau, the contractor has
           used these indicators to identify and address underlying problems
           during the Dress Rehearsal. Still, the magnitude of handheld
           computers performance issues throughout the Dress Rehearsal
           remains unclear. For example, the Bureau received analyses from
           the contractor on average transmission times. However, the
           contractor has not provided analyses that show the full range of
           transmission times, nor how this may have changed throughout the
           entire operation.

           In addition, the Bureau has not fully specified how it will
           measure performance of the handheld computers, even though the
           FDCA contract anticipates the Bureau's need for data on the
           performance of the handheld computers. The FDCA contract outlines
           the type of data the contractor will provide the Bureau on the
           performance of the handheld computers. Specifically, sections of
           the FDCA contract require the handheld computers to have a
           transmission log with what was transmitted, the date, time, user,
           destination, content/data type, and the outcome status. Another
           section of the Bureau's FDCA contract states that the FDCA
           contractor shall provide near real time reporting and monitoring
           of performance metrics and a "control panel/dash board"
           application to visually report those metrics from any Internet
           enabled PC. However, the contractor and the Bureau are not using a
           dashboard for Dress Rehearsal activities. Rather, during the Dress
           Rehearsal, the Bureau plans to identify what data and performance
           they would need for tracking the performance of the handheld
           computers in 2010 operations.

           In order for the Bureau to ensure that the FDCA handheld computers
           are ready for full scale operations, it will have to identify
           risks on a tight time frame. We recommended in a report on the
           Bureau's earlier version of the handheld computers that the Bureau
           define specific, measurable performance requirements for the
           handheld computer and other census-taking activities that address
           such important measures as productivity, cost savings,
           reliability, durability, and that the Bureau test the device's
           ability to meet those requirements in 2006.^13We also recommended
           in a March 2006 testimony that the Bureau validate and approve
           FDCA baseline requirements.^14 The Bureau is working within a
           compressed time frame. By law, the decennial census must occur on
           April 1, 2010, and the results must be submitted to the President
           in December 2010. These dates cannot be altered, even if
           preparations are delayed. Access to real-time performance metrics
           via a "control panel/dash board" would assist Bureau management in
           assessing the handheld computer's performance and maximize the
           amount of time the Bureau and the contractor would have to remedy
           any problems identified during operations. Further, the Bureau's
           tight 2010 Decennial Operations Schedule allows little time for
           fixing problems with the device, raising the importance of the
           Bureau's access to these performance indicators. Such data would
           help fully inform stakeholders of the risks associated with the
           handheld computer, and allow project teams to develop mitigation
           activities to help avoid, reduce, and control the probability of
           these risks occurring.

           Finally, the FDCA and DADSII project teams did not provide
           evidence that specific system interface risks are being adequately
           identified to ensure that risk handling activities will be invoked
           should the systems fail during 2010 Census. For example, although
           the DADS II will not be available for the Dress Rehearsal, the
           project team did not identify any significant interface risks
           associated with this system.

           One reason for these weaknesses, as mentioned earlier, is the lack
           of Bureau policies for managing major acquisitions. If risks are
           not adequately identified and analyzed, management may be
           prevented from monitoring and tracking risks, and taking the
           appropriate mitigation actions, increasing the probability that
           the risks will materialize and magnifying the extent of damage
           incurred in such an event.
			  
^13GAO, 2010 Census: Basic Design Has Potential, but Remaining Challenges
Need Prompt Resolution, [35]GAO-05-9 (Washington, D.C.: January12, 2005).			  

^14 [36]GAO-06-444T .

           Three of Four Project Teams' Risk Mitigation Plans and Monitoring
			  Activities Were Incomplete

           Risk mitigation involves developing alternative courses of action,
           workarounds, and fallback positions, with a recommended course of
           action for the most important risks to the project. Mitigation
           includes techniques and methods used to avoid, reduce, and control
           the probability of occurrence of the risk; the extent of damage
           incurred should the risk occur; or both. Table 6 shows the status
           of the four project teams' implementation of key risk mitigation
           activities.

           Table 6: Risk Mitigation Activities Completed for Key 2010 Census
           Systems

Specific practices: Develop a risk mitigation plan for the most 
important risks to the project, as defined by the risk management 
strategy; 
MTAIP: practice partially implemented; 
FDCA: practice partially implemented; 
DRIS: practice fully implemented; 
DADS: practice not implemented. 

Specific practices: Monitor the status of each risk periodically and 
implement the risk mitigation plan as appropriate; 
MTAIP: practice partially implemented; 
FDCA: practice partially implemented; 
DRIS: practice fully implemented; 
DADS: practice partially implemented. 
	
           Source: GAO analysis of project data.

           Three project teams (MTAIP, FDCA, and DADS II) had developed
           mitigation plans that were often untimely or included incomplete
           activities and milestones for addressing the risks. Some of these
           untimely and incomplete activities and milestones included the
           following:

           o The FDCA project team had developed mitigation plans for the
           most significant risks, but the plans did not always identify
           milestones for implementing mitigation activities. Moreover, the
           plans did not identify any commitment of resources, several did
           not establish a period of performance, and the team did not always
           update the plans with the latest information on the status of the
           risk. In addition, the FDCA project team did not provide evidence
           of developing mitigation plans to handle the other significant
           risks as described in their risk mitigation strategy. (These risks
           included a lack of consistency in requirements definition and
           insufficient FDCA project office staffing levels).

           o The mitigation plans for DADS II were incomplete, with no
           associated future milestones and no evidence of continual progress
           in working towards mitigating a risk. In several instances, DADS
           II mitigation plans were listed as "To Be Determined."

           With regard to the second practice in the table (periodically
           monitoring risk status and implementing mitigation plans), the
           MTAIP, FDCA, and DADS II project teams were not always
           implementing the mitigation plans as appropriate. For example,
           although the MTAIP project team has periodically monitored the
           status of risks, it mitigation plans do not include detailed
           action items with start dates and anticipated completion dates;
           thus, the plans do not ensure that mitigation activities are
           implemented appropriately and tracked to closure. The FDCA and
           DADS II project teams did not identify system interface risks nor
           prepare adequate mitigation plans to ensure that systems will
           operate as intended. Because they did not develop complete
           mitigation plans, the MTAIP, FDCA, and DADS II project teams
           cannot ensure that for a given risk, techniques and methods will
           be invoked to avoid, reduce, and control the probability of
           occurrence.
			  
			  Project Teams Were Inconsistent in Reporting Risk Status to
			  Executive-Level Management

           Reviews of the project teams' risk management activities, status,
           and results should be held on a periodic and event-driven basis.
           The reviews should include appropriate levels of management, such
           as key Bureau executives, who can provide visibility into the
           potential for project risk exposure and appropriate corrective
           actions. Table 7 shows the status of the four project teams'
           implementation of activities for senior-level risk oversight at
           the time of our prior report.

           Table 7: Executive-Level Risk Oversight Activities Completed for
           the Key 2010 Decennial Systems

Specific practices: Review the activities, status, and results of the 
risk management process with executive-level management, and resolve 
issues; 
MTAIP: practice not implemented; 
FDCA: practice not implemented; 
DRIS: practice fully implemented; 
DADS: practice fully implemented. 

           Source: GAO analysis of project data.

           The project teams were inconsistent in reporting the status of
           risks to executive-level officials. DRIS and DADS II did regularly
           report risks; however, the FDCA and MTAIP projects did not provide
           sufficient evidence to document that these discussions occurred or
           what they covered. Failure to report a project's risks to
           executive-level officials reduces the visibility of risks to
           executives who should be playing a role in mitigating them.
			  
			  Implementation of GAO Recommendations Should Help Improve the
			  Bureau's Risk Management

           To help ensure that the Bureau's four key acquisitions for the
           2010 Census operate as intended, we made several recommendations
           in our report. First, to ensure that the Bureau's decennial
           systems are fully tested, we recommended that the Secretary of
           Commerce require the Director of the Census Bureau to direct the
           Decennial Management Division and Geography Division to plan for
           and perform end-to-end testing so that the full complement of
           systems are tested in a census-like environment.

           In written comments on a draft of our final report, the department
           disagreed with our findings that a full complement of systems
           would not be tested, stating it plans to do so during the Dress
           Rehearsal or later. Nonetheless, the Bureau's test plans have not
           been finalized, and it remains unclear whether testing will
           address all interrelated systems and functionality in a
           census-like environment, as would be provided by end-to-end
           testing. Consistent with our recommendation following up with
           documented test plans to do end-to-end testing will help ensure
           that decennial systems will work as intended.

           Further, we recommended that the Secretary direct the Director of
           the Census Bureau to ensure that project teams strengthen risk
           management activities associated with risk identification,
           mitigation, and oversight. The department agreed to examine
           additional ways to manage risks and is working on an action plan
           to strengthen risk management activities.

           In summary, the IT acquisitions planned for 2010 Census will
           require continued oversight to ensure that they are achieved on
           schedule and at planned cost levels. Although, as of October 2007,
           the MTAIP and DRIS acquisitions were currently meeting cost
           estimates, FDCA was not. In addition, while the Bureau was making
           progress developing systems for the Dress Rehearsal, it was
           deferring certain functionality, with the result that the Dress
           Rehearsal operational testing would address less than a full
           complement of systems. Delaying functionality increases the
           importance of later development and testing activities, which will
           have to occur closer to the census date. It also raises the risk
           of cost increases, given the immovable deadline for conducting the
           2010 Census.

           Further, the Bureau's project teams for each of the four
           acquisitions had implemented many practices associated with
           establishing sound and capable risk management processes, but they
           were not always consistent: the teams had not always identified
           risks, developed complete risk mitigation plans, or briefed
           senior-level officials on risks and mitigation plans. At this
           stage, we are particularly concerned about managing the risks
           associated with the handheld mobile computing devices, the
           numerous systems interfaces, and the remaining systems testing.
           Regarding the handheld mobile computing devices, it is critical
           that performance of these devices is clearly specified, measured,
           and that deficiencies in performance is effectively addressed.
           Until the project teams and the Decennial Management Division
           implement appropriate risk management activities, they face an
           increased probability that decennial systems will not be delivered
           on schedule and within budget or perform as expected.

           Mr. Chairman and members of the subcommittee, this concludes our
           statement. We would be happy to respond to any questions that you
           or members of the subcommittee may have at this time.

           If you have any questions on matters discussed in this testimony,
           please contact David A. Powner at (202) 512-9286 or Mathew Scire
           at (202) 512-6806 or by e-mail at [24][email protected] or
           [25][email protected]. Other key contributors to this testimony
           include Mathew Bader, Thomas Beall, Jeffrey DeMarco, Richard Hung,
           Barbara Lancaster, Andrea Levine, Signora May, Cynthia Scott, Niti
           Tandon, Amos Tevelow, Jonathan Ticehurst, and Timothy Wexler.
			  
Appendix I: Key 2010 Census Information Technology Acquisitions: 

IT acquisition: MAF/TIGER Accuracy Improvement Project (MTAIP); 
Contractor: Harris Corporation; 
Purpose: Modernize the system that provides the address list, maps, and 
other geographic support services for the Census and other Bureau 
surveys; 
Contract type: Cost plus award fee; 
Contract award: June 2002. 

IT acquisition: Field Data Collection Automation (FDCA); 
Contractor: Harris Corporation; 
Purpose: Provide automated resources for supporting field data 
collection, including the provision of handheld mobile computing 
devices to collect data in the field, including address and map data; 
Contract type: Cost plus award fee with some firm fixed price elements; 
Contract award: March 2006. 

IT acquisition: Decennial Response Integration System (DRIS); 
Contractor: Lockheed Martin Corporation; 
Purpose: Provide a solution for data capture and respondent assistance; 
Contract type: Cost plus award fee with some firm fixed price elements; 
Contract award: October 2005. 

IT acquisition: Data Access and Dissemination System (DADS II); 
Contractor: IBM; 
Purpose: Develop a replacement for the DADS legacy tabulation and 
dissemination systems; 
Contract type: To be determined; 
Contract award: September 2007. 

Source: GAO analysis of Census Bureau data. 		

           GAO's Mission	  

           The Government Accountability Office, the audit, evaluation, and
           investigative arm of Congress, exists to support Congress in
           meeting its constitutional responsibilities and to help improve
           the performance and accountability of the federal government for
           the American people. GAO examines the use of public funds;
           evaluates federal programs and policies; and provides analyses,
           recommendations, and other assistance to help Congress make
           informed oversight, policy, and funding decisions. GAO's
           commitment to good government is reflected in its core values of
           accountability, integrity, and reliability.
			  
			  Obtaining Copies of GAO Reports and Testimony

           The fastest and easiest way to obtain copies of GAO documents at
           no cost is through GAO's Web site ( [26]www.gao.gov ). Each
           weekday, GAO posts newly released reports, testimony, and
           correspondence on its Web site. To have GAO e-mail you a list of
           newly posted products every afternoon, go to [27]www.gao.gov and
           select "E-mail Updates."
			  
			  Order by Mail or Phone

           The first copy of each printed report is free. Additional copies
           are $2 each. A check or money order should be made out to the
           Superintendent of Documents. GAO also accepts VISA and Mastercard.
           Orders for 100 or more copies mailed to a single address are
           discounted 25 percent. Orders should be sent to:

           U.S. Government Accountability Office 441 G Street NW, Room LM
           Washington, DC 20548

           To order by Phone: Voice: (202) 512-6000
			  TDD: (202) 512-2537
			  Fax: (202) 512-6061
			  
			  To Report Fraud, Waste, and Abuse in Federal Programs

           Contact:

           Web site: [28]www.gao.gov/fraudnet/fraudnet.htm
			  E-mail: [29][email protected]
			  Automated answering system: (800) 424-5454 or (202) 512-7470
			  
			  Congressional Relations

           Gloria Jarmon, Managing Director, [30][email protected] , (202)
           512-4400 U.S. Government Accountability Office, 441 G Street NW,
           Room 7125 Washington, DC 20548
			  
			  Public Affairs

           Chuck Young, Managing Director, [31][email protected] , (202)
           512-4800 U.S. Government Accountability Office, 441 G Street NW,
           Room 7149 Washington, DC 20548

310866

This is a work of the U.S. government and is not subject to copyright
protection in the United States. The published product may be reproduced
and distributed in its entirety without further permission from GAO.
However, because this work may contain copyrighted images or other
material, permission from the copyright holder may be necessary if you
wish to reproduce this material separately.

To view the full product, including the scope
and methodology, click on [37]GAO-08-259T .

For more information, contact David A. Powner at (202) 512-9286 or
[email protected].

Highlights of [38]GAO-08-259T , a testimony before the Subcommittee on
Information Policy, Census, and National Archives, Committee on Oversight
and Government Reform, U.S. House of Representatives

December 11, 2007

INFORMATION TECHNOLOGY

Census Bureau Needs to Improve Its Risk Management of Decennial Systems

For Census 2010, automation and information technology (IT) are expected
to play a critical role. The Census Bureau plans to spend about $3 billion
on automation and technology that are to improve the accuracy and
efficiency of census collection, processing, and dissemination. From
February 2006 through June 2009, the Bureau is holding a ``Dress
Rehearsal'' during which it plans to conduct operational testing that
includes decennial systems acquisitions.

In October 2007, GAO reported on its review of four key 2010 Census IT
acquisitions to (1) determine the status and plans, including schedule and
cost, and (2) assess whether the Bureau is adequately managing associated
risks. This testimony summarizes GAO's report on these key acquisitions
and describes GAO's preliminary observations on the performance of
handheld mobile computing devices used during the Dress Rehearsal.

[39]What GAO Recommends

In its report, GAO made recommendations that the Bureau strengthen its
systems testing and risk management activities, including risk
identification and oversight. The Bureau agreed to examine additional ways
to manage risks, but disagreed with the view that a full complement of
systems would not be tested, stating it planned to do so during the Dress
Rehearsal or later; however, the test plans have not been finalized and it
remains unclear whether this testing will be done.

As of October 2007, three key systems acquisitions for the 2010 Census
were in process, and a fourth contract had recently been awarded. The
ongoing acquisitions showed mixed progress in meeting schedule and cost
estimates. Two of the projects were not on schedule. The award of the
fourth contract, originally scheduled for 2005, was awarded in September
2007. In addition, one project had incurred cost overruns and increases to
its projected life-cycle cost. As a result of the schedule changes, the
full complement of systems and functionality that were originally planned
will not be available for upcoming Dress Rehearsal operational testing.
This limitation increases the importance of further system testing to
ensure that the decennial systems work as intended.

The Bureau's project teams for each of the four IT acquisitions had
performed many practices associated with establishing sound and capable
risk management processes, but critical weaknesses remained. Three project
teams had developed a risk management strategy that identified the scope
of the risk management effort. However, not all project teams had
identified risks, established mitigation plans, or reported risks to
executive-level officials. For example, one project team did not
adequately identify risks associated with performance issues experienced
by handheld mobile computing devices, even though Census field staff
reported slow and inconsistent data transmissions with the device during
the spring Dress Rehearsal operations. The magnitude of these difficulties
is not clear, and the Bureau has not fully specified how it plans to
measure the performance of the devices. Until the project teams implement
key risk management activities, they face an increased probability that
decennial systems will not be delivered on schedule and within budget or
perform as expected.

Performance of Risk Management Activities by Key Census Acquisition
Projects

0M practice fully implemented 0L practice partially implemented 0m
practice not implemented

Source: GAO analysis of Census project data against industry standards.

References

Visible links
  22. http://www.gao.gov/cgi-bin/getrpt?GAO-08-79
  23. http://www.gao.gov/cgi-bin/getrpt?GAO-06-444T
  24. mailto:[email protected]
  25. mailto:[email protected]
  26. http://www.gao.gov/
  27. http://www.gao.gov/
  28. http://www.gao.gov/fraudnet/fraudnet.htm
  29. mailto:[email protected]
  30. mailto:[email protected]
  31. mailto:[email protected]
  32. http://www.gao.gov/cgi-bin/getrpt?GAO-06-444T
  33. http://www.gao.gov/cgi-bin/getrpt?GAO-05-661
  34. http://www.gao.gov/cgi-bin/getrpt?GAO-06-444T
  35. http://www.gao.gov/cgi-bin/getrpt?GAO-05-9
  36. http://www.gao.gov/cgi-bin/getrpt?GAO-06-444T
  37. http://www.gao.gov/cgi-bin/getrpt?GAO-08-259T
  38. http://www.gao.gov/cgi-bin/getrpt?GAO-08-259T
*** End of document. ***