Yucca Mountain: DOE's Planned Nuclear Waste Repository Faces	 
Quality Assurance and Management Challenges (25-APR-06, 	 
GAO-06-550T).							 
                                                                 
The Department of Energy (DOE) is working to obtain a license	 
from the Nuclear Regulatory Commission (NRC) to construct a	 
nuclear waste repository at Yucca Mountain in Nevada. The	 
project, which began in the 1980s, has been beset by delays. In  
2004, GAO raised concerns that persistent quality assurance	 
problems could further delay the project. Then, in 2005, DOE	 
announced discovery of employee e-mails suggesting quality	 
assurance problems. Quality assurance, which establishes	 
requirements for work to be performed under controlled conditions
that ensure quality, is critical to making sure the project meets
standards for protecting public health and the environment. This 
testimony, which summarizes GAO's March 2006 report (GAO-06-313),
provides information on (1) the history of the project's quality 
assurance problems, (2) DOE's tracking of these problems and	 
efforts to address them since GAO's 2004 report, and (3)	 
challenges facing DOE as it continues to address quality	 
assurance issues within the project.				 
-------------------------Indexing Terms------------------------- 
REPORTNUM:   GAO-06-550T					        
    ACCNO:   A52421						        
  TITLE:     Yucca Mountain: DOE's Planned Nuclear Waste Repository   
Faces Quality Assurance and Management Challenges		 
     DATE:   04/25/2006 
  SUBJECT:   Accountability					 
	     Licenses						 
	     Nuclear waste disposal				 
	     Nuclear waste management				 
	     Nuclear waste storage				 
	     Performance measures				 
	     Program evaluation 				 
	     Program management 				 
	     Quality assurance					 
	     Schedule slippages 				 
	     DOE Yucca Mountain Project (NV)			 
	     Yucca Mountain (NV)				 

******************************************************************
** This file contains an ASCII representation of the text of a  **
** GAO Product.                                                 **
**                                                              **
** No attempt has been made to display graphic images, although **
** figure captions are reproduced.  Tables are included, but    **
** may not resemble those in the printed version.               **
**                                                              **
** Please see the PDF (Portable Document Format) file, when     **
** available, for a complete electronic file of the printed     **
** document's contents.                                         **
**                                                              **
******************************************************************
GAO-06-550T

     

     * Background
     * DOE Has a Long History of Quality Assurance Problems at Yucc
     * DOE Cannot Be Certain Its Efforts to Improve Quality Assuran
     * DOE's New Path Forward to Submitting a License Application F
     * Conclusions
     * GAO Contact and Staff Acknowledgments
     * GAO's Mission
     * Obtaining Copies of GAO Reports and Testimony
          * Order by Mail or Phone
     * To Report Fraud, Waste, and Abuse in Federal Programs
     * Congressional Relations
     * Public Affairs

Testimony

Before the Subcommittee on the Federal Workforce and Agency Organization,
Committee on Government Reform, House of Representatives

United States Government Accountability Office

GAO

For Release on Delivery Expected at 2:00 p.m. EDT

Tuesday, April 25, 2006

YUCCA MOUNTAIN

DOE's Planned Nuclear Waste Repository Faces Quality Assurance and
Management Challenges

Statement of Jim Wells, Director Natural Resources and Environment

GAO-06-550T

Mr. Chairman and Members of the Subcommittee:

I am pleased to be here today to discuss our work concerning quality
assurance and other management challenges facing the Department of Energy
(DOE) as it prepares to construct a deep geological repository at Yucca
Mountain in Nevada for the nation's nuclear wastes. My testimony is based
on our March 2006 report entitled Yucca Mountain: Quality Assurance at
DOE's Planned Nuclear Waste Repository Needs Increased Management
Attention.1

DOE is preparing an application for a license from the Nuclear Regulatory
Commission (NRC) to construct an underground geological repository at
Yucca Mountain for the permanent storage of highly radioactive nuclear
waste. Nuclear waste is a by-product of the production of nuclear power,
which provides about 20 percent of U.S. electricity. About 50,000 metric
tons of nuclear waste are stored at 72 sites around the country,
principally at commercial nuclear power plants. These wastes have been
accumulating for several decades in surface storage designed to be
temporary. The Nuclear Waste Policy Act of 1982 required DOE to construct
a repository for permanent storage and to begin accepting these wastes by
January 31, 1998. In 2002, after more than 15 years of scientific study,
the President recommended and the Congress approved Yucca Mountain as a
suitable location for the repository. However, DOE continues to encounter
delays, and it is not certain when it will apply for the license to
construct the repository.

The licensing process requires DOE to demonstrate to NRC that its plans
for the repository will meet Environmental Protection Agency standards for
protecting public health and the environment from harmful exposure to the
radioactive waste. To show that it can meet these standards, DOE has been
conducting scientific and technical studies at the Yucca Mountain site
that will provide supporting documentation for its planned license
application. DOE has also established a quality assurance program to meet
NRC requirements and ensure that its work and the technical information it
produces are accurate and defensible. To accomplish this goal, the program
established procedures that require scientific, design, engineering,
procurement, records keeping, and other work to be performed under
controlled conditions that ensure quality and enable the work to be
verified by others. However, persistent problems implementing these
procedures and resulting questions about the quality of the work are
significantly contributing to delays in DOE's submission of the license
application. Resolving these quality assurance issues is essential to
proceeding with construction.

1 GAO-06-313 (Washington, D.C.: Mar. 17, 2006).

In April 2004, we reported that recurring quality assurance problems at
the Yucca Mountain project could delay the licensing and operation of the
repository. At that time, we also reported that DOE had completed
efforts-known as Management Improvement Initiatives (Initiatives)-to
better manage quality assurance problems, but could not assess their
effectiveness because its performance goals lacked objective measures and
time frames for determining success.2 Then, in early 2005, DOE reported
that it had discovered a series of e-mail messages written in the late
1990s by U.S. Geological Survey (USGS) employees working on the Yucca
Mountain project under a contract with DOE that appeared to imply that
workers had falsified records for scientific work. Several of these
messages appeared to show disdain for the project's quality assurance
program and its requirements. In October 2005, DOE began planning an
aggressive series of changes-known as the "new path forward"-to the
facility design, organization, and management of the Yucca Mountain
project. These efforts are intended to address quality assurance and other
challenges, including those associated with the USGS e-mails, and advance
the license application process. However, in December 2005 and again in
February 2006, some project work was stopped due to continuing quality
assurance problems.

Our March 2006 report examined (1) the history of the project's quality
assurance problems since its start in the 1980s, (2) DOE's tracking of
quality problems and progress implementing quality assurance requirements
since our April 2004 report, and (3) challenges that DOE faces as it
continues to address quality assurance issues with the project. To
determine the history of quality assurance problems, we reviewed previous
GAO, DOE, and NRC documents, visited the Yucca Mountain site, and
interviewed officials from DOE, NRC, and Bechtel/SAIC Company, LLC (BSC),
which is DOE's management contractor for the Yucca Mountain project. To
assess DOE's tracking of quality-related problems and progress in
addressing them, we examined management tools and associated
documentation, and interviewed BSC and DOE officials regarding those
tools. To identify current quality assurance and other challenges, we
attended quarterly NRC management meetings, interviewed the Acting
Director and other senior managers of the DOE project, and gathered
information on management turnover. The work on our report was conducted
from July 2005 through January 2006 in accordance with generally accepted
government auditing standards.

2GAO, Yucca Mountain: Persistent Quality Assurance Problems Could Delay
Repository Licensing and Operation, GAO-04-460 (Washington, D.C.: Apr. 30,
2004).

In summary, we found the following:

           o  DOE has had a long history of quality assurance problems at the
           Yucca Mountain project. In the late 1980s and early 1990s, DOE had
           problems assuring NRC that it had developed adequate plans and
           procedures related to quality assurance. As we reported in 1988,
           NRC had found that DOE's quality assurance procedures were
           inadequate and its efforts to independently identify and resolve
           weaknesses in the procedures were ineffective. By the late 1990s,
           DOE had largely addressed NRC's concerns about its plans and
           procedures, but its own audits identified quality assurance
           problems with the data, software, and models used in the
           scientific work supporting its potential license application. For
           example, in 1998, a team of project personnel determined that 87
           percent of the models used to simulate the site's natural and
           environmental conditions, and to demonstrate the future
           repository's performance over time, did not comply with
           requirements for demonstrating their accuracy in predicting
           geologic events. More recently, DOE has relied on costly and
           time-consuming rework to resolve lingering quality assurance
           concerns. Specifically, in the spring of 2004, DOE implemented a
           roughly $20 million, 8-month project called the Regulatory
           Integration Team to ensure that scientific work was sufficiently
           documented and explained to support the license application. This
           effort involved about 150 full-time employees from DOE, USGS, and
           multiple national laboratories, such as Sandia and Los Alamos,
           working to inspect technical documents to identify and resolve
           quality problems.
           o  DOE cannot be certain that its efforts to improve quality
           assurance have been effective because the management tools it
           adopted did not target existing management concerns and did not
           track progress in addressing significant and recurring problems.
           DOE announced in 2004 that it was making a commitment to
           continuous quality assurance improvement and that its efforts
           would be tracked by performance indicators that would enable it to
           assess progress and direct management attention as needed;
           however, its management tools have not been effective for this
           purpose. Specifically, its one-page summary, or "panel," of
           selected performance indicators that project managers used in
           monthly management meetings was not an effective tool for
           assessing progress because the indicators poorly represented the
           major management concerns and were changed frequently. For
           example, the panel did not include an indicator to represent the
           management concern about unclear roles and responsibilities-a
           problem that could undermine accountability within the project.
           Use of the indicator panel was discontinued in late 2005, and DOE
           is deciding on a tool to replace it. Moreover, a second management
           tool-trend evaluation reports-also did not track relevant
           concerns. The reports generally had technical weaknesses for
           identifying recurrent and significant problems and inconsistently
           tracked progress toward resolving the problems. For example,
           lacking reliable data and an appropriate performance benchmark for
           determining the significance of human errors as a cause of quality
           problems, DOE's trend reports offered no clear basis for tracking
           progress on such problems. In addition, under the trend reports'
           rating categories, the rating assigned to convey the significance
           of a problem was overly influenced by a judgment in the report
           that there were already ongoing management actions to address the
           problem, rather than solely assessing the problem's significance.
           For example, the trend report's rating of one particular problem
           at the lowest level of significance did not accurately describe
           the problem or sufficiently draw management's attention to it.
           o  DOE's aggressive "new path forward" effort faces substantial
           quality assurance and other challenges, as it prepares to submit
           the license application to construct the repository at Yucca
           Mountain. First, the March 2005 announcement of the discovery of
           USGS e-mails suggesting the possible falsification of quality
           assurance records has resulted in extensive efforts to restore
           confidence in scientific documents, and DOE is conducting a
           wide-ranging review of approximately 14 million e-mails to
           determine whether they raise additional quality assurance issues.
           Such a review creates a challenge not just because of the sheer
           volume of e-mails to be reviewed, but also because DOE will have
           to decipher their meaning and determine their significance,
           sometimes without clarification from authors who have left the
           project. Furthermore, if any of the e-mails raise quality
           assurance concerns, further review, inspection, or additional work
           may need to be performed. Second, DOE faces quality assurance
           challenges associated with an inadequate requirements management
           process-the process responsible for ensuring that broad plans and
           regulatory requirements affecting the project are tracked and
           incorporated into specific engineering details. In December 2005,
           DOE issued a stop-work order on some design and engineering work
           until it can determine whether the requirements management process
           has been improved. Third, DOE continues to be challenged by
           managing a changing and complex program and organization. The
           significant project changes initiated under the new path forward
           create the potential for confusion over accountability as roles
           and responsibilities change-a situation DOE found to contribute to
           quality assurance problems during an earlier transition period.
           For example, one proposed reorganization-establishing a lead
           laboratory to assist the project-would not only have to be
           effectively managed, but also would introduce a new player whose
           accountability DOE would have to ensure. DOE has also experienced
           turnover in 9 of 17 key management positions since 2001-including
           positions related to quality assurance-that has created management
           continuity challenges. For example, three individuals have
           directed the project since 1999, and the position is currently
           occupied by an acting director. Since DOE is still formulating its
           plans, it is too early to determine whether its new path forward
           effort will resolve these challenges.

           In our report, we recommend that DOE strengthen its management
           tools by (1) improving the tools' coverage of the Initiatives'
           areas of concern, (2) basing the tools on projectwide analysis of
           problems, (3) establishing quality guidelines, (4) making
           indicators and analyses more consistent over time, and (5)
           focusing rating categories on problem significance rather than a
           judgment on the need for management action. In commenting on the
           report, DOE agreed with our recommendations.

           The Congress enacted the Nuclear Waste Policy Act of 1982 to
           establish a comprehensive policy and program for the safe,
           permanent disposal of commercial spent nuclear fuel and other
           highly radioactive wastes in one or more mined geologic
           repositories. The act charged DOE with (1) establishing criteria
           for recommending sites for repositories; (2) "characterizing"
           (investigating) three sites to determine each site's suitability
           for a repository (1987 amendments to the act directed DOE to
           investigate only the Yucca Mountain site); (3) recommending one
           suitable site to the President, who would submit a recommendation
           of such site to the Congress if he considered the site qualified
           for a license application; and (4) upon approval of a recommended
           site, seeking a license from NRC to construct and operate a
           repository at the site. The Yucca Mountain project is currently
           focused on preparing an application for a license from NRC to
           construct a repository. DOE is compiling information and writing
           sections of the license application, conducting technical
           exchanges with NRC staff, and addressing key technical issues
           identified by NRC to ensure that sufficient supporting information
           is provided.

           In February 2005, DOE announced that it does not expect the
           repository to open until 2012 at the earliest, which is more than
           14 years later than the 1998 goal specified by the Nuclear Waste
           Policy Act of 1982. More recently, the conference report for DOE's
           fiscal year 2006 appropriations observed that additional
           significant delays to submitting a license application are likely.
           In October 2005, the project's Acting Director issued a memorandum
           calling for the development of wide-ranging plans for the "new
           path forward" to submitting the license application. The plans
           address the need to review and replace USGS work products,
           establish a lead national laboratory to assist the project, and
           develop a new simplified design for the waste canisters and
           repository facilities, among other things. In addition, DOE
           announced, in April 2006, that it was proposing legislation
           intended to accelerate licensing and operations. For example, the
           legislation provides that if NRC authorizes the repository,
           subsequent licensing actions would be conducted using expedited,
           simplified procedures.

           Given the delays, the Congress has considered other options for
           managing existing and future nuclear wastes, such as centralized
           interim storage at one or more DOE sites. In addition, the
           conference report for DOE's fiscal year 2006 appropriations
           directed DOE to develop a spent nuclear fuel recycling plan to
           reuse the spent fuel. However, according to the Nuclear Energy
           Institute, which represents the nuclear energy industry, none of
           technological options being considered will eliminate the need to
           ultimately dispose of nuclear waste in a geologic repository.

           DOE has had a long history of quality assurance problems at the
           Yucca Mountain project. In the project's early stages, DOE had
           problems assuring NRC that it had developed adequate quality
           assurance plans and procedures. By the late 1990s, DOE had largely
           addressed NRC's concerns about its plans and procedures, but its
           own audits identified quality assurance problems with the data,
           software, and models used in the scientific work supporting its
           potential license application. While recently resolving these
           quality problems, DOE is now relying on costly and time-consuming
           rework to ensure the traceability and transparency of several
           technical work products that are key components of the license
           application.

           As we reported in 1988, NRC reviewed DOE's quality assurance
           program for the Yucca Mountain project and concluded that it did
           not meet NRC requirements3 and that DOE's quality assurance audits
           were ineffective. In 1989, NRC concluded that DOE and its key
           contractors had yet to develop and implement an acceptable quality
           assurance program. However, by March 1992, NRC determined that DOE
           had made significant progress in improving its quality assurance
           program, noting among other things, that all of the contractor
           organizations had developed and were in the process of
           implementing quality assurance programs that met NRC requirements,
           and that DOE had demonstrated its ability to evaluate and correct
           deficiencies in the overall quality assurance program.

           By the late 1990s, however, the DOE quality assurance program
           began detecting new quality problems in three areas critical to
           demonstrating the repository's successful performance over time:
           data management, software management, and scientific models.

           o  Data management. In 1998, DOE identified quality assurance
           problems with the quality and traceability of data, specifically
           that some data had not been properly collected or tested to ensure
           its accuracy and that data used to support scientific analysis
           could not be properly traced back to its source. DOE found similar
           problems in April and September 2003.
           o  Software management. DOE quality assurance procedures require
           that software used to support analysis and conclusions about the
           performance and safety of the repository be tested or created in
           such a way to ensure that it is reliable. From 1998 to 2003,
           multiple DOE audits found recurring quality assurance problems
           that could affect confidence in the adequacy of software.
           o  Model validation. In 1998, a team of project personnel
           evaluated the mathematical models used to simulate natural and
           environmental conditions and determined that 87 percent of them
           did not comply with validation requirements to ensure they
           accurately predict geologic events. In 2001, and again in 2003,
           DOE audits found that project personnel were not properly
           following procedures, specifically in the areas of model
           documentation, model validation, and checking and review. Further,
           the 2003 audit concluded that previous corrective actions designed
           to improve validation and reduce errors in model reports were not
           fully implemented.

           After many years of working to address these quality assurance
           problems with data, software, and models, DOE had mostly resolved
           these problems by February 2005.

           As DOE prepares to submit the Yucca Mountain project license
           application to NRC, it is relying on costly and time-consuming
           rework to ensure that the documents supporting its license
           application are accurate and complete. Although the department had
           known for years about quality assurance problems with the
           traceability and transparency of technical work products called
           Analysis and Model Reports (AMR)-a key component of the license
           application--DOE did not initiate a major effort to address these
           problems until 2004. AMRs contain the scientific analysis and
           modeling data that demonstrate the safety and performance of the
           planned repository and, among other quality requirements, must be
           traceable to their original source material and data and be
           transparent in justifying and explaining their underlying
           assumptions, calculations, and conclusions. In 2003, based in part
           on these problems, as well as DOE's long-standing problems with
           data, software, and modeling, NRC conducted an independent
           evaluation of three AMRs to determine if they met NRC requirements
           for being traceable, transparent, and technically appropriate for
           their use in the license application. In all three AMRs, NRC found
           significant problems with both traceability and transparency.4 NRC
           concluded that these findings suggested that other AMRs may have
           similar problems and that such problems could delay NRC's review
           of the license application, as it would need to conduct special
           inspections to resolve any problems it found with the quality of
           technical information.

           To address problems of traceability and transparency, DOE
           initiated an effort in the spring of 2004 called the Regulatory
           Integration Team (RIT) to perform a comprehensive inspection and
           rework of the AMRs and ensure they met NRC requirements and
           expectations.5 According to DOE officials, the RIT involved
           roughly 150 full-time personnel from DOE, USGS, and multiple
           national laboratories such as Sandia, Los Alamos, and Lawrence
           Livermore. The RIT decided that 89 of the approximately 110 AMRs
           needed rework. According to DOE officials, the RIT addressed or
           corrected over 3,700 problems, and was completed approximately 8
           months later at a cost of about $20 million. In a February 2005
           letter to DOE, the site contractor stated that the RIT effort had
           successfully improved the AMRs' traceability and transparency.

           Subsequently, however, DOE identified additional problems with
           traceability and transparency that required further inspections
           and rework. DOE initiated a review of additional AMRs that were
           not included in the scope of the 2004 RIT review after a March
           2005 discovery of e-mails from USGS employees written between May
           1998 and March 2000 implying that employees had falsified
           documentation of their work to avoid quality assurance standards.
           These additional AMRs contained scientific work performed by the
           USGS employees and had been assumed by the RIT to meet NRC
           requirements for traceability and transparency. However, according
           to DOE officials, DOE's review determined that these AMRs did not
           meet NRC's standards, and rework was required. DOE identified
           similar problems as the focus of the project shifted to the design
           and engineering work required for the license application. In
           February 2005, the site contractor determined that, in addition to
           problems with AMRs, similar traceability and transparency problems
           existed in the design and engineering documents that constitute
           the Safety Analysis Report-the report necessary to demonstrate to
           NRC that the repository site will meet the project's health,
           safety, and environmental goals and objectives. In an analysis of
           this problem, the site contractor noted that additional resources
           were needed to inspect and rework the documents to correct the
           problems.

           DOE's management tools for the Yucca Mountain project have not
           enabled it to effectively identify and track progress in
           addressing significant and recurring quality assurance problems.
           Specifically, its panel or one-page summary of selected
           performance indicators did not highlight the areas of management
           concern covered by its Management Improvement Initiatives
           (Initiatives) and had weaknesses in assessing progress because the
           indicators kept changing. Its trend reports also did not focus on
           tracking these management concerns, had technical weaknesses for
           identifying significant and recurrent problems, and has
           inconsistently tracked progress with problems. Furthermore, the
           trend reports have sometimes been misleading as to the
           significance of the problems being presented because their
           significance ratings tend to be lower if corrective actions were
           already being taken, without considering the effectiveness of the
           actions or the problem's importance to the project.

           In April 2004, DOE told us it expected that the progress achieved
           with its Initiatives for improving quality assurance would
           continue and that its performance indicators would enable it to
           assess further progress and direct management attention as needed.
           By that time, the actions called for by the Initiatives had been
           completed and project management had already developed a "panel"
           of indicators to use at monthly management meetings to monitor
           project performance. The panel was a single page composed of
           colored blocks representing selected performance indicators and
           their rating or level of performance. For example, a red block
           indicated degraded or adverse performance warranting significant
           management attention, a yellow block indicated performance
           warranting increased management attention or acceptable
           performance that could change for the worse, and a green block
           indicated good performance. The panel represented a hierarchy of
           indicators where the highest-level indicators were visible, but
           many lower-level indicators that determined the ratings of the
           visible indicators were not shown. Our review analyzed a subset of
           these indicators that DOE designated as the best predictors in
           areas affecting quality.

           We found that the panel was not effective for assessing continued
           progress because its indicators poorly represented the management
           concerns identified by the Initiatives. The Initiatives had raised
           concerns about five key areas of management weakness as adversely
           affecting the implementation of quality assurance requirements,
           and had designated effectiveness indicators for these areas.
           (These areas of concern are described in app. I.) However, two of
           the Initiatives' five key areas of concern-roles and
           responsibilities as well as work procedures-were not represented
           in the panel's visible or underlying indicators. In other cases,
           the Initiatives' effectiveness indicators were represented in
           underlying lower-level indicators that had very little impact on
           the rating of the visible indicator. For example, the Initiatives'
           indicator for timely completion of employee concerns was
           represented by two lower-level indicators that together
           contributed 3 percent of the rating for an indicator visible in
           the panel.

           Another shortcoming of the panel was that frequent changes to the
           indicators hindered the ability to identify problems for
           management attention and track progress in resolving them. The
           indicators could change in many ways, such as how they were
           defined or calculated. Such changes made it difficult to measure
           progress because changes in indicator ratings could reflect only
           the changes in the indicators rather than actual performance
           changes. Some of the indicators tracking quality issues changed
           from one to five times during the 8-month period from April 2004
           through November 2004. Even after a major revision of the panel in
           early 2005, most of the performance indicators tracking quality
           issues continued to change over the next 6 months-that is, from
           March 2005 through August 2005. Only one of these five indicators
           did not change during this period. One indicator was changed four
           times during the 6-month period, resulting in it being different
           in more months than it remained the same. Moreover, the panel was
           not always available to track problems. It was not created for
           December 2004 through February 2005, and it has not been created
           since August 2005. In both cases, the panel was undergoing major
           revisions. In December 2005, a senior DOE official told us that
           the project would begin to measure key activities, but without use
           of the panel.

           According to DOE, a second management tool, the project's
           quarterly trend evaluation reports, captured some aspects of the
           Initiatives' areas of concern and their associated effectiveness
           indicators that were not represented in the performance
           indicators. However, the trend reports are designed more to
           identify emerging and unanticipated problems than to monitor
           progress with already identified problems, such as those addressed
           by the Initiatives. In developing these reports, trend analysts
           seek to identify patterns and trends in condition reports, which
           document problematic conditions through the project's Corrective
           Action Program. For example, analysis might reveal that most
           occurrences of a particular type of problem are associated with a
           certain organization.

           In practice, DOE missed opportunities to use trend reports to
           assess progress in the Initiatives' areas of concern. For example,
           DOE missed an opportunity to use trend reports to discuss the
           Initiatives' goal that the project's work organizations become
           more accountable for self-identifying significant problems. The
           August 2005 trend report briefly cited an evaluation of a
           condition report highlighting the low rate of self-identification
           of significant problems during the previous quarter and reported
           the evaluation's conclusion that it was not a problem warranting
           management attention. However, the trend report did not mention
           that about 35 percent of significant problems were self-identified
           during the previous quarter, while the Initiatives' goal was that
           80 percent of significant problems would be self-identified.

           Beyond whether they effectively track the Initiatives' areas of
           concern, trend reports generally face serious obstacles to
           adequately identifying recurrent and significant problems. For
           example, trend analysis tends to focus on the number of condition
           reports issued, but the number of reports does not necessarily
           reflect the significance of a problem. For example, the number of
           condition reports involving requirements management decreased by
           over half from the first quarter to the second quarter of fiscal
           year 2005. However, this decrease was not a clear sign of
           progress. Not only did the number rise again in the third quarter,
           but the May 2005 trend report also noted that the number of all
           condition reports had dropped during the second quarter. According
           to the report, the volume of condition reports had been high in
           the first quarter because of reviews of various areas, including
           requirements management.

           Due, in part, to these obstacles, trend reports have not
           consistently determined the significance of problems or performed
           well in tracking progress in resolving them. For example, trend
           reports have questionably identified human performance as a
           significant problem for resolution and ineffectively tracked
           progress in resolving it because there was (1) no clearly
           appropriate or precise benchmark for performance, (2) a changing
           focus on the problem, and (3) unreliable data on cause codes. The
           February 2004 trend report identified a human performance problem
           based on Yucca Mountain project data showing the project's
           proportion of skill-based errors to all human performance errors
           was two times higher than benchmark data from the Institute of
           Nuclear Power Operations (INPO).6 Interestingly, the report
           cautioned that other comparisons with these INPO data may not be
           appropriate because of differences in the nature, complexity, and
           scope of work performed, but did not explain why this caution did
           not apply to the report's own comparison. While this comparison
           has not appeared in trend reports since May 2004, a November 2004
           trend report changed the focus of the problem to the predominance
           of human performance errors in general, rather than the
           skill-based component of these errors. (Later reports
           reinterpreted this predominance as not a problem.) The report
           cited an adverse trend based on the fact that the human
           performance cause category accounted for over half of the total
           number of causes for condition reports prepared during the
           quarter. Nevertheless, by February 2005, trend reports began
           interpreting this predominance as generally appropriate, given the
           type of work done by the project. That is, the project's work
           involves mainly human efforts and little equipment, while work at
           nuclear power plants involves more opportunities for errors caused
           by equipment. In our view, this interpretation that a predominance
           of human performance errors would be expected implies an imprecise
           benchmark for appropriate performance.

           Further, although trend reports continued to draw conclusions
           about human performance problems, the February 2005 report
           indicated that any conclusions were hard to justify because of
           data reliability problems with cause coding. For example, the
           majority of problems attributed to human performance causes are
           minor problems, such as not completing a form, that receive
           less-rigorous cause analysis. This less-rigorous analysis tends to
           reveal only individual human errors-that is, human performance
           problems-whereas more-rigorous analysis tends to reveal
           less-obvious problems with management and procedures.

           Another shortcoming of the trend reports was that their rating
           categories made it difficult to adequately determine the
           significance of some problems. Specifically, trend reports
           sometimes assigned a problem a lower significance than justified
           because corrective actions were already being taken. The rating
           categories for a problem's significance also involve an assessment
           of the need for management action. In their current formulation,
           DOE's rating categories cannot accurately represent both these
           assessments, and the designated rating category can distort one or
           the other assessment. For instance, a November 2005 trend report
           rated certain requirements management issues as a "monitoring
           trend"-defined as a small perturbation in numbers that does not
           warrant action but needs to be monitored closely. However, this
           rating did not accurately capture the report's simultaneous
           recognition that significant process problems spanned both BSC and
           DOE and the fact that the numbers and types of problems were
           consistently identified over the previous three quarters. A more
           understandable explanation for why the problem received a low
           rating is that designating the problem at any higher level of
           significance would have triggered guidelines involving the
           issuance of a condition report, which, according to the judgment
           expressed in the report, was not needed. Specifically, the report
           indicated that existing condition reports have already identified
           and were evaluating and resolving the problem, thereby eliminating
           the need to issue a new condition report.

           However, by rating the problem at the lowest level of
           significance, the trend report did not sufficiently draw
           management's attention to the problem. At about the same time the
           trend report judged no new condition reports were necessary, a
           separate DOE investigation of requirements management resulted in
           14 new condition reports-3 at the highest level of significance
           and 8 at the second-highest level of significance. These condition
           reports requested, for instance, an analysis of the collective
           significance of the numerous existing condition reports and an
           assessment of whether the quality assurance requirement for
           complete and prompt remedial action had been met. As a result of
           the investigation and a concurrent DOE root cause analysis,7 DOE
           stated during the December 2005 quarterly management meeting with
           NRC that strong actions were required to address the problems with
           its requirements management system and any resulting uncertainty
           about the adequacy of its design products.

           I would now like to update you on the project's February 2006
           stop-work order, which occurred too late to be included in our
           report. We believe this incident is an example of how the
           project's management tools have not been effective in bringing
           quality assurance problems to top management's attention. After
           observing a DOE quality assurance audit at the Lawrence Livermore
           National Laboratory in August 2005, NRC expressed concern that
           humidity gauges used in scientific experiments at the project were
           not properly calibrated-an apparent violation of quality assurance
           requirements. According to an NRC official, NRC communicated these
           findings to BSC and DOE project officials on six occasions between
           August and December 2005, and issued a formal report and letter to
           DOE on January 9, 2006. However, despite these communications and
           the potentially serious quality assurance problems involved, the
           project's acting director did not become aware of the issue until
           January 2006, after reading about it in a news article. Due to
           concerns that quality assurance requirements had not been followed
           and the length of time it took top management to become aware of
           the issue, BSC issued a February 7, 2006, stop-work order
           affecting this scientific work. Project officials have begun a
           review of the issue.

           In pursuing its new path forward, DOE faces significant quality
           assurance and other challenges, including (1) determining the
           extent of problems and restoring confidence in the documents
           supporting the license application after the discovery of e-mails
           raising the potential of falsified records, (2) settling the
           design issues and the associated problems with requirements
           management, and (3) replacing key personnel and managing the
           transition of new managers and other organizational challenges.

           The early 2005 discovery of USGS e-mails suggesting possible
           noncompliance with the project's quality assurance requirements
           has left lingering concerns about the adequacy of USGS's
           scientific work related to the infiltration or flow of water into
           the repository and whether other work on the project has similar
           quality assurance problems. As part of its new path forward, DOE
           has taken steps to address these concerns. It is reworking
           technical documents created by USGS personnel to ensure that the
           science underlying the conclusions on water infiltration is
           correct and supportable. In addition, DOE is conducting an
           extensive review of approximately 14 million e-mails to determine
           whether they raise additional quality assurance concerns.
           According to NRC on-site representatives, screening these millions
           of e-mails to ensure that records were not falsified will be
           challenging. Further, many of the e-mails were written by
           employees who no longer work at the project or may be deceased,
           making it difficult to learn their true meaning and context.
           Moreover, if additional e-mails raise quality assurance concerns,
           DOE may have to initiate further review, inspections, or rework.

           DOE officials have stated that it will need to resolve
           long-standing quality assurance problems involving requirements
           management before it can perform the design and engineering work
           needed to support the revised project plans called for by its new
           path forward. According to a 2005 DOE root cause analysis report,
           low-level documents were appropriately updated and revised to
           reflect high-level design changes through fiscal year 1995.
           However, from 1995 through 2002, many of these design documents
           were not adequately maintained and updated to reflect current
           designs and requirements. Further, a document that is a major
           component of the project's requirements management process was
           revised in July 2002, but has never been finalized or approved.
           Instead, the project envisioned a transition to a new requirements
           management system after the submission of the license application,
           which at that time was planned for December 2004. However, for
           various reasons, the license application was not submitted, and
           the transition to a new requirements management system was never
           implemented. The DOE report described this situation as
           "completely dysfunctional" and identified the root cause of these
           conditions as DOE's failure to fund, maintain, and rigidly apply a
           requirements management system. According to an NRC on-site
           representative, repetitive and uncorrected issues associated with
           the requirements management process could have direct implications
           for the quality of DOE's license application.

           In December 2005, DOE issued a stop-work order on design and
           engineering for the project's surface facility and certain other
           technical work. DOE stated that a root cause analysis and an
           investigation into employee concerns had revealed that the project
           had not maintained or properly implemented its requirements
           management system, resulting in inadequacies in the design control
           process. The stop-work order will be in effect until, among other
           things, the lead contractor improves the requirements management
           system, validates that processes exist and are being followed, and
           requirements are appropriately traced to implementing mechanisms
           and products. Further, DOE will establish a team to take other
           actions necessary to prevent inadequacies in requirements
           management and other management systems from recurring.

           Finally, DOE continues to be challenged to effectively manage a
           changing and complex program and organization. The significant
           project changes initiated under the new path forward create the
           potential for confusion over accountability as roles and
           responsibilities change-a situation DOE found to contribute to
           quality assurance problems during an earlier transition period. An
           important part of this challenge is ensuring that accountability
           for quality and results are effectively managed during the
           transition to the new path forward. For example, DOE's plan to
           establish a lead laboratory to assist the project would not only
           have to be effectively managed, but also would introduce a new
           player whose accountability DOE would have to ensure. According to
           one DOE manager, transitioning project work to a lead laboratory
           under a direct contract with DOE could pose a significant
           challenge for quality assurance because the various laboratories
           assisting with the project are currently working under BSC quality
           assurance procedures and will now have to develop their own
           procedures.

           In addition, the project faces management challenges related to
           ensuring management continuity at the project. DOE has experienced
           turnover in 9 of 17 key management positions since 2001. For
           example, in the past year, the project has lost key managers
           through the departures of the Director of Project Management and
           Engineering, the Director of the License Application and Strategy,
           the Director of Quality Assurance, and the contractor's General
           Manager. To ensure the right managers move the project forward to
           licensing, the project has a recruitment effort for replacing key
           departing managers. Further, the director position for the project
           has been occupied by three individuals since 1999 and is currently
           filled by an acting director. The current Acting Director took his
           position in summer 2005, and initiated the new path forward in
           October 2005. DOE is currently awaiting congressional confirmation
           of a nominee to take the director position. However, the current
           Acting Director told us he expects that the new path forward will
           be sustained after the new director assumes the position because
           it has been endorsed by the Secretary of Energy.

           DOE has a long history of trying to resolve quality assurance
           problems at its Yucca Mountain project. Now, after more than 20
           years of work, DOE once again faces serious quality assurance and
           other challenges while seeking a new path forward to a fully
           defensible license application. Even as DOE faces new quality
           assurance challenges, it cannot be certain that it has resolved
           past problems. It is clear that DOE has not been well served by
           management tools that have not effectively identified and tracked
           progress on significant and recurring problems. As a result, DOE
           has not had a strong basis to assess progress in addressing
           management weaknesses or to direct management attention to
           significant and recurrent problems as needed. Unless these quality
           assurance problems are addressed, further delays on the project
           are likely.

           Mr. Chairman, this concludes my prepared statement, I would be
           happy to respond to any questions that you or other Members of the
           Subcommittee may have at this time.

           For further information about this testimony, please contact Jim
           Wells at (202) 512-3841 or [email protected]. Casey Brown, John
           Delicath, Terry Hanford, and Raymond Smith also made key
           contributions to this statement.

           The Department of Energy's Management Improvement Initiatives
           (Initiatives) perceived five key areas of management weakness as
           adversely affecting the implementation of quality assurance
           requirements at the Yucca Mountain project:

           The Government Accountability Office, the audit, evaluation and
           investigative arm of Congress, exists to support Congress in
           meeting its constitutional responsibilities and to help improve
           the performance and accountability of the federal government for
           the American people. GAO examines the use of public funds;
           evaluates federal programs and policies; and provides analyses,
           recommendations, and other assistance to help Congress make
           informed oversight, policy, and funding decisions. GAO's
           commitment to good government is reflected in its core values of
           accountability, integrity, and reliability.

           The fastest and easiest way to obtain copies of GAO documents at
           no cost is through GAO's Web site ( www.gao.gov ). Each weekday,
           GAO posts newly released reports, testimony, and correspondence on
           its Web site. To have GAO e-mail you a list of newly posted
           products every afternoon, go to www.gao.gov and select "Subscribe
           to Updates."

           The first copy of each printed report is free. Additional copies
           are $2 each. A check or money order should be made out to the
           Superintendent of Documents. GAO also accepts VISA and Mastercard.
           Orders for 100 or more copies mailed to a single address are
           discounted 25 percent. Orders should be sent to:

           U.S. Government Accountability Office 441 G Street NW, Room LM
           Washington, D.C. 20548

           To order by Phone: Voice: (202) 512-6000 TDD: (202) 512-2537 Fax:
           (202) 512-6061

           Contact:

           Web site: www.gao.gov/fraudnet/fraudnet.htm E-mail:
           [email protected] Automated answering system: (800) 424-5454 or
           (202) 512-7470

           Gloria Jarmon, Managing Director, [email protected] (202) 512-4400
           U.S. Government Accountability Office, 441 G Street NW, Room 7125
           Washington, D.C. 20548

           Paul Anderson, Managing Director, [email protected] (202)
           512-4800 U.S. Government Accountability Office, 441 G Street NW,
           Room 7149 Washington, D.C. 20548

                        1. Roles and responsibilities were becoming confused
                        as the project transitioned from scientific studies
                        to activities supporting licensing. The confusion
                        over roles and responsibilities was undermining
                        managers' accountability for results. The
                        Initiatives' objective was to realign DOE's project
                        organization to give a single point of responsibility
                        for project functions, such as quality assurance and
                        the Corrective Action Program, and hold the project
                        contractor more accountable for performing the
                        necessary work in accordance with quality, schedule,
                        and cost requirements.
                        2. Product quality was sometimes being achieved
                        through inspections by the project's Office of
                        Quality Assurance rather than being routinely
                        implemented by the project's work organizations. As a
                        result, the Initiatives sought to increase work
                        organizations' responsibility for being the principle
                        means for achieving quality.
                        3. Work procedures were typically too burdensome and
                        inefficient, which impeded work. The Initiatives
                        sought to provide new user-friendly and effective
                        procedures, when necessary, to allow routine
                        compliance with safety and quality requirements.
                        4. Multiple corrective action programs existed,
                        processes were burdensome and did not yield useful
                        management reports, and corrective actions were not
                        completed in a timely manner. The Initiatives sought
                        to implement a single program to ensure that problems
                        were identified, prioritized, and documented and that
                        timely and effective corrective actions were taken to
                        preclude recurrence of problems.
                        5. The importance of a safety-conscious work
                        environment that fosters open communication about
                        concerns was not understood by all managers and
                        staff, and they had not been held accountable when
                        inappropriately overemphasizing the work schedule,
                        inadequately attending to work quality, and acting
                        inconsistently in practicing the desired openness
                        about concerns. Through issuing a work environment
                        policy, providing training on the policy, and
                        improving the Employee Concerns Program, the
                        Initiatives sought to create an environment in which
                        employees felt free to raise concerns without fear of
                        reprisal and with confidence that issues would be
                        addressed promptly and appropriately.

                                   Background

     DOE Has a Long History of Quality Assurance Problems at Yucca Mountain

3GAO, Nuclear Waste: Repository Work Should Not Proceed Until Quality
Assurance Is Adequate, GAO/RCED-88-159 (Washington, D.C.: Sept. 29, 1988).

4U.S. Nuclear Regulatory Commission, U.S. Nuclear Regulatory Commission
Staff Evaluation of U.S. Department of Energy Analysis Model Reports,
Process Controls, and Corrective Actions (Washington, D.C., Apr. 7, 2004).

5In addition, the RIT edited the AMRs to assure consistency and ease of
technical and regulatory reviews.

    DOE Cannot Be Certain Its Efforts to Improve Quality Assurance Have Been
 Effective Because of Weaknesses in Tracking Progress and Identifying Problems

6Skill-based errors are defined in trend reports as unintentional errors
resulting from people not paying attention to the task at hand.

7A root cause analysis seeks to determine the root cause of a problem,
which is the underlying cause that must change in order to prevent the
problem from reoccurring.

  DOE's New Path Forward to Submitting a License Application Faces Substantial
                     Quality Assurance and Other Challenges

                                  Conclusions

                     GAO Contact and Staff Acknowledgments

Appendix I: The Management Improvement Initiatives' Key Areas of Concern

(360683)

This is a work of the U.S. government and is not subject to copyright
protection in the United States. It may be reproduced and distributed in
its entirety without further permission from GAO. However, because this
work may contain copyrighted images or other material, permission from the
copyright holder may be necessary if you wish to reproduce this material
separately.

GAO's Mission

Obtaining Copies of GAO Reports and Testimony

Order by Mail or Phone

To Report Fraud, Waste, and Abuse in Federal Programs

Congressional Relations

Public Affairs

www.gao.gov/cgi-bin/getrpt?GAO-06-550T.

To view the full product, including the scope

and methodology, click on the link above.

For more information, contact Jim Wells at (202) 512-3841 or
[email protected].

Highlights of GAO-06-550T, a testimony before the Subcommittee on the
Federal Workforce and Agency Organization, Committee on Government Reform,
House of Representatives

April25,2006

YUCCA MOUNTAIN

DOE's Planned Nuclear Waste Repository Faces Quality Assurance and
Management Challenges

The Department of Energy (DOE) is working to obtain a license from the
Nuclear Regulatory Commission (NRC) to construct a nuclear waste
repository at Yucca Mountain in Nevada. The project, which began in the
1980s, has been beset by delays. In 2004, GAO raised concerns that
persistent quality assurance problems could further delay the project.
Then, in 2005, DOE announced discovery of employee e-mails suggesting
quality assurance problems. Quality assurance, which establishes
requirements for work to be performed under controlled conditions that
ensure quality, is critical to making sure the project meets standards for
protecting public health and the environment.

This testimony, which summarizes GAO's March 2006 report (GAO-06-313),
provides information on (1) the history of the project's quality assurance
problems, (2) DOE's tracking of these problems and efforts to address them
since GAO's 2004 report, and (3) challenges facing DOE as it continues to
address quality assurance issues within the project.

What GAO Recommends

In its March 2006 report, GAO recommended actions DOE can take to improve
the project's management tools and their use in identifying and addressing
quality assurance and other problems. In commenting on a draft of the
report, DOE agreed with GAO's recommendations.

DOE has had a long history of quality assurance problems at the Yucca
Mountain project. In the 1980s and 1990s, DOE had problems assuring NRC
that it had developed adequate plans and procedures related to quality
assurance. More recently, as it prepares to submit a license application
for the repository to NRC, DOE has been relying on costly and
time-consuming rework to resolve lingering quality assurance problems
uncovered during audits and after-the-fact evaluations.

DOE announced, in 2004, that it was making a commitment to continuous
quality assurance improvement and that its efforts would be tracked by
performance indicators that would enable it to assess progress and direct
management attention as needed. However, GAO found that the project's
performance indicators and other key management tools were not effective
for this purpose. For example, the management tools did not target
existing areas of concern and did not track progress in addressing them.
The tools also had weaknesses in detecting and highlighting significant
problems for management attention.

DOE continues to face quality assurance and other challenges. First, DOE
is engaged in extensive efforts to restore confidence in scientific
documents because of the quality assurance problems suggested in the
discovered e-mails between project employees, and it has about 14 million
more project e-mails to review. Second, DOE faces quality assurance
challenges in resolving design control problems associated with its
requirements management process-the process for ensuring that high-level
plans and regulatory requirements are incorporated into specific
engineering details. Problems with the process led to the December 2005
suspension of certain project work. Third, DOE continues to be challenged
to manage a complex program and organization. Significant personnel and
project changes initiated in October 2005 create the potential for earlier
problem areas, such as confusion over roles and responsibilities, to
reoccur.

View of Yucca Mountain and the Exploratory Tunnel for the Repository
*** End of document. ***