Yucca Mountain: Quality Assurance at DOE's Planned Nuclear Waste 
Repository Needs Increased Management Attention (17-MAR-06,	 
GAO-06-313).							 
                                                                 
The Department of Energy (DOE) is working to obtain a license	 
from the Nuclear Regulatory Commission (NRC) to construct a	 
nuclear waste repository at Yucca Mountain in Nevada. The	 
project, which began in the 1980s, has been beset by delays. In a
2004 report, GAO raised concerns that persistent quality	 
assurance problems could further delay the project. Then, in	 
2005, DOE announced the discovery of employee e-mails suggesting 
quality assurance problems, including possible falsification of  
records. Quality assurance, which establishes requirements for	 
work to be performed under controlled conditions that ensure	 
quality, is critical to making sure the project meets standards  
for protecting public health and the environment. GAO was asked  
to examine (1) the history of the project's quality assurance	 
problems, (2) DOE's tracking of these problems and efforts to	 
address them since GAO's 2004 report, and (3) challenges facing  
DOE as it continues to address quality assurance issues within	 
the project.							 
-------------------------Indexing Terms------------------------- 
REPORTNUM:   GAO-06-313 					        
    ACCNO:   A49318						        
  TITLE:     Yucca Mountain: Quality Assurance at DOE's Planned       
Nuclear Waste Repository Needs Increased Management Attention	 
     DATE:   03/17/2006 
  SUBJECT:   Accountability					 
	     Licenses						 
	     Nuclear waste disposal				 
	     Nuclear waste management				 
	     Nuclear waste storage				 
	     Performance measures				 
	     Program evaluation 				 
	     Quality assurance					 
	     Schedule slippages 				 
	     DOE Yucca Mountain Project (NV)			 
	     Yucca Mountain (NV)				 

******************************************************************
** This file contains an ASCII representation of the text of a  **
** GAO Product.                                                 **
**                                                              **
** No attempt has been made to display graphic images, although **
** figure captions are reproduced.  Tables are included, but    **
** may not resemble those in the printed version.               **
**                                                              **
** Please see the PDF (Portable Document Format) file, when     **
** available, for a complete electronic file of the printed     **
** document's contents.                                         **
**                                                              **
******************************************************************
GAO-06-313

     

     * Report to the Chairman, Subcommittee on the Federal Workforce and
       Agency Organization, Committee on Government Reform, House of
       Representatives
          * March 2006
     * YUCCA MOUNTAIN
          * Quality Assurance at DOE's Planned Nuclear Waste Repository Needs
            Increased Management Attention
     * Contents
          * Results in Brief
          * Background
          * DOE Has a Long History of Quality Assurance Problems at Yucca
            Mountain and Is Relying on Costly and Time-Consuming Measures to
            Correct Problems Before Submitting Its License Application for
            the Repository
               * DOE Has Had Problems Implementing and Maintaining an
                 Effective Quality Assurance Program
               * Recurring Issues with Project Data, Models, and Software
                 Illustrate DOE's Difficulties Addressing Quality Assurance
                 Problems
               * DOE Is Now Relying on Costly and Time-Consuming Rework to
                 Resolve Additional Problems
          * DOE Cannot Be Certain Its Efforts to Improve Quality Assurance
            Have Been Effective Because of Weaknesses in Tracking Progress
            and Identifying Problems
               * The Panel's Focus and Frequent Changes Hindered the Tracking
                 of Progress with Management Concerns and Quality Problems
               * Trend Evaluation Reports Have Not Specifically Tracked the
                 Initiatives' Management Concerns and Have Had Weaknesses
                 Tracking Significant and Recurrent Problems for Management
                 Attention
               * DOE Has Not Adequately Tracked Problems Being Addressed by
                 Ongoing Management Actions
          * DOE's `New Path Forward' for Preparing to Submit Its License
            Application Faces Substantial Quality Assurance and Other
            Challenges
               * Determining the Extent of Problems with Relevant Documents
                 Will Delay DOE's Submission of the License Application
               * Ongoing Design and Requirements Management Issues Could
                 Delay DOE's Submission of the License Application
               * DOE Faces Challenges in Managing the Transition, Complexity,
                 and Continuity of Its `New Path Forward'
          * Conclusions
          * Recommendations for Executive Action
          * Agency Comments
     * Objectives, Scope, and Methodology
     * Yucca Mountain Project Employee Concerns Programs
     * GAO Contact and Staff Acknowledgments
     * PDF6-Ordering Information.pdf
          * Order by Mail or Phone

Report to the Chairman, Subcommittee on the Federal Workforce and Agency
Organization, Committee on Government Reform, House of Representatives

March 2006

YUCCA MOUNTAIN

Quality Assurance at DOE's Planned Nuclear Waste Repository Needs
Increased Management Attention

Contents

Tables

Figures

March 17, 2006Letter

The Honorable Jon C. Porter Chairman Subcommittee on the Federal Workforce
and Agency Organization Committee on Government Reform House of
Representatives

Dear Mr. Chairman:

The nuclear waste created as a by-product of the nuclear power process in
reactors can remain highly radioactive for hundreds of thousands of years,
and will require proper disposal to protect public health and the
environment. Over 50,000 metric tons of this waste is currently being
stored at 72 sites around the country, principally at commercial nuclear
power plants. These wastes have been accumulating for several decades in
surface storage designed to be temporary. The Nuclear Waste Policy Act of
1982 obligated the Department of Energy (DOE) to construct an underground
geological repository for permanent storage and begin accepting these
wastes by January 31, 1998. However, it was not until 2002, after more
than 15 years of scientific study, that Yucca Mountain in Nevada was
approved by Congress as a suitable location for the repository. DOE is
continuing to experience delays, and it does not currently have a schedule
for when construction of the repository will begin. The project to build
and operate a repository at Yucca Mountain is highly complex. It is also
highly controversial among some of the public, in large part, because of
their concern that the repository may not be adequate, over the long term,
to prevent the release of radioactive material to the environment. DOE has
established quality assurance procedures to ensure that its work relative
to the project and the technical information it produces are of high
quality and defensible. However, persistent problems with implementing
these procedures and resulting questions about the quality of the work
have significantly contributed to project delays. Resolving these quality
issues is essential to proceeding with construction.

To construct a repository at the Yucca Mountain site, DOE must obtain a
license from the Nuclear Regulatory Commission (NRC). As part of the
licensing process, DOE must demonstrate to NRC that its plans for the
repository will meet standards for protecting public health and the
environment from harmful exposure to the radioactive waste. The
Environmental Protection Agency (EPA) set these standards in 2001, but as
a result of a 2004 court ruling, EPA is proposing to revise the standards
to extend the protection period from 10,000 years to 1 million years.1

To demonstrate that it can meet these standards, DOE has been conducting
scientific and technical studies at the Yucca Mountain site that will
serve as supporting documentation for DOE's planned license application.
For example, it has developed mathematical models to measure the
probability that various combinations of natural and engineered
(human-made) features of a repository could safely contain waste for the
long term; the models take into account possible water infiltration
through the mountain (see fig. 1), earthquakes, volcanic action, or other
scenarios. Thus, one of DOE's most important tasks during the licensing
process will be to demonstrate the adequacy of its data, software, and
models. Accordingly, NRC requires nuclear facilities to develop a quality
assurance program that ensures that the technical information submitted in
support of a license application-such as scientific data, models, and
details on design and construction-is well documented and defensible. More
specifically, data used to support conclusions about the safety and design
of the repository must meet transparency and traceability standards. That
is, the data must be clear in justifying and explaining any underlying
assumptions, calculations, and conclusions, and must be capable of being
traced back to original source materials.

Figure 1: A Yucca Mountain Project Scientist Conducts Water Infiltration
Tests inside Yucca Mountain

To meet NRC's requirements, DOE established a quality assurance program
for the Yucca Mountain project. The program establishes requirements that
scientific, design, engineering, and other work, such as procurement and
record keeping, is to be performed under controlled conditions that ensure
quality and enable the work to be verified by others. For example, the
program establishes general requirements for calibrating equipment before
conducting tests, stipulating when and how the equipment should be
calibrated and how to document the results. The project's line
organizations, which are responsible for carrying out various functions or
aspects of the work, then create their own policies and procedures to
implement the requirements.

Project employees are required to follow such procedures to help ensure
the reliability of project information. Quality assurance auditors
periodically verify that the procedures have been followed. Project
employees, including quality assurance auditors, are required to identify
when procedures are not being followed or when they encounter problems
with the procedures. These problems can be identified in "condition
reports" under the project's Corrective Action Program, which establishes
procedures for the prompt identification and correction of problems.
Alternatively, project employees can submit problems for resolution
through the Employee Concerns Program, which allows for submissions to be
confidential or anonymous.

Because quality assurance plays a key role in ensuring that the
information DOE uses to support its license application is of high quality
and fully defensible, problems in this area raise concerns about delays to
DOE's submission and NRC's review of the license application. In April
2004, for example, we reported that recurring quality assurance problems
at the Yucca Mountain project could delay the licensing and operation of
the repository.2 As we noted, a 2004 NRC evaluation found quality
assurance problems such as data that could not be readily traced back to
their sources. NRC indicated that unless DOE rectified such problems
before submitting the license application, NRC could be in the position of
requesting large volumes of additional information, which could prevent it
from making a decision on the license within the time required by law.
Then, in early 2005, DOE reported it had discovered a series of e-mail
messages among some U.S. Geological Survey (USGS) employees working on the
Yucca Mountain project under a contract with DOE that appeared to imply
that workers had falsified records for scientific work. Several of these
messages, written in the late 1990s, appeared to show disdain for the
project's quality assurance program and its requirements. As a result of
these e-mails, DOE is engaging in an extensive review of records to
restore confidence in scientific documents that will be used to support
its license application.

DOE's recent efforts to better manage quality assurance problems include
its Management Improvement Initiatives (Initiatives), which began in 2002
and were reported completed in April 2004. The Initiatives' purpose was to
ensure that work and products consistently met quality objectives and were
fully defensible by establishing a foundation for continuous improvement
in areas of identified management weaknesses. In our 2004 report, we
concluded that, while DOE considered the Initiatives to have been
completed, it could not assess their effectiveness in addressing the
management weaknesses because its performance goals lacked objective
measures and time frames for determining success.3 By the end of the
Initiatives, DOE had established two tools to alert management about
quality-related and other problems: (1) a one-page summary of performance
indicators for key project activities and processes (the summary, which
DOE refers to as a "panel," is prepared monthly for discussion and action
by project managers) and (2) quarterly trend evaluation reports analyzing
patterns and trends in problems identified through the Corrective Action
Program. Then, in October 2005, DOE initiated planning for an aggressive
series of changes to the facility design, organization, and management of
the Yucca Mountain project. This effort, known as the "new path forward,"
is intended to address quality assurance and other challenges prior to
submission of a license application. According to the project's Acting
Director, DOE will be considering changes in performance indicators and
other management tools to better support the new path forward.

In this context, you requested that we provide additional information on
the project's quality assurance problems and DOE's efforts to correct
them. As agreed with your office, this report discusses (1) the history of
the project's quality assurance problems since its start in the 1980s, (2)
DOE's tracking of quality problems and progress implementing quality
assurance requirements since our April 2004 report, and (3) challenges DOE
faces as it continues to address quality assurance issues at the project.
In addition, you asked for information about concerns raised in recent
years through the project's Employee Concerns Program, which is provided
in appendix II.

To determine the history of quality assurance problems, we reviewed
previous GAO, DOE, and NRC documents, visited the project, and interviewed
officials from DOE, NRC, and Bechtel/SAIC Company, LLC (BSC), which is
DOE's management contractor for the Yucca Mountain project. To assess
DOE's tracking of quality-related problems and progress in addressing
them, we examined management tools and associated documentation, such as
monthly indicator panels and quarterly trend reports, and interviewed BSC
and DOE officials regarding those tools. To identify current quality
assurance and other challenges, we attended quarterly NRC management
meetings, interviewed the Acting Director and other senior managers of the
DOE project, and gathered information on management turnover. Due to the
criminal investigation under way related to possible falsification of
records implied in USGS e-mail exchanges, we did not examine the
investigated issues beyond confirming that a concern about the e-mails had
been submitted to the Employee Concerns Program. However, to determine if
concerns about other instances of potential falsification of records had
been raised by project employees, we reviewed employee concerns filed with
the project's Employee Concerns Program from January 2004 to December
2005. More information on our scope and methodology is provided in
appendix I. We conducted our work from July 2005 through January 2006 in
accordance with generally accepted government auditing standards.

Results in Brief

DOE has had a long history of quality assurance problems at the Yucca
Mountain project. In the late 1980s and early 1990s, DOE had problems
assuring NRC that it had developed adequate plans and procedures related
to quality assurance. For example, as GAO reported in 1988, NRC had found
that DOE's quality assurance procedures were inadequate and its efforts to
independently identify and resolve weaknesses in the procedures were
ineffective. By the late 1990s, DOE had largely addressed NRC's concerns
about its plans and procedures, but its own audits identified quality
assurance problems with the data, software, and models used in the
scientific work supporting its potential license application. For example,
in 1998, a team of project personnel determined that 87 percent of the
models used to simulate the site's natural and environmental conditions,
and to demonstrate the future repository's performance over time, did not
comply with requirements for demonstrating their accuracy in predicting
geologic events. More recently, as it prepares to submit the license
application for the planned repository to NRC, DOE has been relying on
costly and time-consuming rework to resolve lingering quality assurance
concerns. For example, to address problems with the transparency and
traceability of scientific work in technical documents, DOE implemented,
in the spring of 2004, a roughly $20 million, 8-month project called the
Regulatory Integration Team. This effort involved about 150 full-time
employees from DOE, USGS, and multiple national laboratories, such as
Sandia, Los Alamos, and Lawrence Livermore, working to inspect technical
documents to identify and resolve quality problems.

DOE cannot be certain that its efforts to improve the implementation of
its quality assurance requirements have been effective because it adopted
management tools that did not target existing management concerns and did
not track progress with significant and recurring problems. Although DOE
announced, in 2004, that it was making a commitment to continuous quality
assurance improvement and that its efforts would be tracked by performance
indicators that would enable it to assess progress and direct management
attention as needed, its adopted management tools have not been effective
for this purpose. Specifically, the one-page summary, or "panel," of
selected performance indicators that project managers used in monthly
management meetings was not an effective tool for assessing progress. The
indicators selected for the panel poorly represented the major management
concerns and changed frequently. For example, the panel did not include an
indicator to represent the management concern about unclear roles and
responsibilities-a problem that could undermine accountability within the
project. Use of the indicator panel was discontinued in late 2005, and DOE
is deciding on a tool to replace it. Moreover, a second management
tool-trend evaluation reports-also did not track relevant concerns. The
reports generally had technical weaknesses for identifying recurrent and
significant problems and inconsistently tracked progress in resolving the
problems. For example, lacking reliable data and an appropriate
performance benchmark for determining the significance of human errors as
a cause of quality problems, DOE's trend reports offered no clear basis
for tracking progress on such problems. In addition, under the trend
reports' rating categories, the rating assigned to convey the significance
of a problem was overly influenced by a judgment that there were already
ongoing management actions to address the problem, rather than solely
assessing the problem's significance. For example, the trend report's
rating of one particular problem at the lowest level of significance did
not accurately describe the problem or sufficiently draw management's
attention to it.

Before DOE submits a license application, its aggressive "new path
forward" effort faces substantial quality assurance and other challenges.
First, the March 2005 announcement of the discovery of USGS e-mails
suggesting the possible falsification of quality assurance records has
resulted in extensive efforts to restore confidence in scientific
documents, and DOE is conducting a wide-ranging review of approximately 14
million e-mails to determine whether they raise additional quality
assurance issues. Such a review creates a challenge not just because of
the sheer volume of e-mails to be reviewed, but also because DOE will have
to decipher their meaning and determine their significance, sometimes
without clarification from authors who have left the project. Furthermore,
if any of the e-mails raise quality assurance concerns, further review,
inspection, or rework may need to be performed to resolve any newfound
problems. Second, DOE faces quality assurance challenges in resolving
design control problems associated with an inadequate requirements
management process-the process responsible for ensuring that broad plans
and regulatory requirements affecting the project are tracked and
incorporated into specific engineering details. In December 2005, DOE
issued a stop-work order on some design and engineering work until DOE can
determine that the requirements management process has been improved.
Third, DOE continues to be challenged by managing a changing and complex
program and organization. Significant project changes initiated in October
2005 under the new path forward create the potential for confusion over
accountability as roles and responsibilities change-a situation DOE found
to contribute to quality assurance problems during an earlier transition
period. For example, a proposed reorganization, establishing a lead
laboratory to assist the project, not only would have to be effectively
managed, but also would introduce a new player whose accountability DOE
would have to ensure. DOE has also experienced turnover in 9 of 17 key
management positions since 2001-including positions related to quality
assurance-that has created management continuity challenges. For example,
the director position for the project has been occupied by three
individuals since 1999 and is currently occupied by an acting director.
Since DOE is still formulating its plans, it is too early to determine
whether its new effort will effectively resolve these challenges.

We are making recommendations to DOE aimed at improving the effectiveness
of its management tools for monitoring performance in key areas, including
quality assurance, by improving the tools' ability to identify problems
and track progress in addressing them. We provided DOE and NRC with draft
copies of this report for their review and comment. In comments, DOE
agreed with our recommendations. Both DOE and NRC provided technical and
editorial comments that we incorporated into the report, as appropriate.

Background

Congress enacted the Nuclear Waste Policy Act of 1982 to establish a
comprehensive policy and program for the safe, permanent disposal of
commercial spent fuel and other highly radioactive wastes in one or more
mined geologic repositories. The act charged DOE with (1) establishing
criteria for recommending sites for repositories; (2) "characterizing"
(investigating) three sites to determine each site's suitability for a
repository (1987 amendments to the act directed DOE to investigate only
the Yucca Mountain site); (3) recommending one suitable site to the
President, who, if he considered the site qualified for a license
application, would submit a recommendation to Congress; and (4) seeking a
license from NRC to construct and operate a repository at the approved
site. The act created the Office of Civilian Radioactive Waste Management
within DOE to manage its nuclear waste program.

Since the 1980s, DOE has spent years conducting site characterization
studies at the Yucca Mountain site to determine whether it is suitable for
a high-level radioactive waste and spent nuclear fuel repository. DOE, for
example, has completed numerous scientific studies of the mountain and its
surrounding region for water flow and the potential for rock movement,
including volcanoes and earthquakes that might adversely affect the
performance of the repository. To allow scientists and engineers greater
access to the rock being studied, DOE excavated two tunnels for studying
the deep underground environment: (1) a five-mile main tunnel that loops
through the mountain, with several research areas or alcoves connected to
it; and (2) a 1.7-mile tunnel that crosses the mountain (see fig. 2). This
second tunnel allows scientists to study properties of the rock and the
behavior of water near the potential repository area. In July 2002,
Congress approved the President's recommendation of the Yucca Mountain
site for the development of a repository.

Figure 2: The 1.7-Mile Tunnel Built for Scientific Studies near the
Potential Repository Area

The Yucca Mountain project is currently focused on preparing an
application to obtain a license from NRC to construct a repository. The
required application information includes both repository design work and
scientific analyses. DOE is engaged in necessary tasks such as compiling
information and writing sections of the license application, and is
conducting technical exchanges with NRC staff and addressing key technical
issues identified by NRC to ensure that sufficient supporting information
is provided. It also plans to further develop the design of the
repository, including revised designs for the repository's surface
facilities and canisters to hold the waste. DOE is also identifying and
preparing potentially relevant documentary material that it is required to
make available on NRC's Web-based information system, known as the
Licensing Support Network. This is a critical step because DOE is required
to certify that the documentary material has been identified and made
electronically

available no later than 6 months in advance of submitting the license
application.4

In February 2005, DOE announced that it does not expect the repository to
open until 2012 at the earliest, which is more than 14 years later than
the 1998 goal specified by the Nuclear Waste Policy Act of 1982. More
recently, the conference report for DOE's fiscal year 2006 appropriations
observed that further significant schedule slippages for submitting a
license application are likely. Further delays could arise from factors
such as the time needed for EPA to establish revised radiation standards
for Yucca Mountain and for DOE to revise its technical documents in
response. Such delays could be costly because nuclear utilities, which pay
for most of the disposal program through a fee on nuclear power, have sued
DOE, seeking damages for not starting the removal of spent nuclear fuel
from storage at commercial reactors by the 1998 deadline. Estimates of the
potential damages vary widely, from DOE's estimate of about $5 billion to
a nuclear industry's estimate of about $50 billion, but the cost for the
damages will likely rise if there are further delays to opening the
repository.

Given these schedule slippages, Congress has considered other options for
managing existing and future nuclear wastes, such as centralized interim
storage at one or more DOE sites. The conference report for DOE's fiscal
year 2006 appropriations directed DOE to develop a spent nuclear fuel
recycling plan to reuse the fuel. However, according to the policy
organization of the nuclear energy industry, no technological option
contemplated will eliminate the need to ultimately dispose of nuclear
waste in a geologic repository.

In October 2005, the project's Acting Director issued a memorandum calling
for the development of wide-ranging plans for the "new path forward,"
DOE's effort to address quality assurance and other challenges prior to
applying for a license. To restore confidence in scientific documents that
will support the license application, some of the plans will address the
need to review and replace USGS work products, a requirement for USGS to
certify its scientific work products, and establishing a lead national
laboratory to assist the project. Other plans are focused on a new
simplified design for the waste canisters and repository facilities, a
design that is expected to improve the safety and operation of the
repository by eliminating the need to directly handle and process the
spent fuel at the repository. Further, this aggressive effort called for
management changes, including a transition plan; more rigorous project
management, including a new baseline schedule; rescoping existing
contracts and developing new contracts; tracking project hiring actions; a
financial plan; and new reporting indicators.

After DOE submits the license application, NRC plans to take 90 days to
examine the application for completeness to determine whether DOE has
addressed all NRC requirements. One of the reviews for completeness will
include an examination of DOE's documentation of the quality assurance
program to assess whether it addresses all NRC criteria. These criteria
include, among other things, organization, design control, document
control, corrective actions, quality assurance records, and quality
audits. If it deems any part of the application is incomplete, NRC may
either reject the application or require that DOE furnish the necessary
documentation before proceeding with the detailed technical review of the
application. If it deems the application is complete, NRC will docket the
application, indicating its readiness for a detailed technical review.5

Once the application is accepted and placed on the docket, NRC will
conduct its 18-month technical review of the application to determine if
the application meets all NRC requirements, including the soundness of
scientific analyses and preliminary facility design, and NRC quality
assurance criteria. If NRC discovers problems with the technical
information used to support the application, it may conduct specific
reviews, including inspections, to determine the extent and effect of the
problem. Because the data, models, and software used in modeling
repository performance are integral parts of this technical review,
quality assurance plays a key role since it is the mechanism used to
verify the accuracy of the information DOE presents in the application.
NRC may conduct reviews, including inspections, of the quality assurance
program if technical problems are identified that are attributable to
quality problems. NRC will hold public hearings chaired by its Atomic
Safety and Licensing Board to examine specific topics. After completing
the proceedings, the board will forward its initial decision to the NRC
commissioners for their review. Finally, within 3 to 4 years from the date
that NRC dockets the application, NRC will make a decision to grant the
construction authorization, reject the application, or grant the
construction authorization with conditions.6 NRC will grant a construction
authorization only if it concludes from its reviews that the repository
would meet its reasonable expectation that the safety and health of
workers and the public would be protected.

DOE Has a Long History of Quality Assurance Problems at Yucca Mountain and
Is Relying on Costly and Time-Consuming Measures to Correct Problems
Before Submitting Its License Application for the Repository

DOE has repeatedly experienced quality assurance problems with its work on
the Yucca Mountain project. In the late 1980s, DOE had been challenged to
fix and develop adequate plans and procedures related to quality
assurance. By the late 1990s, audits by GAO, DOE, and others identified
recurring quality assurance problems with several aspects of key
scientific data, models, and software. Currently, in preparing to submit
the license application to NRC, DOE is relying on costly and
time-consuming rework to resolve lingering quality assurance problems with
the transparency and traceability of data and in project design and
engineering documents uncovered during audits and after-the-fact
evaluations.

DOE Has Had Problems Implementing and Maintaining an Effective Quality
Assurance Program

DOE has a long-standing history of attempting to address NRC concerns
about its quality assurance program. Although NRC will have responsibility
for regulating the construction, operation, and decommissioning (closure)
phases of the project, its regulatory and oversight role does not begin
until DOE submits a license application. As a result, NRC's role in the
project has been limited to providing guidance to DOE to ensure an
understanding of NRC regulations and that the years of scientific and
technical work will not later be found inadequate for licensing purposes.
Specifically, since 1984, NRC has agreed to point out problems it
identifies with the quality assurance program so that DOE can take timely
corrective action. Initially, this NRC guidance was mainly focused on
ensuring that DOE had the necessary quality assurance organization, plans,
and procedures.

As we reported in 1988, NRC had reviewed DOE's quality assurance plans and
procedures comprising the principal framework of its quality assurance
program, and concluded that they were inadequate and did not meet NRC
requirements.7 NRC also concluded that DOE's efforts to independently
identify and resolve weaknesses in the plans and procedures were
ineffective. After observing DOE quality assurance audits, NRC determined
that the audits were ineffective for measuring whether quality assurance
procedures were being effectively implemented. Further, NRC identified
additional concerns, during the 1980s, related to DOE management and
organizational deficiencies relating to the quality assurance program.
Specifically, among other things, NRC found the following:

o DOE had a small staff and relied heavily on contractors to provide
quality assurance oversight. Based on its experience in regulating nuclear
power plants, NRC found that these types of organizations frequently
developed major quality-related problems.

o DOE had indirect project control, with administrative and functional
control over the project split between different offices. NRC found that
such project control arrangements tend to have serious quality
assurance-related problems because conflicts can arise between quality and
other organizational goals, such as cost and schedule.

o During a 1984 NRC visit to Nevada, DOE project participants had
expressed the opinion that quality assurance is "unnecessary, burdensome,
and an imposition." Further, in 1986, DOE issued a stop-work order to the
USGS based on a determination that USGS staff did not appreciate the
importance of quality assurance and that USGS work would not meet NRC
expectations. NRC believed that organizational attitudes can indicate
whether a project is likely to experience problems relating to quality
assurance and found such examples troublesome.

Finally, based in part on the information obtained from its oversight
activities, NRC concluded, in 1989, that DOE and its key contractors had
yet to develop and implement an acceptable quality assurance program.

However, by March 1992, NRC came to the conclusion that DOE had made
significant progress in improving its quality assurance program. NRC noted
that DOE had addressed many of its concerns, specifically that, among
other things, (1) all of the contractor organizations had developed and
were in the process of implementing quality assurance programs that met
NRC requirements, (2) quality assurance management positions had been
filled with full-time DOE personnel with appropriate knowledge and
experience, and (3) DOE had demonstrated that it is capable of evaluating
and correcting deficiencies in the overall quality assurance program.
Nevertheless, in October 1994, NRC found problems with quality assurance,
particularly with the site contractor's ability to effectively implement
corrective actions and DOE's ability to oversee the site contractor's
quality assurance program.

Recurring Issues with Project Data, Models, and Software Illustrate DOE's
Difficulties Addressing Quality Assurance Problems

As DOE's quality assurance program matured, it resolved NRC concerns about
its organization, plans, and procedures, and in the late 1990s began
successfully detecting new quality assurance problems in three areas
critical to the repository's successful performance: the adequacy of the
data sources, the validity of scientific models, and the reliability of
computer software developed at the site. These problems surfaced in 1998
when DOE began to run the initial version of its performance assessment
model. Specifically, DOE was unable to ensure that critical project data
had been collected and tracked back to the original sources. In addition,
DOE did not have a standardized process for developing scientific models
used to simulate a variety of geologic events or an effective process for
ensuring that computer software used to support the scientific models
would work properly. As required by DOE's quality assurance procedures,
the department conducted a root cause analysis and issued a corrective
action plan in 1999. After corrective actions were taken, DOE considered
the issues resolved.

However, in 2001, similar deficiencies associated with models and software
resurfaced. DOE attributed the recurrence to ineffective procedures and
corrective actions, improper implementation of quality procedures by line
managers, and personnel who feared reprisal for expressing quality
concerns. Recognizing the need to correct these recurring problems, DOE
conducted a comprehensive root cause analysis that included reviews of
numerous past self-assessments and independent program assessments, and
identified weaknesses in management systems, quality processes, and
organization roles and responsibilities. Following the analysis, in July
2002, DOE issued its Management Improvement Initiatives (Initiatives) that
addressed quality problems with software and models. In addition, DOE
added other corrective actions to address management weaknesses that it
found in areas such as roles and responsibilities, quality assurance
processes, written procedures, corrective action plans, and work
environment.

However, DOE continued to face difficulties in resolving quality assurance
problems concerning the data, software, and modeling to be used in support
of the licensing application:

o Data management. As part of NRC's quality assurance requirements, data
used to support conclusions about the safety and design of the repository
must be either collected under a quality assurance program or subjected to
prescribed testing procedures to ensure the data are accurate for their
intended use. In addition, the data supporting these conclusions must also
be traceable back to its original source. In 1998, DOE identified quality
assurance problems with the quality and traceability of data-specifically
that some data had not been properly collected or tested to ensure their
accuracy and that data used to support scientific analysis could not be
properly traced back to their source. DOE again found similar problems in
April and September 2003, when a DOE audit revealed that some data sets
did not have the documentation necessary to trace them back to their
sources; the processes for data control and management were
unsatisfactory; and faulty definitions were developed, which allowed
unqualified data to be used.

o Software management. DOE quality assurance procedures require that
software used to support analysis and conclusions about the performance
and safety of the repository be tested or created in such a way to ensure
that it is reliable. From 1998 to 2003, multiple DOE audits found
recurring quality assurance problems that could affect confidence in the
adequacy of software codes. For example, in 2003, DOE auditors found
problems related to software similar to those found previously in areas
such as technical reviews, software classification, planning, design, and
testing. Further, a team of industry professionals hired by DOE to assess
quality assurance problems with software reported in February 2004 that
these problems kept recurring because DOE did not assess the effectiveness
of its corrective actions and did not adequately identify the root causes
of the problems.

o Model validation. Models are used to simulate natural and environmental
conditions at Yucca Mountain, and to demonstrate the performance of the
future repository over time. However, before models can be used to support
the license application, DOE must demonstrate through a process called
validation that the models are able to accurately predict geologic events.
In 1998, a team of project personnel evaluated the models and determined
that 87 percent did not comply with the validation requirements. In 2001,
and again in 2003, DOE audits found that project personnel were not
properly following procedures-specifically in the areas of model
documentation, model validation, and checking and review. Further, the
2003 audit concluded that previous corrective actions designed to improve
validation and reduce errors in model reports were not fully implemented.

After many years of working to address these quality assurance problems
with data, software, and models, DOE had mostly resolved these problems
and closed the last of the associated condition reports by February 2005.

DOE Is Now Relying on Costly and Time-Consuming Rework to Resolve
Additional Problems

As DOE prepares to submit the Yucca Mountain project license application
to NRC, it has relied on costly and time consuming rework to ensure that
the documents supporting the application are accurate and complete.
Specifically, DOE has relied on inspections and rework by DOE personnel to
resolve quality assurance problems with the traceability and transparency
of technical work products. These efforts to deal with quality problems at
the end, rather than effectively ensuring that work organizations are
producing quality products from the beginning, add to the project's cost
and could potentially delay DOE's submission of the license application to
NRC. In addition, DOE's efforts indicate that some corrective actions have
been ineffective in resolving problems with the quality assurance process.
Further, DOE is now detecting quality assurance problems in design and
engineering work that are similar to the quality assurance problems it
experienced with its scientific work in the late 1990s.

Although DOE did not initiate its major effort to address these problems
until 2004, the department and NRC for years had known of quality
assurance problems with the traceability and transparency of technical
work products called Analysis and Model Reports (AMR). AMRs are a key
component of the license application, and contain the scientific analysis
and modeling data demonstrating the safety and performance of the planned
repository. Among other quality requirements, AMRs must be traceable back
to their original source material and data, and must also be transparent
in justifying and explaining their underlying assumptions, calculations,
and conclusions. In 2003, based in part on these problems as well as DOE's
long-standing problems with data, software, and modeling, NRC conducted an
independent evaluation of three AMRs. The scope of the review was to
determine if the AMRs met NRC requirements for being traceable,
transparent, and technically appropriate for their use in the license
application. NRC found significant problems.8 First, in some cases DOE was
not transparent in explaining the basis on which it was reaching
conclusions. For example, in some circumstances, DOE selected a single
value from a range of data without sufficient justification. Other times,
DOE did not explain how a range of experimental conditions were
representative of repository conditions. Second, where DOE did
sufficiently explain the basis for a conclusion, it did not always provide
the necessary technical information, such as experimental data, analysis,
or expert judgment, to trace the support for that explanation back to
source materials. For example, DOE did not explain how information on one
type of material provided an appropriate comparison for another material.
Moreover, while DOE had identified similar problems in the past, the
actions taken to correct them did not identify and resolve other
deficiencies. NRC concluded that these findings suggested that other AMRs
possibly had similar problems, and that if not resolved, such problems
could delay NRC's review of the license application as it would need to
conduct special inspections to resolve any issues it found with the
quality of technical information.

To address problems of traceability and transparency, DOE in the spring of
2004 initiated an effort called the Regulatory Integration Team (RIT) to
perform a comprehensive inspection and rework of the AMRs to ensure they
met NRC requirements and expectations.9 According to DOE officials, the
RIT involved roughly 150 full-time personnel from DOE, USGS, and multiple
national laboratories such as Sandia, Los Alamos, and Lawrence Livermore.
First, the RIT screened all of the approximately 110 AMRs and prioritized
its efforts on 89 that needed additional rework. Ten AMRs were determined
to be acceptable, and 11 were canceled because they were no longer needed
to support the license application. According to DOE officials,
approximately 8 months later, the RIT project was completed at a cost of
about $20 million, with a total of over 3,700 problems and issues
addressed or corrected. In February 2005, in a letter to DOE, the site
contractor stated that the RIT effort was successful and that the AMRs had
been revised to improve traceability and transparency.

Subsequently, however, additional problems with traceability and
transparency have been identified, requiring further inspections and
rework. For example, after the March 2005 discovery of e-mails from USGS
employees written between May 1998 and March 2000 implying that employees
had falsified documentation of their work to avoid quality assurance
standards, DOE initiated a review of additional AMRs that were not
included in the scope of the 2004 RIT review. The additional AMRs
contained scientific work performed by the USGS employees and had been
assumed by the RIT to meet NRC requirements for traceability and
transparency. However, according to DOE officials, DOE's review determined
that these AMRs did not meet NRC's standards, and additional rework was
required. Further, similar problems were identified as the focus of the
project shifted to the design and engineering work required for the
license application. In February 2005, the site contractor determined that
in addition to problems with AMRs, similar traceability and transparency
problems existed in the design and engineering documents that comprise the
Safety Analysis Report-the report necessary for demonstrating to NRC how
the facilities and other components of the repository site will meet the
project's health, safety, and environmental goals and objectives. In a
root cause analysis of this problem, the site contractor noted that
additional resources were needed to inspect and rework the documents to
correct the problems.

DOE Cannot Be Certain Its Efforts to Improve Quality Assurance Have Been
Effective Because of Weaknesses in Tracking Progress and Identifying
Problems

DOE cannot be certain that it has met continuous improvement goals for
implementing its quality assurance requirements, a commitment DOE made at
the closure of its Management Improvement Initiatives (Initiatives) in
April 2004. At that time, DOE told us it expected that the progress
achieved with the initiatives would continue and that its performance
indicators would enable it to assess further progress and direct
management attention as needed. However, DOE's performance indicators, as
well as a second management tool-trend evaluation reports-have not been
effective for this purpose. More specifically, the indicators panel did
not highlight the areas of concern covered by the initiatives and had
weaknesses in assessing progress because the indicators kept changing. The
trend evaluation reports also did not focus on tracking the concerns
covered by the Initiatives, had technical weaknesses for identifying
significant and recurring problems, had inconsistently tracked progress in
addressing problems, and could not fully analyze projectwide problems.10
In addition, the trend reports' tracking of problems for which corrective
actions were already being taken was at times overly influenced by
judgments about whether additional management action was warranted rather
than the problems' significance.

The Panel's Focus and Frequent Changes Hindered the Tracking of Progress
with Management Concerns and Quality Problems

By the time that the actions called for by the Initiatives had been
completed in April 2004, project management had already developed the
indicators panel, which DOE refers to as the annunciator panel, to use at
monthly management meetings to monitor project performance. The panel was
a single page composed of colored blocks representing selected performance
indicators and their rating or level of performance. A manager viewing the
panel would be able to quickly see the color rating of each block or
indicator. For example, red indicated degraded or adverse performance
warranting significant management attention; yellow indicated performance
warranting increased management attention or acceptable performance that
could change for the worse; and green indicated good performance. The
panel represented a hierarchy of indicators in which the highest level, or
primary, indicators were shown; secondary indicators that determined the
primary indicators' ratings were shown for some primary indicators; but
lower third- or fourth-level indicators were not shown. Our review
analyzed a subset of these indicators that DOE designated as the
indicators that best predict performance in areas affecting quality. While
we were conducting our review, DOE suspended preparation of the panel
after August 2005 while it reconsiders its use of indicators to monitor
project performance. DOE had also suspended preparation of the panel from
late 2004 to early 2005 in order to make substantial revisions. These
revisions were made, in part, to emphasize fewer, more important
indicators for management attention.

The Initiatives raised concerns about five key areas of management
weakness as adversely affecting the implementation of quality assurance
requirements:

1.Roles and responsibilities were becoming confused as the project
transitioned from scientific studies to activities supporting licensing.
The confusion over roles and responsibilities was undermining managers'
accountability for results. The Initiatives' objective was to realign
DOE's project organization to give a single point of responsibility for
project functions, such as quality assurance and the Corrective Action
Program, and hold the project contractor more accountable for performing
the necessary work in accordance with quality, schedule, and cost
requirements.

2.Product quality was sometimes being achieved through inspections by the
project's Office of Quality Assurance rather than being routinely
implemented by the project's work organizations. As a result, the
Initiatives sought to increase work organizations' responsibility for
being the principle means for achieving quality.

3.Work procedures were typically too burdensome and inefficient, which
impeded work. The Initiatives sought to provide new user-friendly and
effective procedures, when necessary, to allow routine compliance with
safety and quality requirements.

4.Multiple corrective action programs existed, processes were burdensome
and did not yield useful management reports, and corrective actions were
not completed in a timely manner. The Initiatives sought to implement a
single program to ensure that problems were identified, prioritized, and
documented and that timely and effective corrective actions were taken to
preclude recurrence of problems.

5.The importance of a safety-conscious work environment that fosters open
communication about concerns was not understood by all managers and staff,
and they had not been held accountable when inappropriately
overemphasizing the work schedule, inadequately attending to work quality,
and acting inconsistently in practicing the desired openness about
concerns. Through issuing a work environment policy, providing training on
the policy, and improving the Employee Concerns Program, the Initiatives
sought to create an environment in which employees felt free to raise
concerns without fear of reprisal and with confidence that issues would be
addressed promptly and appropriately.

As shown in table 1, the Initiatives' effectiveness indicators for
tracking progress in addressing these management weaknesses did not have
equivalent performance indicators visible in the annunciator panel when it
was prepared for the last time, using August 2005 data.

Table 1: Visibility of Management Improvement Initiatives' Effectiveness
Indicators in Annuciator Panel When Last Prepared (Using August 2005 Data)

                                        

       Key area of        Effectiveness     DOE response on  GAO comments and 
management weakness indicators from the  coverage of the    observations   
    identified in the      Initiatives         management    
       Initiatives                          weakness in the  
                                                panel's      
                                              performance    
                                               indicators    
Roles,              An improving trend   No integrated    No indicator was 
responsibilities,   in quality and work  analysis of      visible in, or   
                       schedule             trends in        underlies, the   
accountability,     performance.         quality and      panel.           
authority                                schedule                          
                                            performance.     Some indicators  
                                                             measured aspects 
                                                             of quality or    
                                                             schedule, but    
                                                             provided no      
                                                             integrated       
                                                             analysis of      
                                                             these trends.    
                       A consistently       No aspect        No indicator was 
                       decreasing trend in  measured.        visible in, or   
                       quality problems                      underlies, the   
                       related to roles and                  panel.           
                       responsibilities.                     
Quality assurance   The numbers of       One indicator    No indicator,    
programs and        high-priority        looked at work   focused only on  
processes           (significant)        organizations'   significant      
                       quality problems     identification   problems, was    
                       that are             of problems,     visible in the   
                       self-identified are  including less   panel.           
                       at least 80 percent  significant                       
                       of all significant   ones.            One fourth-level 
                       quality problems.                     indicator        
                                                             tracked work     
                                                             organizations'   
                                                             identification   
                                                             of significant   
                                                             problems.        
                       A decreasing trend   A new timeliness No indicator was 
                       in average time to   measure has been visible in       
                       resolve significant  developed.a      panel.           
                       quality                                                
                                                             Aspect of        
                       problems and in                       fourth-level     
                       number of delinquent                  indicator        
                       corrective actions                    tracked average  
                       for significant                       time of          
                       quality problems.                     resolution.      
Work procedures     A decreasing number  No aspect        No indicator was 
                       of quality problems  measured.        visible in, or   
                       related to                            underlies, the   
                       ineffective                           panel.           
                                                             
                       Procedures.                           
                       A decreasing trend   No aspect        No indicator was 
                       in time needed to    measured.        visible in, or   
                       revise procedures.                    underlies, the   
                                                             panel.           
                       A decreasing trend   No aspect        No indicator was 
                       in average time of   measured.        visible in, or   
                       interim procedure                     underlies, the   
                       changes.                              panel.           
Corrective Action   A decreasing trend   No aspect        No indicator was 
Program             in number of         measured.        visible in, or   
                       repetitive quality                    underlies, the   
                       problems.                             panel.           
                       A decreasing trend   A new timeliness No indicator was 
                       in average time to   measure has been visible in       
                       resolve significant  developed.a      panel.           
                       quality                                                
                                                             Aspect of        
                       problems.                             fourth-level     
                                                             indicator        
                                                             tracked average  
                                                             time of          
                                                             resolution.      
                       Less than 10 percent A new timeliness No indicator was 
                       of quality problems  measure has been visible in the   
                       are resolved late.   developed.a      panel.           
                                                                              
                                                             A third-level    
                                                             indicator        
                                                             tracked          
                                                             percentage of    
                                                             problems with    
                                                             timely           
                                                             resolution.      
Work environment    A decreasing number  Aspects of this  No indicator was 
                       of substantiated     issue are        visible in       
                       employee concerns    measured by work panel.           
                       for harassment,      environment                       
                       retaliation,         indicators.      A third-level    
                       intimidation, and                     indicator        
                       discrimination.                       measured this    
                                                             performance.     
                       Evaluation of        Goals have       No indicator was 
                       routine employee     remained at 30   visible in       
                       concerns in less     and 90 days.     panel.           
                       than 30 days, or                                       
                                                             Third-level      
                       90 days for complex                   indicators       
                       employee concerns                     measured the     
                       involving harassment                  timely           
                       or intimidation.                      completion of    
                                                             routine and      
                                                             other concerns.  
                       External evaluation  External         No indicator was 
                       of work environment  evaluation is    visible in       
                       shows positive       accomplished     panel.           
                       changes.             through                           
                                            independent      Four third-level 
                                            employee         indicators were  
                                            surveys,         based on the     
                                            reflected in     employee         
                                            third-level      surveys.         
                                            indicators.      

Source: GAO analysis of DOE data.

aNew timeliness indicator was not implemented by the time of the final
panel using August 2005 data.

Two of the Initiatives' key areas of concern-(1) roles, responsibilities,
authority, and accountability; and (2) work procedures-and their
associated effectiveness indicators were not represented in the panel's
visible or underlying indicators. The Initiatives' effectiveness indicator
for tracking trends in recurring problems also was not represented. In
other cases, the Initiatives' effectiveness indicators were represented in
underlying lower-level indicators that had very little impact on the
rating of the visible indicator. An example is the Initiatives' indicator
for timely completion of employee concerns. The panel's related visible
indicator was work environment, whose rating was based on 4 secondary and
23 tertiary indicators. Of the third-level indicators, two were for
timeliness of completion of employee concerns, and combined they
contributed 3 percent toward the rating of the work environment indicator.
As a result of the weighting of these many underlying indicators, ratings
for individual lower-level indicators could be different from the visible
indicator. For example, in August 2005, the work environment indicator
showed good performance. However, the ratings of four underlying
indicators from the project's employee survey on the work
environment-collectively accounting for 25 percent of the work environment
indicator's score-indicated the need for increased management attention.
Moreover, some of the Initiatives' indicators, such as the work
organizations' self-identification of significant problems, had their
impact on visible indicators diluted by the inclusion of other indicators
that were not focused solely on the detection of significant problems.

Another shortcoming of the annunciator panel was that frequent changes to
the indicators hindered the ability to identify problems for management
attention and track progress in resolving them. The indicators could
change in many ways, such as changes in their definition, calculation, or
data sources used in calculations, or from the deletion or addition of a
subindicator. When such changes were made to the indicators, progress
became less clear because changes in reported performance levels may have
been the result of the indicator changes rather than actual performance
changes. Some of the indicators for key project processes with quality
elements changed from one to five times during the 8-month period from
April 2004 through November 2004. Even after the major revision of the
panel in early 2005, most of the performance indicators tracking quality
issues continued to change over the next 6 months-that is, from March 2005
through August 2005. As shown in table 2, only one of the five relevant
indicators did not change during this period. One indicator was changed
four times during the 6-month period, resulting in it being different in
more months than it remained the same.

Table 2: Key Indicators for Processes with Quality Elements, Their
Intended Focus, and Number of Times They Changed (March through August
2005)

                                        

         Indicators                     Intended focus              Number of 
                                                                       months 
                                                                      changed 
Performance improvement Effectiveness of self-assessment of              4 
                           quality and other issues, lessons        
                           learned, and Corrective Action Program   
Work management         Quality of work products and documents           1 
Safety-conscious work   Worker confidence in management support          0 
environment             for raising quality and other concerns   
                           without fear of retaliation; management  
                           effectiveness in detecting and           
                           preventing retaliation for raising       
                           concerns; effectiveness of normal and    
                           alternative problem resolution           
Human performance       Preventing, detecting, and correcting            3 
                           human errors                             
Quality performance     Composite of quality indicators, in              1 
                           areas of engineering products,           
                           self-assessment, Corrective Action       
                           Program, and work products and documents 

Source: GAO analysis of DOE data.

Moreover, the panel was not always available to identify problems and
track progress. The panel was not created for December 2004, January 2005,
and February 2005 because it was undergoing a major revision. At that
time, DOE told NRC that the performance indicators for the panel were
revised to reflect the change in the work as the project moved into the
engineering, procurement, and construction phase. DOE also reduced the
total number of visible indicators from 60 to 30 to focus on fewer, more
critical aspects of project management. Panels with the new indicators
were then produced for 6 months, starting with March 2005 and ending after
August 2005. This second interruption of the panels resulted from another
major revision to the indicators; this time, indicators are being made
congruent with project work as designated by DOE's "new path forward,"
again to focus on fewer, more important activities. In December 2005, a
senior DOE official told us that the project would begin to measure key
activities, but without use of the panel.

Trend Evaluation Reports Have Not Specifically Tracked the Initiatives'
Management Concerns and Have Had Weaknesses Tracking Significant and
Recurrent Problems for Management Attention

According to DOE, some of the Initiatives' areas of concern and their
associated effectiveness indicators-for example, trends in quality
problems related to roles and responsibilities-were being captured, at
least partially, in the project's quarterly trend evaluation reports
rather than in the performance indicators. However, the trend reports are
a management tool designed more to identify emerging and unanticipated
problems than to monitor progress with already identified problems, such
as those addressed by the Initiatives. In developing these reports, trend
analysts seek to identify patterns and trends in condition reports (CR),
which document problematic conditions through the project's Corrective
Action Program. The trend reports analyze CRs for more significant
problems (Levels A and B) and minor problems (Level C), but not at Level D
(opportunities for improvement). The trend analysis typically separates
the reported problems into categories such as organizational unit, type of
problem, and cause. These categories are intended to provide insights into
the problems. For example, analysis might reveal that most occurrences of
a particular type of problem are associated with a certain organization.

In practice, DOE missed opportunities to use trend reports to call
attention to progress in the Initiatives' areas of concern. For example,
the Initiatives sought to clarify roles and responsibilities within and
between DOE and BSC to ensure clear accountability for project results
during the project's transition from scientific studies to the design and
engineering activities necessary to license a repository. Similar
organizational transition problems were identified in the November 2004
trend report. While that report attributed increases in the number of
causal factors associated with change management, supervisory methods, and
work organization to recent BSC reorganizations and changes in the project
from science-based to design and engineering activities, it did not
specifically mention issues of roles and responsibilities or that roles
and responsibilities was an Initiatives' area of concern. However, an
analysis of the cause of the problems noted in various significant
condition reports, which is performed for certain condition reports and
outside of the process of developing trend reports, found evidence of
weaknesses in the organizational interfaces among BSC organizations, as
well as between BSC and DOE. According to this cause analysis, these
organizational interface weaknesses were associated with some manner of
change and represented weaknesses in the definition of roles and
responsibilities. Trend reports are generally based on condition reports,
and problems with roles and responsibilities seem to be identified in
cause analyses rather than in the condition reports themselves.

Similarly, DOE missed an opportunity to use trend reports to discuss the
Initiatives' goal that the project's line or work organizations become
more accountable for self-identifying significant problems. The August
2005 trend report briefly cited an evaluation of a CR highlighting the low
rate of self-identification of significant problems during the previous
quarter and reported the evaluation's conclusion that it was not a problem
warranting management attention. However, the trend report did not mention
that about 35 percent of significant problems were self-identified during
the previous quarter, while the Initiatives' goal was that 80 percent of
significant problems would be self-identified. Thus, the trend report
missed an opportunity to either raise a performance problem or pose the
question of whether the Initiatives' goal needed to be reassessed.

Beyond whether they effectively tracked the Initiatives' areas of concern,
trend reports face important obstacles, in general, to adequately identify
recurrent and significant problems:

o Recurring or similar conditions can be difficult to clearly identify for
management's attention and resolution. A trend report noted that there
will be few cases where recurrent conditions are obvious because each
condition slightly differs.

o Trend analysis tends to focus on the number of CRs issued, but the
number of CRs does not necessarily reflect the significance of a problem.
For example, the number of CRs involving requirements management decreased
by over half from the first quarter to the second quarter of fiscal year
2005. However, this decrease was not a clear sign of progress. Not only
did the number rise again in the third quarter, but the May 2005 trend
report also noted that the number of all condition reports had dropped
during the second quarter. According to the report, the volume of CRs in
the first quarter had been high because of reviews of various areas,
including requirements management. Another example is the records
management problem. The November 2005 trend report stated that a records
management problem identified in various CRs, despite accounting for about
50 percent of all business administration problems, reflected an
underlying error rate of less than 1 percent and thus was not a
significant problem.

o The lack of an increasing trend in the number of reported problems does
not necessarily mean the lack of a significant problem for management
attention. Knowing the appropriate level of performance, regardless of the
trend, is difficult without having clearly appropriate benchmarks from
organizations engaged in activities similar to the Yucca Mountain project.
Such benchmarks would clarify, for example, whether a project's
percentages of human performance errors compare favorably, regardless of
whether the numbers are increasing. Similarly, the trend in the number and
types of CRs during any period is not necessarily a sign of improvement or
worsening conditions. Trends can be attributed to various factors,
including increases in the number of audits or self-assessments, which can
lead to more CRs being issued.

o At the time of analysis, some trend data may not be sufficiently
reliable or complete to ensure sound findings for management's attention.
For example, although some actions were taken in December 2004 to ensure
that cause and other codes were properly assigned, a BSC audit in June
2005 again raised questions about the consistency of the coding. With
respect to completeness, the fourth quarter report for 2005 noted that 28
percent of the Level B CRs did not have a cause code at the time of the
trend analysis, and one finding was presented even though two-thirds of
the data was missing.

Due, in part, to these obstacles and changes to how the analysis is done,
trend reports have not consistently determined the significance of
problems or performed well in tracking progress in resolving problems. For
example, trend reports have questionably identified significant human
performance problems and ineffectively tracked progress in resolving the
problem because of no clearly appropriate or precise benchmark for
performance, inconsistent focus on the problem, and unreliable data on
cause codes.

The February 2004 trend report identified a human performance problem
based on Yucca Mountain project data showing the project's proportion of
skill-based errors to all human performance errors was two times higher
than benchmark data from the Institute of Nuclear Power Operations
(INPO).11 The report used this comparison to suggest that the project
needed to adopt successful commercial nuclear practices for addressing
skill-based errors. However, the report cautioned that other comparisons
with these INPO data may not be appropriate because of differences in the
nature, complexity, and scope of work performed, but did not explain why
the report's comparison of INPO data for skill-based errors to the Yucca
Mountain project should be an exception to this caution. The May 2004
trend report repeated this comparison to INPO, finding skill-based errors
three times higher than the benchmark data. However, this INPO benchmark
has not been used in subsequent reports.

The November 2004 trend report redefined the problem as the predominance
of human performance errors in general, rather than the skill-based
component of these errors-but later reports reinterpreted this
predominance as not a problem. The problem with skill-based errors was
unclear in the November 2004 report because these errors were showing a
decreasing trend, a finding that was attributed as likely the result of
unreliable assignment of cause codes. Instead, the report cited an adverse
trend based on the fact that the human performance cause category
accounted for over half of the total number of causes for condition
reports prepared during the quarter. Under the project's trend analysis
guidelines, this large predominance of human performance causes-in
contrast to management, communication or procedure, and other cause
categories-was designated an adverse trend. Nevertheless, by February
2005, trend reports began interpreting this predominance as generally
appropriate, given the type of work done by the project. That is, the
project's work involves mainly human efforts and little equipment, while
work at nuclear power plants involves more opportunities for errors caused
by equipment. In our view, this interpretation that a predominance of
human performance errors would be expected implies an imprecise benchmark
for appropriate performance.

Although trend reports continued to draw conclusions about human
performance problems, the February 2005 report indicated that any
conclusions were hard to justify because of data reliability problems with
cause coding. For example, the majority of problems attributed to human
performance causes are minor, or Level C, problems that receive less
rigorous cause analysis, such as not completing a form. This less rigorous
analysis tends to reveal only individual human errors-that is, human
performance problems-whereas more rigorous analysis tends to reveal less
immediately obvious problems with management and procedures.

Trend reports have also inconsistently tracked progress in resolving the
problem associated with the "flow-down" of requirements into the project's
procedures-that is, with ensuring that program, regulatory, and statutory
requirements are identified, allocated, and assigned to the project
organizations that are responsible for applicable activities. Such
requirements management problems can result in inadequate control over
design inputs and, possibly, inputs to scientific models. Progress with
this problem was less clear because of inconsistent methods of
categorizing requirements management problems over time. Initially, based
on reviews of annual trends in condition reports, the September 2004 and
November 2004 trend reports observed a systemic and continuing problem in
the flow-down of requirements from BSC's Project Requirements Document and
identified this as an adverse trend. In subsequent reports, the
requirements flow-down problem was variously treated as an aspect of
requirements management or records management, or as a latent management
weakness or weak change management. When treated as an aspect of these
broader problems, the significance of the original flow-down problem and
any progress in resolving it became diluted and less clear. The primary
focus eventually became requirements management, which the February 2005
trend report designated as a potential trend, whereas the flow-down
problem had earlier been designated an adverse trend. Consequently, as a
result of this change, the flow-down of requirements got less direct
attention and analysis-for example, receiving only a footnote in the
August 2005 trend report stating that the April 2004 condition report
issued to address the adverse trend was still overseeing implementation of
corrective actions.

In addition, because trend reports examine only condition reports issued
to BSC, they do not always assess the projectwide significance of problems
such as requirements management.12 When analyzing one category of issues
associated with requirements management, the November 2005 report stated
that BSC and DOE shared the process problems, which cannot be adequately
addressed by just one of the organizations. However, for a second category
of these issues, the report did not analyze most of the condition reports
because 6 of the 10 relevant reports were assigned to DOE. For a third
category of issues, no analysis or recommendation was provided because all
of the reports were assigned to DOE and therefore did not fall within the
scope of the trend report.

DOE Has Not Adequately Tracked Problems Being Addressed by Ongoing
Management Actions

The tracking of problems for which corrective actions are already being
taken appeared at times to be overly influenced by judgments, rather than
the problems' significance, about whether additional management action is
warranted. As a result, problems might be rated as less significant, or
not tracked further.

The situation of assigning a lower rating to a problem's significance was
apparently caused by the fact that ratings were simultaneously an
assessment of a problem's significance and of the need for management
action. In its current formulation, DOE's rating categories cannot
accurately represent both the assessment of a problem's significance and a
judgment that additional actions are not needed because the designated
rating category will distort one or the other. For instance, the November
2005 trend report analyzed the four categories of requirements management
issues and designated one category that included problems with
requirements flow-down as a "monitoring trend"-defined as a small
perturbation in numbers that does not warrant action but needs to be
monitored closely. Describing this trend as a small perturbation, or a
disturbance in numbers, did not accurately describe the report's
simultaneous recognition that significant process problems spanned both
BSC and DOE and the fact that the numbers and types of problems were
consistently identified over the previous three quarters. A more
understandable explanation for the low rating is that designating the
problem at any higher level of significance would have triggered
guidelines involving the issuance of a condition report, which, according
to the judgment expressed in the report, was not needed. Specifically, the
report indicated that existing condition reports have already identified
and were evaluating and resolving the problem, thereby eliminating the
need to issue a new condition report.

By rating the problem at the lowest level of significance and not calling
for additional actions, the trend report did not sufficiently draw
management's attention to the problem. The trend report's assessment did
not convey that other serious problems might have been raised by the
additional condition reports. At about the same time that the trend report
judged that no new condition reports were necessary, an Employee Concerns
Program's investigation of requirements management resulted in 14 new
condition reports-3 at the highest level of significance and 8 at the
second-highest

level of significance.13 For example, the Employee Concerns Program's
investigation resulted in condition reports calling for an analysis of the
collective significance of the numerous existing condition reports and an
assessment of whether the quality assurance requirement for complete and
prompt remedial action had been met.14 As a result of the investigation
and a concurrent DOE root cause analysis, during the December 2005
Quarterly Management Meeting with NRC, DOE stated that strong actions were
required to address the problems with its requirements management system
and any resulting uncertainty about the adequacy of its design products.15

Trend reports identified significant problems in the February 2005 report
but did not continue to track the problems after a separate analysis
identified ongoing improvement actions. According to the trend report,
Level B condition reports collectively indicated organizational weaknesses
associated with change management involving cross-departmental interfaces.
The trend report recommended that management focus on these problems, and
cited a condition report that would further investigate them. The cause
analysis for that condition report and a related condition report found
that the problems were well-known, in part through a BSC review, and
related to a variety of ongoing BSC improvement actions. Since this was a
broad category of problems with many initiatives under way, the cause
analysis recommended no new actions other than for management to remain
aware of the problems. However, the trend reports that followed provided
no further analyses to focus management's awareness on these problems or
to assess progress in resolving them.

DOE's `New Path Forward' for Preparing to Submit Its License Application
Faces Substantial Quality Assurance and Other Challenges

In October 2005, DOE announced an aggressive series of proposed changes to
the design, organization, and management of the Yucca Mountain project,
but this effort-known as the "new path forward"-will face substantial
challenges. Some key challenges facing DOE are (1) determining the extent
of problems and restoring confidence in the documents supporting the
license application after the discovery of e-mails raising the potential
of falsified records, (2) settling design issues and associated problems
with requirements management, and (3) replacing key personnel and managing
the transition of new managers and other organizational challenges. The
current Acting Director of the Office of Civilian Radioactive Waste
Management (OCRWM) stated that DOE will not announce a schedule for
submitting a license application until DOE addresses these important
quality assurance and other challenges. Since DOE is still formulating its
plans, it is too early to determine whether the new path will resolve
these challenges.

Determining the Extent of Problems with Relevant Documents Will Delay
DOE's Submission of the License Application

In March 2005, after announcing the discovery of USGS e-mails suggesting
the possible violation of quality assurance requirements, including the
falsification of records, DOE has taken steps to address lingering
concerns about the adequacy of the scientific work related to the flow of
water into the repository and whether similar quality assurance problems
are evident in other e-mails relevant to the licensing application.
Specifically, DOE is (1) conducting an extensive review of approximately
14 million e-mails to determine whether these e-mails raise additional
quality assurance concerns and whether they might be relevant to the
licensing process, and (2) reworking the technical documents created by
USGS personnel to ensure that the science underlying the conclusions on
water infiltration are correct and supportable in the license application.
The Acting Director of OCRWM has stated that DOE will not submit a license
application until these efforts are complete. Consequently, given the
early planning stage of these efforts, it is unknown how long this will
delay the submission of a license application.

As part of the licensing process, DOE is required to publicly disclose all
documents relevant to the licensing application, including e-mails, by
posting them on DOE's public Web site, which is accessible through the
NRC-sponsored, Internet-based Licensing Support Network (LSN). To satisfy
schedule requirements, DOE must certify that relevant documents have been
posted to the network and made available for public review 6 months before
the submission of the license application. In preparation for submitting
the license application by December 2004, in June of that year, DOE
submitted almost 700,000 e-mails to the LSN that had been reviewed by
their original authors and determined to be relevant to the licensing
process. They were part of a group of approximately 6 million archived
e-mails authored by individuals still associated with the project.
However, in August 2004, NRC's Atomic Safety and Licensing Board ruled
that DOE had not met its regulatory obligation to make all relevant
documentary material available. Specifically, DOE had not reviewed a group
of approximately 4 million archived e-mails authored by individuals no
longer affiliated with the project to determine whether the e-mails were
relevant to the licensing process. As part of its effort to address the
board's ruling, BSC began a review of e-mails authored by employees who
were not currently working at the project. During this review, the
contractor discovered and brought forward e-mails between USGS scientists
working on water infiltration models that raised questions of the
potential falsification of technical information in order to sidestep
quality assurance requirements.

Following the discovery of the e-mails, DOE conducted a search to
determine if there were similar e-mails in the approximately 1 million
e-mails previously determined relevant for licensing. However, the DOE
Inspector General reported in November 200516 that there was no evidence
that the project requirements for identifying and addressing conditions
adverse to quality, such as those contained in the USGS e-mails, were
considered during the initial review of e-mails. Further, among the
approximately 10 million e-mails that had already been reviewed for the
licensing process, they found additional e-mails that identified possible
conditions adverse to quality that had not been identified by project
personnel as requiring further review. The DOE Inspector General
recommended, among other things, that DOE (1) expand the review of
archived e-mails to include both those deemed relevant and those deemed
not relevant to the licensing process, and ensure that conditions adverse
to quality are appropriately identified, investigated, reported, and
resolved; and (2) ensure that current and future e-mails are reviewed for
possible conditions adverse to quality and that such conditions are
appropriately addressed under the Corrective Action Program (CAP) system.
DOE accepted the Inspector General's recommendations. Specifically, DOE
agreed to develop a corrective action plan to expand the review of
archived e-mails to ensure that conditions adverse to quality are
appropriately identified and processed under the CAP system. In addition
to this review, the DOE Inspector General opened a criminal investigation
into the USGS e-mails in March 2005. As of December 2005, the
investigation was still in progress.

According to NRC on-site representatives, completing these e-mail reviews
will be challenging because DOE now has to screen millions of e-mails to
ensure that records were not falsified. Further, many of these e-mails
were written by employees who no longer work at the project or may be
deceased, making it difficult to learn their true meaning and context.
Moreover, if additional e-mails are found that raise quality assurance
concerns, DOE may have to initiate further review, inspections, or rework
to address the newfound problems. NRC officials stated that it takes the
issue of potentially falsified documents by USGS employees very seriously,
wants a full understanding of the situation regarding the USGS e-mails,
and will conduct follow-up in this area. Because NRC wants DOE to submit a
high-quality license application, it has encouraged DOE to take the time
and actions necessary to fully and adequately resolve these and other
quality assurance issues.

Immediately following the discovery of the USGS e-mails, DOE undertook a
scientific investigation into the technical documents created by USGS
personnel. In October 2005, DOE began developing an action plan for
reviewing, validating, augmenting, and replacing USGS work products that
had come under scrutiny. Although the plan is not yet complete, the Acting
Director told us that the license application would not be submitted until
the USGS work is replaced and there is confidence that all requirements
have been met. In an effort to ensure that the scientific work underlying
water infiltration modeling is accurate, DOE is working to corroborate the
original work by engaging multiple agencies and organizations to rework
the models. For example, DOE has (1) had its lead project contractor work
with the Idaho National Laboratories to extensively review the software
and data used in the original science work, (2) engaged Sandia National
Laboratories to rework the model and calculations using different software
than was used originally, and (3) also asked USGS to rework the models.
Consequently, when this additional rework is completed, DOE will have four
sets of analysis (including the original scientific work) with which they
can evaluate, compare, and corroborate results. DOE will then pick one set
of scientific analysis for inclusion in the license application, and work
to explain and defend its choice.

Ongoing Design and Requirements Management Issues Could Delay DOE's
Submission of the License Application

In October 2005, DOE announced significant changes to the design of the
Yucca Mountain repository to simplify the project and improve its safety
and operation. However, these changes will also require additional design
and engineering work that will add uncertainty about the timing of the
submission of a license application. DOE had been considering a design
where radioactive waste would be shipped to the Yucca Mountain site,
removed from its shipping container, placed and sealed in a special
disposal container, and finally moved into the underground repository. As
a result, DOE contemplated handling the waste up to four separate times.
In late 2003, DOE engineers began identifying potential safety problems
with this approach. First, possible fissures or holes in the cladding
surrounding the spent nuclear fuel accidentally caused during the handling
of the waste could cause air to mix with the fuel and oxidize.
Consequently, this radioactive oxidized material could then leak and be
dispersed into the air. Second, DOE engineers determined that the original
facility design would not be able to adequately control the levels of
radioactivity in the buildings where the waste would be repackaged before
being moved in the repository. To address these problems, DOE researched a
series of options, including only accepting radioactive waste that had
already decayed to the point where oxidization would not be problematic,
and testing the waste shipments for oxidization and treating them at
another site before they arrived at the repository. In addition, DOE also
considered changing the design by filling the processing buildings with
inert gas to prevent oxidization and revising the electrical and
ventilation systems. According to a DOE official, these options were
impractical or added complexity to the design.

However, in October 2005, DOE proposed a new design that relies on uniform
canisters that would be filled and sealed before being shipped,
eliminating the need for direct handling of the waste prior to being
placed in the repository. As a result, DOE will not have to construct
several extremely large buildings costing millions of dollars for handling
radioactive waste. DOE believes this change will improve the safety,
operation, and long-term performance of the repository. However, this
change will also pose a challenge to the project because of the widespread
implications and the unknown time and effort required to implement it. For
example, to implement the new design, DOE will need to, among other
things,

o get approval from the Energy Systems Acquisition Advisory Board17 for a
new project plan, which, among other things, includes details on the
conceptual design, cost estimates, risk management efforts, and
acquisition strategies;

o plan, design, and produce standardized canisters for the transportation
of waste;

o coordinate this new approach with commercial nuclear power plants, NRC,
and government organizations that plan on shipping waste to the project;
and

o revise procurement and contracting plans to support the new design.

Finally, DOE will need to perform the detailed design and engineering work
required to implement the new design, and create new technical documents
to support the license application. However, before it can present its new
plans and perform this design and engineering work, DOE officials have
stated that it will need to resolve long-standing quality assurance
problems involving requirements management. Requirements management is the
process that ensures the broad plans and regulatory requirements affecting
the project are tracked and incorporated into specific engineering
details. According to DOE's root cause analyses, low-level documents were
appropriately updated and revised to reflect high-level design changes
through fiscal year 1995. However, from 1995 through 2002, many of these
design documents were not adequately maintained and updated to reflect
current designs and requirements. Further, a document that is a major
component of the project's requirements management process was revised in
July 2002, but has never been finalized or approved. Instead, the project
envisioned a transition to a new requirements management system after the
planned submission of the license application in December 2004. However,
for various reasons, the license application was not submitted at that
time, and the transition to a new requirements management system was never
implemented. As a result, the document refers to the out-of-date NRC
regulations contained in 10 CFR part 60, and not the regulations in 10 CFR
part 63 that were finalized in October 2002.

The scope and cause of requirements management problems have been
identified in multiple DOE and NRC reviews.

o Multiple condition reports issued in 2004 and 2005 have identified
problems with requirements management. Due to these condition reports and
NRC concerns that repetitive deficiencies and the failure to implement
timely corrective actions could have direct implications on the quality of
the planned license application, NRC performed a review of Corrective
Action Program documents related to the requirements management program in
the late summer of 2005. NRC determined that these reports identified
approximately 35 deficiencies related to requirements management. Because
the requirements management documents are not current and the new
requirements management system has not been implemented, NRC concluded
that there does not appear to be a requirements management mechanism in
place. Further, based on the number of reports and other issues identified
by DOE audits, NRC concluded that the project's Corrective Action Program
was not effective in, among other things, eliminating the repeated
identification of deficiencies relating to requirements management or
initiating the actions to identify and appropriately address the root
cause of these problems.

o In September 2005, DOE began reviewing the root causes associated with
CR-6278, a condition report identifying problems with requirements
management. As part of the review, DOE personnel analyzed 135 condition
reports and other events and allegations. Among other things, this review
found that DOE expectations for requirements management were diluted and
eventually neglected, that DOE reduced funding for requirements management
due to reductions in its annual budget, and that these and other events
caused the requirements management process to become "completely
dysfunctional" from July 2002 to the time of the review in the fall of
2005. The analysis identified the root causes of these conditions as DOE's
failure to fund, maintain, and rigidly apply a requirements management
system.

o In November 2005, a team of DOE personnel concluded an investigation
into an employee's concerns regarding requirements management. The team
substantiated all of the concerns they investigated and found instances of
failures and breakdowns in the requirements management process. For
example, among other things, the team found that no procedure was
developed to describe how requirements management was to occur; some
existing requirements management procedures were not implemented; and
project management was aware of these conditions but corrective actions
were deferred because the planned requirements management system was
expected to address the problem. As a contributing factor, the team also
observed that the project's lead contractor had not implemented a
"traditional systems engineering approach" as it did not have, among other
things, typical engineering management plans or a separate systems
engineering organization responsible for requirements management. As a
result of the investigation, the team initiated 14 condition reports, 13
of which identified quality-related problems.

To address these problems, on December 19, 2005, DOE issued a stop-work
order on design and engineering for the surface facility and certain other
technical work. DOE stated that the root cause analysis for CR 6278 and
the investigation into employee concerns revealed that the project has not
maintained or properly implemented its requirements management system,
resulting in inadequacies in the design control process. This stop-work
order will be in effect until, among other things, the project's lead
contractor improves the requirements management system; validates that
processes exist and are being followed; and requirements are appropriately
traced to implementing mechanisms and products. Further, DOE will
establish a team to take other actions necessary to prevent inadequacies
in requirements management and other management systems from recurring.

An example of the potential risks of a breakdown with requirements
management was noted during a BSC audit on the design process in March
2005. NRC on-site representatives observing this audit reported that the
audit team noted problems with inconsistencies between the design
documents of the planned fuel-handing facility that would be receiving,
preparing, and packaging the waste before it is placed in the repository.
The original set of requirements specified that no water from a fire
protection system was to be used in the fuel-handling areas of the
facility because under certain scenarios, water used for fire suppression
could facilitate an accidental nuclear reaction, a condition known as
criticality. Later, as the project began to review the design of the
fuel-handling facility, the design was changed to allow the use of water
sprinklers in the fuel-handling areas of the facility to suppress possible
fires. NRC noted that personnel working on the design knew of the
inconsistencies between older and newer design documents, but no formal
tracking mechanism had been provided to ensure that those issues were
rectified. According to an NRC on-site representative in December 2005,
this was an example of a concern with requirements management, and that
repetitive and uncorrected issues associated with the requirements
management process could have direct implications on the quality of the
license application.

While the project may be able to resolve these inconsistencies through an
informal process, the lack of a formal design control and requirements
management process increases the risk that not all such problems will be
addressed. These requirements management problems are potentially
significant because if the high-level engineering needs of the project are
not accurately or completely reflected in the detailed design, then the
quality of the license application may be compromised and cause delays in
the license application review process. For example, according to a 1989
speech prepared by NRC's Office of General Counsel stressing the
importance of quality assurance, a West Coast nuclear power plant
experienced similar quality assurance problems with requirements
management. After a license was issued by NRC, power plant personnel
discovered that the wrong diagrams were used to develop design
requirements. As a result of this and other quality assurance weaknesses
identified by NRC, the license was suspended and the power plant was
required to initiate an independent program to verify the correctness of
the design. Further, NRC reopened hearings on the issue of the adequacy of
the power plant's quality assurance program related to the plant's design.

DOE Faces Challenges in Managing the Transition, Complexity, and
Continuity of Its `New Path Forward'

In October 2005, DOE announced a "new path forward" that would create a
new project schedule and financial plan to address the completion of
scientific and engineering work in support of a license application.
However, DOE faces challenges to successfully implementing the new path,
in terms of managing the transition, program and organizational
complexities, and the continuity of management. According to DOE managers
involved with planning the new path forward, the organizational transition
could take several months to complete. It is too early to determine
whether DOE's new effort will resolve quality assurance issues and move
the project forward to the submission of a license application.

Accountability for quality and results, which was identified as a
significant transition issue in the Initiatives, will likely pose a
challenge for managing the transition to the new path forward. The
Initiatives sought to clarify roles and responsibilities within and
between DOE and contractor organizations to ensure clear accountability
for results and quality during the transition from OCRWM's organization,
processes, procedures, and skills supporting scientific studies to those
supporting the activities necessary to license a repository. As the
project realigns organizations, processes, procedures, and skills to
support the new path forward, it will also be faced with the challenge of
ensuring that accountability is not undermined during the transition. For
instance, according one DOE manager, transitioning project work to a lead
laboratory under a direct contract with DOE could pose a significant
challenge for quality assurance because the laboratories are currently
working under BSC quality assurance procedures and will now have to
develop their own procedures.

Implicitly recognizing the importance of accountability issues, elements
of the new path forward seek to address issues that can negatively affect
quality assurance and project management in general. For instance, the new
path includes plans for developing and transmitting requirements to USGS
for the certification of scientific work. In addition, a senior project
official told us that the lead laboratory would provide a single point of
accountability that will enhance the quality of the science work. The
Acting Director indicated that OCRWM's management structure may have to be
reorganized to have a single manager clearly accountable for each of the
new path's major tasks in science, engineering, and licensing. Moreover,
the project is developing new performance indicators to allow the project
to assess important activities under the new path forward. Outside of the
new path, as the result of a September 2005 DOE Inspector General report18
on accountability problems with managing contract incentives, OCRWM agreed
to develop a comprehensive corrective action plan to provide clearer and
more objective performance standards in the BSC contract.

Program complexity and other project characteristics are also likely to
pose challenges to managing quality assurance. Based on its experience
with licensing and regulating nuclear power plants, NRC observed in the
mid-1980s that the Yucca Mountain project's characteristics, such as a
large and complicated program, increased the likelihood of major
quality-related problems. Although the new path is intended to simplify
design, licensing, and construction, the project remains a complicated
program that seeks to both restore confidence in its scientific studies
and pursue new design and engineering activities. As a result, the project
has to manage quality assurance issues simultaneously in both areas.
Moreover, the project involves a complicated organizational structure. The
project will continue contracting work with BSC, USGS, and the Sandia
National Laboratory, which involves working with organizations in various
locations. In our 1988 report, we noted that the geographic distance
between the various organizations may hamper OCRWM's quality assurance
communication and oversight objectives.

The project also faces management challenges related to ensuring
management continuity at the project, since DOE has experienced turnover
in 9 of 17 key management positions since 2001. To ensure the right
managers move the project forward to licensing, the project has a
recruitment effort for replacing key departing managers. In the past year,
the project has lost key managers through the departures of the director
of Project Management and Engineering, the director of the License
Application and Strategy, the director of Quality Assurance, and the
contractor's general manager. According to NRC on-site representatives in
August and October 2005, management turnover is a concern for NRC because
it would like to see continuity of qualified managers rather than a series
of acting managers. Recruiting replacement managers can impact project
continuity, and newly acting managers may not take full rein of project
tasks. However, the Acting Director told us that the recruitment process
is an opportunity to improve project managers and staff, but recruiting
the right people is challenging for various reasons-for example,
government salaries are less than those in industry, and employment
clauses restrict subsequent employment in related industries.

Finally, since new directors sometimes give new direction to the project,
a critical issue for sustaining the current new path forward is continuity
with OCRWM's director. This position was occupied by three individuals
between late 1999 and early 2005. The last OCRWM director assumed the
position in April 2002, started the Management Improvement Initiatives in
2002, and left the position in February 2005. The current Acting Director
began functioning in his position in the summer of 2005, and initiated the
new path forward in October 2005. DOE is currently awaiting congressional
confirmation of a nominee to take the director position. However, the
Acting Director told us he expects that the new path forward will be
sustained because it has been endorsed by the Secretary of Energy.

Conclusions

DOE's Yucca Mountain project has been wrestling with quality assurance
problems for a long time. Now, after more than 20 years of project work,
DOE is again faced with substantial quality assurance and other challenges
to submit a fully defensible license application to NRC. Unless these
challenges are effectively addressed, further delays on the project are
likely. Furthermore, even as DOE faces new quality assurance challenges,
it cannot be certain it has resolved past problems, largely because the
department has not been well served by management tools-specifically, its
performance indicators and trend evaluation reports-that have not
effectively identified and tracked progress on significant and recurring
problems. First, the management tools have provided limited coverage of
the areas of concern identified in the Management Improvement Initiatives
and thus have not enabled DOE managers to effectively monitor progress in
these important areas. Second, the tools have often not reflected the full
extent or significance of problems because their scope has been limited
and not based on projectwide analysis. Third, the trend evaluation reports
have, at times, not accurately characterized problems because reliable and
complete data and appropriate performance benchmarks were not available at
the time of analysis. Fourth, frequent changes in performance indicators
and the way analysis is done have made it difficult to accurately identify
trends over time. Fifth, the tools' rating categories have sometimes been
misleading as to the significance of problems because the ratings tend to
be skewed by the fact that corrective actions were already being taken,
without considering their effectiveness or considering the significance of
the problem on its own terms. These shortcomings with the tools limit
project managers' ability to direct and oversee such a large and complex
undertaking as constructing an underground repository for nuclear wastes.
Further complicating DOE's ability to manage the project are the vacancies
in key managerial positions for the quality assurance program and
elsewhere on the project. The tools become even more important for new
managers who need to quickly understand project management issues.

Recommendations for Executive Action

To improve the effectiveness of DOE's efforts to monitor performance in
key areas at the Yucca Mountain project, including quality assurance, we
recommend that the Secretary of Energy direct the Director, Office of
Civilian Radioactive Waste Management, to take the following five actions
to strengthen the project's management tools:

o Reassess the coverage that the management tools provide for the areas of
concern identified in the Management Improvement Initiatives and ensure
that performance in these important areas is effectively monitored,
especially in light of the more recent condition reports and associated
cause analyses, trend reports, and other reviews indicating continuing
problems.

o Base future management tools, such as the trend evaluation reports, on
projectwide analysis of problems, unless there are compelling reasons for
a lesser scope.

o Establish quality guidelines for trend evaluation reports to ensure
sound analysis when reporting problems for management's attention. Such
guidelines should address, among other things, having reliable and
complete data and appropriate benchmarks.

o To the extent practicable, make analyses and indicators of performance
consistent over time so that trends or progress can be accurately
identified and, where changes to analyses or indicators are made for
compelling reasons, provide a clear history of the changes and their
impact on measuring progress.

o Focus the management tools' rating categories on the significance of the
monitored condition, not on a judgment of the need for management action.

Agency Comments

We provided DOE and NRC with draft copies of this report for their review
and comment. In oral comments, DOE agreed with our recommendations and
provided technical and editorial comments that we have incorporated in the
report, as appropriate. We also incorporated, as appropriate, NRC's oral
editorial comments, which primarily served to clarify its role.

As agreed with your office, unless you publicly announce the contents of
this report earlier, we plan no further distribution until 30 days from
the report date. At that time, we will send copies to interested
congressional committees and Members of Congress, the Secretary of Energy,
and the Chairman of the Nuclear Regulatory Commission. We will also make
copies available to others upon request. In addition, the report will be
available at no charge on the GAO Web site at www.gao.gov .

If you or your staff have any questions about this report, please contact
me at (202) 512-3841 or [email protected] . Contact points for our Offices of
Congressional Relations and Public Affairs may be found on the last page
of this report. GAO staff who made major contributions to this report are
listed in appendix III.

Sincerely yours,

Jim Wells Director, Natural Resources   and Environment

Objectives, Scope, and Methodology Appendix I

The objectives of this review were to determine (1) the history of the
Yucca Mountain project's quality assurance problems since the project's
start in the 1980s, (2) the Department of Energy's (DOE) tracking of
quality problems and progress implementing quality assurance requirements
since our April 2004 report, and (3) challenges that DOE faces as it
continues to address quality assurance issues within the project. In
addition, we were asked to provide information about implementation of the
project's Employee Concerns Program and the types of concerns raised in
recent years through the program.

To determine the history of the project's quality assurance problems, we
reviewed our prior reports and those of DOE's Office of the Inspector
General concerning the Yucca Mountain project. We also reviewed internal
DOE evaluations and audit reports written about the quality assurance
program and Nuclear Regulatory Commission (NRC) reports and NRC-prepared
summaries of NRC and DOE quarterly management meetings, technical exchange
meetings, and quality assurance meetings dating to early 2004. In
addition, we reviewed letters and communications between DOE and NRC
regarding quality assurance from the NRC Web archives from the late 1980s.
Furthermore, we reviewed plans for the Regulatory Integration Team (RIT)
and subsequent correspondence between Bechtel/SAIC Company, LLC (BSC),
DOE's management contractor for the Yucca Mountain project, and DOE.
Moreover, we discussed quality assurance issues with officials of DOE's
Office of Civilian Radioactive Waste Management (OCRWM), including the
Acting Director and Deputy Director, at DOE headquarters in Washington,
D.C., and at its field office in Las Vegas. In addition, we interviewed
representatives of Navarro Quality Services, a DOE subcontractor, as well
as BSC, and NRC officials in the agency's field office in Las Vegas,
Nevada, and at its headquarters in Rockville, Maryland.

To determine DOE's tracking of quality problems and progress implementing
quality assurance requirements since our April 2004 report, we interviewed
OCRWM, BSC, and NRC officials about the status of these efforts since the
issuance of our prior report. We also reviewed DOE's Management
Improvement Initiatives (2002), DOE's Management Improvement Initiatives
Transition Approach (2003), and our 2004 report to understand the history
of the improvement efforts. To understand DOE's management tools to
monitor problems and progress, we reviewed the available performance
indicators panels from April 2004 through August 2005, when it was last
produced; the documentation on the individual indicators applied to August
2005 data; and the quarterly trend reports from the fourth quarter of
fiscal year 2003 through the fourth quarter of fiscal year 2005. We also
reviewed information from condition reports and examined documentation on
DOE's Quality Assurance Requirements and Description (issued in August
2004), BSC's Trend Evaluation and Reporting, and DOE's Procedure:
Condition Reporting and Resolution (issued in November 2005).

To determine challenges that DOE faces as it continues to address quality
assurance issues within the project, we reviewed information from
condition reports, NRC on-site representative reports, DOE Inspector
General reports, and an OCRWM Office of Concerns Program's investigative
report on past quality assurance problems and DOE's efforts to address
them. We obtained information on turnover in key management positions at
DOE and BSC since 2000. In addition, we discussed with DOE and NRC
officials DOE's difficulties in addressing recurring quality assurance
problems and the quality assurance implications of the Yucca Mountain
project moving from the site characterization phase to design and
licensing. Also, to better understand issues and challenges, we attended
quarterly meetings held between DOE and NRC in Rockville in September 2005
and Las Vegas in December 2005.

To identify recent employee concerns related to quality assurance, such as
falsification of records and a safety-conscious work environment, as well
as to identify the actions taken to address those concerns, we reviewed
all concerns received by the OCRWM and BSC Employee Concerns Programs from
January through November 2005. For the OCRWM program, we reviewed all
employee concerns files to identify concerns related to quality assurance.
For the BSC program, we first read summary descriptions of each concerns
file, and reviewed the concerns files for only those we identified as
related to quality assurance. We then conducted a content analysis of all
concerns files that we reviewed. Next, our three team members reached
consensus about the correct classification of a concern as a quality
assurance problem, such as potential falsification of records. Finally,
through a second review of concerns files, we verified our recorded
information for those concerns that seemed to be important illustrations
of problems. In addition, we also spot-checked a sample of OCRWM and BSC
concerns received in 2005 to verify the accuracy of their placement in
various concerns categories. We found that the concerns were generally
categorized accurately.

We performed our work from July 2005 through January 2006 in accordance
with generally accepted government auditing standards.

Yucca Mountain Project Employee Concerns Programs Appendix II

NRC expects licensees to establish a safety conscious work
environment-that is, one in which (1) employees are encouraged to raise
concerns either to their own management or to NRC without fear of
retaliation and (2) employees' concerns are resolved in a timely and
appropriate manner according to their importance. NRC encourages but does
not require licensees to establish employee concerns programs to help
achieve such a work environment, and both DOE and BSC have established
such programs.1 DOE's Employee Concerns Program is currently operated
under the requirements of DOE Order 442.1A, but the department, in
anticipation of becoming a licensee, is in the process of establishing the
program to meet NRC expectations.

DOE and contractor employees at the Yucca Mountain project may raise
concerns about quality, safety, or other work environment issues-such as
harassment, intimidation, retaliation, and discrimination-through various
means. Employees are encouraged to resolve concerns at the lowest possible
level in the organization, in the following order:

o Use normal supervisory channels, such as by raising an issue to a
manager for resolution.

o Initiate a condition report through the Corrective Action Program-a
process in which any employee can formally identify a problem on the
project, such as with policies, procedures, or the work environment, and
have the issue investigated and, if necessary, fixed through corrective
actions.

o Submit a concern via e-mail, telephone, or in person to one of the
project's two Employee Concerns Programs-a BSC program for BSC employees
and other subcontractors and another run by DOE for either DOE or BSC
employees.

o Contact NRC directly.

The DOE and BSC concerns programs are intended to supplement rather than
replace the resolution of problems through managers or the Corrective
Action Program.

DOE and BSC Employee Concerns Programs have each established a
communication network to allow employees to register concerns. These
networks include brochures and regular newsletters on the programs and
numerous links to the program on the project's intranet, where employees
can obtain concerns forms. Both the DOE and BSC concerns programs of the
Yucca Mountain project have four main steps:

1.Employees notify the concerns program staff about issues that they feel
should be corrected, such as safety or health issues; harassment,
intimidation, retaliation, or discrimination; concerns raised through the
Corrective Action Program; and quality assurance problems.

2.The concerns program staff document and handle the concern in accordance
with the requirements of DOE Order 442.1A.

3.The concerns program notifies the employees of the results of the
investigation and notifies management of any deficiencies.

4.Project management develops corrective actions for deficiencies, and the
program validates that the concerns have been effectively addressed by the
actions.

Under DOE Order 442.1A, concerns may be addressed through an investigation
by the concerns program staff, an independent investigation, a referral, a
transfer, or a dismissal of the concern. Employees can request or waive
confidentiality. If a concern is submitted anonymously, interpreting the
main issues and problems is left up to the concerns program staff, and
action on the concern may be limited if the submitted information does not
clearly or sufficiently define the concern.

The concerns program may conduct its own investigation of the concern.
Alternatively, it may refer the concern to another project organization
for investigation or resolution. After the results of the investigation or
resolution are reported to the concerns program within a specified period,
the concerns program accepts the results or requires additional actions.
In other cases, concerns may be transferred to another organization with
the appropriate subject matter responsibility or expertise, such as the
Office of Human Relations, Office of General Counsel, or Office of the
Inspector General.

After investigating a concern, the concerns programs determines whether
the concern is substantiated, partially substantiated, unsubstantiated, or
indeterminate. If a concern is substantiated or partially substantiated,
the investigation results are presented to the responsible senior
managers. A concern is considered indeterminate when evidence is
insufficient to substantiate a concern or allow for a conclusion to be
drawn. Some concerns can be resolved through a noninvestigative
resolution, a method to address concerns promptly when minimal effort is
required for resolution. Some resolutions involve the development of
management corrective action plans that are tracked until they are closed.
In addition, for deficiencies that identify systemic problems, the
concerns programs may file a condition report through the Corrective
Action Program. Moreover, DOE and contractor employees are required to
report certain conditions or alleged conditions to DOE's Office of the
Inspector General under DOE Order 221.1, which covers waste, fraud, and
abuse. The concerns program handles some employee concerns in this way.

From January through November 2005, DOE's concerns program opened 139
employee concerns for investigation, and the BSC concerns program opened
112 concerns for investigation.2 DOE's concerns program places concerns
into 14 categories, while the BSC program uses 20 categories.3 For both
DOE and BSC, the category receiving by far the most concerns for calendar
year 2005 was management: "management/mismanagement" for DOE and
"management practices" for BSC. According to DOE, management concerns
generally involved conditions related to management behavior, policy
practice, budget allocation, or use of resources. According to the manager
of BSC's program, about half of the concerns in the management practices
category involve hiring and human relations issues and the other half
involve organizational policies and other issues. The "quality" category
accounts for a relatively small portion of total concerns-18 percent of
concerns for the DOE program and 4 percent for the BSC program. Tables 3
and 4 show the concerns received by the DOE and BSC programs for January
through November 2005.

Table 3: Employee Concerns Opened for Investigation under DOE's Employee
Concerns Program by Category of Concern, January through November 2005

                                        

                      Concern category                    Percentage of total 
                                                                     concerns 
Management/mismanagement                                                42 
Workplace violence                                                       0 
Harassment, intimidation, retaliation, and                               6 
discrimination                                         
Reprisal                                                                 0 
Chilling effect                                                          5 
Security                                                                 0 
Health                                                                   0 
Safety                                                                   4 
Environment                                                              0 
Fraud, waste and abuse                                                   4 
Human resources                                                         12 
Equal Employment Opportunity                                             2 
Quality                                                                 18 
Other                                                                    8 
Total                                                                  100 

Source: DOE.

Note: Percentages may not add to 100 because of rounding.

Table 4: Employee Concerns Opened for Investigation under BSC's Employee
Concerns Program by Category of Concern, January through November 2005

                                        

              Concern category                   Percentage of total concerns 
Management practices                                                    48 
Industrial                                                               1 
Health                                                                   4 
Fraud                                                                    3 
Fitness for duty                                                         1 
Ethics                                                                   5 
Cyber                                                                    0 
Access authorization                                                     0 
Environmental                                                            1 
Employee relations                                                       5 
Intimidation                                                             1 
Harassment                                                               0 
Discrimination                                                           4 
Chilling effect                                                          4 
Abuse                                                                    4 
Training                                                                 1 
Safety-conscious work environment                                        3 
Retaliation                                                              4 
Quality                                                                  4 
Other                                                                    6 
Total                                                                  100 

Source: BSC.

Note: Percentages may not add to 100 because of rounding.

The Employee Concerns Programs, which are designed to provide an
alternative to raising issues through the Corrective Action Program and
issuing condition reports, have been playing an active and sometimes key
role in identifying and addressing quality assurance problems, as can be
seen in the following examples:

o As part of an effort to identify e-mails relevant to the licensing
process and that therefore should be included in the Licensing Support
Network, BSC employees in late 2004 discovered e-mails suggesting
potential falsification of technical records. The e-mails were submitted
to the Employee Concerns Program in March 2005 and were eventually
reported to the DOE Inspector General for investigation. The quality
assurance issues raised by the e-mails have resulted in a substantial
effort by DOE to restore confidence in the quality of technical documents
that will support its license application to construct the repository.

o In mid-2005, the DOE concerns program referred to the project's senior
management an employee's allegation that the project's schedule was taking
priority over quality in the review of technical documents. In this
instance, the Office of Concerns Program Manager negotiated with senior
management to address the time and resource needs for ensuring quality
assurance, rather than simply communicating to the organization that
quality should take priority over the schedule.

o As the result of an employee's concerns referred to DOE by NRC in mid-
2005, the Employee Concerns Program initiated an extensive investigation
of issues related to requirements management. That investigation
substantiated the employee's concerns and led to the issuance of 14
condition reports for problem resolution. Signifying the importance of
this issue, DOE discussed problems with requirements management with NRC
at their quarterly meeting in December 2005.

The Employee Concerns Programs' role in identifying and addressing quality
assurance and other issues is dependent upon employees' willingness to
submit concerns, but the employees' willingness has sometimes been in
doubt. A late 2004 DOE survey of project employees indicated, for example,
that less than two-thirds of employees were confident that submitted
concerns would be thoroughly investigated and appropriately resolved. DOE
recognizes the need to improve employee trust and willingness to use the
concerns program, and both the DOE and BSC program are engaged in outreach
efforts. However, employees' willingness to submit concerns may be
affected by factors outside the programs' control. According to a DOE
manager, the project's recent and pending workforce reductions may account
for a decreasing number of concerns submitted to the DOE program in late
2005. Based on OCRWM Employee Concerns Program data, the program averaged
about 13 concerns a month from January through November 2005. However, the
number of monthly concerns dropped to 5 in October and 3 in November 2005.

During our review of concerns opened for investigation from January 2004
through November 2005, we did not identify any concerns alleging problems
similar to the falsification of technical records suggested by the USGS
e-mails. Although we found records of an early 2004 concern about an
instance of inappropriate management of a technical document, this
instance was resolved and did not appear to be an intentional or
systematic effort to falsify records. The manager of the BSC program told
us of a concern raised about another set of e-mails, but this concern was
not about record falsification. The manager of the DOE program told us
that she had not seen any reportable allegations of falsification of
technical records since she took her position in July 2004.

GAO Contact and Staff Acknowledgments Appendix III

Jim Wells, (202) 512-3841 or [email protected]

In addition to the contact named above, Raymond Smith (Assistant
Director), Casey Brown, John Delicath, James Espinoza, and Terry Hanford
made key contributions to this report.

(360611)

GAO's Mission

The Government Accountability Office, the audit, evaluation and
investigative arm of Congress, exists to support Congress in meeting its
constitutional responsibilities and to help improve the performance and
accountability of the federal government for the American people. GAO
examines the use of public funds; evaluates federal programs and policies;
and provides analyses, recommendations, and other assistance to help
Congress make informed oversight, policy, and funding decisions. GAO's
commitment to good government is reflected in its core values of
accountability, integrity, and reliability.

Obtaining Copies of GAO Reports and Testimony

The fastest and easiest way to obtain copies of GAO documents at no cost
is through GAO's Web site ( www.gao.gov ). Each weekday, GAO posts newly
released reports, testimony, and correspondence on its Web site. To have
GAO e-mail you a list of newly posted products every afternoon, go to
www.gao.gov and select "Subscribe to Updates."

Order by Mail or Phone

The first copy of each printed report is free. Additional copies are $2
each. A check or money order should be made out to the Superintendent of
Documents. GAO also accepts VISA and Mastercard. Orders for 100 or more
copies mailed to a single address are discounted 25 percent. Orders should
be sent to:

U.S. Government Accountability Office 441 G Street NW, Room LM Washington,
D.C. 20548

To order by Phone: Voice: (202) 512-6000 TDD: (202) 512-2537 Fax: (202)
512-6061

To Report Fraud, Waste, and Abuse in Federal Programs

Contact:

Web site: www.gao.gov/fraudnet/fraudnet.htm E-mail: [email protected]
Automated answering system: (800) 424-5454 or (202) 512-7470

Congressional Relations

Gloria Jarmon, Managing Director, [email protected] (202) 512-4400 U.S.
Government Accountability Office, 441 G Street NW, Room 7125 Washington,
D.C. 20548

Public Affairs

Paul Anderson, Managing Director, [email protected] (202) 512-4800 U.S.
Government Accountability Office, 441 G Street NW, Room 7149 Washington,
D.C. 20548

www.gao.gov/cgi-bin/getrpt? GAO-06-313 .

To view the full product, including the scope

and methodology, click on the link above.

For more information, contact Jim Wells at (202) 512-3841 or
[email protected].

Highlights of GAO-06-313 , a report to the Chairman, Subcommittee on the
Federal Workforce and Agency Organization, Committee on Government Reform,
House of Representatives

March 2006

YUCCA MOUNTAIN

Quality Assurance at DOE's Planned Nuclear Waste Repository Needs
Increased Management Attention

The Department of Energy (DOE) is working to obtain a license from the
Nuclear Regulatory Commission (NRC) to construct a nuclear waste
repository at Yucca Mountain in Nevada. The project, which began in the
1980s, has been beset by delays. In a 2004 report, GAO raised concerns
that persistent quality assurance problems could further delay the
project. Then, in 2005, DOE announced the discovery of employee e-mails
suggesting quality assurance problems, including possible falsification of
records. Quality assurance, which establishes requirements for work to be
performed under controlled conditions that ensure quality, is critical to
making sure the project meets standards for protecting public health and
the environment.

GAO was asked to examine (1) the history of the project's quality
assurance problems, (2) DOE's tracking of these problems and efforts to
address them since GAO's 2004 report, and (3) challenges facing DOE as it
continues to address quality assurance issues within the project.

What GAO Recommends

GAO recommends five actions DOE can take to improve the project's
management tools and identify and address quality assurance and other
problems.

In oral comments, DOE agreed with GAO's recommendations.

DOE has had a long history of quality assurance problems at the Yucca
Mountain project. In the 1980s and 1990s, DOE had problems assuring NRC
that it had developed adequate plans and procedures related to quality
assurance. More recently, as it prepares to submit a license application
for the repository to NRC, DOE has been relying on costly and
time-consuming rework to resolve lingering quality assurance problems
uncovered during audits and after-the-fact evaluations.

DOE announced, in 2004, that it was making a commitment to continuous
quality assurance improvement and that its efforts would be tracked by
performance indicators that would enable it to assess progress and direct
management attention as needed. However, GAO found that the project's
performance indicators and other key management tools were not effective
for this purpose. For example, the management tools did not target
existing areas of concern and did not track progress in addressing them.
The tools also had weaknesses in detecting and highlighting significant
problems for management attention.

DOE continues to face quality assurance and other challenges. First, DOE
is engaged in extensive efforts to restore confidence in scientific
documents because of the quality assurance problems suggested in the
discovered e-mails between project employees, and it has about 14 million
more project e-mails to review. Second, DOE faces quality assurance
challenges in resolving design control problems associated with its
requirements management process-the process for ensuring that high-level
plans and regulatory requirements are incorporated into specific
engineering details. Problems with the process led to the December 2005
suspension of certain project work. Third, DOE continues to be challenged
to manage a complex program and organization. Significant personnel and
project changes initiated in October 2005 create the potential for
confusion over roles and responsibilities-a situation DOE found to
contribute to quality assurance problems during an earlier transition.

View of Yucca Mountain and the Exploratory Tunnel for the Repository
*** End of document. ***