Federal Research: DOE Is Providing Independent Review of the Scientific
Merit of Its Research (Letter Report, 04/25/2000, GAO/RCED-00-109).
Pursuant to a congressional request, GAO provided information on the
Department of Energy's (DOE) merit review practices, focusing on: (1)
what procedures DOE has established for performing merit reviews; and
(2) whether DOE could document that it has followed the merit review
procedures it has established.
GAO noted that: (1) both the Office of Basic Energy Sciences and the
Office of Power Technologies have established procedures for merit
reviews, setting out what types of review will be performed, who will
perform the reviews, what criteria will be used in the evaluations, and
how individual reviewers' comments will be used by those making award
decisions; (2) the offices differ in the specifics, however, reflecting
DOE's belief that a "one size fits all" approach is not appropriate in
an agency with components that have such varying research objectives;
(3) for example, since the Office of Basic Energy Sciences conducts
basic research, it focuses its reviews on the merits of the science and
the qualifications of the researchers; (4) in contrast, since the Office
of Power Technologies conducts applied research, it focuses its merit
reviews on research outcomes and management of the projects and
programs; (5) these approaches are consistent with the Office of Science
and Technology Policy's view that peer review practices should be
flexible and tailored to agencies' missions and types of research; (6)
on the basis of GAO's analysis of programs and more than 150 specific
projects funded by the two offices in fiscal years 1998 and 1999, the
Office of Basic Energy Sciences and the Office of Power Technologies
documented that they were following the procedures they had established
for merit reviews; and (7) all projects are reviewed except those that
are provided project-specific funding through congressional mandates.
--------------------------- Indexing Terms -----------------------------
REPORTNUM: RCED-00-109
TITLE: Federal Research: DOE Is Providing Independent Review of
the Scientific Merit of Its Research
DATE: 04/25/2000
SUBJECT: Research and development
Quality assurance
Research programs
Performance measures
Research program management
******************************************************************
** This file contains an ASCII representation of the text of a **
** GAO Testimony. **
** **
** No attempt has been made to display graphic images, although **
** figure captions are reproduced. Tables are included, but **
** may not resemble those in the printed version. **
** **
** Please see the PDF (Portable Document Format) file, when **
** available, for a complete electronic file of the printed **
** document's contents. **
** **
******************************************************************
GAO/RCED-00-109
Appendix I: Overview of Merit Review Practices in the
Department of Energy
24
Appendix II: Comments From the Department of Energy
30
Appendix III: Key Contacts and Staff Acknowledgments
31
DOE Department of Energy
GAO General Accounting Office
Resources, Community, and
Economic Development Division
B-282892
April 25, 2000
The Honorable F. James Sensenbrenner, Jr.
Chairman
The Honorable Ralph M. Hall
Ranking Minority Member
Committee on Science
House of Representatives
Like the scientific community as a whole, federal agencies normally subject
their research programs to a peer review process. While there is no precise
definition, federal officials have characterized peer review as "a process
that includes an independent assessment of the technical, scientific merit
of research by peers who are scientists with knowledge and expertise equal
to that of the researchers whose work they review." Individual agencies vary
in their approach to peer review.
In March 1999, we reported on peer review procedures in 12 federal science
agencies, finding that all of them were using peer review to assess research
proposals.1 One of these agencies was the Department of Energy (DOE), which
referred to its process as "merit review with peer evaluation." You
subsequently asked that we conduct a follow-up study to determine whether
DOE had implemented the "merit review" procedures the agency said it had
established. You requested that we determine (1) what procedures DOE has
established for performing merit reviews and (2) whether DOE could document
that it has followed the merit review procedures it has established. As
agreed with your offices, the scope of our work was limited to DOE's Office
of Basic Energy Sciences in the Office of Science and the Office of Power
Technologies in the Office of Energy Efficiency and Renewable Energy. We
reviewed the merit review procedures and practices of these two offices and
analyzed the merit review files for selected projects and programs at
headquarters, one DOE operations office, one DOE field office, and two DOE
laboratories.
Both the Office of Basic Energy Sciences and the Office of Power
Technologies have established procedures for merit reviews, setting out what
types of review will be performed, who will perform the reviews, what
criteria will be used in the evaluations, and how individual reviewers'
comments will be used by those making award decisions. The offices differ in
the specifics, however, reflecting DOE's belief that a "one size fits all"
approach is not appropriate in an agency with components that have such
varying research objectives. For example, since the Office of Basic Energy
Sciences conducts basic research, it focuses its reviews on the merits of
the science and the qualifications of the researchers. In contrast, since
the Office of Power Technologies conducts applied research, it focuses its
merit reviews on research outcomes and management of the projects and
programs. These approaches are consistent with the Office of Science and
Technology Policy's view that peer review practices should be flexible and
tailored to agencies' missions and types of research.2
On the basis of our analysis of programs and more than 150 specific projects
funded by the two offices in fiscal years 1998 and 1999, the Office of Basic
Energy Sciences and the Office of Power Technologies documented that they
were following the procedures they had established for merit reviews.
Currently, all projects are reviewed except those that are provided
project-specific funding through congressional mandates.
The federal government is a primary source of funding for research and
development, which accounted for about $80 billion in the fiscal year 1999
budget. This research is performed by the government's own scientists as
well as by external organizations receiving federal financial assistance.
DOE is one of the largest federal agencies funding research, accounting for
$7.0 billion in research and development funds in the fiscal year 1999
budget. Research is carried out by universities, nonprofit organizations,
and industry through financial assistance awards or by the contractors that
operate DOE's national laboratory system. Non-DOE organizations submit
proposals for financial assistance in response to solicitations by DOE or
because they have identified areas of research they wish to pursue and
believe are compatible with DOE's research objectives. For the laboratory
projects, the laboratory contractors identify the projects they believe need
to be carried out by their respective laboratories and combine them into
"field work proposals" to DOE for funding consideration.
Within DOE, two of the offices funding research are the Office of Basic
Energy Sciences in the Office of Science and the Office of Power
Technologies in the Office of Energy Efficiency and Renewable Energy.3 The
two offices differ in their research objectives, however, with the Office of
Basic Energy Sciences focusing on basic research, while the Office of Power
Technologies focuses on applied research. The purpose of basic research is
to obtain greater knowledge of the fundamental aspects of phenomena and
observable facts without specific applications toward processes or products.
The purpose of applied research is to gain the knowledge or understanding
necessary for determining the means by which a recognized and specific need
may be met. Said another way, basic research aims at expanding knowledge,
while applied research aims at solving practical problems.
The Office of Basic Energy Sciences' mission is to foster and support
fundamental research in the natural sciences and engineering that will
provide a basis for (1) developing new and improved energy technologies and
(2) understanding and mitigating the environmental impacts of energy use.
This research is subdivided into four broad subprograms--materials science,
chemical science, engineering and geosciences, and energy biosciences. As
part of its mission, the Office of Basic Energy Sciences plans, constructs,
and operates major scientific user facilities to serve more than 2,400
researchers in universities, other nonprofit organizations, national
laboratories, and industry. In fiscal year 1999, the Office of Basic Energy
Sciences' research and development budget was $779.2 million. Approximately
$639.2 million, or 82 percent, supported research at DOE laboratories, while
$134.4 million, or 17.2 percent, went for financial assistance projects
funded through grants, contracts, and cooperative agreements. The remaining
$5.6 million, or 0.7 percent, went for all other types of projects.
The Office of Power Technologies' mission is to work with electric service
providers and related industries to advance clean, competitive, and reliable
power technologies. The Office of Power Technologies develops renewable
energy technologies that use solar, wind, hydropower, geothermal, and
biomass energy resources and conducts research and development aimed at
creating a hydrogen energy infrastructure. The Office of Power Technologies
also develops advanced technologies--including high-temperature
superconducting materials, real-time power system controls, and energy
storage--that will improve the reliability, energy efficiency, and
cost-effectiveness of the nation's electric transmission and distribution
systems. Finally, the Office of Power Technologies facilitates the export of
renewable energy power generation internationally. The Office of Power
Technologies' research and development budget for fiscal year 1999 was
$270.7 million. Approximately $181.4 million, or 67 percent, of the funding
supported research at DOE laboratories, while the remainder was for
financial assistance projects.
Like other federal research agencies and the scientific community in
general, DOE supports the use of independent peer review of the research it
sponsors. DOE practices peer review as "merit review with peer evaluation,"
which DOE officials say is a formal, competent, and objective evaluation
process using specified criteria and the review and advice of qualified
peers. DOE uses merit reviews to guide the direction of research and to
assess its progress. DOE defines merit review in 10 C.F.R. 600.3 as "a
thorough, consistent, and objective examination of applications based on
pre-established criteria by persons who are independent of those submitting
the applications and who are knowledgeable in the field of endeavor for
which support is requested." These individuals may come from any source,
including industry, academia, private and nongovernmental institutions,
government agencies, and their associated laboratories.
DOE does not have a single agencywide set of policies and procedures for
merit reviews. Rather, there are specific procedures set out for the
agency's various programs and functions. According to the Deputy Secretary
of Energy, DOE's research grants, cooperative research and development, and
other financial programs supporting research and development are governed by
policies--including guidance on merit reviews--set out in 10 C.F.R. parts
600 through 605. Similarly, for research and development programs conducted
through contractual mechanisms and competitive procurements, policies and
procedures requiring objective review are established by statute,
regulation, practice, and culture. The laboratories are expected to apply
merit review procedures as set out in the agreements between DOE and the
laboratory contractors.
Our March 1999 report on peer review practices in 12 federal science
agencies contained an appendix on DOE. We have included the information on
DOE in appendix I of this report.
Merit Reviews
Both the Office of Basic Energy Sciences and the Office of Power
Technologies have established merit review procedures for their research
programs. The specific procedures vary between the two offices, however,
reflecting DOE's view that merit reviews should be tailored to the specific
office or program involved rather than having a "one size fits all" policy
agencywide. This view is consistent with the Office of Science and
Technology Policy's belief--as discussed in our earlier report--that peer
review practices should be flexible and tailored to agencies' missions and
types of research.
The Office of Basic Energy Sciences requires that all of its
research--except for projects mandated by the Congress--be subjected to
merit review prior to being funded. The Office of Basic Energy Sciences'
primary objective in these merit reviews is to "provide an independent
assessment of the scientific and/or technical merit of research by peers
having knowledge and expertise equal to that of the researchers whose work
they review."
The Office of Basic Energy Sciences' merit reviews are project-specific. The
reviewers normally are persons familiar with the science required by the
project but not closely associated with the particular research or
organizations involved in the research. The criteria used by the reviewers
in evaluating the proposals are set out in the regulations and DOE's
published procedures and are essentially the same for each review. The
reviewers normally provide independent evaluations in narrative form.
Reviews Are Project-Specific
The Office of Basic Energy Sciences' policy is to perform a project-specific
merit review prior to awarding funds for any financial assistance or
laboratory project unless that project was mandated by the Congress. In
fiscal year 1998, there were no congressionally mandated projects in the
Office of Basic Energy Sciences' research budget of $645 million. In fiscal
year 1999, the Office of Basic Energy Sciences' research budget of $779.2
million included only one congressionally mandated project, accounting for
$487,000, or 0.06 percent of the overall budget.
According to procedures issued by the Office of Science, financial
assistance awards--which typically are awarded for multiple years--are to be
merit-reviewed before original approval and at every renewal. While the
merit review on a renewal can be waived, no financial assistance project may
be renewed for more than 6 years without a review. The Office of Basic
Energy Sciences provides that laboratory projects funded through field work
proposals are to be merit-reviewed prior to approval and generally every 3
to 4 years thereafter. Under certain circumstances, the Office of Basic
Energy Sciences may allow laboratory projects to be extended up to 6 years
without additional merit reviews.
Officials from the Office of Basic Energy Sciences noted that the office's
research projects are often subject to other types of external review. One
of these is the peer review process to which scientific papers are subjected
before they can be included in scientific journals. These constitute peer
reviews in their own right and also are made available to the merit
reviewers selected by the Office of Basic Energy Sciences for their use in
evaluating the projects in question.
Reviewers Have Expertise in the Science Rather Than in the Specific Area of
Research
An Office of Basic Energy Sciences merit review team must comprise three or
more professionally and technically qualified persons, and the reviewers
themselves must be free from conflict of interest.4 Office of Basic Energy
Sciences officials told us that program managers--who typically choose the
reviewers--must keep their knowledge current in the fields in which they
work. This would include knowing who is qualified to serve as a merit
reviewer and being familiar with their previous reviews. Over time, the
Office of Basic Energy Sciences has developed a cadre of reviewers that it
can call on to carry out specific reviews. The reviewers receive no
additional pay but may be reimbursed for travel expenses.
The Office of Basic Energy Sciences prefers to use merit reviewers from
outside DOE. While they did not have a precise definition of what
constitutes a conflict of interest for outside reviewers, Office of Basic
Energy Sciences officials said that they would not use any persons who
themselves were submitting a grant proposal for the project under review.
However, they might choose another employee of the same organization if he
or she was sufficiently removed from the project. For example, if a
university submits a proposal for a grant, a reviewer could be from the same
university but not from the same department that would be performing the
research.
For both financial assistance and laboratory projects, procedures issued by
the Office of Science and the Office of Basic Energy Sciences would exclude
as reviewers DOE or laboratory contractor personnel from the same laboratory
or from another laboratory working on the research in question. DOE
employees could not be reviewers if they were the contracting officer or
were responsible for managing, auditing, or providing technical assistance
on the project in question.
The Office of Basic Energy Sciences conducts each merit review using one of
four basic methods:
� Field readers. Under this method, Office of Basic Energy Sciences program
managers send project packages to three or more reviewers. The reviewers
then return written comments to the program manager. The readers do not have
contact with one another. This method can be used for both financial
assistance proposals and laboratory projects.
� On-site or off-site panel reviews. The Office of Basic Energy Sciences may
request three or more reviewers to meet as a panel to evaluate laboratory
projects. The reviewers are required to document their findings to the
program manager.
� Standing committees. The Office of Science has the authority to establish
and use a standing committee to review financial assistance projects. The
choice of a standing committee is appropriate when required by legislation
or when (1) there are enough applications on specific topics received on a
regular basis, (2) there are persons available on the committee to serve as
reviewers, or (3) the legislative authority for the project involved extends
beyond 1 year.
� Ad hoc committees. The Office of Basic Energy Sciences may use ad hoc
committees when it determines that a proposal for either a financial
assistance or laboratory project has special review requirements. Such
requirements might include construction or facility operation; subject
matter complexity involving several areas of expertise; consideration of
several projects on a similar topic; or a subject matter of a special
nonrecurring nature.
Review Criteria Are Standardized
The Office of Basic Energy Sciences uses standardized criteria for its merit
reviews. The criteria that reviewers are to consider for financial
assistance proposals are set out in 10 C.F.R. 605.10, in descending order of
importance, as follows:
� scientific and/or technical merit or the educational benefits of the
project,
� appropriateness of the proposed method or approach,
� competency of applicant's personnel and adequacy of proposed resources,
� reasonableness and appropriateness of the proposed budget, and
� other appropriate factors established and set forth by the Office of
Science in a notice of availability or a special solicitation.
The criteria for performing merit reviews on laboratory projects are set out
in Office of Basic Energy Sciences procedures and mirror those set out for
financial assistance projects. The only differences are that (1) the first
criterion for laboratory projects omits the terminology "educational
benefits of the project," (2) the final criterion is worded "other
appropriate factors established and set forth by the Office of Basic Energy
Sciences," and (3) the criteria do not specify their order of importance.
Office of Basic Energy Sciences officials said that the use of standardized
criteria is essential because each proposal involves basic research and is
evaluated on its own merit. The Office of Basic Energy Sciences' concern is
that each project is "good science," that an organization submitting a
proposal is qualified and capable, and that merit reviewers look at each
project in the same manner.
Reviewers Provide Independent Narrative Assessments
Office of Basic Energy Sciences merit reviewers are required to provide a
written evaluation or analysis to the program manager. Reviewers are
independent and, in preparing their narrative comments, are not required to
follow any particular format or even comment on each of the individual
criteria. Reviewers normally do not assign numerical scores or rank
proposals against one another. Also, reviewers normally do not provide a
consensus analysis of the proposals. In the case of field readers, the
reviewers do not even know who the other reviewers were or what they
reported. In the case of panels, there may be a summary of the individual
reviewers' reports.
The program manager is responsible for providing a narrative analysis and
funding recommendation on each proposal to the selecting official. While the
program manager uses the merit reviewers' comments in making his or her own
decision and includes a summary of the comments in the narrative, he or she
does not show a consensus of the reviewers' views. Instead, according to
Office of Basic Energy Sciences officials, the program manager considers the
quality of the technical insights in each reviewer's comments. The program
manager also considers the reviewer's reputation and expertise as well as
the program manager's previous experience with the reviewer. For example, a
short paragraph from a highly respected expert in the area might carry more
weight than several pages from a less experienced scientist. In addition,
the program manager must consider the reviewers' reports in comparison with
one another.
As with the Office of Basic Energy Sciences, Office of Power Technologies
research projects--other than those mandated by the Congress--are subjected
to merit review prior to being funded. Generally, Office of Power
Technologies projects and programs are reviewed annually as a part of the
individual program reviews used to formulate annual operating plans. In
addition, financial assistance projects and competitively bid laboratory
subcontracts are subjected to their own merit reviews.
The Office of Power Technologies differs from the Office of Basic Energy
Sciences in its approach to merit reviews. Office of Power Technologies
officials said that because they focus on applied rather than basic
research, the scientific merit of a program or project normally has already
been established. They are more concerned with whether the research will
achieve the desired objectives and thus concentrate merit reviews on the
anticipated results of the research and management of the program or
project. To this end, the reviews tend to be program-oriented, rely on
persons and panels with specific knowledge in the particular field of
research, use review criteria that are designed specifically for the project
or program under review, and make use of numerical scoring and consensus
reporting.
Reviews Are Program-Oriented and Multilevel
The Office of Power Technologies takes an integrated programmatic approach
to merit review, and research projects are subjected to merit review at
multiple points in the planning process. These include the long-range plan
for determining where the technology is headed, multiyear and annual plans
for establishing DOE program direction, and the award process for individual
projects.
As a part of its long-range planning efforts, the Office of Power
Technologies develops "technology road maps" for individual programs to
define how the technology is expected to develop over some period. The
driving force for the road map is the industry--for example, wind,
photovoltaics, superconductivity--behind the particular technology.
Generally, the road map is put together by persons in the top echelons of
the industry with input and assistance from the Office of Power
Technologies. The period covered by the road map depends on the technology
and the window for its development. In the photovoltaics area, for example,
the road map covers 20 to 25 years, as the technology is still being refined
and developed. The road map in the superconductivity area is much
shorter--about 5 years--as there is a better idea of the end result and what
is needed to get there.
In addition to the technology road maps, the Office of Power Technologies
develops multiyear plans that set out goals, objectives, and strategies over
a shorter period of about 5 years. The multiyear plan is developed by the
office with industry input and review. Office of Power Technologies
officials said that both the technology road maps and the multiyear plans
are a form of peer or merit review--although not labeled as such--because
they involve outside experts helping the office determine its research
priorities and objectives.
Because of its focus on applied research, the Office of Power Technologies'
merit review process includes annual programmatic reviews as well as
individual project reviews. In addition, peer review of various long-range
planning documents for each of these programs is also considered part of the
merit review process.
The Office of Power Technologies has 12 programs--solar buildings,
photovoltaics, concentrating solar power, wind energy, geothermal,
hydropower, biopower, high-temperature superconductivity, hydrogen, energy
storage, transmission reliability, and distributed power--funding research
by DOE laboratories and external organizations. The Office of Power
Technologies develops annual operating plans for each program that, among
other things, set out particular research projects that are to be added,
modified, or dropped. Prior to developing the plan, the officials' practice
is to conduct program reviews of the individual programs and their projects.
While these reviews vary in format among the programs, the general approach
is to assemble cognizant DOE, DOE laboratory contractor, and industry
personnel at a common location and have the individuals responsible for the
various projects make presentations and answer questions raised by the
persons in attendance. As part of this process, the Office of Power
Technologies puts together a panel charged with rating each project on its
own merits and in comparison with the other projects. This review--which in
effect constitutes merit review for the individual projects as well as the
entire program--is then made available to the Office of Power Technologies
management to develop the annual operating plan and decide the budget for
various projects.
For financial assistance awards, the Office of Power Technologies in the
Office of Energy Efficiency and Renewable Energy requires a merit review on
each specific project. Prior to May 1998, this often was the case only for
competitive awards. However, in response to congressional concerns about the
number of noncompetitive awards it was making, the Office of Energy
Efficiency and Renewable Energy changed its procedures to encourage more
competitive awards. This change resulted in the Office of Power
Technologies' reducing the level of noncompetitive awards from 14 percent in
fiscal year 1998 to 5.9 percent in fiscal year 1999. Also, in May 1998, the
Office of Energy Efficiency and Renewable Energy reemphasized, through a
Federal Register notice, the requirement that all discretionary financial
assistance awards are to be subjected to merit reviews regardless of whether
they are competitive or noncompetitive.
For research conducted by DOE laboratories, individual projects normally are
not merit-reviewed separately because, according to Office of Power
Technologies officials, they already have been subjected to the program
review process. If the laboratory subcontracts part of the research, these
subcontracts typically are subjected to merit reviews if the subcontracts
are competitively bid. Noncompetitive subcontracts generally are subjected
to reviews by the laboratory management team only.
Office of Power Technologies officials said that they did not require merit
reviews of congressionally mandated projects. Such projects accounted for
9.4 percent of the office's research budget in fiscal year 1998 and 6.7
percent in fiscal year 1999.
Office of Power Technologies officials stressed that there are many other
occasions on which programs or projects may be subjected to merit review on
an ad hoc and postaward basis. In 1994, for example, a review team from what
is now the Office of Science performed a review of 115 research projects
sponsored by the photovoltaics program. Similarly, program or laboratory
management may convene a special merit review team at any time they believe
they need the assistance. Also, papers on research results submitted for
publication normally are subjected to a rigorous peer review process by the
scientific journals to which they are submitted.
Reviewers Have Expertise in the Specific Area of Research
As with the Office of Basic Energy Sciences, an Office of Power Technologies
merit review team must comprise three or more persons who are competent and
free from any conflict of interest. Unlike the Office of Basic Energy
Sciences, however, the Office of Power Technologies requires that its
reviewers be more closely aligned with the field of research and makes
greater use of DOE and laboratory contractor personnel.
Office of Power Technologies officials said they choose team members
knowledgeable about the program or project in question because team members
need to be familiar with the specific research and with the persons or
organizations involved in the research. While they did not have a precise
definition of what would constitute a conflict of interest, they said it did
not mean that individuals who worked in a related area would be precluded
from serving as reviewers. They said they would not use a person who worked
for one of the organizations competing for the award nor would they use the
selecting official, as these persons would not be sufficiently removed from
the process and would have a potential conflict of interest. Each reviewer
is required to sign a statement that he or she is free from conflicts of
interest on the subject review.
Office of Power Technologies officials said they normally require that the
merit review team members meet as a panel, regardless of whether the merit
review is for a program or a project. The teams can vary in size, normally
depending on the scope of work they are asked to perform and the complexity
of the project or program. Except in limited instances, the team members
receive no additional compensation but may be reimbursed for travel
expenses. In general, the Office of Power Technologies attempts to group
reviews and other meetings in such a way that the additional expense of a
particular merit review is kept to a minimum.
Review Criteria Vary by Program and Project
Unlike the Office of Basic Energy Sciences, the Office of Power Technologies
does not require its reviewers to apply standardized criteria in evaluating
programs and projects. Rather, Office of Power Technologies or laboratory
officials set criteria tailored to their needs on the particular merit
review. Office of Power Technologies officials said that each program and
project is different and that the flexible criteria are consistent with
having reviews aimed at evaluating projects for their ability to achieve the
desired results. However, they noted that typical evaluation criteria
frequently focus on the project's approach, the technical merit of the
project, and the capabilities of the applicant and key personnel.
Panels Provide Numerical Scores and Consensus Views
Unlike the Office of Basic Energy Sciences, the Office of Power Technologies
generally requires that its merit reviewers use numerical scoring sheets and
that the panels tabulate the results and reach consensus opinions. The
results are to be summarized by the leader of the team and provided to the
selecting officials, who then use them in making the final selections.
Office of Power Technologies officials said that they believe numerical
scoring and consensus opinions add consistency to the reviews. Otherwise,
someone else has to interpret how the reviewers rated the project. By using
numerical scoring, the reviewers are better able to identify their
differences, both in perception of the applicant's qualifications and the
relative scale used by each reviewer. This is helpful in discussions and
reaching consensus opinions.
The summary prepared by the team for a program review or a competitively bid
award typically ranks the various applicants in comparison with each other.
In some cases, the reviewers may be asked to set cutoff points to show which
applicants the reviewers thought were qualified to receive an award.
On the basis of our review of available documentation from program and
project files for fiscal years 1998 and 1999, the Office of Basic Energy
Sciences and the Office of Power Technologies are following the merit review
procedures they have established. Both offices are performing merit reviews
on projects or programs, are selecting reviewers with the requisite
knowledge of the research, are requiring those reviewers to apply
appropriate criteria in making their evaluations, and are using the merit
review evaluations in making award decisions. The two offices vary, however,
in the methods they employ to achieve these results.
Two recently issued internal studies--one by DOE's Inspector General and the
other by DOE's Laboratory Operations Board5--agree with our findings that
DOE is following its merit review procedures. The Board suggested ways for
DOE to standardize and strengthen its management of the review process, but
according to the Deputy Secretary of Energy, DOE has elected to maintain its
policy of having flexible procedures that can be adapted to the needs of the
particular offices.
In our analysis of documentation for 100 randomly selected projects funded
in fiscal year 1998 by the Office of Basic Energy Sciences, we found that 96
had been subjected to a merit review. The remaining four were not reviewed
because, according to office officials, the regulations would not have
subjected these projects to merit reviews at the time they were awarded.
Many of the projects we reviewed had been subjected to more than one merit
review because they were ongoing projects and subject to additional reviews
on a 3- to 4-year cycle. Overall, we identified 216 separate merit reviews
on the 96 projects.
The merit review files on the projects we selected did not include specific
information showing why the reviewers chosen were considered to have the
proper technical qualifications. However, they generally did show the
organizations with which the reviewers were affiliated, and these
organizations--such as domestic and foreign research universities, DOE
laboratories, other federal laboratories, and for-profit corporations--would
have had expertise in the broad areas of science involved in the research.
Overall, 48 of the 96 projects that were merit-reviewed included at least
one reviewer from a DOE laboratory, 12 included at least one reviewer from
another federal agency, and 91 included at least one reviewer from another
organization, such as a university. In most cases, the project files we
reviewed did not include conflict of interest statements from reviewers
because the Office of Basic Energy Sciences did not require such
documentation. However, the files did indicate that the reviewers were
external to the organizations submitting the proposals and thus appeared to
meet the Office of Basic Energy Sciences' requirements on conflict of
interest.
For each of the 96 projects in our sample for which the Office of Basic
Energy Sciences performed a merit review, the project file included
documentation indicating that at least three reviewers were involved in the
evaluation. The files also generally included the reviewers' written
evaluations of the proposals. In cases in which the individual written
evaluations were not in the file, there was other documentation--such as the
program manager's summary--indicating that individual written evaluations
had been submitted. Even though the evaluations varied in form and content,
the reviewers generally addressed the specific criteria established by DOE's
regulations and procedures.
Like the Office of Basic Energy Sciences, the Office of Power Technologies
was following its merit review procedures, based on our review of program
and project files for projects funded in fiscal years 1998 and 1999. The
primary mechanism was the program review. In fiscal year 1999, for example,
the Office of Power Technologies performed program reviews on 10 of its 12
programs. The remaining two programs were not subjected to merit reviews
because they were in the early stages of development. Each of the 10 program
reviews used panels of experts that reviewed all of the projects in the
particular program. Six of the panels assigned numerical scores.
The Office of Power Technologies also was performing separate, preaward
merit reviews on all financial assistance projects we reviewed that were
funded since May 1998. This was consistent with the Office of Energy
Efficiency and Renewable Energy's publication of a notice in the Federal
Register, as discussed earlier, requiring that all future awards for
financial assistance be merit-reviewed, regardless of whether they were
competitively bid.
At the two DOE laboratories we visited, the laboratory contractors were
conducting merit reviews of subcontracts only if they were competitively
bid. Any reviews of noncompetitive subcontracts were carried out by
laboratory management personnel in charge of the projects. The reasons given
by officials from the Office of Power Technologies and the laboratory
contractor for not having outside panels review noncompetitive subcontracts
were that (1) the awards were normally small, (2) the projects had already
been merit-reviewed in the program review of which the subcontract was a
part, (3) the laboratory contract itself was competitively bid and subject
to merit review, and (4) the laboratory contractor was charged with
following "best practices" in making the award and was in the best position
to determine the merits of the project and the qualifications of the
subcontractor.
The Office of Power Technologies is not required to include specific
information in its files showing why particular reviewers were selected or
why they were considered to have the requisite expertise. However, the files
for the 52 projects we reviewed did show the organizations and DOE units
from which the reviewers came, and we found that the panels were made up of
individuals who had an association with, but not a direct involvement in,
the research or organizations that were the subject of the reviews. We also
found that the reviewers were required to sign statements showing they had
no conflicts of interest and that these statements were included in the
project files. The Office of Power Technologies used merit reviewers from
diverse backgrounds. As would be expected for applied research, a larger
proportion of the reviewers on the Office of Power Technologies projects we
reviewed came from the DOE laboratories involved in the specific or related
research.
The criteria used by the Office of Power Technologies' reviewers in
evaluating proposals were more extensive than those used by the Office of
Basic Energy Sciences, required reviewers to consider a range of ranking
factors, and focused on anticipated results and management capabilities. For
example, in one financial assistance solicitation for a photovoltaics
research project in fiscal year 1998, the Office of Power Technologies set
out 21 separate categories in which proposals were to be ranked by the merit
reviewers. These were grouped into broader areas such as identification and
description of the proposed project; statement of work for the proposed
product and applications development; applicant and participant roles,
capabilities, and organization; market potential; and commercialization.
The project files we analyzed included individual reviewers' evaluations as
well as summaries of the panels' comments. An individual reviewer's
evaluation typically showed the score the reviewer assigned to each
dimension of the criteria as well as any narrative comments the reviewer
believed were warranted. Similarly, the summary showed a consensus score for
each proposal as well as a narrative showing the panel's overall assessment.
Typically, the summary included a ranking of the projects, showing the
recommended order of funding.
Two internal DOE studies produced findings that are consistent with our own
on the agency's use of merit reviews. Both the Inspector General and the
Laboratory Operations Board have issued reports concluding that DOE has
established merit review procedures and applied them consistently.
In April 1998, DOE's Inspector General issued a report on merit review
programs at three DOE laboratories--the National Renewable Energy
Laboratory, the Pacific Northwest National Laboratory, and the Los Alamos
National Laboratory. The Inspector General concluded that DOE had
established and was managing a peer review process for scientific and
technical projects at the three laboratories.
In March 1999, DOE's Laboratory Operations Board issued a report on DOE's
overall use of merit reviews. The report noted that DOE was making broad use
of merit reviews in all areas of research and that these reviews
appropriately use review mechanisms that match the specific objectives of
individual programs and projects. The report supported DOE's practice of
having different merit review procedures for individual offices and
programs, stating that a "one-size-fits-all approach would undermine the
legitimacy of the evaluation."
The report concluded that DOE should follow through on earlier commitments
it made to strengthen its management of the review process. Some of these
commitments were as follows:
� establishment of guidelines for conducting reviews at various levels of
management,
� periodic and random sampling of the use and effectiveness of the reviews,
and
� development of a process for linking review principles and methods to
other evaluation activities.
The report also noted that the reestablishment of the Office of Program
Analysis within the Undersecretary's Office would help institutionalize
these commitments and serve as a resource for program offices and
laboratories. The report said general agreement should be reached on how to
characterize the different types of merit review, noting that having a
common lexicon would help DOE better explain its extensive use of reviews.
In a February 18, 2000, letter responding to our request for information on
DOE's response to the Laboratory Operations Board's report, the Deputy
Secretary of Energy stated that DOE was generally supportive of the report's
findings that DOE was using merit reviews throughout the agency. He also
said that the report underscores DOE's position that the application of
merit reviews must be flexible and tailored to the nature of the individual
research and development programs, performers, missions, and objectives.
However, he did not see the need for additional guidance for different
levels of management or a centralized authority or office directing the
merit review process, as suggested by the Board.
The Deputy Secretary said that, in accordance with these views, no
additional periodic or random sampling of the use and effectiveness of merit
review has been initiated since the Board issued its report. He would not
rule out the possibility of such activities in the future. He said that the
other proposals made by the Board were best considered and implemented at
the program level, where differences in the nature of research and mission
objectives best determine the specifics of merit review procedures.
We provided a draft of this report to the Department of Energy for its
review and comment. The Department concurred with our report, stating that
it accurately describes the various types of peer reviews that the
Department uses to manage its programs and provides a good description of
the differences in peer review and merit review strategies that are used
between the basic science and applied science programs. The full text of the
Department's comments is in appendix II.
Our work focused on the Office of Basic Energy Sciences within the Office of
Science and the Office of Power Technologies within the Office of Energy
Efficiency and Renewable Energy. We reviewed policies and files at the two
offices' headquarters in Germantown, Maryland, and Washington, D.C.; the
Golden Field Office in Golden, Colorado; the Oak Ridge Operations Office in
Oak Ridge, Tennessee; the National Renewable Energy Laboratory in Golden;
and the Oak Ridge National Laboratory in Oak Ridge. Our review focused on
nondefense projects.
To determine what procedures DOE has established for performing merit
reviews, we obtained information describing these procedures for the
selected program offices, operations and field offices, and laboratories. We
also interviewed DOE and laboratory officials, analyzed formal and informal
policies and procedures, and reviewed merit review documentation in program
and project files.
To determine whether DOE has followed the merit review procedures it has
established, we selected Office of Basic Energy Sciences and Office of Power
Technologies program and project files at DOE headquarters, operations and
field offices, and laboratories for detailed examination. Our examination
efforts focused on those projects that had been funded in fiscal years 1998
and 1999. Because Office of Basic Energy Sciences files are maintained in
Germantown, Maryland, we were able to randomly select 100 files for review
from the 1,289 projects that were funded in fiscal year 1998. Our review of
the Office of Basic Energy Sciences files consisted of examining them for
documentation in accordance with established merit review criteria in the
regulations and DOE procedures. The projects in our sample of 100 had the
following characteristics:
� Seventy-five projects were financial assistance awards funded through
grants, 24 were laboratory projects funded through field work proposals, and
1 was a laboratory project mandated by the Congress.
� Sixty-nine of the award recipients were institutions of higher education,
25 were DOE laboratories, 5 were other nonprofit organizations, and 1 was a
small business.
� Ninety-seven of the projects were ongoing projects, while three were being
funded for the first time.
Within the Office of Power Technologies, we could not make a random
selection of project files because the files were not centrally located.
However, during our visits to the one field office, one operations office,
and two laboratories, we judgmentally selected and reviewed 52 Office of
Power Technologies projects funded in fiscal year 1998. We also reviewed the
most recent program reviews through fiscal year 1999 for various Office of
Power Technologies programs. As with the Office of Basic Energy Sciences,
our review of the Office of Power Technologies files consisted of examining
them for documentation in accordance with established merit review criteria
in the regulations and DOE procedures.
Overall, we focused our review efforts on whether documentation existed to
demonstrate that DOE was following the merit review procedures it has
established. We did not assess the quality or use of the merit reviews
performed. Because our work was limited to project files from the Office of
Basic Energy Sciences and the Office of Power Technologies, the results
cannot be projected agencywide. The results from our review of 100 Office of
Basic Energy Sciences files can be generalized to all 1,289 of the office's
projects funded in fiscal year 1998, however, as the projects selected were
a random sample of the 1,289 projects in the universe. The maximum margin of
error for estimated proportions is plus or minus 10 percent at the
95-percent confidence level.
We also obtained information on prior reviews of DOE's merit review process
by internal DOE organizations, including the Office of the Inspector
General, and the status of any recommendations made in such reviews.
We conducted our work from June 1999 through March 2000 in accordance with
generally accepted government auditing standards.
As arranged with your offices, unless you publicly announce its contents
earlier, we plan no further distribution of this report until 30 days after
the date of this letter. At that time, we will send copies of the report to
the appropriate House and Senate committees; interested Members of Congress;
the Honorable Bill Richardson, the Secretary of Energy; the Honorable Jacob
J. Lew, Director, Office of Management and Budget; and other interested
parties. We will also make copies available to others on request. If you or
your staff have any questions or need additional information, please call me
at (202) 512-3841.
(Ms.) Gary L. Jones
Associate Director, Energy,
Resources, and Science Issues
Overview of Merit Review Practices in the Department of Energy
The information in this appendix was included in our March 1999 report,
Federal Research: Peer Review Practices at Federal Science Agencies Vary
(GAO/RCED-99-99). The term "peer review" is used throughout--even though DOE
commonly uses the term "merit review" in referring to its own
procedures--because the report from which the appendix was extracted was
comparing peer review practices among 12 agencies.
The following presents a description of the U.S. Department of Energy's
(DOE) peer review and other quality assurance review practices.
Created in 1977, DOE's mission is to foster a secure and reliable energy
system that is environmentally and economically sustainable, to be a
responsible steward of the nation's nuclear weapons, to clean up its
facilities, and to support continued U.S. leadership in science and
technology. The agency conducts research and development on a variety of
topics, including fossil, fusion, and nuclear energy production; energy
conservation; renewable energy; biological and environmental research;
materials science; engineering and geoscience; advanced computing;
high-energy and nuclear physics; nuclear waste management; environmental
remediation; radiation; nuclear stockpile management; nuclear
nonproliferation; and the Human Genome Project.
DOE's research can affect a broad spectrum of federal policies and
regulations. For example, DOE generates federal energy-efficiency rules for
the manufacture, testing, and labeling of major home appliances and certain
commercial products. The Environmental Protection Agency's Office of
Radiation Protection and the Nuclear Regulatory Commission have used the
results of DOE's research as part of the background used to set radiation
standards. In addition, agency research was used to set standards for mobile
pollution sources and fuel regulations under the Motor Vehicle Information
and Cost Savings Act.
DOE's research and development budget for fiscal year 1999 is $7.8 billion.
Approximately 80 percent of the budget will support research, research
facilities, and related activities within the Department and its national
laboratory system. The remaining 20 percent will support external research
conducted by industry, universities, public and private research
institutions, not-for-profit organizations, and research and development
consortia through Department-awarded grants, cooperative agreements and
contracts, and laboratory-awarded research subcontracts.
Because of its diversity, DOE's peer review practices are guided by a
variety of laws and regulations. The Federal Acquisition Regulation, the DOE
Acquisition Regulation, and the Competition in Contracting Act guide the
agency's peer review practices for research and development contracts.
Research grants and cooperative agreements, which are awarded through a
merit-based selection process, follow the Department's Financial Assistance
Rules, as promulgated in the Code of Federal Regulations (10 C.F.R. Part
600).
DOE has no formal definition of peer review, but practices peer review as
merit review with peer evaluation--a formal, competent, and objective
evaluation process using specified criteria and the review and advice of
qualified peers. Peers must be technically competent in the scientific or
technical field under review and must be free from conflict of interest.
Peers may come from any source, including industry, academia, private and
nongovernmental institutions, government agencies, and their associated
laboratories.
DOE uses merit review with peer evaluation to guide research direction and
to assess research progress. External research is peer-reviewed in
conjunction with the preaward competitive selection process. This research
is also reviewed as part of the award renewal process. Reviews of laboratory
research occur at both the laboratory and departmental oversight levels. In
addition, laboratories, user facilities, and major research divisions have
committees of outside experts that provide periodic peer reviews of research
relevance and quality. Research results are also extensively published in
peer-reviewed journals. The methods for conducting reviews are tailored to
each situation. The following provides examples of the different peer review
practices among DOE's programs.
Reviews of Research Proposals
With few exceptions, merit review with peer evaluation guides DOE research,
including that by its research laboratories. For example, regulations
governing the Financial Assistance Program require peer review and
competitive selection. The regulations specify that each grant proposal
normally receive a minimum of three reviews per proposal by technically
qualified experts in the proposed field, followed by a peer review panel.
Proposals are peer-reviewed for scientific excellence. The Office of Science
and Technology, in the Environmental Management Program, Project Selection
Reviews, for new research and development activities, combine the judgments
of technical peers and potential users of the results. In addition, research
subcontracted by DOE's national laboratories to outside researchers is
governed by contract provisions, unless otherwise justified through formal
documentation. These provisions require competitive selection processes,
including merit review with peer evaluation.
Peer review is applied to the selection and approval of most laboratory
field work proposals. Field work proposals are the means by which the
laboratories formally propose future work and seek authorization for
expending research and development funds. In the Office of Science, all
field work proposals are required to be peer-reviewed for quality by
external, independent experts. Each laboratory research program is reviewed
annually. For example, the Technology Development Program of the Office of
Environmental Management uses teams of subject matter specialists from
technical, regulatory, business, and stakeholder perspectives. In addition,
peer review is used to allocate available time and to select the experiments
conducted at specialized research facilities located at DOE's laboratories.
Such facilities include accelerators for the study of high-energy physics
and the world's most powerful computers and lasers.
At the laboratories, each director's discretionary research and development
program and the laboratory field work proposals are reviewed. The Laboratory
Directed Research and Development Program provides certain laboratory
directors discretionary funds (up to 6 percent of their laboratory's budget)
to develop new scientific ideas and opportunities and to initiate new
directions. The laboratories rely on individual scientific investigators and
the scientific leadership of the laboratory to identify opportunities that
will contribute to scientific and institutional goals.
Reviews of In-Progress Research
Peer review is also used in conjunction with the evaluation of ongoing
research. While the substance of the reviews is similar, such as considering
the quality and relevance of the research and the investigator's or research
group's record of accomplishment, the nature of the reviews can differ. For
example, the Office of International Health Programs uses independent,
external review panels to conduct in-progress reviews. The Office of Science
and Technology within the Environmental Management Program conducts
technical reviews of continuing projects in their third year of support or
when reaching engineering demonstration, or when considered a new start,
through a formal process externally managed by the American Society of
Mechanical Engineers. The Society selects reviewers who assess technical
excellence, relevance, progress, and productivity. In addition, for new
environmental-management technologies, mid-year progress reviews are held
annually for each program element, with potential users assessing the
applicability and performance requirements.
Reviews of Publication
Publication in open literature constitutes another form of peer review.
Publication of original work is considered essential at DOE, and the
scientists it supports (both external and internal) are continually
evaluated by the quality of their original research, as indicated, in part,
by publications in archival, peer-reviewed journals.
Other Peer Reviews
Retrospectively, scientists who are independent of the laboratory conduct
reviews of laboratory research in conjunction with program reviews and
advisory committee oversight. These reviews provide advice on the quality,
relevance, and productivity of laboratory-conducted research. The following
are three examples of such reviews.
� The Office of Science regularly conducts retrospective peer reviews of
research and development programs throughout the Department, which include
an evaluation of a sampling of research projects. Individual programs also
conduct reviews.
� The Office of Defense Programs uses an Inertial Confinement Fusion
Advisory Committee, constituted under the Federal Advisory Committee Act,
which reports directly to the Assistant Secretary for Defense Programs, to
assess program results. For highly classified research, the Department
interacts with the Department of Defense for customer feedback on program
performance.
� The Office of Civilian Radioactive Waste Management uses peer review to
help assess the quality and validity of completed technical work and to
ensure the quality of data for use in adjudicatory hearings. Because of the
U.S. Nuclear Regulatory Commission's role under the Nuclear Waste Policy
Act, the Commission has provided guidance on the conduct of peer review. A
primary selection criterion for peer reviewers is independence. When there
is a potential or an apparent conflict of interest that may bring the
independence of a participant into question, a documented rationale is
included in the peer review report.
Many of DOE's energy technology development and related research and
development programs are deliberately designed to accommodate industrial
partners. In various ways, these industrial partners provide opportunities
for external merit review by engaging themselves as full participants in
planning, executing, and commercializing the research and development. Such
reviews extend beyond the peer review procedures that characterize science
programs. For most major technological development programs, the formulation
and enforcement of a comprehensive Quality Assurance Program is required.
For the Energy Efficiency Program, quality control involves three stages:
peer review for basic research, merit review for applied research, and
market review for judging commercial application.
Under reforms begun in 1994, all of the Department's new contracts for the
management and operation of its national laboratories require regular,
performance-based merit reviews of the contractor's performance. Colleagues,
laboratory superiors, and administrators at DOE headquarters evaluate the
research and development projects. The nine multiprogram national
laboratories also have various industrial advisory panels to review
research. In addition, all research subcontracted by the laboratories to
outside researchers is governed by contract provisions that generally
require periodic evaluations of the subcontractor's performance.
Panels constituted under the Federal Advisory Committee Act frequently
advise DOE program administrators on program content, quality, future
directions, and priorities. For example, the Office of Science uses advisory
committees for recommendations on the Office of Basic Energy Sciences,
biological and environmental research, high-energy physics, nuclear
sciences, and fusion energy. Similarly, the Office of Civilian Radioactive
Waste Management has standing advisory committees and just completed a
2-year participatory peer review.
For classified nuclear weapons design-related research, where no broad
industrial, university, or other independent source of expertise exists, a
process of merit review exists within DOE's Defense Programs laboratories.
For example, every 5 years, with annual updates, the three Defense Programs
laboratories review the nuclear weapons in the active stockpile through a
formal internal peer review Weapons Appraisal Process. The University of
California, the contractor that operates the Lawrence Livermore and Los
Alamos laboratories, also uses a President's Council Panel on National
Security to assess the nuclear weapons program. Each of the laboratories'
directors also appoints review committees for each of the laboratories'
divisions, with members coming almost exclusively from industry and academia
but sometimes from DOE and its contractors. The committees report to the
laboratory directors with an assessment of the division's technical and
scientific quality. The directors, in turn, file a self-assessment with a
review council convened by the president of the University of California.
From this process, the president reports to DOE on the laboratories'
technical and scientific quality. Finally, additional reviewing bodies such
as JASON (a civilian science advisory group), the National Academy of
Sciences, the Nuclear Weapons Council, and other senior advisory groups
review DOE's Defense Programs' research and development program.
According to DOE officials, most congressional mandates and earmarks, which
designate projects and the institutions to conduct them, are not subject to
the peer review process in deference to the congressional directives.
However, once a grant is funded, it is likely to receive merit review before
being competitively renewed, unless waived with a written determination by
the project administrator. When merit review is not conducted before an
award's renewal, the award must be considered to be noncompetitive and must
meet different selection requirements.6 Whenever the merit review system is
not used for applications and proposals, the Director of Grants and
Contracts must obtain written prior approval for a different review
procedure. Very rarely are contracts peer-reviewed when sole-source
selection is used, but the administrator making this decision must justify
this process. In addition, nonreviewed grants cannot be extended for more
than 6 years; periodic reviews of the research results are another check.
Comments From the Department of Energy
Key Contacts and Staff Acknowledgments
John P. Hunt, Jr., (404) 679-1822
Frankie Fulton, (404) 679-1805
In addition to those named above, Lynn Musser, Deborah Ortega, Paul Rhodes,
and Mindi Weisenbloom made key contributions to this report.
(141329)
1. Federal Research: Peer Review Practices at Federal Science Agencies Vary
(GAO/RCED-99-99 , Mar. 17, 1999).
2. In 1976, the Congress established the Office of Science and Technology
Policy to serve as a source of scientific, engineering, and technological
analysis and judgment for the President and to assist him in providing
leadership and coordination for federal research and development programs.
3. A number of other offices also fund research. Two examples are the Office
of Fusion Energy Sciences within the Office of Science and the Office of
Industrial Technologies within the Office of Energy Efficiency and Renewable
Energy.
4. According to 10 C.F.R. 605.10, reviewers are to be selected "on the basis
of their professional qualifications and expertise" and are "to comply with
all applicable DOE rules or directives concerning the use of outside
evaluators."
5. In April 1995, the Secretary of Energy established the Laboratory
Operations Board to provide focused, regular attention to issues facing
DOE's laboratory complex.
6. 10 C.F.R. 600.6 (c).
*** End of document. ***