Human Capital: Implementing Pay for Performance at Selected	 
Personnel Demonstration Projects (23-JAN-04, GAO-04-83).	 
                                                                 
There is a growing understanding that the federal government	 
needs to fundamentally rethink its current approach to pay and to
better link pay to individual and organizational performance.	 
Federal agencies have been experimenting with pay for performance
through the Office of Personnel Management's (OPM) personnel	 
demonstration projects. GAO identified the approaches that	 
selected personnel demonstration projects have taken to implement
their pay for performance systems. These projects include: the	 
Navy Demonstration Project at China Lake (China Lake), the	 
National Institute of Standards and Technology (NIST), the	 
Department of Commerce (DOC), the Naval Research Laboratory	 
(NRL), the Naval Sea Systems Command Warfare Centers (NAVSEA) at 
Dahlgren and Newport, and the Civilian Acquisition Workforce	 
Personnel Demonstration Project (AcqDemo). We selected these	 
demonstration projects based on factors such as status of the	 
project and makeup of employee groups covered. We provided drafts
of this report to officials in the Department of Defense (DOD)	 
and DOC for their review and comment. DOD provided written	 
comments concurring with our report. DOC provided minor technical
clarifications and updated information. We provided a draft of	 
the report to the Director of OPM for her information.		 
-------------------------Indexing Terms------------------------- 
REPORTNUM:   GAO-04-83						        
    ACCNO:   A09142						        
  TITLE:     Human Capital: Implementing Pay for Performance at       
Selected Personnel Demonstration Projects			 
     DATE:   01/23/2004 
  SUBJECT:   Compensation					 
	     Evaluation criteria				 
	     Performance measures				 
	     Personnel evaluation				 
	     Personnel management				 
	     Productivity in government 			 
	     Program evaluation 				 

******************************************************************
** This file contains an ASCII representation of the text of a  **
** GAO Product.                                                 **
**                                                              **
** No attempt has been made to display graphic images, although **
** figure captions are reproduced.  Tables are included, but    **
** may not resemble those in the printed version.               **
**                                                              **
** Please see the PDF (Portable Document Format) file, when     **
** available, for a complete electronic file of the printed     **
** document's contents.                                         **
**                                                              **
******************************************************************
GAO-04-83

United States General Accounting Office

                     GAO Report to Congressional Requesters

January 2004

HUMAN CAPITAL

 Implementing Pay for Performance at Selected Personnel Demonstration Projects

                                       a

GAO-04-83

Highlights of GAO-04-83, a report to congressional requesters

There is a growing understanding that the federal government needs to
fundamentally rethink its current approach to pay and to better link pay
to individual and organizational performance. Federal agencies have been
experimenting with pay for performance through the Office of Personnel
Management's (OPM) personnel demonstration projects.

January 2004

HUMAN CAPITAL

Implementing Pay for Performance at Selected Personnel Demonstration Projects

The demonstration projects took a variety of approaches to designing and
implementing their pay for performance systems to meet the unique needs of
their cultures and organizational structures, as shown in the table below.

Demonstration Project Approaches to Implementing Pay for Performance

Using competencies to evaluate employee performance.

High-performing organizations use validated core competencies as a key
part of evaluating individual contributions to organizational results. To
this end, AcqDemo and NRL use core competencies for all positions. Other
demonstration projects, such as NIST, DOC, and China Lake, use
competencies based on the individual employee's position.

GAO identified the approaches that Some projects, such as China Lake and
NAVSEA's Newport division, established predetermined selected personnel
demonstration pay increases, awards, or both depending on a given
performance rating, while others, such as projects have taken to implement
DOC and NIST, delegated the flexibility to individual pay pools to
determine how ratings would their pay for performance systems. translate
into performance pay increases, awards, or both. The demonstration
projects made These projects include: the Navy some distinctions among
employees' performance.

Demonstration Project at China

Lake (China Lake), the National Several of the demonstration projects,
such as AcqDemo and NRL, consider an employee's Institute of Standards and
current salary when making performance pay increases and award decisions
to make a better Technology (NIST), the match between an employee's
compensation and contribution to the organization.

Department of Commerce (DOC),

the Naval Research Laboratory According to officials, salaries, training,
and automation and data systems were the major cost (NRL), the Naval Sea
Systems drivers of implementing their pay for performance systems. The
demonstration projects used a Command Warfare Centers number of approaches
to manage the costs.

(NAVSEA) at Dahlgren and Newport, and the Civilian Acquisition Workforce
Personnel Demonstration Project (AcqDemo). We selected these demonstration
projects based on factors such as status of the project and makeup of
employee groups covered.

We provided drafts of this report to officials in the Department of
Defense (DOD) and DOC for their review and comment. DOD provided written
comments concurring with our report. DOC provided minor technical
clarifications and updated information. We provided a draft of the report
to the Director of OPM for her information.

www.gao.gov/cgi-bin/getrpt?GAO-04-83.

To view the full product, including the scope and methodology, click on
the link above. For more information, contact J. Christopher Mihm at (202)
512-6806 or [email protected].

To ensure fairness and safeguard against abuse, performance-based pay
programs should have adequate safeguards, including reasonable
transparency in connection with the results of the performance management
process. To this end, several of the demonstration projects publish
information, such as the average performance rating, performance pay
increase, and award.

Source: GAO.

GAO strongly supports the need to expand pay for performance in the
federal government. How it is done, when it is done, and the basis on
which it is done can make all the difference in whether such efforts are
successful. High-performing organizations continuously review and revise
their performance management systems. These demonstration projects show an
understanding that how to better link pay to performance is very much a
work in progress at the federal level. Additional work is needed to
strengthen efforts to ensure that performance management systems are tools
to help them manage on a day-to-day basis. In particular, there are
opportunities to use organizationwide competencies to evaluate employee
performance that reinforce behaviors and actions that support the
organization's mission, translate employee performance so that managers
make meaningful distinctions between top and poor performers with
objective and fact-based information, and provide information to employees
about the results of the performance appraisals and pay decisions to
ensure reasonable transparency and appropriate accountability mechanisms
are in place.

Contents

    Letter                                                                  1 
                                  Results in Brief                          4 
                                     Background                             6 
              Selected Demonstration Projects Took Various Approaches to 
                    Implement Their Pay for Performance Systems             9 
                              Concluding Observations                      39 
                                  Agency Comments                          40 

Appendixes

Appendix I: Appendix II:

Appendix III: Appendix IV: Objective, Scope, and Methodology Demonstration
Project Profiles

Navy Demonstration Project at China Lake (China Lake)
National Institute of Standards and Technology (NIST)
Department of Commerce (DOC)
Naval Research Laboratory (NRL)
Naval Sea Systems Command Warfare Centers (NAVSEA)
Civilian Acquisition Personnel Demonstration Project

(AcqDemo)

Comments from the Department of Defense GAO Contacts and Staff
Acknowledgments

GAO Contacts Acknowledgments 41

43 43 47 51 55 59

63

67

68 68 68

Tables Table 1:

Table 2: Table 3:

Table 4: Table 5:

Table 6: Selected GS Funding Sources Available for Employee Salary
Increases 8 China Lake's Pay Increase Distribution (2002) 16 NAVSEA
Newport Division's Pay Increase and Award Distribution (2002) 19 DOC's Pay
Increase and Award Distribution (2002) 22 Cumulative Percentage Increase
in Average Salaries for Demonstration Project and Comparison Group
Employees by Year of the Project, as Reported by the Demonstration
Projects 27 Direct Inflation-Adjusted Cost of Training in the First 5
Years of the Demonstration Projects (in 2002 Dollars), as Reported by the
Demonstration Projects 33

                                    Contents

Table 7:	Inflation-Adjusted Cost of Automation and Data Systems for
Selected Demonstration Projects (in 2002 Dollars), as Reported by the
Demonstration Projects 35

Figures	Figure 1: Figure 2:

Figure 3:

Figure 4:

Figure 5: Figure 6:

Figure 7:

Figure 8:

Figure 9:

China Lake's Rating and Pay Distribution Structure
China Lake's Rating Distribution by Numerical Rating
(2002)
NAVSEA Newport Division's Rating and Performance Pay
Distribution Structure
NAVSEA Newport Division's Rating Distribution
(2002)
DOC's Rating Distribution (2002)
AcqDemo's Consideration of Current Salary in Making
Performance Pay Decisions
Funding Sources Linked to Pay Decisions in Selected
Personnel Demonstration Projects as of Fiscal Year
2003
Pay Bands, Intervals, and Corresponding Permanent Pay
Increases for NIST's Scientific and Engineering Career
Path
Sample of NAVSEA Newport Division's Rating Category
Distribution Data Provided to Employees

                                       14

                                       15

                                       17

                                     18 21

24

26

30

37

38 45 49 53 57 61 65 Figure 10: Sample of NIST's Distribution of Average
Performance

Rating Scores Provided to Employees Figure 11: Selected Employee Attitude
Data for China Lake Figure 12: Selected Employee Attitude Data for NIST
Figure 13: Selected Employee Attitude Data for DOC Figure 14: Selected
Employee Attitude Data for NRL Figure 15: Selected Employee Attitude Data
for NAVSEA Figure 16: Selected Employee Attitude Data for AcqDemo

Abbreviations

AcqDemo Civilian Acquisition Workforce Personnel Demonstration

Project CPDF Central Personnel Data File DOC Department of Commerce DOD
Department of Defense FEPCA Federal Employees Pay Comparability Act of
1990 GPI general pay increase GS General Schedule

NAVSEA Naval Sea Systems Command Warfare Centers
NIST National Institute of Standards and Technology
NRL Naval Research Laboratory
OPM Office of Personnel Management
QSI quality step increase
WGI within-grade increase

This is a work of the U.S. government and is not subject to copyright
protection in the United States. It may be reproduced and distributed in
its entirety without further permission from GAO. However, because this
work may contain copyrighted images or other material, permission from the
copyright holder may be necessary if you wish to reproduce this material
separately.

A

United States General Accounting Office Washington, D.C. 20548

January 23, 2004

The Honorable George V. Voinovich
Chairman
Subcommittee on Oversight of Government Management, the Federal

Workforce, and the District of Columbia
Committee on Governmental Affairs
United States Senate

The Honorable Jo Ann Davis
Chairwoman
Subcommittee on Civil Service and Agency Organization
Committee on Government Reform
House of Representatives

To successfully transform themselves, high-performing organizations have
found that they must fundamentally change their cultures so that they are
more results-oriented, customer-focused, and collaborative in nature, and
have recognized that an effective performance management system can
help them drive internal change and achieve desired results. Our prior
work, done at your request, has identified nine key practices for
effective
performance management based on experiences in public sector
organizations both in the United States and abroad.1 The key practices are
as follows:

1.

2.

3.

4.

5.

6.

Align individual performance expectations with organizational goals.

Connect performance expectations to crosscutting goals.

Provide and routinely use performance information to make program
improvements.

Require follow-up actions to address organizational priorities.

Use competencies to provide a fuller assessment of performance.

Link pay to individual and organizational performance.

1U.S. General Accounting Office, Results-Oriented Cultures: Creating a
Clear Linkage between Individual Performance and Organizational Success,
GAO-03-488 (Washington, D.C.: Mar. 14, 2003).

7. Make meaningful distinctions in performance.

8.	Involve employees and stakeholders to gain ownership of performance
management systems.

9. Maintain continuity during transitions.

Among these practices, there is a growing understanding that the federal
government needs to fundamentally rethink its current approach to pay and
better link pay to individual and organizational performance. To this end,
Congress has taken important steps to implement results-oriented pay
reform and modern performance management systems across government. Most
recently, Congress provided the Department of Defense (DOD) flexibility to
revise its performance management system to better link pay to performance
and required DOD to incorporate employee involvement, provide ongoing
performance feedback, and include effective safeguards to ensure fairness
and equity, among other things, in DOD's revised system.

Congress also established a Human Capital Performance Fund to reward
agencies' highest performing and most valuable employees. To be eligible,
agencies are to submit plans for approval by the Office of Personnel
Management (OPM) that incorporate a link between pay for performance and
the agency's strategic plan, employee involvement, ongoing performance
feedback, and effective safeguards to ensure fair management of the
system, among other things. In the first year of implementation, up to 10
percent of the amount appropriated is to be available to train those
involved on making meaningful distinctions in performance. In addition,
Congress created a wider, more open pay range for senior executive
compensation, thus allowing for pay to be more directly tied to individual
performance, contribution to the agency's performance, or both, as
determined under a rigorous performance management system that as designed
and applied, makes meaningful distinctions based on relative performance.

Further, in November 2002, Congress established the Department of Homeland
Security and provided it human capital flexibilities to design a
performance management system and specifically to consider different
approaches to pay. We reported that the department's effort to design its
system could be particularly instructive in light of future requests for

human capital flexibilities.2 Legislation is currently pending, which you
sponsored and introduced, that would provide GAO additional authority to
more fully link employees' annual salary increases to performance.

Federal agencies have been experimenting with pay for performance through
OPM's personnel demonstration projects. Over the past 25 years, OPM has
approved 17 projects, 12 of which have implemented pay for performance
systems. At your request, this report identifies the approaches that 6 of
these personnel demonstration projects have taken to implement their pay
for performance systems. These projects are

o  the Navy Demonstration Project at China Lake (China Lake),

o  the National Institute of Standards and Technology (NIST),

o  the Department of Commerce (DOC),

o  the Naval Research Laboratory (NRL),

o 	the Naval Sea Systems Command Warfare Centers (NAVSEA) at Dahlgren and
Newport, and

o 	the Civilian Acquisition Workforce Personnel Demonstration Project
(AcqDemo).

To address the objective of this report, we focused on OPM's personnel
demonstration projects because they are required to prepare designs,
conduct employee feedback, and complete evaluations of their results,
among other things. We selected these demonstration projects based on
factors such as status of the project and makeup of employee groups
covered. We analyzed Federal Register notices outlining the major features
of each demonstration project, operating manuals, annual and summative
evaluations, employee attitude survey results, project briefings, training
materials, rating and payout data, and cost data as reported by the
agencies without verification by GAO, as well as other relevant
documentation. We also interviewed cognizant officials from OPM;
demonstration project managers, human resource officials, and

2U.S. General Accounting Office, Human Capital: DHS Personnel System
Design Effort Provides for Collaboration and Employee Participation,
GAO-03-1099 (Washington, D.C.: Sept. 30, 2003).

participating supervisors and employees; and union and other employee
representatives. We did not independently evaluate the effectiveness of
the demonstration projects. We assessed the reliability of cost, salary,
rating, and performance pay distribution data provided by the
demonstration projects and determined that the data were sufficiently
reliable for the purposes of this report, with the exception of the DOC
salary data, which we do not present.

We performed our work in the Washington, D.C., metropolitan area from
December 2002 through August 2003 in accordance with generally accepted
government auditing standards. Appendix I provides additional information
on our objective, scope, and methodology. Appendix II presents profiles of
the demonstration projects, including selected elements of their
performance management systems, employee attitude data, and reported
effects.

Results in Brief	We found that the demonstration projects took a variety
of approaches to designing and implementing their pay for performance
systems to meet the unique needs of their cultures and organizational
structures. Specifically, the demonstration projects took different
approaches to

o  using competencies to evaluate employee performance,

o 	translating employee performance ratings into pay increases and awards,

o  considering current salary in making performance-based pay decisions,

o  managing costs of the pay for performance system, and

o 	providing information to employees about the results of performance
appraisal and pay decisions.

Using competencies to evaluate employee performance. Highperforming
organizations use validated core competencies as a key part of evaluating
individual contributions to organizational results. Core competencies
applied organizationwide can help reinforce employee behaviors and actions
that support the organization's mission, goals, and values and can provide
a consistent message to employees about how they are expected to achieve
results. AcqDemo and NRL use core competencies for all positions across
the organization to evaluate performance. Other

demonstration projects, such as NIST, DOC, and China Lake, use
competencies based primarily on the individual position. (See p. 9.)

Translating employee performance ratings into pay increases and awards.
High-performing organizations seek to create pay, incentive, and reward
systems that clearly link employee knowledge, skills, and contributions to
organizational results. These organizations make meaningful distinctions
between acceptable and outstanding performance of individuals and
appropriately reward those who perform at the highest level. To this end,
the demonstration projects took different approaches in translating
individual employee performance ratings into permanent pay increases,
one-time awards, or both in their pay for performance systems. Some
projects, such as China Lake and NAVSEA's Newport division, established
predetermined pay increases, awards, or both depending on a given
performance rating, while others, such as DOC and NIST, delegated the
flexibility to individual pay pools to determine how ratings would
translate into pay increases, awards, or both. While the demonstration
projects made some distinctions among employees' performance, the data and
experience show that making such meaningful distinctions remains a work in
progress. (See p. 12.)

Considering current salary in making performance-based pay decisions.
Several of the demonstration projects consider an employee's current
salary when making pay increase and award decisions. By considering salary
in such decisions, the projects intend to make a better match between an
employee's compensation and his or her contribution to the organization.
Thus, two employees with comparable contributions could receive different
performance pay increases and awards depending on their current salaries.
For example, AcqDemo determines if employees are "appropriately
compensated," "under-compensated," or "overcompensated" when it compares
employee contribution scores to salary. (See p. 23.)

Managing costs of the pay for performance system. According to OPM, the
increased costs of implementing alternative personnel systems should be
acknowledged and budgeted for up front. Based on data the demonstration
projects provided, direct costs associated with salaries, training, and
automation and data systems were the major cost drivers of implementing
their pay for performance systems. The demonstration projects used a
number of approaches to manage the direct costs of implementing and
maintaining pay for performance systems. In making their pay decisions,
some of the demonstration projects use funding

sources such as the annual general pay increase and locality pay
adjustment. Several demonstration projects managed salary costs by
considering fiscal conditions and the labor market when determining how
much to budget for pay increases, managing movement through the pay band,
and providing a mix of one-time awards and permanent pay increases. (See
p. 25.)

Providing information to employees about the results of performance
appraisal and pay decisions. We have observed that a more
performance-based pay system should have adequate safeguards to ensure
fairness and guard against abuse. One such safeguard is to ensure
reasonable transparency and appropriate accountability mechanisms in
connection with the results of the performance management process. To this
end, several of the demonstration projects publish information for
employees on internal Web sites about the results of performance appraisal
and pay decisions, such as the average performance rating, the average pay
increase, and the average award for the organization and for each
individual department, while other demonstration projects publish no
information on the results of the performance cycle. (See p. 36.)

We provided drafts of this report to the Secretaries of Defense and
Commerce for their review and comment. DOD's Principal Deputy, Under
Secretary of Defense for Personnel and Readiness, provided written
comments, which are presented in appendix III. DOD concurred with our
report and stated that it is a useful summary of the various approaches
that the demonstration projects undertook to implement their pay for
performance systems and that their experiences provide valuable insight
into federal pay for performance models. DOD also noted that the NAVSEA
demonstration project training and automation cost data are estimated
rather than actual costs. We made the appropriate notation. While DOC did
not submit written comments, DOC's Classification, Pay, and HR
Demonstration Program Manager provided minor technical clarifications and
updated information. We made those changes where appropriate. We provided
a draft of the report to the Director of OPM for her information.

Background	Congress granted OPM the authority to conduct personnel
demonstration projects under the Civil Service Reform Act of 1978 to test
new personnel

and pay systems.3 A federal agency is to obtain the authority from OPM to
waive existing laws and regulations in Title 5 to propose, develop, test,
and evaluate alternative approaches to managing its human capital. Under
the demonstration project authority, no waivers of law are to be permitted
in areas of employee leave, employee benefits, equal employment
opportunity, political activity, merit system principles, or prohibited
personnel practices. The law also contains certain limitations and
requirements, including

o  5-year time limit for duration of projects,

o  5,000 employee cap on participation,

o  restriction to 10 concurrent demonstration projects governmentwide,

o  union and employee consultation,

o  published formal project plan in the Federal Register,

o 	notification of Congress and employees of the demonstration project,
and

o  project evaluations.

OPM guidance requires that agencies conduct at least three evaluations-
after implementation, after at least 3 and a half years, and after the
original scheduled end of the project-that are to address the following
questions:

o 	Did the project accomplish the intended purpose and goals? If not, why
not?

o 	Was the project implemented and operated appropriately and accurately?

o  What were the costs, relative to the benefits of the project?

3Two governmentwide initiatives were intended to implement pay for
performance systems for supervisors and managers. The Merit Pay System was
established under the Civil Service Reform Act of 1978 and ended in 1984.
Its successor-the Performance Management and Recognition System-ended in
1993.

o 	What was the impact on veterans and other equal employment opportunity
groups?

o 	Were merit systems principles adhered to and prohibited personnel
practices avoided?

o 	Can the project or portions thereof be generalized to other agencies or
governmentwide?

The demonstration projects can link some or all of the funding sources for
pay increases available under the current federal compensation system, the
General Schedule (GS), to an employee's level of performance.4 Table 1
defines selected funding sources.

  Table 1: Selected GS Funding Sources Available for Employee Salary Increases
                           Funding source Description

General pay increase Established under the Federal Employees Pay
Comparability Act of 1990 (FEPCA), the GPI is to be determined

(GPI)	annually and delivered automatically and uniformly to GS employees.
The GPI is to be based on the Employment Cost Index, which is a
statistical measure maintained by the Bureau of Labor Statistics that
considers changes in private sector labor costs.

Locality pay adjustment	Established under FEPCA, locality pay is to
address any gap between federal and nonfederal salaries and is to be
determined annually and delivered automatically and uniformly to most GS
employees within a given locality. Locality pay is to supplement the rate
of basic pay in the 48 contiguous states where nonfederal pay exceeds
federal pay by more than 5 percent. The President's Pay Agent, comprised
of the Secretary of Labor and the Directors of the Office of Management
and Budget and OPM, is to recommend and the President is to approve what,
if any, the percentage of increase should be.

     4The GS is the federal government's main pay system for "white-collar"
positions. The GS is composed of 15 grade levels. Each grade is divided into 10
                      specific pay levels called "steps."

(Continued From Previous Page)

Funding source Description

Within-grade increase The WGI, also known as a "step increase," is a
periodic increase in a GS employee's rate of basic pay to the

(WGI)	next higher pay level or "step" of that grade. To receive a WGI, an
employee must wait a prescribed amount of time and be performing at an
acceptable level of competence. OPM reports that the WGI is designed to
reward experience and loyalty and is based on a judgment that the
employee's work is of an "acceptable level of

a

competence" but does not distinguish between very good and moderately good
performance.

Quality step increase A QSI is to recognize high-quality performance.
Similar to a WGI, a QSI advances the employee to the next

(QSI)	higher step but ahead of the required waiting period. To receive a
QSI, an employee must demonstrate sustained high-quality performance.

Career ladder promotion Federal employees may be appointed to positions
with "career ladders," a series of developmental positions of increasing
difficulty, through which an employee may be promoted to higher grade
levels without competition.

Source: OPM.

aU.S. Office of Personnel Management, A Fresh Start for Federal Pay: The
Case for Modernization (Washington, D.C.: April 2002).

Selected Demonstration Projects Took Various Approaches to Implement Their
Pay for Performance Systems

High-performing organizations seek to create pay, incentive, and reward
systems based on valid, reliable, and transparent performance management
systems with adequate safeguards and link employee knowledge, skills, and
contributions to organizational results. To that end, we found that the
demonstration projects took a variety of approaches to designing and
implementing their pay for performance systems to meet the unique needs of
their cultures and organizational structures. Specifically, the
demonstration projects took different approaches to

o  using competencies to evaluate employee performance,

o 	translating employee performance ratings into pay increases and awards,

o  considering current salary in making performance-based pay decisions,

o  managing costs of the pay for performance system, and

o 	providing information to employees about the results of performance
appraisal and pay decisions.

Using Competencies to High-performing organizations use validated core
competencies as a key Evaluate Employee part of evaluating individual
contributions to organizational results. Performance Competencies define
the skills and supporting behaviors that individuals

are expected to demonstrate and can provide a fuller picture of an
individual's performance. To this end, we found that the demonstration
projects took different approaches to evaluating employee performance.
AcqDemo and NRL use core competencies for all positions across the
organization. Other demonstration projects, such as NIST, DOC, and China
Lake, use competencies based primarily on the individual employee's
position.

Applying competencies organizationwide. Core competencies applied
organizationwide can help reinforce employee behaviors and actions that
support the organization's mission, goals, and values and can provide a
consistent message to employees about how they are expected to achieve
results. AcqDemo evaluates employee performance against one set of
"factors," which are applied to all employees. "Discriminators" and
"descriptors" further define the factors by career path and pay band.
According to AcqDemo, taken together, the factors, discriminators, and
descriptors are relevant to the success of a DOD acquisition
organization.5

AcqDemo's six factors are (1) problem solving, (2) teamwork/cooperation,
(3) customer relations, (4) leadership/supervision, (5) communication, and
(6) resource management. Discriminators further define each factor. For
example, discriminators for problem solving include scope of
responsibility, creativity, complexity, and independence. Descriptors
identify contributions by pay band. For example, a descriptor for problem
solving at one pay band level is "resolves routine problems within
established guidelines," and at a higher level, a descriptor is
"anticipates problems, develops sound solutions and action plans to ensure
program/mission accomplishment."

All factors must be used and cannot be supplemented. While the pay pool
manager may weight the factors, according to an official, no organization
within AcqDemo has weighted the factors to date. Managers are authorized
to use weights sparingly because contributions in all six factors are
important to ensuring AcqDemo's overall success as well as to developing
the skills of the acquisition workforce. If weights are used, they are to
be applied uniformly across all positions within the pay pool. The six

5See U.S. General Accounting Office, An Evaluation Framework for Improving
the Procurement Function (Exposure Draft) (Washington, D.C.: October
2003), for more information on a framework to enable a high-level,
qualitative assessment of the strengths and weaknesses of agencies'
procurement functions.

factors are initially weighted equally and no factor can be weighted less
than one-half of its initial weight. Employees are to be advised of the
weights at the beginning of the rating period.

While AcqDemo applies organizationwide competencies across all employees,
NRL has established "critical elements" for each career path and allows
supervisors to add individual performance expectations. The critical
elements are the key aspects of work that supervisors are to consider in
evaluating employee performance. Each critical element has discriminators
and descriptors. Specifically, for the Science and Engineering
Professionals career path, one critical element is "scientific and
technical problem solving." That element's discriminators are (1) level of
oversight, (2) creativity, (3) technical communications, and (4)
recognition. For recognition, the descriptors include "recognized within
own organization for technical ability in assigned areas" as one level of
contribution and "recognized internally and externally by peers for
technical expertise" as the next level of contribution.

NRL's system allows supervisors to supplement the descriptors to further
describe what is expected of employees. According to an NRL demonstration
project official, this flexibility allows the supervisor to better
communicate performance expectations. Further, pay pool panels may weight
the critical elements, including a weight of zero. Weighted elements are
to be applied consistently to groups within a career path, such as Bench
Level, Supervisor, Program Manager, or Support for the Science and
Engineering Professionals career path. According to an NRL official,
panels commonly weight critical elements but rarely weight an element to
zero. Further, panels use weighting most often for the Science and
Engineering Professionals career path.

Determining individual position-based competencies. Other demonstration
projects determine competencies based primarily on the individual
position. NIST and DOC identify "critical elements" tailored to each
individual position.6 According to a DOC demonstration project official,
DOC tailors critical elements to individual positions because their duties
and responsibilities vary greatly within the demonstration project.

6At DOC, all managerial and supervisory employees are also evaluated on
core critical elements, such as recommending or making personnel
decisions; developing and appraising subordinates; and fulfilling
diversity, equal opportunity, and affirmative action responsibilities, in
addition to program responsibilities.

Each employee's performance plan is to have a minimum of two and a maximum
of six critical elements along with the major activities to accomplish the
element. Supervisors are to assign a weight to each critical element on
the basis of its importance, the time required to accomplish it, or both.
According to NIST and DOC officials, weighting is done at the supervisory
level and is not tracked at the organizational level.

To evaluate the accomplishment of critical elements, DOC uses its
organizationwide Benchmark Performance Standards. They range from the
highest standard of performance, "objectives were achieved with maximum
impact, through exemplary work that demonstrated exceptional originality,
versatility, and creativity" to the lowest, "objectives and activities
were not successfully completed, because of failures in quality, quantity,
completeness, or timelines of work." Supervisors can develop supplemental
performance standards as needed.

Similarly, each China Lake employee has a performance plan that includes
criteria tailored to individual responsibilities. The criteria are to be
consistent with the employee's work unit's goals and objectives and can be
set in two ways, depending on the nature of the position. The "task
approach" defines an individual's output. The "function approach" defines
the required skills and how well they are to be performed. Employees and
supervisors choose from a menu of skills, such as planning, analysis,
coordination, and reporting/documentation. A China Lake official stated
that some of its work units require core competencies, such as teamwork
and self-development, for all employees. According to the official, while
developing core competencies sends a message about what is important to
the organization, tailoring individual performance plans can focus
employees' attention on changing expectations.

Translating Employee Performance Ratings into Pay Increases and Awards

High-performing organizations seek to create pay, incentive, and reward
systems that clearly link employee knowledge, skills, and contributions to
organizational results. These organizations make meaningful distinctions
between acceptable and outstanding performance of individuals and
appropriately reward those who perform at the highest level. Performance
management systems in these leading organizations typically seek to
achieve three key objectives: (1) provide candid and constructive feedback
to help individual employees maximize their potential in understanding and
realizing the goals and objectives of the agency, (2) provide management
with the objective and fact-based information it needs to reward top

performers, and (3) provide the necessary information and documentation to
deal with poor performers.

To this end, the demonstration projects took different approaches in
translating individual employee performance ratings into permanent pay
increases, one-time awards, or both in their pay for performance systems.
Some projects, such as China Lake and NAVSEA's Newport division,
established predetermined pay increases, awards, or both depending on a
given performance rating. Others, such as DOC and NIST, delegated the
flexibility to individual pay pools to determine how ratings translate
into pay increases, awards, or both. Overall, while the demonstration
projects made some distinctions among employees' performance, the data and
experience to date show that making such meaningful distinctions remains a
work in progress.

Setting predetermined pay increases and awards. China Lake's assessment
categories translate directly to a predetermined range of permanent pay
increases, as shown in figure 1.7 Supervisors are to rate employees in one
of three assessment categories and recommend numerical ratings, based on
employees' performance and salaries, among other factors. For employees
receiving "highly successful" ratings, a Performance Review Board assigns
the numerical ratings. For "less than fully successful" ratings, the
first-line supervisor and a second-level reviewer assign the numerical
ratings, based on a problem-solving team's findings and a personnel
advisor's input. The numerical rating determines how many "increments" the
employee will receive. An increment is a permanent pay increase of about
1.5 percent of an employee's base salary.

7China Lake gives managers discretion in determining how awards are
distributed among employees with ratings of "fully successful" or above.

Figure 1: China Lake's Rating and Pay Distribution Structure

Source: GAO, based on DOD data.

Note: All employees receive the locality pay adjustment regardless of
assessment category or numerical rating.

China Lake made some distinctions in performance across employees'
ratings, as shown in figure 2:8

o 	11.3 percent of employees received a "1," the highest numerical rating,
and

o 	a total of six employees (0.2 percent) were rated "less than fully
successful" and received numerical ratings of "4" or "5."

8As a point of comparison, in 2002, about 48 percent of GS employees
across the executive branch under a similar five-level rating system were
rated in the highest category and less than 1 percent were rated as less
than fully successful.

Figure 2: China Lake's Rating Distribution by Numerical Rating (2002)
Highly successful-1

415 employees

Highly successful-2

1,639 employees

Fully successful-3

1,617 employees

0.1%

Less than fully successful-4

4 employees

0.1%

Less than fully successful-5

2 employees Source: GAO analysis of DOD data.

Note: Percentages total more than 100 percent due to rounding.

At China Lake, the average pay increase rose with performance, as shown in
table 2.

o  The average permanent pay increase ranged from 1.8 to 5.3 percent.

o 	Six employees were rated as "less than fully successful" and thus were
to receive no performance pay increases and half or none of the GPI.
According to a China Lake official, employees rated as "less than fully
successful" are referred to a problem-solving team, consisting of the
supervisor, reviewer, personnel advisor, and other appropriate officials,
that determines what corrective actions are necessary.

Table 2: China Lake's Pay Increase Distribution (2002)

                                 Number of   Increase as a percentage 
                                                          of base pay 
                       employees receiving                            
                             permanent pay                            
Assessment category Numerical rating          Average Lowest       Highest 
                       increases                                      
    Highly successful                1 191          5.3 1.5           
                                     2 929          3.4 1.5           
    Fully successful                 3 526          1.8 1.3           
     Less than fully                   4 0          N/A N/A               N/A 
       successful                      5 0          N/A N/A               N/A 
          Total                      1,646                            

Source: DOD.

Legend: N/A= data are not applicable.

Notes: Data do not include the GPI or the locality pay adjustment.

Employees whose salaries are at the top of the pay band cannot receive
permanent pay increases; therefore, the number of employees receiving pay
increases differs from those receiving ratings.

Similar to China Lake, at NAVSEA's Newport division, a performance rating
category translates directly to a predetermined range of permanent pay
increases, one-time awards, or both, as shown in figure 3. Newport
translates ratings into pay increases and awards in three steps. First,
supervisors are to rate employees as "acceptable" or "unacceptable."
Employees rated as unacceptable are not eligible for pay increases or
awards. Employees rated as acceptable are to be further assessed on their
performance relative to their salaries. Supervisors assess acceptable
employees into three rating categories: contributors, major contributors,
or exceptional contributors. Supervisors also make recommendations for the
number of pay points to be awarded, from 0 to 4, depending on the rating
category and the employees' salaries. Pay pool managers review and
department heads finalize supervisor recommendations. A pay point equals
1.5 percent of the midpoint salary of the pay band. Pay points may be
permanent pay increases or one-time awards.

Figure 3: NAVSEA Newport Division's Rating and Performance Pay
Distribution Structure

Source: GAO, based on DOD data.

Note: All employees receive the full GPI and locality pay adjustment
regardless of rating category.

Newport allows for some flexibility in deciding whether employees receive
permanent pay increases, one-time awards, or both. Newport's guidelines
state that those who make greater contributions should receive permanent
increases to base pay, while employees whose contributions are
commensurate with their salaries receive one-time awards. In addition,
employees whose salaries fall below the midpoint of the pay band are more
likely to receive permanent pay increases, while employees above the
midpoint of the pay band are more likely to receive one-time awards.

NAVSEA's Newport division made some distinctions in performance across
employees' ratings.9 As shown in figure 4,

o 	about 80 percent of employees were rated in the top two categories
(exceptional contributor and major contributor) and

o  no employees were rated unacceptable.

9As a point of comparison, in 2002, about 92 percent of GS employees
across the executive branch under a similar four-level rating system were
rated in the top two categories and about 0.1 percent were rated as
unacceptable.

Figure 4: NAVSEA Newport Division's Rating Distribution (2002)

Exceptional contributor

837 employees

Major contributor

851 employees

Contributor

423 employees

Unacceptable

0 employees Source: GAO analysis of DOD data.

Note: Percentages total less than 100 percent due to rounding.

In addition, at NAVSEA's Newport division, the average pay increase and
award amount rose with performance, as shown in table 3.

o  The average permanent pay increase ranged from 1.6 to 2.9 percent.

o  The average performance award ranged from $1,089 to $2,216.

 Table 3: NAVSEA Newport Division's Pay Increase and Award Distribution (2002)
                    Permanent pay increase Performance award

Rating

                        Number of employees receiving permanent pay increases

                               Increase as a percentage of base pay Number of
                                                                    employees
                                                                    receiving
                                                                  performance
                                                                       awards

                            Performance award amount

                          Average Lowest Highest       Average Lowest Highest 
     Exceptional      686     2.9  0.1     7.0    615  $2,216   $561  $5,680  
     contributor                                                      
        Major         602     2.0  0.9     5.3    613           561           
     contributor                                        1,592           4,260
     Contributor      124     1.6  1.2     1.8    143   1,089   519     2,212 
     Unacceptable       0     N/A  N/A     N/A       0     N/A  N/A       N/A 
        Total       1,412                        1,371                

Source: DOD.

Legend: N/A= data are not applicable.

Notes: Data do not include the GPI or locality pay adjustment.

Employees can receive their pay as permanent increases or one-time awards;
therefore, the number of employees receiving pay increases and awards
differs from those receiving ratings.

Delegating pay decisions to pay pools. Some demonstration projects, such
as NIST and DOC, delegate the flexibility to individual pay pools to
determine how ratings translate into permanent pay increases and one-time
awards. For example, supervisors are to evaluate employees on a range of
performance elements on a scale of 0 to 100. Employees with scores less
than 40 are to be rated as "unsatisfactory" and are not eligible to
receive performance pay increases, awards, the GPI, or the locality pay
adjustment. Employees with scores over 40 are to be rated as "eligible;"
receive the full GPI and locality pay adjustment; and be eligible for a
performance pay increase, award, or both.

Pay pool managers have the flexibility to determine the amount of the pay
increase, award, or both for each performance score, depending on where
they fall within the pay band. Employees lower in the pay band are
eligible for larger pay increases as a percentage of base pay than
employees higher in the pay band, and employees whose salaries are at the
top of the pay band and who therefore can no longer receive permanent
salary increases may receive awards.

According to our analysis, in its 2002 rating cycle, DOC made few
distinctions in performance in its distribution of ratings.10 As shown in
figure 5,

o 	100 percent of employees scored 40 or above and over 86 percent of
employees scored 80 or above and

o  no employees were rated as unsatisfactory.

10As a point of comparison, in 2002, about 99.9 percent of GS employees
across the executive branch under a similar two-level rating system passed
and about 0.1 percent failed.

Figure 5: DOC's Rating Distribution (2002)

                                     90-100

                                1,094 employees

                                     80-89

                                1,183 employees

                                     70-79

                                 289 employees

                                   1.7% 60-69

                                  44 employees

                                   0.2% 50-59

                                  6 employees

                                   0.3% 40-49

8 employees

0% Unsatisfactory

0 employees

Source: GAO analysis of DOC data.

According to a DOC official, a goal of the demonstration project is to
address poor performance early. An official also noted that poor
performers may choose to leave the organization before they receive
ratings of unsatisfactory or are placed on a performance improvement plan.
Employees who are placed on a performance improvement plan and improve
their performance within the specified time frame (typically less than 90
days) are determined to be eligible for the GPI and locality pay
adjustment for the remainder of the year.

Our analysis also shows that DOC made few distinctions in performance in
its distribution of awards. As shown in table 4, 10 employees who scored
from 60 to 69 received an average performance award of $925, while

employees who scored from 70 to 79 received an average of $742. Our
analysis suggests that DOC's policy of delegating flexibility to
individual pay pools to determine performance awards could explain why,
without an independent reasonableness review, some employees with lower
scores receive larger awards than employees with higher scores. According
to DOC, it reviews pay pool decisions within but not across organizational
units.

Table 4: DOC's Pay Increase and Award Distribution (2002) Permanent pay increase
                               Performance award

Rating

                        Number of employees receiving permanent pay increases

                                Average as a percentage of base pay Number of
                                                                    employees
                                                                    receiving
                                                                  performance
                                                                       awards

     Performance award amount Average Lowest Highest Average Lowest Highest

Eligible

        90-100       1,014   3.9   0.7    15.0   1,079  $1,781  $250   $7,500 
        80-89        1,121   3.1   0.02   11.0   1,099   1,117   100    6,000 
        70-79         250    2.4   0.2    9.0       181     742  50     2,000 
        60-69         18     0.9   0.2    3.2        10     925  300    2,500 
        50-59          1     1.2   1.2    1.2         1     300  300  
        40-49          0     N/A   N/A    N/A         1     200  200  
    Unsatisfactory     0     N/A   N/A    N/A         0     N/A  N/A      N/A 
        Total        2,404                       2,371                

Source: GAO analysis of DOC data.

Legend: N/A= data are not applicable.

Notes: Data do not include the GPI or the locality pay adjustment.

Not all employees who receive ratings receive pay increases or awards;
therefore, the number of employees receiving pay increases or awards
differs from those receiving ratings.

NIST also delegates pay decisions to individual pay pools. The NIST
100point rating system is similar to DOC's system. Employees with scores
under 40 are rated as "unsatisfactory" and do not receive the GPI,
locality pay adjustment, or performance pay increases or awards. Employees
with scores over 40 receive the full GPI and locality pay adjustment and
are eligible to receive performance pay increases, awards, or both.
Similar to DOC, in its 2002 rating cycle, NIST made few distinctions in
performance in its distribution of ratings. Specifically,

o 	99.9 percent of employees scored 40 or above, and nearly 78 percent of
employees scored 80 or above, and

o  0.1 percent, or 3 employees, were rated as unsatisfactory.

Considering Current Salary in Making Performance-Based Pay Decisions

Several of the demonstration projects consider an employee's current
salary when making decisions on permanent pay increases and one-time
awards. By considering salary in such decisions, the projects intend to
make a better match between an employee's compensation and his or her
contribution to the organization. Thus, two employees with comparable
contributions could receive different pay increases and awards depending
on their current salaries.

At AcqDemo, supervisors recommend and pay pool managers approve employees'
"contribution scores." Pay pools then plot contribution scores against the
employees' current salaries and a "standard pay line" to determine if
employees are "appropriately compensated," "undercompensated," or
"over-compensated," given their contributions.11 Figure 6 shows how
AcqDemo makes its performance pay decisions for employees who receive the
same contribution scores but earn different salaries.

11The "standard pay line" spans from the dollar equivalent of GS-1, step
1, to the dollar equivalent of GS-15, step 10. Appropriately compensated
employees' salaries fall within the "normal pay range," which encompasses
an area of +/- 4.0 points from the standard pay line.

 Figure 6: AcqDemo's Consideration of Current Salary in Making Performance Pay
                                   Decisions

Source: DOD.

AcqDemo has reported that it has made progress in matching employees'
compensation to their contributions to the organization. From 1999 to
2002, appropriately compensated employees increased from about 63 percent
to about 72 percent, under-compensated employees decreased from about 30
percent to about 27 percent, and over-compensated employees decreased from
nearly 7 percent to less than 2 percent.

NRL implemented a similar system intended to better match employee
contributions with salary. Data from NRL show that it has made progress in
matching employees' compensation to their contributions to the
organization. From 1999 to 2002, "normally compensated" employees, or
employees whose contributions match their compensation, increased from
about 68 percent to about 81 percent; under-compensated employees
decreased from about 25 percent to about 16 percent; and over-

compensated employees decreased from about 7 percent to about 3 percent.

Similar to AcqDemo's and NRL's approach, NAVSEA's Dahlgren division
recently redesigned its pay for performance system to better match
compensation and contribution. Because Dahlgren implemented its new system
in 2002, performance data were not available. Less systematically, China
Lake and NAVSEA's Newport division consider current salary in making pay
and award decisions. For example, at Newport, supervisors within each pay
pool are to list all employees in each pay band by salary before a rating
is determined and then evaluate each employee's contribution to the
organization considering that salary. If their contributions exceed
expectations, employees are considered for permanent pay increases. If
contributions meet expectations, employees are considered for one-time
awards.

Managing Costs of the Pay for Performance System

Salary Costs

OPM reports that the increased costs of implementing alternative personnel
systems should be acknowledged and budgeted for up front.12 Based on the
data the demonstration projects provided us, direct costs associated with
salaries, training, and automation and data systems were the major cost
drivers of implementing their pay for performance systems. The
demonstration projects reported other direct costs, such as evaluations
and administrative expenses. The demonstration projects used a number of
approaches to manage the direct costs of implementing and maintaining
their pay for performance systems.

Under the current GS system, federal employees annually receive the GPI
and, where appropriate, a locality pay adjustment, as well as periodically
receiving WGIs. The demonstration projects use these and other funding
sources under the GS to make their pay decisions, as shown in figure 7.

12U.S. Office of Personnel Management, Demonstration Projects and
Alternative Personnel Systems: HR Flexibilities and Lessons Learned
(Washington, D.C.: September 2001).

Figure 7: Funding Sources Linked to Pay Decisions in Selected Personnel
Demonstration Projects as of Fiscal Year 2003

                     Funding source  China Lake NIST  DOC  NRL NAVSEA AcqDemo 
                                GPI                                      a    
            Locality pay adjustment                                   
                        WGI and QSI                                   
           Career ladder promotions                                   

Source: GAO.

a According to AcqDemo officials, some AcqDemo organizational units
guaranteed the GPI for the first year to assure employees' understanding
and fair implementation of the process and others guaranteed the GPI for
additional, but limited, years to obtain local union agreement to enter
the demonstration project.

The aggregated average salary data that some of the demonstration projects
were able to provide do not allow us to determine whether total salary
costs for the demonstration projects are higher or lower than their GS
comparison groups. However, our analysis shows that the demonstration
projects' cumulative percentage increases in average salaries varied in
contrast to their GS comparison groups. For example, as shown in table 5,
after the first year of each demonstration project's implementation, the
differences in cumulative percentage increase in average salary between
the demonstration project employees and their GS comparison group ranged
from -2.9 to 2.7 percentage points.

Table 5: Cumulative Percentage Increase in Average Salaries for
Demonstration Project and Comparison Group Employees by Year of the
Project, as Reported by the Demonstration Projects

Year 1 Year 2 Year 3 Year 4 Year 5

D C Difference D C Difference D C Difference D C Difference D C Difference
%%% %%% %% % %%% %%%

China 27.

10.3 7.6 2.7 17.7 14.6 3.1 24.7 20.1 4.6 28.0 23.5 4.5 31.6 3.9Lake 7

21.

NIST 4.2 2.7 1.5 10.1 7.1 3.0 17.3 12.1 5.2 24.2 16.6 7.6 31.1 9.29

Source: GAO analysis of OPM, DOC, and DOD data.

Legend: D = demonstration project; C = comparison group for the
demonstration project in the GS system.

Notes: We calculated the percentage increase in average salaries using the
demonstration project's or comparison group's aggregated average salary in
the year prior to the project's implementation as the baseline.

Data are as reported by the demonstration projects without verification by
GAO.

Shaded areas indicate that the demonstration project has not yet reached
those years.

Based on our review of the DOC salary data, we determined that the data
were not adequate for use in our comparative analyses of salary growth.
Therefore, we do not present DOC's salary data.

According to a demonstration project official, AcqDemo does not collect
comparable salary data due to its constantly changing and growing
participant base. Therefore, we do not present AcqDemo's average salary
data. AcqDemo reports that demonstration project salaries increased 0.7
percent higher than GS salaries in fiscal year 2000 (year 1) and 2001
(year 2) and 0.9 percent higher in fiscal year 2002 (year 3).

The demonstration projects used several approaches to manage salary costs,
including (1) choosing the method of converting employees into the
demonstration project, (2) considering fiscal conditions and the labor
market, (3) managing movement through the pay band, and (4) providing a
mix of awards and performance pay increases.

Choosing the method of converting employees into the demonstration
project. When the demonstration projects converted employees from the GS
system to the pay for performance system, they compensated each employee
for the portion of the WGI that the employee had earned either as a
permanent increase to base pay or a one-time lump

sum payment. Four of the six demonstration projects (China Lake, NRL,
NAVSEA, and AcqDemo) gave employees permanent increases to base pay, while
the remaining two demonstration projects (NIST and DOC) gave employees
one-time lump sum payments.

Both methods of compensating employees have benefits and drawbacks,
according to demonstration project officials. Giving permanent pay
increases at the point of conversion into the demonstration project
recognizes that employees had already earned a portion of the WGI, but a
drawback is that the salary increases are compounded over time, which
increases the organization's total salary costs. However, the officials
said that giving permanent pay increases garnered employees' support for
the demonstration project because employees did not feel like they would
have been better off under the GS system.

Considering fiscal conditions and the labor market. In determining how
much to budget for pay increases, demonstration projects considered the
fiscal condition of the organization as well as the labor market. For
example, China Lake, NIST, NRL, and NAVSEA receive a portion of their
funding from a working capital fund and thus must take into account fiscal
conditions when budgeting for pay increases and awards. These
organizations rely, in part, on sales revenue rather than direct
appropriations to finance their operations. The organizations establish
prices for their services that allow them to recover their costs from
their customers. If the organizations' services become too expensive
(i.e., salaries are too high), they become less competitive with the
private sector.

A demonstration project official at NAVSEA's Newport division said that as
an organization financed in part through a working capital fund, it has an
advantage over organizations that rely completely on appropriations
because it can justify adjusting pay increase and awards budgets when
necessary to remain competitive with the private sector. Newport has had
to make such adjustments. In fiscal year 2002, the performance pay
increase and award pools were funded at lower levels (1.4 percent and 1.7
percent of total salaries for pay increases and awards, respectively) than
in 2001 (1.7 percent and 1.8 percent, respectively) because of fiscal
constraints. As agreed with one of its unions, Newport must set aside a
minimum of 1.4 percent of salaries for its pay increases, which is equal
to historical spending under GS for similar increases.

NAVSEA's Newport division also considers the labor market and uses
regional and industry salary information compiled by the American

Association of Engineering Societies when determining how much to set
aside for pay increases and awards. In fiscal year 2001, Newport funded
pay increases and awards at a higher level (1.7 percent and 1.8 percent of
total salaries, respectively) than in fiscal year 2000 (1.4 percent and
1.6 percent, respectively) in response to higher external engineer,
scientist, and information technology personnel salaries.

Managing movement through the pay band. Because movement through the pay
band is based on performance, demonstration project employees could
progress through the pay band more quickly than under the GS. Some
demonstration projects have developed ways intended to manage this
progression to prevent all employees from eventually migrating to the top
of the pay band and thus increasing salary costs.

NIST and DOC manage movement through the pay band by recognizing
performance with larger pay increases early in the pay band and career
path and smaller increases higher in the pay band and career path. Both of
these demonstration projects divided each pay band into five intervals.
The intervals determine the maximum percentage increase employees could
receive for permanent pay increases. The intervals, shown in figure 8,
have helped NIST manage salary costs, according to a NIST official.

Figure 8: Pay Bands, Intervals, and Corresponding Permanent Pay Increases
for NIST's Scientific and Engineering Career Path

                                            5   0 - 4  
                                            4   0 - 4  
                            V (GS 15)       3   0 - 4  
                                            2   0 - 5  
                                            1   0 - 6  
                                           4-5  0 - 6  
                          IV (GS 13-14)     3   0 - 6  
                                            2   0 - 8  
                                            1   0 -10  
                                           4-5  0 - 7  
                          III (GS 11-12)    3   0 - 7  
                                            2   0 -12  
                                            1   0 -15  
                                           4-5  0 - 8  
                           II (GS 7-10)     3   0 - 8  
                                            2   0 -16  
                                            1   0 -20  
                                           4-5  0 - 7  
                            I (GS 1-6)      3   0 - 7  
                                            2   0 -12  
                                            1   0 -14  

Source: DOC.

Similarly, some of the demonstration projects, including China Lake and
NAVSEA's Dahlgren division, have checkpoints or "speed bumps" in their pay
bands intended to manage salary costs as well as ensure that employees'
performance coincides with their salaries as they progress through the
band. These projects established checkpoints designed to ensure that only
the highest performers move into the upper half of the pay band. For
example, when employees' salaries at China Lake reach the midpoint of the
pay band, they must receive ratings of highly successful, which are
equivalent to exceeding expectations, before they can receive additional
salary increases. A Performance Review Board, made up of senior
management, is to review all highly successful ratings.

Providing a mix of awards and pay increases. Some of the demonstration
projects intended to manage costs by providing a mix of one-time awards
and permanent pay increases. Rewarding an employee's performance with an
award instead of an equivalent increase to base pay can reduce salary
costs in the long run because the agency only has to pay the amount of the
award one time, rather than annually. For example, at NAVSEA's Newport
division, as employees move higher into the pay band, they are more likely
to receive awards than permanent increases to base pay. According to a
Newport official, expectations increase along with salaries and thus it is
more likely that their contributions would meet, rather than exceed,
expectations.

To manage costs, China Lake allows pay pools to transfer some of their
budgets for permanent pay increases to their budgets for awards. A China
Lake official said that because China Lake receives a portion of its
funding from a working capital fund, it is not only important to give
permanent salary increases to high-performing employees, but also to give
increases China Lake can afford the next year. China Lake does not track
how much funding is transferred from performance pay increase budgets to
awards budgets.

Training Costs	We have reported that agencies will need to invest
resources, including time and money, to ensure that employees have the
information, skills, and competencies they need to work effectively in a
rapidly changing and complex environment.13 This includes investments in
training and developing employees as part of an agency's overall effort to
achieve costeffective and timely results. Agency managers and supervisors
are often aware that investments in training and development initiatives
can be quite large. However, across the federal government, evaluation
efforts have often been hindered by the lack of accurate and reliable data
to document the total costs of training efforts. Each of the demonstration
projects trained employees on the performance management system prior to
implementation to make employees aware of the new approach, as well as
periodically after implementation to refresh employee familiarity with the
system. The training was designed to help employees understand
competencies and performance standards; develop performance plans; write
self-appraisals; become familiar with how performance is evaluated

13U.S. General Accounting Office, Human Capital: A Guide for Assessing
Strategic Training and Development Efforts in the Federal Government
(Exposure Draft), GAO-03893G (Washington, D.C.: July 1, 2003).

and how pay increases and awards decisions are made; and know the roles
and responsibilities of managers, supervisors, and employees in the
appraisal and payout processes.

Generally, demonstration projects told us they incurred direct and
indirect costs associated with training. Direct training costs that the
demonstration projects reported included costs for contractors, materials,
and travel related to developing and delivering training to employees and
managers. As shown in table 6, total direct costs that the demonstration
projects reported for training through the first 5 years of the projects'
implementation range from an estimated $33,000 at NAVSEA's Dahlgren
division to more than $1 million at China Lake.14 (NIST reported no direct
costs associated with training.) Training costs, as indicated by the cost
per employee, were generally higher in the year prior to implementation,
except for AcqDemo's, which increased over time.

14All dollars were inflation-adjusted to 2002 dollars because the
demonstration projects took place over a variety of years.

Table 6: Direct Inflation-Adjusted Cost of Training in the First 5 Years
of the Demonstration Projects (in 2002 Dollars), as Reported by the
Demonstration Projects

Cost per demonstration project employee Total cost

Demonstration Year prior to  Year 1 Year  Year  Year   Year       Prior to 
                                         2     3     4       5 
      project    implementation                                implementation 
                                                               through year 5 
    China Lake             $203  $21    No    No    No      No     $1,226,000 
                                       data  data  data   data 
       NIST                   0      0     0     0     0     0 
                                                            No                
        DOC                  12      6    $5    $8    $8  data        105,000

NAVSEA-17 0 0 0 0 0 33,000 Dahlgren (estimate) (estimate) (estimate) (estimate)
                        (estimate) (estimate) (estimate)

                     AcqDemo No data 8 10 9 20 $19 458,000

Source: GAO analysis of DOC and DOD data.

Notes: The cost per demonstration project employee is based on the number
of employees in the demonstration project at the same time each year, not
the actual number of employees trained on the demonstration project,
because the demonstration projects do not collect this information.

Data are as reported by the demonstration projects without verification by
GAO.

Shaded squares indicate that the demonstration project has not yet reached
those years.

While the demonstration projects did not report indirect costs associated
with training employees on the demonstration project, officials stated
that indirect costs, such as employee time spent developing, delivering,
or attending training, could nonetheless be significant. Likewise, the
time spent on the "learning curve" until employees are proficient with the
new system could also be significant. For example, although NIST did not
capture its indirect training costs, agency officials told us that prior
to implementation, each NIST employee was in training for 1 day. Since its
implementation, NIST offers optional one-half day training three times a
year for all employees. AcqDemo offered 8 hours of training for employees
prior to implementation and a minimum of 4 hours of training after
implementation. All potential new participants also received eight hours
of training prior to implementation at their site. Supervisors and human
resources professionals at AcqDemo were offered an additional 8 hours of
training each year after the demonstration project was implemented.
According to a DOC official, prior to conversion to the demonstration
project, DOC provided a detailed briefing to approximately 400 employees
to increase employee understanding of the project. In addition, employees

could schedule one-on-one counseling sessions with human resources staff
to discuss individual issues and concerns.

Some of the demonstration projects, including China Lake, DOC, and
NAVSEA's Dahlgren and Newport divisions, managed training costs by relying
on current employees to train other employees on the demonstration
project. According to demonstration project officials, while there are
still costs associated with developing and delivering in-house training,
total training costs are generally reduced by using employees rather than
hiring contractors to train employees. For example, China Lake took a
"train the trainer" approach by training a group of employees on the new
flexibilities in the demonstration project and having those employees
train other employees. According to a demonstration project official, an
added benefit of using employees to train other employees is that if the
person leading the training is respected and known, then the employees are
more likely to support the demonstration project. The official said that
one drawback is that not all employees are good teachers, so their skills
should be carefully considered.

AcqDemo used a combination of contractors and in-house training to
implement its training strategy. According to an AcqDemo official, the
relatively higher per demonstration project employee costs in years 4 and
5 are a result of AcqDemo's recognition that more in-depth and varied
training was needed for current AcqDemo employees to refresh their
proficiency in the system; for new participants to familiarize them with
appraisal and payout processes; as well as for senior management, pay pool
managers and members, and human resources personnel to give them greater
detail on the process.

Automation and Data Systems As a part of implementing a pay for
performance system, some of the

Costs	demonstration projects installed new or updated existing automated
personnel systems. Demonstration projects reported that total costs
related to designing, installing, and maintaining automation and data
systems ranged from an estimated $125,000 at NAVSEA's Dahlgren division to
an estimated $4.9 million at AcqDemo, as shown in table 7.

Table 7: Inflation-Adjusted Cost of Automation and Data Systems for
Selected Demonstration Projects (in 2002 Dollars), as Reported by the
Demonstration Projects

Dollars in thousands

                    China NIST  DOC      NRL  NAVSEA-     NAVSEA      AcqDemo 
                    Lakea                     Dahlgren   Newport   
       Prior to      No      0      0 $1,467    $125       $333    
    implementation  data                     (estimate) (estimate) 
Cumulative cost   No                          0         463         $4,871 
        since       data     0 $2,317  2,166 (estimate) (estimate) (estimate) 
    implementation                                                 
        Total        No      0 $2,317 $3,633    $125       $796        $4,871 
                    data                     (estimate) (estimate) (estimate) 

Source: GAO analysis of DOC and DOD data.

Notes: Data are as reported by the demonstration projects without
verification by GAO.

Costs may not sum to totals due to rounding.

aAutomation and data systems were not widely used when the China Lake
demonstration project was implemented in 1980.

To manage data system costs, some demonstration projects modified existing
data systems rather than designing completely new systems to meet their
information needs. For example, NAVSEA's divisions worked together to
modify DOD's existing Defense Civilian Personnel Data System to meet their
needs for a revised performance appraisal system. Similarly, DOC imported
the performance appraisal system developed by NIST and converted the
payout system to a Web-based system. While NIST reported that it incurred
no direct costs for automation and data systems, officials told us it used
in-house employees, NIST's Information Technology Laboratory staff, to
develop a data system to automate performance ratings, scores, increases,
and awards.

NRL used a combination of in-house employees and contractors to automate
its performance management system. While reported automation and data
systems' costs were higher for NRL than for most other demonstration
projects, NRL reports that its automated system has brought about savings
each year of an estimated 10,500 hours of work, $266,000, and 154 reams of
paper since the demonstration project was implemented in 1999.

Providing Information to Employees about the Results of Performance
Appraisal and Pay Decisions

We have observed that a performance management system should have adequate
safeguards to ensure fairness and guard against abuse. One such safeguard
is to ensure reasonable transparency and appropriate accountability
mechanisms in connection with the results of the performance management
process. To this end, NIST, NAVSEA's Newport Division, NRL, and AcqDemo
publish information for employees on internal Web sites about the results
of performance appraisal and pay decisions, such as the average
performance rating, the average pay increase, and the average award for
the organization and for each individual unit. Other demonstration
projects publish no information on the results of the performance cycle.

NAVSEA's Newport division publishes results of its annual performance
cycle. Newport aggregates the data so that no individual employee's rating
or payout can be determined to protect confidentiality. Employees can
compare their performance rating category against others in the same unit,
other units, and the entire division, as shown in figure 9.

Figure 9: Sample of NAVSEA Newport Division's Rating Category Distribution
Data Provided to Employees

80 Percentage of employees

70

60

50

40

30

20

10

0

Division Unit 1 Unit 2 Unit 3 Unit 4 Unit 5 Unit 6 Unit 7 Unit 8 Unit 9
Unit 10 Unit 11 Unit 12 Unit 13 Unit 14 n=2020 n=75 n=718 n=62 n=294 n=362
n=190 n=151 n=159 n=132 n=32 n=182 n=381 n=202 n=176 Unit

Source: DOD.

Contributor

Major contributor

Exceptional contributor

Until recently, only if requested by an employee would NIST provide
information such as the average rating, pay increase, and award amount for
the employee's pay pool. To be more open, transparent, and responsive to
employees, NIST officials told us that in 2003, for the first time, NIST
began to publish the results of the performance cycle on its internal Web
site. NIST published averages of the performance rating scores, as shown
in figure 10, as well as the average recommended pay increase amounts and
the average awards by career path, for the entire organization, and for
each organizational unit. According to one NIST official, the first day
the results were published on the internal Web site, the Web site was
visited more than 1,600 times.

Source: DOC.

aIndicates that there were not enough employees in the unit to protect
confidentiality; therefore, no data are reported.

Publishing the results of the performance management process can provide
employees with the information they need to better understand the
performance management system. However, according to an official, DOC does
not currently publish performance rating and payout results even though
DOC's third year evaluation found that demonstration project participants
continued to raise concerns that indicated their lack of understanding
about the performance appraisal process. According to the evaluation,
focus group and survey results indicated the need for increased
understanding on topics such as how pay pools work, how salaries are
determined, and how employees are rated. Employees were also interested in
knowing more about the results of the performance appraisal process. One
union representative told us that a way to improve the demonstration
project would be to publish information. In past years, according to
employee representatives, some employees and union representatives at DOC
have used the Freedom of Information Act to request and obtain the
information. According to a DOC official, DOC plans to discuss the
publication of average scores by each major unit and look for options to

increase employee understanding of the performance management system at
upcoming Project Team and Departmental Personnel Management Board
meetings.

Concluding Observations

Linking pay to performance is a key practice for effective performance
management. As Congress, the administration, and federal agencies continue
to rethink the current approach to federal pay to place greater emphasis
on performance, the experiences of personnel demonstration projects can
provide insights into how some organizations within the federal government
are implementing pay for performance. The demonstration projects took
different approaches to using competencies to evaluate employee
performance, translating performance ratings into pay increases and
awards, considering employees' current salaries in making performance pay
decisions, managing costs of the pay for performance systems, and
providing information to employees about the results of performance
appraisal and pay decisions. These different approaches were intended to
enhance the success of the pay for performance systems because the systems
were designed and implemented to meet the demonstration projects' unique
cultural and organizational needs.

We strongly support the need to expand pay for performance in the federal
government. How it is done, when it is done, and the basis on which it is
done can make all the difference in whether such efforts are successful.
High-performing organizations continuously review and revise their
performance management systems to achieve results, accelerate change, and
facilitate two-way communication throughout the year so that discussions
about individual and organizational performance are integrated and
ongoing. To this end, these demonstration projects show an understanding
that how to better link pay to performance is very much a work in progress
at the federal level.

Additional work is needed to strengthen efforts to ensure that performance
management systems are tools to help the demonstration projects manage on
a day-to-day basis. In particular, there are opportunities to use
organizationwide competencies to evaluate employee performance that
reinforce behaviors and actions that support the organization's mission,
translate employee performance so that managers can make meaningful
distinctions between top and poor performers with objective and factbased
information, and provide information to employees about the results of the
performance appraisals and pay decisions to ensure that reasonable
transparency and appropriate accountability mechanisms are in place.

Agency Comments	We provided drafts of this report to the Secretaries of
Defense and Commerce for their review and comment. DOD's Principal Deputy,
Under Secretary of Defense for Personnel and Readiness, provided written
comments, which are presented in appendix III. DOD concurred with our
report and stated that it is a useful summary of the various approaches
that the demonstration projects undertook to implement their pay for
performance systems and that their experiences provide valuable insight
into federal pay for performance models. DOD also noted that the NAVSEA
demonstration project training and automation cost data are estimated
rather than actual costs. We made the appropriate notation. While DOC did
not submit written comments, DOC's Classifcation, Pay, and HR
Demonstration Program Manager provided minor technical clarifications and
updated information. We made those changes where appropriate. We provided
a draft of the report to the Director of OPM for her information.

As agreed with your offices, unless you publicly announce its contents
earlier, we plan no further distribution of this report until 30 days
after its date. At that time, we will provide copies of this report to
other interested congressional parties, the Secretaries of Defense and
Commerce, and the Director of OPM. We will also make this report available
to others upon request. In addition, the report will be available at no
charge on the GAO Web site at http://www.gao.gov.

If you have any questions about this report, please contact me or Lisa
Shames on (202) 512-6806. Other contributors are acknowledged in appendix
IV.

J. Christopher Mihm Director, Strategic Issues

Appendix I

                       Objective, Scope, and Methodology

To meet our objective to identify the approaches that selected personnel
demonstration projects have taken to implement their pay for performance
systems, we chose the following demonstration projects: the Navy
Demonstration Project at China Lake (China Lake), the National Institute
of Standards and Technology (NIST), the Department of Commerce (DOC), the
Naval Research Laboratory (NRL), the Naval Sea Systems Command Warfare
Centers (NAVSEA) at Dahlgren and Newport, and the Civilian Acquisition
Workforce Personnel Demonstration Project (AcqDemo). We selected these
demonstration projects based on our review of the projects and in
consultation with the Office of Personnel Management (OPM). Factors we
considered in selecting these demonstration projects included the type of
pay for performance system, type of agency (defense or civilian), status
of the project (ongoing, permanent, or complete), date the project was
implemented, and number and type of employees covered (including employees
covered by a union).

To identify the different approaches that the demonstration projects took
in implementing their pay for performance systems, we analyzed Federal
Register notices outlining the major features and regulations for each
demonstration project, operating manuals, annual and summative
evaluations, employee attitude survey results, project briefings, training
materials, rating and payout data, cost data, rating distribution data
from OPM's Central Personnel Data File (CPDF), and other relevant
documentation. In addition, we spoke with cognizant officials from OPM;
demonstration project managers, human resource officials, and
participating supervisors and employees; and union and other employee
representatives.

We prepared a data collection instrument to obtain actual and estimated
cost data from the six demonstration projects. We tested the instrument
with a demonstration project official to ensure that the instrument was
clear and comprehensive. After revising the instrument based on the
official's recommendations, we administered the instrument via e-mail and
followed up with officials via telephone, as necessary. Officials from the
six demonstration projects provided actual cost data where available and
estimated data when actual data were not available. Cost data reported are
actual unless otherwise indicated. We adjusted cost data for inflation
using the Consumer Price Index, in 2002 dollars. We provide average salary
data, as reported by the demonstration projects and OPM without
verification by GAO. The aggregated average salary data do not allow us to
determine whether total salary costs for the demonstration projects are
higher or lower than their General Schedule (GS) comparison groups.

Appendix I
Objective, Scope, and Methodology

We did not independently evaluate the effectiveness of the demonstration
projects or independently validate the data provided by the agencies or
published in the evaluations. We assessed the reliability of cost, salary,
rating, and performance pay distribution data provided by the
demonstration projects by (1) performing manual and electronic testing of
required data elements, (2) reviewing existing information about the data,
and (3) interviewing agency officials knowledgeable about the data. We
determined that the data were sufficiently reliable for the purposes of
this report, with the exception of the DOC salary data, which we do not
present. Based on our review of the DOC salary data we determined that the
data were not adequate for use in our comparative analyses of salary
growth. An evaluation of the DOC demonstration project reported that data
were missing in critical fields, such as pay and performance scores.1

We did not independently verify the CPDF data for September 30, 2002.
However, in a 1998 report (OPM's Central Personnel Data File: Data Appear
Sufficiently Reliable to Meet Most Customer Needs, GAO/GGD-98199, Sept.
30, 1998), we reported that governmentwide data from the CPDF for key
variables, such as GS-grade, agency, and career status, were 97 percent or
more accurate. However, we did not verify the accuracy of employee
ratings.

We performed our work in the Washington, D.C., metropolitan area from
December 2002 through August 2003 in accordance with generally accepted
government auditing standards.

1Booz Allen Hamilton, Department of Commerce Personnel Management
Demonstration Project Evaluation Year Four Report (McLean, Va.: September
2003).

Appendix II

                         Demonstration Project Profiles

Navy Demonstration Project at China Lake (China Lake)

Source: GAO analysis of DOD and OPM data

Purpose The Navy Demonstration Project1 was to

o 	develop an integrated approach to pay, performance appraisal, and
classification;

o  allow greater managerial control over personnel functions; and

o 	expand the opportunities available to employees through a more
responsive and flexible personnel system.

Selected Elements of the Performance Management System

Competencies: Competencies are tailored to an individual's position. The
employees and their supervisors are to develop performance plans, which
identify the employees' responsibilities and expected results. In
addition, all supervisors are to include certain management competencies
from a menu of managerial factors that best define their responsibilities,
such as developing objectives, organizing work, and selecting and
developing people.

Feedback: Supervisors are to conduct two progress reviews of employees'
performance, set at 5 and 9 months in the performance cycle.

1The Navy Demonstration Project was also implemented at the Space and
Naval Systems Command in San Diego, California.

Appendix II Demonstration Project Profiles

Self-assessment: Employees are strongly encouraged to list accomplishments
for their supervisors' information when determining the performance
rating.

Levels of performance rating: The levels are highly successful (rating
levels 1 or 2), fully successful (rating level 3), or less than fully
successful (rating levels 4 or 5).

Safeguards:

o 	Second-level review: Second-level supervisors are to review all
assessments. In addition, an overall assessment of highly successful is to
be sent to the appropriate department's Performance Review Board for the
assignment of an official rating of "1" or "2." The supervisor and
reviewer are to assign a "4" or "5" rating based on a problem-solving
team's findings and a personnel advisor's input.

o 	Grievance process: Generally, employees may request reconsideration of
their ratings in writing to the third-level supervisor and indicate why a
higher rating is warranted and what rating is desired. The third-level
supervisor can either grant the request or request that a recommending
official outside of the immediate organization or chain of authority be
appointed. The employee is to receive a final decision in writing within
21 calendar days.

                   Appendix II Demonstration Project Profiles

Selected Employee Attitude Data

           Figure 11: Selected Employee Attitude Data for China Lake
       "Under the present system,                                             
      financial rewards are seldom                                            
related to employee performance."                                          
Baseline (1979) "Pay raises depend                                         
     on how well employees perform                                            
         their jobs." 2003 "Job       Demonstration group   Comparison group  
    satisfaction-your pay." "All in   Agree % Disagree % N Agree % Disagree % 
all, I am satisfied with my pay."  37 39 2,221 40 40    N No data No data  
Baseline (1993) 2003 "I feel that  1,149 59 28 1,200 52 No data No data No 
       my supervisor will rate my     25 1,149 63 14 2,221  data No data No   
performance (and set my pay) in a  56 26 1,200 29 No     data No data No   
fair, impartial manner." Baseline  data 2,221 71 13 No   data No data No   
     (1979) "My performance rating    data                  data No data No   
     represents a fair and accurate                         data No data No
        picture of my actual job                            data No data No
performance." 1993 "I am in favor                        data No data N/A
     of the demonstration project."                         N/A N/A N/A N/A
          Baseline (1979) 1998                                    N/A

Source: DOD.

Legend: N/A = data are not applicable; N = number of respondents.

Other Interventions	Reduction in force. To allow for increased retention
of high-performing employees at all levels by ranking employees based on
performance for retention standings.

Salary flexibility. To set entry-level salaries to take into account
market conditions.

Selected Reported Effects A demonstration project evaluation reported the
following effects.2

2Source: U.S. Office of Personnel Management, A Summary Assessment of the
Navy Demonstration Project (Washington, D.C.: February 1986).

                   Appendix II Demonstration Project Profiles

o 	Employees viewed performance improvements within their control and
reported increased recognition of individual performance.

o 	The perception of a pay-performance link was significantly strengthened
under the demonstration pay for performance system, but not in the
comparison group.

o 	Pay satisfaction increased slightly at the demonstration sites and
declined at the control laboratories.

o 	Employees and supervisors cited improved communication, a more
objective focus, and clearer performance expectations as major system
benefits.

o 	Employees and supervisors perceived their performance appraisal system
to be more flexible than the comparison group, to focus more on actual
work requirements, and thus to be more responsive to laboratory needs.

o 	Employees at the demonstration project reported having more input into
the development of performance plans than employees in the comparison
group.

Sources for Additional http://www.nawcwpns.navy.mil/~hrd/demo.htm (Last
accessed on Nov. 7, Information 2003)

http://www.opm.gov/demos/main.asp (Last accessed on Nov. 7, 2003)

                   Appendix II Demonstration Project Profiles

National Institute of Standards and Technology (NIST)

Source: GAO analysis of DOC and OPM data.

Purpose	The NIST demonstration project, formerly known as the National
Bureau of Standards, was to

o 	improve hiring and allow NIST to compete more effectively for
highquality researchers,

o  motivate and retain staff,

o  strengthen the manager's role in personnel management, and

o  increase the efficiency of personnel systems.

Selected Elements of the Competencies: Competencies, called "critical
elements," are based on the Performance Management individual position.
Employee performance plans are to have a minimum System of two and a
maximum of six critical elements, which the supervisor

weights, based on the importance of the critical element, the time
required to accomplish the critical element, or both. Managers' and
supervisors' performance plans are to include a critical element on
diversity and it must be weighted at least 15 points.

Appendix II Demonstration Project Profiles

Feedback: Supervisors are to conduct midyear reviews of all employees to
discuss accomplishments or deficiencies and modify the initial performance
plans, if necessary.

Self-assessment: Employees are to submit lists of accomplishments for
their supervisors' information when determining the performance ratings.

Levels of performance rating: The levels are "eligible" or
"unsatisfactory." On a scale of 0 to 100, employees who receive scores
over 40 are rated eligible and those with scores below 40 unsatisfactory.

Safeguards:

o 	Second-level review: Pay pool managers are to review recommended scores
from supervisors and select a payout for each employee. Pay pool managers
are to present the decisions to the next higher official for review if the
pay pool manager is also a supervisor. The organizational unit director is
to approve awards and review all other decisions.

o 	Grievance procedure: Employees may grieve their performance ratings,
scores, and pay increases by following DOC's Administrative Grievance
Procedure or appropriate negotiated grievance procedures.

                   Appendix II Demonstration Project Profiles

Selected Employee Attitude Data

              Figure 12: Selected Employee Attitude Data for NIST
     "Under the present system,                                               
    financial rewards are seldom                                              
         related to employee                                                  
       performance." Baseline                                                 
(1987/1988) 1995 "All in all, I Demonstration group                        
     am satisfied with my pay."    Agree % Disagree % N                       
    Baseline (1987/1988) 1995 "My  38 47 2,319 24 56 a 49                     
performance rating represents a 39 2,319 56 27 a 58 29                     
fair and accurate picture of my 2,319 56 21 a 71 17     Comparison group
    actual performance." Baseline  2,319 65 18 a 47 18    Agree % Disagree %
(1987/1988) 1995 "I have trust  2,319 70 8 a           N 36 44 396 44 36 a
        and confidence in my                               34 56 395 42 39 a
    supervisor." Baseline (1987)                           61 20 397 56 25 a
     1995 "I am in favor of the                           No data No data No
    demonstration project." 1989                           data 58 23 a N/A
                1995                                      N/A N/A N/A N/A N/A

Sources: U.S. Office of Personnel Management, Implementation Report
National Institute of Standards and Technology Personnel Management
Demonstration Project (Washington, D.C.: Aug. 18, 1989) and Summative
Evaluation Report National Institute of Standards and Technology
Demonstration Project: 1988-1995 (Washington, D.C.: June 27, 1997).

Legend: N/A = data are not applicable; N = number of respondents.

aOPM reported that 47 percent of 3,200 NIST employees and 44 percent of
2,392 comparison group employees responded to the survey.

Other Interventions	Reduction in force. To credit an employee with an
overall performance score in the top 10 percent of scores within a peer
group with 10 additional years of service for retention purposes.

Supervisory differential. To establish supervisory intervals within a pay
band that allow for a maximum rate up to 6 percent higher than the maximum
rate of the nonsupervisory intervals within the pay band.

Hiring flexibility. To provide flexibility in setting initial salaries
within pay bands for new appointees, particularly for hard-to-fill
positions in the Scientific and Engineering career path.

                   Appendix II Demonstration Project Profiles

Extended probation. To require employees in the Scientific and Engineering
career path to serve a probationary period of 1 to 3 years.

Selected Reported Effects A demonstration project evaluation reported the
following effects.3

o 	Recruitment bonuses were used sparingly but successfully to attract
candidates who might not have accepted federal jobs otherwise.

o 	NIST has become more competitive with the private sector and employees
are less likely to leave for reasons of pay.

o 	NIST was able to provide significant performance-based awards, some
with merit increases as high as 20 percent. NIST succeeded in retaining
more of its high performers than the comparison group.

o 	Managers reported significantly increased authority over hiring and pay
decisions.

o 	Managers reported that they felt significantly less restricted by
personnel rules and regulations than other federal managers.

Source for Additional http://www.opm.gov/demos/main.asp (Last accessed on
Nov. 7, 2003) Information

3Source: U.S. Office of Personnel Management, Summative Evaluation Report
National Institute of Standards and Technology Demonstration Project:
1988-1995 (Washington, D.C.: June 27, 1997).

                   Appendix II Demonstration Project Profiles

Department of Commerce (DOC)

Source: GAO analysis of DOC and OPM data.

Purpose	The DOC demonstration project was to test whether the
interventions of the NIST demonstration project could be successful in
environments with different missions and different organizational
hierarchies.

Selected Elements of the Performance Management System

Competencies: Competencies, called "critical elements," are tailored to
each individual position. Performance plans are to have a minimum of two
and a maximum of six critical elements. The supervisor is to weight each
critical element, based on the importance of the element, the time
required to accomplish it, or both, so that the total weight of all
critical elements is 100 points. Organizationwide benchmark performance
standards are to define the range of performance, and the supervisor may
add supplemental performance standards to a performance plan. Performance
plans for managers and supervisors are to include critical elements such
as recommending or making personnel decisions; developing and appraising
subordinates; fulfilling diversity, equal opportunity, and affirmative
action responsibilities; and program and managerial responsibilities.

Feedback: Supervisors are to conduct midyear reviews of all employees to
discuss accomplishments or deficiencies and modify the initial performance
plans, if necessary.

Appendix II Demonstration Project Profiles

Self-assessment: Employees are to submit lists of accomplishments for
their supervisors' information when determining the performance ratings.

Levels of performance rating: The levels are "eligible" or
"unsatisfactory." On a scale of 0 to 100, employees who receive scores
over 40 are rated eligible and those with scores below 40 unsatisfactory.

Safeguards:

o 	Second-level review: The pay pool manager is to review recommended
scores from subordinate supervisors and select a payout for each employee.
The pay pool manager is to present the decisions to the next higher
official for review if the pay pool manager is also a supervisor.

o 	Grievance procedure: Employees may request reconsideration of
performance decisions, excluding awards, by the pay pool manager through
DOC's Administrative Grievance Procedure or appropriate negotiated
grievance procedures.

                   Appendix II Demonstration Project Profiles

Selected Employee Attitude Data

               Figure 13: Selected Employee Attitude Data for DOC
"Pay raises depend on how well                                             
you perform." Baseline (1998)                                              
       2001 "All in all I am                                                  
      satisfied with my pay."                              Comparison group   
      Baseline (1998) 2001 "My                            Agree % Disagree %  
performance rating represents                           N 34 44 512 33 40  
a fair and accurate picture of                         609 41 39 512 46 41 
      my actual performance."     Demonstration group     609 58 23 512 54 26 
    Baseline (1998) 2002 "I have  Agree % Disagree % N 36 609 60 23 512 59 25
     trust and confidence in my   39 1,024 52 28 1,112 47 609 25 13 512 22 24
    supervisor." Baseline (1998)  35 1,024 58 30 1,112 59         609
     2001 "I am in favor of the   22 1,024 56 24 1,112 59 
      demonstration project."     22 1,024 62 23 1,112 37 
        Baseline (1998) 2001      26 1,024 47 29 1,112    

Source: Booz Allen Hamilton, Department of Commerce Personnel Management
Demonstration Project Evaluation Operational Year Technical Report
(Washington, D.C.: Oct. 8, 2002).

                       Legend: N = number of respondents.

Other Interventions	Reduction in force. To credit employees with
performance scores in the top 30 percent of a career path in a pay pool
with 10 additional years of service for retention purposes. Other
employees rated "eligible" receive 5 additional years of service for
retention credit.

Supervisory performance pay. To offer employees who spend at least 25
percent of their time performing supervisory duties pay up to 6 percent
higher than the regular pay band.

Probationary period. To require a 3-year probationary period for newly
hired science and engineering employees performing research and
development duties.

                   Appendix II Demonstration Project Profiles

Selected Reported Effects A demonstration project evaluation reported the
following effects.4

o 	The pay for performance system continues to exhibit a positive link
between pay and performance. For example, in year 4 of the demonstration
project, employees with higher performance scores were more likely to
receive pay increases and on average received larger pay increases than
employees with lower scores.

o 	Some of the recruitment and staffing interventions have been
successful. For example, supervisors are taking advantage of their ability
to offer more flexible starting salaries. Additionally, the demonstration
project has expedited the classification process. DOC's evaluator
recommended that DOC should more fully implement the recruitment and
staffing interventions.

o 	The 3-year probationary period for scientists and engineers continues
to be used, but assessing its utility remains difficult.

o 	On the other hand, some retention interventions receive little use or
have not appeared to affect retention. For example, the supervisor
performance pay intervention is not affecting supervisor retention.

Sources for Additional http://ohrm.doc.gov/employees/demo_project.htm
(Last accessed Nov. 7, Information 2003)

http://www.opm.gov/demos/main.asp (Last accessed Nov. 7, 2003)

4Source: Booz Allen Hamilton, Department of Commerce Personnel Management
Demonstration Project Evaluation Year Four Report (McLean, Va.: September
2003).

                   Appendix II Demonstration Project Profiles

Naval Research Laboratory (NRL)

Source: GAO analysis of DOD and OPM data.

Purpose The NRL demonstration project was to

o  provide increased authority to manage human resources,

o  enable NRL to hire the best qualified employees,

o 	compensate employees equitably at a rate that is more competitive with
the labor market, and

o 	provide a direct link between levels of individual contribution and the
compensation received.

Selected Elements of the Performance Management System

Competencies: Each career path has two to three "critical elements." Each
critical element has generic descriptors that explain the type of work,
degree of responsibility, and scope of contributions. Pay pool managers
may weight critical elements and may establish supplemental criteria.

Feedback: Supervisors and employees are to, on an ongoing basis, hold
discussions to specify work assignments and performance expectations. The
supervisor or the employee can request a formal review during the
appraisal process.

Appendix II Demonstration Project Profiles

Self-assessment: Employees are to submit yearly accomplishment reports for
the supervisors' information when determining the performance appraisals.

Levels of performance rating: The levels are acceptable or unacceptable.
Employees who are rated acceptable are then determined to be
"over-compensated," "under-compensated," or within the "normal pay range,"
based on their contribution scores and salaries.

Safeguards:

o 	Second-level review: The pay pool panel and pay pool manager are to
compare element scores for all of the employees in the pay pool; make
adjustments, as necessary; and determine the final contribution scores and
pay adjustments for the employees.

o 	Grievance procedure: Employees can grieve their appraisals through a
two-step process. Employees are to first grieve their scores in writing,
and the pay pool panel reviews the grievances and makes recommendations to
the pay pool manager, who issues decisions in writing. If employees are
not satisfied with the pay pool manager's decisions, they can then file
formal grievances according to NRL's formal grievance procedure.

                   Appendix II Demonstration Project Profiles

Selected Employee Attitude Data

               Figure 14: Selected Employee Attitude Data for NRL
"Pay raises depend on how well                                             
     I perform." Baseline (1996)                                              
       2001 "All in all, I am                              Comparison group   
       satisfied with my pay."                           Agree % Disagree % N 
      Baseline (1996) 2001 "My     Demonstration group    No data No data No  
performance rating represents a Agree % Disagree % N  data No data No data 
fair and accurate picture of my 41 38 1,656 61 22 678  No data No data No  
    actual performance." Baseline  41 40 1,663 48 35 675 data No data No data 
    (1996) 2001 "I have trust and  63 22 1,656 59 19 676  No data No data No  
    confidence in my supervisor."  61 20 1,639 67 16 674 data No data No data 
Baseline (1996) 2001 "I prefer  667 46 28              No data No data No  
       the following personnel                           data No data No data 
     system." 2001 Demonstration                          No data No data No  
    system Traditional personnel                         data No data N/A N/A
               system                                            N/A

Source: DOD.

Legend: N/A = data are not applicable; N = number of respondents.

Other Interventions	Reduction in force. To credit an employee's basic
Federal Service Computation Date with up to 20 years based on the results
of the appraisal process.

Hiring flexibility. To provide opportunities to consider a broader range
of candidates and flexibility in filling positions.

Extended probationary period. To extend the probationary period to 3 years
for certain occupations.

                   Appendix II Demonstration Project Profiles

Selected Reported Effects	A demonstration project evaluation reported the
following effects.5 From 1996 to 2001:

o 	Managers' satisfaction with authority to determine employees' pay and
job classification increased from 10 percent of managers to 33 percent.

o 	Employees' satisfaction with opportunities for advancement increased
from 26 percent to 41 percent.

o 	The perceived link between pay and performance is stronger under the
demonstration project and increased from 41 percent to 61 percent.

o 	On the other hand, the percentage of employees who agreed that other
employers in the area paid more than the government for the kind of work
that they do increased from 67 to 76 percent.

Sources for Additional Information

http://hroffice.nrl.navy.mil/personnel_demo/index.htm (Last accessed on
Nov. 7, 2003)

http://www.opm.gov/demos/main.asp (Last accessed on Nov. 7, 2003)

5Sources: U.S. Office of Personnel Management, 2002 Summative Evaluation
DOD S&T Reinvention Laboratory Demonstration Program (Washington, D.C.:
August 2002), and DOD. The OPM report evaluated all of the projects in the
Science and Technology Reinvention Laboratory Demonstration Program and
presented the results together, rather than by demonstration project. Data
are based on survey information provided by DOD.

                   Appendix II Demonstration Project Profiles

                               Naval Sea Systems
                                Command Warfare
                                Centers (NAVSEA)

Source: GAO analysis of DOD and OPM data.

Purpose The NAVSEA demonstration project was to

o  develop employees to meet the changing needs of the organization;

o  help employees achieve their career goals;

o  improve performance in current positions;

o  retain high performers; and

o 	improve communication with customers, colleagues, managers, and
employees.

Selected Elements of the Competencies: Each division may implement
regulations regarding the Performance Management competencies and criteria
by which employees are rated. NAVSEA's System Dahlgren division uses three
competencies for all employees, and the

                   Newport division uses eight competencies.

Appendix II Demonstration Project Profiles

Feedback: Each division may implement regulations regarding the timing and
documentation of midyear feedback. Dahlgren requires at least one
documented feedback session at midyear. Beginning in fiscal year 2004,
Newport requires a documented midyear feedback session.

Self-assessment: Each division has the flexibility to determine whether
and how employees document their accomplishments. Dahlgren requires
employees to provide summaries of their contributions for their
supervisors' information. Newport encourages employees to provide
selfassessments.

Levels of performance rating: All of the divisions use the ratings
"acceptable" and "unacceptable."

Safeguards:

o 	Second-level review: Divisions are to design the performance appraisal
and payout process. Supervisors at Dahlgren's division and department
levels review ratings and payouts to ensure that the competencies are
applied uniformly and salary adjustments are distributed equitably. At
Newport, second-level supervisors review recommendations by direct
supervisors, make changes to achieve balance and equity within the
organization, then submit the recommendations to pay pool managers, who
are to go through the same process and forward the recommendations to the
department head for final approval.

o 	Grievance procedure: Divisions are to design their grievance
procedures. Dahlgren and Newport have informal and formal reconsideration
processes. In Dahlgren's informal process, the employee and supervisor are
to discuss the employee's concern and reach a mutual understanding, and
the pay pool manager is to approve any changes. If the employee is not
satisfied with the result of the informal process, the employee is to
submit a formal request to the pay pool manager, who is to make the final
decision. In Newport's informal process, the employee is to submit a
written request to the pay pool manager, who may revise the rating and
payout decision or confirm it. If the employee is not satisfied with the
result of the informal process, the employee may formally appeal to the
department head, who is to render a decision.

                   Appendix II Demonstration Project Profiles

Selected Employee Attitude Data

             Figure 15: Selected Employee Attitude Data for NAVSEA
"Pay raises depend on how well I                                           
    perform." Baseline (1996) 2001                         Comparison group   
"All in all, I am satisfied with                      Agree % Disagree % N 
    my pay." Baseline (1996) 2001   Demonstration group   No data No data No  
        "My performance rating      Agree % Disagree % N data No data No data 
    represents a fair and accurate  23 56 6,372 50 32     No data No data No  
         picture of my actual       2,606 34 47 6,397 49 data No data No data 
    performance." Baseline (1996)   32 2,603 56 27 6,400  No data No data No  
        2001 "I have trust and      55 25 2,609 57 22    data No data No data 
    confidence in my supervisor."   6,350 64 18 2,602     No data No data No  
    Baseline (1996) 2001 "I prefer  2,563 43 36          data No data No data 
the following personnel system."                       No data No data No  
      2001 Demonstration system                          data No data N/A N/A 
     Traditional personnel system                                N/A

Source: DOD.

Legend: N/A = data are not applicable; N = number of respondents.

Other Interventions	Advanced in-hire rate. To set, upon initial
appointment, an individual's pay anywhere within the band level consistent
with the qualifications of the individual and requirements of the
position.

Scholastic achievement appointments. To employ an alternative examining
process that provides NAVSEA the authority to appoint undergraduates and
graduates to professional positions.

                   Appendix II Demonstration Project Profiles

Selected Reported Effects A demonstration project evaluation reported the
following effects.6

From 1996 to 2001:

o 	The percentage of people who agreed that their managers promote
effective communication among different work groups increased from 31 to
43 percent.

o 	On the other hand, NAVSEA employees' response to the statement "High
performers tend to stay with this organization" stayed constant at about
30 percent during this time.

o 	Additionally, the percentage of employees who said that they have all
of the skills needed to do their jobs remained consistent at 59 and 62
percent, respectively.

Sources for Additional Information

http://www.nswc.navy.mil/wwwDL/XD/HR/DEMO/main.html (Last accessed on Nov.
7, 2003)

http://www.opm.gov/demos/main.asp (Last accessed on Nov. 7, 2003)

6Sources: U.S. Office of Personnel Management, 2002 Summative Evaluation
DOD S&T Reinvention Laboratory Demonstration Program (Washington, D.C.:
August 2002), and DOD. The OPM report evaluated all of the projects in the
Science and Technology Reinvention Laboratory Demonstration Program and
presented the results together, rather than by demonstration project. Data
are based on survey information provided by DOD.

                   Appendix II Demonstration Project Profiles

Civilian Acquisition Personnel Demonstration Project (AcqDemo)

Source: GAO analysis of DOD and OPM data.

aPub. L. No. 105-85 removed the 5,000 employee participant cap at AcqDemo.

Purpose AcqDemo was to

o  attract, motivate, and retain a high-quality acquisition workforce;

o  achieve a flexible and responsive personnel system;

o  link pay to employee contributions to mission accomplishment; and

o  gain greater managerial control and authority over personnel processes.

Selected Elements of the Competencies: Six core contribution "factors," as
well as "discriminators" Performance Management and "descriptors," are
used to evaluate every employee.

System	Feedback: AcqDemo requires at least one formal feedback session
annually and encourages informal and frequent communication between
supervisors and employees, including discussion of any inadequate

Appendix II Demonstration Project Profiles

contribution. Each service, agency, or organization may require one or
more additional formal or informal feedback sessions.

Self-assessment: Employees can provide a list of contributions for each
factor.

Levels of performance rating: The levels are "appropriately compensated,"
"over-compensated," and "under-compensated."

Safeguards:

o 	Second-level review: The supervisors and the pay pool manager are to
ensure consistency and equity across ratings. The pay pool manager is to
approve the employee's overall contribution score, which is calculated
based on the employee's contribution ratings.

o 	Grievance procedure: Employees may grieve their ratings and actions
affecting the general pay increase or performance pay increases. An
employee covered by a negotiated grievance procedure is to use that
procedure to grieve his or her score. An employee not under a negotiated
grievance procedure is to submit the grievance first to the rating
official, who will submit a recommendation to the pay pool panel. The pay
pool panel may accept the rating official's recommendation or reach an
independent decision. The pay pool panel's decision is final unless the
employee requests reconsideration by the next higher official to the pay
pool manager. That official would then render the final decision on the
grievance.

                   Appendix II Demonstration Project Profiles

Selected Employee Attitude Data

             Figure 16: Selected Employee Attitude Data for AcqDemo
    "In this organization, my pay                                             
raises depend on my contribution                                           
to the organization's mission."                                            
     Baseline (1998) 2003 "All in                                             
     all, I am satisfied with my                                              
    pay." Baseline (1998) 2003 "My                                            
performance rating represents a  Demonstration group    Comparison group   
fair and accurate picture of my  Agree % Disagree % N  Agree % Disagree %  
    actual performance." Baseline   20 54 2,748 59 28      N 12 67 470 18 60
    (1998) 2003 "I have trust and   2,027 49 32 2,748 57  275 45 39 470 52 35
    confidence in my supervisor."   32 2,027 75 14 2,748  275 62 26 470 66 20
    Baseline (1998) 2003 "I am in   54 33 2,027 63 18     275 60 20 470 77 9
      favor of the demonstration    2,748 66 18 2,027 25  275 N/A N/A N/A N/A
    project." Baseline (1998) 2003  48 2,748 52 30 2,027        N/A N/A

Source: DOD.

Legend: N/A = data are not applicable; N = number of respondents.

Other Interventions	Voluntary emeritus program. To provide a continuing
source of corporate knowledge and valuable on-the-job training or
mentoring by allowing retired employees to voluntarily return without
compensation and without jeopardizing retirement pay.

Extended probationary period. To provide managers a length of time equal
to education and training assignments outside of the supervisors' review
to properly assess the contribution and conduct of new hires in the
acquisition environment.

Scholastic achievement appointment. To provide the authority to appoint
degreed candidates meeting desired scholastic criteria to positions with
positive education requirements.

                   Appendix II Demonstration Project Profiles

Flexible appointment authority. To allow an agency to make a modified term
appointment to last from 1 to 5 years when the need for an employee's
services is not permanent.

Selected Reported Effects A demonstration project evaluation reported the
following effects.7

o 	Attrition rates for over-compensated employees increased from 24.1 in
2000 to 31.6 percent in 2002. Attrition rates for appropriately
compensated employees increased from 11.5 in 2000 to 14.1 percent in 2002.
Attrition rates for under-compensated employees decreased from 9.0 in 2000
to 8.5 in 2001 and then increased to 10.2 percent in 2002.

o 	Increased pay-setting flexibility has allowed organizations in AcqDemo
to offer more competitive salaries, which has improved recruiting.

o 	Employees' perception of the link between pay and contribution
increased, from 20 percent reporting that pay raises depend on their
contribution to the organization's mission in 1998 to 59 percent in 2003.

Sources for Additional http://www.acq.osd.mil/acqdemo/ (Last accessed on
Nov. 7, 2003) Information

http://www.opm.gov/demos/index.asp (Last accessed on Nov. 7, 2003)

7Source: Cubic Applications, Inc., DOD Civilian Acquisition Workforce
Personnel Demonstration Project: Interim Evaluation Report Volume I -
Management Report

(Alexandria, Va.: July 2003).

                                  Appendix III

                    Comments from the Department of Defense

Appendix IV

                     GAO Contacts and Staff Acknowledgments

GAO Contacts	J. Christopher Mihm, (202) 512-6806 or [email protected] Lisa
Shames, (202) 512-6806 or [email protected]

Acknowledgments	In addition to the individuals named above, Michelle
Bracy, Ron La Due Lake, Hilary Murrish, Adam Shapiro, and Marti Tracy made
key contributions to this report.

GAO's Mission	The General Accounting Office, the audit, evaluation and
investigative arm of Congress, exists to support Congress in meeting its
constitutional responsibilities and to help improve the performance and
accountability of the federal government for the American people. GAO
examines the use of public funds; evaluates federal programs and policies;
and provides analyses, recommendations, and other assistance to help
Congress make informed oversight, policy, and funding decisions. GAO's
commitment to good government is reflected in its core values of
accountability, integrity, and reliability.

Obtaining Copies of GAO Reports and Testimony

The fastest and easiest way to obtain copies of GAO documents at no cost
is through the Internet. GAO's Web site (www.gao.gov) contains abstracts
and fulltext files of current reports and testimony and an expanding
archive of older products. The Web site features a search engine to help
you locate documents using key words and phrases. You can print these
documents in their entirety, including charts and other graphics.

Each day, GAO issues a list of newly released reports, testimony, and
correspondence. GAO posts this list, known as "Today's Reports," on its
Web site daily. The list contains links to the full-text document files.
To have GAO e-mail this list to you every afternoon, go to www.gao.gov and
select "Subscribe to e-mail alerts" under the "Order GAO Products"
heading.

Order by Mail or Phone	The first copy of each printed report is free.
Additional copies are $2 each. A check or money order should be made out
to the Superintendent of Documents. GAO also accepts VISA and Mastercard.
Orders for 100 or more copies mailed to a single address are discounted 25
percent. Orders should be sent to:

U.S. General Accounting Office 441 G Street NW, Room LM Washington, D.C.
20548

To order by Phone:	Voice: (202) 512-6000 TDD: (202) 512-2537 Fax: (202)
512-6061

To Report Fraud,	Contact: Web site: www.gao.gov/fraudnet/fraudnet.htm

Waste, and Abuse in E-mail: [email protected]

Federal Programs Automated answering system: (800) 424-5454 or (202)
512-7470

Public Affairs	Jeff Nelligan, Managing Director, [email protected] (202)
512-4800 U.S. General Accounting Office, 441 G Street NW, Room 7149
Washington, D.C. 20548

                               Presorted Standard
                              Postage & Fees Paid
                                      GAO
                                Permit No. GI00

United States
General Accounting Office
Washington, D.C. 20548-0001

Official Business
Penalty for Private Use $300

Address Service Requested
*** End of document. ***