Law Enforcement: Better Performance Measures Needed to Assess
Results of Justice's Office of Science and Technology (14-NOV-03,
GAO-04-198).
The mission of the Office of Science & Technology (OST), within
the Department of Justice's National Institute of Justice (NIJ),
is to improve the safety and effectiveness of technology used by
federal, state, and local law enforcement and other public safety
agencies. Through NIJ, OST funds programs in forensic sciences,
crime prevention, and standards and testing. To support these
programs, Congress increased funding for OST from $13.2 million
in 1995 to $204.2 million in 2003 (in constant 2002 dollars). GAO
reviewed (1) the growth in OST's budgetary resources and the
changes in OST's program responsibilities, (2) the types of
products OST delivers and the methods used for delivering them;
and (3) how well OST's efforts to measure the success of its
programs in achieving intended results meet applicable
requirements.
-------------------------Indexing Terms-------------------------
REPORTNUM: GAO-04-198
ACCNO: A08857
TITLE: Law Enforcement: Better Performance Measures Needed to
Assess Results of Justice's Office of Science and Technology
DATE: 11/14/2003
SUBJECT: Information disclosure
Law enforcement agencies
Performance measures
Product safety
Research and development
Technical assistance
Budget activities
Budget allowances
Funds management
Funding increases
Public safety
******************************************************************
** This file contains an ASCII representation of the text of a **
** GAO Product. **
** **
** No attempt has been made to display graphic images, although **
** figure captions are reproduced. Tables are included, but **
** may not resemble those in the printed version. **
** **
** Please see the PDF (Portable Document Format) file, when **
** available, for a complete electronic file of the printed **
** document's contents. **
** **
******************************************************************
GAO-04-198
United States General Accounting Office
GAO
Report to the Honorable Jane Harman,
House of Representatives
November 2003
LAW ENFORCEMENT
Better Performance Measures Needed to Assess Results of Justice's Office of
Science and Technology
GAO-04-198
Highlights of GAO-04-198, a report to the Honorable Jane Harman, House of
Representatives
The mission of the Office of Science & Technology (OST), within the
Department of Justice's National Institute of Justice (NIJ), is to improve
the safety and effectiveness of technology used by federal, state, and
local law enforcement and other public safety agencies. Through NIJ, OST
funds programs in forensic sciences, crime prevention, and standards and
testing. To support these programs, Congress increased funding for OST
from $13.2 million in 1995 to $204.2 million in 2003 (in constant 2002
dollars). GAO reviewed (1) the growth in OST's budgetary resources and the
changes in OST's program responsibilities, (2) the types of products OST
delivers and the methods used for delivering them; and (3) how well OST's
efforts to measure the success of its programs in achieving intended
results meet applicable requirements.
GAO recommends that the Director of NIJ reassess the measures used to
evaluate OST's progress toward achieving its goals and to better focus on
outcome measures to assess results where possible. In those cases where
measuring outcomes is, after careful consideration, deemed infeasible, we
recommend developing appropriate intermediate measures that will help to
discern program effectiveness.
www.gao.gov/cgi-bin/getrpt?GAO-04-198.
To view the full product, including the scope and methodology, click on
the link above. For more information, contact Laurie Ekstrand at (202)
512-8777 or [email protected].
November 2003
LAW ENFORCEMENT
Better Performance Measures Needed to Assess Results of Justice's Office of
Science and Technology
OST's budgetary resources grew significantly in recent years, along with
the range of its program responsibilities. From fiscal year 1995 through
fiscal year 2003, OST received over $1 billion through Department of
Justice appropriations and the reimbursement of funds from other federal
agencies in exchange for OST's agreement to administer these agencies'
projects. Of the over $1 billion that OST received, approximately $749
million, or 72 percent, was either directed to specific recipients or
projects by public law, subject to guidance in congressional committee
reports, or directed though reimbursable agreements. At the same time that
spending expanded, OST's program responsibilities have changed-from
primarily law enforcement and corrections to broader public safety
technology.
OST delivers three groups of products through various methods. The three
groups include (1) information dissemination and technical assistance; (2)
the application, evaluation, and demonstration of existing and new
technologies for field users; and (3) technology research and development.
According to OST, as of April 2003, it has delivered 945 products since
its inception. Furthermore, OST identified an additional 500 products
associated with ongoing awards. OST makes its products available through a
variety of methods, such as posting information on its Web site and
providing research prototypes to field users for testing and evaluation.
OST has been unable to fully assess its performance in achieving its goals
as required by applicable criteria because it does not use outcome
measures to assess the extent to which it achieves the intended results of
its programs. OST's current measures primarily track outputs, the goods
and services produced, or in some cases OST uses intermediate measures,
which is a step toward developing outcome measures. The Government
Performance and Results Act of 1993 provides that federal agencies measure
or assess the results of each program activity. While developing outcome
measures for the types of activities undertaken by OST is difficult, we
have previously reported on various strategies that can be used to develop
outcome measures, or, at least intermediate measures, for similar types of
activities.
OST's annual budgetary resources in constant 2002 dollars, fiscal years
1995-2003 Dollars in millions
300
250
200
150
100
50
0 1995 1996 1997 1998 1999 2000 2001 2002 2003
Fiscal year
Source: OST.
Contents
Letter
Results in Brief
Background
OST's Budgetary Resources Have Grown and Program
Responsibilities Have Changed OST Delivers Three Groups of Products
Through Various Methods OST's Performance Measurement Efforts Do Not Fully
Meet
Requirements Conclusions Recommendation Agency Comments and Our Evaluation
1
2 3
6 13
17 24 25 25
Appendix I Scope and Methodology
Appendix II Bugetary Resources for OST's Programs in Current Year Dollars
Appendix III OST's 10 Categories of Products
Appendix IV OST's Portfolio Areas
Appendix V OST's Operations
Appendix VI OST's Goals in its Fiscal Year 2004 Performance Plan and
GAO's Assessment 47
Appendix VII Comments from the Department of Justice 51
Appendix VIII GAO Contacts and Staff Acknowledgments
54
GAO Contacts 54 Staff Acknowledgments 54
Tables
Table 1: Flow of Budgetary Resources to OST's Programs
Table 2: Budgetary Resources in Constant 2002 Dollars for OST's Programs
by NIJ Allocation, Fiscal Years 1995-2003
Table 3: Budgetary Resources in Constant 2002 Dollars for OST's
Investigative and Forensic Sciences by NIJ Allocation, Fiscal Years
1995-2003
Table 4: GAO's Assessment of the 42 Measures OST Developed for 11 of Its
Initiatives
Table 5: OST's Outside Studies of Its Initiatives
Table 6: Budgetary Resources in Current Dollars for OST's Programs by NIJ
Allocation, Fiscal Years 1995-2003
Table 7: GAO's Groupings of OST's Categories of Products and Examples of
Each Category
Table 8: Total Funds Awarded for the Operations, Maintenance, and
Technical Support of OST's 10 Technology Centers, Fiscal Years 1995-2003
Table 9: OST's Technology Centers, Their Affiliated Partners, and the
Amounts Awarded to Support the Centers
Table 10: OST's Performance Goals, Initiatives, and Measures for Fiscal
Year 2004, and GAO's Assessment
7 10
13
19 23
32
33
40 42 47
Figures
Figure 1: OST's Budgetary Resources in Constant 2002 Dollars, Fiscal Years
1995-2003 8 Figure 2: GAO's Grouping of OST's 945 Delivered Products, as
of
April 2003 14 Figure 3: OST's Organizational Structure 38 Figure 4: OST's
10 Technology Centers and the Regions They Serve 39 Figure 5: Stakeholders
and Customers that Contribute to the
Setting of OST's priorities 44
Abbreviations
AAG Assistant Attorney General
CITA Crime Identification Technology Act
CLIP Crime Lab Improvement Program
CODIS Combined DNA Index System
COPS Community-Oriented Policing Services
DNA deoxyribonucleic acid
DOD Department of Defense
FBI Federal Bureau of Investigation
GAO General Accounting Office
GPRA Government Performance and Results Act
LLEBG Local Law Enforcement Block Grant
NFSIA Paul Coverdell National Forensic Sciences Improvement
Act NIJ National Institute of Justice NLECTC National Law Enforcement and
Corrections Technology
Centers OJP Office of Justice Programs OMB Office of Management and Budget
OST Office of Science and Technology R&D research and development SSLEA
State and Local Law Enforcement Assistance
This is a work of the U.S. government and is not subject to copyright
protection in the United States. It may be reproduced and distributed in
its entirety without further permission from GAO. However, because this
work may contain copyrighted images or other material, permission from the
copyright holder may be necessary if you wish to reproduce this material
separately.
United States General Accounting Office Washington, DC 20548
November 14, 2003
The Honorable Jane Harman House of Representatives
Dear Ms. Harman:
To enhance public safety and bring criminals to justice, it is important
for law enforcement officials to benefit from the latest advances in
science and technology. The mission of the Office of Science and
Technology (OST), within the Department of Justice's National Institute of
Justice (NIJ), is to improve the safety and effectiveness of technology
used by federal, state, and local law enforcement, corrections, and other
public safety agencies. OST awards funds to research and develop more
effective technology and improve access to technology in a wide range of
areas. For example, OST funds programs in the areas of crime prevention
technologies, investigative and forensic sciences, and electronic crime.
Examples of products resulting from OST's programs include a guide on
school safety, an evaluation of police protective gear, a prototype for
ground-penetrating radar, and a report on gunshot residue detection and
interpretation. To support OST's programs, Congress has significantly
increased its funding, from $13.2 million in fiscal year 1995 to $204.2
million in fiscal year 2003 (in constant 2002 dollars).
In response to your interest about whether OST's programs are achieving
their intended results, we reviewed certain aspects of OST's operations.
Specifically, this report assesses (1) the growth in OST's budgetary
resources, from fiscal year 1995 to fiscal year 2003, and changes in OST's
program responsibilities; (2) what types of products OST delivers and the
methods used to deliver these products to public safety agencies; and (3)
how well OST's efforts to measure the success of its programs in achieving
intended results meet applicable requirements.
To address our objectives, we collected and analyzed relevant data and
reports and interviewed OST officials and NIJ officials, including NIJ
executive staff and the Assistant NIJ Director for OST, division chiefs,
and managers. We also collected data and interviewed officials at OST
technology centers in Rockville, Maryland; and El Segundo and San Diego,
California. Appendix I contains detailed information on the scope and
methodology we used for this assessment. We conducted this engagement in
accordance with generally accepted government auditing standards.
Results in Brief
OST has grown in terms of both budgetary resources and the range of
programs it operates.1 From fiscal year 1995 through fiscal year 2003, OST
received over $1 billion through several Department of Justice (Justice)
appropriations accounts as well as the reimbursement of funds from other
federal agencies in exchange for OST's agreement to administer these
agencies' projects. Of the over $1 billion that OST has received,
approximately $749.7 million, or about 72 percent, was either directed for
specific recipients or projects by public law, subject to guidance in
congressional committee reports designating specific recipients or
projects, or directed from reimbursable agreements with other federal
agencies for OST to manage their projects. At the same time that spending
has expanded, OST's program responsibilities have changed-from primarily
law enforcement and corrections technologies to broader public safety
technologies, including safe school initiatives.
OST delivers three groups of products through various methods. The three
groups include (1) information dissemination and technical assistance; (2)
the application, evaluation, and demonstration of existing and new
technologies for field users; and (3) technology research and development
(R&D). According to OST, as of April 2003, it had delivered 945 products
since its inception. Furthermore, OST identified an additional 500
products associated with ongoing awards. Depending on its research agenda,
OST makes its products available through a variety of methods, such as
posting information on its Web site and providing research prototypes to
field users for testing and evaluation. While OST does not directly
commercialize the results of its technology R&D, it does help link
prototypes with potential developers.
OST has been unable to fully assess its performance in achieving its goals
because it does not measure the extent to which it achieves the intended
outcomes of its programs. OST's current measures primarily track outputs
(goods and services produced). In some cases OST uses intermediate
measures-a step closer to developing outcome measures-but has not taken
this step toward better measurement in many cases where it may be
1We are using "programs" to indicate the broad categories of OST's
individual projects. NIJ and OST have referred to these categories as both
portfolio areas and programs. Our use of the term "programs" encompasses
"portfolio areas" (see app. IV for OST's portfolio areas) and the safe
school technology, counterterrorism technology, and correction technology
programs. NIJ and OST delineations between the various programs and
various portfolio areas are flexible. For example, some of the projects to
develop metal detectors and personnel locator devices would apply to both
school safety technologies and corrections technologies programs and
therefore could be placed in different portfolio areas.
possible to do so. The Government Performance and Results Act of 1993
(GPRA) provides, among other things, that federal agencies establish
performance measures, including, the assessment of relevant outputs and
outcomes of each program activity. Office of Management and Budget (OMB)
guidance suggests that, to the extent possible, federal agencies measure
or assess the extent to which they are achieving the intended outcomes of
their programs. As part of Justice's efforts to comply with GPRA, OST
established goals and developed output, and some intermediate, measures to
track its progress. While developing outcome measures for the types of
activities undertaken by OST is difficult, we have previously reported on
various strategies that can be used to develop outcome measures or at
least intermediate measures for activities that are similar to those in
OST's portfolio of programs.
So that OST does all that is possible to assess whether its programs are
achieving their intended results, we are recommending that the Attorney
General instruct the Director of NIJ to reassess OST's performance
measures to better focus on outcome measures. In commenting on a draft of
this report, the Assistant Attorney General (AAG) for Justice's Office of
Justice Programs (OJP) agreed with our recommendation. The AAG made
additional comments concerning the challenge of developing outcome
measures for R&D activities, OST's overall performance record, and the
amount of OST's funds that are directed for specific recipients and
projects. We respond to these comments in the Agency Comments and
Evaluation section of the report. OJP also provided technical comments,
which have been incorporated in this report where appropriate.
Background The Office of Science and Technology (OST) was created in
fiscal year 1995 following a long history of science and technology
efforts within the National Institute of Justice (NIJ).2 NIJ is a
component of the Office of Justice Programs (OJP), a Justice agency that,
among other things, provides assistance to state, tribal, and local
governments. In establishing OST's objectives and allocating funds for
OST's programs, the NIJ Director considers the priorities of many
stakeholders, including the President, Congress, Justice, and state and
local law enforcement and public safety agencies.
2NIJ was established in statute by the Justice System Improvement Act of
1979 (P.L. 96-157, 93 Stat. 1167 (1979)), which, among other things,
amended the Omnibus Crime Control and Safe Streets Act of 1968 (P.L.
90-351, 82 Stat. 197 (1968)).
OST Established in Statute by the Homeland Security Act of 2002
In November 2002, Congress established OST and its mission and duties in
statute as part of the Homeland Security Act of 2002 (the Act).3 The Act
specified OST's mission "to serve as the national focal point for work on
law enforcement technology; and to carry out programs that, through the
provision of equipment, training, and technical assistance, improve the
safety and effectiveness of law enforcement technology and improve access
to such technology by federal, state, and local law enforcement agencies."
The Act defined the term "law enforcement technology" to include
"investigative and forensic technologies, corrections technologies, and
technologies that support the judicial process."4 The Act also specified
OST's duties to include the following, among others:
o establishing and maintaining advisory groups to assess federal, state,
and local technology needs;
o establishing and maintaining performance standards, and testing,
evaluating, certifying, validating, and marketing products that conform to
those standards;
o carrying out research, development, testing, evaluation, and
cost-benefit analysis of certain technologies; and
o developing and disseminating technical assistance and training
materials.
OST's Operations
OST's operations have multiple levels of internal organization and
multiple kinds of external partners. (For a more detailed description of
OST's operations, see app. V.) OST's multiple levels of organization
include a Washington, D.C., office and a network of 10 technology centers
that provide technical assistance to OST's customers around the country.5
To fulfill its mission, OST also collaborates with entities such as the
3P.L. 107-296, 116 Stat. 2135, 2159 (2002). These mission and duties are
not unlike what OST had been carrying out previously. The Act codified the
mission and duties in statute.
4According to NIJ, forensic science is the application of established
scientific techniques to the identification, collection, and examination
of evidence from crime scenes; the interpretation of laboratory findings;
and the presentation of reported findings in judicial proceedings.
5These 10 technology centers are OST's National Law Enforcement and
Corrections Technology Center (NLECTC) system.
Departments of Defense and Energy and public and private laboratories to
take advantage of established technical expertise and resources.
NIJ has three main types of awards for funding OST's programs: grants,
interagency agreements, and cooperative agreements.6
o Grants are generally awarded annually by NIJ to state and local
agencies or private organizations for a specific product and amount.
o Interagency agreements are used by NIJ for creating partnerships with
federal agencies.
o Cooperative agreements are a type of NIJ grant to nonfederal entities
that prescribes a higher level of monitoring and federal involvement.
NIJ also uses memorandums of understanding (MOU) to coordinate programs
and projects between agencies. The MOUs specify the roles,
responsibilities, and funding amounts to be provided by participating
agencies. Through NIJ, OST can provide supplemental funding to interagency
and cooperative agreements that may be used to contract for special
projects.
OST awards are administered by managers at its Washington, D.C., office
who have final oversight and management responsibility. These managers may
delegate some responsibility to another federal R&D agency receiving the
award. In March 2003, 21 managers were responsible for overseeing 336
active awards totaling $636 million.
Guidance has been established for measuring the performance of government
operations. To assist Justice to follow the Government Performance and
Results Act of 1993 (GPRA),7 OST establishes goals and develops
performance measures to track its progress. In addition, in May 2002, the
White House Office of Management and Budget (OMB) and Office of Science
and Technology Policy issued a memorandum setting forth R&D investment
criteria that departments and agencies should implement. The investment
criteria require an explanation of why the investment is important, how
funds will be allocated to ensure quality, and
6We did not include contracts because NIJ uses them for the purchase of
goods and services rather than for awarding funds for carrying out OST
programs and projects.
7P.L.103-62, 107 Stat. 285 (1993).
how well the investment is performing. According to the memorandum,
program managers must define appropriate outcome measures, and milestones
that can be used to track progress toward goals and assess whether funding
should be enhanced or redirected. The memorandum encourages federal R&D
agencies to make the processes they use to satisfy GPRA consistent with
these criteria.
OST's Budgetary OST's budgetary resources have grown and the range of
program
responsibilities has changed. Budgetary resources for OST
increasedResources Have significantly, from $13.2 million in fiscal year
1995 to $204.2 million in Grown and Program fiscal year 2003 (in constant
2002 dollars), totaling over $1 billion.8 This
increase can be attributed to the introduction of new allocations and
largeResponsibilities Have increases for existing ones. The NIJ director
decides how to allocate Changed certain appropriated funds to the various
NIJ components, including OST.
About $749.7 million, or 72 percent, of OST's total budgetary resources
was either directed to specific recipients or projects by public law,
subject to congressional committee report guidance designating specific
recipients or projects, or directed from the reimbursements from other
Justice and federal agencies in exchange for OST managing their projects.
Corresponding with the designation of spending for specific recipients and
projects, the range of OST's programs changed, from primarily law
enforcement and corrections to include broader public safety technology
R&D, such as for improving school safety and combating terrorism.
Budgetary Resources for OST's budgetary resources9 include both funding
received via Justice OST's Programs appropriations accounts as well as
reimbursements from other Justice and federal agencies. First, OST
receives funding via three appropriations
accounts enacted in the appropriations law for the Justice Department.
From these appropriations accounts, OJP allocates amounts to NIJ. The NIJ
director suballocates part of the NIJ funds for OST programs. In addition,
OST receives reimbursements from other Justice and federal agencies in
exchange for OST's management of specific projects of those agencies, such
as ballistic imaging evaluation for the FBI. Table 1 lists NIJ
8Figures do not include funding for management and administration
expenses, salaries, and unobligated balances carried from one year to the
next.
9For the purposes of this report, we will refer to both the funds OST
receives via several Justice appropriations accounts as NIJ allocations as
well as the reimbursements it receives as OST's budgetary resources.
allocations from the Justice appropriations accounts that go toward
funding OST programs.
Table 1: Flow of Budgetary Resources to OST's Programs
Justice appropriation accounts NIJ's allocations to OST programs
Justice Assistance NIJ Base: NIJ uses base funds for research,
development, demonstration, and dissemination activities.
Counterterrorism R&D:a NIJ sponsors research, development, and evaluations
and tools to help criminal justice and public safety agencies deal with
critical incidents, including terrorist acts.
State and Local Law Enforcement Assistance Local Law Enforcement Block
Grant (LLEBG): NIJ allots its R&D portion (SLLEA) of LLEBG funds to OST to
assist local units of government to identify, select, develop, modernize,
and purchase new technologies for law enforcement use. Community Oriented
Policing Services (COPS) Crime Identification Technology Act (CITA): CITA
activities include upgrading and integrating national, state, and local
criminal justice record, identification systems, and funding
multi-jurisdictional, multi-agency communications systems, and improving
forensic science capabilities, including DNA analysis.
Safe Schools Technology R&D: OST's Safe Schools Technology R&D program
uses three methods for improving school safety: needs assessments and
development of technical partners, technology R&D, and technical
assistance.
Crime Lab Improvement Program (CLIP): CLIP activities include providing
equipment, supplies, training, and technical assistance to state and local
crime laboratories to establish or expand their capabilities and
capacities to perform various types of forensic analyses.
DNA Backlog Reduction: This seeks to eliminate public crime laboratories'
backlogs of DNA evidence as soon as possible.
Paul Coverdell National Forensic Sciences Improvement Act (NFSIA): This
provides funding to state and local laboratories to improve the quality,
timeliness, and credibility of forensic science services for criminal
justice purposes.
Reimbursements of funds from other Justice Department and federal
agencies' accounts: Reimbursable activities have included ballistic
imaging evaluation from the FBI, a study of communications
interoperability (the ability to communicate across different public
safety agencies and jurisdictions) requirements from the Defense Advanced
Research Projects Agency, and death investigator guidelines from the
Centers for Disease Control and Prevention.
Source: GAO analysis of OST data.
aIn fiscal year 1999, OST's counterterrorism R&D programs received funding
through the Justice Department's Counterterrorism Fund appropriation
account.
OST's budgetary resources almost quadrupled from fiscal year 1995 to 1996,
increased 70 percent from fiscal year 1999 to 2000, and increased 63
percent from fiscal year 2001 to 2002. While resources decreased
24 percent from fiscal year 2002 to 2003, OST's fiscal year 2003 level
still represents a 157 percent increase over the fiscal year 1999 level.
Figure 1: OST's Budgetary Resources in Constant 2002 Dollars, Fiscal Years
1995-2003
Dollars in millions 275
Our analysis of OST's yearly budgetary resources from fiscal year 1995 to
fiscal year 2003 showed that the overall increase can be attributed to the
introduction of new NIJ allocations and large increases for existing ones.
The NIJ allocations that contributed to the overall increase in OST's
budgetary resources are most notably the Crime Lab Improvement Program,
DNA Backlog Reduction, Safe Schools Technology R&D, and Counterterrorism
R&D allocations. Table 2 shows figures for all years in constant 2002
dollars.
250
225
200
175
150
125
100 75
50
25 0
1995 1996 1997 1998 1999 2000 2001 2002 2003 Fiscal year
Source: OST.
Notes: Figures do not include funding for management and administration
expenses, such as salaries.
The $103.4 million increase from fiscal year 2001 to 2002 is largely
attributable to increases of $55.6 million in reimbursable agreements,
$24.3 million in DNA Backlog Reduction allocation, and $15.4 million in
the Crime Lab Improvement Program allocation.
The sharp decrease in OST's budgetary resources from fiscal years 2002 to
2003 is largely attributed to the elimination of counterterrorism R&D
allocation (from $45.3 million in fiscal year 2002), which moved to the
Department of Homeland Security, and a decrease of $26.2 million from
reimbursable agreements.
Certain Allocations Contributed to the Increase in Budgetary Resources
since 1995
All dollar figures used in this narrative are in constant 2002 dollars,
except as noted otherwise.
Fiscal years 1995-1996: The $39.4 million (298 percent) increase from
$13.2 million to $52.6 million primarily came from two NIJ allocations
totaling $35.4 million.
o Local Law Enforcement Block Grant (LLEBG) initiated with $22.2 million.
o Reimbursement of funds increased by $13.2 million (471 percent) from
$2.8 million to $16.0 million.
Fiscal years 1999-2000: The $55.6 million (70 percent) increase from $79.5
million to $135.1 million primarily came from three NIJ allocations
totaling $51.7 million.
o DNA Backlog Reduction initiated with $15.6 million.
o Safe Schools Technology R&D allocation initiated with $15.6 million.10
o Counterterrorism R&D increased by $20.5 million (193 percent) from
$10.6 million to $31.1 million.
Fiscal years 2001-2002: The $103.4 million (63 percent) increase from
$164.6 million to $268.0 million primarily came from three NIJ allocations
totaling $95.3 million.
o Reimbursement of funds increased by $55.6 million (209 percent) from
$26.6 million to $82.2 million.
o DNA Backlog Reduction increased by $24.3 million (227 percent) from
$10.7 million to $35 million.
o Crime Lab Improvement Program increased by $15.4 million (79 percent)
from $19.6 million to $35 million.
To be consistent with the report narrative and to show trends, figures in
table 2 are in constant 2002 dollars. A table with the figures in current
dollars can be found in appendix II.
10In fiscal year 1999, NIJ used the LLEBG allocation to meet congressional
guidance to spend $10 million on a new Safe School Initiative. The
following year NIJ's Safe Schools Technology R&D funding was introduced
with $15 million. The OST funding was not reduced as a result of the $15
million increase for the Safe Schools Technology R&D.
Table 2: Budgetary Resources in Constant 2002 Dollars for OST's Programs by NIJ
Allocation, Fiscal Years 1995-2003
Dollars in millions
NIJ allocations 1995 1996 1997 1998 1999 2000 2001 2002 2003 Totala
for OST programs
NIJ Base 10.4 13.3 12.7 14.8 20.3 19.1 29.0 27.1 32.3 179.1
Local Law
Enforcement Block
Grant
(LLEBG) 0 22.2 21.7 21.4 21.2 20.8 20.2 20.0 19.6 167.1
Crime
Identification
Technology Act
(CITA) 0 0 0 0 0 4.4 4.3 1.4 0 10.1
Safe Schools
Technology
Research
and Development 0 0 0 0 0 15.6 17.7 17.0 16.6 66.9
Crime Lab
Improvement
Program
(CLIP) 0 1.1 3.3 13.4 15.9 15.6 19.6 35.0 39.6 143.4
DNA Backlog 0 0 0 0 0 15.6 10.7 35.0 35.2 96.5
Reduction b
Paul Coverdell
National Forensic
Sciences
Improvement Act 0 0 0 0 0 0 0 5.0 4.9
(NFSIA)b
Counterterrorism 0 0 10.9 12.9 10.6 31.1 36.5 45.3 0 147.2
R&D
Reimbursements
from other Justice
and federal 2.8 16.0 0 8.9 11.5 13.0 26.6 82.2 56.0 217.1
agencies
Totala 13.2 52.6 48.6 71.4 79.5 135.1 164.6 268.0 204.2 1037.1
Source: GAO analysis of OST data.
aTotals might not add due to rounding.
bIn fiscal years 2000 and 2001, DNA Backlog Reduction was funded as DNA
Combined DNA Index System (CODIS) Backlog Reduction. In fiscal years 2002
and 2003, both the DNA Backlog Reduction and Coverdell NFSIA allocations
were funded within DNA CODIS Backlog Reduction.
OST had a $63.8 million (24 percent) decrease in total budgetary resources
from fiscal years 2002 to 2003, largely attributed to its not receiving
fiscal year 2003 Counterterrorism R&D resources, which totaled $45.3
million in fiscal year 2002. According to OST, its counterterrorism
resources were transferred to the Department of Homeland Security's Office
of Domestic Preparedness. There was also a $26.2 million decrease in the
reimbursement of funds from other agencies. However, OST's fiscal year
2003 level still represents a 157 percent increase from fiscal year 1999.
Range of OST's Program The range of OST's program responsibilities has
changed over the years Responsibilities Has from primarily law enforcement
and corrections to include broader public Changed safety technology R&D.
This has happened as more and more of OST's
budgetary resources were directed to be spent on specific recipients and
projects. Appropriated funds, for example, are sometimes designated for
specific recipients or projects in public law. In addition, guidance on
the
spending of appropriated funds may be provided through congressional
committee reports. Of the more than $1 billion (in constant 2002 dollars)
that OST programs received from fiscal years 1995 to 2003, $532.6 million,
or 51 percent, was designated for specific recipients and projects in
public law or subject to guidance in committee reports designating
specific recipients or projects.11 Of the $532.6 million, $249.8 million
was designated in public law for specific recipients or projects while
$282.8 million was specified in committee report guidance for specific
recipients or projects.12
In addition to the $532.6 million designated in public law for specific
recipients or projects or subject to guidance in committee reports for
specific recipients or projects, another $217.1 million was reimbursements
from other Justice and federal agencies in exchange for OST's management
of specific projects of those agencies. Thus, the total spending either
directed for specific recipients and projects through public law, subject
to committee report guidance designating specific recipients or projects,
or received as reimbursements, amounts to $749.7 million, or 72 percent,
of OST's total budgetary resources.
The range of OST's program responsibilities has changed to include such
areas as school safety and counterterrorism. In fiscal year 1999, a Safe
Schools Initiative program was established pursuant to conference
committee report guidance13 with $10 million14 directing NIJ to develop
school safety technologies. In another example, OST's counterterrorism R&D
program, initially funded by public law in fiscal year 1997,15 received
$147.3 million through fiscal year 2002, $96.6 million of which was
specified in conference report guidance for three recipients from fiscal
11We separated reimbursements from this total because they included
projects that were not originally allocated to OST, although those
projects also may have been specified in public law and committee reports.
12Included in the $249.8 million was $143.5 million for the CLIP project.
Committee report guidance further designated $107.0 million of that $143.5
million for specific recipients. Given that we have included the $107.0
million in the amounts designated in public law for specific recipients or
projects, we excluded it from the committee report guidance category to
avoid double counting.
13H.R. Conf. Rep. No. 105-825, at 1020-21 (1998).
14For this effort, NIJ initially allocated Local Law Enforcement Block
Grant funds to OST.
15P.L. 104-208, 110 Stat. 3009, 3009-13 (1996).
years 2000 to 200216-Oklahoma City National Memorial Institute for the
Prevention of Terrorism ($37.8 million), Dartmouth College's Institute for
Security Technology Studies ($51.8 million), and the New York University's
Center for Catastrophe Preparedness and Response ($7 million).
OST's program responsibilities have also changed to expand the focus on
investigative and forensic sciences. Our review of OST's budgetary
resources for fiscal years 1995 through 2003 shows that budgetary
resources for investigative and forensic sciences equals at least $342.1
million in constant fiscal year 2002 dollars,17 or about one-third, of its
$1 billion in budgetary resources, as shown in table 3. The proportion of
investigative and forensic sciences annual funding to total OST funding
rose from 6 percent ($800,000) in fiscal year 1995 to 52 percent ($106.0
million) in fiscal year 2003.
16H.R. Conf. Rep. No. 106-479, at 161 (1999); H.R. Conf. Rep. No.
106-1005, at 226 (2000); and H.R. Conf. Rep. No. 107-278, at 86-87 (2001).
17The total amount of budgetary resources for investigative and forensic
sciences is likely to be larger. However, because of the limitations in
detail in the budget documents we received from OST, we could not
determine the amount of funding for investigative and forensic sciences
within certain NIJ Base and LLEBG projects, such as within OST's
technology center network and unspecified NIJ-directed projects.
Table 3: Budgetary Resources in Constant 2002 Dollars for OST's
Investigative and Forensic Sciences by NIJ Allocation, Fiscal Years
1995-2003
Dollars in millions
NIJ allocation
containing funds for
investigative and 1995 1996 1997 1998 1999 2000 2001 2002 2003 Totala
forensic sciences
NIJ Base 0.6 0.6 0.4 1.5 6.2 5.6 5.5 5.0 4.3 29.6
LLEBG 0 0 0 0 0 1.1 0 0 0 1.1
CITA 0 0 0 0 0 0.8 1.3 0 0 2.0
Safe Schools 0 0 0 0 0 0 0 0 0
Technology R&D
CLIP 0 1.1 3.3 13.4 15.9 15.6 19.6 35.0 39.6 143.4
DNA Backlog 0 0 0 0 0 15.6 10.7 35.0 35.2 96.5
Reduction
Coverdell NFSIA 0 0 0 0 0 0 0 5.0 4.9 9.9
Counterterrorism R&D 0 0 0 0 0 0 0 0 0
Reimbursement of
funds from other 0.2 8.9 0 0 0 1.6 1.1 25.4 22.0 59.1
agencies
Totala 0.8 10.5 3.6 14.9 22.1 40.2 38.5 105.4 106.0 342.1
Source: GAO analysis of OST data.
aTotals might not add due to rounding.
OST Delivers Three OST delivers many products, which we categorized into
three groups, and uses various methods to deliver them. These three groups
areGroups of Products (1) information dissemination and technical
assistance; (2) the
Through Various Methods
application, evaluation, and demonstration of existing and new
technologies for field users; and (3) technology R&D. According to OST, as
of April 2003, it had delivered 945 products since its inception.18
Furthermore, OST identified an additional 500 products expected from
ongoing awards. Figure 2 shows our distribution of OST's delivered
products by group. We recognize, as OST officials told us, that the groups
overlap and there is not a clean division between them. For example, while
reports are associated with information dissemination, they may also
result from the technology R&D group. OST has reviewed our classification
of products and agrees that it is generally accurate. Because
classification of some products is based on a judgment call, the
proportions of products in each group should be considered approximations.
18Because NIJ's science and technology efforts predate OST's establishment
in fiscal year 1995, some of the products listed as delivered have award
years prior to 1995. The earliest listed is 1983.
OST's Range of Products
The following examples, while not exhaustive, indicate the wide range of
OST's products.
o Reports on topics such as analysis of DNA typing data, linguistic
methods for determining document authorship, a pepper spray projectile and
disperser, and gunshot residue detection and interpretation.
o Prototypes of products including ground-penetrating radar, ballistics
matching using 3-dimensional images of bullets and cartridge cases, and an
optical recognition system to identify and track stolen vehicles.
o Evaluations of technology including prison telemedicine networks,
police vehicles, and protective gear.
o Guides on topics such as electronic crime scene investigation, use of
security technologies in schools, and antennas for radio communications.
For a more detailed description of OST's products and further examples,
see appendix III.
Figure 2: GAO's Grouping of OST's 945 Delivered Products, as of April 2003
Application, evaluation, and demonstration of
existing and new technology for field users
(191)
Technology R&D
(161)
Information dissemination and technical
assistance
(593)
Source: GAO analysis of OST data.
Notes: See appendix III, table 7 for examples of the products within each
group. Proportions should be considered approximations because some
products overlap categories.
Information Dissemination Information dissemination and technical
assistance represents about
and Technical Assistance 63 percent of OST's delivered products. OST
provides information to its customers in a variety of ways. For example,
OST provides guidance to R&D laboratories on the needs of public safety
practitioners. To public safety practitioners, OST recommends certain
public safety practices, tools, and technologies. Through its Office of
Law Enforcement Standards,
OST develops performance standards to ensure that commercially available
public safety equipment, such as handheld and walk-through metal
detectors, meets minimum performance requirements. OST also helps its
customers enhance their technical capacities by providing them with
training and technical assistance through its Crime Lab Improvement
Program (which also provides supplies and equipment), DNA Backlog
Reduction Program, and network of technology centers. OST also uses the
R&D expertise and experience of already established laboratories and other
R&D organizations to provide additional guidance for managing specialized
technology projects. Further, OST helps its customers receive surplus
federal equipment by acting as their liaison to the equipment transfer
program of the Department of Defense. For example, equipment transferred
ranges from armored vehicles to boots and uniforms.
In addition, OST sponsors conferences, workshops, and forums that bring
together its customers, technologists, and policymakers. For example, it
sponsors the Mock Prison Riot, an annual event demonstrating emerging
technologies in riot training scenarios held at the former West Virginia
Penitentiary in Moundsville, West Virginia. This event brings together
corrections officers and vendors for technology showcases and training
exercises. Also, OST sponsors the Innovative Technologies for Community
Corrections Annual Conference, among others.
Application, Evaluation, and Demonstration of New and Existing
Technologies
Another OST product group is the application, evaluation, and
demonstration of new and existing technologies, which represents about 20
percent of OST's delivered products. Some of OST's programs apply existing
technology solutions in new ways to assist public safety operations.
Examples of the application of new and existing technologies include
developing methods for the collection and analysis of chemical trace
evidence left from explosives and a handheld computer device provided to
bomb technicians in order to access bomb data at the scene of incidents.
In addition, OST tests commercially available products through
NIJ-certified laboratories to determine whether they are in accordance
with national performance standards. Examples of products evaluated
against standards include body armor, handcuffs, and semiautomatic
pistols. OST's evaluations also include conducting field tests to compare
different commercially available products of the same type to allow users
to select the product that best suits their needs. OST also demonstrates
technology resulting from R&D directly to its customers through
OSTsponsored events. For example, the Critical Incident Response
Technology Seminar, formerly known as the Operation America, demonstrates
live-fire simulation for bomb technicians. The annual Mock
Prison Riot demonstrates emerging technologies for use by corrections
officers and tactical team members.
Technology R&D
About 17 percent of OST's delivered products were related to technology
R&D, which involves the development of prototype devices, among other
efforts.19 According to OST, R&D in its early stages includes development
of prototypes and demonstration that a principle can be proven. Applied
R&D, which also involves the development of prototypes, includes
technologies that are made available to public safety agencies, generally
through OST-assisted commercialization. Examples of products resulting
from OST's applied R&D range from a bomb threat training simulator, facial
recognition technology for internet-based gang tracking, to a personal
alarm and location monitoring system for corrections officers.
According to OST, R&D in its early stages begins with testing technology
concepts, exploring solutions, and deciding whether continued development
is warranted. If OST decides to support product development and if it has
available funds, it awards funding to develop, demonstrate, and evaluate
an experimental prototype, which is then further developed into an initial
engineering prototype, and then demonstrated and evaluated. If the
prototype proves successful, OST demonstrates a "near commercial" model to
its customers for their evaluation.
While OST does not directly commercialize the results of its technology
R&D, it does provide prototypes to local users for field-testing and
assists in linking prototypes with potential commercial developers. OST
officials believe it would be a conflict of interest and therefore
inappropriate for them to promote one vendor or technology over another or
try to dictate what equipment their customers should purchase. OST's role
in commercialization is to bring technologies and potential manufacturers
together so that the manufacturers can determine the feasibility of
commercializing the technologies.
19While some of the products resulting from technology R&D are similar to
those of the application, demonstration, and evaluation of new and
existing technologies group, the primary distinction is that the former
includes the development of prototypes and the latter generally does not.
OST's Methods for Delivering its Products
OST's Performance Measurement Efforts Do Not Fully Meet Requirements
OST delivers its products to its customers through a variety of methods.
(We recognize that products are sometimes delivery methods. For example, a
publication can be both a product resulting from research and a method of
information dissemination.) Besides publications, OST's methods for
delivering information and technical assistance include mass mailings;
downloadable material from its Web site; panels, boards, and working
groups; training, support, and presentations; and programs to enhance the
capacity of public safety agencies.
OST also delivers its products related to application, evaluation, and
demonstration through various means. For example, private industry
provides new and existing technologies to OST; in turn, OST informs its
customers of the results of using these technologies in new ways. OST
publishes user guides and the test results of its evaluations of
commercially available equipment (both standards-based and
comparisonbased). Seeking to further educate its customers, OST
demonstrates new technology at technology fairs, providing "hands on"
opportunities to use it.
For its R&D products, OST may test "near commercial" prototypes in
particular settings. For example, OST may install in a police agency a
prototype technology that facilitates communications among public safety
agencies and across jurisdictions. If the technology is effective, the
police agency may incorporate the technology directly into its operations,
before the technology has become a commercial product.
OST's efforts to measure its performance results, including the usefulness
and effectiveness of its products, do not fully meet applicable
requirements. To help Justice comply with the Government Performance and
Results Act of 1993 (GPRA), OST establishes goals and develops performance
measures to track its progress. GPRA, which mandates performance
measurements by federal agencies, requires, among other things, that each
agency measure or assess relevant outputs and outcomes of each program
activity.20 According to GPRA, the Office of Management and Budget (OMB),
and GAO, outcomes assess actual results as compared with the intended
results or consequences that occur from carrying out a
20Performance measures are to be included in the agency performance plan
covering each program activity set forth in the budget of such agency.
Program activity, in this case, refers to projects and activities that are
listed in program and financing schedules of the annual Budget of the
United States Government.
program or activity. Outputs count the goods and services produced by a
program or organization. Intermediate measures can be used to show
progress to achieving intended results. Subsequent OMB and committee
report guidance on GPRA, and previous GAO reports21 recognize that output
measures can provide important information in managing programs. However,
committee report guidance emphasizes using outcome measures to aid policy
makers because such measures are key to assessing intended results.
OST Performance Measures Do Not Measure Results
The performance measures that OST has developed do not measure results.
According to the NIJ director, the Assistant Attorney General (AAG) in
April 2002 issued a memorandum requiring NIJ, including OST, to develop
outcome measures for fiscal year 2004. In August 2002, the NIJ Director
responded by stating that OST had indeed developed outcome measures for
its programs. In its fiscal year 2004 performance plan,22 OST established
goals for 11 of its initiatives23 and developed 42 measures for assessing
the achievement of those goals. However, based on our review of OST's
performance plan, OMB guidance on GPRA, and GAO definitions of outcome,
output, and intermediate measures, we determined that of the 42 measures,
none were outcome-oriented, 28 were output-oriented, and 14 were
intermediate. See table 4 for GAO's determination of the measures and
appendix VI for further details of our results.
21U.S. General Accounting Office, Managing for Results: An Agenda to
Improve the Usefulness of Agencies' Annual Performance Plans,
GAO/GGD/AIMD-98-228 (Washington, D.C.: Sept. 8, 1998).
22Annual performance plans describe a department component's goals and
performance targets in support of the department's long-term strategic
goals and targets. In its fiscal year 2004 performance plan, OST reported
actual performance data for fiscal year 2002, enacted plans for fiscal
year 2003, and performance plans for fiscal year 2004.
23Initiatives in this sense encompass portfolio areas, programs, and
projects.
Table 4: GAO's Assessment of the 42 Measures OST Developed for 11 of Its
Initiatives
Type of measure OST's initiatives Output Intermediate Outcome
1. Convicted Offender DNA Backlog Reduction Program 0 3
2. No Suspect DNA Backlog Reduction Program 0 1
3. Paul Coverdell National Forensic Sciences Improvement Grants Program 0
1
4. Critical Incident Response Technology Initiative 4 1
5. DNA Research & Development 4 0
6. Law Enforcement Technology Research and Development 4 1
7. School Safety Technology 3 0
8. Crime Lab Improvement Program 4 6
9. Office for Law Enforcement Standards 3 0
10. Smart Gun 4 0
11. OST's network of regional centers (known as the National Law
Enforcement and Corrections Technology Center system) 2 1
Source: GAO analysis of OST data.
According to Justice officials, R&D activities present measurement
challenges because outcomes are difficult or costly to measure. As the NIJ
Director pointed out, a May 2002, White House OMB and Office of Science
and Technology Policy memorandum concluded that agencies should not have
the same expectations for measuring the results of basic R&D as they do
for applied R&D.24 According to NIJ, relatively little of OST's work is
basic R&D. As shown earlier, most of OST's products are related to
information dissemination and technical assistance and the application,
evaluation, and demonstration of existing and new technologies for field
users.
24According to the OMB document, Budget of the United States Government
(Analytical Perspectives) for fiscal year 2004, basic R&D is defined as
systematic study directed toward greater knowledge or understanding of
fundamental aspects of phenomena and of observable facts without specific
applications toward processes or products in mind. Applied R&D is defined
as systematic study to gain knowledge or understanding necessary to
determine the means by which a recognized and specific need may be met.
We recognize that OST's task in relation to measuring the results of even
non-basic research is complex in part because of the wide array of
activities it sponsors, and because of inherently difficult measurement
challenges involved in assessing the types of programs it undertakes. For
example, programs that are intended to deter crime face measurement issues
in assessing the extent to which something (crime) does not happen.
Nevertheless, improvement in measurement of program results is important
to help OST ensure it is doing all that is possible to achieve its goals.
It is worth noting that an outcome measure in relation to one OST program
was discussed by the NIJ Director in a May 2002 statement to Congress. In
this statement, the Director provided an example of an outcome from the
Convicted Offender DNA Backlog Reduction Program. The Director stated that
as a direct result of the program, approximately 400,000 convicted
offender samples and almost 11,000 cases with no suspect were analyzed.
According to the NIJ Director, as of May 14, 2002, more than 900 "hits"
had been made on the FBI's Combined DNA Index System (CODIS) database as a
direct result of the program, that is, 900 cases previously unsolved had
been reopened. This information indicates how the program is achieving its
intended results in addressing unsolved cases. Although this example seems
to be a credible outcome measure, it is not included in OST's fiscal year
2004 performance plans.
Limitations in OST's Efforts to Measure Effectiveness of Information
Dissemination
OST efforts to measure information dissemination effectiveness have been
limited. One of the purposes of GPRA is to improve federal program
effectiveness and public accountability by promoting a new focus on
results, service quality, and customer satisfaction. Surveys to gauge
customer satisfaction represent one step toward finding out whether
customers have received information and whether they deem it of value.
However, these surveys have limitations in determining the extent to which
the information has been acted upon and resulted in intended improvements.
Thus, surveys such as these are more likely to be intermediate measures
(Did information get transferred?) than outcome measures (Did information
get transferred, acted upon, and achieve a result?).
In 1998, NIJ initiated an effort to report the results of surveys to
measure the satisfaction of participants at all conferences, workshops,
and seminar series.25 OST reported on the "grantee level of satisfaction
with NIJ
25The surveys were done to determine if participants were satisfied with
the conference as a vehicle of information dissemination.
conferences" for fiscal years 1998-2000. However, in the fiscal years
20012004 GPRA performance plans, OST discontinued tracking the surveys
because OJP and NIJ had ceased tracking these data as a performance
measure.
In fiscal year 2001, OST attempted to evaluate the effectiveness and value
of its TECHbeat newsletter. The survey sample of 5,500 was taken from a
distribution of major readership groups on TECHbeat's mailing list of
20,674. According to OST, the response rate for the survey was too low to
produce statistically valid results: only 124 completed or substantially
completed responses were collected. The surveyors also experienced a very
low return on follow-up phone queries. According to the study, the primary
reason for the exceedingly low response rate was that so many individuals
on the mailing list had either changed jobs or were completely unfamiliar
with TECHbeat. Given these results, OST is trying to improve the
management and distribution of TECHbeat.26
In fiscal year 2001, OST attempted to launch another effort to measure
program results, service quality, and customer satisfaction, but funding
for the effort was not provided. OST requested funding for an evaluation
to measure the success of its outreach efforts, including those by its
technology centers. The evaluation was to determine customer satisfaction
with its strategies for outreach and communication and with its products.
Specifically, OST planned to measure user satisfaction of the content,
format, and delivery mechanisms of its efforts, such as technology
information and assistance.
26To address issues with the mailing lists, the technology centers have
shipped a larger portion of copies to agencies, in bulk, and to
individuals who have actively requested copies and supplied their
addresses; continued to purchase the most current version of the National
Directory of Law Enforcement Administrators, Correctional Institutions and
Related Agencies to update their mailing list; and modified mailing labels
to include the addressee and "...or Training Officer" in case the
addressee is no longer with that agency.
Most Studies of Other OST In fiscal years 1998 and 1999, OST funded eight
outside studies of some of Initiatives Have Focused its science and
technology initiatives (see table 5).27 Our review of these Primarily on
Process studies showed that seven of the eight studies focused on
management
and organizational processes, and one was outcome-oriented.28 Management
and process evaluations can be useful tools for examining how a program is
operating and can offer insights into best practices. They do not assess
whether a program is achieving its intended results.
27Initiatives in this sense encompass portfolio areas, programs, and
projects.
28GPRA establishes two approaches for assessing an agency's performance:
annual measurement of program performance against goals outlined in a
performance plan and program evaluations to be conducted by the agency as
needed. Evaluations can play a critical role in helping to address
measurement and analysis challenges. Performance measurement is the
ongoing monitoring and reporting of program accomplishments, particularly
progress toward established goals. Program evaluations are individual
systematic studies conducted periodically or on an ad hoc basis to assess
how well a program is working. See U.S. General Accounting Office,
Performance Measurement and Evaluation: Definitions and Relationships,
GAO/GGD-98-26 (Washington, D.C.: April 1998).
Table 5: OST's Outside Studies of Its Initiatives Outside study topics
Focus of study
Type of study Date completed
1. National Law Enforcement and Corrections Management, oversight,
structure, Process October 1998 Technology Centers (NLECTC) Programa
organization, and operations
2. Counterterrorism Technology Portfolio Organization, funding, program
process Process June 1999
3. Investigative and Forensic Sciences Program and structure, management,
Process August 1999
Technology Portfolio policies, procedures, lines of control, and funding
4. Less-Than-Lethal Technology Portfolio Management, processes, and
organization Process September 1999
5. Southwest Border States Antidrug Information Program efficacy,
including awareness of Outcome October 1999
System the program, and its value and usefulness or benefits to customers
6. Law Enforcement and Corrections Technology Advisory Council Priorities
and Technology Portfolio Interaction Management and coordination,
processes, organizational challenges
Process February 2000
7. Critical Incident Response and Management
Options for planning, organization, mission, Process September 2000
Crime fighting Technology Program for State and Local First Responder
Teams
management, budget, and recommendations
8. Standards Recommendations for the planning, Process September 2000
Initiative
organization, and management of the
proposed initiative expected to be a
part of
#7 above
Source: OST.
aIn this report we refer to the National Law Enforcement and Corrections
Technology Centers as OST's network of technology centers.
Efforts Are Under Way to The Homeland Security Act of 2002 requires NIJ29
to transmit to Congress Address Performance by late November 2003 a report
assessing the effectiveness of OST's Measurement of existing system of
technology centers and to identify the number of
centers necessary to meet the technology needs of federal, state, and
local
Technology Centers law enforcement in the United States. According to
NIJ, in response to the Homeland Security Act requirement, it has
initiated a study to assess the impact and effectiveness of the technology
center system and how it can be enhanced to meet the evolving science and
technology research and technology needs of the state and local public
safety community. NIJ also stated that the report would address the
functions that the technology center system must provide to transfer NIJ's
research and development results to practice in the criminal justice
system. NIJ and OST have failed
29The Homeland Security Act actually directs the "Director" of OST to
transmit the report. After reorganizing in early 2003, NIJ now calls this
position the assistant NIJ director for science and technology.
to provide us with information detailing the methodology of the study, so
we cannot comment on the likelihood that this study will produce the
information sought by Congress. Additionally, according to OJP, the
technology centers are in the process of developing outcome measures to
demonstrate the impact of their activities.
According to NIJ, OJP has implemented additional performance measures
developed in May 2003 that will apply to NIJ, including OST. However, OJP
said it would defer implementing the measures related to the technology
centers until the results of the technology center study are known and NIJ
has a chance to take action, if warranted.
Measuring Results Is Difficult but Feasible
Conclusions
We acknowledge that measuring results using outcome measures is difficult,
and may be especially so in relation to some of the types of activities
undertaken by OST. Indeed, given the types and wide range of program goals
for OST efforts-solving old crimes, saving lives, and reducing property
loss-it may be the case that for some programs intermediate measures
represent the best feasible measure of results. We note that approximately
63 percent of OST's products fall into the category of information
dissemination and technical assistance, aimed at informing customers and
ultimately encouraging adoption of research results that lead to increased
efficiency and effectiveness. There are strategies available that have
been used by other federal agencies to take steps toward assessing the
effectiveness of information dissemination and technical assistance
efforts. For example, a recent GAO report30 outlines various strategies to
assess media campaigns and informational seminars, including immediate
post workshop surveys and follow-up surveys and the use of logic models to
define measures of a program's progress toward intended results and
long-term goals.
Given the wide range of its products, OST has the potential to
significantly improve the technological capabilities of federal, state,
and local public safety agencies. However, the lack of information about
the results of program efforts, or the assessment of progress toward
goals, means that little is known about their effectiveness. While
developing outcome
30U.S. General Accounting Office, Program Evaluation: Strategies for
Assessing How Information Dissemination Contributes to Agency Goals,
GAO-02-923 (Washington, D.C.: Sept. 30, 2002)
Recommendation
Agency Comments and Our Evaluation
measurements in many research-related programs is extremely difficult,
there are various performance measurement strategies that other federal
programs have used for assessing information dissemination, technical
assistance and other R&D activities that might be applied to OST's
programs. It is important to develop outcome measurements where feasible,
or intermediate measurements where appropriate, to assist Congress, OST
and NIJ management, and OST's customers to better assess whether
investment in OST's programs is paying off with improved law enforcement
and public safety technology.
To help ensure that OST does all that is possible to measure its progress
in achieving goals through outcome-oriented measures, we recommend that
the Attorney General instruct the Director of NIJ to reassess the measures
OST uses to evaluate its progress toward achieving its goals and to better
focus on outcome measures to assess results where possible. In those cases
where measuring outcome is, after careful consideration, deemed
infeasible, we recommend developing appropriate intermediate measures that
will help to discern program effectiveness.
We provided a copy of a draft of this report to the Attorney General of
the United States for review and comment. In an October 30, 2003, letter,
the Assistant Attorney General (AAG) for OJP commented on the draft. Her
comments are summarized below and presented in their entirety in appendix
VII. OJP also provided technical comments, which have been incorporated
into this report where appropriate.
In the AAG's written response, the Justice Department concurred with our
recommendation that NIJ reassess the measures OST uses to assess program
outcomes. In response to our recommendation, the AAG reported that she has
directed the NIJ Director, to reassess NIJ's performance measures for OST
and to refine them, where possible, in order to focus them more toward
measuring outcomes.
While the AAG agreed with our recommendation, she also made several other
comments. First, she commented that developing numerical outcome measures
like those urged by GAO is a particular challenge for R&D activities. As
stated in our report, we recognize that measuring results using outcome
measures is difficult and may be especially so in relation to some of the
types of activities undertaken by OST. Our reference to a numerical
measure is meant only as an example of how one of OST's program activities
can be linked to intended results. We believe
that further consideration of measures, both quantitative and qualitative,
could improve the assessment of results for R&D as well as other OST
programs. Our report also notes that relatively little of OST's work is
R&D. The majority of OST's products are in the category of information
dissemination and technical assistance.
Second, the AAG noted that GAO did not reach any conclusions in its
discussion of OST's growth in budgetary resources, changes in program
responsibilities, management of programs, and delivery of its products.
The AAG noted that Justice believed that OST's record is outstanding.
Neither OST nor we can determine whether OST's efforts in these areas are
successful or otherwise, given that OST has not developed measures to
assess their outcomes. Therefore, it is not possible to draw conclusions.
Third, the AAG indicated that GAO did not discuss in detail that over
onehalf of OST's funds were designated by Congress for specific recipients
and projects. She noted that GAO missed an opportunity to inform the
requester of the impact of Congress' recent decisions regarding OST. We
reported that 51 percent of OST's budgetary resources were designated for
specific recipients and projects in public law or subject to guidance in
committee reports.
As agreed with your office, unless you publicly announce its contents
earlier, we plan no further distribution of this report until 10 days from
its issue date. At that time, we will send copies of the report to the
Attorney General, appropriate congressional committees and other
interested parties. We will also make copies available to others upon
request. In addition, the report will be available at no charge on GAO's
Web site at http://www.gao.gov. Major contributors to this report are
listed in appendix VIII.
If you or your staff have any questions concerning this report, contact me
on (202) 512-8777.
Sincerely yours,
Laurie E. Ekstrand
Director, Homeland Security and Justice Issues
Appendix I: Scope and Methodology
To answer our objectives, we interviewed National Institute of Justice
(NIJ) and Office of Science and Technology (OST) officials and collected
documents at OST's office in Washington, D.C., and at three of OST's
technology centers-the National center in Rockville, Maryland; West center
in El Segundo, California; and Border Research and Technology Center in
San Diego, California. We selected the Rockville center because of its
proximity to Washington, D.C., and the other two centers because of their
locations and particular areas of technology and technical concentrations.
We also interviewed a small group of OST's customers- federal, state, and
local law enforcement, and corrections and public safety officials-who
were selected by officials at the El Segundo and San Diego centers. In
addition, we analyzed information that is available on the National
Institute of Justice's public Web site.
To determine OST's budgetary resources and amounts from fiscal year 1995
to fiscal year 2003 and the changes in OST program responsibilities, we
reviewed NIJ and OST budget documents, interviewed officials in OST's
Technology Management and the OJP's Office of Budget and Management
Services, and reviewed pertinent appropriations laws and committee reports
covering that period. To determine the amount of OST budgetary resources
that were directed to specific recipients and projects, we compared OST's
budget documents that listed individual recipients and projects with the
public laws and reports. We defined directed spending as spending for
specific recipients and projects designated in appropriations laws or
subject to congressional committee report guidance designating specific
recipients or projects. We did not determine the amount of reimbursable
projects designated in public laws or specified in committee reports
because those projects were not originally allocated to OST. Instead, we
considered all the reimbursable projects to be specific projects for which
OST was directed pursuant to its agreements with other agencies on
spending its reimbursable funds.
To determine the changes in OST's program responsibilities, we analyzed
the year-to-year changes in its budget and program scope. To determine the
amount of OST's budgetary resources used for investigative and forensic
sciences for fiscal years 1995-2003, we compared OST's portfolio
description and NIJ's definition of forensic sciences with the individual
budget program and project items listed in OST's budget documents for each
fiscal year. However, while we recognize that OST's technology centers and
their technical partners include investigative and forensic sciences in
their provision of technical assistance, we did not attempt to determine
the amount of center funds associated with investigative and forensic
sciences because the budget documents we received from OST
Appendix I: Scope and Methodology
did not break out such amounts within the funding awarded to the centers.
Therefore, our determination that $342.1 million of OST's total funding
supported investigative and forensic sciences did not include such
amounts.
To determine the amounts of funding awarded to the technology centers, we
analyzed databases on all of the products OST has produced through April
2003 and the associated grants, interagency agreements, and cooperative
agreements and their amounts.
To determine the composition of OST's products and how OST delivers the
products to its customers, we analyzed OST documents and a database of all
the products associated with its past and ongoing awards, from inception
through April 2003, that were delivered or anticipated to be delivered.
While the database included the award amounts associated with the
products, it was not possible to reliably associate the award amounts for
each product or type of product because multiple types of products could
result from individual awards. We also conducted interviews with the
parties mentioned above.
For the budget and product data that OST provided us, we assessed the
reliability of these data by examining relevant fields for missing data,
conducting range checks to identify any potentially erroneous outliers and
inspecting a subset of selected data elements that were common to two or
more data sets. In addition, we independently verified selected budget
data back to appropriations legislation and Committee reports. In
conducting our analyses, we identified some potential data errors or
reliability problems. When this occurred, we contacted agency officials to
address and resolve these matters. However, we did not verify the budget
or product data back to source materials. Overall, we determined that
budget or product data provided to us is adequate for the descriptive
purposes it is used in this report.
We examined OST's efforts to measure performance by interviewing officials
on this matter at OJP, NIJ, and OST in the Washington, D.C., office along
with officials and staff at the technology centers, and current and
previous Advisory Council officials. We also reviewed related agency
documents, such as the OJP mission statement and performance plans; NIJ
strategic planning documents and website pages, annual performance plans
and performance reports, and GPRA documents; policies and procedures,
Department of Justice memoranda, OST internal planning and reporting
documents, program descriptions and documentation, and other related
documents.
Appendix I: Scope and Methodology
As part of our examination, we reviewed OST's fiscal year 1997 to 2004
goals and measures as presented in OST's GPRA performance plans.1 We
focused our review on OST's fiscal year 2004 performance plan and
measures. As part of our review of these goals and measures, we made a
determination as to whether the performance measure was output, outcome,
or intermediate-oriented. To make this determination about the types of
performance measures contained in OST's performance plans, we compared the
measures used in the plans with the requirements of GPRA, accompanying
committee report, OMB's guidance on performance measurement challenges
(Circular A-11), Justice's guidance to its components for preparing
performance measures, and previous GAO work on GPRA.2
Also included in our examination of OST performance measurement efforts
were studies prepared by external parties under grants from OST that
reviewed selected OST initiatives such as portfolio areas, projects, and
programs. In response to our request for all of OST's efforts to assess
its programs, OST provided eight outside studies funded from fiscal years
1998 to 1999. For example, the Pymatuning Group, Inc., conducted an
"Assessment of the National Law Enforcement and Corrections Technology
Center (NLECTC) Program," which described the operations of the OST's
regional technology centers network. We reviewed all eight of the outside
studies for performance information on the OST initiatives being examined
in the report. We examined the studies to determine whether they provided
information that would be considered consistent with an outcome-oriented
evaluation as defined by our criteria.3
The scope of this review was limited to OST, and therefore we cannot
compare OST's efforts to measure the performance of its programs or the
amount of funding directed to specific recipients and projects with the
efforts and funding of any other federal R&D agencies. We performed our
1To determine the goal for each OST program included in the plan, we used
the stated public benefit statement provided in the plan, except for the
Law Enforcement Technology R&D program.
2See U.S. General Accounting Office, Agency Performance Plans: Examples of
Practices That Can Improve Usefulness to Decisionmakers,
GAO/GGD/AIMD-99-69 (Washington, D.C.: Feb. 26, 1999) for our guidance
concerning intermediate-oriented measures and
Managing for Results: Critical Issues for Improving Federal Agencies'
Strategic Plans,
GAO/GGD-97-180 (Washington D.C.: Sept. 16, 1997).
3See U.S. General Accounting Office, Performance Measurement and
Evaluation: Definitions and Relationships, GAO/GGD-98-26 (Washington,
D.C.: April 1998).
Appendix I: Scope and Methodology
audit work from September 2002 to September 2003 in Washington, D.C., and
other cited locations in accordance with generally accepted government
auditing standards.
Appendix II: Bugetary Resources for OST's Programs in Current Year Dollars
Table 6: Budgetary Resources in Current Dollars for OST's Programs by NIJ
Allocation, Fiscal Years 1995-2003
Dollars in millions
NIJ allocations 1995 1996 1997 1998 1999 2000 2001 2002 2003 Totala
for OST programs
NIJ Base 9.2 12.0 11.7 13.8 19.2 18.4 28.6 27.1 32.8 172.9
Local Law
Enforcement Block
Grant
(LLEBG) 0 20.0 20.0 20.0 20.0 20.0 20.0 20.0 19.9 159.8
Crime
Identification 0 0 0 0 0 4.2 4.2 1.4 0
Technology Act
(CITA)
Safe Schools
Technology
Research and
Development 0 0 0 0 0 15.0 17.5 17.0 16.9 66.4
Crime Lab
Improvement 0 1.0 3.0 12.5 15.0 15.0 19.4 35.0 40.3 141.1
Program (CLIP)
bDNA Backlog 0 0 0 0 0 15.0 10.6 35.0 35.8 96.3
Reduction
Paul Coverdell
National Forensic
Sciences
Improvement Act 0 0 0 0 0 0 0 5.0 5.0 10.0
(NFSIA)b
Counterterrorism 0 0 10.0 12.0 10.0 30.0 36.0 45.3 0 143.3
R&D
Reimbursements
from other
Justice and
federal agencies 2.5 14.5 0 8.3 10.9 12.5 26.3 82.2 56.9 214.1
Totala 11.7 47.5 44.7 66.6 75.1 130.2 162.6 268.0 207.6 1,013.8
Source: GAO analysis of OST data.
aTotals might not add due to rounding
bIn fiscal years 2000 and 2001, DNA Backlog Reduction allocations was
funded as DNA CODIS Backlog Reduction. In fiscal years 2002 and 2003, both
the DNA Backlog Reduction and Coverdell NFSIA allocations were funded
within DNA CODIS Backlog Reduction..
Appendix III: OST's 10 Categories of Products
While we divided OST's products into three groups for our reporting
purposes, OST divides them into 10 categories. (See table 7 for GAO's 3
groupings of OST's 10 categories.) In regrouping OST's 10 categories, we
recognized, as OST officials told us, that the 10 categories overlap and
there is not a clean division between them. We also recognized that many
of OST's products could also be considered a delivery method. For example,
publications, such as the TECHbeat newsletter, are OST products that can
also represent a method of delivery for OST technology information. OST
has reviewed our classification of products and agrees that it is
generally accurate.
Table 7: GAO's Groupings of OST's Categories of Products and Examples of
Each Category
GAO's 3 groupings of
OST's 10 categories OST's 10 categories Examples of products
1. Technology R&D 1. Results of the early stages of Results of
investigating forensic techniques, studying potential technology R&D
include the less-than-lethal incapacitation technologies, and researching
development of prototypes and advanced weapons detection. demonstration
that a principle or concept can be proven.
2. New applied technologies made Improved bomb robots and electromagnetic
concealed weapons available to public safety detection. agencies,
generally through commercialization.
2. Application, evaluation, 3. Existing technologies applied to
Communications interoperability (the ability to communicate across and
demonstration of new situations. different public safety agencies and
jurisdictions), handheld new and existing computer devices for bomb
investigators, and software tools to technologies for field measure levels
of school safety. users
4. Product evaluations based on voluntary national performance standards
and comparisons with like products.
5. Technology demonstrations.
Ballistic and stab-resistant body armor, handcuffs, semi-automatic
pistols, walk-through metal detectors; patrol vehicles, patrol vehicle
tires, and replacement brake pads; cut-, puncture-, and pathogenresistant
protective gloves.
Annual Mock Prison Riot meeting demonstrates emerging technologies for use
in hands-on riot training scenarios, and the annual Critical Incident
Response Technology seminar (formerly called Operation America), in which
bomb technicians practice livefire simulations.
3. Information dissemination and technical assistance
6. Information and guidance for public safety practitioners and those in
R&D.
Needs assessments of what public safety practitioners require, such as for
combating electronic crime and terrorism; funding requirements for
forensic science; investigative, selection, and application guides; and
technology and training for small agencies.
7. Standards to ensure that commercially available public safety
equipment meets minimum performance.
Ballistic resistance of personal body armor and handheld and walk-through
metal detectors.
Appendix III: OST's 10 Categories of Products
GAO's 3 groupings of
OST's 10 categories OST's 10 categories Examples of products
8. Enhanced capacity that gives agencies access to technologies and tools
they otherwise might not have had funding for or access to.
9. Conferences, forums, and workshops that bring together practioners,
technologists, and policymakers to form partnerships, communicate needs,
and educate participants.
10. Technical expertise and oversight of technology projects provide
additional oversight and guidance.
Technology assistance provided to OST's customers by its regional centers;
Crime Lab Improvement Program for establishing or expanding laboratories'
capacities for forensic analysis; the DNA Backlog Reduction Program for
helping to eliminate DNA backlog, leading to the resolution of unsolved
crimes.
Technical working groups of experienced practitioners and researchers
working to improve investigation techniques and issue procedural guides.
Panels and councils of public safety leaders, experts, and policymakers
assisting OST and its regional centers in setting development priorities,
launching technologies, identifying equipment problems, and enhancing
understanding of technological issues and advances. Commercialization
planning workshops involving developers and entrepreneurs interested in
commercializing public safety technologies.
Space and Naval Warfare Systems Command providing oversight, contracting,
and administrative support for the NIJ User Centric Information Technology
Program and Critical Incident Management System Testbed; the U.S Air Force
Research Laboratory providing oversight and administrative support to the
NIJ Concealed Weapons Detection and Personnel Location Technology Programs
and hosting the NIJ-sponsored National Cyberscience Laboratory.
Source: GAO analysis of OST data.
Appendix IV: OST's Portfolio Areas
OST has organized its individual projects to develop, improve, and
implement technology for public safety agencies into nine portfolio areas.
As of April 2003, these portfolio areas included
o critical incident technology, for first responders and investigators
protecting the public in the event of critical incidents such as natural
disasters, industrial accidents, or terrorist acts;
o communications interoperability1 and information sharing, enhancing
communication among public safety agencies through wired
links, wireless radios, and information networks, even when disparate
systems are involved;
o electronic crime, supporting computer forensic laboratories,
publishing guides for handling electronic evidence, and developing
computer forensic tools;
o investigative and forensic sciences, funding at the state and local
levels for DNA-typing of convicted offenders and use of DNA-typing in the
investigation of unsolved cases, and developing tools for forensic
casework;
o crime prevention technologies, including contraband detectors, sensors
and surveillance systems, and biometric technologies;
o protective systems technologies, including body armor; "smart"
handguns, which fire only upon recognition of, for example, a certain
handprint or password; puncture resistant gloves; better handcuffs; better
concealed weapon detection; and personnel tracking and location
technologies;
o less-than-lethal technologies, developing alternatives to lethal
force, including technologies involving electrical or chemical effects,
light barriers, vehicle stopping, and blunt trauma, and evaluating and
modeling the effects of these technologies;
o learning technologies, developing technology tools for agencies to use
in training their personnel, including use of the internet, CD-ROMs, and
video-based and interactive simulations; and
1Interoperability of communications is the ability to communicate across
different public safety agencies and jurisdictions.
Appendix IV: OST's Portfolio Areas
o standards and testing, ensuring that the equipment public safety
agencies buy is safe, dependable, and effective.
Appendix V: OST's Operations
As with other federal agencies, OST's operations involve multiple levels
of internal organization and multiple kinds of external partners. OST's
multiple levels of organization include a Washington, D.C., office that
manages its technology programs and a network of technology centers around
the country that provide technical assistance to OST's regional customers.
OST also collaborates with other R&D entities, such as those in the
Departments of Defense and Energy and public and private laboratories, by
forming technical partnerships in order to leverage already established
technical expertise and resources to support their program efforts.
Another aspect of OST's complex operations is the need to determine OST's
own priorities and the priorities of its customers, which involves
Washington and regional center staff collaborating formally and informally
with a myriad of federal, state, and local officials, as well as with one
another.
OST Has Multiple Levels of Organization
OST's multiple levels of organization include a Washington, D.C., office
and technology centers, as well as technical partnerships with government,
public and private R&D and public safety organizations. As of September
2003, OST's Washington office consisted of 25 full-timeequivalent Justice
staff divided into three divisions and under the Assistant NIJ Director
for OST.1 Responsibility for managing these programs is divided among the
three divisions. (See figure 3 for OST's organizational structure.)
1In addition, there were 2 federal detailees, 2 visiting scientists, and
32 on-site contractors supporting OST.
Appendix V: OST's Operations
Figure 3: OST's Organizational Structure
o Research and Technology Development Division manages electronic crime
(including cybercrime), critical incidents and counterterrorism,
communications interoperability and information sharing, crime prevention,
learning technology tools, less-than-lethal technologies, standards
development, school safety, and corrections technologies.
o Investigative and Forensic Sciences Division manages DNA-related R&D
and other investigative and forensic sciences, such as fingerprint
analysis, and includes the Crime Laboratory Improvement Program projects,
DNA Backlog Reduction projects, and DNA research and development projects.
o Technology Assistance Division, through the technology center network,
provides training and technical advice to, and identifies technologies
for, OST's customers, and oversees OST's network of 10 technology centers
(see figure 4). The technology centers are another source of technical
advice for OST's customers.
Appendix V: OST's Operations
o The Office of Law Enforcement Technology Commercialization, Inc.,
assists inventors and developers, among others, in commercializing
technologies.
o The Border Research and Technology Center aids in the development of
technologies for agencies concerned with law enforcement at the northern
and southern borders.
o The Rural Law Enforcement Technology Center aids rural and
smallcommunity law enforcement and corrections agencies.
Table 8: Total Funds Awarded for the Operations, Maintenance, and
Technical Support of OST's 10 Technology Centers, Fiscal Years 1995-2003
Dollars in millions
Regional centers Total funding
National, Rockville, Md.
Northeast, Rome, N.Y.
Southeast, North Charleston, S.C.
Northwest, Anchorage, Alaska
Rocky Mountain, Denver, Colo.
West, El Segundo, Calif.
Specialty centers
Border Research Technology Center, San Diego, Calif.
Office of Law Enforcement Standards, Gaithersburg, Md.
Office of Law Enforcement Technology Commercialization, Wheeling, W.Va.
Rural Law Enforcement Technology Center, Hazard, Ky.
Total funding $171.7
Source: OST.
Notes: Figures are based on the current year values of each award.
According to OST documents, the first award year for the Office of Law
Enforcement Standards in support of OST efforts was 1994. All of the
centers had award years of 1995 or later.
OST's Technical In addition to forming divisions and technology centers,
OST has also
Partnerships for Long-formed partnerships with governmental, public and
private R&D
Term Support organizations, agencies, and working groups. According to OST
officials,
an advantage of these partnerships is that OST can leverage the expertise
and resources of already established R&D facilities without having to
create their own. These partners have included
Appendix V: OST's Operations
o corporations, such as Georgia Tech Research Corporation and L-3
Communications, Analytics Corporation;
o state and local agencies, such as the Houston Police Department and
the Washington Metropolitan Area Transit Authority;
o academic institutions, such as the University of Virginia and Syracuse
University;
o other federal government agencies, such as the Department of Defense's
Army Training and Doctrine Command, and the Department of Transportation's
Federal Aviation Administration; and
o foreign government organizations, such as the Royal Canadian Mounted
Police, the United Kingdom Police Scientific Development Branch, and the
government of Israel.
Each of OST's technology centers is affiliated with one or more of OST's
technical partners. These technical partners are awarded funding in
exchange for providing staff and facilities to the technology centers.
Table 9 lists OST's partners and their affiliations, and funding they
received to support the centers through June of fiscal year 2003.
Appendix V: OST's Operations
Table 9: OST's Technology Centers, Their Affiliated Partners, and the Amounts
Awarded to Support the Centers
Dollars in millions
Amount awarded Technology centers Affiliated OST partner to support center
Regional centers
National, Rockville, Md. Aspen Systems Corporation, Rockville, Md.
Northeast, Rome, N.Y. Air Force Research Laboratory, U.S. Air Force, Rome,
N.Y.
Southeast, North Charleston, S.C. South Carolina Research Authority, North
Charleston, S.C.
Space and Naval Warfare Systems Center, U.S. Navy, Columbia, S.C.
Oak Ridge National Laboratory, U.S. Department of Energy, Oak Ridge, Tenn.
Savannah River Site, Department of Energy, Aiken, S.C.
Northwest, Anchorage, Alaska
Chenega Technology Services Corporation, and National Business Center,
U.S. Department of Interior, Anchorage, Alaska
Rocky Mountain, Denver, Colo. University of Denver - Colorado Seminary,
Denver, Colo.
West, El Segundo, Calif. Aerospace Corporation, El Segundo, Calif.
Specialty centers
Border Research Technology Center, San Aerospace Corporation, El Segundo,
Calif. Diego, Calif
Space and Naval Warfare Systems Center, U.S. Navy, San Diego, Calif.
Sandia National Laboratories, U.S. Department of Energy, Albuquerque, N.
Mex.
U.S. Attorney's Office, Southern District of California, Department of
Justice, San Diego, Calif. 0.0a
Office of Law Enforcement National Institute of Standards and
Standards, Technology, U.S. Department
Gaithersburg, Md. of Commerce, Gaithersburg, Md. 53.6
Office of Law Enforcement
Technology OLETC, Inc., Wheeling, W.Va.
Commercialization, Wheeling, 2.8
W.Va.
Wheeling Jesuit University, Wheeling, 14.0
W.Va.
National Aeronautics and Space 2.8
Administration
Rural Law Enforcement Technology Center, Eastern Kentucky University,
Hazard, Ky. Hazard, Ky. 3.0
b
Total funding $171.7
Source: OST.
Note: Figures are based on the current year values of each award. Award
amounts are for the operations, maintenance and technical support of the
centers.
aActual amount is $25,000.
bTotal might not add due to rounding.
Appendix V: OST's Operations
OST Collaborates with Many Customers and Partners to Determine Program
Priorities
To determine its program priorities, OST collaborates with its many
customers and partners. Staff at both OST's Washington, D.C., office and
its technology centers are involved in helping OST to set program
priorities. The staff report the results of their collaboration through
formal meetings, periodic reports, and informal communication. Input is
exchanged continually between OST's customers and staff and within its
multiple levels of organization. Using their input, the NIJ Director
determines OST's program priorities. (See figure 5 for the stakeholders,
partners, and customers that contribute to the setting of OST's
priorities.)
Appendix V: OST's Operations
Figure 5: Stakeholders and Customers that Contribute to the Setting of
OST's Priorities
Appendix V: OST's Operations
OST Collaborates with OST's three divisions collaborate with other U.S.
government agencies, the Government Agencies, research and professional
communities, and its technology centers to Research and Professional
solicit input for setting priorities. Also, the divisions work with public
safety practitioners at the state and local levels by, for example,
meetingCommunities, and Centers with grantees and assessing their needs.
o The Investigative and Forensic Sciences Division collaborates with,
and receives input from, researchers, academia, and the forensic
laboratory community to help set program priorities. It also collaborates
with, for example, the FBI and the interagency Technical Support Working
Group.
o The Research and Technology Development Division receives input
through its collaboration with other federal agencies, such as the FBI,
Drug Enforcement Administration, U.S. Secret Service, and White House
Office of National Drug Control Policy. The division also participates in
interagency working groups, such as for school safety and the Technical
Support Working Group. Through these collaborations, OST can develop and
share technologies used by both its customers and other agencies. For
example, OST works with the Department of Defense to conduct
less-thanlethal weapons R&D for law enforcement.
o The Technology Assistance Division is primarily responsible for
receiving input from OST's technology centers. The centers solicit input
from customers through their outreach efforts, such as technical
assistance, e-mail exchanges, and telephone calls. The centers are also
required to use OST's web-based reporting system to record information on
their customers' requests for technical assistance. The centers are also
required to submit monthly reports on their activities and finances.
Advisory Councils and Federal, State, and Local Public Safety Agencies
Collaborate with OST's Technology Centers
OST's technology centers solicit input from the national and regional
advisory councils that OST created to determine and advocate for the
particular needs of its customers. Members of the national advisory
council are selected by the technology centers and represent federal,
state, and local public safety agencies, as well as international criminal
justice organizations. Among its duties, this national advisory council
identifies the present and future equipment and technology needs of OST's
customers and reviews the programs of the technology centers. In addition,
the national advisory council recommends (1) ways to improve the
technology centers' programs' relevance to the needs of the centers'
customers and (2) broad priorities for the technology center network and
OST that are consistent with the needs of their customers.
Appendix V: OST's Operations
Each technology center has a regional advisory council. The regional
advisory councils consist of a cross-section of law enforcement and other
public safety officials who represent the interests of state and local
officials. The regional advisory councils solicit input from the state and
local agencies serviced in their regions, advise and support their
respective center directors on their customers' problems and needs, and
advocate for resource support and improvements required by their
customers. Through this method of sharing information, OST can better
understand the needs of its customers. For example, OST's regional
councils can represent the unique needs of their customers that the
national advisory council or the technology centers might not be aware of.
Appendix VI: OST's Goals in its Fiscal Year 2004 Performance Plan and
GAO's Assessment
Table 10: OST's Performance Goals, Initiatives, and Measures for Fiscal
Year 2004, and GAO's Assessment
Type of measure
Measures for assessing achievement of goals Output Intermediate Outcome
OST's initiatives Goals for initiatives
1. Convicted Offender DNA Backlog Reduction Program Reduce DNA backlog and
support a functioning, active system, which can solve old crimes and
prevent new ones from occurring.
1. Number of labs demonstrating improved access to external capabilities
and increased lab capabilities.
2. Number of samples (1) analyzed using the selected DNA markers that are
required by the FBI's national Combined DNA Index System (CODIS) database,
and (2) made available for CODIS.
3. Number of states that have experienced an increase in the number of
samples they have contributed to the national database.
X
X
X
2. No Suspect Reduce DNA backlog 4. Number of DNA samples from cases where
X DNA Backlog and support a there is no known suspect. Reduction
functioning, active Program system, which can solve old crimes and prevent
new ones from occurring.
3. Paul Coverdell Improve quality, 5. Number of forensic labs with
improved X National timeliness, and credibility analytic and technological
resources. Forensic of forensic science Sciences services. Improvement
Grants Program
4. Critical Incident Response Technology Initiative Improve the ability of
public safety responders, including law enforcement and corrections
officers, to deal with critical incidents, save lives, and reduce property
loss.
6. Number of technology demonstrations and test indicators that describe
the goods and services produced.
7. Number of prototype technologies developed.
8. Number of guides, standards, and assessments in progress.
9. Number of guides, standards, and assessments completed.
10. Number of technologies introduced in law enforcement and corrections
agencies.
X
X X X
X
Appendix VI: OST's Goals in its Fiscal Year 2004 Performance Plan and
GAO's Assessment
Type of measure
Measures for assessing achievement of goals Output Intermediate Outcome
OST's initiatives Goals for initiatives
5. DNA Research & Development Develop faster and more powerful tools and
techniques for the analysis of DNA evidence. These new tools and
techniques will result in more crimes prevented and solved and more
perpetrators brought to justice.
11. Number of projects researching new forensic DNA markers.
X
12. Number of development/validation studies for forensic DNA techniques.
X
13. Number of computer programs developed for forensic DNA analysis.
X
14. Number of prototypes and tools for forensic DNA analysis.
X
6. Law Enforcement Technology Research and Development Assist in applying
technology to reduce the vulnerability of critical infrastructure; detect
weapons and other contraband; improve technologies to locate and
differentiate between individuals in structures; leverage information
technology to enhance the responder community's ability to anticipate and
deal with critical incidents; identify and respond to terrorist attacks
involving chemical, biological, and other unconventional weapons; and
develop needed standards.a
15. Number of technology demonstrations and tests.
16. Number of prototype technologies developed.
17. Number of guides, standards, and assessments in progress.
18. Number of guides, standards, and assessments completed.
19. Number of technologies introduced in law enforcement and corrections
agencies.
X X X X
X
7. School Safety Technology Assist school administrators and law
enforcement in creating a safer and more productive learning environment.
Safe, effective, appropriate, and affordable technologies can affect the
perception and reality of safe schools.
20. Number of technology demonstrations. X
21. Number of conferences and forums. X
22. Number of school safety technology products.
X
Appendix VI: OST's Goals in its Fiscal Year 2004 Performance Plan and
GAO's Assessment
Type of measure
Measures for assessing achievement of goals Output Intermediate Outcome
OST's initiatives Goals for initiatives
8. Crime Lab Improvement Program Provide immediate results in solving more
crimes, bringing to justice more criminals, and improving administration
of justice through the presentation of strong, reliable forensic evidence
at trial.
23. Number of crime labs receiving specialized forensic services.
24. Number of capacity-building forensic R&D and validation projects
funded.
25. Number of forensic technology training tools developed and
distributed.
26. Number of labs providing continuing education or advanced training to
crime analysts.
27. Number of crime labs with increased capacity for implementation of new
forensic capabilities (including DNA analysis).
28. Number of capacity-building forensic R&D and validation projects
completed and impacting crime labs.
29. Number of labs establishing new forensic capabilities.
30. Number of labs expanding current forensic capabilities.
31. Number of labs experiencing a reduction in time needed for evidence
analysis.
32. Number of labs experiencing a reduction in backlogged evidentiary
sample analysis.
X X X X
X
X
X X X X
9. Office for Law Enforcement Standards
Help the public safety 33. Number of methods for examining community make
evidentiary materials developed.
informed decisions about 34. Number of standards for equipment andproducts
being marketed operating procedures developed.for public safety
personnel. 35. Law enforcement technology deliverables
X
X
X
(standards, product performance evaluations, product guides).
10. Smart Gun Develop a firearm that could save the lives of law
enforcement officers and members of the public that they encounter while
performing their duties.
36. Successful demonstration of prototype X recognition system for smart
gun.
37. Failure mode analysis for prototype X recognition system for smart
gun.
38. Incorporation and demonstration of X recognition system into firearm
(where applicable).
39. Identification of appropriate biometric X solutions for recognition
system (where applicable).
Appendix VI: OST's Goals in its Fiscal Year 2004 Performance Plan and
GAO's Assessment
Type of measure
Measures for assessing achievement of goals Output Intermediate Outcome
OST's initiatives Goals for initiatives
11. OST's network of technology centers (known as the National Law
Enforcement and Corrections Technology Center system) Help state and local
law enforcement, corrections, and public safety personnel do their jobs
more safely and efficiently, thereby leading to greater administrative
efficiencies, more crimes solved, and more lives saved.
40. Number of technology information documents distributed.
41. Number of practitioners trained through the Crime Mapping Program.
42. Savings to criminal justice agencies through the DOD's Section 1033
Military Surplus Program. Section 1033 of the National Defense
Authorization Act for Fiscal Year 1997b authorizes DOD to transfer excess
military property to federal and state agencies to support law enforcement
activities including counterdrug and counterterrorism activities.
X X
X
Source: OST.
aBecause the goal for this initiative was not outcome-oriented according
to our criteria, we used the initiative's mission statement as the goal.
bP.L. 104-201, 110 Stat. 2422 (1996).
Appendix VII: Comments from the Department of Justice
Appendix VII: Comments from the Department of Justice
Appendix VII: Comments from the Department of Justice
Appendix VIII: GAO Contacts and Staff Acknowledgments
GAO Contacts
Staff Acknowledgments
(440165)
Laurie Ekstrand (202) 512-8777 Weldon McPhail (202) 512-8644
In addition to those named above, the following individuals contributed to
this report: Samuel L. Hinojosa, Debra L. Picozzi, Katherine M. Davis,
Richard Hung, Geoffrey R. Hamilton, Denise M. Fantone, Kristeen McLain,
Elizabeth H. Curda, Rebecka Derr, Thomas M. Beall, and Leo M. Barbour.
GAO's Mission
Obtaining Copies of GAO Reports and Testimony
The General Accounting Office, the audit, evaluation and investigative arm
of Congress, exists to support Congress in meeting its constitutional
responsibilities and to help improve the performance and accountability of
the federal government for the American people. GAO examines the use of
public funds; evaluates federal programs and policies; and provides
analyses, recommendations, and other assistance to help Congress make
informed oversight, policy, and funding decisions. GAO's commitment to
good government is reflected in its core values of accountability,
integrity, and reliability.
The fastest and easiest way to obtain copies of GAO documents at no cost
is through the Internet. GAO's Web site (www.gao.gov) contains abstracts
and fulltext files of current reports and testimony and an expanding
archive of older products. The Web site features a search engine to help
you locate documents using key words and phrases. You can print these
documents in their entirety, including charts and other graphics.
Each day, GAO issues a list of newly released reports, testimony, and
correspondence. GAO posts this list, known as "Today's Reports," on its
Web site daily. The list contains links to the full-text document files.
To have GAO e-mail this list to you every afternoon, go to www.gao.gov and
select "Subscribe to e-mail alerts" under the "Order GAO Products"
heading.
Order by Mail or Phone The first copy of each printed report is free.
Additional copies are $2 each. A check or money order should be made out
to the Superintendent of Documents. GAO also accepts VISA and Mastercard.
Orders for 100 or more copies mailed to a single address are discounted 25
percent. Orders should be sent to:
U.S. General Accounting Office 441 G Street NW, Room LM Washington, D.C.
20548
To order by Phone: Voice: (202) 512-6000 TDD: (202) 512-2537 Fax: (202)
512-6061
Contact:
To Report Fraud, Web site: www.gao.gov/fraudnet/fraudnet.htm
Waste, and Abuse in E-mail: [email protected]
Federal Programs Automated answering system: (800) 424-5454 or (202)
512-7470
Jeff Nelligan, Managing Director, [email protected] (202) 512-4800
Public Affairs U.S. General Accounting Office, 441 G Street NW, Room 7149
Washington, D.C. 20548
*** End of document. ***