Elections: Federal Programs for Accrediting Laboratories That	 
Test Voting Systems Need to Be Better Defined and Implemented	 
(09-SEP-08, GAO-08-770).					 
                                                                 
The 2002 Help America Vote Act (HAVA) created the Election	 
Assistance Commission (EAC) and assigned both it and the National
Institute of Standards and Technology (NIST) responsibilities for
accrediting laboratories that test voting systems. NIST assesses 
a laboratory's technical qualifications and makes recommendations
to EAC, which makes a final accreditation decision. In view of	 
the continuing concerns about voting systems and the important	 
roles that NIST and EAC play in accrediting the laboratories that
test these systems, GAO was asked to determine whether each	 
organization has defined an effective approach for accrediting	 
laboratories that test voting systems and whether each is	 
following its defined approach. To accomplish this, GAO compared 
NIST and EAC policies, guidelines, and procedures against	 
applicable legislation and guidance, and reviewed both agencies' 
efforts to implement them.					 
-------------------------Indexing Terms------------------------- 
REPORTNUM:   GAO-08-770 					        
    ACCNO:   A84074						        
  TITLE:     Elections: Federal Programs for Accrediting Laboratories 
That Test Voting Systems Need to Be Better Defined and		 
Implemented							 
     DATE:   09/09/2008 
  SUBJECT:   Documentation					 
	     Elections						 
	     Eligibility determinations 			 
	     Evaluation criteria				 
	     Federal regulations				 
	     Institution accreditation				 
	     Laboratories					 
	     Operational testing				 
	     Program evaluation 				 
	     Program management 				 
	     Standards						 
	     Systems analysis					 
	     Systems evaluation 				 
	     Systems integrity					 
	     Systems testing					 
	     Technology 					 
	     Testing						 
	     Voting						 
	     NIST National Voluntary Laboratory 		 
	     Accreditation Program				 
                                                                 
	     Electronic Voting System				 

******************************************************************
** This file contains an ASCII representation of the text of a  **
** GAO Product.                                                 **
**                                                              **
** No attempt has been made to display graphic images, although **
** figure captions are reproduced.  Tables are included, but    **
** may not resemble those in the printed version.               **
**                                                              **
** Please see the PDF (Portable Document Format) file, when     **
** available, for a complete electronic file of the printed     **
** document's contents.                                         **
**                                                              **
******************************************************************
GAO-08-770

   

This text file was formatted by the U.S. Government Accountability 
Office (GAO) to be accessible to users with visual impairments, as part 
of a longer term project to improve GAO products' accessibility. Every 
attempt has been made to maintain the structural and data integrity of 
the original printed product. Accessibility features, such as text 
descriptions of tables, consecutively numbered footnotes placed at the 
end of the file, and the text of agency comment letters, are provided 
but may not exactly duplicate the presentation or format of the printed 
version. The portable document format (PDF) file is an exact electronic 
replica of the printed version. We welcome your feedback. Please E-mail 
your comments regarding the contents or accessibility features of this 
document to [email protected]. 

This is a work of the U.S. government and is not subject to copyright 
protection in the United States. It may be reproduced and distributed 
in its entirety without further permission from GAO. Because this work 
may contain copyrighted images or other material, permission from the 
copyright holder may be necessary if you wish to reproduce this 
material separately. 

Report to the Chairman, Committee on House Administration, House of 
Representatives: 

United States Government Accountability Office: 

GAO: 

September 2008: 

Elections: 

Federal Programs for Accrediting Laboratories That Test Voting Systems 
Need to Be Better Defined and Implemented: 

Accreditation of Voting System Testing Laboratories: 

GAO-08-770: 

GAO Highlights: 

Highlights of GAO-08-770, a report to the Chairman, Committee on House 
Administration, House of Representatives. 

Why GAO Did This Study: 

The 2002 Help America Vote Act (HAVA) created the Election Assistance 
Commission (EAC) and assigned both it and the National Institute of 
Standards and Technology (NIST) responsibilities for accrediting 
laboratories that test voting systems. NIST assesses a laboratory�s 
technical qualifications and makes recommendations to EAC, which makes 
a final accreditation decision. In view of the continuing concerns 
about voting systems and the important roles that NIST and EAC play in 
accrediting the laboratories that test these systems, GAO was asked to 
determine whether each organization has defined an effective approach 
for accrediting laboratories that test voting systems and whether each 
is following its defined approach. To accomplish this, GAO compared 
NIST and EAC policies, guidelines, and procedures against applicable 
legislation and guidance, and reviewed both agencies� efforts to 
implement them. 

What GAO Found: 

NIST has largely defined and implemented an approach for accrediting 
voting system testing laboratories that incorporates many aspects of an 
effective program. In particular, its approach addresses relevant HAVA 
requirements and reflects relevant laboratory accreditation guidance, 
including standards accepted by the international standards community. 
However, NIST�s defined approach does not, for example, cite explicit 
qualifications for the persons who conduct accreditation technical 
assessments, as called for in federal accreditation program guidance. 
Instead, NIST officials said that they rely on individuals who have 
prior experience in reviewing such laboratories. Further, even though 
the EAC requires that laboratory accreditation be based on demonstrated 
capabilities to test against the latest voting system standards, NIST�s 
defined approach has not always cited these current standards. As a 
result, two of the four laboratories accredited to date were assessed 
using assessment tools that were not linked to the latest standards. 
Moreover, available documentation for the four laboratory assessments 
was not sufficient to determine how the checklists were applied and how 
decisions were reached. According to NIST officials, the four 
laboratories were consistently assessed. Moreover, they said that they 
intend to evolve NIST�s accreditation approach to, for example, clearly 
provide for sufficient documentation of how accreditation reviews are 
conducted and decisions are reached. However, they had yet to develop 
specific plans for accomplishing this. EAC recently developed a draft 
laboratory accreditation program manual, but this draft manual does not 
adequately define all aspects of an effective approach, and it was not 
used in the four laboratory accreditations performed to date. 
Specifically, while this draft manual addresses relevant HAVA 
requirements, such as the requirement for the commissioners to vote on 
the accreditation of any laboratory that NIST recommends for 
accreditation, it does not include a methodology governing how 
laboratories are to be evaluated or criteria for granting 
accreditation. Because the manual was not approved at the time EAC 
accredited four laboratories, these accreditations were governed by a 
more broadly defined accreditation review process that was described in 
correspondence sent to each laboratory and a related document receipt 
checklist. As a result, these accreditations were based on review steps 
that were not sufficiently defined to permit them to be executed in a 
repeatable manner. According to EAC officials, including the official 
who conducted the accreditation reviews for the four laboratories, 
using the same person to conduct the reviews ensured that the steps 
performed on the first laboratory were repeated on the other three. 
However, given that both the steps and the results were not documented, 
GAO could not verify this. EAC officials stated that they intend to 
evolve the program manual over time and apply it to future 
accreditations and reaccreditations. However, they did not have 
specific plans for accomplishing this. Further, although EAC very 
recently approved an initial version of its program manual, this did 
not occur until after EAC provided comments, and GAO had finalized, 
this report. 

What GAO Recommends: 

GAO is making recommendations to NIST and EAC aimed at further defining 
and implementing their respective accreditation programs in a way that 
better ensures that voting system laboratory accreditations are 
performed consistently and are verifiable. NIST and EAC generally 
agreed with the need for their respective programs to continuously 
improve and both sought clarification on the report�s recommendations, 
which GAO has added. 

To view the full product, including the scope and methodology, click on 
[hyperlink,http://www.gao.gov/cgi-bin/getrpt?]. For more information, 
contact Randolph C. Hite at (202) 512-3439 or [email protected]. 

[End of section] 

Contents: 

Letter: 

Results in Brief: 

Background: 

NIST Has Defined and Implemented an Accreditation Approach That 
Reflects Relevant Standards but Is Missing Details Needed for 
Consistent and Verifiable Implementation: 

EAC's Recently Drafted Accreditation Approach and Its Earlier Performed 
Laboratory Accreditations Lack Key Effectiveness Factors and Features: 

Conclusions: 

Recommendations for Executive Action: 

Agency Comments and Our Evaluation: 

Appendix I: Objectives, Scope, and Methodology: 

Appendix II: Comments from the National Institute of Standards and 
Technology: 

Appendix III: Comments from the Election Assistance Commission: 

Appendix IV: GAO Contact and Staff Acknowledgments: 

Tables: 

Table 1: EAC and NIST Responsibilities under HAVA: 

Table 2: Summary of Extent to Which EAC Draft Approach Addresses NIST- 
Identified Accreditation Factors: 

Table 3: Summary of Extent to Which EAC Satisfies Features Common to 
Federal Accreditation Programs: 

Figures: 

Figure 1: A Voting System Life Cycle Model: 

Figure 2: Overall NIST and EAC Accreditation Processes: 

Figure 3: VSTL Accreditation Program Activities: 

Abbreviations: 

EAC: Election Assistance Commission: 

HAVA: Help America Vote Act: 

ISO: International Organization for Standardization: 

NIST: National Institute of Standards and Technology: 

NVLAP: National Voluntary Laboratory Accreditation Program: 

VSTL: voting system testing laboratory: 

VSS: Voting Systems Standards: 

VVSG: Voluntary Voting System Guidelines: 

United States Government Accountability Office: 

Washington, DC 20548: 

September 9, 2008: 

The Honorable Robert A. Brady: 
Chairman: 
Committee on House Administration: 
House of Representatives: 

Dear Mr. Chairman: 

In the wake of the 2000 and 2004 general elections, we issued a series 
of reports and testified[Footnote 1] on virtually every aspect of our 
nation's overall election system, including the many challenges and 
opportunities associated with various types of voting systems. In this 
regard, we emphasized that voting systems alone were neither the sole 
contributor nor solution to the problems that were experienced during 
the 2000 and 2004 elections, and that the overall election system 
depended on the effective interplay of people, process, and technology 
and involved all levels of government. Among many things, we 
specifically reported in 2001[Footnote 2] that no federal entity was 
responsible for accrediting the laboratories that tested voting 
systems, and we raised the establishment of such an entity as a matter 
for congressional consideration. 

Subsequently, Congress passed the Help America Vote Act (HAVA), which 
created the Election Assistance Commission (EAC) and assigned both it 
and the National Institute of Standards and Technology (NIST) separate 
but related responsibilities for accrediting laboratories that test 
voting systems.[Footnote 3] In general, NIST is responsible for 
assessing a laboratory's technical qualifications and making an 
accreditation recommendation to EAC, while EAC is to use the assessment 
results and recommendation, along with its own review of related 
laboratory capabilities, to reach an accreditation decision. In 2004 
and 2007, NIST and EAC established voting system testing laboratory 
accreditation programs, respectively. To date, EAC has accredited four 
laboratories. In view of the continuing concerns about voting systems 
and the important roles that both NIST and EAC play in accrediting the 
laboratories that test these systems, you asked us to determine whether 
NIST and EAC have each defined an effective laboratory accreditation 
approach and whether each is following its defined approach. 

To accomplish this, we reviewed NIST and EAC policies, guidelines, and 
procedures governing voting system testing laboratory accreditation, 
deaccreditation, and reaccreditation and compared them, as appropriate, 
to applicable statute, such as HAVA, and guidance published by NIST, 
the International Organization for Standardization, and us. We then 
compared NIST and EAC actions and artifacts that were used for 
accrediting four voting system testing laboratories to their respective 
policies, guidelines, and procedures. We did not review a fifth 
laboratory because NIST was in the process of assessing it when we 
started our review, and had yet to recommend the laboratory to EAC for 
final accreditation. In addition, we interviewed officials from NIST, 
EAC, and the four laboratories to understand and clarify approaches 
taken, documentation provided, and decisions reached. 

We conducted this performance audit at EAC and NIST offices in 
Washington, D.C., and Gaithersburg, Maryland, respectively, from 
September 2007 to September 2008 in accordance with generally accepted 
government auditing standards. Those standards require that we plan and 
perform the audit to obtain sufficient, appropriate evidence to provide 
a reasonable basis for our findings and conclusions based on our audit 
objectives. We believe that the evidence obtained provides a reasonable 
basis for our findings and conclusions based on our audit objectives. 
Further details of our objective, scope, and methodology are included 
in appendix I. 

Results in Brief: 

NIST has largely defined and implemented an approach for accrediting 
voting system testing laboratories that incorporates many aspects of an 
effective program. In particular, its approach addresses relevant HAVA 
requirements and reflects relevant laboratory accreditation standards 
that have been accepted by the international community. However, NIST's 
defined approach does not cite explicit qualifications or training 
requirements for accreditation technical assessors, which, according to 
NIST, is a characteristic of an effective accreditation program. 
Instead, the program has relied, in part, on a small and specialized 
field of potential assessors in the voting system arena. Further, until 
recently, NIST's laboratory accreditation guidance cited different 
versions of required standards. As a result, two of the four accredited 
laboratories were assessed using checklists linked to the current 
standards and two were not. Moreover, NIST's documentation of these 
assessments was not sufficient to determine how the checklists were 
applied and how decisions were reached. According to NIST officials, 
the four laboratories were consistently assessed, and they intend to 
ensure that sufficient documentation is produced to show how 
assessments are conducted and decisions are reached. However, they said 
that they do not have a documented plan for accomplishing this. 

EAC recently developed a draft of a voting system testing laboratory 
accreditation program manual, but the draft manual does not adequately 
define all aspects of an effective approach and was not used in 
accrediting the four laboratories. EAC's draft manual addresses 
applicable HAVA requirements, but does not include either a methodology 
governing how laboratories are to be reviewed or certain criteria 
relevant to granting accreditation. Because the draft manual was not 
available for laboratory assessments until recently, EAC instead used a 
more broadly defined accreditation review process contained in 
correspondence with the laboratories and a related checklist that were 
not specific enough to ensure that review steps were executed in a 
repeatable manner. According to EAC officials, using the same person to 
conduct the reviews ensured that the reviews were consistently 
performed. However, because both the steps and the results were not 
documented, we could not verify this. In addition, EAC officials told 
us that they intend to evolve the program manual over time and apply it 
to future accreditations and reaccreditations; however, they do not 
have a documented plan for accomplishing this.[Footnote 4] 

To assist NIST and EAC in evolving their respective voting system 
testing laboratory accreditation programs, we are making 
recommendations to NIST and EAC to develop and execute plans, with 
specific tasks, milestones, resources, and measures, which are aimed at 
adding consistency and specificity to their defined approaches and 
ensuring that the approaches are fully implemented and documented. 

NIST and EAC provided written comments on a draft of this report, 
signed by the Deputy Director of NIST and the Executive Director of 
EAC, respectively. More specifically, NIST stated that it appreciated 
our careful review of its voting system testing laboratory 
accreditation program and added that it generally concurs with our 
findings that this program must continue to evolve and improve. NIST 
also provided comments that were intended to clarify the current status 
of the program relative to three of our findings. For various reasons 
discussed in the agency comments section of this report, we do not 
believe that these comments affect any of the three findings. 
Therefore, we have not modified the report's presentation of them. In 
light of NIST's recent actions to address one of the findings, we 
updated our report to reflect these actions and have removed the 
associated recommendation from our final report. Further, in order to 
avoid the possibility of any misunderstanding about the actions needed 
to address one other finding, we have slightly modified the 
recommendation associated with it. 

With respect to EAC's comments, the commission described our review and 
report as helpful to the commission as it works to fully develop and 
implement its voting system testing laboratory accreditation program. 
It also stated that it agrees with the report's conclusions that 
additional written internal procedures, standards, and documentation 
are needed to ensure more consistent and repeatable implementation of 
the program. Further, it stated that it generally accepts our 
recommendations, adding that it will work hard to implement them. 
However, it sought clarification about two of the recommendations. In 
response, we have slightly modified both recommendations to avoid any 
confusion as to their intent. Both the NIST and EAC comments are 
discussed in detail in the agency comments section of this report, and 
are reprinted in their entirety in appendixes II and III, respectively. 

Background: 

All levels of government share responsibility in the overall U.S. 
election system. At the federal level, Congress has authority under the 
Constitution to regulate presidential and congressional elections and 
to enforce prohibitions against specific discriminatory practices in 
all federal, state, and local elections. Congress has passed 
legislation that addresses voter registration, absentee voting, 
accessibility provisions for the elderly and persons with disabilities, 
and prohibitions against discriminatory practices.[Footnote 5] At the 
state level, individual states are responsible for the administration 
of both federal elections and their own elections. States regulate the 
election process, including, for example, the adoption of voluntary 
voting system guidelines, the state certification and acceptance 
testing of voting systems, ballot access, registration procedures, 
absentee voting requirements, the establishment of voting places, the 
provision of election day workers, and the counting and certification 
of the vote. 

In total, the overall U.S. election system can be seen as an assemblage 
of 55 distinct election systems--those of the 50 states, 4 U.S. 
territories, and the District of Columbia. Further, although election 
policy and procedures are legislated primarily at the state level, 
states typically have decentralized election systems, so that the 
details of administering elections are carried out at the city or 
county levels, and voting is done at the local level. As we reported in 
2001,[Footnote 6] local election jurisdictions number more than 10,000, 
and their sizes vary enormously--from a rural county with about 200 
voters to a large urban county, such as Los Angeles County, where the 
total number of registered voters for the 2000 elections exceeded the 
registered voter totals in 41 states. Further, these thousands of 
jurisdictions rely on many different types of voting methods that 
employ a wide range of voting system makes, models, and versions. 
Because of the prominent role played by electronic voting systems, 
testing these systems against national standards is critical to 
ensuring their security and reliability. Equally critical is ensuring 
that the laboratories that perform these tests are competent to carry 
out testing activities. 

The Overall U.S. Election System Depends on Effective Interactions 
among People, Processes, and Technology: 

In the United States today, most votes are cast and counted by 
electronic voting systems, and many states require use of systems that 
have been certified nationally or by state authorities. However, voting 
systems are but one facet of a multifaceted, continuous overall 
election system that involves the interplay of people, processes, and 
technology during the entire life of a system. All levels of 
government, as well as commercial voting system manufacturers and 
system testing laboratories, play key roles in ensuring that voting 
systems perform as intended. 

Electronic voting systems are typically developed by manufacturers, 
then purchased as commercial, off-the-shelf products and operated by 
state and local election administrators. Viewed at a high level, these 
activities make up three phases of a system life cycle: product 
development, acquisition, and operations. (See fig. 1.) Key processes 
that span these life cycle phases include managing the people, 
processes, and technologies within each phase and across phases, and 
testing the systems and components during and at the end of each phase. 
Additionally, voting system standards are important through all of the 
phases because they provide criteria for developing, testing, and 
acquiring voting systems, and they specify the necessary documentation 
for operating the systems. 

Figure 1: A Voting System Life Cycle Model: 

This figure is a horizontal flow chart showing a voting system life 
cycle model. 

[See PDF for image] 

Source: GAO analysis of EAC, NIST, and Institute of Electrical and 
Electronics Engineers (IEEE) publications. 

[End of figure] 

* The product development phase includes activities such as 
establishing requirements for the system, designing a system 
architecture, developing software, and integrating components. 
Activities in this phase are performed by the system vendor. 

* The acquisition phase includes activities such as publishing a 
solicitation, evaluating offers, choosing a voting technology and a 
vendor, and awarding and administering contracts. For voting systems, 
activities in this phase are primarily the responsibility of state and 
local governments but entail some responsibilities that are shared with 
the system vendor (e.g., entering into the contract). 

* The operations phase consists of activities such as ballot design and 
programming, setup of systems before voting, pre-election testing, vote 
capture and counting during elections, recounts and system audits after 
elections, and storage of systems between elections. Responsibility for 
activities in this phase typically resides with local jurisdictions, 
whose officials may, in turn, rely on or obtain assistance from system 
vendors for aspects of these activities. 

* Standards for voting systems, as will be discussed in a later 
section, were developed at the national level by the Federal Election 
Commission in 1990 and 2002 and were updated by EAC in 2005. In the 
product development phase, voting system standards serve as 
requirements to meet for developers to build systems. In the 
acquisition phase, they also provide a framework that state and local 
governments can use to evaluate systems. In the operations phase, they 
specify the necessary documentation for operating the systems. 

* Testing processes are conducted throughout the life cycle of a voting 
system. Voting system vendors conduct product testing during 
development of the system and its components. Federal certification 
testing of products submitted by system vendors is conducted by 
national voting system testing laboratories (VSTL). States may conduct 
evaluation testing before acquiring a system to determine how well 
products meet their state-specific specifications, or they may conduct 
certification testing to ensure that a system performs its functions as 
specified by state laws and requirements. Once a voting system is 
delivered by the system vendor, states and local jurisdictions may 
conduct acceptance testing to ensure that the system satisfies 
functional requirements. Finally, local jurisdictions typically conduct 
logic and accuracy tests related to each election and sometimes subject 
portions of the system to parallel testing during each election to 
ensure that the system components perform accurately. 

* Management processes ensure that each life cycle phase produces a 
desirable outcome. Typical management activities that span the system 
life cycle include planning, configuration management, system 
performance review and evaluation, problem tracking and correction, 
human capital management, and user training. These activities are 
conducted by the responsible parties in each life cycle phase. 

In 2004, we reported[Footnote 7] that the performance of electronic 
voting systems, like any type of automated information system, can be 
judged on several bases, including their security, accuracy, ease of 
use, efficiency, and cost. We also reported that voting system 
performance depends on how the system was designed, developed, and 
implemented. 

Laboratory Accreditation Plays an Important Role in Ensuring Accurate, 
Reliable, and Secure Voting Systems: 

Since the passage of HAVA, the use of electronic voting systems has 
increased and become the predominant method of voting. However, 
concerns have been raised about the security and reliability of these 
systems. As we have previously reported,[Footnote 8] testing and 
certifying voting systems is one critical step in acquiring, deploying, 
operating, and administering voting systems, which better ensures that 
they perform securely and reliably. Among other things, rigorous 
execution and careful documentation of system testing is a proven way 
to help ensure that system problems are found before the systems are 
deployed and used in an election. To accomplish this, it is vital that 
the organizations that test the systems be qualified and competent to 
do so. For voting systems, a key testing organization is a federally 
accredited, national VSTL. 

In general, accreditation is the formal recognition that a laboratory 
is competent to carry out specific types of tests or calibrations. 
Federally accredited laboratories perform many different types of 
testing and related activities on various products, ranging from 
inspecting grain to certifying maritime cargo gear. The genesis of 
laboratory accreditation programs owes largely to agencies' need to 
assure themselves of the competency of the organizations responsible 
for testing products or services that involve the use of federal funds. 

To provide national recognition for competent laboratories, the NIST 
Director established the National Voluntary Laboratory Accreditation 
Program (NVLAP) in 1976 at the request of the private sector. Under 
this program, which is based on internationally accepted standards, 
NIST accredits laboratories that it finds competent to perform specific 
types of tests or calibrations. In June 2004, NVLAP announced the 
establishment, in accordance with HAVA, of an accreditation program for 
laboratories that test voting systems using standards determined by 
EAC. 

HAVA Assigned EAC and NIST Responsibility for Accrediting VSTLs: 

Enacted in October 2002, HAVA affected nearly every aspect of the 
voting process, from voting technology to provisional ballots and from 
voter registration to poll worker training.[Footnote 9] In particular, 
the act authorized $3.86 billion in funding over several fiscal years 
to replace punch card and mechanical lever voting equipment, improve 
election administration and accessibility, train poll workers, and 
perform research and pilot studies. HAVA also established EAC, provided 
for the appointment of four commissioners, and specified the process 
for selecting an executive director. Generally speaking, EAC is to 
assist in the administration of federal elections and provide 
assistance in administering certain federal election laws and programs. 

Since the passage of HAVA in 2002, the federal government has taken 
steps to implement the act's provisions. For example, after beginning 
operations in January 2004, EAC updated the existing federal voluntary 
standards for voting systems, including strengthening provisions 
related to security and reliability. Additionally, EAC established an 
interim VSTL accreditation program that leveraged a predecessor program 
run by the National Association of State Elections Directors, and EAC 
and NIST then established companion accreditation programs that 
replaced the interim program. 

EAC Updated the Federal Voluntary Standards for Voting Systems: 

Federal standards for voting systems were first issued in 1990 when the 
Federal Election Commission published standards.[Footnote 10] These 
federal standards identified minimum functional and performance 
requirements, which states were free to adopt in whole, in part, or not 
at all, for electronic voting equipment, and specified test procedures 
to ensure that the equipment met those requirements. In 2002, the 
Federal Election Commission issued its Voting System Standards (VSS), 
which updated the 1990 standards to reflect more modern voting system 
technologies. In 2005, we reported[Footnote 11] that these standards 
identified minimum functional and performance requirements for voting 
systems but were not sufficient to ensure secure and reliable voting 
systems. As a result, we recommended that EAC work to define specific 
tasks, measurable outcomes, milestones, and resource needs to improve 
the voting system standards. Until then, election administrators were 
at risk of relying on voting systems that were not developed, acquired, 
tested, operated, or managed in accordance with rigorous security and 
reliability standards--potentially affecting the reliability of future 
elections and voter confidence in the accuracy of the vote count. 

Following the enactment of HAVA in 2002 and the establishment of EAC in 
2004, EAC adopted the Voluntary Voting System Guidelines (VVSG) in 
2005.[Footnote 12] The VVSG specify the functional requirements, 
performance characteristics, documentation requirements, and test 
evaluation criteria for the national certification of voting systems. 
Accredited testing laboratories are to use the VVSG to develop test 
plans and procedures for the analysis and testing of systems in support 
of EAC's voting system certification program.[Footnote 13] The VVSG are 
also used by voting system manufacturers as the basis for designing and 
deploying systems that can be federally certified. 

EAC Established an Interim Accreditation Program: 

We reported in 2001[Footnote 14] that the National Association of State 
Elections Directors was accrediting independent test authorities to 
test voting equipment against the Federal Election Commission 
standards. Under this program, three laboratories were accredited. 
Under HAVA, NIST is to recommend laboratories for EAC accreditation. In 
2006, NIST notified EAC that its initial recommendations might not be 
available until sometime in 2007. As a result, EAC initiated an interim 
accreditation program and invited the three laboratories accredited by 
the state elections directors to apply.[Footnote 15] As part of the 
interim program, laboratories were required to attest to a set of EAC- 
required conditions and practices, including certifying the integrity 
of personnel, the absence of conflicts of interest, and the financial 
stability of the laboratory. In August and September 2006, EAC granted 
interim accreditation to two of the three laboratories invited to 
apply. EAC terminated its interim program in March 2007. 

EAC and NIST Have Established Separate but Related Laboratory 
Accreditation Programs: 

HAVA assigned responsibilities for laboratory accreditation to both EAC 
and NIST. In general, to reach an accreditation decision, NIST is to 
focus on assessing laboratory technical qualifications, while EAC is to 
use those assessment results and recommendations and augment them with 
its own review of related laboratory capabilities. See table 1 for the 
two agencies' HAVA responsibilities. 

Table 1: EAC and NIST Responsibilities under HAVA: 

HAVA responsibility: Provide for the testing, certification, 
decertification, and recertification of voting system hardware and 
software by accredited laboratories; 
Responsible entity: EAC: X; 
Responsible entity: NIST: [Empty]. 

HAVA responsibility: Conduct evaluations of independent, nonfederal 
laboratories and submit to EAC a list of those laboratories proposed 
for accreditation; 
Responsible entity: EAC: [Empty]; 
Responsible entity: NIST: X. 

HAVA responsibility: Vote on the accreditation of any laboratory, 
taking into consideration the NIST recommendation for it. No laboratory 
may be accredited unless its accreditation is approved by a vote of 
EAC; 
Responsible entity: EAC: X; 
Responsible entity: NIST: [Empty]. 

HAVA responsibility: Publish an explanation for the accreditation of 
any laboratory not included on the list submitted for recommendation by 
NIST; 
Responsible entity: EAC: X; 
Responsible entity: NIST: [Empty]. 

HAVA responsibility: Monitor and review the performance of laboratories 
accredited by EAC and make recommendations to EAC with respect to the 
continuing accreditation of laboratories, including recommendations to 
revoke the accreditation of laboratories; 
Responsible entity: EAC: [Empty]; 
Responsible entity: NIST: X. 

HAVA responsibility: Revoke the accreditation of a laboratory only when 
approved by a vote of EAC; 
Responsible entity: EAC: X; 
Responsible entity: NIST: [Empty]. 

Source: GAO analysis of United States Code. 

[End of table] 

The tasks that NIST is to perform in meeting HAVA's requirements are 
addressed in an annual interagency agreement executed between the 
institute and EAC each year. For example, the 2008 interagency 
agreement states that NVLAP will continue to assess VSTLs and will 
coordinate with EAC to continually monitor and review the performance 
of the laboratories. Additionally, the agreement states that the two 
agencies will coordinate to maintain continuity between their 
respective accreditation programs. 

The NIST and EAC accreditation programs can be viewed together as 
forming a federal VSTL accreditation process that consists of a series 
of 12 complementary steps. These steps are depicted in figure 2, where 
the numbers correspond to a detailed narrative description below. 

Figure 2: Overall NIST and EAC Accreditation Processes: 

This figure is a flowchart of overall NIST and EAC accreditation 
processes. 

[See PDF for image] 

Source: GAO analysis. 

[End of figure] 

1. Laboratory Application to NIST: 

The accreditation process begins when a laboratory submits a completed 
application to NIST, along with administrative information about the 
laboratory, the scope of accreditation being applied for, and an 
agreement to the conditions of accreditation (i.e. practices that must 
be followed to obtain and maintain accreditation). In addition, the 
laboratory submits documentation that supports the application, 
including the laboratory's quality control manual. 

2. NIST Preassessment: 

Using the application and supporting documentation, NIST conducts a 
preassessment review. Among other things, this review includes 
comparing the quality control manual against the requirements in NIST 
accreditation program guidance. If deficiencies in the documentation 
are found, NIST requests corrections to satisfy program requirements. 

3. NIST On-site Assessment: 

Following satisfactory completion of the preassessment review, a NIST 
team visits the laboratory facilities to conduct an on-site assessment. 
This assessment includes staff interviews, reviews of laboratory 
records and audit reports, and demonstrations of staff competence to 
execute planned test methods and procedures. It concludes with the NIST 
team presenting its findings to laboratory management. While conducting 
the on-site assessment, the team records its observations and comments. 

4. NIST Nonconformity Resolution: 

NIST prepares a final report, including a list of any nonconformities, 
and provides it to the laboratory. The laboratory has 30 days to 
respond as to how it will address the areas of nonconformity. NIST 
evaluates the laboratory's response and determines whether the 
nonconformities have been sufficiently addressed. If so, NIST renders 
an accreditation decision. If not, NIST may contact the laboratory for 
additional information or may deny accreditation. If a laboratory is 
denied accreditation, it may reapply to NIST. 

5. NIST Accreditation Decision: 

When a laboratory has no areas of nonconformity, the voting systems 
program manager makes an accreditation recommendation to the Chief of 
NVLAP, who is responsible for all NVLAP accreditation decisions and 
issues all NVLAP accreditation certificates. 

6. NIST Recommendation to EAC: 

In addition to granting the NVLAP accreditation, the Chief provides a 
recommendation to the Director of NIST. The recommendation is reviewed 
by NIST's general counsel, and then a letter is sent to EAC that 
recommends the laboratory for accreditation as a VSTL in accordance 
with HAVA. 

7. Accreditation Application to EAC: 

After receiving the NIST recommendation, EAC sends the recommended 
laboratory an invitation to apply to the EAC accreditation program. In 
the letter, EAC specifies a list of information and documentation that 
the laboratory must provide. 

8. EAC Accreditation Review: 

The laboratory submits an application and supporting information to 
EAC. EAC staff review the application package for completeness. In 
addition, staff review the supporting materials vis-a-vis accreditation 
program requirements. During the course of the review, staff may 
contact the laboratory to clarify the information provided or to inform 
the laboratory of requirements that are not sufficiently addressed. 

9. EAC Nonconformity Resolution: 

EAC submits correspondence (generally through e-mail) to the laboratory 
identifying areas of nonconformity. The laboratory then provides the 
missing and/or clarifying documentation. EAC staff determines if the 
provided information adequately addresses the nonconformity issues, 
contacting the laboratory as needed. 

10. Recommendation to EAC Commissioners: 

The EAC accreditation program director, through the EAC executive 
director, makes a recommendation to the EAC commissioners as to whether 
the laboratory should be accredited by EAC. Along with the 
recommendation, the program director provides the review results, as 
well as laboratory-provided materials. 

11. Vote by EAC Commissioners: 

The commissioners review the material provided and may request 
additional clarification, as needed. At a public meeting, the 
commissioners vote on whether to accredit the laboratory. Should the 
EAC commissioners vote to deny the accreditation, the laboratory must 
wait for EAC to invite the laboratory to reapply. 

12. EAC Accreditation Granted: 

When the commissioners vote to accredit a laboratory, EAC's executive 
director issues an accreditation certificate identifying the scope and 
effective dates of the VSTL accreditation. In addition, the program 
director makes information about the laboratory's accreditation 
publicly available via the EAC Web site. At this point, the laboratory 
is authorized to operate as a VSTL under EAC's testing and 
certification program. 

Postaccreditation Monitoring Activities: 

Once a laboratory has been accredited, both NIST and EAC are to monitor 
its compliance with the terms of its accreditation. In doing so, NVLAP 
staff may visit a laboratory at any time, whether for cause or on a 
random selection basis, and these visits can be either scheduled in 
advance with the laboratory or unannounced. If a laboratory is found to 
not be in compliance, the accreditation may either be 
suspended[Footnote 16] or revoked, depending on the nature of the 
issues involved. A suspension provides the opportunity for the 
laboratory to address the identified issues. EAC also monitors the 
procedures and practices of accredited laboratories through 
documentation reviews and visits. If a VSTL is unable to remedy 
identified compliance issues, the EAC program director can propose that 
the accreditation be suspended and ultimately revoked. As provided for 
under HAVA, the EAC commissioners would vote on any proposed 
revocation. 

Status of Completed VSTL Accreditation Activities: 

As of May 2008, EAC has accredited four laboratories. These 
laboratories are SysTest Labs, LLC; Wyle Laboratories, Inc; iBeta 
Quality Assurance; and InfoGard Laboratories, Inc. A fifth laboratory, 
CIBER Inc., has been granted NVLAP accreditation and has been 
recommended to, but not yet accredited by, EAC. InfoGard Laboratories, 
Inc., whose NVLAP accreditation expires in June 2008, has recently 
notified NIST and EAC that it would not apply to renew its 
accreditation, citing the volatility of the voting system environment 
as one reason. The timeline for each of these accreditations, and other 
accreditation program activities, is found in figure 3. 

Figure 3: VSTL Accreditation Program Activities: 

This figure is a chart depicting VSTL accreditation program activities. 

[See PDF for image] 

Source: GAO based on NIST: and EAC-provided data. 

[End of figure] 

NIST Has Defined and Implemented an Accreditation Approach That 
Reflects Relevant Standards but Is Missing Details Needed for 
Consistent and Verifiable Implementation: 

NIST's defined approach to accrediting voting system laboratories 
largely reflects applicable HAVA requirements and relevant 
international standards, both of which are necessary to an effective 
program. However, this approach is continuing to evolve based on issues 
realized during NIST's implementation experience to date. In 
particular, because NIST's defined program does not, for example, 
specify the nature and extent of assessment documentation to generate 
or retain or specify the version of the voting system standards to be 
used, our analysis of NIST's efforts in accrediting four laboratories 
could not confirm that the agency has consistently followed its defined 
accreditation program. NIST officials stated that these limitations are 
due in part to the relative newness of the program and that they will 
be addressed by updating the accreditation program handbook. However, 
they said that they do not have documented plans to accomplish this. 
Until these limitations are addressed, NIST will be challenged in 
accrediting voting system laboratories in a consistent and verifiable 
manner. 

NIST Voting System Accreditation Program Reflects HAVA Requirements: 

NIST has defined its voting system accreditation program to address 
relevant HAVA requirements. According to HAVA, NIST is to: 

* conduct reviews of independent, nonfederal voting system testing 
laboratories and submit to EAC a list of proposed voting system testing 
laboratories and: 

* monitor and review the performance of those proposed laboratories 
that EAC accredits, including making recommendations to EAC regarding 
accreditation continuance and revocation. 

NIST's defined voting system accreditation program satisfies both of 
these requirements. With respect to the first, NIST announced in June 
2004 the establishment of its voting system testing laboratory 
accreditation program as part of NVLAP, a statutorily created program 
for unbiased, third parties to establish the competence of national 
independent laboratories. As such, NIST adopted its NVLAP 
handbook[Footnote 17] as the basis for its defined approach to 
reviewing VSTLs and has supplemented it with a handbook that is 
specific to voting system testing.[Footnote 18] 

With respect to the second HAVA requirement, the supplemental handbook 
cited above states that the NIST Director will recommend NVLAP- 
accredited VSTLs to EAC for subsequent commission accreditation. 
Additionally, NIST's handbooks provide for both monitoring accredited 
laboratories and for making recommendations regarding a laboratory's 
continued accreditation. For example, the handbook states that a 
monitoring visit may occur at both scheduled and unscheduled times and 
the scope may be limited to a few items or include a full review. It 
also states that a reaccreditation review shall be conducted in 
accordance with the procedures used to initially accredit laboratories. 
Further, the handbook also identifies accreditation or reaccreditation 
decision options, including granting, denying, or modifying the scope 
of an accreditation. 

According to NIST officials, these HAVA requirements are relevant and 
important to defining an effective voting system testing laboratory 
accreditation program. By incorporating them, NIST has reflected one 
key aspect of an effectively defined program. 

NIST Has Incorporated Relevant International Accreditation Standards 
into Its VSTL Accreditation Program: 

NIST's VSTL accreditation program reflects internationally recognized 
standards for establishing and conducting accreditation activities. 
These standards are published by the International Organization for 
Standardization (ISO), and the two that are germane to this 
accreditation program are (1) ISO/IEC 17011,[Footnote 19] which 
establishes general requirements for accreditation bodies and (2) ISO/ 
IEC 17025,[Footnote 20] which establishes the general requirements for 
reviewing the competence of laboratories. According to NIST program 
documentation, this allows NVLAP to both operate as an unbiased, third 
party accreditation body and to utilize a quality management system 
compliant with international standards. As a result, NIST has 
incorporated key aspects of an effective accreditation body into its 
voting system accreditation program. 

NIST Program Meets ISO Accreditation Body Requirements: 

ISO/IEC 17011 requires that an accrediting body have, among other 
things, (1) a management system for accreditation activities, (2) a 
policy defining the types of records to be retained and how those 
records will be maintained, (3) a clear description of the 
accreditation process that covers the rights and responsibilities of 
those seeking accreditation, and (4) a clear description of the 
accreditation activities to be performed. 

NIST VSTL accreditation program-related documentation, including its 
program handbooks, satisfies each of these requirements. In fact, NIST 
has cross-referenced its documentation to each ISO/IEC 17011 
requirement. Specifically, the first requirement is cross-referenced to 
the NVLAP Management System Manual,[Footnote 21] which describes the 
overall accreditation program's management policies and control 
structure, and the second is cross-referenced to the program's record 
keeping policy, which specifies what types of records should be 
maintained and how they should be maintained. The third and fourth 
requirements are cross-referenced to the accreditation process 
descriptions in both the Management System Manual and the general 
handbook. Together, these documents contain, for example, (1) the 
rights of laboratories applying for accreditation and (2) the scope of 
accreditation activities to be performed, including a preassessment 
review, an on-site review, and a final on-site assessment report. 

NIST Program Meets ISO Laboratory Accreditation Review Requirements: 

ISO/IEC 17025 requires that accreditation reviews cover specific 
topics. These include (1) laboratory personnel independence and 
conflicts of interest; (2) a laboratory system for quality control 
(i.e., a framework for producing reliable results and continuous 
improvement to laboratory procedures); and (3) a laboratory mechanism 
for collecting and responding to customer complaints. Additionally, the 
standard establishes basic technical requirements that a laboratory has 
to meet, and thus that reviews are to cover, including (1) competent 
laboratory personnel who are capable of executing the planned tests, 
(2) appropriate tests and test methods, and (3) clear and accurate test 
result documentation. 

NIST voting system testing laboratory accreditation program-related 
documents, including its program handbooks, satisfy these requirements. 
First, the general handbook defines the requirement for a laboratory to 
have personnel that are independent and free of any conflict of 
interest. Second, the handbook requires that a laboratory have a 
management quality control system and that this system provide for 
reliable results and continuous improvement to laboratory procedures. 
Third, the handbook requires that a laboratory have a mechanism for 
receiving and responding to customer complaints. Last, the handbook 
establishes certain technical requirements that a laboratory must meet, 
such as having competent laboratory personnel capable of executing the 
planned tests, using appropriate tests and test methods, and 
documenting test results in a clear and accurate manner. 

For several of these requirements, NIST's voting-specific supplemental 
handbook augments the general handbook. For example, this supplemental 
handbook requires laboratories to submit a quality control manual, as 
well as information to demonstrate the competence of laboratory 
administrative and technical staff. Further, it requires that a 
laboratory's training program be updated so that staff can be retrained 
as new versions of voting system standards are issued. 

NIST Voting System Accreditation Program Does Not Reflect Its Own 
Findings on the Need for Assessor Qualifications and Training and Key 
Assessment Criteria: 

NIST has reported on the importance of ensuring that those persons who 
perform accreditation assessments are sufficiently qualified and that 
the assessments themselves are based on explicitly defined criteria and 
are adequately documented. Nevertheless, NIST has not fully reflected 
key aspects of these findings in its defined approach to accrediting 
voting system testing laboratories. For example, it has not specified 
the basis for determining the qualifications of its accreditation 
assessors, and while a draft update to its handbook now includes the 
specific voting system standards to be used when performing an 
accreditation assessment, this handbook was only recently approved. 
According to NIST officials, these gaps are due to the newness of the 
accreditation program and will be addressed in the near future. Because 
these gaps have confused laboratories as to what standards they were to 
meet, and may have resulted in differences in how accreditations have 
been performed to date, it is important that the gaps be addressed. 

NIST Has Not Specified Requirements for Assessor Qualifications and 
Training: 

NIST has reported[Footnote 22] on the importance of having competent 
and qualified human resources to support accreditation programs. 
According to these findings, an accreditation program should, among 
other things, provide for: 

* having experienced and qualified assessors to perform accreditation 
activities; 

* demonstrating an assessors' qualifications using defined 
documentation and explicit criteria that encompass the person's 
education, experience, and training; and: 

* training (initial and continuing) for assessors. 

NIST's defined approach to VSTL accreditation does not provide for all 
these requirements. To its credit, its program handbook identifies the 
need for experienced and qualified assessors in the execution of 
accreditation activities and provides for each assessor's 
qualifications to be documented. Further, it has defined generic 
training that applies to all of its accreditation assessors. For 
example, the NVLAP Assessor Training Syllabus includes training on ISO/ 
IEC 17011 and 17025, as well as training on the NVLAP general handbook. 
In addition, the VSTL accreditation program manager stated that new 
assessors receive training on the 2002 VSS and 2005 VVSG and that 
periodic training seminars are provided to assessors on changes to 
either the general handbook or the 2005 VVSG. 

In addition, the program manager told us that candidate assessors must 
submit some form of documentation (e.g., a resume), and that this 
documentation is used to evaluate, rank, and select candidates that are 
best qualified. The NIST VSTL assessors that we interviewed confirmed 
that they were required to submit such documentation at NIST's request. 

However, NIST's defined approach does not cite the explicit 
capabilities and qualifications that an assessor must meet or the 
associated documentation needed to demonstrate these capabilities and 
qualifications. According to the program manager, this is because the 
field of potential assessors in the voting system arena is small and 
specialized and because they focused on defining other aspects of the 
program that were higher priorities. Further, NIST has not defined and 
documented the specific training requirements needed to be a VSTL lead 
assessor or a technical assessor for the VSTL program. According to the 
program manager, this is because these assessors receive all the 
training they need by working on the job with more experienced 
assessors. Not specifying criteria governing assessor qualifications 
and training is of concern because differences in assessors' 
capabilities could cause inconsistencies in how assessments are 
performed. 

NIST's Approach Does Not Fully Specify Criteria for Evaluating and 
Documenting VSTL Capabilities: 

NIST recognizes the importance of specifying explicit criteria against 
which all candidate laboratories will be assessed and fully documenting 
the assessments that are performed. Specifically, the general handbook 
provides the criteria and requirements that will be used to evaluate 
basic laboratory capabilities. It also states that technical 
requirements specific to a given field of accreditation are published 
in program-specific handbooks. To that end, NIST published a 
supplemental program-specific handbook in December 2005 that provided 
the voting-specific requirements to be used to evaluate VSTLs, 
additional guidance, and related interpretive information.[Footnote 23] 

NIST's 2005 supplemental handbook does not contain sufficient criteria 
against which to evaluate VSTLs. It identifies specific requirements 
that laboratories are to demonstrate relative to the 2002 VSS but not 
the 2005 VVSG. For example, the handbook states that laboratories are 
expected to develop, validate, and document test methods that meet the 
2002 VSS. However, it does not refer to the 2005 VVSG. In addition, the 
program-specific checklist that accompanies this version of the 
handbook does not identify all the 2005 VVSG standards against which 
laboratories are evaluated. Specifically, this checklist makes 
reference to the VVSG in relation to just a few checklist requirements. 

According to the NIST program manager, the 2005 handbook did not refer 
to the 2005 VVSG requirements because only the 2002 VSS requirements 
were mandatory at the time it was published. He further stated that, 
despite the fact that the 2005 VVSG requirements were not included in 
that handbook, NIST assessors were expected to use them when performing 
the first laboratory assessments. Representatives for two laboratories 
stated that because these requirements were not documented or 
identified in the NIST handbooks, they did not learn that they would be 
required to demonstrate 2005 VVSG-based capabilities until the NIST on- 
site assessment teams arrived. 

In December 2007, NIST released draft revisions of the voting program- 
specific handbook and checklist, stating that labs are expected to meet 
both 2002 VSS and 2005 VVSG. In addition, the 2007 draft handbook 
clearly specifies that laboratories must demonstrate how developed test 
methods and planned tests trace back to and satisfy both the 2002 VSS 
and the 2005 VVSG. Taken together, the new handbook and checklist 
should better identify the requirements and criteria used to evaluate a 
laboratory and document the results. According to NIST, the new 
handbook and checklist have recently been finalized, and both are now 
in use.[Footnote 24] 

Available Documentation Does Not Show That NIST Has Consistently 
Followed All Aspects of Its Defined Accreditation Approach: 

NIST has found that reliable and accurate documentation provides 
assurance that laboratory accreditation activities have been 
effectively fulfilled.[Footnote 25] However, in its efforts to date in 
accrediting four VSTLs, documentation of the assessments does not show 
that NIST has fully followed its defined accreditation approach. While 
we could not determine whether this is due to incomplete documentation 
of the steps performed and the decisions made during an assessment or 
due to steps not being performed as defined, this absence of verifiable 
evidence raises questions about the consistency of the assessments and 
the resultant accreditations. Without adequately documenting each 
assessment, including all steps performed and the basis for any steps 
not performed, such questions may continue to be raised. 

To NIST's credit, available documentation shows that it consistently 
followed some aspects of its defined approach in accrediting the four 
laboratories. For example, we verified that NIST received an 
application from each of the laboratories as required, and our review 
of completed checklists and summary reports shows that preassessment 
reviews and on-site assessments were performed for each laboratory, as 
was required. According to a lead assessor, this review usually focused 
on the laboratories' quality assurance manuals. Moreover, the completed 
checklists identified whether the requirement was met or not for each 
listed requirement, and included comments, in some cases, as to how a 
laboratory addressed a requirement. Also as required, NIST received 
laboratory responses describing how unmet requirements were addressed 
within specified time frames, used the responses in making 
accreditation decisions, and notified EAC of its decisions via letters 
of recommendation. Furthermore, NIST has recently begun reaccreditation 
reviews at two laboratories, as required. 

However, documentation does not show that NIST has consistently 
followed other aspects of its defined approach. Our analysis of the 
checklists that are to be used to both guide and document a given 
assessment, including identifying unmet requirements and capturing 
assessor comments and observations, shows some differences. For 
example: 

* One type of checklist (the supplemental handbook checklist) was 
prepared for only two of the four laboratory assessments. According to 
the program manager, this is because even though a draft revision of 
this checklist was actually used to assess the other two laboratories, 
the assessment results were recorded on a different checklist (the 
general handbook checklist). While this is indicated on one of the two 
checklists, it is not indicated on the other. 

* On the checklist used for one laboratory, an assessor marked several 
sections as "TA" with no explanation as to what this means. Also, the 
checklist used for another laboratory did not identify whether most of 
the requirements were met or not met. Further, the checklist for a 
third laboratory had one section marked as "not applicable" but 
included no explanation as to why that section did not apply, while the 
checklist for a different laboratory marked the same section as "not 
applicable" but included a reason for doing so. 

Notwithstanding these differences, the program manager told us that 
each laboratory was assessed using the same requirements and all 
assessments to date were performed in a consistent manner. On the basis 
of available documentation, however, we could not verify that this is 
the case. As a result, it is not clear that NIST has consistently 
followed its defined approach. 

Available documentation also does not show that NIST followed other 
aspects of its approach. For example: 

* The program handbook states that each laboratory is to identify the 
requested scope of accreditation in its application package. However, 
our analysis of the four application packages shows that two 
laboratories did not specify a requested scope of accreditation. 
According to the program manager, the scope of accreditation for all 
laboratories was the 2002 VSS and 2005 VVSG because, even though the 
latter standards were not yet in effect at the time, they were 
anticipated to be in effect in the near future.[Footnote 26] However, 
NIST did not have documentation that notified the laboratories of this 
scope of accreditation or that indicated whether this scope was 
established by EAC, NIST, or the laboratories. 

* The program handbook states that after receiving a laboratory's 
application package, NIST will acknowledge its receipt in writing and 
will inform the laboratory of the next steps in the accreditation 
process. However, NIST did not have documentation demonstrating that 
this was done. According to the program manager, this was handled via 
telephone conversations. However, representatives for several 
laboratories noted that these calls did not clearly establish 
expectations, adding that some expectations were not communicated until 
the NIST team assessors arrived to conduct the on-site assessment. 

The program manager stated that these deviations from the defined 
approach are attributable to the relative newness of the program, but 
despite these discrepancies, each laboratory was assessed consistently. 
However, we could not verify this, and thus it is not clear that NIST 
has consistently followed its defined approach. According to this 
official, future versions of the program handbook would address these 
limitations. However, documented plans for doing so have not been 
developed. 

EAC's Recently Drafted Accreditation Approach and Its Earlier Performed 
Laboratory Accreditations Lack Key Effectiveness Factors and Features: 

EAC has recently defined its voting system laboratory accreditation 
approach in a draft program manual. However, this draft manual omits 
important content. While addressing relevant HAVA requirements, the 
draft manual does not adequately define key accreditation factors that 
NIST has identified, and a key accreditation feature that we have 
previously reported as being integral to an effective accreditation 
program. Moreover, not all factors and features that the draft manual 
does include have been defined to a level that would ensure thorough, 
consistent, and verifiable implementation. Because this manual was not 
available for EAC to use on the four laboratory accreditations that it 
has completed, the accreditations were performed using a largely 
undocumented series of steps. As a result, the thoroughness and 
consistency of these accreditations is not clear. According to EAC 
officials, these gaps are due to the agency's limited resources being 
focused on other issues, and will be addressed as its accreditation 
program evolves. However, they said that they do not yet have 
documented plans to accomplish this. Until EAC fully defines a 
repeatable VSTL accreditation approach, it will be challenged in its 
ability to treat all laboratories consistently and produce verifiable 
results. 

EAC Has Defined a Draft Accreditation Approach that Meets HAVA 
Requirements: 

In February 2008, EAC issued a draft version of a VSTL accreditation 
program manual[Footnote 27] for public comment. According to HAVA, 
EAC's accreditation program is to meet certain requirements. 
Specifically, it is to provide for voting system hardware and software 
testing, certification, decertification, and recertification by 
accredited laboratories. Additionally, it is to base laboratory 
accreditation decisions, including decisions to revoke an 
accreditation, on a vote of the commissioners, and it is to provide for 
a published explanation of any commission decision to accredit any 
laboratory that was not first recommended for accreditation by NIST. 

To EAC's credit, its draft accreditation program manual addresses each 
of these requirements. First, the manual defines the role that the 
laboratories are to play relative to voting system testing, 
certification, recertification and decertification, and it incorporates 
by reference an EAC companion voting system certification 
manual[Footnote 28] that defines requirements and process steps for 
voting system testing and certification-related activities. 

With respect to the remaining three HAVA requirements, the draft EAC 
accreditation manual also requires (1) that the commissioners vote on 
the accreditation of laboratories recommended by NIST for 
accreditation, (2) that EAC publish an explanation for the 
accreditation of any laboratory not recommended by NIST for 
accreditation, and (3) that the commissioners vote on the proposed 
revocation of a laboratory's accreditation. 

According to EAC officials, its draft approach incorporates HAVA 
requirements because the commission is focused on meeting its legal 
obligations in all aspects of its operations, including VSTL 
accreditation. In doing so, EAC has addressed one important aspect of 
having an effective accreditation program. 

EAC Draft Approach Does Not Adequately Define Key Accreditation-Related 
Steps and Decision Criteria: 

Beyond addressing relevant HAVA requirements, EAC's draft accreditation 
manual defines an accreditation process, including program phases, 
requirements, and certain evaluation criteria. However, it does not do 
so in a manner that fully satisfies factors that NIST has reported can 
affect the effectiveness of accreditation programs.[Footnote 29] 
Moreover, it does not adequately address a set of features that our 
research shows are common to federal accreditation programs[Footnote 
30] and that can influence a program's effectiveness. According to EAC 
officials, these factors and features are not fully addressed in the 
draft program manual because its accreditation program is still in its 
early stages of development and is still evolving. Until they are fully 
addressed, EAC's accreditation program's effectiveness will be limited. 

Key NIST Accreditation Factors Not Fully Addressed: 

According to NIST, having confidence in and ensuring appropriate use of 
an accredited testing laboratory requires that accreditation 
stakeholders have an adequate understanding of the accreditation 
process, scope, and related criteria. NIST further reports that 
confidence in the accreditation process can be traced to a number of 
factors that will influence the thoroughness and competence of 
accreditation programs, and thus these factors can be viewed as 
essential accreditation program characteristics. They include having: 

* published procedures governing how the accreditation program is to be 
executed, such as procedures for granting, maintaining, modifying, 
suspending, and withdrawing accreditation; 

* specific instructions, steps, and criteria for those who conduct an 
accreditation assessment (assessors) to follow, such as a test 
methodology that is acceptable to the accreditation program; 

* knowledgeable and experienced assessors to execute the instructions 
and steps and apply the related criteria; and: 

* complete records on the data collected, results found, and reports 
prepared relative to each assessment performed. 

EAC's draft accreditation program manual addresses one of these factors 
but it does not fully address the other three. (See table 2.) For 
example, while the manual requires that EAC maintain records, it only 
addresses the retention of records associated with the testing of 
voting systems and not those associated with the accreditation of 
laboratories. EAC officials told us that testing records are meant to 
include accreditation records, although they added that this is not 
explicit in the manual and needs to be clarified. Further, the manual 
is silent on the steps to be followed and criteria to be applied in 
reviewing a laboratory's application and the qualifications required 
for accreditation reviewers. By not fully addressing these factors, EAC 
increases the risk that its accreditation reviews will not be performed 
consistently and comprehensively. 

Table 2: Summary of Extent to Which EAC Draft Approach Addresses NIST- 
Identified Accreditation Factors: 

Accreditation program factor: Published accreditation program 
procedures; Addressed by EAC?: Yes. 

Accreditation program factor: Specific accreditation instructions; 
Addressed by EAC?: No. 

Accreditation program factor: Established accreditation personnel 
qualifications; Addressed by EAC?: No. 

Accreditation program factor: Adequate maintenance of records; 
Addressed by EAC?: Partially. 

Source: GAO analysis of EAC data. 

[End of table] 

All but One Key GAO-Reported Accreditation Program Feature Has Been 
Addressed: 

As we have previously reported,[Footnote 31] the nature and focus of 
federal programs for accrediting laboratories vary, but nevertheless 
include certain common features. In particular, these programs require 
laboratories to provide certain information to the accrediting body, 
and they provide for evaluation of this information by the accrediting 
body in making an accreditation determination. As we reported, the 
required information is to include, among other things, the 
laboratory's (1) organizational information, (2) records and record- 
keeping policy, (3) test methods and procedures, (4) conflict of 
interest policy, and (5) financial stability. 

To its credit, EAC's draft accreditation manual provides for 
laboratories to submit information relative to each of these features 
that are common to federal accreditation programs. For example, it 
provides for laboratories to submit organizational information, such as 
location(s), ownership, and organizational chart; a written policy for 
maintaining accreditation-related records for 5 years; conflict of 
interest policies and procedures; test-related polices and procedures, 
as well as system-specific test plans; and financial information needed 
to demonstrate stability. Moreover, for four of the five features, the 
manual identifies the specific types of information needed for 
accreditation and how the information is to be evaluated, including the 
criteria that are to be used in evaluating it. However, for the 
financial stability feature, the manual does not describe what specific 
documents are required from the laboratory to satisfy this requirement, 
nor does the manual indicate how information provided by a laboratory 
will be evaluated. 

Table 3: Summary of Extent to Which EAC Satisfies Features Common to 
Federal Accreditation Programs: 

Accreditation program feature: Organizational information; 
EAC requires information?: Yes; 
EAC specifies information scope and level of detail?: Yes; 
EAC specifies how information is to be evaluated and criteria to be 
used?: Yes. 

Accreditation program feature: Records and record-keeping; 
EAC requires information?: Yes; 
EAC specifies information scope and level of detail?: Yes; 
EAC specifies how information is to be evaluated and criteria to be 
used?: Yes. 

Accreditation program feature: Test methods and procedures; 
EAC requires information?: Yes; 
EAC specifies information scope and level of detail?: Yes; 
EAC specifies how information is to be evaluated and criteria to be 
used?: Yes. 

Accreditation program feature: Conflict of interest policy; 
EAC requires information?: Yes; 
EAC specifies information scope and level of detail?: Yes; 
EAC specifies how information is to be evaluated and criteria to be 
used?: Yes. 

Accreditation program feature: Assurance of financial stability; 
EAC requires information?: Yes; 
EAC specifies information scope and level of detail?: No; 
EAC specifies how information is to be evaluated and criteria to be 
used?: No. 

Source: GAO analysis of EAC data. 

[End of table] 

At the time of our review, EAC's Director of Voting System Testing and 
Certification[Footnote 32] told us that the draft accreditation manual 
was to be submitted for approval and that this draft did not address 
all of the limitations cited above.[Footnote 33] For example, it would 
not contain the information needed and the evaluation approach and 
criteria to be used in making determinations about financial stability 
because this decision is to be based on what the director referred to 
as a "reasonableness" test that involves EAC evaluation of the 
information relative to that provided by other laboratories. Further, 
while EAC officials said that they plan to evolve their approach to 
VSTL accreditation and to address these gaps, EAC does not have 
documented plans for accomplishing this. Without clearly defining 
information to be used and how it is to be used, EAC increases the risk 
that financial stability determinations will not be consistently and 
thoroughly made. 

Available Documentation Does Not Demonstrate EAC's Basis for 
Accrediting Laboratories to Date: 

As of May 2008, EAC has accredited four laboratories,[Footnote 34] but 
the documentation associated with each of these accreditations is not 
sufficient to recreate a meaningful understanding of how each 
evaluation was performed and how decisions were made, and thus, the 
bases for each accreditation were not clear. Specifically, each of the 
accreditations occurred before EAC had defined its approach for 
conducting them. Because of this, EAC performed each one using a 
broadly defined process outlined in a letter to each laboratory and an 
associated checklist that only indicated whether certain documents were 
received. Our analysis of these letters showed that the correspondence 
sent to each laboratory was all the same, identifying three basic 
review steps to be performed and citing a list of documents that the 
laboratories were to provide as part of their applications.[Footnote 
35] However, the letters did not describe in any manner how EAC would 
review the submitted material, including the criteria to be used. 

According to EAC officials, the review steps were not documented. 
Instead, they were derived by a single reviewer using (1) the 
applications and accompanying documents submitted by the laboratories, 
(2) familiarity with the materials used by the state election directors-
sponsored accreditation program, and (3) the judgment of each reviewer. 
Further, while the reviews were supported by a checklist that covered 
each of the items that was to be included in the laboratory 
applications and provided space for the reviewer(s) to make notes 
relative to each of these items, the checklists did not include any 
guidance or methodology, including criteria, for evaluating the 
submitted items. Rather, the EAC accreditation program director told us 
that he was the reviewer on all the accreditations and he applied his 
own, but undocumented, tests for reasonableness in deciding on the 
submissions' adequacy and acceptability. 

Our analysis of the checklists for each laboratory accreditation showed 
that while the same checklist was used for each laboratory, the 
checklists did not provide a basis for evaluating and documenting the 
basis for the sufficiency of those documents. In some cases, additional 
communications occurred between the reviewer and the laboratory to 
obtain additional documents. However, no documentation was available to 
demonstrate what standards or other criteria the laboratories were held 
to or how their submissions were otherwise reviewed. For example, each 
of the checklists indicated that each laboratory provided "a copy of 
the laboratory's conflict of interest policy." However, they did not 
specify, for example, whether the policy adequately addressed 
particular requirements. Nevertheless, for three of the four accredited 
laboratories, documentation shows that EAC sought clarification on or 
modification to the policies provided, thus suggesting that some form 
of review was performed against more detailed requirements. Similarly, 
while the checklists indicate that the laboratories disclosed their 
respective coverage limits for general liability insurance policies, 
and in one case EAC communicated to the laboratory that the limits 
appeared to be low, no documentation specifies the expected coverage 
limits. According to the EAC Director of Voting System Testing and 
Certification, this determination was made after comparing limits among 
the laboratories and was not based on any predetermined threshold. 
Further, while the checklists indicate that each laboratory provided 
audited financial statements, there is no documentation indicating how 
these statements were reviewed. 

According to the EAC program director, the lack of documentation 
demonstrating the basis for EAC's laboratory accreditations is due to 
the need at the time to move quickly in accrediting the laboratories 
and the fact that use of the same individual to review the 
accreditation evaluation negated the need for greater documentation. 
Without such documentation, however, we could not fully establish how 
the accreditations were performed, including whether there was an 
adequate basis for the: 

accreditation decisions reached and whether they were performed 
consistently. 

Conclusions: 

The effectiveness of our nation's overall election system depends on 
many interrelated and interdependent variables, including the security 
and reliability of voting systems. Both NIST and EAC play critical 
roles in ensuring that the laboratories that test these two variables 
have the capability, experience, and competence necessary to test a 
voting system against the relevant standards. NIST has recently 
established an accreditation program that largely accomplishes this, 
and while EAC is not as far along, it has a foundation upon which it 
can build. 

However, important elements are still missing from both programs. 
Specifically, the current NIST approach does not define requirements 
for assessor qualifications and training or ensure that assessments are 
fully documented. Additionally, EAC has not developed program 
management practices that are fully consistent with what NIST has found 
to be hallmarks of an effective accreditation program, nor has the 
agency adequately specified how evaluations are to be performed and 
documented. As a result, opportunities exist for NIST and EAC to 
further define and implement their respective programs in ways that 
promote greater consistency, repeatability, and transparency--and thus 
improve the results achieved. It is also important for NIST and EAC to 
follow through on their stated intentions to evolve their respective 
programs, building on what they have already accomplished through the 
development and execution of well-defined plans of action. If they do 
not, both will be challenged in their ability to consistently provide 
the American people with adequate assurance that accredited 
laboratories are qualified to test the voting systems that will 
eventually be used in U.S. elections. 

Recommendations for Executive Action: 

To help NIST in evolving its VSTL accreditation program, we recommend 
that the Director of NIST ensure that the accreditation program manager 
develops and executes plans that specify tasks, milestones, resources, 
and performance measures that provide for the following two actions: 

* Establish and implement transparent requirements for the technical 
qualifications and training of accreditation assessors. 

* Ensure that each laboratory accreditation review is fully and 
consistently documented in accordance with NIST program requirements. 

To help EAC in evolving its VSTL accreditation program, we recommend 
that the Chair of the EAC ensure that the EAC Executive Director 
develops and executes plans that specify tasks, milestones, resources, 
and performance measures that provide for the following action: 

* Establish and implement practices for the VSTL accreditation program 
consistent with accreditation program management guidance published by 
NIST and GAO, including: 

- documentation of specific accreditation steps and criteria to guide 
assessors in conducting each laboratory review; 

- transparent requirements for the qualifications of accreditation 
reviewers; 

- requirements for the adequate maintenance of records related to the 
VSTL accreditation program; and: 

- requirements for determining laboratory financial stability. 

Agency Comments and Our Evaluation: 

Both NIST and EAC provided written comments on a draft of this report, 
signed by the Deputy Director of NIST and the Executive Director of 
EAC, respectively. These comments are described below along with our 
response to them. 

In its comments, NIST stated that it appreciates our careful review of 
its VSTL program and generally concurs with our conclusions that its 
program must continue to evolve and improve. However, NIST also 
provided comments to clarify the current status of the program relative 
to three of our findings. 

* With respect to our finding that NIST's defined approach for 
accrediting VSTLs does not cite explicit qualifications for the persons 
who conduct the technical assessments, the institute stated that it 
does explicitly cite assessor qualifications for its overall national 
laboratory accreditation program, adding that this approach to 
specifying assessor qualifications has a proven record of success. It 
also stated that the overall program's management manual requires all 
assessors to meet defined criteria in such areas as laboratory 
experience, assessment skills, and technical knowledge, and that 
candidate assessors must submit information addressing each of these 
areas as well as factors addressing technical competence in a given 
laboratory's focus area (e.g., voting systems). Further, it stated that 
candidate assessors' qualification ratings and rankings are captured in 
work sheets. 

In response, we do not disagree with any of these statements. However, 
our finding is that NIST's defined approach for VSTL accreditation does 
not specify requirements for persons who assess those laboratories that 
specifically test voting systems. In this regard, NIST's own written 
comments confirm this, stating that specific requirements for assessors 
are not separately documented for each of its national laboratory 
accreditation programs, such as the VSTL program. Therefore, we have 
not modified this finding or the related recommendation. 

* Regarding our finding that NIST's defined approach for accrediting 
VSTLs has not always cited the current voting system standards, the 
institute affirmed this in its comments by stating that the VSTL 
program handbook that it provided to us only cites the 2002 system 
standards, as these were the only standards in place when the handbook 
was published. However, NIST also noted that when the 2005 system 
guidelines were adopted in December 2005, it began the process of 
updating the handbook and associated assessment checklist, and that the 
handbook update was recently finalized for publication and is now being 
used. 

In response, we stand by our finding that NIST's defined approach has 
not always cited the current voting system standards, which NIST 
acknowledges in its comments. However, we also recognize that NIST has 
recently addressed this inconsistency by finalizing its new handbook 
and the associated assessment checklist. In light of NIST's recent 
actions, we have updated the report to acknowledge the finalization of 
the handbook and checklist, and removed the associated recommendation 
that was contained in our draft report for NIST to ensure that its 
defined approach addresses all required voting system standards. 

* Regarding our finding that available documentation from completed 
accreditations does not show that NIST has consistently followed all 
aspects of its defined approach, the institute stated that, among other 
things, all required documents for its VSTL accreditation program are 
currently in use and reflect the recent update to its handbook and 
checklist, and that all these documents are securely maintained. 

In response, we do not question these statements; however, they are not 
pertinent to our finding. Specifically, our finding is that the four 
completed accreditations that we reviewed were not consistently 
documented. As we state in our report, we reviewed the documentation 
associated with the accreditation assessments for these four 
laboratories, and we found that all four were not documented in a 
similar manner, even though they were based on the same version of the 
program handbook. For example, neither the laboratory notifications of 
the scope of the assessment nor the next steps in the accreditation 
process were consistently documented. Therefore, we have not modified 
our finding, but have slightly modified our recommendation to make it 
clear that its intent is to ensure that all phases of the accreditation 
review are fully and consistently documented. 

In its comments, EAC described our review and report as being helpful 
to the commission as it works to fully develop and implement its VSTL 
program. It also stated that it agrees with the report's conclusions 
that additional written internal procedures, standards, and 
documentation are needed to ensure more consistent and repeatable 
implementation of the program. The commission added that it generally 
accepts our recommendations and will work hard to implement them. To 
assist it in doing so, it sought clarification about two of our 
recommendations, as discussed below. 

* EAC stated that the recommendation in our draft report for the 
commission to develop specific accreditation steps and criteria was 
broadly worded, and thus the recommendation's intent was not clear. EAC 
also stated that it interpreted the recommendation to mean that it 
should define internal instructions to guide assessors in performing an 
accreditation, and that the recommendation was not intended to have any 
impact on its published requirements and procedures governing, for 
example, granting, suspending, or withdrawing an accreditation. We 
agree with EAC's interpretation, as it is in line with the intent of 
our recommendation. To avoid the potential for any future 
misunderstanding, we have modified the wording of the recommendation to 
clarify its intent. 

* EAC stated that the recommendation in our draft report for the 
commission to develop transparent technical requirements for the 
qualifications of its assessors may be confusing because, as we state 
in our report, only NIST performs a technical accreditation review, as 
EAC's review is administrative, non-technical in nature. To avoid the 
potential for any confusion, we have modified the wording of the 
recommendation to eliminate any reference to technical qualification 
requirements. 

We are sending copies of this report to the Ranking Member of the House 
Committee on House Administration, the Chairman and Ranking Member of 
the Senate Committee on Rules and Administration, the Chairmen and 
Ranking Members of the Subcommittees on Financial Services and General 
Government, Senate and House Committees on Appropriations, and the 
Chairman and Ranking Member of the House Committee on Oversight and 
Government Reform. We are also sending copies to the Chair and 
Executive Director of EAC, the Secretary of Commerce, the Deputy 
Director of NIST, and other interested parties. We will also make 
copies available to others on request. In addition, this report will be 
available at no charge on the GAO Website at [hyperlink, 
http://www.gao.gov]. 

Should you or your staffs have any questions on matters discussed in 
this report, please contact me at (202) 512-3439 or at [email protected]. 
Contact points for our Offices of Congressional Relations and Public 
Affairs may be found on the last page of this report. Key contributors 
to this report are listed in appendix IV. 

Sincerely yours, 

Randolph C. Hite: 
Director, Information Technology Architecture and System Issues: 

[End of section] 

Appendix I: Objectives, Scope, and Methodology: 

Our objectives were to determine whether the National Institute of 
Standards and Technology (NIST) and the Election Assistance Commission 
(EAC) have defined effective voting system testing laboratory (VSTL) 
accreditation approaches, and whether each is following its defined 
approach. 

To determine whether NIST has defined an effective accreditation 
approach, we reviewed documentation from its VSTL accreditation 
program, such as handbooks and program manuals for the National 
Voluntary Laboratory Accreditation Program (NVLAP), of which the VSTL 
accreditation program is a part. In doing so, we compared these 
documents with applicable statute, guidance, and best practices, 
primarily the Help America Vote Act of 2002 (HAVA), internationally 
recognized standards from the International Organization for 
Standardization (ISO), and federal accreditation program management 
guidance published by NIST. We compared program documentation with 
HAVA's NIST-specific accreditation requirements to determine the extent 
to which the agency was fulfilling its HAVA responsibilities. We also 
reviewed program documentation against ISO/IEC 17011,[Footnote 36] 
which establishes general requirements for accreditation bodies, and 
ISO/IEC 17025,[Footnote 37] which establishes the general requirements 
for assessing the competence of laboratories, to determine the extent 
to which NIST's accreditation program was based on internationally 
recognized standards. We also compared the documentation against NIST 
publication NISTIR 6014,[Footnote 38] which contains sections that 
provide guidance for laboratory accreditation programs, to determine 
whether the VSTL accreditation program had defined other elements of 
effective accreditation programs. We also interviewed the voting 
accreditation program manager to determine how these documents were 
used to guide the program. 

To determine whether NIST has followed its defined approach, we 
examined artifacts from the accreditation assessments of five VSTLs, 
including one laboratory accredited by NVLAP, but not yet recommended 
to EAC. This material included completed assessment checklists derived 
from the accreditation program handbooks, additional documents 
supporting the assessments, and laboratory accreditation applications 
and supporting documentation. We compared artifacts from these 
assessments to program guidance to determine the extent to which the 
defined process was followed. In addition, we interviewed officials 
from NIST and NIST contract assessors and officials from EAC and the 
four EAC-accredited VSTLs to understand how the NIST process was 
implemented and how it related to the process managed by EAC. 

To determine whether EAC has defined an effective accreditation 
approach, we reviewed documentation from its VSTL accreditation 
program, such as the draft Voting System Test Laboratory Accreditation 
Program Manual.[Footnote 39] In doing so, we compared this document 
with applicable statute and best practices, primarily HAVA and federal 
accreditation program management guidance published by NIST. We 
compared the draft program manual with HAVA's EAC-specific 
accreditation requirements to determine the extent to which the agency 
was fulfilling its HAVA responsibilities. We also compared the 
documentation against the accreditation guidance in NISTIR 6014 to 
determine whether the accreditation program had defined other elements 
of effective accreditation programs. We also interviewed the EAC voting 
program director and executive director to determine how these 
documents were used to guide the program and to understand EAC's 
defined accreditation approach prior to the development of the draft 
manual. 

To determine whether EAC has followed its defined approach, we compared 
artifacts from the accreditation reviews of four VSTLs. We did not 
review a fifth laboratory, which had been accredited by NVLAP, but not 
yet recommended to EAC. The materials reviewed included checklists 
completed by EAC in the absence of an approved program manual. In doing 
so, we compared the review artifacts to accreditation program 
requirements, as communicated to the laboratories, to determine the 
extent to which the agency followed its process, as verbally described 
to us. We did not compare accreditation submissions or EAC review 
artifacts with the draft accreditation manual because agency officials 
stated that the draft manual had not been used in the review of any 
laboratory. In addition, we interviewed officials from NIST, EAC, and 
the four EAC-accredited VSTLs[Footnote 40] to understand how the EAC 
process was implemented and how it related to the process managed by 
NIST. 

To assess data reliability, we reviewed program documentation to 
substantiate data provided in interviews with knowledgeable agency 
officials. We have also made appropriate attribution indicating the 
data's sources. 

We conducted this performance audit at EAC and NIST offices in 
Washington, D.C., and Gaithersburg, Maryland, respectively, from 
September 2007 to September 2008 in accordance with generally accepted 
government auditing standards. Those standards require that we plan and 
perform the audit to obtain sufficient, appropriate evidence to provide 
a reasonable basis for our findings and conclusions based on our audit 
objectives. We believe that the evidence obtained provides a reasonable 
basis for our findings and conclusions based on our audit objectives. 

[End of section] 

Appendix II: Comments from the National Institute of Standards and 
Technology: 

United States Department Of Commerce: 
National Institute of Standards and Technology: 
Gaithersburg, Maryland 20899-0001: 
Office Of The Director: 

Memorandum For: 
Randolph C. Hite: 
Director, Information Technology Architecture and System Issues: 
Government Accountability Office: 

From: 
Signed by: 
James M. Turner, Ph.D.: 
Deputy Director:  

Subject: Comments on Government Accountability Office (GAO) Draft 
Report Entitled "Elections: Federal Programs for Accrediting 
Laboratories that Test Voting Systems Need to Be Better Defined and 
Implemented" (GAO-08-770): 

This is in response to your draft report dated March 2008, entitled 
"Elections: Federal Programs for Accrediting Laboratories that Test 
Voting Systems Need to Be Better Defined and Implemented." Thank you 
for the opportunity to review and comment on this draft. 

I appreciate the GAO team's careful review of the NIST/NVLAP program to 
accredit voting system testing laboratories under the EAC program. I am 
pleased that the GAO review found that the NIST voting system 
accreditation program reflects HAVA requirements, incorporates relevant 
international accreditation standards and meets International 
Organization for Standardization (ISO) accreditation body requirements, 
and laboratory accreditation review requirements. 

The draft report also states, however, that the NIST program is missing 
details needed for consistent and verifiable implementation. While NIST 
generally concurs with the GAO findings that the Voting System Testing 
Laboratory (VSTL) program must continue to evolve and be enhanced over 
time, we think it is important to clarify the current state of the 
program, particularly the state of the documentation associated with 
the program's consistent and verifiable implementation. In the 
attachment to this memorandum, we respond to the three areas of 
specific concern identified in the draft report regarding NIST's role 
in the voting system accreditation process provided for under HAVA. 

I am confident that the accreditation program implemented by NVLAP 
currently fully meets the requirements of HAVA and will lead to 
increased confidence in the electronic voting process. NIST remains 
committed to improving the operation of the program as we move forward. 

We are looking forward to receiving your final report. Please contact 
Steve Willett on (301) 975-8707 should you have any questions regarding 
this response. 

Attachment: 

Below is the NIST response to the three areas of specific concern 
identified in the draft report regarding NIST's role in the voting 
system accreditation process provided for under HAVA. 

(1) First, the draft findings state that NIST has not specified 
requirements for assessor qualifications and training. Specifically, 
"NIST's defined approach does not, for example, cite explicit 
qualifications for the persons who conduct accreditation technical 
assessments, as called for in federal accreditation program guidance." 

Response � The NVLAP program does cite explicit qualifications for 
technical assessors, as required under the provisions of ISO/IEC 
17011:2004, the international standard that establishes general 
requirements for accreditation bodies. This standard is incorporated in 
the NVLAP Management System Manual. Section 6.3.2.1 of the manual, 
which addresses assessor selection criteria, states, "NVLAP uses the 
services of assessors and technical experts who meet defined criteria 
in the areas of laboratory experience, communication, assessment 
skills, technical knowledge, quality management, professional 
activities, and education and training." Candidate assessors must 
submit biographies and supporting information that address each of the 
above criteria, in each case specifically addressing factors that 
demonstrate technical competence relevant to a specific NVLAP 
laboratory accreditation program or programs, e.g., voting system 
testing. 

To evaluate a potential assessor, NVLAP relies on the assessor 
biography, interviews, and recommendations to prepare an assessor 
summary sheet that documents the basis for selection. Rating work 
sheets are used to rank the assessor's knowledge and ability in the 
areas of: education, lab experience, communication skills, assessment 
experience, lab management experience, and professional association 
affiliations. While specific requirements for assessors are not 
separately documented for each NVLAP program, the procedure currently 
in use has a proven record of success within NVLAP and other 
accreditation bodies. NVLAP also monitors its assessors and solicits 
written comments from customer laboratories covering many issues, 
including assessor technical competence. Initial qualification, 
monitoring and laboratory feedback information is documented and 
maintained in the assessor files in a secured file room. 

The GAO report also states that, "NIST officials said that they rely on 
individuals who have prior experience in reviewing such laboratories." 
All NVLAP programs rely on experts that have prior experience; this is 
viewed as an asset. Assessors are selected for their technical 
expertise and trained in assessment techniques and NIST Handbook 150 
(NVLAP's version of ISO/IEC 17025). This is accomplished by NVLAP 
through workshops, on the job training, and/or one-on-one training. The 
NVLAP technical assessor supporting the voting system testing 
laboratory program has both extensive prior experience in voting system 
testing and evaluating technical competence through the National 
Association of State Election Directors (NASED) assessment program and 
current experience as an EAC evaluator. This is documented in his 
assessor file. The voting system lead assessor has over 10 years of 
experience as a NVLAP lead management system assessor and as a 
technical assessor in the area of electromagnetic compatibility and 
telecommunications. He has attended courses offered periodically by 
NVLAP to its assessors and has been qualified as a NVLAP assessor with 
the appropriate certificate. This is documented in his assessor file. 

(2) Second, the draft findings state that the current NIST approach 
does not fully specify criteria for evaluating and documenting VSTL 
capabilities. Specifically, "Even though the EAC requires that 
laboratory accreditation be based on demonstrated capabilities to test 
against the latest voting system standards, NIST's defined approach has 
not always cited these current standards." The draft report cites 
instances where the technical (program specific) checklists were not 
always consistent for the on-site assessments and notes that some 
laboratories stated they did not know what was expected of them. This 
finding refers to the development and implementation of the program 
specific handbook and checklist. 

Response - When the NVLAP voting system accreditation program was first 
made available to interested laboratories in June 2005, the 2002 VVS 
was the only approved standard in existence. Therefore, NVLAP's initial 
program specific handbook took only this standard into account. The EAC 
formally adopted the 2005 VVSG in December 2005, with an effective date 
of December 2007. NVLAP informed the candidate VSTLs in early 2006 that 
their scope of accreditation would include both the 2002 VVS and 2005 
VVSG. The process was begun at that time to revise the initial program 
specific handbook to reflect both standards. 

Candidate VSTLs were clearly told by NIST that the accreditation would 
be to both the 2002 VVS and 2005 VVSG. The labs were informed that in 
order to be accredited as Voting System Testing Laboratories by NVLAP 
they would have to be accredited to the core set of test methods 
contained in both standards. This was communicated to the labs during 
the pre- assessments, in e-mail and phone communications, and at the on-
site assessment visit. In addition the candidate VSTLs should have 
known this since the same information was available on the EAC website. 

NVLAP conducted no-cost pre-assessments of each candidate VSTL, 
beginning in February 2006. In parallel, a program specific checklist 
was being developed to encompass both standards. During these pre-
assessments (lasting an average of two days), the laboratory's 
technical and quality systems were reviewed, personnel interviewed and 
issues discussed. The program specific handbook was updated and 
approved at the NVLAP level in December 2007. It was denoted as being 
in draft form at that time because it had not been through a NIST 
internal editorial review process needed for hardcopy publication. This 
editorial process has been completed and the handbook has been 
finalized for publication. Final versions of the program specific 
handbook and checklist are now in use. Each VSTL has been assessed to 
both the 2002 VVS and 2005 VVSG. 

NVLAP's general requirements are found in NIST Handbook 150. Program 
specific handbooks and checklists are provided by NVLAP as a tool for 
identifying and documenting assessment findings and nonconformities 
relative to the test methods being assessed. They incorporate the same 
numbering system and section headings as Handbook 150. The difference 
between the general and program specific handbooks is that the program 
specific handbook (Handbook 150-22 in the case of the voting program) 
provides additional guidance for coverage of specific test methods 
without repeating the general criterion in Handbook 150. In the case of 
the voting system testing program, the specific test methods are found 
in 2002 VVS and 2005 VVSG. There are roughly one thousand test methods 
in the 2005 VVSG alone. The goal of Handbook 150-22 is not to reiterate 
every requirement in the technical standards, since these are already 
stated in the standards themselves, but to provide a framework so the 
laboratory will know where elements of the standard will be considered 
and recorded in our documentation. NVLAP's function is not to educate 
the laboratories in the standard but to document their competency in 
applying voting system standards to equipment under test. 

(3) The third finding was that available documentation does not show 
that NIST has consistently followed all aspects of its defined 
accreditation approach. Specifically, the GAO report recommends that 
the NIST/NVLAP program manager needs to provide actions for, "Ensuring 
that each laboratory accreditation review is fully documented." 

Response: All required documents for the voting system testing program 
are currently in use and include the most recent, up-to-date Handbook 
150-22 with its checklist. The documentation used for accreditation 
review by NVLAP includes: the lab application, laboratory management 
system manual, on-site report, lab responses to nonconformities, and 
documentation to support the laboratory's response. The on-site report 
consists of the general operations checklist, program specific 
checklist, signature sheet, narrative summary, and assessment summary. 
All these documents remain on file in a secure NVLAP file room. 

Every NVLAP program documents the evaluation using the NVLAP On-site 
Assessment Review form. Additionally, as part of the on-site assessment 
report, the assessor completes a narrative summary addressing the 
overall findings of each section of the standard as well as the 
assessment checklists for both the NVLAP general and program specific 
handbooks. The assessment process requires that the assessor leave 
copies of the assessment report with the lab on the last day of the on-
site assessment. 

Unlike NVLAP's other programs, where strict confidentiality is 
observed, the voting program must exhibit a level of transparency in 
order to remain open to the public. To accomplish this, the on-site 
report, lab responses, and evaluation form are made available on the 
NIST voting website. Proprietary information in the nonconformity 
response documentation is not posted. 

Over the past several years NIST has worked to provide the American 
public with a comprehensive accreditation program capable of assessing 
the competency of voting system testing laboratories to current 
standards while also maintaining a process which will allow the program 
to accommodate the development of new standards. NIST recognizes the 
need for an unbiased assessment of its process and commends the GAO for 
providing this service. The NVLAP voting system testing laboratory 
program has gone through a maturation process since 2004. NIST is 
confident that the accreditation program implemented by NVLAP currently 
fully meets the requirements of HAVA and will lead to increased 
confidence in the electronic voting process. We remain committed to 
improving the operation of the program as we move forward. In 
particular, as recommended by the GAO, we will continue to enhance the 
transparency of NVLAP requirements for the technical qualifications and 
training of assessors, ensure that all required voting system standards 
are addressed, and ensure that each laboratory accreditation review is 
fully documented. 

[End of section] 

Appendix III: Comments from the Election Assistance Commission: 

U.S. Election Assistance Commission: 
Office Of The Executive Director: 

1225 New York Avenue, NW, Suite 1100: 
Washington, DC. 20005: 

Mr. Randolph C. Hite, Director: 
Information Technology Architecture and Systems: 
United States Government Accountability Office:  

Re: Draft Report GAO 08-770, Elections: Federal Programs for 
Accrediting Laboratories that Test Voting Systems Need to Be Better 
Defined and Implemented: 

Thank you for the opportunity to provide comments on GAO Report 08-770 
regarding Federal voting system test laboratory accreditation programs. 
The U.S. Election Assistance Commission (EAC) appreciates the time and 
effort put forth by GAO during the preparation of this document and the 
willingness of GAO staff to discuss pertinent issues at length with the 
EAC, NIST and other key stakeholders in the voting system testing 
community. The EAC has found both the review process and report helpful 
as it works to fully develop and implement its new Laboratory 
Accreditation Program. 

GAO recognized that EAC has successfully "published procedures 
governing how the accreditation program is to be executed" (pgs 31-32) 
and that its program includes requirements for all five recognized 
accreditation program features (pgs 33-34). The EAC agrees with the 
report's conclusion that while EAC's accreditation program has 
effectively laid out its program requirements, additional written 
internal procedures, standards and documentation are needed to ensure a 
more consistent and repeatable review process. The EAC is currently 
developing written, internal procedures to better implement and 
document its Laboratory Accreditation Manual. As the EAC implements the 
various aspects of its accreditation program, it will incorporate the 
recommendations provided in the GAO report. 

The report provides four recommendations for the EAC to better conform 
to accreditation program management guidance published by NIST and GAO. 
Generally, the EAC accepts the recommendations provided with little 
comment. However, in one instance the phrasing of a recommendation is 
questioned in light of the report's findings. Additionally, the EAC 
expresses concern that certain technical responsibilities under the 
purview of NIST are placed upon the EAC. The following are EAC's 
comments in response to each recommendation. 

1. Documentation of specific accreditation steps and criteria. This 
recommendation is stated broadly and, as a result, is unclear in light 
of the content of the report. Despite its broad language, the EAC does 
not read this recommendation as impacting the program's laboratory 
accreditation requirements, as the report has found that the EAC has 
properly "published procedures governing how the accreditation program 
is to be executed, such as procedures for granting, maintaining, 
modifying, suspending, and withdrawing accreditation." (pg 31 � 32). 
Rather, based upon the discussion on page 31-32 of the report, the EAC 
reads this recommendation as advising that EAC develop additional 
"instructions, steps, and criteria for those who conduct an 
accreditation assessment." 

Based upon this reading, the EAC concurs with GAO's recommendation to 
develop a more formalized internal process for reviewing accreditation 
applications. The EAC has a checklist in place to guide the process; 
however we will review the document with an eye toward improving it by 
developing additional internal procedures and criteria for reviewing 
applications. 

2. Transparent technical requirements for the qualifications of 
accreditation reviewers. EAC is concerned that this recommendation may 
confuse readers of this report regarding the distinct roles of EAC and 
NIST in the laboratory accreditation process. According to HAVA, NIST 
is responsible for the technical review of voting system test 
laboratories; the EAC performs a "nontechnical" review of laboratories 
after the technical review by NIST is completed. As the GAO report 
notes, "NIST is to focus on assessing laboratory technical 
qualifications, while the EAC is to use those assessment results and 
recommendations, and augment them with its own review of related 
laboratory capabilities, to reach an accreditation decision." (pg 13). 
EAC's review is administrative, and it is initiated only after 
receiving a laboratory accreditation recommendation from NIST. 

After it has received technical recommendation from NIST, the EAC 
conducts an administrative review of the applicant laboratory. This 
includes collecting: (1) laboratory information required by Section 
3.4.1. of EAC Draft Voting System Test Laboratory Program Manual 
(Accreditation Manual); (2) a signed agreement affirming that the 
laboratory will meet all elements of the EAC Accreditation Program 
(Chapter 2 of the Accreditation Manual); and (3) a Certification of 
Laboratory Conditions and Practices (Attachment A of the Accreditation 
Manual) documenting that the laboratory has certain conditions and 
required policies in place (specifically: a conflict of interest 
policy, a personnel policy, a recordkeeping policy and evidence of 
sufficient resources and financial stability). Thus, the EAC's 
nontechnical review is limited to (1) determining whether required 
information and the signed agreement and certification are, in fact, 
provided and (2) determining whether the certifications of the 
laboratory (concerning its conflict of interest policy, personnel 
policy, recordkeeping policy and resources and financial stability) are 
acceptable by comparing the policies provided with the EAC program 
requirements. 

EAC's staff has the appropriate credentials for nontechnical reviews of 
laboratory applicants in accordance with the accreditation process 
contained in HAVA. These qualifications include a detailed 
understanding of EAC's Accreditation Program and its requirements. The 
EAC will place this qualification and any others deemed necessary in 
the internal procedures it is creating (as noted above). 

3. Requirements for the adequate maintenance of records related to the 
VSTL accreditation program. This recommendation is similar to 1 and 2, 
above. EAC agrees that more formalized review procedures would benefit 
its accreditation program. Documentation is a key element of such 
action. The adoption of EAC's Accreditation Manual (anticipated date of 
approval, July, 2008) will go a long way to assist with improving the 
laboratory accreditation process. The manual prescribes an application 
package which includes a certification form, an agreement document and 
specific information. As noted above, EAC is already using an 
application checklist which it plans to amend consistent with GAO 
recommendations. Moreover, the Commission will ensure that its internal 
procedures require formal documentation of each EAC action and basis 
for it. The EAC is making documentation a priority throughout its 
Laboratory Accreditation Program. 

The EAC has made it a priority to provide the public with program 
information, including correspondence and many other program-related 
materials. This information is readily available at [hyperlink, 
http://www.eac.gov], and stakeholders including State and local 
election officials, voting system manufacturers, test laboratories and 
the voting public are notified frequently about program updates. The 
EAC also provides links to program material generated by NIST's 
technical reviews of the laboratories. 

4. Requirements for determining laboratory financial stability. The GAO 
Report specifies five program features (or areas of required 
information) that an accreditation program should include. As the GAO 
report notes, EAC's Accreditation Program Manual contains requirements 
for each of these features, including: organizational information, 
records and record keeping, test methods and procedures, conflict of 
interest policy and assurance of financial stability. However, for the 
last requirement, GAO has recommended that the EAC provide additional 
criteria for review and obtain additional laboratory information. 

EAC's Draft Accreditation Manual provides a concise requirement for 
laboratory financial stability. "As a condition of accreditation, all 
VSTLs shall allocate sufficient resources to enable the laboratory to 
properly use and maintain its test equipment, personnel, and facility 
and to satisfactorily perform all required laboratory functions. The 
laboratory shall maintain insurance policies sufficient to indemnify, 
itself against financial liabilities or penalties that may result from 
its operations." (Accreditation Manual �2.14). The EAC agrees that this 
requirement should look more like its other four program features and 
provide more specific criteria. The EAC will amend its draft manual to 
reflect GAO's recommendations before the document's final publication 
in July. In addition, the EAC will require additional information with 
respect to the issue of financial stability. Presently the draft manual 
requires the submission of insurance, workman's compensation, staffing, 
facilities, and financial reporting information. (Accreditation Manual 
�3.4.1). 

The EAC recognizes that the Federal government's first voting system 
test laboratory accreditation program must have a solid foundation. The 
EAC continues to evaluate its progress and identify issues of concern 
to ensure that this new program is robust and thorough. For instance, 
the EAC recently asked NIST (Attachment; EAC letter to NIST Standards 
Services Division Chief Mary Saunders, March 13, 2008. Also available 
at [hyperlink, http://www.eac.gov] under the Voting Systems Center.) to 
address and report back to the EAC on some areas of concern noted by 
the EAC in its review of VSTL work products. We also requested NIST to 
review the laboratories' management process for assigning appropriate 
levels of staff, the qualifications of laboratory employees and how 
federal testing assignments are prioritized. Further, we requested a 
review of the laboratories' test methods and the validation process 
used to ensure that "NIST remains confident of each VSTL's ability to 
test voting systems." 

The EAC will continue to work with NIST to ensure that the laboratory 
accreditation process is monitored very closely, and after NIST produce 
this additional information, we will provide it to GAO, Congress, and 
the public. 

Keeping stakeholders and the general public informed about program 
activities, milestones and challenges continues to be a top priority. A 
May 2, 2008 memo (attached and available at [hyperlink, 
http://www.eac.gov] under Voting System Center), to election officials 
from Program Director Brian Hancock indicated that challenges remain in 
the certification process, such as the length of time it takes 
accredited laboratories to evaluate voting systems. Some participants 
in the certification program are experiencing challenges meeting the 
requirements of a more robust and thorough testing process, as 
evidenced by the submission of nine test plans from a single testing 
engagement. All draft and final test plans as well as test reports are 
available at [hyperlink, http://www.eac.gov] under the Voting System 
Center. 

The EAC thanks GAO for its work in assisting the Commission in its 
efforts to improve and develop the Voting System Laboratory 
Accreditation Program. The EAC is committed to continuous improvement 
in all of it programs and will work hard to implement the 
recommendations made in this report. The EAC is committed to making its 
accreditation process part of a world class testing and certification 
program. 

Sincerely, 

Signed by: 

Thomas R. Wilkey: 
Executive Director: 

[End of section] 

Appendix IV: GAO Contact and Staff Acknowledgments: 

GAO Contact: 

Randolph C. Hite, (202) 512-3439 or [email protected]: 

Staff Acknowledgments: 

Paula Moore, Assistant Director; Justin Booth; Timothy Case; Neil 
Doherty; Timothy Eagle; Nancy Glover; Dave Hinchman; Rebecca LaPaze; 
Freda Paintsil; Nik Rapelje; and Jeffrey Woodward made key 
contributions to this report. 

[End of section] 

Footnotes: 

[1] See, for example, GAO, Elections: Perspectives on Activities and 
Challenges Across the Nation, GAO-02-3 (Washington, D.C.: Oct. 15, 
2001); Elections: Status and Use of Federal Voting Equipment Standards, 
GAO-02-52 (Washington, D.C.: Oct. 15, 2001); Elections: A Framework for 
Evaluating Reform Proposals, GAO-02-90 (Washington, D.C.: Oct. 15, 
2001); Elections: Federal Efforts to Improve Security and Reliability 
of Electronic Voting Systems Are Under Way, but Key Activities Need to 
Be Completed, GAO-05-956 (Washington, D.C.: Sept. 21, 2005); and 
Elections: All Levels of Government Are Needed to Address Electronic 
Voting System Challenges, GAO-07-576T (Washington, D.C.: Mar. 7, 2007). 

[2] GAO-02-52. 

[3] 42 U.S.C. � 15371. 

[4] In August 2008, after EAC provided comments on, and we had 
finalized, this report, the commission announced that it had approved 
its draft accreditation program manual. As a result, we did not review 
the approved manual. 

[5] GAO-02-3. 

[6] GAO-02-3. 

[7] GAO, Elections: Electronic Voting Offers Opportunities and Presents 
Challenges, GAO-04-975T (Washington, D.C.: July 20, 2004). 

[8] GAO-05-956. 

[9] Help America Vote Act, Pub. L. No. 107-252 (Oct. 29. 2002). 

[10] Federal Election Commission, Performance and Test Standards for 
Punchcard, Marksense, and Direct Recording Electronic Voting Systems 
(January 1990). 

[11] GAO-05-956. 

[12] The VVSG did not take effect until December 2007. 

[13] We have ongoing work to review EAC's certification program for the 
House Committee on House Administration. 

[14] GAO-02-90. 

[15] The state elections directors' accreditation program was 
discontinued in July 2006. 

[16] A suspension provides the laboratory an opportunity to address the 
identified issues. 

[17] NIST, NIST Handbook 150: National Voluntary Laboratory 
Accreditation Program Procedures and General Requirements 
(Gaithersburg, Md.: February 2006). 

[18] NIST, NIST Handbook 150-22: National Voluntary Laboratory 
Accreditation Program: Voting System Testing (Gaithersburg, Md.: 
December 2005). 

[19] ISO, ISO/IEC 17011: Conformity Assessment: General Requirements 
for Accreditation Bodies Accrediting Conformity Assessment Bodies 
(Geneva, Switzerland: Feb. 15, 2005). 

[20] ISO, ISO/IEC 17025: General Requirements for the Competence of 
Testing and Calibration Laboratories (Geneva, Switzerland: May 15, 
2005). 

[21] NIST, National Voluntary Laboratory Accreditation Program: 
Management System Manual, Revision 2 (Gaithersburg, Md.: Mar. 25, 
2008). 

[22] NIST, The ABC's of the U.S. Conformity Assessment System, NISTIR 
6014 (Gaithersburg, Md.: April 1997). 

[23] NIST, NIST Handbook 150-22. 

[24] NIST, Handbook 150-22 2008 Edition: National Voluntary Laboratory 
Accreditation Program: Voting System Testing (Gaithersburg, Md.: May 
2008); NIST Handbook 150-22 Checklist: Voting System Testing Program 
(Rev. 2008-06-25). 

[25] NIST, National Voluntary Laboratory Accreditation Program: 
Management System Manual, Revision 2. 

[26] As mentioned earlier, these guidelines became effective in 
December 2007. 

[27] EAC, Voting System Test Laboratory Accreditation Program Manual 
(draft, Washington, D.C., February 2008). 

[28] EAC, Testing and Certification Program Manual (Washington, D.C.: 
January 2007). 

[29] NIST, The ABC's of the U.S. Conformity Assessment System, NISTIR 
6014. 

[30] GAO, Laboratory Accreditation: Requirements Vary Throughout the 
Federal Government, GAO/RCED-89-102 (Washington, D.C.: Mar. 28, 1989). 

[31] GAO/RCED-89-102. 

[32] The EAC Director of Voting System Testing and Certification also 
manages EAC's VSTL accreditation program. 

[33] On August 4, 2008, EAC reported that it approved an initial 
version of its program manual. However, we did not evaluate this 
initial version to determine the extent to which it addresses 
limitations that we found in the draft manual because it was approved 
after EAC provided comments on a draft of, and we had finalized, this 
report. 

[34] According to the NIST program manager, a fifth laboratory has been 
accredited by NVLAP but not yet recommended to EAC. 

[35] The steps were to: 1) provide information, such as the 
laboratory's conflict of interest policy, evidence of insurance 
coverage limits, and audited financial statements; 2) provide a signed, 
standardized letter of agreement to abide by the EAC program terms; and 
3) provide a signed certification of laboratory practices and 
conditions, such as having policies in place with respect to personnel 
practices, record-keeping requirements, and financial stability. 

[36] ISO, ISO/IEC 17011: Conformity Assessment: General Requirements 
for Accreditation Bodies Accrediting Conformity Assessment Bodies 
(Geneva, Switzerland: Feb. 15, 2005). 

[37] ISO, ISO/IEC 17025: General Requirements for the Competence of 
Testing and Calibration Laboratories (Geneva, Switzerland: May 15, 
2005). 

[38] NIST, The ABC's of the U.S. Conformity Assessment System, NISTIR 
6014 (Gaithersburg, Md.: April 1997). 

[39] EAC, Voting System Test Laboratory Accreditation Program Manual 
(draft, Washington, D.C.: February 2008). EAC approved the program 
manual for publication in July 2008; however this was accomplished too 
late for GAO to review the manual's contents for this report. 

[40] In June 2008, NIST granted NVLAP accreditation to a fifth 
laboratory, CIBER Inc., and recommended it for EAC accreditation. As of 
August 2008, that accreditation had yet to be granted. 

GAO's Mission: 

The Government Accountability Office, the audit, evaluation and 
investigative arm of Congress, exists to support Congress in meeting 
its constitutional responsibilities and to help improve the performance 
and accountability of the federal government for the American people. 
GAO examines the use of public funds; evaluates federal programs and 
policies; and provides analyses, recommendations, and other assistance 
to help Congress make informed oversight, policy, and funding 
decisions. GAO's commitment to good government is reflected in its core 
values of accountability, integrity, and reliability.  

Obtaining Copies of GAO Reports and Testimony: 

The fastest and easiest way to obtain copies of GAO documents at no 
cost is through GAO's Web site [hyperlink, http://www.gao.gov]. Each 
weekday, GAO posts newly released reports, testimony, and 
correspondence on its Web site. To have GAO e-mail you a list of newly 
posted products every afternoon, go to [hyperlink, http://www.gao.gov] 
and select "E-mail Updates."  

Order by Mail or Phone: 

The first copy of each printed report is free. Additional copies are $2 
each. A check or money order should be made out to the Superintendent 
of Documents. GAO also accepts VISA and Mastercard. Orders for 100 or 
more copies mailed to a single address are discounted 25 percent. 
Orders should be sent to:  

U.S. Government Accountability Office: 
441 G Street NW, Room LM: 
Washington, D.C. 20548:  

To order by Phone: 
Voice: (202) 512-6000: 
TDD: (202) 512-2537: 
Fax: (202) 512-6061:  

To Report Fraud, Waste, and Abuse in Federal Programs:  

Contact:  

Web site: [hyperlink, http://www.gao.gov/fraudnet/fraudnet.htm]: 
E-mail: [email protected]: 
Automated answering system: (800) 424-5454 or (202) 512-7470:  

Congressional Relations:  

Ralph Dawn, Managing Director, [email protected]: 
(202) 512-4400: 
U.S. Government Accountability Office: 
441 G Street NW, Room 7125: 
Washington, D.C. 20548:  

Public Affairs: 

Chuck Young, Managing Director, [email protected]: 
(202) 512-4800: 
U.S. Government Accountability Office: 
441 G Street NW, Room 7149: 
Washington, D.C. 20548: 

*** End of document. ***