[House Hearing, 110 Congress]
[From the U.S. Government Publishing Office]
CERTIFICATION AND TESTING OF ELECTRONIC VOTING SYSTEMS
=======================================================================
HEARING
before the
SUBCOMMITTEE ON INFORMATION POLICY,
CENSUS, AND NATIONAL ARCHIVES
of the
COMMITTEE ON OVERSIGHT
AND GOVERNMENT REFORM
HOUSE OF REPRESENTATIVES
ONE HUNDRED TENTH CONGRESS
FIRST SESSION
__________
MAY 7, 2007
__________
Serial No. 110-13
__________
Printed for the use of the Committee on Oversight and Government Reform
Available via the World Wide Web: http://www.gpoaccess.gov/congress/
index.html
http://www.oversight.house.gov
----------
U.S. GOVERNMENT PRINTING OFFICE
36-750 PDF WASHINGTON : 2007
For sale by the Superintendent of Documents, U.S. Government Printing
Office Internet: bookstore.gpo.gov Phone: toll free (866) 512-1800;
DC area (202) 512-1800 Fax: (202) 512-2104 Mail: Stop IDCC,
Washington, DC 20402-0001
COMMITTEE ON OVERSISGHT AND GOVERNMENT REFORM
HENRY A. WAXMAN, California, Chairman
TOM LANTOS, California TOM DAVIS, Virginia
EDOLPHUS TOWNS, New York DAN BURTON, Indiana
PAUL E. KANJORSKI, Pennsylvania CHRISTOPHER SHAYS, Connecticut
CAROLYN B. MALONEY, New York JOHN M. McHUGH, New York
ELIJAH E. CUMMINGS, Maryland JOHN L. MICA, Florida
DENNIS J. KUCINICH, Ohio MARK E. SOUDER, Indiana
DANNY K. DAVIS, Illinois TODD RUSSELL PLATTS, Pennsylvania
JOHN F. TIERNEY, Massachusetts CHRIS CANNON, Utah
WM. LACY CLAY, Missouri JOHN J. DUNCAN, Jr., Tennessee
DIANE E. WATSON, California MICHAEL R. TURNER, Ohio
STEPHEN F. LYNCH, Massachusetts DARRELL E. ISSA, California
BRIAN HIGGINS, New York KENNY MARCHANT, Texas
JOHN A. YARMUTH, Kentucky LYNN A. WESTMORELAND, Georgia
BRUCE L. BRALEY, Iowa PATRICK T. McHENRY, North Carolina
ELEANOR HOLMES NORTON, District of VIRGINIA FOXX, North Carolina
Columbia BRIAN P. BILBRAY, California
BETTY McCOLLUM, Minnesota BILL SALI, Idaho
JIM COOPER, Tennessee ------ ------
CHRIS VAN HOLLEN, Maryland
PAUL W. HODES, New Hampshire
CHRISTOPHER S. MURPHY, Connecticut
JOHN P. SARBANES, Maryland
PETER WELCH, Vermont
Phil Schiliro, Chief of Staff
Phil Barnett, Staff Director
Earley Green, Chief Clerk
David Marin, Minority Staff Director
Subcommittee on Information Policy, Census, and National Archives
WM. LACY CLAY, Missouri, Chairman
PAUL E. KANJORSKI, Pennsylvania MICHAEL R. TURNER, Ohio
CAROLYN B. MALONEY, New York CHRIS CANNON, Utah
JOHN A. YARMUTH, Kentucky BILL SALI, Idaho
PAUL W. HODES, New Hampshire
Tony Haywood, Staff Director
C O N T E N T S
----------
Page
Hearing held on May 7, 2007...................................... 1
Statement of:
Davidson, Donetta L., chairman, U.S. Election Assistance
Commission; and Mark W. Skall, chief, Software Diagnostics
and Conformance Testing Division, National Institute on
Standards and Technology................................... 17
Davidson, Donetta L...................................... 17
Skall, Mark W............................................ 34
Kellner, Douglas A., co-chair, New York State Board of
Education; Dr. David Wagner, associate professor, Computer
Science Division, University of California, Berkeley;
Lawrence Norden, Brennan Center for Justice, New York
University School of Law; John Washburn, VOTETRUSTUSA
Voting Technology Task Force; and Mac J. Slingerlend,
president and CEO, CIBER, Inc., accompanied by John Pope,
vice president for contracts............................... 54
Kellner, Douglas A....................................... 54
Norden, Lawrence......................................... 78
Slingerlend, Mac J....................................... 105
Wagner, Dr. David........................................ 64
Washburn, John........................................... 93
Letters, statements, etc., submitted for the record by:
Clay, Hon. Wm. Lacy, a Representative in Congress from the
State of Missouri, prepared statement of................... 3
Davidson, Donetta L., chairman, U.S. Election Assistance
Commission, prepared statement of.......................... 19
Kellner, Douglas A., co-chair, New York State Board of
Education, prepared statement of........................... 57
Maloney, Hon. Carolyn B., a Representative in Congress from
the State of New York:
Information concerning CIBER................................. 110
Prepared statement of........................................ 15
Norden, Lawrence, Brennan Center for Justice, New York
University School of Law, prepared statement of............ 80
Skall, Mark W., chief, Software Diagnostics and Conformance
Testing Division, National Institute on Standards and
Technology, prepared statement of.......................... 36
Slingerlend, Mac J., president and CEO, CIBER, Inc.,
information concerning CIBER............................... 118
Wagner, Dr. David, associate professor, Computer Science
Division, University of California, Berkeley, prepared
statement of............................................... 66
Washburn, John, VOTETRUSTUSA Voting Technology Task Force,
prepared statement of...................................... 96
CERTIFICATION AND TESTING OF ELECTRONIC VOTING SYSTEMS
----------
MONDAY, MAY 7, 2007
House of Representatives,
Subcommittee on Information Policy, Census, and
National Archives,
Committee on Oversight and Government Reform,
New York, NY.
The subcommittee met, pursuant to notice, at 9:30 a.m., in
City Council Chambers, New York City Hall, 131 Duane Street,
New York, NY, Hon. Wm. Lacy Clay (chairman of the subcommittee)
presiding.
Present: Representatives Clay and Maloney.
Staff present: Tony Haywood, staff director/counsel; Adam
C. Bordes, professional staff member; and Nidia Salazar, staff
assistant.
Mr. Clay. The Subcommittee on Information Policy, Census,
and National Archives of the House Committee on Oversight and
Government Reform will now come to order.
Today's hearing will examine issues relating to the
certification and testing of electronic voting systems under
the Help America Vote Act of 2002.
Without objection, the Chair and other Members present will
have 5 minutes to make opening statements, and without
objection, Members and witnesses may have 5 legislative days to
submit a written statement, or extraneous material for the
record.
Let me say, first of all, that it is a pleasure to be here
in the Big Apple to discuss a topic of tremendous importance to
New Yorkers and the Nation as a whole; the need for effective
and transparent certifications and testing of electronic voting
systems. I want to thank my distinguished friend and colleague,
Congresswoman Carolyn Maloney, for inviting us to New York and
I want to thank City Council Speaker Christine Quinn for making
the City Council Chambers available to us. This is a wonderful
venue for a hearing.
And this is the subcommittee's second hearing on electronic
voting systems. During an April 18th hearing in Washington, the
subcommittee heard testimony concerning widespread
vulnerabilities in modern electronic voting systems. Those
weaknesses are a major concern for Congress, State, and local
entities, that administer the electoral process, and all
Americans who value their stake in our democracy. Passed on
response to reports of serious voting irregularities during the
November 2000 Presidential election, HAVA established the first
set of uniform minimum standards and requirements for the
administration of Federal elections.
The law authorized $3.86 billion in funding. The bulk of
this funding was provided to enable States to replace punch
card or mechanical voting equipment, improve their election
administration capabilities, meet new election requirements and
improve access for disabled voters.
Beginning in fiscal year 2003, many States used HAVA funds
to procure new electronic voting systems. In 2005, the EAC
approved new voting system standards, the 2005 voluntary voter
system guideline for States to use as a reference, when
procuring new machines under HAVA.
Unfortunately, numerous States have reported problems with
new voting systems, as well as difficulty ensuring that their
systems comply with the evolving HAVA standards.
Voting system problems include software vulnerabilities
that impair security or reliability, and the inability to
confirm voter intent in the case of systems that lack an
independent audit component, such as a verifiable paper trail.
A change in requirements have left some States out of
compliance with HAVA standards because their systems were
designed and procured before current standards took effect.
In addition, there have been serious problems relating to
the EAC's accreditation and oversight labs that test and
certify voting systems for compliance with HAVA.
In January, for example, the New York State Board of
Elections suspended CIBER, Inc., a lab that has reportedly
tested 70 percent of the Nation's voting systems, due to
ineffective internal controls and CIBER certification
practices, and lack of transparency in their testing process.
CIBER also has failed to win accreditation by the EAC. New
York has decided to postpone the procurement of new voting
systems until there is a more dependent and transparent
certification program to identify system vulnerabilities and
ensure HAVA compliance before systems are marketed to States.
We rely upon our voting systems to record each and every
vote accurately. Uniform testing standards and vigorous
oversight of the certification process for voting systems are
necessary to ensure that these systems operate reliability and
securely, and without this we risk eroding the public
confidence that is necessary for active voter participation and
a healthy democracy.
We have invited today's witnesses here to shed light on the
factors that have impeded the ernest efforts of States like New
York, to improve accuracy, reliability and security in their
voting systems, while complying with HAVA requirements.
I want to thank all of our witnesses for appearing before
the subcommittee today, particularly those who traveled long
distances and adjusted their busy schedule to be with us. I
welcome all of you and look forward to an informative and frank
discussion of these important issues, and now I would turn to
my colleague and dear friend, Mrs. Carolyn Maloney. Thank you.
[The prepared statement of Hon. Wm. Lacy Clay follows:]
[GRAPHIC(S) NOT AVAILABLE IN TIFF FORMAT]
Mrs. Maloney. Thank you so much, Lacy Clay, for your
leadership on this and so many other important issues before
Congress and for traveling all the way, to be here, in New York
City on this very important issue. Truly nothing is more
important to our democracy than the accuracy, the reliability,
the trust that our people have in our voting systems, and the
fact that they are reliable and dependable and transparent.
I do want to say that Rush Holt had hoped to be with us,
but was not able to. He brings his greetings. He says we will
be marking up his bill that he has worked 8 years on, Intro
811, tomorrow, in Congress, it will be moving forward with
tremendous and important funding, $1 billion for new voting
machines, $100 million for auditing and making sure that the
voting machines work, and also calls for an independent audit,
a paper trail. It's very important legislation. I support it.
I know that Lacy and I have some ideas to make it even
better. But it is a compromise. I'm thrilled that it's moving
forward and I thank all of our attendees today.
It shows that you care about our democracy, and most
importantly, I thank all of our witnesses for coming and for
the hard work that they're doing on this subject.
And I really especially appreciate all the hard work done
by Mr. Clay and his staff on an issue that is very important to
me, and I would say to every American, the accuracy and
security of the Nation's voting systems.
In recent years, considerable concern has been expressed
about the security and reliability of the electronic voting
systems. Reports from governmental agencies, testimony before
Congress, and academic studies, have indicated serious
vulnerabilities that call for immediate attention.
I must add that it is one of the issues that people
literally walk up to me on the street, at events, at meetings.
They come up and express their concern over voting machines.
This is a critical issue to my constituents and I would say to
every American across this country.
Penetration testing done by independent computer security
experts has demonstrated that election results can be altered
in a manner that cannot be detected by normal election security
procedures. Independent reviews commissioned by State election
officials have revealed serious security vulnerabilities in the
software, architecture of voting systems now in use.
Typically, when concerns about the security and reliability
of voting systems are raised, supporters argue that these
systems have been tested to Federal standards. However, at a
recent hearing of this subcommittee, the Government
Accountability Office reported, ``The test performed by
independent testing authorities, and State and local election
officials, do not adequately assess electronic voting systems
security and reliability. These concerns are intensified by a
lack of transparency in the testing system.''
The GAO, which is an independent bipartisan governmental
agency, noted weak and insufficient system testing, source code
reviews and penetration testing. They pointed out that most of
the systems that exhibited the weak security controls had been
nationally certified after testing by an independent testing
authority.
Now that is scary. They're saying you cannot trust them and
they've been certified. Last summer, the EAC undertook a review
of the laboratories that had been testing under the NASED
program. The assessment review of one of these labs, CIBER
concluded, ``CIBER has not shown the resources to provide a
reliable product.'' The report also noted, ``CIBER reports
provide limited or no descriptions of the testing performed, so
a reader or reviewer can tell if all the testing was
completed.''
This is very serious. This is one of the things that we
want to accomplish this hearing, is how we can rectify this.
Here, in New York, an independent review--and I want to
applaud the elections board of New York, they went out and got
an independent review, many States did not, but New York State
is so concerned about this issue; they got an independent
reviewer of CIBER's test plans and these revealed that they did
not document the methodologies, procedures, and processes
necessary to ensure that all testing is done in a structured
and repeatable way.
It is estimated that CIBER has tested the software in more
than 70 percent of the voting machines used last November. So
what the GAO and the independent review in New York is telling
us is that 70 percent of those voting machines that are out
there being used, really have not been tested adequately and
have not been certified adequately, and may have serious flaws.
Estimated, because there is no way to know for sure which lab
tested which system, and apparently there's also no way of
knowing, for sure, if any testing was done at all. Trusting the
word of the ITA or testing labs, election officials across the
country use taxpayer money to purchase equipment, believing
that this equipment was in conformance with Federal standards.
Apparently, we have no way of knowing whether the equipment
actually does meet Federal standards. CIBER hides behind a
cloak of confidentiality, and personally, I believe that in
something as important as the reliability of our voting
machines, there should be no confidentiality; it should be
transparent and open to the election officials, and I would say
the public.
Because test methods are considered proprietary, the public
and election officials cannot verify that procedures were done
properly. When a system fails a test, there is no public
announcement. Why in the world aren't they telling people, if
certain systems are failing these tests? We have a right to
know this.
Many States went out and bought these machines, thinking
they were reliable. If they had known that they had failed
tests, or hadn't even been certified, they would never have
bought them.
Further, if the system subsequently passes, there is no way
to identify what changes the manufacturer made, if any, to
enable the system to pass. Considering that CIBER certified 70
percent of the machines that were used last November, we have a
real dilemma. Do we keep using machines that were certified by
these testing labs that did not meet the standards for
accreditation, or do we have to start all over and recertify?
That is a basic question before this committee today.
I am very pleased that CIBER will be here today to respond
to our concerns. The National Testing and Certification Program
has been vital to the sales and acceptance of voting machines
in most States. Experience is often the best test and a great
deal of jurisdictions are finding problems with the machines
that the testing labs seem to have missed.
Several States have moved forward quickly to buy touch
screen voting machines, and they are realizing that the
machines they bought do not work very well.
New Mexico, the State of New Mexico decided to switch to
optical scan style voting, statewide. In 2006, including in
four counties it spent nearly $4 million for touch screen
machines. Last month, Maryland switched to optical scan. They
even took the extraordinary step of having paper ballot votes
because they didn't trust the machines.
This month, Florida followed suit, and incidentally, there
will be hearings in Washington on the contested ``Florida 15''
because of the missing votes. New York is looking pretty smart
these days. We were criticized for not going out there and
buying those machines. There were court suits against us. But I
think New York looks pretty smart, because New York focused on
standards and refused to jump quickly into untested technology.
Our elected officials may have saved taxpayers a great deal of
money. We didn't buy machines that we have to change, and the
New York delegation, led by Congressman Serrano, is working
very hard to restore the $50 million that was taken away from
New York State.
It was part of a bill that was moving forward, that has
been vetoed; but we believe we will be successful in restoring
that money.
We need meaningful testing to make sure equipment meets the
2005 standards. This hearing provides an opportunity to examine
the current state of voting systems testing and certification
in this great Nation. It can also serve as a step toward a more
transparent and trustworthy process in the future. Unless we
improve our certification process, we are in danger of losing
the confidence of American voters.
And I want to really thank the advocates and citizens that
turned out today, and many of your constant questions, e-mails,
phone calls to me, are one of the reasons that I have reached
out to the chairman of the appropriate committee to hold these
hearings, and he has done a magnificent job and I am sure he
will not stop until he is satisfied, that we have safe,
reliable, transparent voting machines. So I thank everyone,
especially the chairman.
[The prepared statement of Hon. Carolyn B. Maloney
follows:]
[GRAPHIC(S) NOT AVAILABLE IN TIFF FORMAT]
Mr. Clay. Thank you so much, Representative Maloney. Let me
also say that I represent Missouri, which is known as the
``Show Me State,'' and Representative Maloney has certainly
laid the marker down for what the intent is of this hearing and
future hearings on the transparency. So it is time that the
people that produce election machines, those who monitor, those
who have the authority over it, show the people of this country
that it is transparent, show them that their votes will be
counted accurately.
And let me say that on our first panel, we will hear from
the Honorable Donetta Davidson, Chair of the U.S. Election
Assistance Commission and Mr. Mark W. Skall, chief of the
Software Diagnostics and Conformance Testing Division within
the Information Technology Laboratory of the National Institute
on Standards and Technology.
And we also have our newest commissioner of the Election
Assistance Commission, Rosemary Rodriguez. Thank you for being
here, Ms. Rodriguez. Let me thank all of you for being here
today before the subcommittee and it is the policy of the
Committee on Oversight and Government Reform to swear in all
witnesses before they testify.
I would like to ask you both to stand and raise your right
hands.
[Witnesses sworn.]
Mr. Clay. Thank you. You may be seated. Let the record
reflect that the witnesses answered in the affirmative and I
will ask you both to give a brief summary of your testimony and
to keep the summary under 5 minutes in duration, and those
lights in front of you will indicate when you get down to 1
minute, and then when it turns red, that means your 5 minutes
is up.
You complete written statement will be included in the
hearing record.
Ms. Davidson, we will begin with you. Please proceed.
STATEMENTS OF DONETTA L. DAVIDSON, CHAIRMAN, U.S. ELECTION
ASSISTANCE COMMISSION; AND MARK W. SKALL, CHIEF, SOFTWARE
DIAGNOSTICS AND CONFORMANCE TESTING DIVISION, NATIONAL
INSTITUTE ON STANDARDS AND TECHNOLOGY
STATEMENT OF DONETTA L. DAVIDSON
Ms. Davidson. Thank you very much, Mr. Chairman. We are
here to discuss the reliability of voting systems. With the
committee's permission, I think it's important to talk, just
for a moment, about how equipment has been tested in the past.
The National Association of Election Directors [NASED], tested
voting equipment against the guidelines created by the Federal
Election Commission. They did this on a volunteer process and
without any Federal funding.
The Federal Government, at that time, at 2002 standards by,
and up to just recently, they did not certify, the Federal
Government did not certify voting equipment.
It wasn't until the Help America Vote Act, that even--we
also know it was HAVA--that put this into place, where we could
test equipment, and I would like to go further into that with
questions because my statement won't allow time, but we'll go
further into it.
HAVA requires EAC to create voting system guidelines and it
also accredited the labs which will test voting systems.
The commission voluntary adopted voting system guidelines
in December 2005. Our certification program got underway to
test voting equipment this year. And let me be absolutely
clear. We did not grandfather any vendors or test labs into the
process.
The National Institute of Standards and Technology is EAC's
valuable partner in both of these areas. NIST evaluates the
test labs and provides recommendations to the EAC.
After review, NIST recommends, and we conduct additional
reviews when the commission makes final decision, before we
make the final decision. As of today, we have two accredited
labs. There is nine manufacturers or vendors that have
registered for our program. Five systems have been submitted
for certification. Information about these labs and the
manufacturers are on our Web site at www.eac.gov. EAC will hold
the vendors and the labs to do their job and make sure they
take responsibility.
We do have ability to decertify in both cases. We have set
up a quality monitoring program and we will work hard with the
States to investigate on reports and the voting systems
irregularities and share this information with election
officials and the public.
So what does the future hold for voting systems? We are
working with NIST on the next iteration of guidelines and we
expect to receive this a little later this year.
Just like 2005 guidelines, the version will further
increase security requirements. However, no matter how thorough
we test voting machinery, people ultimately ensure the voting
equipment is reliable. People remove the ballots from the
ballot boxes. People unlock the optical scan machines and
remove the ballots. And people program all voting equipment.
To successfully compromise a voting system, any voting
system on election day, you must have two things--knowledge of
that system and access to that system.
Focusing on the security of voting machines in a laboratory
is not enough. No voting system, ballot box, touch screen or
optical scan, should be trusted unless officials store them in
secure locations, prevent tampering, conduct logic and accuracy
testing as well as all other testing, have well-trained
workers, in other words, your poll workers, audit the result,
and let the public observe the process.
I have spent most of my career in elections, and some
things never change. Detail matter, whether we are using paper
ballots, we use touch screen, or we use the DRE, the direct
record. It is important to remember that the voting equipment
must work properly as well as to have procedures and make sure
that the people are well-trained to control the access and
maintain the equipment properly.
Thank you. I look forward to your questions.
[The prepared statement of Ms. Davidson follows:]
[GRAPHIC(S) NOT AVAILABLE IN TIFF FORMAT]
Mr. Clay. Thank you so much, Ms. Davidson, for your
testimony.
Mr. Skall, you may proceed.
STATEMENT OF MARK W. SKALL
Mr. Skall. Thank you. Chairman Clay and members of the
subcommittee, thank you for the opportunity to testify today. I
am Mark W. Skall, chief of the Software Diagnostics and
Conformance Testing Division of NIST, part of the Technology
Administration of the Department of Commerce. I will discuss
NIST's role in voluntary voting systems, guidelines and
testing.
Some of the major items assigned to NIST by HAVA included
sharing and providing technical support to the Technical
Guidelines Development Committee [TGDC], in order to develop
voluntary voting system guidelines and conducting an evaluation
of independent non-Federal laboratories, in order to submit to
the EAC a list of those laboratories that NIST proposes to be
accredited by the EAC to test voting systems.
These voluntary voting system guidelines [VVSG], contain
requirements for vendors when developing voting systems, and
for laboratories when testing whether the systems meet the
requirements of the guidelines.
The TDGC provides technical direction to NIST in the form
of TDGC resolutions and reviews, and approves research material
written by NIST researchers. The TDGC ultimately is responsible
for approving the guidelines and submitting them to the EAC.
HAVA provided for the creation of the TDGC and mandated
that the first set of recommendations for voluntary voting
system guidelines be delivered to the EAC 9 months after the
final creation of the TDGC.
To meet this very aggressive schedule, NIST and the TDGC
conducted workshops, meeting, and numerous teleconferences to
gather input, pass resolutions and review and approve NIST-
authored material.
This was done in a fully transparent process, with meetings
conducted in public and draft materials available over the Web.
These guidelines built upon the strengths of the previous
voting system standards, enhanced areas needing improvement,
and included new material, primarily in usability,
accessibility and security.
The resultant document, now known as the VVSG 2005, was
delivered on schedule to the EAC in May 2005.
Immediately after completing its work on the VVSG 2005,
NIST and the TDGC began working on the next iteration of the
VVSG which is currently planned for delivery to the EAC in July
2007.
The new VVSG will be a larger, more comprehensive standard,
with much more thorough treatment of security areas and
requirements for equipment reliability. This VVSG will include
updated requirements for accessibility and requirements for
usability based on performance benchmarks. It prohibits radio
frequency wireless communications, which includes the use of
common wireless local area networks.
In December 2006, the TDGC approved a resolution to include
requirements in the VVSG only for those voting systems that are
software independent. This essentially means that the voting
system can be audited through the use of voter-verified paper
records, so that election fraud and errors that would result in
changes to election outcomes can be reliably detected.
To encourage innovations in voting systems that could
produce more usable, accessible and reliable designs, the new
VVSG will include an innovation class. Some innovations
resulting from this class could result in secure voting systems
that do not rely on voter-verified paper records.
NIST is also developing open, comprehensive test suites, so
that the requirements in the draft VVSG can be tested uniformly
and consistently by all of the testing labs.
NIST has been directed to recommend qualified testing
laboratories to the EAC for accreditation. In order to
accomplish this, NIST is utilizing its National Voluntary
Laboratory Accreditation Program [NVLAP]. Simply stated, NVLAP
offers an unbiased third party evaluation and formal
recognition that a laboratory is competent to carry out
specific tests or calibrations.
NIST first accredits voting system testing laboratories
according to NVLAP's criteria and then recommends them to the
EAC.
In January 2007, NIST proposed that high Beta Quality
Assurance and SysTest Labs be accredited by the EAC under the
provisions of HAVA. Currently, NVLAP is proceeding with the
evaluation of five other laboratory applicants.
In conclusion, NIST is pleased to be working on this matter
of national importance with our EAC and TDGC partners. Thank
you for the opportunity to testify. I would be happy to answer
any questions the subcommittee might have.
[The prepared statement of Mr. Skall follows:]
[GRAPHIC(S) NOT AVAILABLE IN TIFF FORMAT]
Mr. Clay. Thank you so much, Mr. Skall.
We will now proceed to the questioning period under the 5-
minute rule, and I will start with Ms. Davidson.
Ms. Davidson, I am aware of your background in the area of
systems certification, through your work as Secretary of State
in Colorado, and through the National Association of State
Election Directors.
With this expertise, I am hopeful that you can offer some
explanation and potential solutions. What activities have the
Technical Guidelines Development Committee of the EAC, in
concert with NIST, and the vendor community, undertaken to
bring uniformity to the accreditation process of certification
labs?
Where is the EAC in determining whether to reinstate the
labs that lost their interim accreditation in 2006?
Ms. Davidson. Currently, Mr. Chair, we have set up a
temporary process to get us through the last year's election,
to make sure that we were able to test just software, not
systems, because of State laws changing, or maybe a piece of
equipment failed and needed some software change, and the other
issue is a name of a ballot came off, or the court case. So
State law, that type of thing, would cause that. We had three
that was tested, only three minor changes. In that process, we
said that underneath what the--and we did this at a public
meeting in August 2005, where our Standards Board and our
Advisory Boards were there, and we went through the process of
saying this is what we will do if we cannot get laboratories
that have been recommended by NIST/NVLAP process.
Because of their thorough process, we were told that it was
going to take over a year to get them through the process. It
is a very thorough process, to get it really worked through. So
in January, we allowed the three labs that NSLAP had actually
accredited as independent test labs, and we allowed them to
qualify, you know, to actually register to go through the steps
and the procedures.
In that, two labs were named, in October, and they
testified in a public meeting that we had. So there was a
public meeting with the two labs that had met that criteria, it
was SysTest and it was Wyle.
At that time, CIBER had applied, they also applied, but
they had not met all of the requirements that we felt they
should. We went through the same process that was set up by
NVLAP with NIST, and really tried to make sure that the labs
would meet the needs that we needed. And this was only to 2002
requirements, not to 2005.
We weren't checking voting systems, only the software in
that time. So we are still in the process with CIBER. If they
meet that, you know, that interim process. But at this time, if
they do not meet that, and we expect to have that, you know,
information before too long, then they'll continue going
through the NVLAP process and trying to meet their letter from
Dr. Jeffries to us from the NIST Foundation, to come to us and
recommend that they would be accredited. They are one of the
five labs that have registered, that has not gone through the
full process with NVLAP at this time.
Does that answer your question thoroughly enough?
Mr. Clay. Well, wait a minute now. Are you comfortable with
the other two labs that have gained certification? Are you
confident that they are doing what is necessary to check these
systems throughout the country?
Ms. Davidson. The two labs that have the accreditation, now
the full accreditation, because we received a letter in January
from NIST, recommending that we accreditate SysTest, which was
one of those labs, and the other one is iBeta, and those labs
have gone through the whole process, through NIST, and with
that process I think Congress was very wise in putting NIST in
control of that, because they go through that process with all
different kinds of labs. They are really very qualified to do
that.
So in moving forward, yes, I feel that our labs will be
able to test to the standards that have been developed, and
they currently--because we did not grandfather anything in--
they can test to 2002 or 2005.
The equipment that is out there right now have the
recommendation from the NASED association, which was a
volunteer association, no Federal money. So the two labs that
are there now, yes, I feel that they definitely can.
And one of the things that we do is any time we set new
standards, NVLAP will go back out to make sure that they meet
that, and in our requirements, we also put that we can go into
the labs at any time and verify the process they are using, to
make sure that they are doing the job correctly.
Mr. Clay. Thank you for such a thorough answer. Let me ask
one more and then I will turn it over to Representative
Maloney.
What is the commission doing about the system flaws that
were reported during the 2006 election cycle? In particular,
what will it do with reports of significant flaws or failures
in systems certified under NASED for 2007 and 2008 election
cycle? Will the commission decertify NASED systems, if
warranted?
Ms. Davidson. In our process, they have to go through our
process for us to be able to decertify. But one of the things
that we are doing is if there is something that has come in for
certification, as we said, we have five different systems that
is in now, if that is one of them that had issues, we have sent
that manufacturer a letter, asking them if they are addressing
that in the new process that they have gone through with the
test labs.
So that the laboratories will be aware of it, and any time
we get anything from the States, if the system is going through
it we make the laboratories aware of what the issues are.
So we are definitely making sure that if they are going
through our process, we feel that we have authority at that
time.
Mr. Clay. So you all actually report to the certification
board, to NIST, if there are flaws or problems, and they are
brought to your attention?
Ms. Davidson. We will certify to the laboratories themself,
if we are aware of any problem, so that they can check, too,
what the problems--whether it is a State or whether it is an
issue that has been, you know, really gone through a process
some other way, we will definitely notify the labs of the
issues.
Mr. Clay. Thank you for that response.
Representative Maloney, please proceed.
Mrs. Maloney. Thank you.
I would like to start with Mr. Skall, and if you would like
to also answer, Ms. Davidson, and thank you very much. for
being here, for all your hard work, both of you.
Considering that CIBER certified 70 percent of the machines
in use last November, and that now they have been suspended for
inadequate certification and testing, we have a huge challenge
in front of us. Do we keep using machines that were certified
by the ITA, or testing labs that did not meet the standards for
accreditation? Or do we have to start over and recertify? What
are we going to do with those 70 percent that--Mr. Skall?
Mr. Skall. Thank you. Now of course at NIST, we are a
technical agency and don't make policy decisions like that. I
guess we are very lucky not to be in that situation. But I will
give you my perspective from a technical analysis.
Mrs. Maloney. Yes.
Mr. Skall. Making sure that voting systems work correctly
is a very complex process. It starts with a standard. You can
only test for the most part. You can do some testing outside of
the standard. You could look through the source code and find
security glitches.
But the vast array of detailed testing is what we call
functional testing, and it starts with having a comprehensive
well-specified standard. So in my opinion, until you actually
have really precise, detailed standards in place, which have
tremendously precise and accurate requirements for security and
accessibility, it is very difficult to get systems tested
thoroughly. So the first step is to have the standards in
place.
Mrs. Maloney. Do we have those standards in place now?
Mr. Skall. We have one standard in place, the 2005
standard. We are about to deliver to the EAC the much more
comprehensive standard. We are planning to deliver that to the
EAC in July 2007.
Mrs. Maloney. So you are going to come out with it. See,
what happens, though--and I just have to jump ahead--you keep
improving the standards, and then, if the States go out and buy
these machines, then they have to totally change them to the
new standard. So that is a problem for States, and so could you
address that.
Mr. Skall. Yes; absolutely.
HAVA mandated that we produce the first set of voluntary
voting system guidelines in 9 months. By definition, that meant
we can only do an incremental update to the existing standards.
We knew, right away, that we needed a more comprehensive
standard. The one in 2007 is the comprehensive standard. I
don't have any plans, and I do not believe the EAC does, to
change that standard for a long, long time. This will be the
standard in place for many, many years.
It won't be a moving target. It is the one that is going to
have all the requirements that we and the TDGC felt were
necessary.
Mrs. Maloney. And that will be in place. And where
specifically does it change from the 2005 standard?
Mr. Skall. Oh, it is much more comprehensive in the areas
of security, access, control, cryptographic requirements, what
I mentioned before, software independence, which allows for the
voter to verify his or her vote. This concept of an innovation
class, which is going to allow, hopefully in the future, for
automated solutions to voter verification, much more detailed
requirements in usability for performance benchmarks, to allow
much more innovative designs to meet the performance
benchmarks, reliability, accuracy, tremendously--much more
comprehensive.
Mrs. Maloney. Sounds great. But based on your statement,
then, we haven't really scientifically certified these 70
percent of machines that are being used.
So I guess the question goes to the policymaker. Ms.
Davidson, are we going to keep using machines that were
certified by the ITA, that did not meet the standards for
accreditation, or do we have to start all over?
Ms. Davidson. We felt like we had to start over.
Mrs. Maloney. So you're starting all over to recertify
them.
Ms. Davidson. In January, we asked all the vendors, they
had letters to all of them, asking them to come back in and be
retested, because as you have stated, most of the States are
using equipment that is 2002, meets those guidelines and not
the 2005, because of the deadline that was set in HAVA.
So many of the States have purchased that equipment and we
feel that it does need to be retested, and if they want our
seal--it is a volunteer program--but if the States want the
seal, where then we can go back and decertify if there is
issues, we have asked for that equipment to come in.
We have five that has already got their equipment in, we
expect many more, we expect another lab, within just a short
time, from NVLAP. They are also through. So we are moving
forward. We feel it has to go through the process that we have
set up.
Mrs. Maloney. OK. Is there any reason--again I'll start
with Mr. Skall--why the testing process and test reports should
be done in secret? Why shouldn't the public be able to verify
that testing was done properly?
And we have some of these vendors saying everything we do
has to be in secret. Well, how in the world do you certify that
they're doing it properly? So my question is, is there any
reason why the testing process and test reports should be done
in secret?
Mr. Skall. Again, let me give the technical answer to that.
Right now, the problem, in my opinion, from a technical
standpoint is there is no uniform set of tests with all the
labs, publicly available uniform set of tests. Labs develop
their own tests, they're proprietary, whether they should be
proprietary or not I guess is a legal and policy question, but
what we're doing at NIST is developing, starting in fiscal year
2007, a comprehensive set of test suites that all the labs can
use. They will be publicly available, there will be tremendous
transparency, and once this test suite is done----
Mrs. Maloney. OK. Let's go to another point. Why should the
labs be doing the testing? That's like the fox in the chicken
house. I mean, why should the manufacturers be doing this
testing? They have been certifying--or it is changing now,
money is going to go to EAC and then go to the labs----
Mr. Skall. Yes. So you are getting into the question of
whether, in fact, the vendor should pay the test labs to do
testing. Again, it's--would you like to----
Mrs. Maloney. So do you see any reason, once we come out
with a uniform set of tests, that this testing should be done
in secret? Is there any reason why----
Mr. Skall. Oh, no, it should not be done in secret, and, in
fact, there will not be initial proprietary test suites,
because we will develop them, they will be in the public
domain, they will be completely open for everyone to see.
Mrs. Maloney. That is great news. That is great news. Ms.
Davidson, would you like to respond?
Ms. Davidson. The one thing I believe I would like to add
is we do support, that Congress gives us authority to collect
the money, and then whether it is by lot, or whatever the case
may be, we set up a procedure and it is an open procedure. We
have hearings on issues that we bring into procedures.
So there would be a process set up where we would collect
the money and then the lab would be selected for that
manufacturer or vendor.
So we see that would improve it, because it is a conflict,
and there is a lot of the public that is very concerned about
it as well as us.
Mrs. Maloney. I ask the chairman, may I have an additional
2 minutes to ask a question.
Mr. Clay. Please proceed.
Mrs. Maloney. OK. I would like to ask Commissioner
Davidson, and Mr. Skall, if you would like to comment, in
Section 202 of HAVA, Congress tasked the EAC with serving as a
clearinghouse of information on the experiences of State and
local governments in implementing the guidelines and in
operating voter systems, in general.
And when a security vulnerability or a system flaw is
revealed, or when your assessor determined that the main
testing lab is not testing adequately, why hasn't the EAC made
every effort to share this information with election officials
and the public, restoring the trust of the American voter
should not be a public relations effort. The trust of the
American public must be earned through transparency and
accountability, and if you are--you're tasked to be a
clearinghouse, but I have heard concerns that this type of
information, when it comes in, does not get sent out to the
election officials and to the public.
Ms. Davidson. Currently, the EAC is reviewing how we can
move forward, because, you know, when we get things from third
parties, if it is not coming from the State, how do we make
sure that it's reliable information and correct information?
And that is one of the things we feel is a responsibility of
the EAC, that is, make sure that it is correct.
We thought about setting up a review panel. We have given
consideration, you know, how do we, you know, actually walk
through this process? Because it will happen in the future.
Mrs. Maloney. But Commissioner, if a report comes in from a
State election official, I mean, that is a pretty serious
thing, and the question is why are you not sharing that with
other State election officials? Maybe they would not have
bought some of these faulty machines, if they knew some of the
problems that were coming in from other States.
We want to get good machines out there and a good system
out there. So if information's coming into the clearinghouse, I
would say it is true, you have to verify that it is true. But
if it is coming in from a State election official, from a
Secretary of State or whatever, this is a very serious piece of
information and what I am being told is that you are not
sharing it with other States, the election officials or the
public.
Ms. Davidson. We have taken the position, now that we have
started certifying, yes, that type of information will be
shared, and because I mean, we have just now----
Mrs. Maloney. Now you will be sharing it. OK.
Ms. Davidson. That is right. That is correct. If it comes
from a Secretary of State, and if it comes from a county
official, we feel like we have to, beyond the ground and see if
that--what was the issue with that? Because many times, whether
it was a poll worker, whether it was actually somebody that did
the setup of the election--you know, we have to make sure
whether it is a machine problem, what, but report whatever that
issue might be.
Mrs. Maloney. And last, Commissioner, was there any
communication between the White House and the EAC concerning
the release of the voter fraud, voter intimidation report, or
any of the other reports that have been submitted to the EAC?
Ms. Davidson. Because of everything that was brought up in
that, and, you know, it is such a hotly contested issue, we
have asked our Inspector General to do a full audit of our
process and of those reports, and to give a report and we would
be more than happy to give you that once that is done. We also
will be changing----
Mrs. Maloney. When do you expect that to be done?
Ms. Davidson. You know, they haven't given us a timetable
but I would say, hopefully, it's done within a month.
Mrs. Maloney. Within a month. But the question, was there
any communication between the White House and the EAC? That is
a simple question.
Ms. Davidson. Yes. Not that I know of, but, you know, I
know that they have kind--they have put a gag order on us
talking to anybody else within our own office. So for me to ask
somebody, I--you know, they are going through all of our e-
mails, they are going through all the records, paper records,
everything, to see if there was any communication with--whether
it was a Congress Member or whether it was the White House.
Mrs. Maloney. Do you know of any communication with DOG,
the FEC or the RNC?
Ms. Davidson. I am not aware of any.
Mrs. Maloney. Thank you.
Mr. Clay. Thank you, Representative.
Mrs. Maloney. By the way, Mr. Skall, would you like to
comment on the clearinghouse question of information? This is a
concern that many State governments have brought to Mr. Clay
and myself, that they want this information coming out from the
clearinghouse, that they were tasked by HAVA.
Could you comment on that aspect.
Mr. Skall. You know, again, as sort of the technical arm of
developing the standards and tests, it's just not an area we
have much expertise in.
Mrs. Maloney. All right. Thank you very much for your
testimony and thank you for your work.
Mr. Clay. Thank you.
Mrs. Maloney. Both of you. Thank you.
Mr. Clay. Mr. Skall, let me ask you, are there time limits
for labs to address problems found during the pre-assessment,
assessment or monitoring phases of accreditation? What steps
does NIST take if these time limits are not met?
Mr. Skall. No; there are no time limits. The way NVLAP
works is the NVLAP accreditation very much depends on the
readiness of the labs. Some labs are further along, some labs
are not very far along, and it takes them a lot of time to do
remedial type actions to get up to speed, and NVLAP will not
issue an accreditation until we are 100 percent confident that
the lab can perform its services.
So in the procedures there is no time limit, that we ask
the labs to move faster, because we want them to do it
correctly.
Mr. Clay. Thank you.
Ms. Davidson, can you explain the rationale by the EAC to
exempt off-the-shelf products from the VVSG guidelines for
testing of certification purposes, since so much of the
software and components used in voting systems are COTS
products. Isn't there an effective way to evaluate these
products?
Ms. Davidson. You know, I think that the technical portion
of your question Mr. Skall should answer. Really----
Mr. Clay. I'll go back to him and let me hear what the
rationale is from EAC.
Ms. Davidson. All right. We actually are doing exactly what
the standards are saying, the voluntary voting system
standards, that we don't take a position because we feel that
is an independent body, the Technical Guidelines Committee
setting up what the guidelines should be in those arenas, and
we have not taken a position on that ourselves as an EAC.
Mr. Clay. OK. Mr. Skall, is there an effective way to
evaluate these products?
Mr. Skall. Yes. The COTS, commonly called COTS, commercial
off-the-shelf systems, has had an exemption, a limited
exemption throughout the history of voting standards. The
reason for this exemption--and the exemption has to do--it is
not a total exemption, they are tested, but some aspects of the
source code are not tested mainly because we can't acquire
them.
Typically Microsoft, for instance, and other large
commercial off-the-shelf vendors are not going to give their
source code. That's a tremendous proprietary interest to them
and they will not give out and make public their source code.
So there are limitations in what we can acquire.
We, in the VVSG 2007, are really tightening this loophole.
We are looking much more closely at which types of systems get
exemptions and we are limiting the type of exemptions. So we
are going to test these systems as much as possible within the
confines of the amount of source code we can get.
Mr. Clay. Thank you for that. I would like to hear some of
your thoughts on the new VVSG guidelines that are scheduled to
go into effect at the end of this year.
I think we all agree that a good certification process is
meaningless, if the standards being used are incomplete.
What is the status of development for the 2007 Voluntary
Voting System Guidelines? And are there any major topics,
originally planned for this edition, that will be deferred to a
later version of the guidelines?
Mr. Skall. Yes. Let me first say, I agree 100 percent. We
look at the viability of software and hardware as sort of a
three-legged stool. You have the standards, you have the tests,
and then you have the implementation, in this case the voting
system, and if one of those legs falls over, the whole system
falls over.
So you need a good standard, you need good tests, and then
you need a good implementation based on that.
The VVSG 2007, as I mentioned before, is very
comprehensive. We are on schedule to complete it. There is
nothing that I know of, that will not be in the VVSG 2007, that
we want to be there. So it will be a complete standard. Now we
may discover in the future, there are more minor things, and
those can be added by probably maintenance to the standard.
But there are no major areas or functionality I know of,
that will be missing.
Mr. Clay. Ms. Davidson, would you like to comment.
Ms. Davidson. Yes, sir. I certainly would. I appreciate
that. Once they are delivered, by law, to the EAC, we have to
publish that in a public register, at least for 90 days. The
last one, we got 6,500 comments that had to be vetted. From the
time it was delivered to the EAC to the time that it was
adopted, that was July, I believe, or it was delivered in May
2000, it took until the middle of December to get that actually
vetted, and we feel this process will take longer.
We feel we need to have some open meetings. We are not sure
what it is going to take the manufacturers in building this new
equipment. This, as Mr. Skall has discussed, is very complex,
and adds a lot of details to the voting equipment. It is the
future of voting systems.
How long will it take to develop that? Also we need to know
from the State officials and county officials in a hearing,
what kind of timeframe are we looking at, that you would be
replacing equipment? And how long do we need to consider our
2005, like you said, you can't constantly require States to
purchase new equipment.
We need to get information from them. This needs to be a
very public process. We need to hear from the advocacy
community. So as we move forward in this process, we expect it
to take some time because it has to be vetted, the public has
to have their right to input in public meetings, and here in
public meetings, and being able to send in their comments to
the EAC.
So we will work with NIST, as we did last time, once these
comments come in, to make sure that the best produce comes out,
because we want the very same thing that you want. We want
reliability. We want our elections to be a success in the
future.
Mr. Clay. Thank you for that response.
Ms. Davidson, since New York failed to procure new systems
by 2006, it is my understanding that they will lose
approximately $50 million in HAVA funds.
Due to the circumstances facing New York, will the EAC be
offering the State a waiver to use the funds, once their
technical concerns are satisfied? And if not, why not?
Also, can you tell us if there are other States that might
not have spent their HAVA funds due to concerns over the
accreditation and certification processes.
Ms. Davidson. You know, we follow the law. Right now, the
law says they have to return the money but we are aware that
there is a bill, as mentioned by the Congresswoman, that they
would be able to keep that money and obviously, with that going
through the process, we would not be moving forward with that.
I kind of feel like the Congresswoman. I think that is
going to be a process that gives us ability in the law, that
says that States that did not spend their money can retain it.
I think it's until 2008, is what is in the bill currently. But
we will follow the law.
The law is what is there but, obviously, we try to make
ourselves always aware of new legislation.
Mr. Clay. So right now, the commission couldn't
administratively give the waiver to the State of New York or--
--
Ms. Davidson. We cannot give the waiver but, obviously, we
know that there is a process moving forward, so we have not
sent out any letters.
Mr. Clay. Are there any other States that are also kind of
caught in limbo as far as the certification process?
Ms. Davidson. As far as other States, they are not caught
in limbo. They have bought equipment, but maybe one county
didn't, like in Pennsylvania, I believe there is one county,
one individual county, so they were going to have to return
back a very small amount.
There is other States, Arkansas, that has to return a very
small amount. But New York is the big area, that they didn't
move forward and buy equipment, and so it was because of other
issues, that some of the others didn't purchase equipment.
Mr. Clay. In New York's case, they didn't move forward
because they were cautious, because they wanted to make sure
they got this done correctly, I mean, and I'm sure we will make
the case for this State in Congress. But I mean, you do
understand that they moved very cautiously, which I can
appreciate it. I think others can too.
Ms. Davidson. We definitely understand their position. We
asked for reports from States, like the law asks us to, and we
have a full list, if you want that, of States, what kind of
funds they still have out there, because it does affect more
than one State, when you're passing that legislation.
Mr. Clay. Sure. We would love to see the list and if you
could provide to the subcommittee.
Ms. Davidson. OK.
Representative Maloney, any other questions for this panel?
Mrs. Maloney. Very briefly. I just wanted to comment on
your statement, Commissioner Davidson, that ultimately it is a
human hand and human accountability. I looked at one machine
that Smartmatic manufactured under the Sequoia name, and they
literally had a yellow button on the back of the machine where
you could change the vote. It was unbelievable.
So when I inquired, what do you do to make sure that
someone's not changing the vote on the back of the machine? and
the answer was, well, we will have people watching to make sure
that no one is changing the vote on the back of the machine.
So I feel that we should not have machines like Sequoia's
yellow button you can change, but that there still has to be a
human element, and I hope Mr. Skall's guidelines will help
remove the need for that. I have been in some New York
elections where absolutely every voting machine has had a
citizen-watcher to make sure that everything is done properly.
But back to your statement that everything should be
public. When a system fails a test, there is no public
announcement. Wouldn't that be helpful for the public and for
Mr. Skall, and others, to know that this system has failed? And
then, ultimately, when you test, you are testing to standards.
What about the hackers? It is the hackers that are getting into
these machines.
There are reports in the paper that one from Princeton
hacked in, and you're not really testing to prevent the hackers
from getting in there and doing their thing.
Your response?
Ms. Davidson. Well, currently, the only ones that we are
aware of, that has been hacked into, has been at Princeton in a
lab, and not in a polling location. We are not aware of any
equipment being hacked into on election day.
Mrs. Maloney. But that is the point. You are not aware of
anyone hacking in. It doesn't mean that someone hasn't hacked
in, and the testing doesn't really prevent hacking or look at
the hacking approach. It looks at the standards and tests the
standards as opposed to how a hacker goes in and sees what's
missing and how to get in there.
I mean, since we haven't tested against hackers, we don't
really know whether they have gotten in on election day or any
other time.
Ms. Davidson. And I think that is the reason why NIST and
the TDGC has definitely put a lot of area into security and
going into cryptographics as Mr. Skall mentioned.
That is why the new guidelines has really gone into that
area. But, you know, I think you're going to get a far more
detailed answer from Mr. Skall than from myself, if you would
like.
Mrs. Maloney. But on a policy statement, when a system
fails a test I'm told there is no public announcement. Maybe
that is the type of thing that should go into the
clearinghouse, so that election officials across the country
will know what systems are failing and why, and be on the alert
for it.
So my question is when a system fails a test, there is no
public announcement. Why not? Why aren't we putting that in the
clearinghouse and getting it out to election officials?
Ms. Davidson. As I stated before, that will be a process
that we are looking at, is how do we get it out, how do we make
sure it's reliable. As you said, if it comes from a State or
election official, it needs to be out there.
And we will also, it has been our policy to, we do a
newsletter, and the newsletter also goes to our oversight
committees on the Hill, and we try to make that available not
only to election officials in the Nation but our oversight. I
believe that NIST is on. We add anybody that would like to be
put on to our list for our newsletter.
Mrs. Maloney. Thank you.
Mr. Skall, on the hacking question, how do we know they
haven't hacked in on election day, if we're not testing
antihacking----
Mr. Skall. OK. Let me answer that in a couple of ways. We
are testing security requirements. So the standard itself, the
new standard will have something called requirements for open-
ended vulnerability testing.
This is precisely to check, to see whether, in fact,
hackers have hacked in. Now it is well beyond the state-of-the-
art to prove and to be certain that someone hasn't hacked in,
just like it is beyond the state-of-the-art to prove the
software works correctly. You can't prove it. You can only get
an indication of reliability and of security.
So we will have more comprehensive tests. There are some
tests now, the examination of source codes, for that very
reason. We will have more tests, more requirements.
Can we be sure someone has not hacked in? No. Will we have
a better feel, a better confidence that they haven't? Yes.
So we're at the point where we can be more comprehensive
but we can never be sure, and we never will be able to.
Mrs. Maloney. My time is up. I want to thank both of you. I
would also like to comment that Congress is very concerned
about moving forward with helping overseas residents vote, and
helping our men and women in the military vote, and that is
something that we'll possibly be looking at at a later time,
because as we go into more of a global economy, many of our
Americans are living overseas and they report they are having
difficulty voting. So that is another concern.
Anyway, thank you very much for coming and thank you for
all your hard work.
Mr. Skall. Thank you.
Ms. Davidson. Thank you.
Mr. Clay. Thank you, Representative Maloney, and that will
conclude the testimony for panel one.
Thank you, Ms. Davidson, and thank you, Mr. Skall, for your
testimony and you may be excused.
I would like to now invite our second panel of witnesses to
come forward and then we will take a recess. Voting systems
from a variety of important perspectives.
Mr. Douglas Kellner, co-chair of the New York State Board
of Elections, an attorney at the law firm of Kellner Herlihy,
Getty and Friedman. Welcome.
Mr. David Wagner, professor of computer science at the
University of California at Berkeley. Thank you for making the
trip, sir.
Mr. Lawrence Norden of the Brennan Center for Justice at
New York University School of Law. Thank you for being here.
And Mr. John Washburn, software quality consultant and
member of the VoteTrustUSA Voting Technology Task Force.
And Mr. Mac J. Slingerlend, president and CEO of CIBER,
Inc., located in Denver, CO.
Gentlemen, welcome to all of you. In addition, I understand
that Mr. Slingerlend is accompanied by CIBER, Inc.'s vice
president for contracts, Mr. John Pope, and thank you for being
here.
It is the policy of the Committee on Oversight and
Government Reform to swear in all witnesses before they
testify. At this time I would like to ask all of the witnesses
to stand and raise your right hands. Mr. Pope, you intend to
speak on the record. I would like you to join the invited
witnesses in being sworn.
[Witnesses sworn.]
Mr. Clay. Thank you, and let the record reflect that all of
the witnesses answered in the affirmative. I will now ask all
of you to give an oral summary of your testimony and to keep
the summary under 5 minutes in duration.
Your complete written testimony will be included in the
hearing record, and Mr. Kellner, we will begin with you.
STATEMENTS OF DOUGLAS A. KELLNER, CO-CHAIR, NEW YORK STATE
BOARD OF EDUCATION; DR. DAVID WAGNER, ASSOCIATE PROFESSOR,
COMPUTER SCIENCE DIVISION, UNIVERSITY OF CALIFORNIA, BERKELEY;
LAWRENCE NORDEN, BRENNAN CENTER FOR JUSTICE, NEW YORK
UNIVERSITY SCHOOL OF LAW; JOHN WASHBURN, VOTETRUSTUSA VOTING
TECHNOLOGY TASK FORCE; AND MAC J. SLINGERLEND, PRESIDENT AND
CEO, CIBER, INC., ACCOMPANIED BY JOHN POPE, VICE PRESIDENT FOR
CONTRACTS
STATEMENT OF DOUGLAS A. KELLNER
Mr. Kellner. Thank you, Congressman. I thank you for
calling us to testify today. I have read some of the statements
that you have made at prior hearings, and I am grateful,
because I believe that you do understand, very well, the issues
that we need to address in order to assure that we have
uniform, accurate, transparent, and verifiable elections. And I
also thank Congress Member Maloney who has also worked so hard
on this issue, and for her contribution on this, particularly
in shedding light on Sequoia Pacific earlier this year and the
fine work that she has been doing.
I believe that since it is clear to me that you understand
the fundamentals, I will skip that part of my testimony and go
directly to what we have done in New York.
The key thing is that we can have all these fine principles
about how elections should be done, and I endorse the
principles involved in the Voter Confidence and Increased
Accessibility Act of 2007, H.R. 811, which is sponsored by
Congressman Holt, because those are important principles to
assure that we have verifiable and transparent elections.
But I add the caveat, that we have to pay careful attention
to the timetable for implementation of any new law, that good
intentions alone do not make wise legislation. That the timing
for implementation of new voting systems and HAVA was
fundamentally flawed by putting the cart before the horse. We
required States to replace their punch card and lever voting
machines before setting the standards for new voting systems.
And as we have heard the testimony from NIST, and from the
EAC, that none of the systems that are in use today have been
certified to the 2005 standards that have been set by the
Election Assistance Commission, let alone the 2007 standards
which are still in development.
And what New York has found is that the system for
certifying under the 2002 standards, which were very weak and
very summary, itself was flawed, and that there is good reason
to question all of the 2002 certifications that were made by
NASED.
And specifically, what have we found on this? Well, I
pointed out that in the process of New York adopting its own
independent testing process, that we learned that ES&S, which
is one of the major suppliers of election systems throughout
the country, came to New York and said we want a waiver from
the 2005 standards with respect to source code, and the reason
you should give us that waiver is that there was no change in
that particular requirement from the 2002 standards and we got
certification from NASED under those standards. So why should
you make us comply now?
Well, that raised questions in my mind, and I went and
inquired, well, how is it that they didn't comply with the 2002
standard and still got certification?
The answer is nobody knows. That in asking the NASED
officials who were in charge of the certification process, they
said, well, we got a report from CIBER that recommended
certification, and there was nothing in that report that
indicated that they were not in compliance with all of the
applicable standards.
And then we go back and, in fact, the States that purchased
this equipment were relying on the NASED certification, that
relied on CIBER, and CIBER never reported the fact that they
had not even tested for that particular requirement with
respect to the source code.
So that is one piece of evidence questioning the 2002
certification standards.
The second thing is that we had these reports that Congress
Member Maloney referred to before, where computer scientists at
Princeton showed how they could hack into the Diebold optical
scanning system. Computer scientists at the University of
Connecticut did it from a different approach and also showed
the vulnerability of the system.
The Maryland election authorities had commissioned a study
also, that showed the security vulnerabilities. And these
reports show that, again, that Diebold scanning system was
certified to the 2002 standards, even though none of the
security requirements in the 2002 standards had been tested,
again by CIBER, that did the independent testing report that
was given to NASED, and NASED certified that Diebold scanning
system as well as other Diebold--the Diebold DREs share the
same types of flaws, as pointed out in these studies, and they
were certified to those 2002 standards which themselves were
inadequate, even though there was no testing for those
particular requirements under those standards.
Now as Commissioner Davidson has indicated, the EAC does
not decertify equipment that was certified by the National
Association of State Election Directors. They only decertify
equipment that they themselves have certified.
So the bottom line is, is that most of the equipment that
is in use in this country now, has never been properly
certified, and the certification process that is in place now,
to the 2002 standards, is meaningless.
Now at this time, not a single voting system has been
certified to the 2005 standards and there is only one system,
at least according to the EAC Web site, that has even applied
for certification to the 2005 standards. The other five
applications are all to the old 2002 standards.
So we really do have a crisis, in the sense that the voting
equipment that is in use now does not meet current standards,
and if Congress is going to require States to upgrade their
voting equipment, and I certainly support that process, and I
support what Congressman Holt is trying to do in H.R. 811, we
have to first make sure, that before we spend all this money,
we're spending it for equipment that needs proper standards,
and that is what I would urge you to do.
In my written testimony, I have enumerated how the New York
law actually incorporates a lot of these principles that
Congressman Holt has in his bill. That New York already
requires every voting system to produce a voter verifiable
paper audit trail.
New York requires that there be an audit of the paper trail
of at least 3 percent of the voting machines in each county,
and authorizes the escalation of the audit to a greater number
of machines where errors or the closeness of the results
warrant.
New York already prohibits any device or functionality
potentially capable of externally transmitting or receiving
data via the Internet or radio waves, and New York requires
that the manufacturer or vendor of each voting machine escrow a
complete copy of all programming, source coding and software.
New York is one of only two States that now has that
requirement, and North Carolina, the other State, is not
enforcing its requirement.
So New York will actually be the first to effectively
require at least the escrow of source coding.
New York has also adopted a number of other reforms in the
regulations that it has adopted, including being the only State
so far to require compliance with the 2005 voter system
guidelines.
New York requires every vendor to disclose all political
contributions. New York requires and provides for public access
to observe usability testing of the systems, and--OK.
Mr. Clay. Mr. Kellner, we will let you summarize.
Mr. Kellner. All right. I will wrap up, Congressman. So the
bottom line is that to emphasize that there is no voting system
on the market today that complies with the current Federal
standards, and that you can't on the adequacy of the old
certification, and that Congress should keep that in mind as it
requires jurisdictions to upgrade their voting equipment.
[The prepared statement of Mr. Kellner follows:]
[GRAPHIC(S) NOT AVAILABLE IN TIFF FORMAT]
Mr. Clay. Thank you so much, Mr. Kellner. I would like to
remind the witnesses, let's attempt to keep it at the 5-minute
rule. Thank you.
Dr. Wagner, please.
STATEMENT OF DR. DAVID WAGNER
Dr. Wagner. Chairman Clay, Representative Maloney, thank
you for the opportunity to testify today.
In my research into electronic voting, I have come to the
conclusion that the Federal certification process is not
getting the job done. The testing labs, as we have already
heard today, are failing to weed out insecure and unreliable
voting systems.
The testing labs have approved systems that have lost
thousands of votes, they have approved systems that are
unreliable, they have approved systems with serious security
vulnerabilities.
For instance, in the past few years, independent security
researchers have discovered security vulnerabilities in voting
systems that are used throughout the country, vulnerabilities
that were not detected by State and Federal certification
processes.
In my own research, I too have found serious problems in
federally certified voting system, systems that remain
certified and in use today.
The bottom line is election officials rely upon the Federal
certification process to ensure quality; but the process has
failed them.
Part of the problem is that the testing labs are not doing
as good a job as they could. But part of the problem is more
fundamental. Paperless voting machines are incredibly hard to
certify. When we use paperless voting machines, a single flaw
in the software potentially caused undetectable errors in
election outcome, and that places an impossible burden on
vendors in testing labs because it requires perfection.
A single overlooked defect can be enough to render the
whole system insecure, unreliable or inaccurate, and experience
has proven that it is easy for even the most capable experts to
overlook flaws and defects in software.
Given the complexity of modern election technology, it is
unreasonable to expect perfection from vendors or testing labs.
If the voting system is completely reliant upon software
failures and security flaws are inevitable. Therefore, one of
the best ways to solve this problem may be to reduce our
reliance upon software.
Our election system must be software independent. It must
not rely upon the correct functioning of software. The good
news is that there are solutions to these problems. The most
effective solution today is to adopt voter-verified paper
records and perform routine audits of those records.
These audits provide a way to independently check whether
the software has counted the votes correctly. This would reduce
our reliance upon the software and, in my opinion, it would
make the shortcomings of the certification process less
critical.
Audits are not perfect. Because they can detect problems
after the fact but cannot prevent them, we will need a
certification process that is capable of weeding out
problematic voting system.
In my testimony, I discuss a number of steps we could take
to improve the certification process, including eliminating
conflicts of interest, increasing transparency and embracing
open-ended vulnerability testing.
In particular, I would like to draw your attention to a
conflict of interest in the testing process. Today, vendors
choose and pay the testing labs, and this creates a perverse
incentive for the labs to place the vendors' interests above
the public interest.
One potential solution would be for Congress to act to give
the EAC the authority it would need to collect fees from
vendors, so that EAC can choose and hire testing labs itself.
As I mentioned, the good news is that solutions are
available; however, the bad news is that only a minority of
States have adopted these solutions. My understanding is that
27 States use voter-verified paper records throughout the
State, but only 13 of them audit those records.
Adopting voter-verified paper records in routine audits,
more widely, would reduce the pressure on our certification
process and would provide greater transparency and confidence
for voters. I believe it is the single most effective thing we
could do to improve the reliability and security and
trustworthiness of e-voting. Thank you.
[The prepared statement of Dr. Wagner follows:]
[GRAPHIC(S) NOT AVAILABLE IN TIFF FORMAT]
Mr. Clay. Thank you so much, Doctor.
Mr. Norden, please proceed.
STATEMENT OF LAWRENCE NORDEN
Mr. Norden. Thank you, Chairman Clay, and Congresswoman
Maloney, for holding this hearing on what is certainly an
extremely important topic.
For 18 months, I chaired the Brennan Center's Task Force on
Voting System Security, and that was a task force made up of
the leading computer scientists and security professionals in
both the private and public sector in the United States.
It included David Wagner as well as scientists from NIST,
the former chief security officer from Microsoft, and the
former cyber security czar for President George W. Bush.
What the task force found is no longer, I think, a matter
of debate among security experts that have looked at these
voting machines, and that is that they have serious security
and reliability vulnerabilities.
As David Wagner mentioned, the good news is that there is
substantial agreement among these experts, about what we can do
to address these vulnerabilities, and among the most important
things we can do is to ensure that we have an independent
voter-verified record such as a paper ballot or paper trail,
and that after the polls have closed, we use those paper
records to check the electronic tallies.
These steps are certainly important, given the problems
that we are aware of with the machines today and their
certification. But I would echo what David Wagner said, and say
that these steps are important, no matter how well we do the
certification process or accredit labs.
That is not to say that certification of accreditation
isn't extremely important. We want to catch flaws before the
elections, before the systems are certified, obviously, and to
maximize the chance that we catch those flaws, we have to fix
what is a broken certification and accreditation process.
That process, I should say, is in transition right now, as
we have heard today, and I think there is good reason to
believe that it is being substantially improved. Still, there
are certain things that need to be done. I detail a number of
them in my written testimony. I am just going to talk about a
few in the remaining time that I have.
I would say one of the most important things we can do is
something that Congresswoman Maloney touched upon and David
Wagner touched upon, and that is to eliminate the process where
vendors choose and pay the labs that judge and certify them.
For obvious reasons, this is a conflict of interest and creates
perverse incentives for vendors to certify machines where they
are relying on--excuse me--for testing authorities to certify
machines. They are relying on those same vendors for future
business.
I should add that Congressman Holt's bill, H.R. 811, does
end this system along the lines of what David Wagner suggested.
The second thing we can do is add an important step to testing
machines, and this has also come up a little bit in some of the
testimony we have heard today.
Right now, what we do is we test to guidelines. We test
under normal conditions to satisfy a check list. This is
certainly important to do but good security testing, as
Congresswoman Maloney touched upon, will try to ensure that a
system does not fail when it is attacked or misused.
There are a couple of things we can do. One of the things
that we can do is what Mr. Skall suggested, which is to have
independent security experts perform open-ended research and
search for vulnerabilities on these machines to exploit.
This is how many of the most serious flaws in voting
machines have been discovered. Unfortunately, because it wasn't
part of a certification process, this isn't something we
discovered until after the machines were in use.
Something else we can do is require vendors to demonstrate
how they will defeat a standard set of threats that could be
developed by an organization list like NIST.
We should also make sure that the process for certifying
machines, for evaluating machines, excuse me, does not end with
certification.
The EAC is now accepting anomaly reports from election
officials and that is a good step. Unfortunately, it is not
accepting such reports from voters, from technical experts that
are performing field studies on these systems.
And I would say that is a problem, for a number of reasons,
not least of which is that voters themselves, and technical
experts, are often going to be in a better position than
election officials to know if the machines aren't working when
they are voting on them.
We should use their reports to investigate machines, to
amend guidelines and to require machine changes, where
necessary.
Finally, one thing I would urge Congress is to make sure
that we fund the EAC and the certification process adequately.
The EAC is charged with some of the most important
administrative tasks in Federal elections. If we are going to
keep them in charge of those tasks, it is important that we
give them enough funds and enough employees to do them.
In 2006, the EAC had a budget of just $15 million and less
than 30 employees, and that is simply not enough, given the
responsibilities that they have.
Thank you.
[Note.--The Brennan Center Task Force on Voting System
Security publication entitled, ``The Machinery of Democracy:
Protecting Elections in an Electronic World,'' may be found in
subcommittee files.]
[The prepared statement of Mr. Norden follows:]
[GRAPHIC(S) NOT AVAILABLE IN TIFF FORMAT]
Mr. Clay. Thank you so much, Mr. Norden.
Mr. Washburn, please proceed.
STATEMENT OF JOHN WASHBURN
Mr. Washburn. Thank you, Chairman Clay, and Mrs. Maloney,
Congresswoman Maloney, for having this hearing and for giving
me this opportunity to present testimony to you on testing and
certification of voting systems.
I have worked in the field of software quality assurance
since 1994, and for the 10-years prior to that, I was a
commercial programmer developing commercial software.
It is important to consider both past testing done under
NASED and the present testing process of the EAC, for two
reasons. First, as has been mentioned, all the equipment
currently in use has been tested under the former NASED
process, and most of this equipment will be used again in the
subsequent years, in this year, and 2008.
Second, the new EAC program has made some steps toward
greater transparency and oversight. It retains some of the
systemic flaws of the NASED program. The NASED and EAC testing
and certification framework suffer from three systematic flaws.
Both systems are opaque to most primary stakeholders in the
election process. These stakeholders are State election
officials, local election officials, candidates for public
office, and most importantly, the voters themselves, and due to
the lack of transparency and accountability, neither system
adequately assures the public that rigorous, thorough and
effective testing has actually been done, and neither system
permits or encourages the reporting of system defects, nor do
they include a responsive corrective action plan.
Under the NASED system, the entire process was a private
sector transaction between the manufacturer and the testing
laboratory, shielded from public oversight by vigorously
enforced nondisclosure agreements.
The reports of test results as well as documentation of the
testing undertaken to confirm a voting system's compliance with
standards are considered the property of the manufacturer of
that system. It is extremely rare for citizens to gain access
to these reports.
For jurisdictions without their own State level testing
programs, all that is available is a list of systems which have
been granted a certification number, and the assurance that
NASED has ruled that the certified system is in conformance
with the standards.
Without test plans, and results of the test executions,
there is no evidence, there is just an appeal to authority, and
with the reports from the New York Board of Elections and the
nonconformances revealed in penetration analysis and academic
reviews, this authority has been called into question.
Over the last several years, numerous security and design
effects have been uncovered, and each of these discoveries has
left unanswered the simple question: How did these noncompliant
systems ever get certified?
For example, use of a programming technique called
interpreted code, is prohibited by both the 1990 and 2002
standards, yet is in use by the Diebold systems.
The vote tabulation software found in ES&S equipment varies
from machine to machine and from election to election and from
jurisdiction to jurisdiction.
For each election, a new and unique version of the vote
tabulation software is created. If the software changes from
election to election and jurisdiction to jurisdiction, how can
there be any version that is the certified version? The central
election management system for Sequoia, which accumulates vote
totals on election night, includes both source code and the
compiler for that source code.
The source code and compiler combination make it easy to
change the operation of this software ``on the fly,'' and in
the field. This is a violation of both the 2002 and 2005
standards.
These examples of nonconformance, though, went undetected
for multiple rounds of testing over several years. So it is not
just a one-time miss here.
The profound and real world consequences of not following
these standards, even as weak as they are, is found at the hour
hour and 9 minute mark of the documentation, Hacking Democracy,
which I have included with my testimony. In this realistic
simulation of an election, the outcome of the mock election was
altered in spite of the election official following all of the
correct administrative procedures.
This manipulation was only possible because that system did
not follow the standards. The NASED testing framework provided
no mechanism to report problems and no way to receive
suggestions for improvement. The EAC has created a new--for
example, I think some of the Sequoia systems don't have
sufficient accessibility for the ADA. That is my opinion; but
who am I going to tell that to?
The EAC has created a new program called the Quality
Monitoring Program. The Quality Monitoring Program, though,
limits itself to fielded systems. As Commissioner Davidson had
pointed out, a fielded system is defined as a system which is
certified by the EAC and used in a Federal election.
Since the EAC has not yet certified any systems, there are
no fielded systems. The Quality Monitoring Program also records
only anomalies, but the definition of anomaly in this section
is exceptionally narrow and permits the dismissal of any report
on the basis the report is due to administrative error or a
procedural defect.
So, for example, a programming error in Pottawattamie
County, IA, caused the election system to incorrectly tally the
results of the June 6, 2006 primary election. This error,
though, does not meet the EAC's definition of an anomaly,
because the preelection testing done by the county auditor was
insufficient and thus is a procedural deficiency.
The failure of the system to not correctly tally votes is
not considered an anomaly by this definition, and further, only
credible reports will be published and distributed to other
election officials. Information in a credible report must first
meet this narrow definition of anomaly, second, must only come
from an election official, and third, the events included in
the report have to have occurred during an election.
If an election official discovers a defect in a voting
system during preelection testing, or during other testing, or
were to undertake an independent review, the results would not
be shared with other election officials.
The Quality Monitoring Program fails to meet the mandate
laid upon the EAC in section 202, to be a clearinghouse of
information on all voting systems, not just those systems which
meet the limited definitions of fielded, anomaly and credible
reports.
There is not much time before the 2008 Presidential
election, and because of the short time, the EAC should use its
authority already granted to the commission under section 242,
to set up a second parallel testing framework. A suggestion for
that is in my written testimony.
So, in conclusion, the NASED testing framework is opaque to
every stakeholder in the elections, except, it seems, the
election manufacturers. It gives the illusion of rigorous
testing without the substance and resists reports of problems
and resists suggestions for improvement.
The new EAC testing framework has these same deep flaws. In
the meantime, an alternate framework needs to be created, which
is more nimble, more effective and more efficient than either
the NASED or EAC framework.
I would like to add as a software test professional, the
activities over the last several years do offend me, that they
have been allowed to be called software testing.
[Note.--The U.S. Election Assistance Commission publication
entitled, ``Testing and Certification Program Manual,'' may be
found in subcommittee files.]
[The prepared statement of Mr. Washburn follows:]
[GRAPHIC(S) NOT AVAILABLE IN TIFF FORMAT]
Mr. Clay. Thank you.
STATEMENT OF MAC J. SLINGERLEND
Mr. Slingerlend. I will loan a couple of my minutes to a
couple colleagues that used a couple extra minutes, so we can
stay on track here. We realize that we didn't predeliver a
standard written statement, and thank you, Mr. Chairman and
Mrs. Maloney, for having us here today.
This was not to offend or otherwise indicate a lack of
cooperation on CIBER's part. A letter by the committee was sent
to us 10 days ago, faxed last Saturday, handed to me last
Monday afternoon, but for me, last week was a board of
directors meeting and a shareholders meeting, so as soon as
those were over, I began to work on this activity.
That said, I contacted Tony Haywood and discussed today's
hearing, changed my schedule and that of John Pope, who is the
left of me. I spent the weekend preparing and getting further
updated on what has been going on in this activity of our
company, so I could be here.
Ladies and gentlemen, we have nothing to hide. We are a 33-
year-old New York Stock Exchange billion dollar IT services
company with 8,000 people in 18 countries and a 96 percent
customer satisfaction rating.
The business we are here to discuss represents about one-
quarter of 1 percent of what we do. That said, we take all of
our business seriously. I am, and have been, at least generally
familiar with the questions asked of us in the chairman's
letter to be here today. I cannot say I know every detail of
any one project but I have prepared and believe I can speak
with you today about the matters you are asking.
With respect to the New York Board of Elections, and Mr.
Kellner, in particular, and I have read his criticisms, in
part, of us, or one of our counsel, we have nothing except good
things to say about the State of New York's activity with
respect to electronic voting.
They have taken their responsibility seriously. They picked
a good company to do the work for them and they have been
victims, I believe as have we, with circumstances primarily
beyond our control since some time, in particular, in 2006.
We have done good work for them and it is currently on
hold. In our opinion, we should either finish the work or
perhaps be paid and asked to go away, but in any case, we are
happy to do either, as directed.
With respect to the EAC, this is a more complicated
situation. The EAC, like we, and our customer, have been caught
in the middle of changing responsibilities, changing
technology, changing test procedures, likely a lack of
sufficient funding for the EAC, and changing testers.
Specifically, we have dealt with moving targets, slow turn-
around times on assessments, and a general lack of sufficient
direct EAC resources, such that they have to rely on others,
and then part-time others, nondirect, and inexperienced
auditing, in part, to help them with their systems and their
accreditation.
In conclusion, some of the tabloids have been accurate;
some not. I think some of the statements Mrs. Maloney made this
morning weren't exactly--I would say accurate, from the
standpoint that you were led in the wrong direction, not that I
would criticize anything you had to say, but relying on some
statements that weren't accurate. Therefore, your questions
came from that standpoint.
It appears that there are multiple agendas that our
customer, the New York State Board of Elections, and we, are
affected by, and perhaps this meeting this morning will push
these to resolution.
Thank you for having us here today.
Mr. Clay. Thank you so much, Mr. Slingerlend, and Mr. Pope,
for being here. We appreciate your accommodating the committee.
Let me go to the 5-minute questioning now and I will start with
Mr. Kellner, and let me first thank the entire panel of
witnesses for the expert testimony that you have just provided.
Mr. Kellner, in light of CIBER's inability to earn interim
accreditation from the EAC last year, what are the major issues
New York is currently facing in using the nationally accredited
Voting System Test Labs for the upcoming election cycle?
What are the timelines that are necessary to adequately
address the EAC's accreditation process in order to ensure a
smooth election cycle for 2008?
Mr. Kellner. Congressman, the New York State Board of
Elections has issued a RFP to accredited laboratories and the
deadline for response to that is next week or so, and we will
be very shortly then evaluating our options on restarting the
testing process as soon as possible.
We would hope that within the next couple of months, we
would be able to restart the testing process.
Now hopefully, the vendors have used this time delay of the
testing process to get their equipment up to snuff, so that
when the testing process resumes, the equipment will pass, and
if that happens, then we expect that we would be able to
certify to the county boards of elections acceptable voting
systems by this December, and that would be in sufficient time
for them to acquire new voting systems for the 2008 primary in
September and the general election in November 2008.
Mr. Clay. Pardon my ignorance. Is New York involved in a
February 5, 2008----
Mr. Kellner. That is correct, Congressman.
Mr. Clay. OK. So they will not be ready for----
Mr. Kellner. That is correct; not for February.
Mr. Clay. OK. One topic that I believe does not get enough
attention in the larger debate over system integrity and
security is the topic of information sharing about system
flaws.
As the national clearinghouse for election information,
what role should the EAC play in developing stronger mechanisms
for sharing information among election officials about system
flaws that are identified by officials or industry
stakeholders?
And anyone on the panel can attempt to answer that.
To followup, should the EAC work together and disseminate
information about flaws not found through its prescribed
national certification processing, including NASED
qualification for upcoming elections?
Yes, Mr. Washburn. You may start.
Mr. Washburn. My customary experience with software
testing, when you are reporting and recording defects, is to
record everything and then categorize later. That is why I am
particularly disturbed with the gatekeeping functions on the
definition of anomaly.
So I guess my opinion would be is that the EAC should take
a report of everything, from everyone, and vet those out, and
then categorize them as credible, not credible, after the fact,
because many times, it's in the pattern of the minutia, in the
pattern of the many reports, that you actually see something--
ah, there is a recurring issue here in some administrative--you
know, even though it may be an administrative error, it is one
that everyone's having.
So the general custom in software testing is to record
everything immediately and then categorize, prioritize and
essentially cite its significance later.
Mr. Clay. Do you think the response time is quick enough?
Is it timely, to flaws and problems?
Mr. Washburn. We are under a very short timeframe for the
2008 election cycle. I am not sure, even if they started
setting up a very high end, you know, defect reporting system
like ClearCase, you know, tomorrow, I doubt that the
responses--it would be better, but I don't know if it would be
enough to correct the systems. But it would at least allow the
local election officials to know what problems to watch for and
perhaps adopt local procedures to help avoid them and mitigate
them.
Mr. Clay. Anyone else? Mr. Norden.
Mr. Norden. Chairman Clay, I just wanted to add a couple of
things. Certainly, there should be some process for all
systems, including NASED-certified systems, to get reports from
election officials, and I would add, as I said before, also
from voters who are voting on these machines and are actually
using them on election day, about things that go wrong with the
system.
Another thing I would add is that as I understand it right
now, if election officials file a report with the EAC and that
report is deemed credible, there is no way for the election
official to have that complaint made anonymous, and that seems
to me to be a problem, for a couple of reasons.
No. 1, the election official that may be filing the
complaint is often the one who bought the system. So they might
have an incentive for not wanting that to be attributed to
them. They are also reliant on the vendors for technical
assistance in the future, and we have instances in the past,
where there has been retribution against election officials for
making complaints, or showing the vulnerabilities in voting
systems.
So I would say three critical things would be providing
some way for there to be anonymous publication of these
complaints from election officials, if they requested, include
voters in the complaints that are taken, and make sure that
there is a clearinghouse for all systems, not just the ones
that have been, or are going to be certified in the near
future.
Mr. Clay. That is a great point. In Congress, we also deal
with that same issue when it comes to HAVA, from the original
authors who don't want any alteration of HAVA, but we know it
is much overdue and needed.
Let me ask Dr. Wagner, many people compare computerized
voting machines to bank ATM machines. They argue that these
bank machines are perfectly safe and accepted by the public.
Therefore, we should have the same confidence in
computerized voting machines. Are these voting machines
constructed with the same security as bank machines and is the
physical security of voting machines the same? What are the
differences in the security and reliability standards and would
using such security standards enable us to better test and
evaluate e-voting systems?
Dr. Wagner. Thank you. First, I would say that our voting
systems are not up to the standards in the financial system
that we are using to protect our bank ATMs.
Second, I would say that the voting problem is a much more
challenging problem than the problem of securing bank ATMS
because of the secret ballot. If we didn't have a secret
ballot, we may be able to apply some of the techniques from the
financial world, which include associating names, multiple
paper trails, and auditing those, cross-checking them.
But because of the requirement for a secret ballot, we are
much more constrained in the voting world by what kinds of
audit logs we can keep, so it is much more challenging to
provide the necessary level of security in the voting world.
Mr. Clay. Thank you so much for that response.
Representative Maloney, your turn.
Mrs. Maloney. Thank you, and I really thank all of the
panelists. I particularly would like to thank Mr. Kellner and
Mr. Norden who are from the district and communities that I
have had the honor to represent, and they have been
longstanding advocates for voter reform, machine reform,
honesty in voting, and I congratulate all of your efforts.
I congratulate all of your efforts. I am just more familiar
with theirs since they are from my city.
Mr. Slingerlend, I understand that you have already
responded to many of the concerns raised in the initial EAC
assessment review from last July. However, the EAC review and
the NYSDEC review commissioned by the New York Board of
Elections, described the state of your testing methods and
procedures that prevailed during the period in which you were
testing most of the voting system software in use across the
country. These independent reviews suggest that CIBER is unable
to adequately document the testing undertaken to establish the
conformance of voting systems to Federal standards.
Are you able to document the test plans, methods and
results of testing performed under the NASED/ITA program?
Mr. Slingerlend. Thank you. I think the answer to that
question is yes. If I may, in kind of a broader sense, say how
we got to where we were, and my comment on, by the way, one of
your earlier comments that we have certified machines, we have
never certified machines, and unfortunately for Commissioner
Clay, it says regained accreditation from the EAC, well, we
have never had accreditation from the EAC so we don't regain
that either.
But in some respects, we have been involved in this
business for a decade. We have been involved in the business
under the NASEd leadership and it was completely voluntary
because I think the Federal Government just did not adequately
approach this subject, historically, and consequently the
States found it necessary to take it on themselves, although
there were a few Federal standards that they were identified
with.
I have talked to myself, if you will, about this, over the
weekend, saying that, you know, we were lulled to sleep by the
process, which wasn't our fault. The fact that we slept
probably was our fault. I think the individual, in particular,
that was leading this effort for us, was like a cook that
doesn't have recipes. He knew the systems very well. He knew
the vendors very well. He knew everything very well. He behaved
in pretty much the same manner for the last 5 or 10 years, as
far as how he was testing machines, and going through his
procedures.
But the documentations of his efforts were not what you or
I would call ``buttoned up,'' to a standard that would be
acceptable, and when the EAC came around last summer with
respect to testing to a standard, it was a new standard, hadn't
been used previously, which was OK. I would say that we weren't
documenting things, that we were physically doing. Nobody has
ever questioned the quality of our work, or the fact that we
have tested things, or attested to things accurately.
The documentation to that, of that fact, though, is not as
good as it should have been. We spent the summer, probably
early fall, after we were told about this, getting things, if
you will, buttoned up, perhaps not completely but substantially
better. The EAC came back--and I am feeling like I am running
out on my answer but there is a timeline here. The EAC came
back in early December and asked to review what progress we had
made, and said you guys have made tremendous progress, but now
we also need you to meet the 2005 standards. So the people that
were certified by the EAC, last summer, weren't asked to meet
the 2005 standards, and we have buttoned ourselves up for 2002.
We were then told we had to be--2005. Then it was February
before we get another response. We turned back in a--and asked
by EAC to respond by March 5th. We further responded on
February 26th, which is--you can, you know, take the months
now, but it is 2\1/2\ months, or whatever that might be. We
still have not heard back, the status of that submission.
So, you know, we feel for the State of New York. You might
even say we feel for ourselves. But I do believe at this point,
we are fully capable of meeting the 2002 standards as the other
currently accredited companies are doing, or have been
accredited to.
Mrs. Maloney. OK. I would like to submit a formal request
on behalf of the subcommittee for documentation related to the
testing by the CIBER of NASED-qualified systems, and it is a
documentation request for each of the systems listed before. If
you would produce the following set of records.
Mr. Clay. Without objection.
Mrs. Maloney. I would like to submit it to you, and to the
record. Thank you.
[The information referred to follows:]
[GRAPHIC(S) NOT AVAILABLE IN TIFF FORMAT]
Mrs. Maloney. Mr. Slingerlend, is there any reason why the
testing process and test reports should be done in secret?
Mr. Slingerlend. I have listened to some of the comments
about the--I will probably say no to the question, with the
exception of that it is a very iterative process, and one can
draw conclusions. It is a little bit like Donetta Davidson was
saying earlier, that you are not always sure the information
that you are getting is accurate, so you are not quite sure you
want to publish it, until you have the ability, yourself, to
verify whether it is accurate.
And for Mr. Washburn and I--and he and I obviously don't
know each other--but I am sure that he has been through lots of
testings of software, over time, just based on his testimony,
and it is an iterative process.
What we have found, and what has been explained to me about
what we have done with the vendors in the past, they may give
us something, we say, well, that doesn't meet Federal
guidelines. And so you go back and forth, and back and forth.
You may do it 50 times.
I don't know that it is healthy, or wise, or necessary, to
indicate the status, sort of an iterative process between a
vendor and a testing lab, whether it is NIST, whether it is
ourselves, etc. And by the way, we have no problem with the
concept that any vendor money would go to NIST or EAC, and then
they would select people to do testing. That means nothing to
us.
Mrs. Maloney. But once you have tested and sent the results
to EAC, why shouldn't the public be able to verify that the
testing, see what the testing was, to see if it was done
properly or not? Why keep that secret? When you are in a ``give
and take,'' I can understand. But once you have made a decision
and relayed it to EAC, why not have that open to the public, as
the prior two panelists said, should be open to the public?
Mr. Slingerlend. I think that sounds fine with us. I mean,
I think from our standpoint, we have never certified any
machine works. We have attested to the fact that it has met
Federal guidelines. The fact that we say something meets
Federal guidelines, we have no problem with that information
being public, ourselves.
Mrs. Maloney. OK. Did CIBER serve as the independent
testing authority for the ES&S Unity System that was certified
by the National Association of State Election Directors in 2003
and 2004?
Mr. Pope. Yes, ma'am. I believe that is correct.
Mrs. Maloney. OK. Did CIBER do a review, at that time, to
determine if the source code used in the ES&S Unity System
complied with the 2002 voting system standards?
Mr. Pope. I am not the technical expert on that. We would
have to ask our technical folks about that.
Mrs. Maloney. But you were reviewing and testing to see if
they met 2002 standards; right?
Mr. Pope. Yes.
Mrs. Maloney. But you can't say whether or not you tested
to see whether they met 2002 voting system standards?
Mr. Pope. I believe that is a correct statement but I would
like to have the chance to verify that.
Mrs. Maloney. Well, could you verify and get back to the
committee on whether or not you tested to see if they met the
2002 standards?
Mr. Pope. Yes, ma'am.
Mrs. Maloney. Now you testified that you believe they did
since it was certified in 2003 and 2004. So my question is
really, how does CIBER explain the ES&S request to the New York
State Board of Elections for a waiver of these standards? So
when they came to New York, they asked for a waiver of the 2002
voting system standards.
Mr. Pope. That issue is between ES&S and the State of New
York, not between us and ES&S.
Mrs. Maloney. Well, were there other standards in the 2002
voting system standards, that CIBER did not test? We are
talking about testing--70 percent of the voting machines out
there now were tested by CIBER. Now, because of the GAO report,
and it is not my words, I was quoting from the GAO report, the
GAO report said that they were not done properly. We just
heard, from the prior two panelists from the Election
Commission, that they are not going to have to recertify all of
those voting machines to the standards.
So I want to know, are there standards in the 2002 voting
system standards that CIBER did not test?
Now you testified earlier that you are working now to get
up to the 2005 standards. But were there some standards that
you eliminated, or did not test in the 2002 voting system
standards?
Mr. Slingerlend. Ma'am, I don't think we have ever--first
of all, I do believe we tested everything with respect to 2002.
Nobody has ever indicated that we haven't tested everything
with 2002. The issue has been with the documentation with
respect to the testing, not the fact that testing wasn't done,
or that the systems didn't work to Federal standards.
Mrs. Maloney. OK. Then if I could have an additional minute
for one question, Mr. Chairman.
Mr. Clay. Please proceed.
Mrs. Maloney. What individual, or individuals, are
responsible for carrying out and supervising the testing of
voting systems at CIBER?
Mr. Slingerlend. Historically, that responsibility has
fallen, in Huntsville, AL, under a name, Sean Southworth.
Mrs. Maloney. Prior to serving in this capacity, what were
Mr. Southworth's qualifications and how was he chosen for this
role?
Mr. Slingerlend. Ma'am, I can't tell you that. I can tell
you that he has been doing it for approximately 10 years. We
made an acquisition in October 2001, and this was a small
portion of that company, and it was an ongoing activity of that
company. It wasn't the target of the acquisition but was an
ongoing activity of the company at the time. They had been
doing it for several years, are very familiar with NASED, the
people involved in NASED, and continue to do the work they had
been doing prior to the acquisition, after the acquisition.
Mrs. Maloney. Could you please provide the subcommittee
with Mr. Southworth's biography, resume, documents attesting to
his qualifications to perform voting system testing.
Mr. Slingerlend. Sure.
Mr. Clay. Thank you very much, Representative.
Mr. Slingerlend, first, could you please characterize for
us the meaning of the term ``confidential, competition
sensitive.'' Does this mean these documents have trade secrets
or proprietary information? Why was there not adequate
justification made to the board for these designations?
Mr. Slingerlend. My understanding, in part, with respect to
the software work that we perform, we believe that the way we
perform the work we were doing was unique to ourselves and
consequently, you would tend as a business competing with other
business and having competitors and testing, that you don't
like to release those testing procedures to other companies, in
particular.
I think the whole activity that--now that EAC is here, now
that NIST is here, I think that whole program can change. I
don't have any particular reason, other than just we didn't
find it necessary to disclosure how, if you will, Sean
Southworth was doing his work to our competitors.
Mr. Clay. Now according to the New York State Board of
Elections, CIBER had been submitting reports to the board, that
were paid for with New York State funds, but were somehow
restricted from public disclosure.
It seems to me as though CIBER was looking to prevent
public scrutiny of its work.
Mr. Slingerlend. Yes. I don't think there is any intent to
that. I do believe that Mr. Kellner talked to one of our
attorneys, but Mr. Kellner, I did not verify that. I am happy,
if you want to comment on this, and I believe that the
discussion between Mr. Kellner and our attorney was such, that
we removed the confidential labeling of the documents and they
were made public. If that is not the case--I don't know that is
not the case.
Mr. Clay. OK. Well, we will let Mr. Kellner respond. Go
ahead.
Mr. Kellner. Mr. Chairman, I think the problem is that the
habit of CIBER was to keep everything secret and confidential,
and New York's process has been to keep everything open to the
public, and CIBER really wasn't prepared to deal with that, and
I was not satisfied with the way my requests were handled in
terms of telling them, look, you have marked all this stuff
confidential, I want to release it.
And we had a report that had been very carefully negotiated
between New York's independent technical experts, the New York
State Technical Enterprise Corp., and CIBER, on the extent of
the COTS exemptions for source code testing, and CIBER insisted
that agreement that they had be marked confidential, and then
the lawyer at CIBER, when I protested this, rewrote the report,
not the experts but the lawyer rewrote the report, and then
said, here you can release this version that I've cleaned up.
And I really thought that was an inappropriate way to deal
with an expert report, and of course the New York State board
then, following the complicated legal procedures in our State
law, disclosed the report, but only after we went through the
formal procedures to determine that CIBER had no right to claim
confidentiality for the agreed report.
Mr. Clay. Thank you. Thank you for that response.
Mr. Washburn, any commentary or thoughts about the
testimony?
Mr. Washburn. It is my amateur legal opinion, but I don't
think trade secrets apply in voting systems for the test
procedures, because that is the evidence that it does conform.
You are talking about public moneys spent for the, you know,
spent by public officials to administer public elections, for
candidates to public office. What part of that is private?
And so I don't think half of the trade secret definition is
met, because part of trade secret is subject to reasonable
efforts to keep secret, and it is unreasonable to keep secrets
here.
Mr. Clay. Thank you for that response.
Mr. Slingerlend, I picked up on something that you said,
that I am really concerned about, when you say that there were
first 2002 standards and now there are 2005 standards, like
this, and it seems to me like there is a moving ball or a
moving target that the industry has to keep up with.
But what I find to be so disconcerting is that, you know,
we are talking about the public's voting rights, the integrity
of elections, making sure that we get it right once, the first
time, making sure that people's votes are accurately tallied,
that they are actually counted.
I mean, is this a process that we will never be able to
satisfy? Or can we get this right?
Mr. Slingerlend. Sir, I believe it certainly can be done.
If I took off my CIBER hat for a second and I just put on my
American hat----
Mr. Clay. Put on your American cap.
Mr. Slingerlend. I do think that when you look back, then,
how this was done over time--and you should give credit to Ms.
Davidson and the other people of NASED, that took their time,
unpaid, etc., to work on having these machines certified to
some level of Federal standard over the last decade, I think
this has just been, you know, the minister's kids without
shoes. You know, it is just basically a system that has been
neglected, in an official sense, as it should have been done,
over time.
I don't know that the two thousand and--you know, we were
certified as the 1997 standards, the 2002 standards were better
but certainly not adequate, we are sitting here today being
told the 2005 standards are better, but by July 2007 there is
even going to be better ones.
And when we were asked, which we had never been asked to
behave in a certain manner, as I said we were kind of lulled to
sleep, not our fault, but the fact we slept is--when we were
asked last July to go through a testing process that our guys
hadn't done before, weren't behaving in a manner that they
would qualify for ``our fault,'' but doesn't necessarily mean
that they hadn't been, you know, basically steered in that
direction.
When we came back for retesting, it was yet a different set
of rules, after submitting first answers, and then there is a
different set of rules, and now it has been from February 26th
to May 6th or 7th, and we haven't heard about our last response
because EAC really hasn't had the funding, the full-time
auditors, NIST isn't quite on the ground--and that is not
trying to criticize NIST.
I think you have an evolving process here that is going to
be much better, very quickly. But it has been not a great
process over the last couple of years or the last few to
several years.
Mr. Clay. Thank you for that response.
Mr. Washburn, can you identify specific examples of e-
voting systems that had previously been certified by the former
NASED program, even though they were not compliant with the
appropriate standards. If so, can you offer examples of the
types of problems with each system, and are any of these
systems still being used by local election boards?
Mr. Washburn. Well, all through the ones I gave, I cited in
my oral statement, and also my written statement, are currently
in use. So the use of interpreted code is prohibited by section
5.3 of the 1990 standards, it is prohibited by section 4.2.2 of
the 2002 standards, and there were, I believe, 11 systems that
have that property, that were tested over the course of about 4
years. I could get you the actual numbers, if you would like,
of the systems involved.
Similarly, because of an open records request in
California, it was discovered that one of the members of the
technical subcommittee of the NASED voting systems board,
stated that the ES&S scanners have a unique executable for
every election, and there is no single version of the firmware.
It changes from election to election, to election to election,
because it incorporates the election information as a
commingled integral part. You cannot separate the ballot
definition from the scanner firmware. So it is always
different.
And similarly, the Sequoia system, Win EDS, which is in use
by a number of systems still in use, has source code in the
form of Transact SQL, as well as the compiler for it which is
Enterprise Manager.
And what this means is that you can alter the behavior of
the stored procedures, triggers--I am probably getting a little
technical here--but what the Win EDS system does is it just
calls it by name. So whatever SQL is behind that name, that is
what gets executed at that moment in time, and it may not be
the same stuff that was delivered, it may not be the same SQL
that was certified, and it may not be the same stuff that you
audit, the day after.
So those are currently in use.
Mr. Clay. Mr. Norden, what systems send out alarms for you?
Mr. Norden. I think Mr. Washburn did a pretty good job
there.
Mr. Clay. Got everything that you were concerned with?
Mr. Norden. Yes.
Mr. Clay. And how about you, Doctor?
Dr. Wagner. Well, I am a technologist, and I consider the
question of what meets the certification standards a policy
question. But I believe there is room for serious concern about
a number of the systems from three of the four major vendors
out there. The Princeton vulnerability testing has demonstrated
serious security problems in machines from one vendor, which I
think there is a credible argument, violates the standards.
The problem that we face today is that there has been no
process and no attempt to investigate these claims. This has
been a bit of a political ``hot potato'' that no one wants to
touch, because if we were to--if there were to be a finding
that these systems did not comply with the standards, local
election officials would be in a major bind.
So for that reason, the EAC has been reluctant to
investigate these claims about--they perhaps reasonably have
said NASED certified these systems, let NASED deal with its
mess. NASED has been silent on this issue.
So we haven't come to terms. There has been no serious
attempt to grapple with these allegations.
Mr. Clay. Thank you so much for that response.
Representative Maloney, do you have any questions?
Mrs. Maloney. That is truly horrifying, that there has not
been any serious attempt to grapple with this, and everyone's
hiding behind the fact that NASED certified it.
So I would like to ask Mr. Slingerlend, since he is
involved in testing, is it fair to say that having
certification from the National Association of State Election
Directors does not necessarily mean that the voting equipment
complies with each and every one of the voting standards?
Maybe let me back up a little bit. Did CIBER test the
Diebold AccuVote TS optical scan terminals that were the
subject of the reports by computer scientists at Princeton and
the University of Connecticut that Dr. Wagner mentioned? Did
CIBER test them?
Mr. Slingerlend. Do you know? I don't know.
Mr. Pope. We have tested Diebold systems but I'm not
particular about the one that you mention.
Mrs. Maloney. Well, the Diebold system is the one that
Princeton and Connecticut hacked into.
Mr. Slingerlend. As Mr. Washburn said, is it the one that
was tested, the one that was delivered, the one that was
implemented, or as other people were saying, we have--they have
been a client from time to time. That specific item, we would
have to check into, ma'am.
Mrs. Maloney. Well, maybe you could check into it and get
back to the committee.
Mr. Slingerlend. Let me just make sure that I get the right
question, so I get them the right answer.
[The information referred to follows:]
[GRAPHIC(S) NOT AVAILABLE IN TIFF FORMAT]
Mrs. Maloney. And isn't it true, that those reports showed
security vulnerabilities that were not tested in the
certification process, obviously, in the Connecticut and
Princeton tests?
Mr. Slingerlend. Can I address what you are--the general
topic of what you are saying right now. I believe with 100
percent, you know, certainty--and again I guess put the word
``believe'' there--but I believe we have done a fine and good
job of testing the software in machines, not the hardware of
machines, cause we have never been said to be testing the
hardware of the machine. But the software of the machines to
meet the 2002 standards that are out there.
That does not mean to say that the 2002 standards were as
great as they should have been, or that they weren't changed in
2005, and it sounds like they are changing again in 2007. But I
do believe that if we were asked to certify something, or
attest to something, as to how it worked to 2002 standards, we
did our job properly.
Those standards may not have been sufficient and that may
be exactly your point.
Mrs. Maloney. Prior to the time that you were suspended
from further testing in New York State, did any of the voting
systems submitted pass each of the tests that were given? Did
any of the voting systems pass prior?
Mr. Pope. With regard to the State of New York, all the
systems that we have tested are still in an incomplete state.
Mrs. Maloney. All right. Let me go back to the question
that, really, the point that Dr. Wagner raised, and just go
down the panel, starting with Mr. Kellner, and let everybody
answer.
Is it fair to say that having certification from the
National Association of State Election Directors does not
necessarily mean that the voting equipment complies with each
and every one of the voting standards?
Can you replay to that, Mr. Kellner.
Mr. Kellner. I think that is completely true. I think that
everyone has to follow California's lead, and California's
Secretary of State has announced that she is going to retest
every single piece of voting equipment, and it is based on the
bankruptcy of the 2002 standards testing that was done under
NASED supervision, that NASED certification is meaningless.
Mrs. Maloney. That is a powerful statement.
Dr. Wagner.
Dr. Wagner. Representative Maloney, I think it is indeed
fair to say. I would concur with your assessment.
Mrs. Maloney. That NASED certification is meaningless. OK.
Mr. Norden.
Mr. Norden. Yes, I would agree with that, and I would add a
couple of things. That is one reason why having software
independent records and audits is so critical.
And in addition, something that Mr. Kellner mentioned I
think bears some further explanation. I am troubled by the fact
that this system has been so--on top of everything else, and
certainly, the security of our elections is the most important
thing.
But on top of everything else, it has been an incredibly
inefficient system, and we have States like New York and States
like California not trusting Federal certification and having
to run very expensive tests on their own. This is expensive to,
obviously, the people of the State of New York, to the people
of California, it is expensive to the vendors.
And what I would like to see is that, at some point, when
we get these standards right on the Federal level, that this
isn't just voluntary, that it is a mandatory thing that all of
the States comply with, and that we can actually trust the
certification process, so we don't have to go through what we
have gone through in New York, so that we don't have to do the
kind of additional testing that we do in California, unless
there are very specific reasons for doing so.
Mr. Washburn. I too would agree that a certification number
has no connection at all to whether that system complied or
doesn't comply with the standards, and echoing Mr. Norden's
point on the testing, the proposal I was talking about, that is
in my written testimony, would propose that a consortium of
States buy a pool of election equipment exactly as bought by
election officials, and essentially allow anyone who would like
to do a test on it, in a manner similar to, with access similar
to what an election official has, the stipulation being is it
has to be videotaped and audio recorded for everything you do,
so there is no dispute what they did, what they didn't do, what
the findings were, good, bad or ugly, whatever the result is.
And then that information could be made public and help
election officials evaluate changes in the local security
procedures.
Mrs. Maloney. That is a very strong statement, if I
understand what you said. You said no certification system up
to this point can verify that the voting machines are meeting
the required standards of 2002, not to mention 2005, that they
are now required to meet.
Mr. Washburn. Well, I haven't looked at all of them. I
looked at most of those that are sold in the State of
Wisconsin. But I find problems with all of--I can find a
section of the standards that the system does not meet for
every one of those in Wisconsin.
Mrs. Maloney. And Mr. Slingerlend, do you agree with the
comments or Mr. Kellner, Dr. Wagner, Mr. Norden and Mr.
Washburn, that the certification from the National Association
of State Election Directors is not a certification you can rely
on? Is it fair to say you are saying it is not workable, it is
not doing the job?
Mr. Slingerlend. If you knew me better, you would probably
know I disagree with most anybody. But I would go back to what
these gentlemen were saying, and your question earlier was are
they meaningless, and I think I would say these are good people
doing unpaid work, not sufficiently funded or done by the
Federal Government, doing the best they could.
I would say it wasn't sufficiently meaningful, but I'm not
going to say it was meaningless.
Mrs. Maloney. But back to Dr. Wagner's statement, you were
saying that the EAC would not go back and look at these systems
because they were certified by the National Association of
State Election Directors. Is that what you said?
Dr. Wagner. I can't speak for the EAC of course, but my
understanding is that the position the EAC has taken is that
they will not go back to investigate these allegations and
systems that were certified by NASED, that they are developing
a new process. If manufacturers choose to submit their systems
to the EAC's new process, then the EAC will investigate
reports, may consider decertification, if that is warranted.
They have developed a new process with these safeguards but
those safeguards don't apply to the old NASED process.
Mrs. Maloney. That is very discouraging. I would like, Mr.
Kellner, just go down the line, for each of you to comment on
what you have examined in voting systems that were certified,
and do you think they are fine? Can we trust them? What are
your statements? I will just get you on the record.
Mr. Kellner. I certainly subscribe to the view that Debra
Bowen in California has adopted, which is that we need to have
recertification of every voting system that is in use in this
country, and that is a responsibility Congress should give the
EAC, and I would add that we shouldn't be spending a lot of new
money to buy voting equipment until that process has been
completed.
Dr. Wagner. It is a difficult question with a complex
answer. I would say despite the flaws and the deficiencies in
the certification process, I believe that many of the systems
out there, for instance, the systems that provide a voter-
verified paper record, if they are used appropriately, can
provide a good basis for trust in our elections.
However, I have serious concerns about the use of paperless
e-voting systems.
Mr. Norden. I would echo exactly what David Wagner just
said. If we are going to continue using these systems, and I
think to a certain extent there is no choice, that for the next
few elections we have to, we need to ensure that we have paper
records and that we are using those paper records to check the
electronic tallies that we get at the end of election day.
Mr. Washburn. I once knew a whitewater outfitter who used
to say there comes a point in the river where there is no way
out but through, and I think we are at that point with the
current crop of systems. There is no way it is going to be
fixed in time.
But that said, as Mr. Wagner said, certain systems are less
vulnerable than others, and specifically what you want is a
system that provides an objective record, that the voter has
made, that might possibly contradict what the electronics are
telling you. Systems that don't have that are inherently more
vulnerable.
Mr. Slingerlend. I think paper systems are great for Third
World countries. I like your comment about if you can't find a
way I will go through it. I think we are on the cusp, with EAC
and NIST, to making progress in an area that was never
sufficiently addressed before, and you should press on.
I mean, I think that this country should press on with
electronic voting system, and you have smart people that care,
that are in charge of this activity now. Go with it. That would
be my recommendation.
Mrs. Maloney. Thank you.
Mr. Clay. Thank you, Mrs. Maloney.
Let me thank all of the panel for their testimony today,
and thank our gracious host, again, Representative Maloney, for
inviting us here today. I think that the hearing brought out
the fact that we must be able to verify the reliability and
security of our Nation's voting machinery.
The EAC, the States, and local election authorities, must
work hand in hand to ensure that our elections are conducted in
a manner that gives our citizens the utmost confidence in the
election process.
Vendors of election machines should not be paying labs, and
all machines must have a verifiable paper trail.
H.R. 811, introduced by Representative Holt of New Jersey,
would apparently give us that extra protection, and Congress
needs to move on it.
The certification process must be transparent, and sunshine
must be allowed to expose the process. We must get the voting
procedure correct the first time in New York and across this
Nation, and I will yield to my friend for closing remarks.
Mrs. Maloney. I want to thank all of the panelists for
coming, and my colleague and good friend, Mr. Clay, for having
this Federal hearing. It is obviously a critical issue. What is
more important than the security of our voting machines? And it
is a part of our democracy, it is a top priority and one that
we will continue to pursue as a Congress and as a committee.
I am delighted that tomorrow, Congress Holt's bill, on
which he has worked for 8 years, will be marked up in committee
and I hope it will move to the floor and be passed. It will
strengthen it and address many of the issues that you brought
up today. The need for a verifiable paper trail to check the
electronic voting. The need for checking conflicts of interest,
that the payment by vendors will go to the EAC who will then
select the testing labs to find out how accurate they are.
It provides funding for purchasing these machines, and for
audits. It is very important to have an independent audit, to
see if they are working properly.
All of you have helped move this country forward to a
safer, more reliable voting system, and I thank all of you for
your tremendous contributions to it. Nothing is more upsetting
than hearing questions about more people voting than were
registered and more people voting than signed up to vote on the
machine, and all types of really questionable items, that
really, you expect to be happening in Third World countries,
not in the great democracy of the United States.
So we need to correct it, we need to all continue with
oversight, and to continue with our eye on making sure that
these elections are as safe as they possibly can be, and I want
to thank all of you for your research, your time, for being
here today, and for your continued commitment for safe and
reliable voting machines and election system in the United
States. And all the advocates.
Mr. Clay. Thank you so much, Representative, and at this
time we will excuse the panel, gavel the committee to a close,
and hold an impromptu press conference with Representative
Maloney and myself for members of the press.
Without objection, the hearing is adjourned.
[Whereupon, at 11:30 a.m., the subcommittee was adjourned.]
[Additional information submitted for the hearing record
follows:]
[GRAPHIC(S) NOT AVAILABLE IN TIFF FORMAT]