c/o 3548 Beechwood Boulevard
Pittsburgh, Pennsylvania 15217-2767
October 25, 2006
United States Election Assistance Commission
1225 New York Avenue, N.W., Suite 1100
Washington, D.C. 20005
By fax: 202-566-3127
Re: Comments on Voting System Testing and Certification Program
Ladies and Gentlemen:
VoteAllegheny is a volunteer organization committed to safeguarding voters’ rights. As such, we wish to add testimony toward the implementation of the best system possible of testing and certifying voting systems. Following are our comments on Voluntary Voting System Guidelines and Voting System Standards, and also on the Testing and Certification Program Manual 2006.
We, as a group, seek to define and circumscribe a universally acceptable ideal for a reliable system, which would include the following: voter verifiability; accessibility to the broadest range of individuals for secure, private voting; system functioning which is not unnecessarily complex or opaque; any system source code open to review by the public; software verification; distinct and detailed testing of systems prior to, during and after use; accurate counting of votes; mandatory auditing of randomly selected five percent of precincts; all portions of vote compilation available to the public; completely secure retention of voter-verified paper records for recounts and audits; and which system would be as free of vulnerability as possible.
Toward that end, we feel that the VVSG and VSS should address these issues to the fullest and broadest extent possible.
1. Voter Verifiability. There must always be some method of a voter verifying the voter’s vote separate from blind faith that a computer is recording it correctly. Ideally, at this point in the evolution of voting system technology, we believe that best method is through a paper record which the voter has verified and which is retained at the polling place as a legally accepted ballot. We feel that the paper ballot must be of sufficient durability to retain its viability over at least two years. Further, we would propose that such paper ballot comply also with the laws of the Commonwealth of Pennsylvania, which obviates consecutive voter records on a roll of paper.
2. Accessibility and Privacy. Any voting system must provide accessibility and privacy to the broadest range of individuals possible with current technology. The standard should be the highest level available – accessible to sip-and-puff quadriplegics, the blind, and those otherwise incapacitated – rather than minimally available to the blind with lengthy listening to long lists of candidates and extremely slow writing capacity currently offered by some manufacturers. Additionally, privacy in terms of not requiring assistance from others at the polling place is of importance.
3. Transparent System. There should be few, if any, software and hardware products incorporated into the system which have been made by manufacturers other than the one selling the voting system. Further, there should be no reason for any such products which create the situation in which the public may not know the source code due to its being proprietary. The manufacturer must own all the portions of its system and must be willing to demonstrate them openly. Further, there should be no outside ports or programs in any part of the system which may enable connecting with the internet, outside devices, etc., except where such connection is a necessary and unavoidable part of the voting system.
4. Open Source Code. There must be no reason why the public may not inspect the source code which is governing our elections. Further, there must be mandatory submission by each manufacturer of source code and hardware designs to a central authority (the EAC) that is then responsible for publishing copies of said code for the purposes of audit verification. Lacking this independent chain of custody, individual jurisdictions have little ability to verify that what they receive is what was presented for certification.
5. Software Verification and System Testing. We must have the ability to conduct open hardware and software inspections, without requirement of manufacturer representatives being present. Further, there should be mandatory thorough security analyses conducted publicly by agencies (the EAC) both fiscally and managerially independent of the manufacturers.
6. Vote Counting Accuracy. Any system must be able demonstrably to compute vote totals with provable accuracy. Any compiling software must be considered a part of the system in such demonstrability, and must be shown not to be corruptible by insertion of text files from outside sources in the process of tallying final totals.
7. Mandatory Audits. Although not a consideration of the EAC at this time, we feel that in order to keep all systems honest each voting jurisdiction should be encouraged to implement mandatory random audit of at least two percent (and better five percent) of the voting precincts in any election. This would require hand counting of paper ballots and comparison with the final results produced by the voting system.
8. Public Scrutiny of All Processes. There should never be contractually arranged situations where the manufacturer denies the public access to witnessing the processes utilized by the voting system, including its programming and preparation.
9. Freedom from Vulnerability. There must always be testing and scrutiny included as a part of certification of any system of its sensitivity to attack or vulnerability to failure. There must be a set level of extremely low tolerance for these sensitivities and vulnerabilities. Any system which does not pass standard acceptable tests for such may not be certified.
The very fact that slot machines currently undergo greater scrutiny than our voting machines is worrisome to us.
10. Secure Retention of Paper Ballots. There must be present in any voting system a delivery method of a voter-verified paper ballot – by default in an optical-scan system, by design in other systems – which paper should be retainable in a secure way by any voting jurisdiction. Further, the paper should be of such type that it would retain its integrity and the integrity of its data for at least two years.
We feel that the most important advancement to be achieved with these new guidelines is a total transparency in the ITA process, where it has not existed at all previously. To date, manufacturers have had the ability to “shop” their wares around until they would find a favorable review by an ITA, which of course they then would purchase and not have to release to the public. Inasmuch as the various jurisdictions ultimately purchasing the equipment rely heavily on federal and state “certification,” the entire results of such review must be made completely public in all cases.
Overall, we feel that the draft document presents a good outline of the circuitry for registering, monitoring, certifying, reviewing, and maintaining documentation for voting systems manufacturers and their products. We would comment on several points of that basic outline.
Paragraph 18.104.22.168 indicates that EAC certification does not qualify as a determination that a system will meet all requirements of the Help America Vote Act (“HAVA”) when fielded. It is our experience that local governmental jurisdictions rely heavily upon federal “certification” and state “certification” to indicate the complete and total compliance of systems with HAVA. Therefore, we believe that it behooves the federal certification procedure to do just that, leaving only state certification as to state laws, and nothing but purchase considerations to the local voting jurisdictions.
Furthermore, as new HAVA standards come into effect, we feel that manufacturers must be required to seek certification to the new standards, even if their systems are already deployed in voting jurisdictions.
Paragraph 22.214.171.124 indicates that EAC certification is not a determination that the system is ready for use in an election. We object to this blanket statement entirely, and feel that any system which is certified by the EAC should be “ready for use in an election.”
We do not feel that systems qualified previously by NASED should pass without current review, as outlined in Paragraph 3.3.
Paragraph 8.4 deals with quality monitoring, and indicates that the third method would be field performance issues with certified systems. This provision may place too much reliance on local ability to monitor what system and software the district is employing. At present, we note, only one county in the entire United States verifies its software – Snohomish County, Washington. Responsibility for such quality monitoring should not rest on a thousand jurisdictions rather than in one federal-level entity. The federal-level repository of such manufacturers’ information should make it much easier for local jurisdictions to scrutinize their systems as deployed.
Paragraph 8.5, concerning manufacturing site review, should contain, in addition to review of all production and manufacture physical components, complete review and comparison of software/firmware being produced.
Fielded System Review
Similar to the paragraph immediately above, any review of a fielded system should also contain complete review and comparison of software/firmware being employed, on all portions of the system.
Field Anomaly Reporting
Reporting anomalies in the field should not be limited only to reports by the official voting jurisdiction election officials. Citizens and advocacy groups should also be given credence in such reporting. In many jurisdictions such advocacy groups and designated members of political parties are given the responsibility of watching the logic and accuracy testing of machines. Additionally, we have found that elections officials are often extremely difficult to convince that the manufacturer providing their systems is not completely in control. Whether this is because they have succumbed to snow jobs by the manufacturer’s salespersons or because they have faith in their selection of product, it nevertheless serves to make observations of anomalies next to impossible to such elections officials.
Paragraph 10.3 covers trade secrets. We do not believe that source code of voting systems should be considered a trade secret, or that any adjacent software should be used which is not owned by the manufacturer and therefore its source code cannot also be disclosed. Additionally, any information of a type which would be submitted to the Library of Congress for copyright, or of the sort which would be disclosed to the Securities and Exchange Commission in the case of any public company, should be required submission for any manufacturer of voting machines and its products and systems, regardless of whether such copyright submission has or ever will be made and regardless of whether or not such manufacturer is a public company or even a corporation at all.
Additionally, we feel that the document itself should contain details as to how the Voluntary Voting System Guidelines and Voting System Standards are developed. Even better would be containing the actual guidelines and standards themselves.
We remain hopeful that employing new procedures and somewhat revamping the certification process will not prove to be locking the barn after the horse has escaped. Inasmuch as most jurisdictions in the United States have made often rash decisions as to purchasing voting systems in order to comply with HAVA, based very much upon faith in “certifications” by upper levels of government, and since so very much money has already been spent thereon, unless (a) more federal tax dollars are offered for upgrades, or (b) manufacturers are forced to upgrade systems at minimal cost to the end users, or (c) systems begin failing in massive numbers, we are likely to find a great reluctance toward further purchases by local jurisdictions, obviating much of this work for the next several years.
By:__ ______/s/ Audrey N. Glickman__________
Audrey N. Glickman, Secretary pro tem