Every Vote Counts (Hopefully)

As electronic voting machines come into vogue, Kirsten Anderson takes a look at the security problems that have plagued these devices over the last several years.
This post was published on the now-closed HuffPost Contributor platform. Contributors control their own work and posted freely to our site. If you need to flag this entry as abusive, send us an email.

The following piece was produced through OffTheBus, a citizen journalism project hosted at the Huffington Post and launched in partnership with NewAssignment.Net. For more information, read Arianna Huffington's project introduction. If you'd like to join our blogging team, sign up here If you're interested in other opportunities, you can see the list here.

This is the first installment of a two-part report by Kirsten Anderson.

Which of the following best describes taking the SAT and voting?

A. They both take about four hours to complete.

B. They both require a thorough knowledge of a specific issue.

C. Their results are affected by the number of people participating.

D. Their results are affected by the weather.

Did you choose D? Here's why you should have: on August 24th, the College Board and NCS Pearson announced a tentative settlement of a class-action suit brought by about four thousand students whose October 2005 SATs were scored incorrectly. The College Board maintains and administers the SAT. Pearson NCS scores the tests.

In announcing the settlement, the College Board also described new quality control measures that have been put into place to reduce the likelihood of similar errors: all answer sheets will be scanned twice from now on, using different machines each time. Tests will be kept in low humidity areas because high moisture had apparently contributed to the faulty reading of answers as the tests were scored by optical scanners (refresher to those of you haven't taken a standardized test for a while: you fill in the answer bubble on a test sheet, shading it as dark as you can. The answer sheet is fed into an optical scanner which records your answers and stores the results).

So what does this have to do with voting? NCS Pearson sells its own line of optical scan test-scoring machines, but one of the models sold under the Pearson name is made by Chatsworth Data Corporation (CDC). Another CDC client that sells a line of optical scanners under its own name is Election Systems & Software. ES&S is one of the primary suppliers of optical scanners used for voting in elections.

Now it's probably not likely that states holding primaries on February 5th, 2008, will be afflicted with bouts of humidity high enough to stretch out the ballots running through the scanners, causing the votes on them to be incorrectly counted. But it's not impossible either.

It's been a great summer for fans of voting technology follies. On August 3rd, California Secretary of State, Debra Bowen announced that the state was decertifying a number of DRE (direct-recording electronic) voting machines after a two-month review revealed a number of security flaws in machines made by major voting systems players Diebold and Sequoia, as well as smaller company Hart Intercivic. In addition, another group of machines was disqualified after it was revealed that the company, ES&S, sold them in California before those models were certified by the state. ES&S faces an investigation on this matter and up to $10,000 in fines per uncertified machine. Meanwhile, counties all over California are scrambling to replace the machines they no longer can use.

Florida came close to facing its own decertification nightmare. A study by Florida State University researchers revealed security flaws in a model of Diebold optical scanners that were set to be used in upcoming elections. The state threatened to decertify the machines if a solution was not presented by August 17th, allowing time to get them ready for local elections in Sarasota County. In a letter to Diebold dated August 10th, Florida Secretary of State Kurt Browning stated that the problems had been resolved and the machines were certified for use.

An August 20th report from CNET.com described how the secret ballot in Ohio isn't so secret. A state open records law gives people the right to go to an election office and ask to look at a list of voter sign-ins and a list of votes cast--a list which is time-stamped. Putting the lists together could allow someone to make a reasonable match of voter in order of sign-in with the time-stamped vote.

The maker of the machine that includes the time-stamp, ES&S, protested that variables in the time it takes to vote could throw the list out of order. That may be true in a major election where there is a large turnout and there are many questions and candidates on a ballot. However, in a small local election, where voters may come in at slower pace, the lists probably agree quite easily and it is in this situation that being able to identify voters is more problematic. With a ballot of questions of importance to neighbors, colleagues and friends, finding out who voted how could lead to political or economic punishment or reward.

In addition to this problem, Secretary of State Jennifer Brunning is reviewing all models of voting machines used in Ohio. Some of the models they are investigating are Diebold machines that were decertified in California. Kentucky also recently discovered that its largest county has been using uncertified Diebold machines. In Colorado, Jefferson County is reconsidering their choice of machine.

And remember--the presidential primaries are about five months away.

The way votes are cast and counted has been in flux since the rocky 2000 presidential election. In October, 2002, the enthusiastically named Help America Vote Act (HAVA) was passed. The intent of HAVA, as described in the opening paragraph, was: "To establish a program to provide funds to States to replace punch card voting systems, to establish the Election Assistance Commission to assist in the administration of Federal Elections and to otherwise provide assistance with the administration of certain Federal election laws and programs, to establish minimum election administration standards for States and units of local government with responsibility for the administration of Federal elections, and for other purposes." HAVA also stated that new voting systems should allow voters the opportunity to check their vote before it is cast and counted; that there should be a paper trail for audit purposes; that machines should be accessible to people with disabilities; and that alternative language choices should be available.

In order to help states change their voting systems, HAVA authorized an amount in excess of three billion dollars distributing funds in amounts based on voting age population per state. States quickly began spending this money in order to get in compliance with HAVA. Three companies--Diebold, ES&S, and Sequoia--took advantage of the new electronic voting machine gold rush, jumping into the lead and dominating the nation's purchases of new election equipment.

The Election Assistance Commission was formed to help oversee HAVA. One of its jobs is implementation and funding of a system to federally certify voting machines; previously, there was a volunteer system for federal certification, overseen by the National Association of State Election Directors, a nonpartisan group that was not federally funded. As described by the Maryland State board of elections the NASED approved independent testing authorities (ITAs) and the ITAs tested voting systems. The ITA that tested a given system, though, was chosen and paid by the system manufacturer. This, of course, leads to questions of conflict-of-interest and reliability; one of these ITAs, Ciber Inc., which tested a large number of the systems in use throughout the country was shut down earlier this year due to questions of quality control and failure to complete all required tests.

States are not required by the EAC to participate in its own new federal lab certification program. However, most states demand federal certification in addition to state certification and are therefore likely to use it. Unfortunately, the EAC's first group of labs was not accredited until February, 2007. This means that the various new voting systems sold since 2002 have been certified under a patchwork, questionable system. The results have been predictably wild.

TrueVoteMD, a non-partisan organization, sent trained observers to precincts all over Maryland to watch voting in the elections on November 2, 2004. They reported a variety of problems that resulted in lost or incorrect votes. There were memory card failures and hard drive crashes. Touch screens were so sensitive that people complained they couldn't tell who they had voted for. Review screens went blank before they could check their votes. Ballots were submitted before the voter had made any choices. Polling places opened late because of trouble booting up machines, and votes were lost because people could not wait. Inadequate staffing meant voters weren't able to get help when they needed it.

VotersUnite!, a national election watch group collected information about voting problems in the November, 2006, mid-term elections. Difficulties with starting machines again contributed to precincts opening late. Voters saw machines register different candidates than those they had selected; staffers told them it was their fault for not knowing how to use a computer. At the end of the day, tallying the votes became a nightmare. Election workers reported machines adding votes multiple times; subtracting votes instead of adding them; and reporting a tally greater than the number of people who had voted. Some had trouble retrieving any information from the memory cards. Others got different sets of results every time they tried to get a total. One frustrated worker in North Carolina said that when they called the ES&S home office to get help with their machine, the phone was off the hook.

Most notably that year, in Sarasota, Florida, it was discovered that although 18,000 people who had cast votes on other ballot questions had not marked a vote for the race for the District 13 Congressional Seat. The 13% "undervote" is considered an astonishing anomaly. According to a report on the incident by Common Cause, nearby Manatee County experienced only a two percent undervote, and a typical presidential election usually registers an undervote of less than one percent.

Essentially 18,000 votes were missing, which is particularly meaningful in light of the fact that the winner, Vern Buchanan, beat his opponent, Christine Jennings, by only 369 votes. Lawsuits by Jennings and local voters led the state to investigate the election, but they determined the machines were not at fault. However, a report on the state investigation by electronic voting experts David Dill and Dan Wallach found that the state's investigation was conducted incomplete, and improperly conducted in an unrealistic setting.

Electronic voting systems were supposed to make elections more secure, more convenient, and easier for voters. Instead, the 2006 elections had shown that the new systems were just as trouble-plagued as the banished punch-card and lever machines. With presidential primaries on the horizon, the question of what makes electronic voting machines so insecure and how can they be fixed, had to be answered.

Tomorrow, Kirsten Anderson investigates some of the security challenges to electronic voting machines.

The above piece was produced through OffTheBus, a citizen journalism project hosted at the Huffington Post and launched in partnership with NewAssignment.Net. For more information, read Arianna Huffington's project introduction. If you'd like to join our blogging team, sign up here. If you're interested in other opportunities, you can see the list here.

Popular in the Community

Close

What's Hot