Pwning* democracy

Pronounced "pon"
1.  (especially in video gaming) utterly defeat (an opponent or rival); completely get the better of.
"I can't wait to pwn some noobs in this game"

The Register (UK)


It took DEF CON hackers minutes to pwn these US voting machines

We've got three years to shore up election security
By Iain Thomson in San Francisco
DEF CON After the debacle of the 2000 presidential election count, the US invested heavily in electronic voting systems – but not, it seems, the security to protect them.
This year at the DEF CON hacking conference in Las Vegas, 30 computer-powered ballot boxes used in American elections were set up in a simulated national White House race – and hackers got to work physically breaking the gear open to find out what was hidden inside.
In less than 90 minutes, the first cracks in the systems' defenses started appearing, revealing an embarrassing low level of security. Then one was hacked wirelessly.
“Without question, our voting systems are weak and susceptible. Thanks to the contributions of the hacker community today, we've uncovered even more about exactly how,” said Jake Braun, who sold DEF CON founder Jeff Moss on the idea earlier this year.
“The scary thing is we also know that our foreign adversaries – including Russia, North Korea, Iran – possess the capabilities to hack them too, in the process undermining principles of democracy and threatening our national security.”
The machines – from Diebolds to Sequoia and Winvote equipment – were bought on eBay or from government auctions, and an analysis of them at the DEF CON Voting Village revealed a sorry state of affairs. Some were running very outdated and exploitable software – such as unpatched versions of OpenSSL and Windows XP and CE. Some had physical ports open that could be used to install malicious software to tamper with votes.
It's one thing to physically nobble a box in front of you, which isn't hard for election officials to spot and stop. It's another to do it over the air from a distance. Apparently, some of the boxes included poorly secured Wi-Fi connectivity. A WinVote system used in previous county elections was, it appears, hacked via Wi-Fi and the MS03-026 vulnerability in WinXP, allowing infosec academic Carsten Schurmann to access the machine from his laptop using RDP. Another system could be potentially cracked remotely via OpenSSL bug CVE-2011-4109, it is claimed.

We're told the WinVote machine was not fully secured, and that the intrusion would have been detected and logged, so don't panic too much. And not all the attacked equipment are used in today's elections. However, it does reveal the damage that can potentially be done if computer ballot box makers and local election officials are not on top of physical and remote security, especially with a growing interest from Russia and other states. Think of it as a wakeup call.
“Elections have always been the concern and constitutional responsibility of state and local officials. But when Russia decided to interlope in 2016, it upped the ante,” said Douglas Lute, former US Ambassador to NATO and now principal at Cambridge Global Advisors.
“This is now a grave national security concern that isn't going away. In the words of former FBI Director James Comey, ‘They're coming after America. They will be back.’” ®
The Hill
States ditch electronic voting machines
Cory Bennett
States have abandoned electronic voting machines in droves, ensuring that most voters will be casting their ballots by hand on Election Day.
With many electronic voting machines more than a decade old, and states lacking the funding to repair or replace them, officials have opted to return to the pencil-and-paper voting that the new technology was supposed to replace.
Nearly 70 percent of voters will be casting ballots by hand on Tuesday, according to Pamela Smith, president of election watchdog Verified Voting.
"Paper, even though it sounds kind of old school, it actually has properties that serve the elections really well," Smith said.
It’s an outcome few would have predicted after the 2000 election, when the battle over “hanging chads” in the Florida recount spurred a massive, $3 billion federal investment in electronic voting machines.
States at the time ditched punch cards and levers in favor of touch screens and ballot-scanners, with the perennial battleground state of Ohio spending $115 million alone on upgrades.
Smith said the mid-2000s might go down as the “heyday” of electronic voting.
Since then, states have failed to maintain the machines, partly due to budget shortfalls.
“There is simply no money to replace them,” said Michael Shamos, a computer scientist at Carnegie Mellon University who has examined computerized voting systems in six states.
The lack of spending on the machines is a major problem because the electronic equipment wears out quickly. Smith recalled sitting in a meeting with Missouri election officials in 2012 where they complained 25 percent of their equipment had malfunctioned in preelection testing.
“You’re dealing with voting machines that are more than a decade old,” Smith said.
Roughly half of the states that significantly adopted electronic voting following the cash influx have started to move back toward paper.
The Presidential Commission on Election Administration in January warned that the deterioration of voting machines is an “impending crisis,” but House Republicans say the issue should be left to the states.
Rep. Candice Miller (R-Mich.), who chairs the house committee that oversees federal elections and is a former Michigan secretary of State, said the cash infusion to the states in the mid-2000s was "unprecedented."
"State and local election officials should not rely on the federal government to replace voting machines that may be nearing the end of its useful life. Therefore, state and local election officials should recognize that they are responsible for upgrading their voting equipment as needed, and hopefully they are budgeting accordingly," Miller said in a statement to The Hill. 
Some voters might welcome the return to punch voting, given that researchers have repeatedly proved the fallibility of individual e-voting machines.
One group from Princeton needed only seven minutes and simple hacking tools to install a computer program on a voting machine that took votes for one candidate and gave them to another.
More whimsically, two researchers showed they could install Pac-Man onto a touch-screen voting machine, leaving no detectable traces of their presence.
But concerns of widespread tampering are overblown, Shamos said.
“It’s something you can demonstrate under lab conditions,” he said. To translate it to an election-altering hack, “you would have to commit the perfect crime.”
“There's never been a proven case of manipulation of an electronic voting machine,” he said.
Voting machines are not connected to any network and not connected to each other, making them difficult to tamper with.
“These machines are not hooked up or networked in any way that would make them vulnerable to external access,” said Matt McClellan, press secretary for the Ohio secretary of State. “We’re confident that process is secure and the integrity is being maintained.”
“There’s no mechanism whereby viruses can pass from one machine to another,” Shamos agreed. Best-case scenario, “maybe I could fool a few people” and get several hundred votes “for my guy.”
Bryan Whitener, director of communications for the U.S. Election Assistance Commission, noted that all electronic voting machines are tested and certified.
Many states, like Colorado, keep their machines under video surveillance with detailed records of when software is being installed.
When Ohio made the $115 million statewide switch to e-voting, it passed a law that all voting methods, including touch screens, must also generate a paper trail.
“It’s not just solely an electronic vote,” McClellan said.
More than 60 percent of states passed similar laws with the electronic switch. Some states moved preemptively; others were reactionary.
An electronic machine in North Carolina lost roughly 4,500 votes in a 2004 statewide race after it simply stopped recording votes. The race was ultimately decided by fewer than 2,000 votes.
“Now what do you do?” Smith asked. “You can’t really do a recount. There’s nothing to count.”
Within a year, the state passed a law requiring a paper back-up.
Paper trails are simply “more resilient,” Smith said.
Shamos said he expects the move back to paper ballots to continue, unless there’s a high-profile crisis similar to the 2000 election.
Still, he predicted the drumbeat for Internet and mobile voting will grow.
“Eventually [a generation is] going to have the thought that it’s idiotic for me not to be able to vote using my cell phone,” Shamos said.
Then all bets are off.
Wired Magazine
Broken Machine Politics
Introducing the User-Friendly, Error-Free, Tamper-Proof Voting Machine of the Future!
(WARNING: Satisfaction not guaranteed if used before 2006.)

 Wired Staff
On a cool afternoon last February, five politicians gathered in the heart of Silicon Valley for a meeting of the Santa Clara County Board of Supervisors. Their task: to replace the county's antiquated punch card voting system with $20 million worth of touchscreen computers.
Executives and lobbyists from three different voting-tech vendors were scheduled to present their wares, but the outcome was practically predetermined. Supervisors on the board's finance committee had already anointed a winner: Sequoia Voting Systems, based 35 miles north in Oakland. It was all over but the voting.
And then the computer scientists showed up: Peter Neumann, principal computer scientist at R&D firm SRI; Barbara Simons, past president of the Association for Computing Machinery; and Stanford computer science professor David Dill. They had been fidgeting in the front of the room through three hours of what Dill would later call "garbage." Finally, they stood up and, one by one, made their case.
Voting, they explained, is too important to leave up to computers - at least, these types of computers. They're vulnerable to malfunction and mischief that could go undetected. Where they'd already been adopted, the devices - known in the industry as DREs, short for direct recording electronic - had experienced glitches that could have called into question entire elections. And they keep no paper record, no backup. "We said, 'Slow down. You've got a problem,'" recalls Neumann.
It felt odd - computer scientists inveighing against their own technology in the tone of geniuses lecturing undergraduates. They had been lobbying for months, and now "it was like they were making a last stand at Santa Clara," says one person who was at the meeting. The supervisors listened politely. "But the board didn't seem to see what it had to do with anything," says Liz Kniss, a supervisor who shared the concerns raised by the scientists.
In the end, Kniss and her colleagues voted 3 to 2 to award the contract. The last stand had failed - almost. At the final moment, the supes insisted that Sequoia be ready to produce DREs with a paper backup, should the county ever ask for them. It seemed like a sop to the geeks, but months later it would prove to be the smartest thing the board did that afternoon.
After Florida and the chaos of the 2000 presidential election, the nation's voting masters vowed: Never again. Never again would an election be jeopardized because the mechanics failed, and never again would parsing a winner be left to human discretion. Officials have scrambled to update voting equipment, creating a weird, three-pointed confluence of interests: civil servants, suits, and geeks.
Thanks to Florida, local governments find themselves sitting on piles of fix-it money - millions from city and county coffers and $3.9 billion from Congress, thanks to the Help America Vote Act of 2002. The companies that make voting equipment are rushing to produce machines; at the same time, big players like Diebold, with almost $2 billion in revenue last year, are touting transparent, efficient, and chad-free elections. Meanwhile, some of the nation's elite computer experts and election watchdogs are hyperventilating. They see a fumbled opportunity - instead of using the tech to make democracy secure and accurate for the first time, we're building an electoral infrastructure with more holes than a punch card ballot. This future is getting hashed out not in Washington (the Feds don't run elections) but in the nooks and crannies of American politics, like that Silicon Valley board meeting. "Every year there are legislative proposals that make election administrators' eyes roll," says Warren Slocum, chief elections officer for San Mateo County, just south of San Francisco. "The voting registrar's life has become wildly complex."
That's ironic, because electronic voting is supposed to make elections easier. The systems themselves are as simple to use as an ATM, and overvotes - one of the problems in Florida - are impossible. You can't select a second candidate without deselecting the first. The interface notes skipped races or ballot questions. With the addition of a simple numeric keypad and headphones, the visually impaired can vote independently.
Electoral officials get their own set of benefits. For example, some precincts in Southern California print ballots in Spanish, Vietnamese, Korean, and Tagalog, among other languages; registrars must guess how many of each to print before election day. And printed ballots often show candidates who have dropped out (such as Arianna Huffington in the California recall). By contrast, touchscreens can be quickly reprogrammed with new languages and ballot changes. When the polls close, an election worker inserts an "ender" card that tells the DRE it's time to aggregate the votes. The machine saves the tally in its internal memory and copies it to a flash memory card, which the worker removes and transports to a separate server. That's where the official count takes place - fast enough for TV networks to name a winner before the bars close.
Yet for all the ostensible advantages, digital voting's recent history plays like a Marx Brothers movie. In Southern California's Riverside County in 2000 - the state's first use of touchscreen DREs - a Sequoia server unaccountably froze, then began counting backward. In the central coast community of San Luis Obispo in 2002, a machine spontaneously began reporting totals with five hours left in the election. In Louisiana, humidity and overheating caused constant crashes. Last November, in Indiana, DREs reported more than 144,000 votes cast in Boone County, which has fewer than 19,000 registered voters.
With stories like these, it's not hard to suspect something sinister. One of evoting's harshest, most vocal critics has been an online journalist named Bev Harris, a loquacious 52-year-old from the Seattle area with a keen nose for gossip. Her site chronicles a litany of malfeasance and incompetence. Faulty equipment, miscounts, and tapping into vote-counting machines with cell phones are just the start. Harris rails against the right-wing Christian Ahmanson family's sponsorship of Election Systems & Software and the failure of US senator Chuck Hagel (R-Nebraska) to disclose financial ties to the company while its machines were being used to vote for him. And then there's the now-infamous smoking gun: a fundraising letter from Diebold's chief executive, Walden O'Dell, assuring fellow Ohio Republicans, "I am committed to helping Ohio deliver its electoral votes to the president next year."
Harris has particularly focused on Georgia in 2002, the first statewide elections using Diebold machines. There, a local IT worker claims he was hired to install state-certified software in new machines from Diebold but spent most of a sweaty election eve patching in fixes straight from the company. A Diebold spokesperson says the company has obeyed the election laws in every jurisdiction it serves. In that same election, aides to secretary of state Cathy Cox worried that they would have no way of knowing whether any given machine at a poll had blown a fuse. So they distributed 8,000 night-lights to polling stations with instructions to plug the machines into the lights and plug the lights into the wall as an improvised tattle.
Lately, DRE critics have targeted the safety of the actual votes. When Stanford professor Dill put questions about cryptography to a Sequoia executive at a meeting of California's task force on evoting, the vendor said that they'd written the crypto themselves. "Do-it-yourself crypto is a bad idea," says Dill. A Sequoia spokesperson says the company uses "publicly tested and accepted" encryption standards.
All said, you don't have to be a conspiracy theorist to spin out a nightmare scenario. A well-orchestrated attack could change the outcome of an election for the entire country (see "5 Worst-Case Scenarios," opposite), and we might never know it happened. Imagine the "Manchurian Programmer": a domestic political dirty tricks operation or, let's say, the People's Republic of China - technologically sophisticated, deep-pocketed, and with previous attempts at election trickery on its rap sheet - finds a programmer working for a vote-tech market leader and flips him to the cause. He instructs the machines to record a vote for one candidate as a vote for another. Normally, safeguards - so-called Logic-and-Accuracy tests - would catch such a problem. But the Manchurian Programmer writes his code bomb to operate only between certain hours on election day. The tests miss the flaw. Just a few votes get swapped, but that's enough. Paul Kocher, CEO of computer security firm Cryptography Research, says changing only half a percent of the votes cast could have given the House to the Democrats in the last election.
Americans have always hoped technology would solve electoral problems. In the late 1880s, New York and Massachusetts introduced newfangled paper ballots pioneered in Australia. Lever machines, which reduced the number of ballots spoiled by error and fraud, first came into use in 1892. And punch cards appeared in polls in the mid-1960s, when UC Berkeley political scientist Joseph Harris adapted IBM computer cards. (Harris' prototype IGS Votomatic turned up in university storage in 2001; now the Smithsonian has it.) Westinghouse Learning's "mark sense" cards grew out of optical-scan technologies developed to grade the ACT college entrance exam.
Today, touchscreen machines are spreading faster than any previous voting technology. In the California recall election in October, 9 percent of voters made their selection directly on a computer. By the state presidential primary in March, it'll be 32 percent. "All counties will eventually utilize touchscreens," says John Groh, senior vice president at Election Systems & Software. "It will reduce their costs and give everyone access to the ballot."
It will also create a monster of a market. A midsize state typically needs 20,000 machines, at about $3,000 apiece, plus service contracts and upgrades. In May 2002, Georgia paid more than $50 million to go digital; Maryland signed a similarly sized deal. The manufacturers have been on their own buying spree. In 1997, ES&S started rolling up other companies, like Business Records, one of the original purveyors of optical-scan units. In 2002, De La Rue, a British provider of banknotes and other secure documents, paid $23 million for Sequoia, which then served 70 counties in 17 states.
Perhaps most fatefully, Diebold got into the game. A maker of safes and vaults before the Civil War, the Ohio-based company eventually expanded into alarms and other security systems. In the 1960s, it prototyped an automated teller machine; today Diebold makes two-thirds of the ATMs in the US. In 1999, the company bought a South American computer outfit named Procomp, which was awarded a contract to provide DREs for Brazil's presidential election. Two years later, in the biggest play so far, Diebold paid $26 million for Global Election Systems, with sales contracts in more than 850 North American jurisdictions.
Joe Torre is not your typical Maryland powerbroker, not the kind of legislative hack who has a sandwich named after him at Chick and Ruth's Delly - that's how they spell it - a couple of blocks from the state capitol in Annapolis. But Torre, a native with the appealing near-drawl indigenous to this near-Southern state, wields his own kind of influence. He's the voting equipment procurement officer, and last year he spent more money on election technology than anyone in US history.
See, in 1994, Maryland had its own mini-Florida. The Democratic governor, Parris Glendening, survived an ugly recount after edging out his opponent, Republican Ellen Sauerbrey, by just 5,993 votes. Sauerbrey sued, the FBI got involved, and the state's voting equipment got part of the blame.
Torre and the election board's implementation officer, David Heller, entertained presentations from at least five companies, small and large. One of the vendors took hours to set up. Another one, says Heller, had a system that looked like it ran on vacuum tubes. But Diebold "seemed to have the most business sense." Torre liked it for a simpler reason: the familiar, ATM-like interface. In March 2002, the election board bought 5,000 Diebold machines for a pilot program in four counties. Last July, Maryland agreed to buy 11,000 more. Cost of creating a new infrastructure for elections: $55.6 million. Cost of not having the FBI oversee those elections: priceless.
Then, a few days after the contract with Diebold had been signed, the bits hit the fan. A computer security expert from Johns Hopkins named Aviel Rubin published a report eviscerating Diebold's tech.
How Rubin got involved is a bit of a tale. Six months earlier, just after the supervisors meeting in Silicon Valley, Bev Harris - the anti-DRE writer - was Googling Diebold. She stumbled on a company FTP site containing what looked like code for the AccuVote-TS, one of Diebold's touchscreen units. She announced on her Web site that she'd found the code, but she didn't know what she had. David Dill did. He passed along word of the Diebold code to Rubin.
Rubin is a leader in the field. He served on a National Science Foundation panel on computers and voting, and he helped the Costa Rican government study Internet-based elections. Plus, he gets jazzed about bulletproof code and rock-solid security the way some guys get jazzed about sports. He'd even asked Diebold for a machine to dissect (they said no). "Everything changed when we got to peek under the hood," he says.
Rubin called in two graduate students, Adam Stubblefield and Tadayoshi Kohno, and told them he had "a drop-everything project." Stubblefield had been Rubin's intern at AT&T; while there he had confirmed that Wi-Fi networks were hackable, identifying the encryption keys in just a week. (The Wi-Fi crypto standard is now being upgraded.) "I talk about 'Adam units,'" says Rubin. "He does in a day what others do in a month, and Yoshi is in the same league."
In a few days, Rubin, Stubblefield, and Kohno isolated the encryption keys that protect data on a Diebold machine. Then they moved on to larger, structural weaknesses. Rubin published an extensive report criticizing the devices. "They made mistakes I wouldn't expect an undergraduate in computer security to make," Rubin says. Programmer logs, previously hacked from Diebold's Web site, disparaged the code: "This is a bit of a hack for now," one note reads. Other problems: The smartcards with which voters log in use unencrypted passwords (easy to fake - either to vote more than once or prematurely close out a DRE). The machines are protected by the outdated Data Encryption Standard, crackable in 24 hours or less. Anyone who wanted to rig an election could bust open the data files - say, while transporting flash memory cards - and insert new vote tallies. And Diebold runs it all on Microsoft Windows CE, not exactly the Fort Knox of operating systems.
When Kohno presented their analysis to some 500 computer security experts at the Crypto 2003 conference in Santa Barbara last August, "we lost at least a minute of our five minutes up there to laughs," Rubin says.
The repercussions were immediate - and hardly what Rubin expected. Diebold initially objected on technical grounds, saying Harris found outdated code. Then the company sent Rubin a letter warning him to shut up. In September, Diebold prevailed on Bev Harris' ISP to shut down one of her Web sites that linked to Diebold memos, and the manufacturer has since sent cease-and-desist letters to others who posted copies. Maryland officials maintain they are grateful that Rubin issued his report. As a result, they hired Science Applications International Corporation do an outside review of the Diebold machines. SAIC found many of the same flaws. Maryland got Diebold to close the most egregious gaps in security - the company made administrator passwords programmable instead of hardwired, improved the encryption on ballot results transmitted by modem, and gave officials the ability to alter encryption keys. Then Maryland let the contract with Diebold go forward.
Rubin doesn't seem to understand entirely why his report got everyone in Annapolis so upset. "We didn't do it the day before the election," he says. Stubblefield pipes in: "We expected they would fix it." Rubin admits that a perfect system is hard to imagine for a population as large as the US (his solution is an impracticable, multitiered design diagramed on his whiteboard). The real problem, he points out, is that the soul of computer security is authorizing specific people to do specific tasks, usually via password. But connecting specific voters to their selections is precisely what our secret ballot system forbids.
Standing before the Santa Clara Board of Supervisors back in February, David Dill pitched a solution. The only way to verify an electronic election, he said, is to keep a paper trail. In Florida, when the presidential vote went down the pipes, the state initiated a hand count of every ballot. Most DREs don't offer that choice. But if they printed a vote tally in addition to storing the 1s and 0s, a blackhat hacker couldn't really affect the outcome. "If there's a paper trail," Rubin says, "Osama bin Laden could write the code and it wouldn't matter."
In late November, California's secretary of state, Kevin Shelley, conceded the point. He announced that by 2006, all DREs must be equipped with paper. Other states are expected to follow. While a few small vendors of election technology say they're ready to comply, only one major player, ES&S, has produced a prototype that includes paper. It's a kludge, a voting device grafted to a 1920s-era pneumatic tube transport system. The box has a plastic pipe attached; the paper ballot pops out for verification, then whooshes into a lockbox when the voter approves.
Adding paper won't come cheap. Alfie Charles, a spokesperson for Sequoia and a press officer for California's previous secretary of state, says it'll tack on 15 percent to the cost of a $3,000 machine. That's an extra $55 million to $65 million statewide.
So the computer scientists in Silicon Valley are vindicated. But whether DREs are trustworthy overall is still an open question. California may have helped with that, too. In the October recall election, voters used a smorgasbord of devices: optical scan ballots, punch cards, and touchscreens.
Afterward, Rebecca Mercuri, a computer scientist and research fellow at Harvard's Kennedy School of Government, and a proponent of paper backups, ran the numbers. Of almost 8.4 million votes cast, 384,427 were not recorded for the recall question, either because of an error in the ballot technology or because the voter didn't register a choice (given the nature of the election, that's not likely). In the parlance of the field, that's a "residual rate" of about 4.6 percent.
Surprisingly, there was no correlation between residual votes and the type of voting technology. Some kinds of punch cards, to be outlawed in California by spring, "fared somewhat better, on average, than all of the optically scanned and touchscreen systems," Mercuri wrote in an email. And on the recall question, Diebold's AccuVote TS - the one Rubin cracked and attacked - lost the fewest votes of all.
Mercuri's ambivalent findings seem likely to disappear in a debate that, she says, is losing nuance. To people opposed to digital voting, "you've got two extremes," she says, "the doofuses who don't know how to vote, and those who are stealing votes in huge numbers." On the other side, vendors demonize the tech experts without responding to their ideas. And politicians, under pressure to ensure Florida never happens again, simply accept the solution offered by the traditional suppliers. By raising the specter of hacking, Rubin may have distracted attention from the problem of poor quality. Even David Allen, coauthor of Bev Harris' book, thinks the danger of hackers has been overemphasized. "Rigging elections is too hard, and the stakes are too high," he says of Diebold's misadventures in Georgia. "It's more likely that crappy software threw the election."
The dark secret of running elections is that the people in charge have never been able to rely on the security of voting machines, computerized or otherwise. Much of what they call electoral science is actually just safeguards that have grown up to prevent fraud, beginning with close observation of voters at the polls. DREs are supposed to improve on existing systems, but really they're just a way to keep up with newly revealed problems endemic to the existing system. "There are secure systems that can be run insecurely," says Torre, "and insecure systems that can be run securely." Either way, we're facing another presidential election in November.
5 Worst-case Scenarios
By Paul O'Donnell
Today's digital voting machines don't keep a paper record of individual votes, so if something goes wrong, there's no backup for the data. And if history is a guide, something will go wrong. Officials have answers for every scenario, but some of their solutions are more convincing than others.
SCENARIO: A rogue programmer tweaks the code to swap votes from Democrat to Republican, or vice versa.
SAFEGUARD: Logic-and-Accuracy testing roots out any such code bombs. And in the actual program that counts votes, ballot positions are scrambled, making a switch hard to mastermind.
SCENARIO: A voter upgrades his access with a counterfeit version of the smartcard issued to every person as they vote. Result: He can vote multiple times.
SAFEGUARD: The voting booths have no curtains; polling-place volunteers are trained to watch for suspicious behavior.
SCENARIO: A hacker changes the count on memory cards from individual machines or on the server used to tally the votes.
SAFEGUARD: The number of votes won't match the totals on hardwired memory in each DRE device - or the number of voters who signed the rolls.
SCENARIO: A power outage cuts electricity to the polls.
SAFEGUARD: Internal batteries provide juice in a blackout, and many polling places - schools, churches, and so on - have disaster preparedness kits that include generators.
SCENARIO: Someone walks out of a polling booth and announces he has gamed the machine and no one will ever figure out how.
SAFEGUARD: Polling-place staffers take the DRE offline and call tech support for a diagnostic. Meanwhile they look for obvious discrepancies, like more votes recorded than voter signatures on the rolls.
Security and Privacy Risks in Voter Registration Databases (VRDBs)
Peter G. Neumann
Principal Scientist, SRI International, Computer Science Lab
333 Ravenswood Avenue, Menlo Park CA 94025-3493 USA
Tel 1-650-859-2375
Washington DC
  Workshop on Voter Registration Databases, organized by the National Academies' Computer Science and Telecommunications Board (CSTB) as part of a study sponsored by the U.S. Election Assistance Commission
    [Caveat to the committee: Kristen Batch asked me to give a big-picture overview for this session.  I will not be surprised if much of this position paper duplicates what you already know, or statements by other panelists.  However, I hope the system oriented perspective will be useful.]
Databases containing personal identification information tend to engender enormous potential risks relating to computer security, system integrity, data errors, accountability, correctness, remediation of incorrect and inconsistent data, personal privacy, identity usurpation (including what is loosely called identity theft), and personal well-being. Past experience with the development and use of such databases is not encouraging.
Voter Registration Databases (VRDBs) need to anticipate all of those risks. Of particular concern to the ongoing CSTB VRDB study is the requirement to maintain accurate and up-to-date records of all eligible voters, correctly recording all necessary changes. VRDBs are especially vulnerable to accidental errors, willful misuse, data manipulation, denial-of-service attacks, and many other problems.
In this position paper, I consider many of these risks and examine some principles that might help overcome them. Responses are given to three explicitly asked questions. Several recommendations are discussed.
To put the risks and the problems they create into a broader perspective, Appendix 1 outlines some generally relevant difficulties that have been observed in complex database applications in other disciplines. Appendix 2 summarizes a set of principles for VRDBs [ACM2006] developed by the USACM committee of the Association for Computing Machinery chaired by Paula Hawthorn and Barbara Simons.
It is important to realize that there are no simple procedures by which the necessary requirements can be easily satisfied. The depth of the problems that must be addressed is considerable. Thus, we begin with a consideration of the risks before turning to requirements and principles.
Elections require a total-system approach to system security, system integrity, data integrity, and voter privacy. They represent an end-to-end assurance problem in which every step in the process is today a potential weak link. Thus, the risks that must be considered are typically dispersed accordingly.
With respect to voting machines, considerable emphasis has been devoted in recent years to their integrity, reliability, and accuracy (e.g., ACCURATE: A Center for Correct, Usable, Reliable, Auditable and Transparent Elections, or their lack thereof (e.g., the California Top-To-Bottom Review [Cal2007]. However, of particular concern here are the initial steps in the overall election process, involving voter registration (before or possibly during each election) and voter authentication (whenever the identity and validity of each would-be voter may be challenged, resolved, or deferred through provisional ballots that are evaluated later).
Many voting system developments and election procedures have been established without adequate concerns for overall system security, integrity, and privacy. Many slippery slopes are not being anticipated, or are ignored altogether. Privacy issues seem to be sublimated, with no real commitment to creating and enforcing realistic policies. Proactive establishment of system requirements and observance of principles for development and operation are both very important, but widely disregarded in practice.
There is a popular tendency to seriously overendow technology as a solution to human problems. This tendency is manifest in several ways, including but not limited to ignoring the following problems associated with voter registration and authentication:
* Difficulties in establishing and enforcing uniform statewide (not to mention national) standards for voter identification suitable for registration.
* Difficulties in maintaining accuracy and correctness of registration data, in view of name and address variations, marriages, inconsistencies among county rules, and so on, which are exacerbated when people move from one location to another (as was the case after Hurricane Katrina).
* Improper (e.g., fraudulent) use of Internet registration. (Arizona allows Internet registration with an electronic signature; many other states provide application forms and other registration support online [Ele2006].)
* Inconsistencies that can arise from data entry into statewide VRDBs by multiple authorities (registrars, clerks, system intruders).
* Inabilities of and irregularities in existing voter identification, authentication, and access control mechanisms. These are often compounded by inadequate oversight of security and privacy when voters actually attempt to vote.
* Difficulties in cross-referencing and coordinating records across jurisdictional boundaries, in the presence of administrative inconsistencies, communication impediments, and access limitations.
* Inability to detect people registered simultaneously in multiple precincts and even multiple states, which can be compounded by the presence of aliases and variant name formats. Remediation of errors is itself would be a slippery slope. Note that public access to every state's VRDB could create new opportunities for companies such as ChoicePoint trying to identify duplicates and enable overzealous elimination of questionable voter registrations. (ChoicePoint claims to be the leader in identification and credential verification, but has been implicated in several suspicious activities. See Appendix 1.)
* Complexity of requirements imposed by noncompromisible auditing and accountability, which introduce further problems with respect to system security, data integrity, and data privacy.
* Risks of exacerbated problems that result from mission creep -- e.g., if further applications become linked to the originally intended uses, and as control of the above factors is not properly enforced.
* Misuses and improper reuses of the databases, e.g., for commercial, surveillance, vindictive, partisan, or other purposes.
* Inadequacies in establishing and enforcing VRDB system access controls, as well as difficulties in monitoring VRDB system use and being able to detect misuses and improper reuses.
* Inadequacies that become particularly difficult for persons with disabilities, or that in some ways disadvantage those persons.
* Serious risks for persons who must live with identities and other personal information that must be treated specially, as in intelligence agents, people under witness protection programs, and particularly those people in danger of bodily harm, for whom voting registry information made public could compromise their well-being. (According to an Electionline Briefing [Ele2006], at least 35 states currently make exceptions for certain persons. For example, California Elections Code 2166.5 and 2166.7 [CalCode] allows upon request the redaction of personal information for state employees, volunteers, providers and patients of reproductive health services, victims of domestic violence and stalking, and public safety officers. New Hampshire also has special procedures for victims of domestic violence and people under witness protection.)
In addition to all of these potential problems, the increased online availability of public-record VRDBs is likely to increase the likelihood of automated data mining, with opportunities for identity theft, unmonitored automated purges of voters, and many other risks.
Each of the above risks suggests the need for explicit requirements to address and ameliorate those risks.
Identification and authentication require serious study and open discussion. For example, requirements for a government-issued photo ID (as has been proposed in Georgia) open up the arguments for and against REAL-ID, plus concerns that this might further disenfranchise certain types of voters.
Correctness and timeliness of the VRDB data is essential. Oversight over data entry, subsequent alterations, removal of entries, and possible misuse of the data all require noncompromisible mechanisms for accountability and auditing of basically every database operation -- as well as system alterations made by administrators or external vendor or other third-party upgrades.
Mission creep could work both ways. On one hand, desires for nationwide unique identifiers might suggest using Social Security Numbers or REAL-ID as voter identifiers. On the other hand, VRDBs could be used in conjunction with EEVS and law enforcement databases for purposes other than voter registration.
Statewide VRDBs are now mandatory under HAVA, as of 1 January 2006. However, several states have not yet been able to complete procurement and deployment of HAVA-compliant registration systems, with varying problems experienced in Alabama, Illinois, Maine, New Jersey, New York, Wisconsin and Wyoming, and only interim use in California. North Dakota does not register voters, but is building a statewide database to record who has voted. Texas reportedly experienced many complaints about poor system performance of its Texas Election Administration Management system (TEAM) during the May 2007 primaries, and disenfranchisement of eligible voters who were (mistakenly? accidentally? intentionally?) removed from the database [Ele2007]. A Brennan Center report [Bre2006] provides detailed analyses of each state's responses to the HAVA VRDB requirement.
All of these concerns need to be addressed by any comprehensive approach to voter registration and voter authentication. In addition, large database systems are typically replete with numerous additional problems, such as development delays and overruns, project cancellations when the delivered system is clearly unable to meet expectations, serious consequences of erroneous data, relative ease of undesired insider and outsider manipulations, and so on.
Above all, the requirements must be stated prior to development and acquisition, and must be reasonably assured in any system before deployment and continually throughout operation.
It is interesting to contrast the potential difficulties that may arise in voter registration databases with the problems already being experienced in attempts to develop the Electronic Employment Verification System (EEVS) -- which I have discussed in testimony for the Congress of the United States, House of Representatives Committee on Ways and Means, Subcommittee on Social Security [Neu2007]. The EEVS pilot study (also now being referred to as E-Verify [Epi2007]) reportedly has errors in the records of over 4% among the employees contained in the pilot study. An error rate that high in the eventual system would clearly be extremely disruptive.
The biggest difference between VRDBs and EEVS is that VRDBs contain data that is local to states or in some cases smaller jurisdictions, whereas EEVS and its successors will eventually contain data on every eligible employee in the entire country and will be accessed by every employer nationwide. Nevertheless, many of the lessons that must be learned in considerations of EEVS are also applicable to VRDBs. The mission-creep issue of using REAL-ID or some other national identifier is present in both. Accountability and auditability are essential in both. Addressing the need to provide rapid human-oriented procedures to rectify errors in the computer system data is clearly critical to both, particularly if the error rates in VRDBs are even only some fraction of what they are in the EEVS prototype.
Some further illustrative examples from other types of database applications are given in Appendix 1.
Q1: PRIVACY CONCERNS What privacy considerations need to be taken into account?
In this context, I consider primarily physical privacy and information privacy of stored data and possibly interactions with voters over electronic communications such as telephony and the Internet [Be+2007].
* Identification and authentication. New technologies often tend to have unforeseen privacy problems. For example, improperly protected smart cards may leak personal information. Furthermore, there is a tendency toward using such technologies for multiple purposes, such as social security, employment verification, passports, health care, and elections), rather than expecting people to carry many different unique identifiers. RFID chips are notoriously problematic. Real-time biometric scanners with wireless scanners can add to the privacy problems. Surreptitious surveillance may lead to clandestine tracking of individuals.
* Accuracy. Although the effects of erroneous VRDB data may not seem to be primarily a privacy issue, incorrect data misinterpreted by untrained officials could result in some unexpected consequences, such as denial of the right to vote, arrests, deportations, and panicked reactions of voters who feel unduly threatened.
* Identity usurpation. Identity theft (including masquerading) may seem to represent only a relatively small portion of the problems associated with misuse of identity information. However, its consequences are considerable. Thus, the risks relating to VRDBs must be considered, particularly theft of the right to vote -- such as organized matching of VRDB entries with recent obituaries before authorities catch up. (Dead people voting is apparently an old American tradition.)
* Other data misuse. Although past and present VRDB records might seem to present few privacy problems beyond identity usurpation, any mandated presence of REAL-ID or Social Security Numbers could lead to data mining and aggregation of information from the assorted state and other public and private records, particularly if some of those databases contain greater detail than others that would enable cross-correlation.
* Annoyance. Presence of phone numbers and e-mail addresses in any state VRDB systems could lead to automated calling and e-mailing, in addition to the postal mail access that is already commonplace.
Q2: PRINCIPLES What principles guide your security decisions? How might these apply to voter registration databases?
An oversimplified set of election principles is given in my book, Computer-Related Risks [Neu1995]. These principles addressed system integrity, data integrity and reliability, voter authenticity, voter anonymity, vote confidentiality, operator authentication, and system accountability. Additional principles also addressed system disclosability (avoidance of proprietary code and data), system availability (despite accidental and malicious acts), system reliability, interface usability, documentation, and assurance. Shamos's Six Commandments are of course relevant in spirit. Many other principles are associated more broadly with the development of trustworthy systems, such the Saltzer-Schroeder principles and others discussed in [Neu2004], and these should be applied to election systems, control systems, and any other systems that must be trusted to perform correctly.
Based on past evidence and on the expected opportunities for future misuse, the VRDBs clearly represent significant weak links that can be exploited or accidentally invoked. Therefore, all of these principles are specifically relevant for the development and operation of VRDB systems. Anything that can compromise any weak link in the election process can potentially compromise entire elections, local, statewide, or in some cases even nationwide. In particular, a concise summary of a set of eight principles of the USACM committee is given in Appendix 2.
Q3: EVALUATION STANDARDS AND METRICS What standard, adversarial test could be applied against each state's database? What would you include in such a test?
Reliance on standards and on testing is inherently an incomplete approach, and confronts several slippery slopes:
* There are no adequate existing standard adversarial tests that are applicable here.
* Static testing of individual state systems (hardware and software) is not very satisfactory. Many problems will arise only through human interactions, inactions, or typically unanticipated malicious behavior. Proactive design of the VRDB systems would be vastly preferable, although it is not likely to be found even among commonly used best practices.
* Standards tend to be lowest common denominators that minimally please election officials, system developers, and evaluators.
* The widespread use of proprietary software and proprietary evaluations that has prevailed in voting machines must not be perpetuated in RVDBs.
As a consequence of these and many other concerns relating to prevailing weaknesses in standards and in testing methodologies -- whether static or dynamic -- I am very wary of giving any credence to a strategy for testing, knowing that it would be only a very small tip of a very large iceberg.
Considerably more focused research, development of trustworthy systems, and operational oversight are needed on total-system approaches to the overall problems of election integrity. The extent of the risks is generally much greater than recognized, and needs to be addressed explicitly. VRDBs are just one piece of the overall puzzle, although a very important part.
Considerable effort is needed relating to any identity management that might be used in association with VRDBs. Biometric-based identity cards and similar means of identification and authentication are generally not a panacea. They can often be misused, forged, or otherwise subverted. Integrity of VRDBs may be compromised by technological or operational flaws in untrustworthy systems on which the database software is implemented.
Use of best practices and principles is highly desirable, but never quite enough -- because too many opportunities exist for accidental errors and intentional misuses. More broadly, incentives are needed to ensure that research and development of these systems are relevant to the real-world needs of managing fair elections and to ensure that these systems take advantage of the best of what is known in the R&D communities. Developers, maintainers, and users of VRDBs should always tend toward caution in looking for simple solutions. The same statement also applies to legislators.
The recommended standards (as being embodied in the Election Assistance Commission's current revision of the earlier voluntary guidelines) need to encompass stringent evaluations such as those carried out by the California Secretary of State's Top-To-Bottom Review [Cal2007] that go far beyond the existing evaluation procedures. Truly independent evaluations are essential, although quality assurance and testing of a product in the absence of its use is never enough.
System security and database security need to be built in from the outset, rather than superficially relegated to procedural constraints. In addition to rigorous authentication of people with access to VRDBs and supporting systems, differential access controls, auditing, monitoring, and accountability are all essential. Above all, built-in VRDB system accountability must employ systems designed to be trustworthy and noncompromisible with audit trails for all potentially relevant accesses to the databases (and their underlying operating systems).
Also essential is human oversight with respect to the quality and integrity of the information in each state VRDB, with respect to detecting and responding to serious misuses of the data, assurance that legitimate voters will be sent mandatory notifications of actions intended to be taken (e.g., their being disenfranchised on certain alleged grounds), implementation of effective procedures to ensure voters will have adequate opportunities to remediate errors, and so on. Oversight must also be accompanied with nonpolitical personnel, including Chief Privacy Officers and ombudspersons who can be truly independent of political pressures. Otherwise, the entire election process can be severely biased.
Above all, the likely risks outlined here must be explicitly anticipated and addressed.
Thanks to Joseph Lorenzo Hall for his comments. He noted that the exposure of protected information for people at high risk is often underrecognized. He also encouraged me to put a somewhat more positive spin on what might otherwise seem like a very gloomy view of the future -- which I hope I have been able to do in the final section.
[ACM2006] USACM, Study of Accuracy, Privacy, Usability, Security, and Reliability Issues, February 2006.
[Be+2007] Steven M. Bellovin, Matt Blaze, Whitfield Diffie, Susan Landau, Jennifer Rexford, Peter G. Neumann, Internal Surveillance, External Risks, Communications of the ACM, 50, 12, December 2007. This article is based on a paper by the named authors, ``Risking Communications Security: Potential Hazards of the ``Protect America Act'' (
[Bre2006] Brennan Center for Justice, Making the List: Database Matching and Verification Processes for Voter Registration. ?key=97&subkey=9185&init_key=9160 [split URL]
[Cal2007] California Secretary of State, Debra Bowen, Top-To-Bottom Review, August 2007.
[CalCode] California Elections Code 2166.5 and 2166.7. ?section=elec&group=02001-03000&file=2150-2168 [split URL]
[Ele2006] Holding Form: Voter Registration 2006. ElectionLine.
[Ele2007] Statewide Voter Registration Database Status. ElectionLine.
[Epi2007] Spotlight on Surveillance: E-Verify System: DHS Changes Name, But Problems Remain for U.S. Workers, Electronic Privacy Information Center, July 2007.
[Neu1995] Peter G. Neumann, Computer-Related Risks, Addison-Wesley, 1995. ISBN 0-201-55805-X.
[Neu2004] Peter G. Neumann, Principled Assuredly Trustworthy Composable Architectures, Technical report for DARPA, Computer Science Laboratory, SRI International, Menlo Park, California, December 2004. This report considers principles for the development and operation of systems and networks that must be trustworthy., .pdf, and .ps.
[Neu2007] Peter G. Neumann, Testimony for the House Subcommittee on Social Security, 7 June 2007. This testimony considers some of the risks likely to result from the development of the Electronic Employment Verification System (EEVS).
[References to the ACM Risks Forum given below can be found online (]
Numerous serious problems have arisen in the past in database systems related to governmental activities. These include projects that have been late, over budget, or even mothballed after many years and large expenditures of funding and manpower. Other problems include deployed systems that have had inordinate false positives and/or false negatives. A few of these are included here because of their potential relevance to VRDBs and particularly in the need to avoid the problems encountered therein in any VRDBs. Additional examples can be found in my Illustrative Risks compendium index ( See also the PITAC report, Cyber Security: A Crisis of Prioritization, for further discussion of development difficulties:
* The Electronic Employment Verification System (EEVS), and its pilot system E-Verify, noted above.
* The FBI Virtual File System (VFS). An attempt to modernize and unify FBI database systems has floundered because of major development difficulties. (See RISKS-24.03, -24.38, and -24.62.)
* The California Statewide Automated Child Support System (SACSS), affectionately know as the Deadbeat Dads Database, experienced huge overruns and the development contract was eventually canceled. (See RISKS-19.47.)
* Student and Exchange Visitor Information System (SEVIS) rushed into production prior to testing; files mysteriously deleted or "misplaced"; unable to modify existing records (e.g., when someone has a baby); system slowness; random crashes; inadequate help support. (See RISKS-22.81.)
* U.S. Government healthcare database for disciplinary records and malpractice actions incomplete, inaccurate. (See RISKS-21.15.)
* Names of 4000 AIDS patients leaked to press in Pinellas County, FL (RISKS-18.48,53); former Health Dept employee and roommate charged (Reuters, 15 Feb 1997)
* News reports have long noted problems with the no-fly list (as in CAPPS II), which now contains over a third of a million names and has had over 50,000 people wrongly detained because of supposed name matches (for example, Senator Ted Kennedy and everyone named David Nelson).
* Numerous privacy breaches have affected Time Warner, Ameritrade, LexisNexis, Bank of America, Wells Fargo, and SAIC, to name just a few of the more publicized cases noted in the media, some of which are cited in the Illustrative Risks compendium index.
Use of one company's third-party databases has presented numerous privacy problems noted here, the first of which is particularly relevant to voting:
* Florida election erroneous disenfranchisement of thousands of voters traced to bogus ChoicePoint data; ChoicePoint blamed DBT, its data aggregator (RISKS-21.42).
* In 2005, ChoicePoint sent warning letters to more than 30,000 consumers whose detailed personal profiles had been obtained by identity thieves masquerading as legitimate business people, having established 50 fraudulent businesses. (ChoicePoint's databases reportedly contain over 19 billion public records.) * Erroneous law-enforcement data from ChoicePoint: Privacy Foundation's Richard Smith discovered he had been dead since 1976, and had aliases with Texas convicts; Chicago woman misidentified as shoplifter and drug dealer, and fired (RISKS-21.42).
The USACM study [ACM2006] is a well-reasoned analysis of many of the problems that may arise with VRDBs. The principles discussed there are outlined here:
1. The policies and practices of entire voting registration systems, including those that govern VRDBs, should be transparent both internally and externally.
2. Accountability should be apparent throughout each VRDB.
3. Audit trails should be employed throughout the VRDB.
4. Privacy values should be a fundamental part of the VRDB, not an afterthought.
5. Registration systems should have strong notification policies.
6. Election officials should rigorously test the usability, security and reliability of VRDBs while they are being designed and while they are in use.
7. Election officials should develop strategies for coping with potential Election Day failures of electronic registration databases.
8. Election officials should develop special procedures and protections to handle large-scale merges with and purges of the VRDB.
Personal Background Information
Peter G. Neumann ( has doctorates from Harvard and Darmstadt. His first technical employment was working for the U.S. Navy in the summer of 1953. After 10 years at Bell Labs in Murray Hill, New Jersey, in the 1960s, during which he was heavily involved in the Multics development jointly with MIT and Honeywell, he has been in SRI's Computer Science Lab since September 1971. He is concerned with computer systems and networks, trustworthiness/dependability, high assurance, security, reliability, survivability, safety, and many risks-related issues such as voting-system integrity, cryptography policy, social implications, and human needs including privacy. He moderates the ACM Risks Forum (comp.risks), edits CACM's monthly Inside Risks column, and is the Chairman of the ACM Committee on Computers and Public Policy (ACM-CCPP), which serves as a review board for RISKS and Inside Risks, and represents an international rather than national scope. He is also a member of USACM, which is independent of ACM-CCPP. He created ACM SIGSOFT's Software Engineering Notes in 1976, was its editor for 19 years, and still contributes the RISKS section.
Dr. Neumann has participated in four studies for the National Academies of Science: Multilevel Data Management Security (1982), Computers at Risk (1991), Cryptography's Role in Securing the Information Society (1996), and Improving Cybersecurity for the 21st Century: Rationalizing the Agenda (2007). His book, Computer-Related Risks (Addison-Wesley and ACM Press, 1995), is still timely, and discusses many of the risks noted above. He is a Fellow of the ACM, IEEE, and AAAS, and is also an SRI Fellow. He received the National Computer System Security Award in 2002 and the ACM SIGSAC Outstanding Contributions Award in 2005. He is a member of the U.S. Government Accountability Office Executive Council on Information Management and Technology, and the California Office of Privacy Protection advisory council. He has taught courses at Darmstadt, Stanford, the University of California at Berkeley, and the University of Maryland. He also chairs the National Committee for Voting Integrity ( See his website ( for testimonies for the U.S. Senate and House and for the California state Senate and Legislature, papers, bibliography, and further background.
Dr. Neumann is the SRI Principal Investigator for A Center for Correct, Usable, Reliable, Auditable and Transparent Elections (ACCURATE), under NSF Grant number 0524111. ACCURATE is a collaborative effort that also includes colleagues at Johns Hopkins, Rice University, the University of California at Berkeley, Stanford University, and the University of Iowa. ACCURATE is examining techniques and approaches for voting systems, with particular emphasis on security, integrity, and privacy, and with much broader relevance to trustworthy systems with assurable auditing and accountability.
This workshop position paper was prepared with support from the SRI NSF ACCURATE Grant noted above.