Transcript #2

advertisement
1519
Video Interview #2 with Dan Wallach
1520
Tuesday, November 18th, 2003. 9am
1521
1522
Participants: Dan Wallach (DW), Christopher Kelty (CK), Hannah Landecker (HL)
1523
1524
[ ] = overlapping speech
1525
{ } = unsure of spelling, unable to understand speech, needs verification
1526
1527
DW: Every time I say you know, a thousand volts will go flinging up my spine…
1528
CK: All right, well I don’t have my “uh-huh” zapper on but…
1529
DW: I was reading that transcript and I was horrified by what I saw.
1530
CK: Well Valerie got really into because… (I shouldn’t say this on tape, probably, she’ll feel bad
1531
about it, but) she really wanted to do it like…. In anthropology there’s a sub-discipline called
1532
ethnomethodology which is all about paying very close attention to exactly those kinds of details.
1533
Like, the phatic responses, the fact that I say “uh-huh, uh-huh”, gets you…. Talking
1534
DW: [you know,] you know. Uh-huh you know
1535
CK: So it’s actually worthwhile to write all those things down, but…
1536
DW: I know, but seeing that that’s how I talk
1537
CK: [so that’s how you]
1538
DW: It’s not how I want to…
1539
CK: Wait ‘til we show you what you look like on video (laughter).
1540
DW: Oh no… at least I hope I helped correct all of your spelling errors.
1541
CK: Yes you did, that was good. Although I think we’re going to leave Bite-Code B-I-T-E.
1542
DW: Bite-code (laughter) Bite my code.
1543
CK: Alright, so let’s see
1544
DW: Are we rolling?
1545
CK: We’re rolling, we’re on. There are a buncha different areas I thought I’d focus on today, one
1546
of which is gonna be voting, I think, so we’ll have you talk about the voting stuff, um, but I thought
1547
I would seize on immediately the gift-economy stuff, since you brought it up last time…
1548
DW: Mmm hmm hmm.
1549
CK: And since it was something everybody knew something about, and so…um… you said last
1550
time, we were talking about you getting a giant computer from Sun for having found bugs.
1551
DW: [Mm-huh].
1552
CK: and you said it was like a gift-economy, because I asked whether this was quid pro quo,
1553
whether this was a signal from them that said you can have this computer as long as you keep
1554
doing things the way you’re doing it, and you said it’s more like a gift economy right…
1555
DW: Course, I don’t actually know formally what a gift economy… I mean probably what I know
1556
about a gift economy is what I read from Eric Raymond’s “The Cathedral and the Bazaar”…
1557
CK: Probably… that’s the only way it circulates it turns out.
1558
DW: Which may or may not have any correlation to what anthropologists consider to be a gift
1559
economy…
1560
CK: Its… sort of… it’s close, he’s not entirely wrong actually, I’ve thought a lot about this actually.
1561
One of the things about gift economy classically is that it’s a constant flow, its not just a “I give
1562
you a gift you give me a gift” but what it does it creates an obligation between people, it creates a
1563
relationship. And then, and it can actually include more than just two parties, but an entire… it
1564
was a way of theorizing society originally, that it included everybody, and so I give you a gift, you
1565
give Hannah a gift, Hannah gives some one else a gift and it produces a social bond between all
1566
of us, right. But the question then that I think wanted to come u…that came up was we wanted to
1567
know what the difference is, in some ways, between say working for a company that pays you to
1568
find bugs…
1569
DW: Mm-huh
1570
CK: and maintaining the kind of impartiality that the University gives you to find bugs for whoever.
1571
DW: Mm-huh
1572
CK: Now you still clearly have a relationship with these corporate entities, but they’re not paying
1573
you to find bugs….
1574
DW: Right.
1575
CK: So what you gain from maintaining this impartiality, why not go work for a company and find
1576
bugs, where you could presumably be paid better?
1577
DW: Well the university pays me quite well, better than many of these companies might. And it’s
1578
not all that exciting to continue looking for bugs over and over again. If that’s your job, then that’s
1579
what they expect you to do. To keep finding bugs, and that’s a real drag. The hope, which may
1580
or may not have any correlation to the reality, the hope is that you find a bunch of bugs and
1581
convince them that there’s something broken with their process and therefore they need to build
1582
a better system such that you won’t be able to find bugs anymore…that’s the hope.
1583
CK: And do y…do you see it as something where you get to actually get to work on that part, is
1584
that the exciting part? Re-building the system, or coming up with a better way of doing it?
1585
DW: [Well,] So there are two exciting parts, the first exciting part is realizing that there’s
1586
something broken-not just with the software but with the process that created the software. That
1587
the people who made the software didn’t understand the ramifications of what they were
1588
doing…and here’s an opportunity to convince them that they need to take security more
1589
seriously, and what have you, and maybe you need to convince them by demonstration, because,
1590
oth… other things don’t work as well. And once you’ve convinced them, then in theory they’ve
1591
learned and they’ll go off and do the right thing, and now you can move on to, either another
1592
target, or…I mean, I have a whole talk about this… My research pipeline is Step 1. Find Bug
1593
Somewhere, Step 2. Write a paper about it. Step 3. Build a better mousetrap, which is what every
1594
other academic in the world does, and then Step 4: Look for a new victim. (laughter) So that’s
1595
my research pipe… and… Finding bugs is a nice way of motivating the rest of the work.
1596
CK: Mm-huh.
1597
DW: If anything it helps convince the rest of computer science academia that, when I’m writing a
1598
paper that says “Here’s how to do better Java Security” I’ve already convinced them that Java
1599
Security is a problem. Whereas if you go straight to step 3, they’re like: “Hey why are you
1600
wasting our time on this, why is this a problem?” I didn’t realize this when I was doing it, but it all
1601
makes perfect sense now.
1602
CK: Right.
1603
DW: It’s analogous actually to phone companies selling services to help people call you up at
1604
home at 9pm when you’d rather be eating dinner, and then selling you services to block the
1605
people from calling you. (Laughter). I mean you create a market for your own products…
1606
CK: Uh-huh, Uh-huh.
1607
DW: So, it’s sort of an academic variant on the same theme.
1608
CK: Sure. Is there, I’m thinking of , I mean we’ll get to the voting stuff in a minute, but the
1609
denunciations from Diebold have been pretty vitriolic, and from other places as well…
1610
DW: And vacuous.
1611
CK: (Laughter) Let’s just stick with vitriolic for right now, cause the question I want to ask is do
1612
you think that by focusing on commercial products and products that are out in the world rather
1613
than on academic projects, other, you know, your peers and other things, that you risk
1614
compromising the impartiality that you have here, in the university?
1615
DW: Do I risk compromising my impartiality by looking at commercial vs. non-commercial
1616
products?
1617
CK: Yeah.
1618
DW: I don’t think so, ‘cause I also look at non-commercial things. I’ve been working with Peter
1619
Druschel on security for peer-to-peer systems, and we’ve been looking not the commercial peer-
1620
to-peer things like Kazaa or Gnutella, we’ve been looking at our own homebrew stuff—Pastry and
1621
various other things built here in house. So that’s an example of where I could have gone out an
1622
beaten up on the real world systems, but instead I was beating up on our local ones, cause the
1623
real-world ones are too easy to beat up on.
1624
CK: Mm-huh.
1625
DW: Not enough challenge there…they were never designed to be robust, so instead we’re
1626
beating up on our own systems. So…
1627
CK: So do you… so you think you can then say that you pick commercial products based one
1628
whether or not they have some claim to being robust or secure?
1629
DW: I s’pose. There’s no fun beating up on an easy target.
1630
CK: Uh huh, certainly.
1631
DW: And it’s better to… hm. What’s the way to say this? (Computer makes a bleeping noise, as if
1632
in response). That isn’t the way to say it. (Laughter and cackling) Um. If so. If it’s a matter of
1633
expectations, if the people who are using the system have an expectation of security, and that
1634
expectation has no grounding in reality, then that makes it an interesting system to analyze. The
1635
emperor has no cloths. (Computer makes another noise). I am gonna shut this thing up now. Ok.
1636
(Laughter). So it has nothing to do with what’s commercial and noncommercial. It has to do with,
1637
where do a substantial number of people have expectations that aren’t met by reality.
1638
CK: It makes sense. So I think sort of partially related question to this, that is you mentioned last
1639
time that part of the ethical issue in doing this kind of work to the commercial companies is
1640
learning how to maintain a good relationship with them. Right? Like learning how to release the
1641
results in a way that isn’t damaging to them but still, you know, supposed to [shows that
1642
they’re]…
1643
DW: You don’t want to… The companies are actually secondary to their customers. I mean, if I
1644
damage a company, you know, I’m sorry. But if the customers who are dependent on their
1645
products are damaged by something that I did, then that’s a problem. So do I really care about
1646
Diebold’s financial health? No. Do I care about the voters who are using Diebold Equipment?
1647
Hell yes. I care about the democracy for which Diebold has sold products. I could really care less
1648
about Diebold’s bottom line. And particularly since they have not exactly played ball with us, I
1649
feel no particular need to play ball with them.
1650
CK: Right. So you don’t always need a good relationship then. You just need to figure out…. By
1651
good relationship you mean you don’t wanna hurt the people that they are trying to serve as
1652
customers.
1653
DW: Yeah. I, if you will, I serve the masses, not the…. choose your favorite like Marxist term,
1654
“bourgeois elite” [laughter].
1655
CK: Running Dog Lackeys of the Capitalist Elite? (Laughter). Alright.
1656
DW: I can’t use those words without snickering. I am sorry.
1657
CK: OK. Well we won’t put them in your mouth. When you, when we talked about it last time, you
1658
had used, as an example, the idea of blackmailing a company with a list of bugs. You suggested
1659
there was a system that allowed one to exploit something that was I suppose in some way the
1660
good will of Sun, you know, “find us a bug and we’ll pay you.” So, um, is there a clear line for you
1661
between the place where, um, uh, finding bugs turns into a kind of exploitation of a process? Or is
1662
it something that you kinda feel, and have to know intuitively by case by case?
1663
DW: Ohh, it is hard to make a real generalization. I think when you are doing something that
1664
whose sole purpose seems to be line your own pockets… then, of course that’s that’s capitalism.
1665
Right? But in this case, it seems that… just Blackmail, I shouldn’t have to, it’s it’s clearly
1666
unethical. You know, blackmail in the traditional world. You know… Give us money… “give us a
1667
lot of money or your daughter dies,” you know that sort of thing. It is not just unethical it’s illegal.
1668
Um. I guess, I’ve always just come at this from… yeah. I guess I worry about who gets hurt. And
1669
well, I mentioned earlier that I obviously have no love lost for a company like Diebold. I am not
1670
trying to hurt them. I am not deliberately trying to injure them. I am trying to protect all of their
1671
customers. And if that requires fatally wounding Diebold to protect their customers, that’s fine. So
1672
in the context of java, protecting all the millions of people surfing the web is what it’s all about.
1673
And if it’s more effective to protect them by wounding the vendor, then that’s fine. If it is more
1674
effective to protect them by helping the vendor, then that is preferable. Because the vendor has a
1675
lot more leverage than I do. The vendor can ship a new product. You know, it would be great if
1676
Diebold, if I could somehow talk them into having this epiphany that they’ve been wrong all along,
1677
you know but, we guessed quite possibly correctly that Diebold wouldn’t be amenable to having a
1678
friendly discussion with us about our results. We assumed, without any information but almost
1679
certainly correctly that if we’d approached Diebold in advance they would have tried to shut us up
1680
with a cease and desist before we ever got out the door. And had they done that, that would
1681
have been more cost more, effectively more injury to their customers, who actually we care
1682
about. So therefore, that was how we came to the decision that we weren’t gonna talk to Diebold,
1683
we instead just came right out the door with our results.
1684
CK: Well maybe since we’re on this topic, you can backup and sort of tell the story of deciding
1685
working on e-voting, you know, with reference to what we’ve already talked to about how you
1686
decided to pick this area and this company in particular…
1687
DW: okay lets rewind to just after the Fa… the November 2000 elections in Florida which is the
1688
wildly held example that “oh, my God election technology does not always work”.
1689
CK: Hanging chad!
1690
DW: Yeah. Suddenly the whole world had this concept of hanging chad, pregnant chad what
1691
have you… And likewise, the world became cognizant of issues of policies and procedures. How
1692
do you define, how to count a… an ambiguous ballot, and it turned out that Florida had really
1693
awful procedures for that sort of thing. Likewise, the whole butterfly ballot issue; likewise, issues
1694
of voter registration… pretty much everything that could have gone wrong went wrong in Florida.
1695
Everything that they could have messed up, they did. It’s quite astonishing. I am glad I don’t live
1696
there. Or if I did, God knows, what I’d be involved in right now. Anyway, so the Florida story is the
1697
backdrop. The immediate aftermath of that was that Congress passed this bill, I mean, they are
1698
{fast tracked something had been in the works} for a long time called the Help America Vote Act
1699
(HAVA). And HAVA among other things allocated billions of federal dollars to help states
1700
upgrade, modernize, replace antiquated voting hardware. All the mechanical voting systems, all
1701
the punched card systems, everything…. And “everything” is the key part of the problem here.
1702
For the vendors who have been… Diebold, ES&S, Sequoia and smaller firms, like the one sold
1703
the system to Houston, Hart Inter-civic etc. we are small beans or they were companies that had
1704
been selling prior technologies that had these software based systems on the back burner, but
1705
very little adoption. All that got flipped around ‘cause suddenly the states had money, federal
1706
money, one time shot to buy new gear, and… there was this perception that paper is bad,
1707
therefore lack of paper is good. And, oh by the way, whenever you talk to an election official, the
1708
people who manage elections for living… they hate paper. It weighs a lot, you have to store it,
1709
you know, if it gets humid, it sticks, and oh god it can be ambiguous. So, they would just love to
1710
do away with paper and then you have these vendors who of course will sell whatever their
1711
customers, which is to say election officials, want to buy. Regardless of what the real
1712
customers, the electorate, would prefer to use. The electorate doesn’t matter at all, of course, it’s
1713
all about the election officials. I’m letting a little bit of cynicism seep into the discussion here. Um,
1714
and the checks and balances are supposed to come from these independent testing authorities.
1715
But it comes… are you familiar with the concept of regulatory capture….
1716
CK: It sounds fairly obvious but explain to me.
1717
DW: So the concept is… We see this a lot with the modern Bush administration, where “oh, all
1718
you polluting people, why don’t you figure out your own standards and regulate yourself.” Or if
1719
there is federal organization that’s responsible for setting the standards and who are the people
1720
setting the standards but the people who are been regulated. You know, if they’re setting their
1721
own standards, if they’re defining what clean means in their own terms whether that is you know,
1722
smoke stack emissions or auto emissions, or all the other things the EPA is messing up right
1723
about now. That’s an example of regulatory capture. You know, where the people who are being
1724
regulated have effectively taken over the people regulating them. So the voting industry has that
1725
problem in spades. What standards there are, are weak; the people who certify them don’t know
1726
what they are doing and federal election commission standards that are meant to be minimal
1727
standards, like, God anything ought to be at least this good, are considered to be maximal
1728
standards instead, you know, anybody who has passed this bar is worth buying. Everything is
1729
completely messed up. So the checks and balances that are supposed to keep this simplification
1730
feedback loop between the vendors and their customers, the testing authorities that are supposed
1731
to balance that out, aren’t. So that’s the state of affairs that we’re in, plus this huge injection of
1732
money. Election officials don’t understand computer security, they don’t understand computers.
1733
They don’t know what a computer is. They still think that computers are like The Jetsons or
1734
something. Or maybe they think it’s like what they see on, you know know, Law and Order….
1735
CK: CSI (laughing)
1736
DW: Yeah. They think that… if you watch CSI, hell, just last night someone was saying: “oh, that
1737
video is clearly proof that this woman wasn’t at the crime scene. We checked the time stamps:
1738
she was definitely there.” Like, of course, that could be faked. But, you know, they just… if they
1739
get all their, all the rest of their science as wrong as they get their computer science, oh, the
1740
horror!
1741
CK: [Right.] (laughing)
1742
DW: But I digress. So that’s the situation we’re in. So fast forward to June-ish of 2001. Harris
1743
County had just decided that they we’re purchasing Hart InterCivics’ system—the eSlate voting
1744
system. That’s little “e” capital “S” no space. And there was a hearing being held by then mayor
1745
pro-tem Jew Don Boney. Spell that one right. Google, google (laughing)…um…I’m torturing your
1746
poor transcriptionist. So, Jew Don Boney was holding a hearing and invited a number of experts
1747
to give testimony. In particular, he invited Peter Neumann (N-e-u-m-a-n-n) from SRI and
1748
Rebecca Mercuri (M-e-c-u-r-i), who is, let’s see at the time she was a professor at Bryn Mayr (go
1749
ahead, spell that!).
1750
CK: Alright, enough teasing…(Laughter)
1751
DW: Currently a fellow at Harvard, and what exactly that means, I’m not sure, for she’s a jolly
1752
good… fellow. So, Jew Don Boney invited two national experts and apparently Newmann didn’t
1753
actually want to bother to fly out to Houston. Instead he said “Well there’s that Wallach guy, he
1754
knows a little bit, he’s a good security guy…give him a call”. They ended up getting all three of us,
1755
and that was my first exposure to these voting systems. And right away I clued in to this lack-of-
1756
paper problem. In fact I was opening their system and pulling out this flashcard and waving it in
1757
front of City Council saying “Look, I can do this, so could somebody else…this is bad.” But of
1758
course, nothing happened, and we bought it anyway. I think it got abrief internal mention in the
1759
Houston Chronicle. I tried to sell a story to the Houston Press and they weren’t buying it. They
1760
did eventually run a story written by, who is this guy, Connelly? the one who, is currently {}the
1761
Houston Press column where they rip on local politicians.
1762
CK: Right.
1763
DW: Had an article saying how, you know, Grandma wasn’t going to be able to comprehend this
1764
thing, written tongue in cheek. And I wrote a letter to the editor and even though I mercilessly
1765
hacked it down, because I knew that they tended to have very short letters, they still hacked my
1766
short letter down to the point where it was almost incomprehensible and didn’t have any of my
1767
attributions, it just said Dan Wallach, Houston, TX. As opposed to Professor Dan Wallach, Rice
1768
University. Normally when it’s a letter from an expert, they usually include the affiliation. So they
1769
sort of denied me my, they denied my affiliation, which pissed me off. And that was about it in
1770
terms of my voting involvement. You know, Hart Intercivic won that little battle, and I got to learn
1771
who on the City Council had a clue and who on the City Council was clueless. I got to learn that
1772
Beverly Kaufman, our County commissioner is … (silence) … insert string of adjectives that
1773
shouldn’t be printed. I can’t think of any polite thing to say about her, so I’ll just stop right there.
1774
CK: Did they at any point {actually let you look at these machines or anything}, or was it just like,
1775
you’re an expert on security, so will put you on it?
1776
DW: The latter, you’re an expert on security. Here they are sitting in front of you. At the time I
1777
had challenged Bill Stotesbury {garbled}, saying I would like to audit these machines, I would like
1778
to read their source code. They said “only under a non-disclosure agreement.” I said “No way,”
1779
They said, “No deal.” {garbled} I mean they would be happy to let me look at their source code
1780
under a non-disclosure, but that means I can’t tell anybody else the results and that {violates the
1781
point} of me wanting to look at it, which is to, that the public should know about what they’re
1782
voting on. That’s the whole point of doing an audit, it seems to me. Although, {it turns out that}
1783
these independent testing authorities that I mentioned, their reports are classified. Not military
1784
kind of classified, but only election officials get to read them. They are not public. So you joe-
1785
random voter wanna, you know, are told “Just trust us! It’s certified.” But you can’t even go…not
1786
only can’t you look at the source code, but you can’t even read the certification document. So
1787
you have absolutely no clue why it was certified, or you don’t even know who you are supposed
1788
to trust. You’re just told “trust us.” Needless to say, this sets off all kinds of warning bells in the
1789
back of my head. But there’s very little I could do about it. So fast forward to early summer 2003.
1790
In the meantime, I’ve been involved in other things. The whole DMCA/SDMI thing in particular,
1791
happened in between. Or actually, the DMCA thing was going on, while that particular hearing.
1792
[someone opens the door]
1793
DW: Could you come back later, please? Thank you. Custodians. [Is there a trash can back there
1794
somewhere].
1795
CK: [They’re the one with the master key that we talked about last time.]
1796
DW: Yeah, that aforementioned lack of privacy. Where was I? So I was, the DMCA was
1797
wrapping down, wrapping… was finishing up at the time. Wrapping down? Finishing up? The
1798
paper was presented in August of 2001 and that was more or less the end of the whole DMCA
1799
thing.
1800
CK: But summer of 2003, or 2002?
1801
DW: Summer of 2003. David Dill, professor at Stanford, he does, had been doing formal
1802
verification, which is an increasingly important corner of computer science, that focuses on, just
1803
using the brute force power of computers to try to exhaustively study the design of software, the
1804
design of chips, or what have you, to prove that they don’t have bugs. It’s the sort of thing that 20
1805
years ago would be inconceivable, but now that computers are so big and fast, you can actually
1806
consider, proving that your divider always gets you the right answer for every possible input. It’s
1807
almost unbelievable but they can do it. That’s were he had been, that was his contribution to
1808
science. You know, that field got a real boost after the Intel divider bug that they had… that was
1809
the early 90s?
1810
CK: Later than that…
1811
DW: Yeah Intel shipped a chip where the divider would get an error once in a while. And unlike a
1812
piece of software, fixing a bug in hardware is very expensive. They have to recall… they had a
1813
multi-hundred-billion dollar write-off as a result of that. So the sorts of things David Dill worked on
1814
became very popular after that, because they solved that problem. Anyway. So, David Dill is a
1815
professor at Stanford; lives in Santa Clara County and Santa Clara County was in the process of
1816
purchasing new voting machines based on this HAVA requirement. HAVA actually has a, it had a
1817
carrot and a stick. The carrot is: “here’s a pile of money,” the stick is you must have new gear
1818
before the 2004 elections. Carrot – stick. So Santa Clara county was trying to decide what it was
1819
going to buy and Dill, like any other educated computer scientist said; “what do you mean, there’s
1820
no paper? The software could do the wrong thing.” It’s intuitively obvious to a computer scientist
1821
that this is a bad thing – having all this trust placed in software, because software is malleable.
1822
So Dill was somewhat late to the party, but boy, he is an activist. I mean he’s a very competent
1823
activist. And he’s he was on sabbatical at the time. So he just threw all of his energy behind this
1824
voting issue, and, from nothing, to being one of the premiere experts in the area. Because it’s not
1825
really that deep of an area to get an understanding of. So Dill contacted Peter Neumann and
1826
Rebecca Mercuri, existing experts in the area, and they also put him in touch with me and we
1827
wrote an FAQ all about why paper in your voting system is a good thing, and why we’re
1828
advocating on behalf of this voter verifiable audit trail. So, we wrote that FAQ, it was originally
1829
something that was meant to be submitted just to the Santa Clara County people who were
1830
making this decision. But somehow they managed to suck Dill into the nationwide issue and he
1831
set up this, I think it’s verifiedvoting.org. And… basically he’s turned his sabbatical issue into a
1832
crusade and he’s very competently using his mantle of Stanford professor. He’s {leverging his
1833
prestige} in all the right ways. So. One of the… so a number of other activists involved in this
1834
issue, and a very curious woman named Bev Harris out of Seattle, claims that while she was
1835
Google searching, she stumbled across a Diebold FTP site that had gigabytes and gigabytes of
1836
crap there for the downloading. Is that actually true? I don’t know. It really doesn’t matter,
1837
because it was there and she grabbed it all. And in July of 2003, she had written sort of an
1838
expose, explaining how the “GEMS” Global Election Management System, the back end
1839
computer with the database that tabulates all the votes. She explained exactly how simple it was
1840
to compromise that machine; it just used a simple Microsoft Access database, no passwords, if it
1841
was online, anyone on the Internet could connect to it, edit the votes, no audit log, or the audit log
1842
that there was was easy to work around, etc. And at the same time, she had announced that, oh
1843
by the way, here’s a website in New Zealand that has all those… gigabytes of original material
1844
that somehow was liberated from Diebold. That was intriguing.
1845
CK: (Laughter).
1846
DW: So Bev Harris, is, subscribes to conspiracy theorist opinion, that clearly that these voting
1847
machines are being used to compromise elections and, and here is further evidence of the evils
1848
of the Republicans, what have you. And the problem with that sort of an attitude, is whether or not
1849
it’s true, it tends to make { people }
1850
CK: Right
1851
DW: So, one of the things that…So Dill said, this is an interesting opportunity, here’s all this
1852
code. And we could try to leverage that whole academic prestige thing into making a statement
1853
that can’t be as easily ignored as things that Bev Harris does herself. So Dill got, you know, sort
1854
of started shaking the grape vine, and ended up organizing a nice group of people to do the
1855
analysis, which ended up, it was a initially a much larger group, including the people at Johns
1856
Hopkins, people here, some other folks in California, and we got the EFF involved very early,
1857
Electronic Frontier Foundation, to help make sure that whatever, that, one of the things that we
1858
learned as a result of the earlier SDMI work is that it’s very, very important to dot your i’s, cross
1859
your t’s, to understand exactly the ramifications of everything that you do, to cover your ass
1860
properly, in advance of any lawsuit. Another important lesson we learned at the time, is that if
1861
you’re gonna do research that might drag your ass into court, get your university counsel onboard
1862
BEFORE you do anything. Yes, okay, a brief digression to the SDMI story. Yeah, I didn’t go to
1863
talk to Rice Legal until the cease and desist letter had shown up.
1864
CK: Um hm
1865
DW: And the day I walked over there, they were having Richard Zansitis’s welcome to Rice party.
1866
CK: (Laughter)
1867
DW: Welcome to Rice… help. And, you know, the aftermath to that story which I haven’t really
1868
told yet, the aftermath was, you know, what you did was OK Dan, next time just please come talk
1869
to us in advance. So this time, I was sort of slowly ramping up, getting my research group ready,
1870
y’know made sure everybody was comfortable with the legal ramifications of what they were
1871
doing, that there were, there were still some legal risks involved, made sure everybody was cool
1872
with that. Meanwhile, the Hopkins people were racing ahead, Adam Stubblefield and Tadayoshi
1873
Kohno were busily pulling all nighters analyzing the code, and Avi Rubin calls me on like a
1874
Thursday or a Friday, says y’know, the state of Maryland just announced they’re purchasing 56-
1875
odd million dollars worth of Diebold gear, we’ve gotta get this out the door, like now. I still haven’t
1876
got Rice Legal to buy off, what are you talkin’ about? “It’s gonna go out the door in like a couple
1877
days, do you want your name on there or what?” AAAh. So, we go chugging down to Rice
1878
Legal, I’m like, I need an answer, NOW.
1879
CK: Um hm. They knew about this they were just like trying to figure out what to do.
1880
DW: Well, yeah. Y’know, so, I ended up, let’s see, this was like August. August in Houston,
1881
nobody’s here. My department chair, out of town. Zansitis, out of town. Malcolm Gillis, out of
1882
town. Sydney Burris, out of town. So, what ended up happening was, I ended up putting
1883
together a meeting, where I had Tony Elam, Jordan Konisky, me, Corky Cartwright as the acting
1884
CS Chair, and Joe Davidson from the Rice legal office, conferenced in with Cindy Cohn C-O-H-N
1885
from the EFF, she’s their head counsel, director, whatever, she’s the big cheese over at the EFF.
1886
And I got her and Joe Davidson on the phone. Joe Davidson was skeptical in advance of this
1887
conversation. He’s like, you know, violation of a trade secret is a class C felony in Texas, go to
1888
jail, you wanna do that? So he was, he was very skeptical. And I got her {sic} and Cindy on the
1889
phone together. And the rest of us were sitting around listening to the speaker phone. All of us
1890
were in Tony’s office except for, uh, Joe who was in his office in Rice Legal, and Cindy who’s in
1891
California. So, Joe and Cindy, after they got past the hi nice to meet you, very quickly spiraled
1892
into dense legalese. The rest of us could understand there was a conversation going on, and
1893
could pick out lots of words that we understood, but it was clearly a high-bandwidth legal jargon
1894
conversation. And, at the conclusion of that conversation, Joe said, “Go for it.” Well, actually, he
1895
didn’t say that, he said, “I’m gonna go talk to Malcolm.” And Malcolm was off fly-fishing in
1896
Colorado or something. I sort of have this vision of Malcolm Gillis as Teddy Roosevelt. You
1897
know, hunting in the African bush. He is sort of a modern day Roosevelt.
1898
CK: I suppose…
1899
DW: He just needs the monocle, and and a bigger mustache, right?
1900
CK: Right, bigger mustache…
1901
DW: I’ll have to get the Thresher backpage people on it.
1902
CK: Right.
1903
DW: I’m sure they can conjure something
1904
CK: Exactly.
1905
DW: That’s actually a pretty good idea. Um, anyway, so Joe basically briefed Malcolm, and
1906
Malcolm’s opinion was, this is core to what the University’s about, the University will support you.
1907
‘Cause, I could do whatever I want, I, it’s academic freedom,
1908
CK: Um hm.
1909
DW: The University can’t stop me. But what I was asking, was I wanted the University to stand
1910
behind me.
1911
CK: Um hm.
1912
DW: I was try, I wanted, if Malcolm were ever called by somebody from the press
1913
CK: Um hm.
1914
DW: I wanted him to say, “We support Wallach.”
1915
CK: Um hm.
1916
DW: We think what he’s do—I wanted that in place, in advance.
1917
CK: Right.
1918
DW: And all this was happening concurrently with Avi Rubin and company talking to a reporter
1919
from the New York Times.
1920
CK: Hm.
1921
DW: Before I’d even got the sign off from Malcolm.
1922
CK: Um hm.
1923
DW: And in fact the original New York Times article didn’t mention Rice.
1924
CK: Um hm.
1925
DW: Because…
1926
CK: It was unclear…
1927
DW: It was unclear, and, by the time we told the reporter, yes Rice is one of the co-authors, it
1928
was too late. It, the paper had already, the article was out of his hands into the editor’s hands,
1929
and oh by the way, the editors hacked the hell out of the story. Hopkins had thought they’d
1930
negotiated an, negotiated, in exc, negotiated in exchange for an exclusive, that they’d get page A-
1931
1, front cover. And then you know, something blew up in Iraq, or whatever, and, y’know, we got
1932
bumped to like page A-8 or something, on the front of the National section, which is buried inside,
1933
we weren’t happy about that. But, dealing with the media is another topic entirely.
1934
CK: Um hm….
1935
DW: Something of which I’ve learned an awful lot in the last couple of years. Um. Yeah, so, it
1936
took me a little while to get Rice on board but Rice, y’know, all the way from Malcolm on, on
1937
down, agreed that my work was valid research and should go on. And so, you know, the final
1938
paper had my name on it. And, about, y’know, a quarter to a third of the reporters actually
1939
noticed that it’s not just the Hopkins paper but it’s the Hopkins-Rice paper, and y’know, that’s OK.
1940
The Hopkins people did most of the heavy work in it anyway. So, okay, ethical issues.
1941
CK: Well, let’s um, maybe you can talk a little bit about, um, what was involved with getting a
1942
Diebold machine, doing the research…
1943
DW: We never got a Diebold machine
1944
CK: You never did…
1945
DW: I’ve still never touched one in person.
1946
CK: Ah
1947
DW: Never. What we had was the source code to a Diebold machine. So, the, the legal decision
1948
hinged on, of what we could and could not analyze from this chunk of stuff that we’d downloaded
1949
from this webserver. Ah, this is complicated. There were all kinds of different files. Some of
1950
them were just, you could read them directly, some of them were in encrypted zip files.
1951
CK: Um hm.
1952
DW: Zip encryption is not very strong. Y’know, in fact the passwords were posted on some
1953
website. And they were people’s first names. So, it wasn’t like we didn’t have the technical ability
1954
to read any of these files. But the legal analysis hinged on three different theories of how we
1955
could get into trouble. Copyright. Just straight old boring copyright. Well, if we shipped our, if we
1956
shipped the Rice voting machine using their code, that would be a clear violation of copyright.
1957
But, reading their code and quoting from it, that’s fair use. DMCA. The only DMCA theory that
1958
could apply is if somehow there was an anticircumvention device that we worked around. So,
1959
encrypted zip files are kind of like an anticircumvention device. So, the legal ruling was, don’t
1960
look at those. Not that you can’t, just don’t.
1961
CK: Um hm.
1962
DW: And the third theory was that we were violating their trade secrets. Now a trade secret as it
1963
happens is an artifact of state law, not federal law. So it’s different everywhere. And in Texas in
1964
specific, violation of trade secret, and the definition of what it means to violate it is this
1965
complicated thing. Which was sufficiently vague for us that it would really come down to a judge.
1966
CK: Um hm.
1967
DW: But violation of trade secret is a class C felony.
1968
CK: Um hm.
1969
DW: Go to jail.
1970
CK: Right. But in Maryland?
1971
DW: Maryland it’s not. Maryland it’s a civil issue. Texas it’s a criminal issue.
1972
CK: Um hm.
1973
DW: So, it’s, these things vary by state.
1974
CK: For sure
1975
DW: Maryland has other intellectual issues. They have UCITA. Which is
1976
CK: Oh yeah
1977
DW: this DMCA extension of doom.
1978
CK: Um, and, because it’s a federally, I don’t know, what’s the federal role here in terms of
1979
creating, ah, um, voting machines that, or are they, is it still a state-by-state thing
1980
DW: Um
1981
CK: What’s the regulation, even if the regulatory agencies are captured presumably they’re
1982
federal regulatory agencies…
1983
DW: Uh, that’s really an orthogonal problem. It has nothing to do with our exposure to legal
1984
liability.
1985
CK: Right
1986
DW: So, let’s just leave that off the table for now
1987
CK: OK. Alright. It’s just a question of whether, I’m wondering, because the, the right to keep
1988
trade secrets around this, these machines is presumably something that the regulatory agencies
1989
are handing over to these companies. Rather than saying, you have to
1990
DW: [This is]
1991
CK: [independently]
1992
DW: This is sort of a, a separate problem. Now it’s a question of… it’s a question of should our
1993
election systems be transparent. Or should, or should vendors be allowed to keep trade secrets
1994
about how their systems work. That question, I mean, obviously I prefer transparency. But that
1995
question is anything from obvious, what the right answer is. Well, I mean, I think it’s obvious that
1996
there should, that there should be transparency, but a lot of people seem to think that a company
1997
should be allowed to make proprietary products and sell them for voting. While I was talking on
1998
the phone to a film maker from the U.K. who wants to do a movie about, a documentary about all
1999
these voting things I’ve been involved in, and, y’know, he said, You realize those words together,
2000
“the voting industry.” (laughter) Those two words just don’t appear together anywhere else except
2001
in the U.S. Which is sort of a good point, I suppose. But back to our legal liability. So, the
2002
conclusion was, that the trade secret issue was only an issue inasmuch as it was actually still a
2003
secret. But here it was on a website in New Zealand, where the whole world could download it,
2004
and it was very publicly announced that it was on this website in New Zealand, and had been
2005
there for about three weeks at the time that we were forced to make a decision. Cindy Cohn
2006
called the New Zealand ISP saying, “Have you gotten any cease and desist letters?” And they
2007
said, “No, and we have what on our website?” (laughter). So that told us that Diebold hadn’t been
2008
taking any actions to protect their trade secrets. And one of the other things that I, as a computer
2009
scientist you think: either it is or it’s not a secret. That isn’t the way lawyers think. Lawyers think,
2010
that y’know, it’s kind of a secret, and how much of a secret depends on how much effort you
2011
spend to defend the secret. So if you aggressively go after those leaks, and try to contain it, then
2012
you can still make a trade secret claims, but they’re, they’re diluted somehow, but they’re still
2013
valid.
2014
CK: Even if it’s out there.
2015
DW: Even if it’s out there. But the longer it’s out there…
2016
CK: Um hm.
2017
DW: If you don’t, if you’re not taking aggressive steps to stop it, then it’s not, it loses its secrecy.
2018
But there was still some vagueness. And that was the, the calculated risk that we had to take.
2019
That we were prepared to do the work despite that calculated risk, and were, and y’know, EFF
2020
was signed up to defend us in the event that we landed in court based on that. And you know,
2021
that, like I said, you know. So, our decision was to go for it. Jordan Konisky remarked while we
2022
were in this meeting that in his whatever 20, 30 years at Rice, it’s the most interesting decision
2023
he’d ever been involved in.
2024
CK: Hm.
2025
DW: I’m not sure what that means. And this guy’s a biologist. He’s involved in all sorts of
2026
whacky ethical decisions that those guys have to make, right?
2027
CK: Right.
2028
DW: Is it ethical to stick hot pokers in mice. Cute little rabbits.
2029
CK: So when you made the decision to do this trade secret, to, to, go with the research
2030
DW: Yeah…
2031
CK: And not worry about the trade secret issue…
2032
DW: Yes, our decision was reading the source code was close enough to being in you know
2033
legally OK,
2034
CK: Uh huh.
2035
DW: That it was, that the risk was acceptable.
2036
CK: Right. But back up a second, I mean, is it from security researcher’s standpoint, is it
2037
sufficient to have only the source code to do this research?
2038
DW: Diebold would say no. We would say yes.
2039
CK: Uh huh. I mean, in terms of other projects.
2040
DW: Well, certainly. In our paper, we we had to say several places, we don’t know how this
2041
would exactly work, but, it would be one of these three ways and no matter what, it would still be
2042
vulnerable because of this.
2043
CK: Um hm.
2044
DW: So, we, our paper had a number of caveats in it…
2045
CK: Um hm.
2046
DW: Just by necessity. But despite those caveats, our results were absolutely devastating to
2047
Diebold.
2048
CK: Um hm.
2049
DW: So in addition to all the usual sorts of editing that happens to a paper before you ship it out
2050
the door, we also had Cindy Cohn reading over the paper, and mutating our language in subtle
2051
but significant ways. In fact, one of her, one of her sort of subtle additions, had to do with our
2052
overall stance. She’d end the sentence along the lines of, you know, this system is far below the
2053
standards of any other high assurance system. Far below what you’d expect for anything else. I
2054
forget her exact words, but I saw that sentence quoted extensively.
2055
CK: Sure.
2056
DW: She’s good. Um, so our paper was, it generated quite a splash…
2057
CK: Um hm.
2058
DW: Particularly coming only three or four days after Maryland had announced this 56-odd
2059
million dollar deal with Diebold. As a direct consequence of our paper, the Maryland state
2060
government commissioned an, an independent study with a vendor who we since think might
2061
have had a conflict of interest. That’s SAIC. And SAIC’s report which took them like a month to
2062
write, they reached, the full report was 200 some pages. The Maryland State government then
2063
redacted, excuse me, redacted it down to about 40 pages. And even from what we could read, it
2064
said, there are some serious problems here.
2065
CK: Um hm.
2066
DW: And then the Maryland officials said, It’s great! We’re gonna go ahead and use it.
2067
CK: Um hm.
2068
DW: And we’re like, it doesn’t say you should go ahead and use it. That’s not what that says!
2069
Just recently, like last week, there was a hearing in State of Maryland government that had Avi
2070
Rubin like, my coauthor from Johns Hopkins…
2071
CK: Um hm.
2072
DW: Testifying before the, whatever, Maryland House of Representatives
2073
CK: Um hm,
2074
DW: Whatever it’s called. And so he was testifying and this woman from their election
2075
commission was testifying. And as far as I could tell from the articles I read, he absolutely
2076
destroyed her. And all but one of the people who was on that commission, or panel or whatever,
2077
appeared to favor our position.
2078
CK: Um hm,
2079
DW: There was even a quote from somebody saying they expected Avi to be more of a wiseass,
2080
but in fact they were surprised at how coherent, and what have you, he was.
2081
CK: Um hm.
2082
DW: Nonetheless, election officials are still storming ahead and purchasing this stuff.
2083
CK: How much of things like this, like either the paper, or a public hearing like that, or um, or
2084
even the journalism about the whole issue actually mentioned, what you mentioned at the outset,
2085
which is that vendors are allowed to keep this stuff secret, and that there’s no way for you to
2086
actually legitimately test this, that nobody can actually legitimately test it, and had, and then on
2087
the flip side, how much of the case against you made by people like Diebold is that you stole the
2088
stuff?
2089
DW: Diebold’s initial response was that we were looking at an old outdated thing, not related to
2090
the current software.
2091
CK: Um hm.
2092
DW: Which SAIC helpfully disproved.
2093
CK: Um hm.
2094
DW: Diebold went after us every which way they could. All of which was wholly without merit.
2095
There… Avi Rubin as it turns out, is on a number of technical advisory boards, including for a
2096
company called VoteHere.
2097
CK: Um hm.
2098
DW: Which was trying to sell software to firms like Diebold to put in their voting hardware. And
2099
they had just announced a deal with Sequoia, one of Diebold’s competitors. So, an independent
2100
reporter had figured out that Avi had this potential conflict of interest, and Avi had literally
2101
forgotten that he was a member of that advisory board, because they hadn’t contacted him, you
2102
know, nothing, he had no financial benefit. So he disavowed the whole thing, gave them back
2103
their stock, resigned from that technical advisory board, but it’s, it’s common in responses that
2104
we’ve seen, where they try to say that our work has no merit because we had a conflict of
2105
interest.
2106
CK: Um hm.
2107
DW: We hear a lot of that. What we don’t hear is how many of these election officials were given
2108
campaign donations by members of the election industry, like Diebold.
2109
CK: Sure.
2110
DW: There’s a certain amount, not a certain amount, there’s a vast amount of hypocrisy…
2111
CK: Right.
2112
DW: In terms of conflict of interest accusations.
2113
CK: Sure, sure. But it seems like one of the central things is just the fact that you can’t
2114
legitimately do this research. No one can legitimately do this research, right?
2115
DW: Yeah.
2116
CK: I mean, that seems to me like a more powerful message than the fact that the machines are
2117
compromised.
2118
DW: It’s also a harder message to get out.
2119
CK: Uh huh.
2120
DW: Saying that, I mean, this is the argument in favor of transparency.
2121
CK: Sure.
2122
DW: There’s no reason why every aspect of our election system shouldn’t be public and open to
2123
scrutiny.
2124
CK: Um hm.
2125
DW: And by keeping these things hidden, you’re not, there’s no hidden strength that you’re
2126
protecting.
2127
CK: Sure.
2128
DW: What you’re hiding are weaknesses…
2129
CK: Right.
2130
DW: And, it’s certainly the case that anybody that might wish to compromise the election can
2131
learn all the details.
2132
CK: Um hm.
2133
HL: I’m gonna stop you there to change the tape.
2134
DW: Allright. Bathroom Break!
2135
{…tape break}
2136
DW:…by eating potato chips! (Crunch Crunch) (Laughter).
2137
HL: I wanted to {ask you…} it’s so interesting hearing about all the things you have to learn that
2138
you would think that a computer scientist would never have to learn, you know, so you’re learning
2139
the intricacies of wording, um {} of things that you know are going to be really high profile, and
2140
watching how everyone does that, you’re learning the intricacies of state law about trade secrets
2141
and those kind of things…what’s the experience of all of a sudden, you know, the kind of work
2142
that you do with technical objects all of a sudden is tied into to all of these legal and political…
2143
DW: Honestly I think it’s a hoot.
2144
HL: Mm-hm.
2145
DW: The Rice Media Relations office, this is mentioned in the transcript last time, I mentioned I
2146
was going there {to} be briefed, yeah they were gonna teach me how to deal with the media, they
2147
set up a video camera, and Jade Boyd interviewed me about voting security, and then we sat
2148
down and watched the video tape.
2149
HL: Mm-hm.
2150
DW: And Jade Boyd and Margot Dimond were pointing out things that I could have done better,
2151
how I could have kept my answers more precise, by saying… just answer the ques…you know
2152
they told me all these things that I never would have thought about, like, on a TV interview, you,
2153
the question, from the interviewer is never aired, ever, cause they’re always cutting things so
2154
tight.
2155
CK: Right.
2156
DW: So, to say what you…there’s {garble} makes sense why politicians never actually answer
2157
the question that was asked of them, cause they know that the question is hardly ever gonna be
2158
heard next to the answer. So instead they, they stay on message. And so {garble} know is how
2159
to stay on message: have your 3 points, know what they are, be able to state them precisely, the
2160
better you know them the less you will say Uh Um Oh etc.
2161
CK: Right.
2162
DW: So I got briefed in all that, and then I read my own transcript and horror of horrors, I’m
2163
reading myself doing all the things I’m not supposed to be doing!
2164
HL: But we’re not journalists.
2165
DW: Right.
2166
CK: So we actually don’t care if you do that. We’re actually, as anthropologists, we’re
2167
diametrically opposed to journalists, we prefer the long meandering story to the, to the simple
2168
points, cause there are always simple points, we can get simple points from the media, what we
2169
can’t get is the, complex point…
2170
DW: Well, I’m hardly as refined as a modern day politican is. So I don’t think you should be too
2171
worried about me overly filtering what I’m saying.
2172
CK: Right.
2173
DW: But, I do think that I don’t like to see what I say being as ungrammatical as it can be… So
2174
the experience has been… every computer geek is fascinated by where technology intersects
2175
with public policy and the law. The DMCA really raised a lot of people’s consciousness. These
2176
election issues are a regular issue on Slashdot. I mean Slashdot’s a good barometer of what
2177
matters to your friendly neighborhood geek. And intellectual property law, is actually very
2178
intriguing toyour average geek, even though to the general population it’s almost entirely
2179
irrelevant. And especially when that technology intersects with things like, god-dammit I want my
2180
MP3s…
2181
CK: Mm hm.
2182
DW: And when you start saying no I can’t have my MP3s, instead I have get whatever stupid crap
2183
{system} you’re pushing because it will somehow make it {} more difficult for teenagers to pirate
2184
music. That pisses the technologist off. And turns them into some kind of an activist.
2185
HL: So does that mean that you enjoy these experiences so much that you would continue to
2186
actively seek out things which happen at this intersection of technology and public policy of is the
2187
type of work you do just inevitably going to lead you to…
2188
DW: Mmm. That’s a curious question. I don’t say…I don’t think seeking it out is the right answer,
2189
but it seeks me out. This election stuff, definitely…the people who were more involved, people
2190
like David Dill and all them, literally sought me out, and said {we need help}.
2191
HL: Right.
2192
DW: And I can’t say no to something as good as that. Most of my work is boring academic stuff
2193
that ten people in the world care about, just like what most academics do. I mean…
2194
CK: And not all of it{garble} or some very small portion of it come from work on practical issues
2195
like this…
2196
DW: Yeah. This is, you go look at my vita, and you’ll see, most of my papers have nothing to do
2197
with tearing down the Man (laughter).
2198
CK: Right. But can I ask a sort of inverse question there, to the one Hannah’s asking, which is do
2199
you think that Computer Science as a discipline, or the people that, your peer group, rather,
2200
within computer science, um, are reluctant to see work derived from practical concerns being
2201
legitimate? I mean one of the things you mentioned in the last interview was the issue, the fact
2202
that stack inspection which was a result of the Java stuff, didn’t really get any attention for seven
2203
years, and at the time people said it was just a hack {garble}, do you think that’s systematic in
2204
computer science, or?
2205
DW: Umm, yes. One, in the {systems} community, there’s a widespread belief that, we the
2206
systems people blew it because we didn’t invent the Web. We should have invented the Web
2207
and it came from those Physics people! Damn them! But part of the reason why the systems
2208
community didn’t {invent the web} is that it was too simple.
2209
CK: Uh huh.
2210
DW: Of course, they’re busy fixing that problem (laughter). But in its original conception it was so
2211
simple that there was nothing to it, and that’s exactly why it took off. Whereas {systems people}
2212
would have tried to solve all the Web’s problems—{garble}, make it scalable, so you don’t have
2213
the Slashdot effect, they would have tried to deal with the {broken links issue} so you’d never
2214
have broken links, and they would have had this large unwieldy mess that never went anywhere.
2215
So… in a certain sense, when you start addressing practical problems, the solutions often aren’t
2216
technically deep. There’s nothing deep about how to build a good voting machine, nonetheless
2217
we still need them. So… a lot of the work I do is technically deep and obscure, and addresses
2218
threats that may or may not have any correlation to what would happen in the real world.
2219
CK: Mm hm.
2220
DW: But. When something as topical and relevant as this comes along, I certainly think now that
2221
it’s part of my duty to do something about it. In the same way that I imagine that the folks over in
2222
political science work on things that are much more obscure, the economists…you know if you’re
2223
an economist and you spend all your time you know studying the widget economy, and then
2224
somebody comes to you and says: what do you think about the recovery? Is it real or not? You
2225
know, it’s not really what you’re working on and publishing papers about, but you know, you know
2226
enough about macroeconomics and you can read the paper probably better than other people
2227
and read between the lines, and you can speak with some authority, so you do. So in the same
2228
way, I feel I’m doing exactly what the rest of those folks do.
2229
CK: Mm hm.
2230
DW: Which is unlike what most computer scientists do. Cause computer sci… I mean computer
2231
scientists as a bunch Are anti-social. Very, very few computer scientists do well in front of a
2232
camera…I’m learning. My advisor Ed Felton was a master. But he wasn’t when we started. But
2233
he developed into it, and got very good at it. Ridiculously good. And… in a certain sense I’m
2234
following in the footsteps of my advisor. Seeing that I can build a successful career, I just have to
2235
maintain a balance between meaty, technical, nerdy stuff and…ethically ambiguous… ‘am I doing
2236
the right thing, am I doing it in exactly the right way, if I have to tell the world about it, save the
2237
world from whatever…’
2238
CK: Well, one of the things you said in the last interview that everyone seized on who read it was
2239
this phrase “doing the right thing ethically is almost always doing the right thing professionally,”
2240
and I think it’s interesting, in terms of computer science, all the obscure work tends not to fall into
2241
that hopper because maybe there’s occasionally there’s the same kind of ethical issues, but I
2242
think what you were referring to was these more high profile ones…
2243
DW: Yeah, I guess what I probably meant to say with that quote was, when we’re talking about
2244
boring technical things, “look I made a web server run 10% faster,” there are no ethical dilemmas
2245
there, but when you’re talking about publishing a paper that criticizes a major voting industry firm,
2246
now there are some ethical questions that you have to resolve. And, you have to make some
2247
decisions about what you’re going to do. And in terms of the question of what’s right ethically and
2248
what’s right professionally – what would it mean to do the wrong thing professionally? Well, you
2249
know, I could have tried to blackmail Diebold, I could have tried, I could have just ignored
2250
university council’s advice and done other things – there’s a lot of different ways I could have
2251
approached the problem. But the way we did approach the problem produced results that,
2252
y’know, professionally people have applauded us for, and everything that we did was very
2253
carefully reasoned about, y’know, with lawyers and the works to make sure that it was legal and
2254
ethical and all that. The earlier drafts of the paper had much more inflammatory language than
2255
the final paper that went out. Is that an ethical issue? I’m not sure, but by keeping a very cool
2256
professional tone, it helped set us apart from the conspiracy theorists and gave our voice more
2257
authority.
2258
CK: Mm-hm.
2259
DW: One of the things my fiancé criticized me about, rightfully, some of my early interviews, I
2260
would use phrases like, “these voting machines are scary,” she was like, “scary is, like, a juvenile
2261
word, that’s a word that kids use. Say, these voting systems are unacceptable.”
2262
(Laughter)
2263
DW: That’s a, that’s a professorial word.
2264
CK: Right.
2265
DW: And simple changes in the wording – to me, ‘scary’ and ‘unacceptable’ are synonyms, but to
2266
the general public, those words – scary is a juvenile word and unacceptable is an adult word.
2267
You know, unacceptable hits you in a way that scary doesn’t. On my door, I’ve got a copy of This
2268
Modern World, by Tom Tomorrow, where he had Sparky his penguin dressing up as a Diebold
2269
voting machine for Halloween – because it’s scary.
2270
CK: (Laughs). That’s good. Well you mentioned last time too, that the voting thing has made
2271
you pay a lot more attention to your words because – I love the example where you said, “we
2272
found a flaw, that’s great, wait, no that’s bad!” Like, part of it seems to me to be the good and
2273
bad thing, but also this is particularly fraught because of the Republican/Liberal or the
2274
Republican/Democrat or any of those things. Can you say more about your problem with
2275
watching your words?
2276
DW: Well you will never in public see me say anything in support of or against a political
2277
candidate of either stripe now. Of course I have political opinions, but you’re not going to hear
2278
me say them, because…I absolutely have to be non-partisan, it’s a non-partisan issue. And I
2279
don’t care who you want to vote for, you should, if you believe at all in our democracy you should
2280
believe {everyone gets} exactly one vote, and those get tallied correctly. As it turns out, I’ve
2281
never actually registered for a political party. Now I can actually say, good thing, “I’m non-
2282
partisan. I’m not a Democrat. I am not a Republican.”
2283
CK: And in terms of the technical analysis, does it descend to that level too? I haven’t read the
2284
paper you guys [wrote, but I will now, because] I’m interested.
2285
DW: [It’s important…yeah…yeah] I mean technically it’s important for me to be able to say, you
2286
know, this is a machine, these are its goals, this is the security threat it has to be robust against.
2287
It’s not. And I need to be able to say that in a very clinical fashion. Have people used it to
2288
compromise elections? I have no clue. Is it technically feasible? You bet. And that’s where I
2289
have to draw the line.
2290
CK: Sure – but even in saying that it’s technically feasible don’t you – this is like imagining how
2291
to be a really good bad guy, don’t you have to be able to imagine some specifics?
2292
DW: Oh sure, and we go into a lot of those specifics. Yeah we hypothesize, perhaps you have a
2293
precinct that’s heavily biased in favor of one candidate over the other. If you could disable all
2294
those voting machines, you could disenfranchise that specific set of voters and swing the election
2295
to the other candidate. When I say it that way, I’m not talking about the Democrat and the
2296
Republican. In a paper that we just shipped this weekend, it was a demo voting system that we
2297
built called Hack-A-Vote, to demonstrate how voting systems could be compromised. In fact, it
2298
was a class project in my security course. Phase One, the students were responsible for adding
2299
Trojan Horses to the code.
2300
CK: Uh-huh.
2301
DW: Their assignment was: you work for the voting company, you’ve been hired to compromise
2302
the election, do it. And then Phase Two: you’re an auditor, here are some other students group
2303
projects, find all of the flaws… can you do it? So in that assignment I put students in both roles
2304
within the same assignment.
2305
CK: Sure…
2306
DW: And we wrote a paper about that assignment and one of the things we were very careful to
2307
do in…we have some screen shots… in the initial version of the paper one of my students had
2308
just gone and figured out who was running in the 2000 presidential election and had Gore, Bush,
2309
Nader, etc. I said, “nope, get rid of it.” We have another set of ballots where we have the Monty
2310
Python Party, the Saturday Night Live Party, and the Independents, so you could have an
2311
election between John Cleese and Adam Sandler and Robin Williams. And, that’s something that
2312
lets you avoid any sense of partisanship.
2313
CK: Sure.
2314
DW: Yet it’s – I like the sense that it’s less serious. Hart Intercivic (sp?), when they were
2315
demonstrating their own voting machines, used a historical ballot, had Washington running
2316
against Jefferson and Madison. I much prefer the comedian ballot.
2317
CK: Indeed. So let’s see here. One of my questions which kind of came up after thinking about
2318
some of this stuff - I guess I’d never gotten a clear answer on it - was whether or not you have
2319
actually have a sense that software can be perfectly secure?
2320
DW: It can’t.
2321
CK: It can’t?
2322
DW: Ever.
2323
CK: So how do you formalize that as a researcher, how do you, is it always just a contextual
2324
thing, or, is it “secure in this case.”
2325
DW: Well, secure with respect to some threat model.
2326
CK: Uh-huh.
2327
DW: You can build a server that’s secure with respect to people trying to communicate with it
2328
over the network, but can you build it secure with respect to people attacking it with a
2329
sledgehammer? Probably not. That’s a threat that’s outside of our threat model, so we don’t try
2330
to do it. Although for some systems that is part of the threat model. Military people worry about
2331
that sort of thing.
2332
CK: So is part of the problem then in terms of dealing with corporations who have software
2333
they’re selling to customers trying to figure out what their threat model is?
2334
DW: Well part of the problem is when industries – the threat model they’ve built the system to be
2335
robust against, isn’t the threat model it’s going to face. Most of these voting systems are built to
2336
be robust against being dropped on the floor, or losing electricity for the election, and you can put
2337
a big lead acid battery in there, and you can test it by dropping it on the floor ten times and it’ll be
2338
ro… you can build a system that’s phyisically robust against those sorts of issues.
2339
CK:Mm-hm.
2340
DW: But if your threat is that somebody tries to reprogram the machine, or if your threat is that
2341
somebody brings their own smart card rather than using the official one, they didn’t appear to
2342
have taken any of those threats seriously, despite the fact that those are legit threats that they
2343
should be robust against. What galls us is that the answer is so simple, it’s this voter-verifiable
2344
audit trail. Once you print it on paper, then your software can’t mess with it anymore, it’s out of
2345
the software’s hands. And the voter sees it and the voter says, yes, that’s what I wanted or no,
2346
that’s not what I wanted. That means you don’t care if the software’s correct anymore. What
2347
could be better than that?
2348
CK: (Laughs) Really. You’ve just given them free reign to write the crappiest software that they
2349
can.
2350
DW: Yeah, now you’ve designed the system where they can write crap software!
2351
CK: All you need to do is have a print button!
2352
DW: Given the quality that we know most commercial software is, that’s, it’s cheaper for them to
2353
produce software that… we don’t care is correct because the greater system around it is robust,
2354
than trying to somehow build perfect software and somehow guarantee that there’s no
2355
opportunity for a bad guy to insert Trojan Horses in that otherwise perfect software. That’s a very
2356
difficult threat relative to producing paper ballots that a voter can read, that’s a much simpler
2357
solution to a hard problem. So to me it’s obvious, but somehow that doesn’t work.
2358
CK:Mm hm.
2359
DW: And why it doesn’t work is some complicated amalgam of whiny election officials who don’t
2360
want to have to deal with paper, poorly educated election officials who don’t want to have to lose
2361
face over having made a very bad decision. Similarly, vendors who don’t want to lose face after
2362
having told the world that their systems are perfect, and this whole, you know, voting-industrial-
2363
complex that… where the whole process from soup to nuts is simply broken, too many people
2364
have too much vested in that process, and you know, we’re saying, you know, we’re saying that
2365
the emperor has no clothes.
2366
CK: Right.
2367
DW: And saying it in a cool, professional, technical language…
2368
CK: Mm hm.
2369
DW: Hopefully sends chills down the right peoples’ spines. Of course what we’ve found is, in
2370
fact, that isn’t enough, we have to go out and give talks, and testify, and talk to the press.
2371
CK: How far would you be willing to push that, maybe not you personally, but even just in terms
2372
of lending your support? We talked last time about the Blaster Worm which you clearly said you
2373
were opposed to as a procedure of, say, let’s call it, civil disobedience.
2374
DW: Yeah.
2375
CK: Um, how far would you be willing to push the voting stuff, in terms of, you know, if someone
2376
decided, you know, using your paper, that they could go in and rig it so that Monty Python did
2377
win, you know that John Cleese did win an election, is that something that you think would
2378
demonstrate the insecurity of these systems properly, or is that a dangerous thing to do?
2379
DW: I think it’s dangerous to go dork with actual live elections where there’s real candidates
2380
being elected. First off, it’s very, very illegal. And oftentimes people who try to prove a point that
2381
why, like the character who went and left some box cutters inside airplane bathrooms, the
2382
security people didn’t have much of a sense of humor about it.
2383
CK: (Laughter) Surprise!
2384
DW: Yeah, Surprise. And I don’t really want to go to jail, and if I can prove my point without
2385
needing to go to jail, then that’s all the better. And enough people are taking what we’re saying
2386
seriously that I don’t feel need to do a stunt like that, ‘cause that’s all it is, it’s just a stunt.
2387
Instead, I’d rather be in a debate where you’ve got me and you’ve got Beverly Kauffman or
2388
somebody, {and me on} …the other side where I can debate them in a public forum and I wanna
2389
win or lose a debate.
2390
CK: Uh-huh.
2391
DW: I was invited to Atlanta for just such a debate but the secretary of state and everybody else
2392
on that side refused to show. It was being hosted by this guy Franz Klein who’s in the what
2393
school is he in…public policy I think, at Georgia Tech. And he organized this thing and I flew in
2394
and another guy from Iowa, whose, whose also on our side… This guy , his name is Doug Jones
2395
and he’s a professor of computer science and he’s on Iowa’s board of election examiners, so he
2396
has a foot in both camps. He absolutely supports the voter verifiable audit trail, thinks it’s the
2397
right answer. And he and I were invited, they were gonna try to {have} people from the other side.
2398
Nobody on the other side wanted to show up. So it’s like they’re ducking the issue.
2399
CK: Mm-huh.
2400
DW: And that isn’t gonna work long-term for them. The issue has far greater legs than anybody
2401
could have predicted in advance. Never could I have predicted that one week of research would
2402
turn into six months of talking to the press. Never.
2403
CK: What do you think about what’s happening with the stuff that, the stuff that was on the New
2404
Zealand web-server, which is now being mirrored everywhere because… as a kind of act of
2405
electronic civil disobedience?
2406
DW: Well, so the stuff that was on the New Zealand web-server is gone, I don’t think it’s online
2407
anywhere now.
2408
CK: Uh-huh.
2409
DW: The stuff that’s mirrored everywhere is… subsequent to our paper, some hacker broke into
2410
some Diebold system and carted away a bunch of private email.
2411
CK: So those aren’t the same thing, then?
2412
DW: Those are separate.
2413
CK: Oh OK. That’s important to know.
2414
DW: Yeah. So some hacker carted away all this, this treasure trove of internal Diebold emails and
2415
that’s what’s now being mirrored all over the place. So I have absolutely no involvement in that.
2416
CK: And the stuff that you downloaded from the New Zealand…that stuff is basically gone, that’s
2417
not part of this?
2418
DW: Well, Diebold has since sent some sort of cease and desist to them, and in fact it did go
2419
away.
2420
CK: OK.
2421
DW: And in fact, Diebold sent us a cease and desist much later in the game, but I wandered over
2422
to Rice legal with this, and I had all three of the Rice lawyers reading it and laughing at it, saying
2423
“Is that the best they can do? And what took them so long?” (Laughter).
2424
CK: A little lawyer posturing there.
2425
DW: Well, their attitude was, “This is a joke.” And…
2426
CK: But unfortunately, it’s so not, since it like involves one of a handful of corporations that’s
2427
building voting machines…
2428
DW: Right right. But the cease and desist that they sent to myself and a similar one to the
2429
Hopkins people, legally, was a joke. What they accused us of and what they threatened us with,
2430
well they said, you know, “Stop that! Or….We’ll say “Stop that” Again!” You know (laughter). It
2431
didn’t really have, they didn’t say stop this or we will bring these laws down on your head, which
2432
is the normal way you write one of those things.
2433
CK: Right, sure, sure.
2434
DW: What they basically said is “We’re watching you! You damn kids.” But it didn’t say much.
2435
CK: Right, So where does it stand now in terms of trying to negotiate with any of these voting
2436
companies to get access to the source code to do a legitimate study? Is that part of the
2437
negotiation now?
2438
DW: I think… none of the voting companies really want to talk to us {anymore}.
2439
CK: Right [Are you, but]
2440
DW: [They didn’t want to talk to us in advance]
2441
CK: [But if it becomes] a bigger and bigger, if it actually becomes a bigger and bigger, isn’t the
2442
right thing for them to do or the right thing for you to do is to try and negotiate to do a legitimate
2443
study of this, you know, or another one, or to repeat it?
2444
DW: Right, but the voting companies don’t want to give it to us on our terms.
2445
CK: Mm-huh.
2446
DW: Our terms, I mean, I [would go…]
2447
CK: [But I mean] in some ways that’s fine except that their gonna give it to someone else, on
2448
other terms, right?
2449
DW: Well if the terms are non-disclosure, then who knows who they’ve given it to?
2450
CK: Right.
2451
DW: If the terms are, anybody in the world can see it, then I’m interested.
2452
CK: Mm-huh.
2453
DW: And in fact Vote-Here, which I mentioned earlier, has allowed researchers to look at their
2454
source code. I’m not involved in that effort, but a number of other researchers are looking at
2455
Vote-Here’s software.
2456
CK: Mm-Huh.
2457
DW: And, you know, I applaud that company for that, and since Sequoia, one of the big three
2458
firms is apparently going to be using Vote Here’s software, in some {very} capacity, Vote Here is
2459
doing something good. And, you know, likewise there are limits to what I or any other researcher
2460
can do. The community as a whole needs to be looking at these things and you know, I’m doing
2461
my part. I can’t do everything.
2462
CK: Sure, Sure. What about the federal issue of actually setting standards for these kinds of
2463
things, where, who’s worrying about that?
2464
DW: So, There is a bill winding its way through congress, it’s house bill 2239, spo, it, the initial
2465
author is Russ Holt of NJ, it’s got 50-60 odd democratic sponsors, and no republican sponsors…
2466
CK: Why is that?
2467
DW: Good question. And what that bill is, what it would require if it became law would be this
2468
voter-verifiable audit trail. And the vendors, whe… back before we published the paper that
2469
trashed Diebold, these vendors, initially, would say things like: Voter verifiable audit trail, dumb
2470
idea; we don’t need that, we don’t want it, it’s bad. As we put more and more pressure on them,
2471
they became agnostic. Well, if we’re required to do it and if there’s a standard that says how we’ll
2472
do it, then we’ll do it. So that’s what’s… that’s more or less where we stand now, all of the major
2473
vendors are now saying, “if we’re required to do it and if there’s at all some notion of a standard
2474
for how we’re gonna do it, then we will. And so actually behind the scenes, there’s a furious battle
2475
going on inside the IEEE standards committee on defining exactly what these standards might or
2476
might not be. And again this idea of regulatory capture comes up. And that’s the sort of un-sexy
2477
boring part of it. Dill and Mercury are involved in that process, as are a number of other people. I
2478
decided that I don’t have the time to get wrapped up in the standards effort… I just don’t have
2479
enough time.
2480
CK: Right.
2481
DW: I mean, I was, they were gonna have a meeting in San Antonio, and I was gonna show up,
2482
and then like two weeks before the meeting they moved it, to a time that I already had a time-
2483
conflict. I was gonna be at a conference somewhere in upstate New York. People had bought
2484
tickets for this thing… and they just arbitrarily moved the date! Who’s they and why did they do
2485
it? I don’t know, but it’s another example of sleazy behavior.
2486
HL: And when you teach this as an exercise in this class that you were telling us about, I
2487
presume you do that partly because it’s something that you’re working on, so you get some ability
2488
to teach what you’re working on, but also do you see it as teaching these students about {these
2489
things} you were talking about, about technology and public policy?
2490
DW: It’s an assignment that covers a lot of ground compressed into a small assignment. In prior
2491
years, I had an assignment where phase one was to design a Coke-machine that you pay with a
2492
smart card. And phase two was you’d get somebody else’s Coke Machine and beat up on it.
2493
And then Phase three you’d take that feedback from the other groups in phase two and make
2494
your thing better. And the genius of that assignment was among other things it avoided slow
2495
delay of grading, because you got immediate feedback from other groups, because they, they
2496
had their own deadlines. So I just took that idea and I replaced the Coke-machine with a voting
2497
machine. And I had you start off being evil rather than starting off being good, but otherwise it’s,
2498
it, it mirrors an assignment that I’d already carefully designed to have this duality of being on both
2499
sides of a problem. So I kept that duality but I stirred in a topical problem… the students have
2500
mentioned to me that they really enjoyed being able to discuss their problem with their peers.
2501
Everybody likes working on something that is topical or cool or somehow meaningful to people
2502
who aren’t conversant in what their major is. I do this when I teach COMP 314, a sophomore
2503
level software engineering class. The COMP 314 projects, one of them is, implement a
2504
newspaper-style web site that reads in a bunch of articles and then outputs the articles with
2505
cross links, and here’s the business section, here’s the metro section, all that, just like a real
2506
website does, and then you can show your friends: “Look, I built something that does something
2507
real.” Another one of the assignments is, using genetic algorithms, which is a sort of obscure
2508
topic in computer science, but using them to generate pictures, you’re breeding pictures
2509
together… and you have this interface, where you can pick one that’s pretty and say “Make
2510
more.” And it then breeds them and you get more pretty pictures and then you click on the pretty
2511
ones and say “make more.” And it’s something your roommates can play with even if they don’t
2512
understand it, they can still think it’s cool.
2513
CK: Mm-huh.
2514
DW: And I’ve found that students will work harder when they consider they’re work to be cool. So
2515
in a certain sense, I’m trying to trick people into doing more work (laughter).
2516
CK: We knew you were evil!!
2517
DW: Mwah-hahh-hah (but it works!)…
2518
CK: Okay I think we’ve pretty much the end. Um, but I just noticed this while we sitting here,
2519
cause you talked about it, I just want to share on camera…
2520
DW: Oh yeah…
2521
CK: {Tips} for effective Message Control…
2522
DW: These are handouts from the Rice Media Center people.
2523
CK: That’s so [Orwellian…]
2524
DW: This, this was authored by Terry Shepard, I believe.
2525
CK: Okay, good lord. (Reading) Pre-Interview exercises. Find out as much as you can about the
2526
news media outlet interviewing you, and the…and …(reading) Did you do all these things?
2527
DW: I don’t remember getting an email from you?
2528
CK: “Try describing you program or job in one sentence using clear concise language. Write
2529
down and repeat to yourself 1-3 points you want to get across, and practice answering questions
2530
in advance, especially ones you don’t….” What I like are the During the Interview ones… “Stay
2531
focused, don’t confuse your interviewer by bringing in extraneous issues.” Why do computer
2532
scientists have so many toys on their desk?
2533
DW: Uhhhh…
2534
CK (Laughter). “Use flag word: The most important thing to remember… the three most
2535
important things… for example… etc.” “Be interesting, tell a brief anecdote that will illustrate your
2536
point.” We’re opposed to that one as well… (we say) Tell a long complex, meandering anecdote
2537
to illustrate your point, so that we have something to WORK with!
2538
DW: Laughing…
2539
CK: “Stay positive. Don’t repeat any part of a negative question when you are answering it.”
2540
“Speak from the public’s point of view.” That’s hard, that’s a really hard one. How do you know
2541
when you’re in the public? “Speak past the interviewer, directly to the public.” “State your
2542
message, then restate it in a slightly different way. Do this as many times as you can without
2543
sounding like a broken record.” (Laughter) That’s a good one, isn’t it? And I like this one
2544
especially: “Bridge from an inappropriate question to what you really want to say.” It all comes
2545
down to like, “say the same thing over and over again, and don’t answer the questions you don’t
2546
want to.”
2547
DW: Yeah, I mean this is what your politicians call “staying on message”.
2548
CK: Right, Effective message control.
2549
DW: And here’s some more…
2550
CK: Oh, Ethics and Procedures! “Tell the truth, the whole truth and nothing but the truth.” “Stay
2551
on the record.” “face to face”.
2552
DW: I like this: “You are the University.”
2553
CK: Wow.
2554
DW: I mean they emphasize this, because when I’m talking to a reporter, the distinction between
2555
a professor… Professor X says, “blah!”—people say “Oh, Rice University says, ‘blah!’” People
2556
read between the lines, so you are speaking on behalf of your employer whether you want to be
2557
or not…
2558
CK: Right. You Are The University! On that note… thanks again, for your time.
2559
DW: No problem.
Download