HSS-EncryptionAdvantage

advertisement
Notes/Explanation
This is an advantage stem that can be read with the Surveillance State Repeal Act and Secure Data Act affirmatives (and
others that we haven’t worked on yet). It accesses many of the other impact areas that we have developed in other
advantage files. This file contains the internal link stem for the advantage; the impact scenarios are found in other
advantage files and the solvency cards for a particular plan are found in that affirmative file.
1AC Options
1AC — Encryption Advantage Stem
Contention __ is Encryption
First, U.S. government attacks on encryption leave everyone vulnerable,
jeopardizing privacy and data security. There is no such thing as a “good guys
only” backdoor.
Doctorow 14 — Cory Doctorow, journalist and science fiction author, Co-Editor of Boing Boing, Fellow at the
Electronic Frontier Foundation, former Canadian Fulbright Chair for Public Diplomacy at the Center on Public Diplomacy
at the University of Southern California, recipient of the Electronic Frontier Foundation’s Pioneer Award, 2014 (“Crypto
wars redux: why the FBI's desire to unlock your private life must be resisted,” The Guardian, October 9th, Available Online
at http://www.theguardian.com/technology/2014/oct/09/crypto-wars-redux-why-the-fbis-desire-to-unlock-yourprivate-life-must-be-resisted, Accessed 06-24-2015)
Eric Holder, the outgoing US attorney general, has
joined the FBI and other law enforcement
agencies in calling for the security of all computer systems to be fatally
weakened. This isn’t a new project – the idea has been around since the early 1990s, when the NSA classed all strong
cryptography as a “munition” and regulated civilian use of it to ensure that they had the keys to unlock any technological
countermeasures you put around your data.
In 1995, the Electronic Frontier Foundation won a landmark case establishing that code was a form of protected
expression under the First Amendment to the US constitution, and since then, the whole world has enjoyed relatively
unfettered access to strong crypto.
How strong is strong crypto? Really, really strong. When properly implemented
and secured by relatively long keys, cryptographic algorithms can protect your
data so thoroughly that all the computers now in existence, along with all the
computers likely to ever be created, could labour until the sun went nova
without uncovering the keys by “brute force” – ie trying every possible permutation of password.
The “crypto wars” of the early 1990s were fuelled by this realisation – that computers were
changing the global realpolitik in an historically unprecedented way. Computational crypto made
keeping secrets exponentially easier than breaking secrets, meaning that, for the
first time in human history, the ability for people without social or political
power to keep their private lives truly private from governments, police, and
corporations was in our grasp.
The arguments then are the arguments now. Governments invoke the Four
Horsemen of the Infocalypse (software pirates, organised crime, child
pornographers, and terrorists) and say that unless they can decrypt bad guys’
hard drives and listen in on their conversations, law and order is a dead letter.
On the other side, virtually every security and cryptography expert tries
patiently to explain that there’s no such thing as “a back door that only the good
guys can walk through” (hat tip to Bruce Schneier). Designing a computer that bad guys
can’t break into is impossible to reconcile with designing a computer that good
guys can break into.
If you give the cops a secret key that opens the locks on your computerised
storage and on your conversations, then one day, people who aren’t cops will get
hold of that key, too. The same forces that led to bent cops selling out the public’s personal information to Glen
Mulcaire and the tabloid press will cause those cops’ successors to sell out access to the world’s computer systems, too,
only the numbers of people who are interested in these keys to the (United) Kingdom will be much larger, and they’ll
have more money, and they’ll be able to do more damage.
That’s really the argument in a nutshell. Oh, we can talk about whether the danger is
as grave as the law enforcement people say it is, point out that only a tiny number of
criminal investigations run up against cryptography, and when they do, these
investigations always find another way to proceed. We can talk about the fact
that a ban in the US or UK wouldn’t stop the “bad guys” from getting perfect
crypto from one of the nations that would be able to profit (while US and UK business
suffered) by selling these useful tools to all comers. But that’s missing the point:
even if every crook was using crypto with perfect operational security, the
proposal to back-door everything would still be madness.
Because your phone isn’t just a tool for having the odd conversation with your friends –
nor is it merely a tool for plotting crime – though it does duty in both cases. Your phone, and all the other
computers in your life, they are your digital nervous system. They know
everything about you. They have cameras, microphones, location sensors. You
articulate your social graph to them, telling them about all the people you know and how you know
them. They are privy to every conversation you have. They hold your logins and
passwords for your bank and your solicitor’s website; they’re used to chat to your therapist and the STI clinic and
your rabbi, priest or imam.
That device – tracker, confessor, memoir and ledger – should be designed so that
it is as hard as possible to gain unauthorised access to. Because plumbing leaks at the seams,
and houses leak at the doorframes, and lie-lows lose air through their valves. Making something airtight is
much easier if it doesn’t have to also allow the air to all leak out under the right
circumstances.
There is no such thing as a vulnerability in technology that can only be used by
nice people doing the right thing in accord with the rule of law. The existing
“back doors” in network switches, mandated under US laws such as CALEA, have become the
go-to weak-spot for cyberwar and industrial espionage. It was Google’s lawful interception
backdoor that let the Chinese government raid the Gmail account of dissidents. It was the lawful interception backdoor in
Greece’s national telephone switches that let someone – identity still unknown – listen in on the Greek Parliament and
prime minister during a sensitive part of the 2005 Olympic bid (someone did the same thing the next year in Italy).
The most shocking Snowden revelation wasn’t the mass spying (we already knew about
that, thanks to whistleblowers like Mark Klein, who spilled the beans in 2005). It was the fact that the UK
and US spy agencies were dumping $250,000,000/year into sabotaging operating
systems, hardware, and standards, to ensure that they could always get inside
them if they wanted to. The reason this was so shocking was that these spies
were notionally doing this in the name of “national security”– but they were
dooming everyone in the nation (and in every other nation) to using products
that had been deliberately left vulnerable to attack by anyone who
independently discovered the sabotage.
There is only one way to make the citizens of the digital age secure, and that is to
give them systems designed to lock out everyone except their owners. The
police have never had the power to listen in on every conversation, to spy upon
every interaction. No system that can only sustain itself by arrogating these
powers can possibly be called “just.”
Second, these government programs are an attack on encryption itself. The
damage is done even if agencies don’t abuse backdoors.
Gillmor 14 — Dan Gillmor, Director of the Knight Center for Digital Media Entrepreneurship at the Walter Cronkite
School of Journalism and Mass Communication at Arizona State University, Fellow at the Berkman Center for Internet &
Society at Harvard University, recipient of the Electronic Frontier Foundation’s Pioneer Award, 2014 (“Law Enforcement
Has Declared War on Encryption It Can’t Break,” Future Tense—a Slate publication, October 1st, Available Online at
http://www.slate.com/blogs/future_tense/2014/10/01/law_enforcement_has_declared_war_on_encryption_it_can_t_b
reak.html, Accessed 06-24-2015)
Suppose two suspected criminals happened to be the last two people on Earth
who could understand and speak a certain language. Would law enforcement
argue that they should only be permitted to speak in a language that others
could understand?
That's an imperfect but still useful analogy to a "debate" now taking place in American
policy circles, as law enforcement reminds us that it will never be satisfied until
it can listen in on everything we say and read everything we create—in real time
and after the fact.
The spark for this latest tempest is Apple's announcement that iPhones will
henceforth be encrypted by default. Data on the device will be locked with a key that only the phone
user controls, not Apple. Meanwhile, Google's Android operating system—which, unlike Apple's iOS, has
offered this functionality for several years now as an option—will also make device encryption the
default setting in its next version.
Attorney General Eric Holder and FBI Director James Comey are leading the howdare-you chorus, with close harmony from the usual authoritarian acolytes. Their
goal can't only be to get Apple and Google to roll back this latest move, because as many people have pointed out, almost
all of what people keep on their phones is also stored in various corporate cloud computers that law enforcement can pry
open in a variety of ways, including a subpoena, secret order in alleged national security cases, or outright hacking.
No, the longer-range objective seems plain enough. This
is the launch of the latest, and most
alarming, attack on the idea of encryption itself—or at least encryption the government can't
easily crack. In particular, as the latest push to control crypto makes clear, law
enforcement wants so-called back doors into users' devices: technology that users
can't thwart, just in case the police want to get in.
Never mind that Congress has already said it doesn’t expect communications providers to provide such capabilities in
most cases. Here's relevant language from the law:
A telecommunications carrier shall not be responsible for decrypting, or ensuring the government’s ability to
decrypt, any communication encrypted by a subscriber or customer, unless the encryption was provided by the
carrier and the carrier possesses the information necessary to decrypt the communication.
What's discouraging to people who believe in digital security—that is, security for our
devices and our communications from criminals, business competitors, and
government spies who flout the Constitution, among others—is law enforcement's
disconnection from the reality that back doors increase vulnerability. University of
Pennsylvania researcher and security expert Matt Blaze put it this way: “Crypto
backdoors are dangerous even if you trust the government not to abuse them.
We simply don't know how to build them reliably.”
The hackers of the world—criminals, foreign governments, you name it—will be thrilled if
Holder, Comey and the no-privacy-for-you backup singers get their way.
The other, even worse, disconnect is the implicit notion that there is no measure
we shouldn’t take to guarantee our ability to stop and punish crime. The
Constitution, and especially the Bill of Rights, says we do take some additional risks in
order to have liberty. Why have we become so paranoid and fearful as a society
that we’d even entertain the notion that civil liberties mean next to nothing in the
face of our fear?
[Insert Impact Module(s) and Solvency Card(s)]
1AC — Long “Open Letter” Card
[If reading this impact, plug-in the relevant terminal impact cards.]
U.S. government attacks on encryption decimate cybersecurity, crush tech
industry competitiveness, and undermine global Internet freedom.
Open Letter 15 — An Open Letter to President Obama co-signed by 36 civil society organizations (including the
American Civil Liberties Union, Electronic Frontier Foundation, Electronic Privacy Information Center, and the Free
Software Foundation), 48 technology companies and trade associations (including Apple, Facebook, Google, Microsoft,
and Yahoo), and 58 security and policy experts (including Jacob Applebaum, Eric Burger, Joan Feigenbaum, and Bruce
Schneier), the full list of signatories is available upon request under the “FYI: Open Letter To Obama” header, 2015 (Open
Letter to Obama, May 19th, Available Online at https://static.newamerica.org/attachments/3138-113/Encryption_Letter_to_Obama_final_051915.pdf, Accessed 06-29-2015, p. 1-2)
We the undersigned represent a wide variety of civil society organizations dedicated to
protecting civil liberties, human rights, and innovation online, as well as
technology companies, trade associations, and security and policy experts. We are
writing today to respond to recent statements by some Administration officials regarding the deployment
of strong encryption technology in the devices and services offered by the U.S. technology industry. Those officials
have suggested that American companies should refrain from providing any
products that are secured by encryption, unless those companies also weaken
their security in order to maintain the capability to decrypt their customers’ data
at the government’s request. Some officials have gone so far as to suggest that
Congress should act to ban such products or mandate such capabilities.
We urge you to reject any proposal that U.S. companies deliberately weaken the
security of their products. We request that the White House instead focus on
developing policies that will promote rather than undermine the wide adoption
of strong encryption technology. Such policies will in turn help to promote and
protect cybersecurity, economic growth, and human rights, both here and
abroad.
Strong encryption is the cornerstone of the modern information economy’s
security. Encryption protects billions of people every day against countless
threats—be they street criminals trying to steal our phones and laptops, computer criminals
trying to defraud us, corporate spies trying to obtain our companies’ most valuable trade secrets, repressive
governments trying to stifle dissent, or foreign intelligence agencies trying to compromise our and
our allies’ most sensitive national security secrets.
Encryption thereby protects us from innumerable criminal and national security
threats. This protection would be undermined by the mandatory insertion of any
new vulnerabilities into encrypted devices and services. Whether you call them “front doors”
or “back doors”, introducing intentional vulnerabilities into secure products for the
government’s use will make those products less secure against other attackers.
Every computer security expert that has spoken publicly on this issue agrees on
this point, including the government’s own experts.
In addition to undermining cybersecurity, any kind of vulnerability mandate would
also seriously undermine our economic security. U.S. companies are already
struggling to maintain international trust in the wake of revelations about the
National Security Agency’s surveillance programs. Introducing mandatory
vulnerabilities into American products would further push many customers—be
they domestic or international, [end page 1] individual or institutional—to turn away
from those compromised products and services. Instead, they—and many of the
bad actors whose behavior the government is hoping to impact—will simply rely
on encrypted offerings from foreign providers, or avail themselves of the wide
range of free and open source encryption products that are easily available
online.
More than undermining every American’s cybersecurity and the nation’s
economic security, introducing new vulnerabilities to weaken encrypted
products in the U.S. would also undermine human rights and information
security around the globe. If American companies maintain the ability to unlock
their customers’ data and devices on request, governments other than the United
States will demand the same access, and will also be emboldened to demand the
same capability from their native companies. The U.S. government, having
made the same demands, will have little room to object. The result will be an
information environment riddled with vulnerabilities that could be exploited by
even the most repressive or dangerous regimes. That’s not a future that the
American people or the people of the world deserve.
The Administration faces a critical choice: will it adopt policies that foster a
global digital ecosystem that is more secure, or less? That choice may well define
the future of the Internet in the 21st century. When faced with a similar choice at
the end of the last century, during the so-called “Crypto Wars”, U.S. policymakers
weighed many of the same concerns and arguments that have been raised in the current debate,
and correctly concluded that the serious costs of undermining encryption
technology outweighed the purported benefits. So too did the President’s Review
Group on Intelligence and Communications Technologies, who unanimously
recommended in their December 2013 report that the US Government should “(1) fully
support and not undermine efforts to create encryption standards; (2) not in any
way subvert, undermine, weaken, or make vulnerable generally available
commercial software; and (3) increase the use of encryption and urge US
companies to do so, in order to better protect data in transit, at rest, in the cloud,
and in other storage.”
We urge the Administration to follow the Review Group’s recommendation and
adopt policies that promote rather than undermine the widespread adoption of
strong encryption technologies, and by doing so help lead the way to a more
secure, prosperous, and rights-respecting future for America and for the world.
1AC — Short “Open Letter” Card
[If reading this impact, plug-in the relevant terminal impact cards.]
U.S. government attacks on encryption hurt cybersecurity, the economy, and
human rights.
Open Letter 15 — An Open Letter to President Obama co-signed by 36 civil society organizations (including the
American Civil Liberties Union, Electronic Frontier Foundation, Electronic Privacy Information Center, and the Free
Software Foundation), 48 technology companies and trade associations (including Apple, Facebook, Google, Microsoft,
and Yahoo), and 58 security and policy experts (including Jacob Applebaum, Eric Burger, Joan Feigenbaum, and Bruce
Schneier), the full list of signatories is available upon request under the “FYI: Open Letter To Obama” header, 2015 (Open
Letter to Obama, May 19th, Available Online at https://static.newamerica.org/attachments/3138-113/Encryption_Letter_to_Obama_final_051915.pdf, Accessed 06-29-2015, p. 1)
We the undersigned represent a wide variety of civil society organizations dedicated to
protecting civil liberties, human rights, and innovation online, as well as
technology companies, trade associations, and security and policy experts. We are
writing today to respond to recent statements by some Administration officials regarding the deployment
of strong encryption technology in the devices and services offered by the U.S. technology industry. Those officials
have suggested that American companies should refrain from providing any
products that are secured by encryption, unless those companies also weaken
their security in order to maintain the capability to decrypt their customers’ data
at the government’s request. Some officials have gone so far as to suggest that
Congress should act to ban such products or mandate such capabilities.
We urge you to reject any proposal that U.S. companies deliberately weaken the
security of their products. We request that the White House instead focus on
developing policies that will promote rather than undermine the wide adoption
of strong encryption technology. Such policies will in turn help to promote and
protect cybersecurity, economic growth, and human rights, both here and
abroad.
1AC — Civil Liberties Module
[If reading this impact, plug-in appropriate terminal impacts — privacy,
journalism, bigotry, etc.]
Encryption is a fundamental human right. Without it, privacy and freedom of
speech are impossible.
O’Neill 14 — Patrick Howell O'Neill, Reporter who covers the future of the internet for The Daily Dot—the
“hometown newspaper of the internet,” 2014 (“Encryption shouldn’t just be an election issue—it should be a human
right,” Kernel Magazine, October 19th, Available Online at http://kernelmag.dailydot.com/issue-sections/staffeditorials/10587/encryption-human-right/#sthash.zpesaOpG.dpuf, Accessed 06-24-2015)
We are all born with certain inalienable rights. Now, the notion of free speech—core to the entire
idea of human rights—must be constantly re-examined in the face of a rapidly changing world where the most important
speech increasingly takes place on the Internet.
Free speech isn’t merely about the abstract idea of saying whatever you want. It’s the
freedom to speak, ask questions, and seek knowledge without anyone, hidden or
otherwise, looking over your shoulder. In the digital age, free speech is about being
able to use the Internet unmolested by the greedy eyes of corporations with
incentive to sell your personal data, hackers wanting to steal or destroy it, and
massive regimes collecting it all.
Everything a person does online is saved as data. The messages you send, the
websites you look at, the files you save are all data that can live on forever. The
only way to protect your most personal data—your location, credit card numbers, political
thoughts, medical queries, and everything else you do on a phone, tablet, or computer—from malicious spying
eyes is through encryption.
That’s why encryption should be considered a human right.
Modern encryption—at its core, really good mathematics that make it possible to protect your data so that no computer
can decrypt it without your go ahead—is legally protected by a 1995 American court decision that declared computer
code constitutionally protected free speech. But many of the world’s top cops continue to demonize encryption, saying it
will inevitably enable and immunize criminals on massive scales.
Criticisms of encryption coming from corners of power are louder now than
they’ve been in two decades. Earlier this week, James B. Comey, director of the Federal Bureau of
Investigation (FBI), said that law enforcement agencies should have access to encrypted communications, because “the
law hasn’t kept pace with technology, and this disconnect has created a significant public safety problem.” He believes
Apple’s new smartphone encryption puts its users “beyond the law.”
What is bound to be one of the great political issues of this century is only now
beginning to enter the mainstream in a clear way. One day in the not-too-distant
future, cryptography will be an election issue. Prominent politicians will debate
louder than ever about privacy and secrecy while voters—even moms and dads—will
cast ballots with strong encryption on their minds.
That’s not to say the debate hasn’t already begun. The war over encryption is four decades old. It
dates back to the 1970s when a new invention called public key cryptography definitively broke the government’s
monopoly on secrets. All of sudden, using free software that utilized clever mathematics and powerful cryptography,
normal people were able to keep data private from even the most powerful states on the planet.
The war over encryption—most notably the so-called “crypto wars” of the 1990s—saw the American
government try to make strong encryption a military-grade weapon in the eyes
of the law. Opposed chiefly by the Electronic Frontier Foundation, courts declared
computer code to be free speech and said the government’s regulations were
unconstitutional.
Despite the landmark legal victory, the war over encryption has continued to this
day.
John J. Escalante, chief of detectives for the Chicago Police Department, has called
encryption mostly a tool of pedophiles—a claim that’s disingenuous and
misleading, if not outright dangerous. For one thing, many city and federal police
agents use encryption tools regularly, and encryption stymied a total of nine
police investigations last year. There are plenty of ways to investigate crimes
involving cryptography that don’t involve banning or curtailing it.
There’s no denying that these tools have some very ugly users. However, for the
few billion of us who want to keep our digital lives private from unwanted
eavesdroppers and hackers, being forcefully grouped in with terrorists and
pedophiles is a hard insult to stomach.
Encryption works to protect you—and everyone else—online. More than that, it’s the
best protection you have. There are simply no other options that can compare.
If, for some reason, you assume a hack will never happen to you, let me give you some
perspective on the current state of digital security. 2014 is known in information technology circles as
“the year of the breach” because it has boasted some of the biggest hacks in
history. 2013 had a nickname too: The year of the breach. Come to think of it, 2012 was
called something eerily similar: The year of the breach.
2011? You get the idea.
This isn’t merely one year of massive security breaches, it’s an era of profound digital
insecurity in which sensitive personal data—the information that can be put together to add up to a
startlingly complete picture of our lives and thoughts—is under attack by criminals, corporations,
and governments whose sophistication, budget, and drive is only growing.
Consider the following, put forth by Eben Moglen, a law professor at Columbia University, in 2010: “Facebook
holds and controls more data about the daily lives and social interactions of half
a billion people than 20th-century totalitarian governments ever managed to
collect about the people they surveilled.”
The Internet’s “architecture has also made it possible for businesses and
governments to fill giant data vaults with the ore of human existence—the
appetites, interests, longings, hopes, vanities, and histories of people surfing the
Internet, often unaware that every one of their clicks is being logged, bundled,
sorted, and sold to marketers,” the New York Times journalist Jim Dwyer wrote in his new book, More
Awesome Than Money. “Together, they amount to nothing less than a full psyche scan,
unobstructed by law or social mores.”
When people like Comey suggest that law enforcement should have a “back door” or
“golden key” that allows cops to easily access all encrypted communication, they
are willfully ignoring the reality shouted to them by the vast majority of the
information technology industry.
“You can’t build a ‘back door’ that only the good guys can walk through ,”
cryptographer Bruce Schneier wrote recently. “Encryption protects against cybercriminals,
industrial competitors, the Chinese secret police, and the FBI. You’re either
vulnerable to eavesdropping by any of them, or you’re secure from
eavesdropping from all of them.”
When encryption becomes a campaign issue, that’s going to go on the bumper stickers: You either have real
privacy and security for everyone or for no one.
“The existing ‘back doors’ in network switches, mandated under U.S. laws such as CALEA, have
become the go-to weak-spot for cyberwar and industrial espionage,” author Cory
Doctorow wrote in the Guardian. “It was Google’s lawful interception backdoor that let the Chinese government raid the
Gmail account of dissidents. It was the lawful interception backdoor in Greece’s national telephone switches that let
someone—identity still unknown—listen in on the Greek Parliament and prime minister during a sensitive part of the
2005 Olympic bid (someone did the same thing the next year in Italy).”
If, like many Americans, you say you don’t mind if the U.S. government watches what
you do online, take a step back and consider the bigger picture.
The American government is not the only government—nevermind other
organizations—watching and hacking people on the Internet. China, Russia,
Iran, Israel, the U.K., and every other nation online decided long ago that
cyberspace is a militarized country. All the states with the necessary resources
are doing vast watching and hacking as well.
Encryption proved a crucial help to protesters during the Arab Spring. It helps
Iranian liberals push against their oppressive theocracy. From African free
speech activists to Chinese pro-democracy organizers to American cops
investigating organized crime, strong encryption saves lives, aids law
enforcement (ironic, huh?), protects careers, and helps build a more free and verdant
world. Journalists—citizen and professional alike—depend on encryption to keep
communications and sources private from the people and groups they report on,
making it essential to an independent and free press.
The right to privacy, the right to choose what parts of yourself are exposed to the world, was described over a century ago
by the U.S. Supreme Court and held up as an issue of prime importance last year by U.N. human rights chief Navi Pillay.
It’s something we all need to worry about.
Lacking good law, privacy is best defended by good technology. You cannot truly
talk about online privacy without talking about encryption. That’s why many of
the world’s biggest tech firms such as Google, Apple, and Yahoo are adding strong
encryption to some of their most popular products.
“There is only one way to make the citizens of the digital age secure, and that is
to give them systems designed to lock out everyone except their owners,” Doctorow
wrote. “The police have never had the power to listen in on every conversation, to
spy upon every interaction. No system that can only sustain itself by arrogating
these powers can possibly be called ‘just.’”
In the digital age, encryption is our only guarantee of privacy. Without it, the
ideal of free speech could be lost forever.
1AC — Cybersecurity Module
U.S. government attacks on encryption destroy cybersecurity.
Open Letter 15 — An Open Letter to President Obama co-signed by 36 civil society organizations (including the
American Civil Liberties Union, Electronic Frontier Foundation, Electronic Privacy Information Center, and the Free
Software Foundation), 48 technology companies and trade associations (including Apple, Facebook, Google, Microsoft,
and Yahoo), and 58 security and policy experts (including Jacob Applebaum, Eric Burger, Joan Feigenbaum, and Bruce
Schneier), the full list of signatories is available upon request under the “FYI: Open Letter To Obama” header, 2015 (Open
Letter to Obama, May 19th, Available Online at https://static.newamerica.org/attachments/3138-113/Encryption_Letter_to_Obama_final_051915.pdf, Accessed 06-29-2015, p. 1)
Strong encryption is the cornerstone of the modern information economy’s
security. Encryption protects billions of people every day against countless
threats—be they street criminals trying to steal our phones and laptops, computer criminals
trying to defraud us, corporate spies trying to obtain our companies’ most valuable trade secrets, repressive
governments trying to stifle dissent, or foreign intelligence agencies trying to compromise our and
our allies’ most sensitive national security secrets.
Encryption thereby protects us from innumerable criminal and national security
threats. This protection would be undermined by the mandatory insertion of any
new vulnerabilities into encrypted devices and services. Whether you call them “front doors”
or “back doors”, introducing intentional vulnerabilities into secure products for the
government’s use will make those products less secure against other attackers.
Every computer security expert that has spoken publicly on this issue agrees on
this point, including the government’s own experts.
Cyber attacks are frequent and devastating. Every attack increases the risk of
existential catastrophe.
Nolan 15 — Andrew Nolan, Legislative Attorney at the Congressional Research Service, former Trial Attorney at the
United States Department of Justice, holds a J.D. from George Washington University, 2015 (“Cybersecurity and
Information Sharing: Legal Challenges and Solutions,” CRS Report to Congress, March 16th, Available Online at
http://fas.org/sgp/crs/intel/R43941.pdf, Accessed 07-05-2015, p. 1-3)
Introduction
Over the course of the last year, a host of cyberattacks1 have been perpetrated on a
number of high profile American companies. In January 2014, Target announced
that hackers, using malware,2 had digitally impersonated one of the retail giant’s
contractors,3 stealing vast amounts of data—including the names, mailing addresses, phone numbers
or email addresses for up to 70 million individuals and the credit card information of 40 million shoppers. 4
Cyberattacks in February and March of 2014 potentially exposed contact and login information of eBay’s customers, prompting the online retailer to ask its more than 200 million users
to change their passwords.5 In September, it was revealed that over the course of five
months cyber-criminals tried to steal the credit card information of more than
fifty million shoppers of the world’s largest home improvement retailer, Home Depot.6 One
month later, J.P. Morgan Chase, the largest U.S. bank by assets, disclosed that
contact information for about 76 million households was captured in a
cyberattack earlier in the year.7 In perhaps the most infamous cyberattack of 2014, in
late November, Sony Pictures Entertainment suffered a “significant system disruption”
as a result of a “brazen cyber attack”8 that resulted in the leaking of the personal
details of thousands of Sony employees.9 And in February of 2015, the health care
provider Anthem Blue Cross Blue Shield [end page 1] disclosed that a “very
sophisticated attack” obtained personal information relating to the company’s
customers and employees.10
The high profile cyberattacks of 2014 and early 2015 appear to be indicative of a
broader trend: the frequency and ferocity of cyberattacks are increasing,11
posing grave threats to the national interests of the United States. Indeed, the attacks
on Target, eBay, Home Depot, J.P. Morgan-Chase, Sony Pictures, and Anthem
were only a few of the many publicly disclosed cyberattacks perpetrated in 2014 and 2105.12
Experts suggest that hundreds of thousands of other entities may have suffered
similar incidents during the same period,13 with one survey indicating that 43% of
firms in the United States had experienced a data breach in the past year.14 Moreover,
just as the cyberattacks of 2013—which included incidents involving companies
like the New York Times, Facebook, Twitter, Apple, and Microsoft15—were
eclipsed by those that occurred in 2014,16 the consensus view is that 2015 and
beyond will witness more frequent and more sophisticated cyber incidents.17 To
the extent that its expected rise outpaces any corresponding rise in the ability to
defend against such attacks, the result could be troubling news for countless
businesses that rely more and more on computers in all aspects of their
operations, as the economic losses resulting from a single cyberattack can be
extremely costly.18 And the resulting effects of a cyberattack can have effects
beyond a single company’s bottom line. As “nations are becoming ever more
dependent on information and information technology,”19 the threat posed by
any one cyberattack [end page 2] can have “devastating collateral and cascading
effects across a wide range of physical, economic and social systems.”20 With
reports that foreign nations—such as Russia, China, Iran, and North Korea—may
be using cyberspace as a new front to wage war,21 fears abound that a cyberattack
could be used to shut down the nation’s electrical grid,22 hijack a commercial
airliner,23 or even launch a nuclear weapon with a single keystroke.24 In short,
the potential exists that the United States could suffer a “cyber Pearl Harbor,” an
attack that would “cause physical destruction and loss of life”25 and expose—in the
words of one prominent cybersecurity expert—“vulnerabilities of staggering proportions.”26
Cybersecurity outweighs terrorism.
CSM 14 — Christian Science Monitor, 2014 (“Feds hacked: Is cybersecurity a bigger threat than terrorism?,” Byline
Harry Bruinius, November 10th, Available Online at http://www.csmonitor.com/USA/2014/1110/Feds-hacked-Iscybersecurity-a-bigger-threat-than-terrorism-video, Accessed 07-06-2015)
While the terrestrial fears of terrorism and Ebola have dominated headlines, American
leaders are fretting about what may be even more serious virtual threats to the
nation’s security.
This year, hundreds of millions of private records have been exposed in an
unprecedented number of cyberattacks on both US businesses and the federal
government.
On Monday, just as President Obama arrived in Beijing to being a week-long summit with regional leaders, Chinese
hackers are suspected to have breached the computer networks of the US Postal Service, leaving the personal data of
more than 800,00 employees and customers compromised, The Washington Post reports.
The data breach, which began as far back as January and lasted through mid-August, potentially exposed 500,000 postal
employees’ most sensitive personal information, including names, dates of birth, and Social Security numbers, the Postal
Service said in a statement Monday. The data of customers who used the Postal Service’s call center from January to
August may have also been exposed.
"The FBI is working with the United States Postal Service to determine the nature and scope of this incident," the federal
law enforcement agency said in a statement Monday. Neither the FBI nor the Postal Service, however, confirmed it was
the work of Chinese hackers.
The breach did not expose customer payment or credit card information, the Postal Service said, but hackers did gain
access to its computer networks at least as far back as January. The FBI informed the Postal Service of the hack in midSeptember.
“It is an unfortunate fact of life these days that every
organization connected to the Internet is a
constant target for cyber intrusion activity,” said Postmaster General Patrick Donahoe in a statement.
“The United States Postal Service is no different. Fortunately, we have seen no evidence of malicious use of the
compromised data and we are taking steps to help our employees protect against any potential misuse of their data.”
But the reported breach comes as both
intelligence officials and cybersecurity experts say
computer hackers now pose a greater threat to national security than terrorists.
Since 2006, cyber-intruders have gained access to the private data of nearly 90 million people in federal networks, the
Associated Press reported in a major investigation published Monday.
Hackers have also accessed 255 million customer records in retail networks during this time, 212 million customer records
in financial and insurance industry servers, as well as 13 million records of those in educational institutions, the AP
reported.
“The
increasing number of cyber-attacks in both the public and private sectors is
unprecedented and poses a clear and present danger to our nation’s security,”
wrote Rep. Elijah Cummings (D) of Maryland, ranking member of the House Committee on Oversight and Government
Reform, in a letter to Postmaster General Donahoe on Monday.
Only strong encryption can preserve cybersecurity.
Kehl et al. 15 — Danielle Kehl, Senior Policy Analyst at the Open Technology Institute at the New America
Foundation, holds a B.A. in History from Yale University, with Andi Wilson, Policy Program Associate at the Open
Technology Institute at the New America Foundation, holds a Master of Global Affairs degree from the Munk School at
the University of Toronto, and Kevin Bankston, Policy Director at the Open Technology Institute at the New America
Foundation, former Senior Counsel and Director of the Free Expression Project at the Center for Democracy &
Technology, former Senior Staff Attorney at the Electronic Frontier Foundation, former Justice William Brennan First
Amendment Fellow at the American Civil Liberties Union, holds a J.D. from the University of Southern California Law
School, 2015 (“Doomed To Repeat History? Lessons From The Crypto Wars of the 1990s,” Report by the Open Technology
Institute at the New America Foundation, June, Available Online at https://static.newamerica.org/attachments/3407-125/Lessons%20From%20the%20Crypto%20Wars%20of%20the%201990s.882d6156dc194187a5fa51b14d55234f.pdf,
Accessed 07-06-2015, p. 19)
Strong Encryption Has Become A Bedrock Technology That Protects The
Security Of The Internet
The evolution of the ecosystem for encrypted communications has also enhanced the
protection of individual communications and improved cybersecurity. Today,
strong encryption is an essential ingredient in the overall security of the modern
network, and adopting technologies like HTTPS is increasingly considered an industry best-practice among major
technology companies.177 Even the report of the President’s Review Group on
Intelligence and Communications Technologies, the panel of experts appointed
by President Barack Obama to review the NSA’s surveillance activities after the 2013 Snowden
leaks, was unequivocal in its emphasis on the importance of strong encryption to
protect data in transit and at rest. The Review Group wrote that:
Encryption is an essential basis for trust on the Internet; without such
trust, valuable communications would not be possible. For the entire
system to work, encryption software itself must be trustworthy. Users of
encryption must be confident, and justifiably confident, that only those people
they designate can decrypt their data…. Indeed, in light of the massive
increase in cyber-crime and intellectual property theft on-line, the use of
encryption should be greatly expanded to protect not only data in transit,
but also data at rest on networks, in storage, and in the cloud.178
The report further recommended that the U.S. government should:
Promote security[] by (1) fully supporting and not undermining efforts to
create encryption standards; (2) making clear that it will not in any way
subvert, undermine, weaken, or make vulnerable generally available
commercial encryption; and (3) supporting efforts to encourage the greater
use of encryption technology for data in transit, at rest, in the cloud, and
in storage.179
1AC — Tech Competitiveness Module
U.S. government attacks on encryption destroy tech industry competitiveness.
Open Letter 15 — An Open Letter to President Obama co-signed by 36 civil society organizations (including the
American Civil Liberties Union, Electronic Frontier Foundation, Electronic Privacy Information Center, and the Free
Software Foundation), 48 technology companies and trade associations (including Apple, Facebook, Google, Microsoft,
and Yahoo), and 58 security and policy experts (including Jacob Applebaum, Eric Burger, Joan Feigenbaum, and Bruce
Schneier), the full list of signatories is available upon request under the “FYI: Open Letter To Obama” header, 2015 (Open
Letter to Obama, May 19th, Available Online at https://static.newamerica.org/attachments/3138-113/Encryption_Letter_to_Obama_final_051915.pdf, Accessed 06-29-2015, p. 1-2)
In addition to undermining cybersecurity, any kind of vulnerability mandate would
also seriously undermine our economic security. U.S. companies are already
struggling to maintain international trust in the wake of revelations about the
National Security Agency’s surveillance programs. Introducing mandatory
vulnerabilities into American products would further push many customers—be
they domestic or international, [end page 1] individual or institutional—to turn away
from those compromised products and services. Instead, they—and many of the
bad actors whose behavior the government is hoping to impact—will simply rely
on encrypted offerings from foreign providers, or avail themselves of the wide
range of free and open source encryption products that are easily available
online.
Only the plan can restore tech competitiveness. Federal action is needed.
Castro and McQuinn 15 — Daniel Castro, Vice President of the Information Technology and Innovation
Foundation—a nonprofit, non-partisan technology think tank, former IT Analyst at the Government Accountability
Office, holds an M.S. in Information Security Technology and Management from Carnegie Mellon University and a B.S. in
Foreign Service from Georgetown University, and Alan McQuinn, Research Assistant with the Information Technology
and Innovation Foundation, holds a B.S. in Political Communications and Public Relations from the University of TexasAustin, 2015 (“Beyond the USA Freedom Act: How U.S. Surveillance Still Subverts U.S. Competitiveness,” Report by the
Information Technology & Innovation Foundation, June, Available Online at http://www2.itif.org/2015-beyond-usafreedom-act.pdf?_ga=1.61741228.1234666382.1434075923, Accessed 07-05-2015, p. 7)
Conclusion
When historians write about this period in U.S. history it could very well be that one
of the themes will be how the United States lost its global technology leadership to
other nations. And clearly one of the factors they would point to is the long-standing
privileging of U.S. national security interests over U.S. industrial and commercial
interests when it comes to U.S. foreign policy.
This has occurred over the last few years as the U.S. government has done
relatively little to address the rising commercial challenge to U.S. technology
companies, all the while putting intelligence gathering first and foremost. Indeed,
policy decisions by the U.S. intelligence community have reverberated
throughout the global economy. If the U.S. tech industry is to remain the leader
in the global marketplace, then the U.S. government will need to set a new course
that balances economic interests with national security interests. The cost of
inaction is not only short-term economic losses for U.S. companies, but a wave of
protectionist policies that will systematically weaken U.S. technology
competiveness in years to come, with impacts on economic growth, jobs, trade
balance, and national security through a weakened industrial base. Only by
taking decisive steps to reform its digital surveillance activities will the U.S.
government enable its tech industry to effectively compete in the global market.
1AC — Internet Freedom Module
U.S. government attacks undermine global Internet freedom. The plan is key
to bolster U.S. credibility.
Open Letter 15 — An Open Letter to President Obama co-signed by 36 civil society organizations (including the
American Civil Liberties Union, Electronic Frontier Foundation, Electronic Privacy Information Center, and the Free
Software Foundation), 48 technology companies and trade associations (including Apple, Facebook, Google, Microsoft,
and Yahoo), and 58 security and policy experts (including Jacob Applebaum, Eric Burger, Joan Feigenbaum, and Bruce
Schneier), the full list of signatories is available upon request under the “FYI: Open Letter To Obama” header, 2015 (Open
Letter to Obama, May 19th, Available Online at https://static.newamerica.org/attachments/3138-113/Encryption_Letter_to_Obama_final_051915.pdf, Accessed 06-29-2015, p. 2)
More than undermining every American’s cybersecurity and the nation’s
economic security, introducing new vulnerabilities to weaken encrypted
products in the U.S. would also undermine human rights and information
security around the globe. If American companies maintain the ability to unlock
their customers’ data and devices on request, governments other than the United
States will demand the same access, and will also be emboldened to demand the
same capability from their native companies. The U.S. government, having
made the same demands, will have little room to object. The result will be an
information environment riddled with vulnerabilities that could be exploited by
even the most repressive or dangerous regimes. That’s not a future that the
American people or the people of the world deserve.
Independently, weakening encryption makes global authoritarianism easier.
Sanchez 14 — Julian Sanchez, Senior Fellow specializing in technology, privacy, and civil liberties at the Cato
Institute, former Washington Editor for Ars Technica, holds a B.A. in Philosophy and Political Science from New York
University, 2014 (“Old Technopanic in New iBottles,” Cato at Liberty—a Cato Institute blog, September 23rd, Available
Online at http://www.cato.org/blog/old-technopanic-new-ibottles, Accessed 06-29-2015)
Second, and at the risk of belaboring the obvious, there
are lots of governments out there that no
freedom-loving person would classify as “the good guys.” Let’s pretend—for the sake
of argument, and despite everything the experts tell us—that somehow it were possible to design a
backdoor that would open for Apple or Google without being exploitable by hackers
and criminals. Even then, it would be awfully myopic to forget that our own
government is not the only one that would predictably come to these companies
with legal demands. Yahoo, for instance, was roundly denounced by American legislators
for coughing up data the Chinese government used to convict poet and dissident Shi
Tao, released just last year after nearly a decade in prison. Authoritarian governments, of course, will
do their best to prevent truly secure digital technologies from entering their
countries, but they’ll be hard pressed to do so when secure devices are being
mass-produced for western markets. An iPhone that Apple can’t unlock when
American cops come knocking for good reasons is also an iPhone they can’t
unlock when the Chinese government comes knocking for bad ones. A backdoor
mandate, by contrast, makes life easy for oppressive regimes by guaranteeing that
consumer devices are exploitable by default—presenting U.S. companies with a presence in those
countries with a horrific choice between enabling repression and endangering their foreign employees.
Case Backlines
They Say: “Freedom Act Solves”
The Freedom Act wasn’t enough.
Castro and McQuinn 15 — Daniel Castro, Vice President of the Information Technology and Innovation
Foundation—a nonprofit, non-partisan technology think tank, former IT Analyst at the Government Accountability
Office, holds an M.S. in Information Security Technology and Management from Carnegie Mellon University and a B.S. in
Foreign Service from Georgetown University, and Alan McQuinn, Research Assistant with the Information Technology
and Innovation Foundation, holds a B.S. in Political Communications and Public Relations from the University of TexasAustin, 2015 (“Beyond the USA Freedom Act: How U.S. Surveillance Still Subverts U.S. Competitiveness,” Report by the
Information Technology & Innovation Foundation, June, Available Online at http://www2.itif.org/2015-beyond-usafreedom-act.pdf?_ga=1.61741228.1234666382.1434075923, Accessed 07-05-2015, p. 1)
Almost two
years ago, ITIF described how revelations about pervasive digital
surveillance by the U.S. intelligence community could severely harm the
competitiveness of the United States if foreign customers turned away from U.S.made technology and services.1 Since then, U.S. policymakers have failed to take
sufficient action to address these surveillance concerns; in some cases, they have
even fanned the flames of discontent by championing weak information security
practices.2 In addition, other countries have used anger over U.S. government
surveillance as a cover for implementing a new wave of protectionist policies
specifically targeting information technology. The combined result is a set of
policies both at home and abroad that sacrifices robust competitiveness of the
U.S. tech sector for vague and unconvincing promises of improved national
security.
ITIF estimated in 2013 that even a modest drop in the expected foreign market
share for cloud computing stemming from concerns about U.S. surveillance
could cost the United States between $21.5 billion and $35 billion by 2016.3 Since then, it has
become clear that the U.S. tech industry as a whole, not just the cloud
computing sector, has underperformed as a result of the Snowden revelations.
Therefore, the economic impact of U.S. surveillance practices will likely far
exceed ITIF’s initial $35 billion estimate. This report catalogues a wide range of
specific examples of the economic harm that has been done to U.S. businesses. In
short, foreign customers are shunning U.S. companies. The policy implication of
this is clear: Now that Congress has reformed how the National Security Agency (NSA)
collects bulk domestic phone records and allowed private firms—rather than the
government—to collect and store approved data, it is time to address other
controversial digital surveillance activities by the U.S. intelligence community.4
They Say: “Companies Solve”
Government action is needed — companies can’t do it alone.
Kehl et al. 14 — Danielle Kehl, Senior Policy Analyst at the Open Technology Institute at the New America
Foundation, holds a B.A. in History from Yale University, with Kevin Bankston, Policy Director at the Open Technology
Institute at the New America Foundation, former Senior Counsel and Director of the Free Expression Project at the Center
for Democracy & Technology, former Senior Staff Attorney at the Electronic Frontier Foundation, former Justice William
Brennan First Amendment Fellow at the American Civil Liberties Union, holds a J.D. from the University of Southern
California Law School, Robyn Greene, Policy Counsel specializing in surveillance and cybersecurity at the Open
Technology Institute at the New America Foundation, holds a J.D. from Hofstra University School of Law, and Robert
Morgus, Program Associate with the Cybersecurity Initiative and International Security Program at the New America
Foundation, 2014 (“Surveillance Costs: The NSA’s Impact on the Economy, Internet Freedom & Cybersecurity,” Report by
the Open Technology Institute of the New America Foundation, July, Available Online at
https://static.newamerica.org/attachments/184-surveillance-costs-the-nsas-impact-on-the-economy-internet-freedomand-cybersecurity/Surveilance_Costs_Final.pdf, Accessed 07-05-2015, p. 13)
It is abundantly clear that the NSA
surveillance programs are currently having a serious,
negative impact on the U.S. economy and threatening the future competitiveness
of American technology companies. Not only are U.S. companies losing overseas
sales and getting dropped from contracts with foreign companies and
governments—they are also watching their competitive advantage in fastgrowing industries like cloud computing and webhosting disappear, opening
the door for foreign companies who claim to offer “more secure” alternative
products to poach their business. Industry efforts to increase transparency and
accountability as well as concrete steps to promote better security by adopting
encryption and other best practices are positive signs, but U.S. companies cannot
solve this problem alone. “It’s not blowing over,” said Microsoft General
Counsel Brad Smith at a recent conference. “In June of 2014, it is clear it is getting worse, not
better.”98 Without meaningful government reform and better oversight, concerns
about the breadth of NSA surveillance could lead to permanent shifts in the
global technology market and do lasting damage to the U.S. economy.
They Say: “Alt Causes To Cybersecurity”
Encryption is vital to every aspect of Internet security.
Schneier 15 — Bruce Schneier, Chief Technology Officer for Counterpane Internet Security, Fellow at the Berkman
Center for Internet and Society at Harvard Law School, Program Fellow at the New America Foundation's Open
Technology Institute, Board Member of the Electronic Frontier Foundation, Advisory Board Member of the Electronic
Privacy Information Center, interviewed by Rob Price, 2015 (“Bruce Schneier: David Cameron's proposed encryption ban
would 'destroy the internet',” Business Insider, July 6th, Available Online at http://www.businessinsider.com/bruceschneier-david-cameron-proposed-encryption-ban-destroy-the-internet-2015-7, Accessed 07-20-2015)
BI: Are there any less obvious ways in which encryption helps people on a day-to-day basis?
BS: Encryption
secures everything we do on the Internet. It secures our
commerce. It secures our communications. It secures our critical infrastructure.
It secures our persons from criminal attack, and it secures our countries from
nation-state attack. In many countries, it helps journalists, dissidents, and
human rights workers stay alive. In a world of pretty bad computer security, it is
the one thing that works well.
Backdoors devastate cybersecurity — inherent complexity and concentrated
targets.
Crypto Experts 15 — Harold Abelson, Professor of Electrical Engineering and Computer Science at the
Massachusetts Institute of Technology, Fellow of The Institute of Electrical and Electronics Engineers, Founding Director
of Creative Commons and the Free Software Foundation, holds a Ph.D. in Mathematics from the Massachusetts Institute
of Technology, et al., with Ross Anderson, Steven M. Bellovin, Josh Benaloh, Matt Blaze, Whitfield Diffie, John Gilmore,
Matthew Green, Susan Landau, Peter G. Neumann, Ronald L. Rivest, Jeffrey I. Schiller, Bruce Schneier, Michael Specter,
and Daniel J. Weitzner, qualifications of these co-authors available upon request, 2015 (“Keys Under Doormats:
Mandating insecurity by requiring government access to all data and communications,” Massachusetts Institute of
Technology Computer Science and Artificial Intelligence Laboratory Technical Report (MIT-CSAIL-TR-2015-026), July 6th,
Available Online at http://dspace.mit.edu/bitstream/handle/1721.1/97690/MIT-CSAIL-TR-2015-026.pdf, Accessed 0720-2015, p. 2-3)
The goal of this report is to similarly analyze the newly proposed
requirement of exceptional
access to communications in today’s more complex, global information infrastructure. We find that it would pose
far more grave security risks, imperil innovation, and raise thorny issues for
human rights and international relations.
There are three general problems. First, providing exceptional access to communications would force
a U-turn from the best practices now being deployed to make the Internet more
secure. These practices include forward secrecy — where decryption keys are deleted immediately after use, so that
stealing the encryption key used by a communications server would not compromise earlier or later communications. A
related technique, authenticated encryption, uses the same temporary key to guarantee confidentiality and to verify that
the message has not been forged or tampered with.
Second, building
in exceptional access would substantially increase system
complexity. Security researchers inside and outside government agree that complexity is
the enemy of security — every new feature can interact with others to create
vulnerabilities. To achieve widespread exceptional access, new technology
features would have to be deployed and tested with literally hundreds of
thousands of developers all around the world. This is a far more complex
environment than the electronic surveillance now deployed in telecommunications and
Internet access services, which tend to use similar technologies and are more likely to have the resources to manage
vulnerabilities that may arise from new features. Features
to permit law enforcement exceptional
access across a wide range of Internet and mobile computing applications could
be particularly problematic because their typical use would be surreptitious —
making security testing difficult and less effective.
Third, exceptional access would create concentrated targets that could attract bad
actors. Security credentials that unlock the data would have to be retained by the
platform provider, law enforcement agencies, or some other trusted third party.
If law enforcement’s keys guaranteed access to everything, an attacker who
gained access to these keys would enjoy the same privilege. Moreover, law
enforcement’s stated need for rapid access to data would make it impractical to
store keys offline or split keys among multiple keyholders, as security engineers would
normally do with extremely high-value credentials. Recent attacks on the United States Government Office of
Personnel Management (OPM) show how much harm can arise when many organizations
rely on a single institution that itself has security vulnerabilities. In the case of
OPM, numerous federal agencies lost sensitive data because OPM had insecure
infrastructure. If service providers implement exceptional [end page 2] access
requirements incorrectly, the security of all of their users will be at risk.
They Say: “Alt Causes To Tech Industry”
The plan restores trust in U.S. companies by prohibiting attacks on encryption.
Kehl et al. 14 — Danielle Kehl, Senior Policy Analyst at the Open Technology Institute at the New America
Foundation, holds a B.A. in History from Yale University, with Kevin Bankston, Policy Director at the Open Technology
Institute at the New America Foundation, former Senior Counsel and Director of the Free Expression Project at the Center
for Democracy & Technology, former Senior Staff Attorney at the Electronic Frontier Foundation, former Justice William
Brennan First Amendment Fellow at the American Civil Liberties Union, holds a J.D. from the University of Southern
California Law School, Robyn Greene, Policy Counsel specializing in surveillance and cybersecurity at the Open
Technology Institute at the New America Foundation, holds a J.D. from Hofstra University School of Law, and Robert
Morgus, Program Associate with the Cybersecurity Initiative and International Security Program at the New America
Foundation, 2014 (“Surveillance Costs: The NSA’s Impact on the Economy, Internet Freedom & Cybersecurity,” Report by
the Open Technology Institute of the New America Foundation, July, Available Online at
https://static.newamerica.org/attachments/184-surveillance-costs-the-nsas-impact-on-the-economy-internet-freedomand-cybersecurity/Surveilance_Costs_Final.pdf, Accessed 07-05-2015, p. 40-41)
The U.S. government should not require or request that new surveillance
capabilities or security vulnerabilities be built into communications technologies
and services, even if these are intended only to facilitate lawful surveillance.
There is a great deal of evidence that backdoors fundamentally weaken the
security of hardware and software, regardless of whether only the NSA
purportedly knows about said vulnerabilities, as some of the documents suggest. A policy
statement from the Internet Engineering Task Force in 2000 emphasized that “adding a requirement for wiretapping will
make affected protocol designs considerably more complex. Experience has shown that complexity almost inevitably
jeopardizes the security of communications.”355 More recently, a May 2013 paper from the Center for Democracy and
Technology on the risks of wiretap modifications to endpoints concludes that “deployment of an intercept capability in...
communications services, systems and applications poses serious security risks.”356 The authors add that “on balance
mandating that endpoint software vendors build intercept functionality into their products will be much more costly to
personal, economic and governmental security overall than the risks associated with not being able to wiretap all
communications.”357 While NSA programs such as SIGINT Enabling—much like proposals from domestic
law enforcement agencies to update the Communications Assistance for Law Enforcement Act (CALEA) to require digital
wiretapping capabilities in modern Internet-based communications services358—may
aim to [end page 40]
promote national security and law enforcement by ensuring that federal agencies
have the ability to intercept Internet communications, they do so at a huge cost
to online security overall.
Because of the associated security risks, the U.S. government should not mandate
or request the creation of surveillance backdoors in products, whether through
legislation, court order, or the leveraging industry relationships to convince
companies to voluntarily insert vulnerabilities. As Bellovin et al. explain, complying with
these types of requirements would also hinder innovation and impose a “tax” on
software development in addition to creating a whole new class of
vulnerabilities in hardware and software that undermines the overall security
of the products.359 An amendment offered to the NDAA for Fiscal Year 2015 (H.R. 4435) by Representatives Zoe
Lofgren (D-CA) and Rush Holt (D-NJ) would have prohibited inserting these kinds of vulnerabilities outright.360 The
Lofgren-Holt proposal aimed to prevent “the funding of any intelligence agency,
intelligence program, or intelligence related activity that mandates or requests
that a device manufacturer, software developer, or standards organization build
in a backdoor to circumvent the encryption or privacy protections of its products,
unless there is statutory authority to make such a mandate or request.”361 Although that measure was not adopted as
part of the NDAA, a similar amendment sponsored by Lofgren along with Representatives Jim Sensenbrenner (D-WI) and
Thomas Massie (R-KY), did make it into the House-approved version of the NDAA—with the support of Internet
companies and privacy organizations362—passing on an overwhelming vote of 293 to 123.363 Like Representative
Grayson’s amendment on NSA’s consultations with NIST around encryption, it remains to be seen whether this
amendment will end up in the final appropriations bill that the President signs. Nonetheless, these legislative
efforts are a heartening sign and are consistent with recommendations from the
President’s Review Group that the U.S. government should not attempt to
deliberately weaken the security of commercial encryption products. Such
mandated vulnerabilities, whether required under statute or by court order or
inserted simply by request, unduly threaten innovation in secure Internet
technologies while introducing security flaws that may be exploited by a variety
of bad actors. A clear policy against such vulnerability mandates is necessary to
restore international trust in U.S. companies and technologies.
The plan restores confidence in the U.S. tech industry.
Sensenbrenner et al. 15 — Jim Sensenbrenner, Member of the United States House of Representatives (R-WI),
with Thomas Massie, Member of the United States House of Representatives (R-KY), and Zoe Lofgren, Member of the
United States House of Representatives (D-CA), 2015 (“Sensenbrenner, Massie & Lofgren Introduce Secure Data Act,”
Press Release, February 4th, Available Online at
https://lofgren.house.gov/news/documentsingle.aspx?DocumentID=397873, Accessed 06-30-2015)
These "backdoors" can also be detrimental to American jobs. Other countries buy
less American hardware and software and favor their domestic suppliers in
order to avoid compromised American products.
The Secure Data Act fixes this by prohibiting any agency from requesting or
compelling backdoors in services and products to assist with electronic
surveillance.
They Say: “Alt Causes To Economy”
Encryption backdoors destroy the Internet and the global economy.
Venezia 15 — Paul Venezia, Senior Contributing Editor at InfoWorld, has worked as a Network and Systems
Engineer for over 15 years, 2015 (“Encryption with backdoors is worse than useless -- it's dangerous,” InfoWorld, July 13th,
Available Online at http://www.infoworld.com/article/2946064/encryption/encryption-with-forced-backdoors-isworse-than-useless-its-dangerous.html, Accessed 07-20-2015)
The fact that these officials
are even having this discussion is a bald demonstration that
they do not understand encryption or how critical it is for modern life. They're
missing a key point: The moment you force any form of encryption to contain a
backdoor, that form of encryption is rendered useless. If a backdoor exists, it will
be exploited by criminals. This is not a supposition, but a certainty. It's not an
American judge that we're worried about. It's the criminals looking for exploits.
We use strong encryption every single day. We use it on our banking sites,
shopping sites, and social media sites. We protect our credit card information
with encryption. We encrypt our databases containing sensitive information (or at
least we should). Our economy relies on strong encryption to move money around in
industries large and small.
Many high-visibility sites, such as Twitter, Google, Reddit, and YouTube, default to SSL/TLS encryption now. When
there were bugs in the libraries that support this type of encryption, the IT world moved heaven and earth to patch them
and eliminate the vulnerability. Security pros were sweating bullets for the hours, days, and in some cases weeks between
the hour Heartbleed was revealed and the hour they could finally get their systems patched -- and now politicians
with no grasp of the ramifications want to introduce a fixed vulnerability into
these frameworks. They are threatening the very foundations of not only
Internet commerce, but the health and security of the global economy.
Put simply, if backdoors are required in encryption methods, the Internet would
essentially be destroyed, and billions of people would be put at risk for identity
theft, bank and credit card fraud, and any number of other horrible outcomes.
Those of us who know how the security sausage is made are appalled that this is
a point of discussion at any level, much less nationally on two continents. It’s
abhorrent to consider.
They Say: “Alt Causes To Privacy”
Encryption is the lynchpin of freedom of opinion and expression. Without it,
intellectual privacy is impossible.
Fulton quoting Kaye 15 — Deirdre Fulton, Staff Writer for Common Dreams, quoting David Kaye, UN Special
Rapporteur and author of a report about encryption released by the United Nations Office of the High Commissioner for
Human Rights, 2015 (“No Backdoors: Online 'Zone of Privacy' is a Basic Human Right,” Common Dreams, May 29th,
Available Online at http://www.commondreams.org/news/2015/05/29/no-backdoors-online-zone-privacy-basichuman-right, Accessed 06-29-2015)
Encryption and anonymity tools, which help protect individuals' private data and
communications, are essential to basic human rights, according to a report released
Friday by the United Nations Office of the High Commissioner for Human Rights.
Issued while U.S. lawmakers are engaged in heated debates over online privacy, data collection, and so-called 'back-door'
surveillance methods, the
document recommends holding proposed limits on
encryption and anonymity to a strict standard: "If they interfere with the right to
hold opinions, restrictions must not be adopted."
The report, written by UN Special Rapporteur David Kaye, is based on questionnaire responses submitted by 16
countries, opinions submitted by 30 non-government stakeholders, and statements made at a meeting of experts in
Geneva in March.
The document reads, in part: "Encryption and anonymity, today’s leading vehicles for online
security, provide individuals with a means to protect their privacy, empowering
them to browse, read, develop and share opinions and information without
interference and enabling journalists, civil society organizations, members of
ethnic or religious groups, those persecuted because of their sexual orientation or
gender identity, activists, scholars, artists and others to exercise the rights to
freedom of opinion and expression."
Kaye makes specific mention of tech tools such as Tor, a free software that directs Internet
traffic through a free, worldwide, volunteer network consisting of more than 6,000 servers to conceal users' location and
usage from anyone conducting online surveillance.
Such tools, the report continues, "create a zone of privacy to protect opinion and belief.
The ability to search the web, develop ideas and communicate securely may be
the only way in which many can explore basic aspects of identity, such as one’s
gender, religion, ethnicity, national origin or sexuality."
Among its many recommendations for states and corporations, the document advises that
governments "avoid all measures that weaken the security that individuals may
enjoy online."
It cites "broadly intrusive measures" such as the back-doors that tech companies
build into their products in order to facilitate law enforcement access to
encrypted content. In the U.S., FBI Director James Comey and NSA chief Adm. Michael Rogers have both
expressed support for backdoors and other restrictions on encryption.
The problem with all of those approaches is that they "inject a basic vulnerability
into secure systems," Kaye told the Washington Post. "It results in insecurity for everyone
even if intended to be for criminal law enforcement purposes."
Encryption is crucial to intellectual privacy.
Richards 15 — Neil M. Richards, Professor of Law at the Washington University School of Law, former law clerk to
Chief Justice William H. Rehnquist, holds a J.D. and an M.A. in Legal History from the University of Virginia, 2015 (“How
encryption protects our intellectual privacy (and why you should care),” Wired UK, January 7th, Available Online at
http://www.wired.co.uk/magazine/archive/2015/05/features/encryption-intellectual-privacy, Accessed 07-16-2015)
As the debates about government surveillance and privacy continue, some influential
politicians (including the prime minister) have started to pick on encryption.
Encrypted communications and files have long been an important element of digital security, but strong crypto is
coming under fire from those who claim it is a threat to our safety, and that the
security services must have mandated back doors to our encrypted devices and
communications.
The political argument is framed as one of safety couched in fear. Bad people want to hurt us,
and our security services claim a need to peer into the most intimate corners of our lives. Encryption matters
because it protects our intellectual privacy -- our ability to be protected from
surveillance or interference when we are making sense of the world by thinking,
reading and speaking privately with those we trust. More and more, the acts of
reading, thinking, and private communication are mediated by electronic
technologies like computers, tablets, ebooks and smartphones. Whenever we shop, read, speak,
and think, we now do so using devices that create records of these activities.
When we are watched, tracked and monitored, we act differently. There's an
increasing body of evidence that internet surveillance stops us from reading
unpopular or controversial ideas. Remember that our most cherished ideas -- that
people should control the government, that heretics should not be burned at the
stake and that all people are equal – were once unpopular and controversial
ideas. A free society should not fear dangerous ideas, and does not need
complete intellectual surveillance. Existing forms of surveillance and policing
are enough.
Encryption and intellectual privacy will of course make it more difficult for the
security services to do their jobs, but so too do our other civil liberties of free
speech, democratic control of the police and military, and requiring a warrant
before the government enters our homes or reads our mail.
We are more secure when we have hope than when we are filled with fear and
treated like potentially naughty children. Encryption promotes this kind of
political security. It promotes other kinds of security as well; after all, a back door
for the government can also be a back door for criminal hackers.
Fundamentally, intellectual privacy stops our lives from becoming completely
transparent, leaving room for diversity, eccentricity and being able to think for
ourselves. The technologies of strong encryption are a powerful practical
protection for these most vital of civil liberties. Contrary to the politicians, a
world without crypto would definitely be less secure. It would also be less free
and much less interesting.
Privacy Outweighs Security
Freedom outweighs security in the context of encryption.
Zdziarski 14 — Jonathan Zdziarski, considered one of the world’s foremost experts in iOS related digital forensics
and security, Principal Software Engineer at Accessdata Group LLC, former Senior Forensic Scientist at Viaforensics,
former Senior Research Scientist at Barracuda Networks, 2014 (“The Politics Behind iPhone Encryption and the FBI,”
Zdziarski's Blog of Things—Jonathan Zdziarski’s blog, September 25th, Available Online at
http://www.zdziarski.com/blog/?p=3894, Accessed 07-20-2015)
While perhaps a heartfelt utopian ideal, the fact is that demanding
back doors for law enforcement
is selfish. It’s selfish in that they want a backdoor to serve their own interests.
Non law-enforcement types, such as Orin Kerr, a reporter who wrote a piece in the Washington Post
supporting FBI backdoors (and then later changed his mind), are being selfish by demanding that
others give up their privacy to make them feel safer. This is the absolute opposite
of a society where law enforcement serves the public interest. What Kerr, and
anyone supporting law enforcement back doors, really wants, is a society that
caters to their fears at the expense of others’ privacy.
While we individually choose to trust the law enforcement we come in contact
with, government only works if we inherently and collectively distrust it on a
public level. Our public policies and standards should distrust those we have put
in a position to watch over us. Freedom only works when the power is balanced
toward the citizens; providing the government with the ability to choose to
invade our 1st, 4th and 5th Amendment rights is only an invitation to lose control of our
own freedom. Deep inside this argument is not one of public safety, but a
massive power grab away from the people’s right to privacy. Whether everyone involved
realizes that, it is privacy itself that is at stake.
Our founding fathers were aware that distrusting government was essential to
freedom, and that’s why they used encryption. In fact, because of their own example in concealing
correspondence, one can make an even stronger case supporting encryption as an instrument of free speech. The
constitution is the highest law of the land – it’s above all other laws. Historically,
our founding fathers guarded all instruments available that protect our freedom
as beyond the law’s reach: The Press, Firearms, Assembly. These things provided information, teeth, and
consensus. Modern encryption is just as essential to our freedom as a country as
firearms, and are the teeth that guarantees our freedom of speech and freedom
from fear to speak and communicate.
Encryption today still just as vital to free speech as it was in the 1700s, and to
freedom itself. What’s at stake here is so much bigger than solving a crime. The
excuse of making us safer has been beaten to death as a means to take away
freedoms for hundreds of years. Don’t be so naive to think that this time, it’s any
different.
Cybersecurity Outweighs Terrorism
Cybersecurity costs of weakened encryption outweigh counter-terrorism
benefits.
Doctorow 14 — Cory Doctorow, journalist and science fiction author, Co-Editor of Boing Boing, Fellow at the
Electronic Frontier Foundation, former Canadian Fulbright Chair for Public Diplomacy at the Center on Public Diplomacy
at the University of Southern California, recipient of the Electronic Frontier Foundation’s Pioneer Award, 2014 (“A World
of Control and Surveillance,” Information Doesn't Want to Be Free: Laws for the Internet Age, Published by McSweeney’s,
ISBN 9781940450285, p. 125-126)
The Edward Snowden leaks left much of the world in shock. Even the most
paranoid security freaks were astounded to learn about the scope of the
surveillance apparatus that had been built by the NSA, along with its allies in the "Five Eyes"
countries (the UK, Canada, New Zealand, and Australia).
The nontechnical world was most shocked by the revelation that the NSA was snaffling up such unthinkable mountains
of everyday communications. In some countries, the NSA is actively recording every single cell-phone conversation,
putting millions of indisputably innocent people under surveillance without even a hint of suspicion.
But in
the tech world, the real showstopper was the news that the NSA and the UK's spy
agency, the GCHQ, had been spending $250 million a year on two programs of
network and computer sabotage — BULLRUN, in the USA, and EDGEHILL, in the UK.
Under these programs, technology companies are bribed, blackmailed, or
tricked into introducing deliberate flaws into their products, so that spies can
break into them and violate their users' privacy. The NSA even sabotaged U.S.
government agencies, such as the National Institute for Standards and Technology (NIST), a rock-ribbed
expert body that produces straightforward engineering standards to make sure that our digital infrastructure doesn't fall
over. NIST
was forced to recall one of its cryptographic standards after it became
apparent that the NSA had infiltrated its process and deliberately weakened the
standard — an act akin to deliberately ensuring that the standard for electrical
wiring was faulty, so that you could start house fires in the homes of people you
wanted to smoke out during armed standoffs.
The sabotage shocked so many technology experts because they understood that
there was no such thing as a security flaw that could be [end page 125] exploited by
"the good guys" alone. If you weaken the world's computer security — the
security of our planes and nuclear reactors, our artificial hearts and our
thermostats, and, yes, our phones and our laptops, devices that are privy to our
every secret — then no amount of gains in the War on Terror will balance out
the costs we'll all pay in vulnerability to crooks, creeps, spooks, thugs, perverts,
voyeurs, and anyone else who independently discovers these deliberate flaws
and turns them against targets of opportunity.
The benefits of encryption outweigh the costs.
Friedersdorf 15 — Conor Friedersdorf, Staff Writer for The Atlantic, 2015 (“Do Encrypted Phones Threaten
National Security?,” The Atlantic, July 15th, Available Online at
http://www.theatlantic.com/politics/archive/2015/07/does-encryption-threaten-national-security/398573/, Accessed
07-20-2015)
But even
granting that there are scenarios in which strong encryption could result
in costs that the public wouldn’t have had to bear in an alternative world with
less privacy, a civil-liability framework would have to account for the opposite
sort of case, in which costs were imposed on consumers or society by
insufficiently strong encryption.
Say a foreign government, a terrorist organization, or black-hat hackers
compromise the smartphone exchanges of David Petraeus or Hillary Clinton or the
head of security at a nuclear power plant or a systems administrator at the New
York Stock Exchange. What costs will the public bear that wouldn’t exist in an
alternative world with end-to-end encryption? And what of the costs born by
average citizens when their privacy is compromised? I strongly suspect the costs of
insecurity are much higher than the costs that would be born in a world of endto-end encryption, though of course there’s no way to prove that judgment definitively.
FYI: Open Letter To Obama
Civil society organizations that signed the Open Letter to Obama include:
Access
Advocacy for Principled Action in Government
American-Arab Anti-Discrimination Committee (ADC)
American Civil Liberties Union
American Library Association
Benetech
Bill of Rights Defense Committee
Center for Democracy & Technology
Committee to Protect Journalists
The Constitution Project
Constitutional Alliance
Council on American-Islamic Relations
Demand Progress
Defending Dissent Foundation
DownsizeDC.org, Inc.
Electronic Frontier Foundation
Electronic Privacy Information Center (EPIC)
Engine
Fight for the Future
Free Press
Free Software Foundation
Freedom of the Press Foundation
GNOME Foundation
Human Rights Watch
The Media Consortium
New America's Open Technology Institute
Niskanen Center
Open Source Initiative
PEN American Center
Project Censored/Media Freedom Foundation
R Street
Reporters Committee for Freedom of the Press
TechFreedom
The Tor Project
U.S. Public Policy Council of Association for Computing Machinery World Privacy Forum
X-Lab
Companies and trade associations that signed the Open Letter to Obama
include:
ACT | The App Association
Adobe
Apple Inc.
The Application Developers Alliance
Automattic
Blockstream
Cisco Systems
Coinbase
Cloud Linux Inc.
CloudFlare
Computer & Communications Industry Association
Consumer Electronics Association (CEA)
Context Relevant
The Copia Institute
CREDO Mobile
Data Foundry
Dropbox
Evernote
Facebook
Gandi.net
Golden Frog
Google
HackerOne
Hackers/Founders
Hewlett-Packard Company
Internet Archive
Internet Association
Internet Infrastructure Coalition (i2Coalition)
Level 3 Communications
LinkedIn
Microsoft
Misk.com
Mozilla
Open Spectrum Inc.
Rackspace
Rapid7
Reform Government Surveillance
Sonic
ServInt
Silent Circle
Slack Technologies, Inc.
Symantec
Tech Assets Inc.
TechNet
Tumblr
Twitter
Wikimedia Foundation
Yahoo
Security and policy experts that signed the Open Letter to Obama include:
Hal Abelson, Professor of Computer Science and Engineering, Massachusetts Institute of Technology
Ben Adida, VP Engineering, Clever Inc.
Jacob Appelbaum, The Tor Project
Adam Back, PhD, Inventor, HashCash, Co-Founder & President, Blockstream
Alvaro Bedoya, Executive Director, Center on Privacy & Technology at Georgetown Law
Brian Behlendorf, Open Source software pioneer
Steven M. Bellovin, Percy K. and Vida L.W. Hudson Professor of Computer Science, Columbia University
Matt Bishop, Professor of Computer Science, University of California at Davis
Matthew Blaze, Director, Distributed Systems Laboratory, University of Pennsylvania
Dan Boneh, Professor of Computer Science and Electrical Engineering at Stanford University
Eric Burger, Research Professor of Computer Science and Director, Security and Software Engineering Research Center
(Georgetown), Georgetown University
Jon Callas, CTO, Silent Circle
L. Jean Camp, Professor of Informatics, Indiana University
Richard A. Clarke, Chairman, Good Harbor Security Risk Management
Gabriella Coleman, Wolfe Chair in Scientific and Technological Literacy, McGill University
Whitfield Diffie, Dr. sc. techn., Center for International Security and Cooperation, Stanford University
David Evans, Professor of Computer Science, University of Virginia
David J. Farber, Alfred Filter Moore Professor Emeritus of Telecommunications, University of Pennsylvania
Dan Farmer, Security Consultant and Researcher, Vicious Fishes Consulting
Rik Farrow, Internet Security
Joan Feigenbaum, Department Chair and Grace Murray Hopper Professor of Computer Science, Yale University
Richard Forno, Jr. Affiliate Scholar, Stanford Law School Center for Internet and Society
Alex Fowler, Co-Founder & SVP, Blockstream
Jim Fruchterman, Founder and CEO, Benetech
Daniel Kahn Gillmor, ACLU Staff Technologist
Robert Graham, creator of BlackICE, sidejacking, and masscan
Jennifer Stisa Granick, Director of Civil Liberties, Stanford Center for Internet and Society
Matthew D. Green, Assistant Research Professor, Johns Hopkins University Information Security Institute
Robert Hansen, Vice President of Labs at WhiteHat Security
Lance Hoffman, Director, George Washington University, Cyber Security Policy and Research Institute
Marcia Hofmann, Law Office of Marcia Hofmann
Nadim Kobeissi, PhD Researcher, INRIA
Joseph Lorenzo Hall, Chief Technologist, Center for Democracy & Technology
Nadia Heninger, Assistant Professor, Department of Computer and Information Science, University of Pennsylvania
David S. Isenberg, Producer, Freedom 2 Connect
Douglas W. Jones, Department of Computer Science, University of Iowa
Susan Landau, Worcester Polytechnic Institute
Gordon Fyodor Lyon, Founder, Nmap Security Scanner Project
Aaron Massey, Postdoctoral Fellow, School of Interactive Computing, Georgia Institute of Technology
Jonathan Mayer, Graduate Fellow, Stanford University
Jeff Moss, Founder, DEF CON and Black Hat security conferences
Peter G. Neumann, Senior Principal Scientist, SRI International Computer Science Lab, Moderator of the ACM Risks
Forum
Ken Pfeil, former CISO at Pioneer Investments
Ronald L. Rivest, Vannevar Bush Professor, Massachusetts Institute of Technology
Paul Rosenzweig, Professorial Lecturer in Law, George Washington University School of Law
Jeffrey I. Schiller, Area Director for Security, Internet Engineering Task Force (1994-2003), Massachusetts Institute of
Technology
Bruce Schneier, Fellow, Berkman Center for Internet and Society, Harvard Law School
Micah Sherr, Assistant Professor of Computer Science, Georgetown University
Adam Shostack, author, “Threat Modeling: Designing for Security”
Eugene H. Spafford, CERIAS Executive Director, Purdue University
Alex Stamos, CISO, Yahoo
Geoffrey R. Stone, Edward H. Levi Distinguished Service Professor of Law, The University of Chicago
Peter Swire, Huang Professor of Law and Ethics, Scheller College of Business, Georgia Institute of Technology
C. Thomas (Space Rogue), Security Strategist, Tenable Network Security
Dan S. Wallach, Professor, Department of Computer Science and Rice Scholar, Baker Institute of Public Policy
Nicholas Weaver, Researcher, International Computer Science Institute
Chris Wysopal, Co-Founder and CTO, Veracode, Inc.
Philip Zimmermann, Chief Scientist and Co-Founder, Silent Circle
CPs
CISA CP
CISA doesn’t solve cybersecurity.
Greenberg 15 — Andy Greenberg, Senior Writer at Wired covering security, privacy, information freedom, and
hacker culture, 2015 (“CISA Security Bill: An F for Security But an A+ for Spying,” Wired, March 20th, Available Online at
http://www.wired.com/2015/03/cisa-security-bill-gets-f-security-spying/, Accessed 07-06-2015)
More False Warnings Than Real Threats
For those who value security over privacy, CISA’s surveillance compromises might seem acceptable. But questions
persist about whether CISA would even do much to improve security. Robert
Graham, a security researcher and an early inventor of intrusion prevention systems, says CISA
will lead to sharing of more false positives than real threat information. Skilled
hackers, he says, know how to evade intrusion prevention systems, intrusion
detection systems, firewalls, and antivirus software. Meanwhile, most data alerts
from systems shared under CISA will be false alarms. “If we had seen the
information from the Sony hackers ahead of time, we still wouldn’t have been
able to pick it out from the other information we were getting,” Graham says, in reference
to the epic hack of Sony Pictures Entertainment late last year. “The reality is that even if you have the
information ahead of time, you really can’t pick the needle from the haystack.”
Graham points to the more informal information sharing that already occurs in
the private sector thanks to companies that manage the security large client
bases. “Companies like IBM and Dell SecureWorks already have massive
‘cybersecurity information sharing’ systems where they hoover up large
quantities of threat information from their customers,” Graham wrote in a blog post
Wednesday. “This rarely allows them to prevent attacks as the CISA bill promises. In
other words, we’ve tried the CISA experiment, and we know it doesn’t really work.”
In his statement excoriating CISA last week, Senator Ron Wyden—the only member of the
intelligence committee to vote against the bill—agreed. He wrote that CISA not only lacks privacy protections,
but that “it will have a limited impact on US cybersecurity.”
But Wyden went further than calling CISA ineffective. Citing its privacy loopholes, he questioned the
fundamental intention of the legislation as it’s currently written. “If information-sharing legislation
does not include adequate privacy protections then that’s not a cybersecurity bill,” he wrote. “It’s a surveillance
bill by another name.”
CISA is a net-negative for cybersecurity.
Granick 15 — Jennifer Granick, Director of Civil Liberties at the Stanford Center for Internet and Society, former
Civil Liberties Director at the Electronic Frontier Foundation, holds a J.D. from the University of California Hastings
College of the Law, 2015 (“Sloppy Cyber Threat Sharing Is Surveillance by Another Name,” Just Security, June 29th,
Available Online at http://justsecurity.org/24261/sloppy-cyber-threat-sharing-surveillance/, Accessed 07-07-2015)
Normally, your email provider wouldn’t be allowed to give this information over without your consent or a search
warrant. But that could soon change. The
Senate may soon make another attempt at passing the
Cybersecurity Information Sharing Act, a bill that would waive privacy laws in the name of
cybersecurity. In April, the US House of Representatives passed by strong majorities two similar “cyber threat”
information sharing bills. These bills grant companies immunity for giving DHS information about network attacks,
attackers, and online crimes.
Sharing information about security vulnerabilities is a good idea. Shared vulnerability
data empowers other system operators to check and see if they, too, have been attacked, and also to guard against being
similarly attacked in the future. I’ve spent most of my career fighting for researchers’ rights to share this kind of
information against threats from companies that didn’t want their customers to know their products were flawed.
But, these bills gut legal protections against government fishing expeditions
exactly at a time when individuals and Internet companies need privacy laws to
get stronger, not weaker.
Worse, the bills aren’t needed. Private companies share threat data with each other,
and even with the government, all the time. The threat data that security
professionals use to protect networks from future attacks is a far more narrow
category of information than those included in the bills being considered by
Congress, and will only rarely contain private information.
And none of the recent cyberattacks — not Sony, not Target, and not the devastating grab
of sensitive background check interviews on government employees at the Office of Personnel Management —
would have been mitigated by these bills.
None of this has stopped private companies from crowing about their need for corporate immunity, but it should stop
Congress from giving it to them. We
don’t need to pass laws gutting privacy rights to save
cybersecurity.
These bills aren’t needed and aren’t designed to encourage sharing the right
kind of information. These are surveillance bills masquerading as security bills.
Instead of removing (non-existent) barriers to sharing — and undermining American privacy in the process — Congress
should consider how to make sharing worthwhile. I’ve been told by many entities, corporate and academic, that they
don’t share with the government because the government doesn’t share back. Silicon Valley engineers have wondered
aloud what value DHS has to offer in their efforts to secure their employer’s services. It’s not like DHS is setting a great
security example for anyone to follow. OPM’s Inspector General warned the government about security problems that,
left unaddressed, led to the OPM breach.
And there’s a very serious trust issue. We recently learned that the NSA is sitting on the domestic Internet backbone,
searching for foreign cyberthreats, helping the FBI and thinking about how to get authority to scan more widely. You can
see the lifecycle now. Vulnerable company reports a threat to DHS, NSA programs its computers to search for that threat,
vulnerable company’s proprietary data gets sucked in by FBI. Any company has to think at least twice about sharing how
they are vulnerable with a government that hoards security vulnerabilities and exploits them to conduct massive
surveillance.
Cybersecurity is a serious problem, but it’s not going to get better with Congress doing
whatever it politically can instead of doing what it should. It’s not going to get better by neutering the few
privacy protections we have. Good security is supposed to keep your
information safe. But these laws will make your private emails and information
vulnerable. Lawmakers have got to start listening to experts, and experts are saying the same
thing. Don’t just do something, do the right thing. And if you can’t do the right
thing, then don’t do anything at all.
NIST “Golden Key” CP
Permute: do both. Research on a Golden Key can continue, but backdoors
should be prohibited now.
Permute: do the counterplan. It’s plan plus — the “except if a Golden Key
works” provision isn’t textually or functionally competitive because a Golden
Key is mathematically impossible. This doesn’t sever: the counterplan is no
different than “except if 2+2=5.”
Doesn’t solve tech competitiveness. Holding out hope for a Golden Key sends
the signal that the U.S. is committed to backdooring encryption. In response,
international customers will choose non-U.S. companies that can guarantee
products without backdoors.
A Golden Key is mathematically impossible — experts.
Geller 15 — Eric Geller, Deputy Morning Editor at The Daily Dot—the “hometown newspaper of the Internet,” 2015
(“The rise of the new Crypto War,” The Daily Dot, July 10th, Available Online at
http://www.dailydot.com/politics/encryption-crypto-war-james-comey-fbi-privacy/, Accessed 07-20-2015)
To researchers who have spent their careers studying code, the FBI’s belief that it
can shut down the development of strong cryptography is ludicrous. Code, after
all, is just math.
“This all requires an idea that people just won’t innovate in areas where the
government doesn’t like them to,” said Cohn. “And that’s really never been the case.”
Hall said, “You’re basically trying to prevent people from doing certain kinds of
math.”
Philip Zimmerman, the inventor of the widely used PGP encryption scheme, neatly summed up the problem with
government encryption limits when he told a 1996 Senate subcommittee hearing, “Trying
to stop this is like
trying to legislate the tides and the weather.”
The mathematical nature of encryption is both a reassurance that it cannot be
banned and a reminder that it cannot be massaged to fit an agenda—even
agendas ostensibly meant to save lives. Software engineers simply cannot build
backdoors that do what the FBI wants without serious security vulnerabilities,
owing to the fundamental mathematical nature of cryptography. It is no more
possible for the FBI to design a secure backdoor than it is for the National
Weather Service to stop a hurricane in its tracks.
“No amount of presto change-o is going to change the math,” Cohn said. “Some
people say time is on their side; I think that math is on our side when it comes to
crypto.”
* Cohn = Cindy Cohn, Executive Director and former Legal Director and General Counsel of the Electronic Frontier
Foundation, holds a J.D. from the University of Michigan Law School; Hall = Joseph Hall, chief technologist at the Center
for Democracy & Technology
Creating a “law enforcement-only” backdoor is technically impossible.
Schneier 15 — Bruce Schneier, Chief Technology Officer for Counterpane Internet Security, Fellow at the Berkman
Center for Internet and Society at Harvard Law School, Program Fellow at the New America Foundation's Open
Technology Institute, Board Member of the Electronic Frontier Foundation, Advisory Board Member of the Electronic
Privacy Information Center, interviewed by Rob Price, 2015 (“Bruce Schneier: David Cameron's proposed encryption ban
would 'destroy the internet',” Business Insider, July 6th, Available Online at http://www.businessinsider.com/bruceschneier-david-cameron-proposed-encryption-ban-destroy-the-internet-2015-7, Accessed 07-20-2015)
BI: Is
there really no way to keep users' data secure while providing backdoors to
law enforcement?
BS: Yes, there really is no way.
Think of it like this. Technically, there is no such thing as a "backdoor to law
enforcement." Backdoor access is a technical requirement, and limiting access to
law enforcement is a policy requirement. As an engineer, I cannot design a
system that works differently in the presence of a particular badge or a signed
piece of paper. I have two options. I can design a secure system that has no
backdoor access, meaning neither criminals nor foreign intelligence agencies nor
domestic police can get at the data. Or I can design a system that has backdoor
access, meaning they all can. Once I have designed this less-secure system with
backdoor access, I have to install some sort of policy overlay to try to ensure that
only the police can get at the backdoor and only when they are authorized. I can
design and build procedures and other measures intended to prevent those bad
guys from getting access, but anyone who has followed all of the high-profile
hacking over the past few years knows how futile that would be.
There is an important principle here: we have one world and one Internet. Protecting
communications means protecting them from everybody. Making
communications vulnerable to one group means making them vulnerable to all.
There just isn't any way around that.
A “golden key” system would create inherent vulnerabilities — experts.
Crypto Experts 15 — Harold Abelson, Professor of Electrical Engineering and Computer Science at the
Massachusetts Institute of Technology, Fellow of The Institute of Electrical and Electronics Engineers, Founding Director
of Creative Commons and the Free Software Foundation, holds a Ph.D. in Mathematics from the Massachusetts Institute
of Technology, et al., with Ross Anderson, Steven M. Bellovin, Josh Benaloh, Matt Blaze, Whitfield Diffie, John Gilmore,
Matthew Green, Susan Landau, Peter G. Neumann, Ronald L. Rivest, Jeffrey I. Schiller, Bruce Schneier, Michael Specter,
and Daniel J. Weitzner, qualifications of these co-authors available upon request, 2015 (“Keys Under Doormats:
Mandating insecurity by requiring government access to all data and communications,” Massachusetts Institute of
Technology Computer Science and Artificial Intelligence Laboratory Technical Report (MIT-CSAIL-TR-2015-026), July 6th,
Available Online at http://dspace.mit.edu/bitstream/handle/1721.1/97690/MIT-CSAIL-TR-2015-026.pdf, Accessed 0720-2015, p. abstract)
Twenty years ago, law enforcement organizations lobbied to require data and communication services to engineer their
products to guarantee law enforcement access to all data. After lengthy debate and vigorous predictions of enforcement
channels “going dark,” these attempts to regulate the emerging Internet were abandoned. In the intervening years,
innovation on the Internet flourished, and law enforcement agencies found new and more effective means of accessing
vastly larger quantities of data. Today we are again hearing calls for regulation to mandate the provision of exceptional
access mechanisms. In this report, a group of computer scientists and security experts, many of whom participated in a
1997 study of these same topics, has convened to explore the likely effects of imposing extraordinary access mandates.
We have found that the
damage that could be caused by law enforcement exceptional
access requirements would be even greater today than it would have been 20
years ago. In the wake of the growing economic and social cost of the
fundamental insecurity of today’s Internet environment, any proposals that alter
the security dynamics online should be approached with caution. Exceptional
access would force Internet system developers to reverse “forward secrecy”
design practices that seek to minimize the impact on user privacy when systems
are breached. The complexity of today’s Internet environment, with millions of
apps and globally connected services, means that new law enforcement
requirements are likely to introduce unanticipated, hard to detect security
flaws. Beyond these and other technical vulnerabilities, the prospect of globally
deployed exceptional access systems raises difficult problems about how such an
environment would be governed and how to ensure that such systems would
respect human rights and the rule of law.
Prefer our evidence — we cite the world’s leading experts. They cite people
who don’t understand math.
Crypto Experts 15 — Harold Abelson, Professor of Electrical Engineering and Computer Science at the
Massachusetts Institute of Technology, Fellow of The Institute of Electrical and Electronics Engineers, Founding Director
of Creative Commons and the Free Software Foundation, holds a Ph.D. in Mathematics from the Massachusetts Institute
of Technology, et al., with Ross Anderson, Steven M. Bellovin, Josh Benaloh, Matt Blaze, Whitfield Diffie, John Gilmore,
Matthew Green, Susan Landau, Peter G. Neumann, Ronald L. Rivest, Jeffrey I. Schiller, Bruce Schneier, Michael Specter,
and Daniel J. Weitzner, qualifications of these co-authors available upon request, 2015 (“Keys Under Doormats:
Mandating insecurity by requiring government access to all data and communications,” Massachusetts Institute of
Technology Computer Science and Artificial Intelligence Laboratory Technical Report (MIT-CSAIL-TR-2015-026), July 6th,
Available Online at http://dspace.mit.edu/bitstream/handle/1721.1/97690/MIT-CSAIL-TR-2015-026.pdf, Accessed 0720-2015, p. 1)
Political and law enforcement leaders in the United States and the United Kingdom have called
for Internet systems to be redesigned to ensure government access to information
— even encrypted information. They argue that the growing use of encryption will neutralize
their investigative capabilities. They propose that data storage and communications systems
must be designed for exceptional access by law enforcement agencies. These
proposals are unworkable in practice, raise enormous legal and ethical
questions, and would undo progress on security at a time when Internet
vulnerabilities are causing extreme economic harm.
As computer scientists with extensive security and systems experience, we
believe that law enforcement has failed to account for the risks inherent in
exceptional access systems. Based on our considerable expertise in real-world
applications, we know that such risks lurk in the technical details. In this report we
examine whether it is technically and operationally feasible to meet law enforcement’s call for exceptional access without
causing large-scale security vulnerabilities. We take no issue here with law enforcement’s desire to execute lawful
surveillance orders when they meet the requirements of human rights and the rule of law. Our
strong
recommendation is that anyone proposing regulations should first present
concrete technical requirements, which industry, academics, and the public can
analyze for technical weaknesses and for hidden costs.
Even the President’s Review Group votes aff.
Open Letter 15 — An Open Letter to President Obama co-signed by 36 civil society organizations (including the
American Civil Liberties Union, Electronic Frontier Foundation, Electronic Privacy Information Center, and the Free
Software Foundation), 48 technology companies and trade associations (including Apple, Facebook, Google, Microsoft,
and Yahoo), and 58 security and policy experts (including Jacob Applebaum, Eric Burger, Joan Feigenbaum, and Bruce
Schneier), the full list of signatories is available upon request under the “FYI: Open Letter To Obama” header, 2015 (Open
Letter to Obama, May 19th, Available Online at https://static.newamerica.org/attachments/3138-113/Encryption_Letter_to_Obama_final_051915.pdf, Accessed 06-29-2015, p. 2)
The Administration faces a critical choice: will it adopt policies that foster a
global digital ecosystem that is more secure, or less? That choice may well define
the future of the Internet in the 21st century. When faced with a similar choice at
the end of the last century, during the so-called “Crypto Wars”, U.S. policymakers
weighed many of the same concerns and arguments that have been raised in the current debate,
and correctly concluded that the serious costs of undermining encryption
technology outweighed the purported benefits. So too did the President’s Review
Group on Intelligence and Communications Technologies, who unanimously
recommended in their December 2013 report that the US Government should “(1) fully
support and not undermine efforts to create encryption standards; (2) not in any
way subvert, undermine, weaken, or make vulnerable generally available
commercial software; and (3) increase the use of encryption and urge US
companies to do so, in order to better protect data in transit, at rest, in the cloud,
and in other storage.”
We urge the Administration to follow the Review Group’s recommendation and
adopt policies that promote rather than undermine the widespread adoption of
strong encryption technologies, and by doing so help lead the way to a more
secure, prosperous, and rights-respecting future for America and for the world.
They Say: “Research Still Good”
The debate has already been decided. On our side is every expert. On their side
is wishful thinking. More research is pointless.
McLaughlin 15 — Jenna McLaughlin, Reporter and Blogger covering surveillance and national security for The
Intercept, former national security and foreign policy reporter and editorial fellow at Mother Jones, 2015 (“FBI Director Says
Scientists Are Wrong, Pitches Imaginary Solution to Encryption Dilemma,” The Intercept, July 8th, Available Online at
https://firstlook.org/theintercept/2015/07/08/fbi-director-comey-proposes-imaginary-solution-encryption/, Accessed
07-20-2015)
Testifying before two Senate committees on Wednesday about the threat he says
strong encryption presents to law enforcement, FBI Director James Comey didn’t so much
propose a solution as wish for one.
Comey said he needs some way to read and listen to any communication for which he’s gotten a court order. Modern
end-to-end encryption — increasingly common following the revelations of mass surveillance by NSA whistleblower
Edward Snowden — doesn’t allow for that. Only the parties on either end can do the decoding.
Comey’s problem is the nearly universal agreement among cryptographers,
technologists and security experts that there is no way to give the government
access to encrypted communications without poking an exploitable hole that
would put confidential data, as well as entities like banks and power grids, at
risk.
But while speaking at Senate Judiciary and Senate Intelligence Committee hearings on Wednesday, Comey
repeatedly refused to accept that as reality.
“A whole lot of good people have said it’s too hard … maybe that’s so,” he said to the Intelligence Committee. “But my
reaction to that is: I’m not sure they’ve really tried.”
In a comment worthy of climate denialists, Comey told one senator: “Maybe the
scientists are right. Ennnh, I’m not willing to give up on that yet.”
He described his inability to make a realistic proposal as the act of a humble public servant. “We’re trying to show
humility to say we don’t know what would be best.”
Comey said American technologists are so brilliant that they surely could come
up with a solution if properly incentivized.
Julian Sanchez, a senior fellow at the Cato Institute, was incredulous about Comey’s insistence that experts are wrong:
“How does his head not explode from cognitive dissonance when he repeats he has no tech expertise, then insists
everyone who does is wrong?” he tweeted during the hearing.
Prior to the committee hearings, a group of the world’s foremost cryptographers
and scientists wrote a paper including complex technical analysis concluding
that mandated backdoor keys for the government would only be dangerous for
national security. This is the first time the group has gotten back together since 1997, the previous instance in
which the FBI asked for a technical backdoor into communications.
But no experts were invited to testify, a fact that several intelligence committee members brought up,
demanding a second hearing to hear from them.
Comey got little pushback from the panel, despite his lack of any formal plan
and his denial of science. Sen. Martin Heinrich, D-N.M., thanked him for his display of “humility” in not
presenting a solution, while Committee Chairman Richard Burr, R-N.C., said “I think you deserve a lot of credit for your
restraint.”
Comey at one point briefly considered the possibility of a world not like the one he imagined, then concluded: “If that’s
the case, then I think we’re stuck.”
This arg is something anti-vaxxers would say. “Researching the impossible is
good” is not a net-benefit.
Whittaker 15 — Zack Whittaker, Writer-Editor for ZDNet, CNET, and CBS News, 2015 (“Why are we still talking
about backdoors in encryption? No, really,” ZDNet, July 8th, Available Online at http://www.zdnet.com/article/becausethere-is-no-such-thing-as-a-secure-backdoor-gosh-darn-it/, Accessed 07-20-2015)
It's the conversation that just won't end.
In case you hadn't noticed, in recent months there's been fervent discussion both for and against encryption, which has
law enforcement and experts at odds.
The debate began in the wake of the Edward Snowden affair after Apple began encrypting iPhones and iPads running the
latest software. By cutting out Apple from the data demand process, law enforcement must go directly to the device
owner. That -- as you might imagine -- incensed law enforcement and intelligence agencies, because that requires
informing suspects that they're under investigation.
Without any congressional backing, the feds asked for the only thing they could: "backdoor" access to tech companies'
systems to pull data on suspected criminals and terrorists.
On Tuesday, more
than a dozen elite cryptographers and security experts penned a
letter effectively calling on the feds to stop flogging an already dead horse.
From the report in The New York Times:
"The group -- 13 of the world's preeminent cryptographers, computer scientists and security specialists -- will
release the paper, which concludes there is no viable technical solution that would allow the American and
British governments to gain 'exceptional access' to encrypted communications without putting the world's most
confidential data and critical infrastructure in danger."
Their argument is simple: there is no such thing as a secure backdoor. If there's a
way in that allows the feds to access data at will (even with a warrant or courtorder), hackers will inevitably find it and use it for their own gain.
From Business Insider, widely-respected security expert Bruce Schneier explained why
there "really is no way" to keep users' data safe while providing backdoors. He said:
"I have two options. I can design a secure system that has no backdoor access, meaning neither criminals nor
foreign intelligence agencies nor domestic police can get at the data. Or I can design a system that has backdoor
access, meaning they all can."
And that
should be the end of it. If the brightest minds in the world cannot come
up with something the FBI wants in its wildest dreams, someone has to back
down. You can't just push ahead with something if it's not technically feasible.
But that's not stopping FBI director James Comey, who on Wednesday made his case -- for the millionth
time -- that encryption will prevent his agency (and others) from finding the bad guys.
In a brief (yet still rambling) opinion piece for Lawfare, the FBI director aimed for "healthy discussion" but failed to retain
his point once he promised he was "not a maniac (or so at least [his] family says so" -- which, by the way, is exactly what a
maniac would say.
Comey wrote:
"In universal strong encryption, I see something that is with us already and growing every day that will
inexorably affect my ability to do that job. It may be that, as a people, we decide the benefits here outweigh the
costs and that there is no sensible, technically feasible way to optimize privacy and safety in this particular
context, or that public safety folks will be able to do their job well enough in the world of universal strong
encryption."
At least he managed to avoid using the word "backdoor" in his piece. He redeemed himself when, in Wednesday's
testimony, Comey said that the FBI was "not seeking a backdoor." (For those not watching C-SPAN, he then proceeded to
describe a backdoor.)
At one point during his testimony, the FBI director specifically said, dodging the question by Sen John Cornyn (R-TX),
that he doesn't want to "scare people by saying I'm certain people will die."
He, and others, are blind to the fact that undermining encryption by installing
backdoors won't prevent crime. In a tweet, security expert and researcher Matt Blaze, one of the
cryptographers who also signed the aforementioned letter, said: "'Crypto causes crime' deserves no
more consideration than 'vaccinations cause disease'."
The debate on encryption that Comey wanted has come and gone – and he lost.
He failed at the first hurdle. It's time to let it go.
Every proposal will get shot down. Researching them is a stalling tactic.
Geller 15 — Eric Geller, Deputy Morning Editor at The Daily Dot—the “hometown newspaper of the Internet,” 2015
(“The rise of the new Crypto War,” The Daily Dot, July 10th, Available Online at
http://www.dailydot.com/politics/encryption-crypto-war-james-comey-fbi-privacy/, Accessed 07-20-2015)
You can’t legislate the weather
Security researchers don’t often wade into highly charged political fights. They
work with code, and code is apolitical. But the security researchers who have
followed the encryption debate say they’re tired of watching the FBI either
misunderstand or dismiss the technical reality surrounding backdoors.
“A lot of the arguments you hear from the FBI and the NSA come from people
most separated from technical reality,” Hall said.
One major frustration is that the government hasn’t laid out the specifics of its
desired intercept solution.
Comey and Deputy Attorney General Sally Yates continued to duck questions
about specifics at a July 8 Senate Judiciary Committee hearing, one of two Senate encryption hearings at which
Comey testified that day.
"We're not ruling out a legislative solution," said Yates, the second-ranking official at the Justice Department, "but we
think that the more productive way to approach this … is to work with the industry" on per-company solutions.
“The government’s position was that bad guys could use crypto and so therefore they needed to regulate it, and our
position was good guys need crypto in order to protect us from bad guys.”
Later in the hearing, Comey suggested "some sort of regime where it's easier to compel people to unlock their devices,"
though he acknowledged the inherent Fifth Amendment concerns of self-incrimination.
Perhaps the Snowden leaks and general public anxiety about surveillance have made the government cautious. Perhaps it
wants to cut deals with specific segments of the tech industry and it worries that going public with its ideas will make
private negotiations harder. Schoen said that he had heard of “informal contacts” between the government and tech
companies, with officials “contacting businesses and saying, ‘We find such and such product objectionable’ or ‘We want
to make such and such a request about a product.’”
But there is another possibility. Specific
proposals expose the government to specific
criticism, and when that criticism comes from respected researchers, it can kill a
proposal in its infancy. For such a small device, the Clipper chip certainly casts a long shadow.
“If you say, ‘Well, I think that people’s crypto keys should be copied in this way and transformed in this way, and then
stored by this entity,’” Schoen said, “then it’s very easy for security experts to say, ‘That’s a terrible risk for reasons X, Y,
and Z, because attackers can attack it in this way.”
Instead of offering specifics, FBI officials say they want more dialog with Congress, the
security community, and the public. But that appears to be little more than a stalling tactic.
They Say: “Crime/Terror Net-Benefit”
Backdoors necessarily make systems insecure and increase the risk of crime.
Sanchez 14 — Julian Sanchez, Senior Fellow specializing in technology, privacy, and civil liberties at the Cato
Institute, former Washington Editor for Ars Technica, holds a B.A. in Philosophy and Political Science from New York
University, 2014 (“Old Technopanic in New iBottles,” Cato at Liberty—a Cato Institute blog, September 23rd, Available
Online at http://www.cato.org/blog/old-technopanic-new-ibottles, Accessed 06-29-2015)
First, as Kerr belatedly acknowledges in a follow-up post, there
are excellent security reasons not to
mandate backdoors. Indeed, had he looked to the original Crypto Wars of the 90s, he would have seen that this
was one of the primary reasons similar schemes were almost uniformly rejected by technologists and security experts.
More or less by definition, a backdoor for law enforcement is a deliberately
introduced security vulnerability, a form of architected breach: It requires a
system to be designed to permit access to a user’s data against the user’s wishes,
and such a system is necessarily less secure than one designed without such a
feature. As computer scientist Matthew Green explains in a recent Slate column (and, with several
eminent colleagues, in a longer 2013 paper) it is damn near impossible to create a security
vulnerability that can only be exploited by “the good guys.” Activist Eva Galperin puts the
point pithily: “Once you build a back door, you rarely get to decide who walks
through it.” Even if your noble intention is only to make criminals more
vulnerable to police, the unavoidable cost of doing so in practice is making the
overwhelming majority of law-abiding users more vulnerable to criminals.
It is impossible to build a “law enforcement only” backdoor. Attacks on
encryption make crime easier, not harder.
Cohn 14 — Cindy Cohn, Executive Director and former Legal Director and General Counsel of the Electronic Frontier
Foundation, holds a J.D. from the University of Michigan Law School, 2014 (“EFF Response to FBI Director Comey's
Speech on Encryption,” Electronic Frontier Foundation, October 17th, Available Online at
https://www.eff.org/deeplinks/2014/10/eff-response-fbi-director-comeys-speech-encryption, Accessed 06-24-2015)
FBI Director James Comey
gave a speech yesterday reiterating the FBI's nearly twenty-yearold talking points about why it wants to reduce the security in your devices,
rather than help you increase it. Here's EFF's response:
The FBI should not be in the business of trying to convince companies to offer
less security to their customers. It should be doing just the opposite. But that's
what Comey is proposing—undoing a clear legal protection we fought hard for in the 1990s.1 The law
specifically ensures that a company is not required to essentially become an
agent of the FBI rather than serving your security and privacy interests. Congress
rightly decided that companies (and free and open source projects and anyone else
building our tools) should be allowed to provide us with the tools to lock our digital
information up just as strongly as we can lock up our physical goods. That's
what Comey wants to undo.
It's telling that his remarks echo so closely the arguments of that era. Compare them, for example, with this comment
from former FBI Director Louis Freeh in May of 1995, now nearly twenty years ago:
[W]e're in favor of strong encryption, robust encryption. The country needs it, industry needs it. We just want
to make sure we have a trap door and key under some judge's authority where we can get there if somebody is
planning a crime.
Now just as then, the
FBI is trying to convince the world that some fantasy version of
security is possible—where "good guys" can have a back door or extra key to
your home but bad guys could never use it. Anyone with even a rudimentary
understanding of security can tell you that's just not true. So the "debate" Comey
calls for is phony, and we suspect he knows it. Instead, Comey wants everybody to have
weak security, so that when the FBI decides somebody is a "bad guy," it has no
problem collecting personal data.
That's bad science, it's bad law, it's bad for companies serving a global
marketplace that may not think the FBI is always a "good guy," and it's bad for
every person who wants to be sure that their data is as protected as possible—
whether from ordinary criminals hacking into their email provider, rogue
governments tracking them for politically organizing, or competing companies
looking for their trade secrets.
Perhaps Comey's speech is saber rattling. Maybe it's an attempt to persuade the American people that we've undertaken
significant reforms in light of the Snowden revelations—the U.S. government has not—and that it's time for the
"pendulum" to swing back. Or maybe by putting this issue in play, the FBI may hope to draw our eyes away from, say, its
attempt to water down the National Security Letter reform that Congress is considering. It's difficult to tell.
But if
the FBI gets its way and convinces Congress to change the law, or even if it
convinces companies like Apple that make our tools and hold our data to weaken the security they offer to us, we'll
all end up less secure and enjoying less privacy. Or as the Fourth Amendment puts it: we'll be be
less "secure in our papers and effects."
For more on EFF's coverage of the "new" Crypto Wars, read this article focusing on the security issues we wrote last week
in Vice. And going back even earlier, a broader update to a piece we wrote in 2010, which itself was was based on our
fights in the 90s. If the FBI wants to try to resurrect this old debate, EFF will be in strong opposition, just as we were 20
years ago. That's because—just like 20 years ago—the
Internet needs more, not less, strong
encryption.
Even if a backdoor “worked,” it wouldn’t enable real-time access.
Geller 15 — Eric Geller, Deputy Morning Editor at The Daily Dot—the “hometown newspaper of the Internet,” 2015
(“The rise of the new Crypto War,” The Daily Dot, July 10th, Available Online at
http://www.dailydot.com/politics/encryption-crypto-war-james-comey-fbi-privacy/, Accessed 07-20-2015)
Exploitation isn’t the only thing that worries security
researchers. They also remain unconvinced
that a backdoor with the other necessary qualities can even be built.
The FBI often cites situations like child abductions or active terrorist plots when
it calls for backdoors. But such ongoing threats would require real-time access to
the backdoor mechanism. That real-time access, experts said, is not realistic given
how backdoors are built and maintained.
“You’re going to need to essentially perform very sensitive security operations
that are going to require a bunch of human steps really, really fast,” said Hall. “A lot
of these are going to be in airgapped facilities [where servers are deliberately cut off from the
Internet], and you’re going to have to jump the airgaps to get the keying material
together in one place that’s probably also an airgapped facility.”
Daniel Weitzner argued that there was simply no way to reconcile a backdoor’s dual
requirements of security and accessibility. If you physically disperse keys across
the country to make them easier for law enforcement to reach, you add more
venues for exploitation, he said. If you put one hardware security module in the
FBI’s heavily guarded Washington headquarters, you prevent disparate lawenforcement groups from quickly accessing it to launch real-time monitoring
operations.
“I’m not even sure we’re good at doing that, keeping keys like that technically
secure,” Green said. “I’m not sure we have any hardware that’s ever been put to that
test.”
* Hall = Joseph Hall, chief technologist at the Center for Democracy & Technology; Weitzner = Daniel Weitzner, lecturer
in the computer science department at the Massachusetts Institute of Technology; Green = Matthew Green, assistant
research professor at the Johns Hopkins Information Security Institute
Terrorism/Crime DA
They Say: “Statistics/Anecdotes”
Empirically, strong encryption doesn’t foil law enforcement. Their evidence is
baseless fearmongering.
Schneier 14 — Bruce Schneier, Chief Technology Officer for Counterpane Internet Security, Fellow at the Berkman
Center for Internet and Society at Harvard Law School, Program Fellow at the New America Foundation's Open
Technology Institute, Board Member of the Electronic Frontier Foundation, Advisory Board Member of the Electronic
Privacy Information Center, 2014 (“Stop the hysteria over Apple encryption,” CNN, October 31st, Available Online at
http://www.cnn.com/2014/10/03/opinion/schneier-apple-encryption-hysteria/index.html, Accessed 06-29-2015)
Last week Apple announced that it is closing a serious security vulnerability in the iPhone. It used to be that the phone's
encryption only protected a small amount of the data, and Apple had the ability to bypass security on the rest of it.
From now on, all the phone's data is protected. It can no longer be accessed by criminals, governments, or rogue
employees. Access to it can no longer be demanded by totalitarian governments. A user's iPhone data is now more secure.
To hear U.S. law enforcement respond, you'd think Apple's move heralded an
unstoppable crime wave. See, the FBI had been using that vulnerability to get into peoples' iPhones. In the
words of cyberlaw professor Orin Kerr, "How is the public interest served by a policy that only thwarts lawful search
warrants?"
Ah, but that's the thing: You
can't build a "back door" that only the good guys can walk
through. Encryption protects against cybercriminals, industrial competitors, the
Chinese secret police and the FBI. You're either vulnerable to eavesdropping by
any of them, or you're secure from eavesdropping from all of them.
Back-door access built for the good guys is routinely used by the bad guys. In 2005,
some unknown group surreptitiously used the lawful-intercept capabilities built into the Greek cell phone system. The
same thing happened in Italy in 2006.
In 2010, Chinese hackers subverted an intercept system Google had put into Gmail to comply with U.S. government
surveillance requests. Back doors in our cell phone system are currently being exploited by the FBI and unknown others.
This doesn't stop the FBI and Justice Department from pumping up the fear.
Attorney General Eric Holder threatened us with kidnappers and sexual predators.
The former head of the FBI's criminal investigative division went even further,
conjuring up kidnappers who are also sexual predators. And, of course, terrorists.
FBI Director James Comey claimed that Apple's move allows people to place themselves beyond the law" and also
invoked that now overworked "child kidnapper." John J. Escalante, chief of detectives for the Chicago police department
now holds the title of most hysterical: "Apple will become the phone of choice for the pedophile."
It's all bluster. Of the 3,576 major offenses for which warrants were granted for
communications interception in 2013, exactly one involved kidnapping. And, more
importantly, there's no evidence that encryption hampers criminal investigations in
any serious way. In 2013, encryption foiled the police nine times, up from four
in 2012 – and the investigations proceeded in some other way.
This is why the FBI's scare stories tend to wither after public scrutiny. A former
FBI assistant director wrote about a kidnapped man who would never have been
found without the ability of the FBI to decrypt an iPhone, only to retract the
point hours later because it wasn't true.
We've seen this game before. During the crypto wars of the 1990s, FBI Director Louis
Freeh and others would repeatedly use the example of mobster John Gotti to illustrate
why the ability to tap telephones was so vital. But the Gotti evidence was
collected using a room bug, not a telephone tap. And those same scary criminal
tropes were trotted out then, too. Back then we called them the Four Horsemen
of the Infocalypse: pedophiles, kidnappers, drug dealers, and terrorists. Nothing
has changed.
Official statistics disprove the link.
Franceschi-Bicchierai 15 — Lorenzo Franceschi-Bicchierai, Staff Writer covering hacking, information
security, and digital rights at VICE Motherboard, former writer at Mashable and Danger Room—the Wired blog, holds an
M.S. in Journalism from Columbia University, 2015 (“Data Shows Little Evidence for FBI's Concerns About Criminals
'Going Dark',” Motherboard, July 1st, Available Online at http://motherboard.vice.com/read/data-shows-little-evidencefor-fbis-concerns-about-criminals-going-dark, Accessed 07-20-2015)
In the last few months, several government
officials, led by the FBI’s Director James Comey, have
been complaining that the rise of encryption technologies would lead to a “very
dark place” where cops and feds can’t fight and stop criminals.
But new numbers released by the US government seem to contradict this doomsday
scenario.
In 2014, encryption thwarted four wiretaps out of 3,554, according to an annual
report published on Wednesday by the US agency that oversees federal courts.
The report reveals that state law enforcement agencies encountered encryption in 22
wiretaps last year. Out of those, cops were foiled on only two occasions. As for the
feds, they encountered encryption in just three wiretaps, and could not decipher
the intercepted communications in two of them.
“They're blowing it out of proportion,” Hanni Fahkoury, an attorney at the digital rights
group Electronic Frontier Foundation (EFF), told Motherboard. “[Encryption] was only a
problem in five cases of the more than 3,500 wiretaps they had up. Second, the
presence of encryption was down by almost 50 percent from the previous year.
“So this is on a downward trend, not upward,” he wrote in an email.
In fact, cops found less encryption last year than in the year prior. In 2013, state
authorities encountered encryption in 41 cases, versus 22 in 2014. At the federal
level, there were three cases of encryption in 2014, against none in 2013. (The report
also refers to five federal wiretaps conducted in “previous years” but only reported in 2014. Of those, the feds were able
to crack the communications in four of the five.)
The FBI did not respond to Motherboard’s request for comment.
Yet, other experts warn that the Wiretap Report is only a small window into the world of government surveillance.
First of all, the FBI has been railing against encryption not just when it’s used for communications, but especially when
it’s used to safeguard data on the phone or computer. The whole recent debate was spurred by Apple’s announcement
that it wouldn’t be able to unlock phones for the police anymore, and that new iPhones would be encrypted by default.
Wiretaps aren’t used to get that kind of data, but cover mostly communications.
Moreover, the FBI has said in the past that it doesn’t apply for wiretaps when it know it can’t intercept the targeted
communications, according to Albert Gidari, a lawyer at Perkins Coie who has worked with technology firms on
surveillance matters, and Jonathan Mayer, a computer scientist and lawyer at Stanford University.
“The report is suggestive, but hardly conclusive,” Mayer told Motherboard. “Much
more telling, in my view, is
that law enforcement and intelligence officials remain unable to provide
episodes where encryption frustrated an investigation.”
So far, the FBI has yet to put forth a valid example where encryption really
thwarted an investigation. In fact, some of the examples cited by Comey have been
debunked in media reports.
“This crypto debate continues to be a red herring because we really are uninformed about the facts that the FBI contends
supports their position,” Gidari said.
The Wiretap Report contains other interesting information that shed a light on government surveillance practices. Out of
the more than 3,554 wiretaps authorized by judges, the vast majority of them (3,409 or 89 percent) were for drug related
offenses. Homicide, in turn, was the reason behind only 4 percent of the the wiretaps. And virtually all of them (96%)
were for “portable devices,” such as cellphones.
Even if the Wiretap Report is just small a peek behind the scenes of government
surveillance, it shows that for now, at least when it comes to wiretapping, the
FBI’s isn’t really going dark.
Law enforcement can still get a warrant — no link.
Cohn et al. 14 — Cindy Cohn, Executive Director and former Legal Director and General Counsel of the Electronic
Frontier Foundation, holds a J.D. from the University of Michigan Law School, with Jeremy Gillula, Staff Technologist at
the Electronic Frontier Foundation, holds a Ph.D. in Computer Science from Stanford University, and Seth Schoen, Senior
Staff Technologist at the Electronic Frontier Foundation, 2014 (“What Default Phone Encryption Really Means For Law
Enforcement,” Vice News, October 8th, Available Online at https://news.vice.com/article/what-default-phoneencryption-really-means-for-law-enforcement, Accessed 07-05-2015)
The common misconception among the hysteria is that this decision will put vital
evidence outside the reach of law enforcement. But nothing in this encryption change
will stop law enforcement from seeking a warrant for the contents of a phone,
just as they seek warrants for the contents of a laptop or desktop computer.
Whether or not a person can be required to unlock the device is a complicated
question — intertwined with the right of due process and the right to avoid selfincrimination — that ought to be carefully considered by a court in the context of
each individual situation.
Strong encryption decreases crime — empirical data.
Kehl et al. 15 — Danielle Kehl, Senior Policy Analyst at the Open Technology Institute at the New America
Foundation, holds a B.A. in History from Yale University, with Andi Wilson, Policy Program Associate at the Open
Technology Institute at the New America Foundation, holds a Master of Global Affairs degree from the Munk School at
the University of Toronto, and Kevin Bankston, Policy Director at the Open Technology Institute at the New America
Foundation, former Senior Counsel and Director of the Free Expression Project at the Center for Democracy &
Technology, former Senior Staff Attorney at the Electronic Frontier Foundation, former Justice William Brennan First
Amendment Fellow at the American Civil Liberties Union, holds a J.D. from the University of Southern California Law
School, 2015 (“Doomed To Repeat History? Lessons From The Crypto Wars of the 1990s,” Report by the Open Technology
Institute at the New America Foundation, June, Available Online at https://static.newamerica.org/attachments/3407-125/Lessons%20From%20the%20Crypto%20Wars%20of%20the%201990s.882d6156dc194187a5fa51b14d55234f.pdf,
Accessed 07-06-2015, p. 19)
Moreover, there
is now a significant body of evidence that, as Bob Goodlatte argued back in
1997, “Strong encryption prevents crime.”180 This has become particularly true as
smartphones and other personal devices that store vast amount of user data have
risen in popularity over the past decade. Encryption can stop or mitigate the
damage from crimes like identity theft and fraud targeted at smartphone users.181
They Say: “Going Dark”
No “going dark” link — it is just rhetoric.
Kerry 14 — Cameron F. Kerry, Distinguished Fellow in Governance Studies at the Center for Technology Innovation
at the Brookings Institution, former Visiting Scholar with the MIT Media Lab, former General Counsel and Acting
Secretary of the United States Department of Commerce, holds a J.D. from Boston College Law School, 2014 (“The Law
Needs To Keep Up With Technology But Not At The Expense Of Civil Liberties,” Forbes, November 6th, Available Online
at http://www.forbes.com/sites/realspin/2014/11/06/the-law-needs-to-keep-up-with-technology-but-not-at-theexpense-of-civil-liberties/2/, Accessed 07-05-2015)
“Going dark” makes a good sound bite. But the neat phrase ignores the
fundamental reality that law enforcement and intelligence agencies have been
“going bright.” The digital era provides troves of evidence that never existed
before. IBM has estimated that 90% of the data in the world has been created in
the last two to three years and EMC projects this volume will increase sevenfoldfold by 2020. This vast flood of information—emails, surveillance and traffic
cameras, transactions, mobile phone location data, and many other sources—
provides law enforcement with digital trails everywhere all the time. Expanded
use of encryption may obscure a portion of such data, but that still leaves
exabytes of information from which to seek evidence.
Fears that encryption will cause agencies to “go dark” are wrong and explained
by loss aversion and the endowment effect.
Swire and Ahmad 11 — Peter Swire, C. William O’Neill Professor of Law at the Moritz College of Law of the
Ohio State University, served as the Chief Counselor for Privacy in the Office of Management and Budget during the
Clinton Administration, holds a J.D. from Yale Law School, and Kenesa Ahmad, Legal and Policy Associate with the
Future of Privacy Forum, holds a J.D. from the Moritz College of Law of the Ohio State University, 2011 (“‘Going Dark’
Versus a ‘Golden Age for Surveillance’,” Center for Democracy & Technology, November 28th, Available Online at
https://cdt.org/blog/%E2%80%98going-dark%E2%80%99-versus-a-%E2%80%98golden-age-forsurveillance%E2%80%99/, Accessed 06-24-2015)
What explains the agencies’ sense of loss when the use of wiretaps has expanded,
encryption has not been an important obstacle, and agencies have gained new
location, contact, and other information? One answer comes from behavioral economics
and psychology, which has drawn academic attention to concepts such as “loss
aversion” and the “endowment effect.” “Loss aversion” refers to the tendency to
prefer avoiding losses to acquiring gains of similar value. This concept also helps
explain the “endowment effect” – the theory that people place higher value on
goods they own versus comparable goods they do not own. Applied to
surveillance, the idea is that agencies feel the loss of one technique more than
they feel an equal-sized gain from other techniques. Whether based on the
language of behavioral economics or simply on common sense, we are familiar
with the human tendency to “pocket our gains” – assume we deserve the good
things that come our way, but complain about the bad things, even if the good
things are more important.
A simple test can help the reader decide between the “going dark” and “golden age
of surveillance” hypotheses. Suppose the agencies had a choice of a 1990-era
package or a 2011-era package. The first package would include the wiretap
authorities as they existed pre-encryption, but would lack the new techniques for
location tracking, confederate identification, access to multiple databases, and
data mining. The second package would match current capabilities: some
encryption-related obstacles, but increased use of wiretaps, as well as the
capabilities for location tracking, confederate tracking and data mining. The
second package is clearly superior – the new surveillance tools assist a vast range
of investigations, whereas wiretaps apply only to a small subset of key
investigations. The new tools are used far more frequently and provide granular
data to assist investigators.
Conclusion
This post casts new light on government agency claims that we are “going dark.” Due
to changing
technology, there are indeed specific ways that law enforcement and national
security agencies lose specific previous capabilities. These specific losses,
however, are more than offset by massive gains. Public debates should
recognize that we are truly in a golden age of surveillance. By understanding
that, we can reject calls for bad encryption policy. More generally, we should
critically assess a wide range of proposals, and build a more secure computing
and communications infrastructure.
Encryption won’t make law enforcement “go dark.” We’re in a Golden Age of
Surveillance.
Sanchez 14 — Julian Sanchez, Senior Fellow specializing in technology, privacy, and civil liberties at the Cato
Institute, former Washington Editor for Ars Technica, holds a B.A. in Philosophy and Political Science from New York
University, 2014 (“Old Technopanic in New iBottles,” Cato at Liberty—a Cato Institute blog, September 23rd, Available
Online at http://www.cato.org/blog/old-technopanic-new-ibottles, Accessed 06-29-2015)
Fourth and finally, we
should step back and maintain a little perspective about the supposedly
dire position of 21st century law enforcement. In his latest post in the Apple series, Kerr
invokes his influential “equilibrium adjustment theory” of Fourth Amendment law.
The upshot of Kerr’s theory, radically oversimplified, is that technological changes over time can confer advantages on
both police investigators and criminals seeking to avoid surveillance, and the law adjusts over time to preserve a balance
between the ability of citizens to protect their privacy and the ability of law enforcement to invade it with sufficiently
good reason. As I hope some of my arguments above illustrate, technology
does not necessarily
provide us with easy Goldilocks policy options: Sometimes there is just no good
way to preserve capabilities to which police have grown accustomed without
imposing radical restrictions on technologies used lawfully by millions of
people—restrictions which are likely to as prove futile in the long run as they are
costly. But this hardly means that evolving technology is bad for law
enforcement on net.
On the contrary, even if we focus narrowly on the iPhone, it seems clear that what Apple taketh away
from police with one hand, it giveth with the other: The company’s ecosystem
considered as a whole provides a vast treasure trove of data for police even if
that trove does not include backdoor access to physical devices. The ordinary,
unsophisticated criminal may be more able to protect locally stored files than he was a decade ago, but in a thousand
other ways, he can expect to be far more minutely tracked in both his online and offline activities. An encrypted text
messaging system may be worse from the perspective of police than an unencrypted one, but it is it really any worse than
a system of pay phones that allow criminals to communicate without leaving any record for police to sift through after the
fact? Meanwhile activities
that would once have left no permanent trace by default—
from looking up information to moving around in the physical world to making
a purchase—now leave a trail of digital breadcrumbs that would have sounded
like a utopian fantasy to an FBI agent in the 1960s. Law enforcement may moan
that they are “going dark” when some particular innovation makes their jobs
more difficult (while improving the security of law-abiding people’s private data), but when we
consider the bigger picture, it is far easier to agree with the experts who have
dubbed our era the Golden Age of Surveillance. Year after year, technology opens a thousand
new windows to our government monitors. If we aim to preserve an “equilibrium” between government power and
citizen privacy, we should accept that it will occasionally close one as well.
Encryption won’t jeopardize law enforcement.
Schneier 14 — Bruce Schneier, Chief Technology Officer for Counterpane Internet Security, Fellow at the Berkman
Center for Internet and Society at Harvard Law School, Program Fellow at the New America Foundation's Open
Technology Institute, Board Member of the Electronic Frontier Foundation, Advisory Board Member of the Electronic
Privacy Information Center, 2014 (“Stop the hysteria over Apple encryption,” CNN, October 31st, Available Online at
http://www.cnn.com/2014/10/03/opinion/schneier-apple-encryption-hysteria/index.html, Accessed 06-29-2015)
As for law enforcement? The recent
decades have given them an unprecedented ability to
put us under surveillance and access our data. Our cell phones provide them with a
detailed history of our movements. Our call records, email history, buddy lists,
and Facebook pages tell them who we associate with. The hundreds of
companies that track us on the Internet tell them what we're thinking about.
Ubiquitous cameras capture our faces everywhere. And most of us back up our
iPhone data on iCloud, which the FBI can still get a warrant for. It truly is the
golden age of surveillance.
After considering the issue, Orin Kerr rethought his position, looking at this in terms of a technological-legal trade-off. I
think he's right.
Given everything that has made it easier for governments and others to intrude
on our private lives, we need both technological security and legal restrictions to
restore the traditional balance between government access and our
security/privacy. More companies should follow Apple's lead and make encryption the easy-to-use default. And
let's wait for some actual evidence of harm before we acquiesce to police
demands for reduced security.
They Say: “Above The Law”
No “above the law” link — bad argument.
Masnick 14 — Mike Masnick, Founder and Chief Executive Officer of Floor64—a software company, Founder and
Editor of Techdirt, 2014 (“FBI Director Angry At Homebuilders For Putting Up Walls That Hide Any Crimes Therein,”
Techdirt, September 26th, Available Online at https://www.techdirt.com/articles/20140925/17303928647/fbi-directorangry-homebuilders-putting-up-walls-that-hide-any-crimes-therein.shtml, Accessed 07-05-2015)
On Thursday, FBI boss James Comey
displayed not only a weak understanding of privacy
and encryption, but also what the phrase "above the law" means, in slamming Apple and
Google for making encryption a default:
"I am a huge believer in the rule of law, but I am also a believer that no one in this country is above the law,"
Comey told reporters at FBI headquarters in Washington. "What concerns me about this is companies
marketing something expressly to allow people to place themselves above the law."
[....]
"There will come a day -- well it comes every day in this business -- when it will matter a great, great deal to the
lives of people of all kinds that we be able to with judicial authorization gain access to a kidnapper's or a
terrorist or a criminal's device. I just want to make sure we have a good conversation in this country before that
day comes. I'd hate to have people look at me and say, 'Well how come you can't save this kid,' 'how come you
can't do this thing.'"
First of all, nothing in what either Apple
or Google is doing puts anyone "above the law."
It just says that those companies are better protecting the privacy of their users.
There are lots of things that make law enforcement's job harder that also better
protect everyone's privacy. That includes walls. If only there were no walls, it
would be much easier to spot crimes being committed. And I'm sure some crimes
happen behind walls that make it difficult for the FBI to track down what
happened. But we don't see James Comey claiming that homebuilders are allowing
people to be "above the law" by building houses with walls.
"I get that the post-Snowden world has started an understandable pendulum swing," he said. "What I'm worried about is,
this is an indication to us as a country and as a people that, boy, maybe that pendulum swung too far."
Wait, what? The
"pendulum" hasn't swung at all. To date, there has been no legal
change in the surveillance laws post-Snowden. The pendulum is just as far over
towards the extreme surveillance state as it has been since Snowden first came on
the scene. This isn't the pendulum "swinging too far." It's not even the pendulum
swinging. This is just Apple and Google making a tiny shift to better protect
privacy.
As Christopher Soghoian points out, why isn't Comey screaming about the manufacturers of
paper shredders, which similarly allow their customers to hide papers from
"lawful surveillance?"
They Say: “Only For Criminals”
Not “only for criminals” — ridiculous argument.
Masnick 14 — Mike Masnick, Founder and Chief Executive Officer of Floor64—a software company, Founder and
Editor of Techdirt, 2014 (“FBI Director Angry At Homebuilders For Putting Up Walls That Hide Any Crimes Therein,”
Techdirt, September 26th, Available Online at https://www.techdirt.com/articles/20140925/17303928647/fbi-directorangry-homebuilders-putting-up-walls-that-hide-any-crimes-therein.shtml, Accessed 07-05-2015)
But, of course, the freaking out continues. Over in the Washington Post, there's this bit of insanity:
“Apple will become the phone of choice for the pedophile,” said John J. Escalante, chief of detectives for
Chicago’s police department. “The average pedophile at this point is probably thinking, I’ve got to get an Apple
phone.”
Um. No. That's just ridiculous. Frankly, if pedophiles are even thinking about encryption, it's likely that they already are
using one of the many encryption products already on the market. And, again, this demonizing
of
encryption as if it's only a tool of pedophiles and criminals is just ridiculous.
Regular everyday people use encryption every single day. You're using it if you visit this
very website. And it's increasingly becoming the standard, because that's just good
security.
They Say: “Helps Terrorists”
Encryption doesn’t foster terrorism.
Schneier 15 — Bruce Schneier, Chief Technology Officer for Counterpane Internet Security, Fellow at the Berkman
Center for Internet and Society at Harvard Law School, Program Fellow at the New America Foundation's Open
Technology Institute, Board Member of the Electronic Frontier Foundation, Advisory Board Member of the Electronic
Privacy Information Center, interviewed by Rob Price, 2015 (“Bruce Schneier: David Cameron's proposed encryption ban
would 'destroy the internet',” Business Insider, July 6th, Available Online at http://www.businessinsider.com/bruceschneier-david-cameron-proposed-encryption-ban-destroy-the-internet-2015-7, Accessed 07-20-2015)
BI: Won't
the proliferation of encryption help terrorists?
BS: No. It's the exact opposite: encryption is one of the things that protects us from
terrorists, criminals, foreign intelligence, and every other threat on the Internet,
and against our data and communications. Encryption protects our trade secrets,
our financial transactions, our medical records, and our conversations. In a world
where cyberattacks are becoming more common and more catastrophic,
encryption is one of our most important defenses.
In 2010, the US Deputy Secretary of Defense William Lynn wrote: "Although the threat to intellectual property is less
dramatic than the threat to critical national infrastructure, it may be the most significant cyberthreat that the United States
will face over the long term." Encryption
protects against intellectual property theft, and it
also protects critical national infrastructure.
What you're asking is much more narrow: won't terrorists be able to use encryption to protect
their secrets? Of course they will. Like so many other aspects of our society, the benefits of
encryption are general and can be enjoyed by both the good guys and the bad
guys. Automobiles benefit both long-distance travelers and bank robbers.
Telephones benefit both distant relatives and kidnappers. Late-night all-you-caneat buffets benefit both hungry students and terrorists plotting their next moves.
This is simply reality. And there are two reasons it's okay. One, good people far
outnumber bad people in society, so we manage to thrive nonetheless. And two, the
bad guys trip themselves up in so many other ways that allowing them access to
automobiles, telephones, late-night restaurants, and encryption isn't enough to
make them successful.
Most of the time we recognize that harming the overwhelming number of honest people in
society to try to harm the few bad people is a dumb trade-off. Consider an analogy:
Cameron is unlikely to demand that cars redesign their engines so as to limit their speeds to 60 kph so bank robbers can't
get away so fast. But he doesn't understand the comparable trade-offs in his proposed legislation.
Defense: terrorists will evade backdoors. Offense: they’ll exploit them to
commit attacks.
Venezia 15 — Paul Venezia, Senior Contributing Editor at InfoWorld, has worked as a Network and Systems
Engineer for over 15 years, 2015 (“Encryption with backdoors is worse than useless -- it's dangerous,” InfoWorld, July 13th,
Available Online at http://www.infoworld.com/article/2946064/encryption/encryption-with-forced-backdoors-isworse-than-useless-its-dangerous.html, Accessed 07-20-2015)
The general idea coming from these camps is that terrorists use encryption to communicate.
Thus, if there are backdoors, then law enforcement can eavesdrop on those
communications. Leaving aside the massive vulnerabilities that would be
introduced on everyone else, it’s clear that the terrorists could very easily modify
their communications to evade those types of encryption or set up alternative
communication methods. We would be creating holes in the protection used for
trillions of transactions, all for naught.
Citizens of a city do not give the police the keys to their houses. We do not
register our bank account passwords with the FBI. We do not knowingly or
specifically allow law enforcement to listen and record our phone calls and
Internet communications (though that hasn’t seemed to matter). We should definitely not
crack the foundation of secure Internet communications with a backdoor that
will only be exploited by criminals or the very terrorists that we’re supposedly
trying to thwart.
Remember, if the government can lose an enormous cache of extraordinarily
sensitive, deeply personal information on millions of its own employees, one can
only wonder what horrors would be visited upon us if it somehow succeeded in
destroying encryption as well.
They Say: “Helps ISIS”
No, encryption isn’t needed to stop ISIS. But it is vital to U.S. cybersecurity.
Landau 15 — Susan Landau, Professor of Cybersecurity Policy in the Department of Social Science and Policy
Studies at Worcester Polytechnic Institute, serves on the Computer Science Telecommunications Board of the National
Research Council, former Senior Staff Privacy Analyst at Google, former Distinguished Engineer at Sun Microsystems,
former faculty member at the University of Massachusetts at Amherst and at Wesleyan University, has held visiting
positions at Harvard, Cornell, Yale, and the Mathematical Sciences Research Institute, holds a Ph.D. in Mathematics from
the Massachusetts Institute of Technology, 2015 (“Director Comey and the Real Threats,” Lawfare, July 3rd, Available
Online at http://www.lawfareblog.com/director-comey-and-real-threats, Accessed 07-06-2015)
Conflation obscures issues. That's what's happening now with FBI Director Comey's
arguments regarding ISIS, Going Dark, and device encryption. On Wednesday, Ben,
quoting the director, discussed how the changes resulting from ISIS means we ought to reexamine the whole encryption
issue. "Our job is to find needles in a nationwide haystack, needles that are increasingly invisible to us because of end-toend encryption," Comey said. "This is the 'going dark' problem in high definition."
Nope. Comey is looking at the right issue but in the wrong way. The
possibility of ISIS attacks on US
soil is very frightening. But as the New York Times reports, though the organization
inspires lone wolf terrorists, it doesn't organize them to conduct their nefarious
acts.
Encryption is not the difficulty in determining who the attackers might be and
where their intentions lie. But encryption is important in combating our most
serious national security concerns. I've quoted William Lynn here before, but the point he made is
directly relevant, and it bears repeating. The Deputy Director of Defense wrote, "the threat to intellectual
property is less dramatic than the threat to critical national infrastructure, [but] it
may be the most significant cyberthreat that the United States will face over the long
term."
The way you protect against such threats is communications and computer
security everywhere. This translates to end-to-end encryption for
communications, securing communications devices, etc. This is why, for example, NSA has
supported technological efforts to secure devices, communications, and networks in the private sector.
Thoughts of an armed thug wielding a machete or shooting a semiautomatic rifle
at a Fourth of July parade or picnic are terrifying. But one thing we expect out of
government officials is rational thought and a sense of priorities. Tackling ISIS
domestically is difficult, but there is no evidence that being able to listen to
communications would have helped prevent the attacks on Charlie Hebdo, in
Tunisia, or other ISIS-inspired efforts. Meanwhile there is plenty of evidence that
securing our communications and devices would have prevented the breaches at
Anthem, OPM, and elsewhere. The latter are serious long-term national security
threats.
Securing the US means more than protecting against a knife-wielding fanatic; it
includes securing the economy and developing the infrastructure that protects
against long-term threats. We expect our leaders to prioritize, putting resources
to the most important threats and making the choices that genuinely secure our
nation. Director Comey's comments mixing ISIS with discussions about
communications security and encryption do not rise to that level.
There’s no evidence that encryption empowers ISIS.
Wheeler 15 — Marcy Wheeler, independent journalist writing about national security and civil liberties, has written
for the Guardian, Salon, and the Progressive, author of Anatomy of Deceit about the CIA leak investigation, Recipient of the
2009 Hillman Award for blog journalism, holds a Ph.D. in Comparative Literature from the University of Michigan, 2015
(“Jim Comey May Not Be a Maniac, But He Has a Poor Understanding of Evidence,” Empty Wheel—Dr. Marcy Wheeler’s
blog, July 6th, Available Online at https://www.emptywheel.net/2015/07/06/jim-comey-may-not-be-a-maniac-but-hehas-a-poor-understanding-of-evidence/, Accessed 07-20-2015)
Apparently, Jim Comey wasn’t happy with his stenographer, Ben Wittes. After having Ben write up Comey’s concerns on
encryption last week, Comey has written his own explanation of his concerns about encryption at Ben’s blog.
Here are the 3 key paragraphs.
2. There are many benefits to this. Universal strong encryption will protect all of us—our innovation, our
private thoughts, and so many other things of value—from thieves of all kinds. We will all have lock-boxes in
our lives that only we can open and in which we can store all that is valuable to us. There are lots of good
things about this.
3. There are many costs to this. Public safety in the United States has relied for a couple centuries on the ability
of the government, with predication, to obtain permission from a court to access the “papers and effects” and
communications of Americans. The Fourth Amendment reflects a trade-off inherent in ordered liberty: To
protect the public, the government sometimes needs to be able to see an individual’s stuff, but only under
appropriate circumstances and with appropriate oversight.
4. These two things are in tension in many contexts. When the government’s ability—with appropriate
predication and court oversight—to see an individual’s stuff goes away, it will affect public safety. That tension
is vividly illustrated by the current ISIL threat, which involves ISIL operators in Syria recruiting and tasking
dozens of troubled Americans to kill people, a process that increasingly takes part through mobile messaging
apps that are end-to-end encrypted, communications that may not be intercepted, despite judicial orders under
the Fourth Amendment. But the tension could as well be illustrated in criminal investigations all over the
country. There is simply no doubt that bad people can communicate with impunity in a world of universal
strong encryption.
Comey admits encryption lets people lock stuff away from criminals (and supports
innovation), and admits “there are lots of good things about this.” He then introduces
“costs,” without enumerating them. In a paragraph purportedly explaining how
the “good things” and “costs” are in tension, he raises the ISIL threat as well as —
as an afterthought — “criminal investigations all over the country.”
Without providing any evidence about that tension.
As I have noted, the recent wiretap report raises real questions, at least about the
“criminal investigations all over the country,” which in fact are not being
thwarted. On that ledger, at least, there is no question: the “good things” (AKA, benefits)
are huge, especially with the million or so iPhones that get stolen every year, and
the “costs” are negligible, just a few wiretaps law enforcement can’t break.
I conceded we can’t make the same conclusions about FISA orders — or the FBI
generally — because Comey’s agency’s record keeping is so bad (which is consistent with all
the rest of its record-keeping). It may well be that we’re not able to access ISIL
communications with US recruits because of encryption, but simply invoking the
existence of ISIL using end-to-end encrypted mobile messaging apps is not
evidence (especially because so much evidence indicates that sloppy end-user
behavior makes it possible for FBI to crack this).
Especially after the FBI’s 0-for-40 record about making claims about terrorists
since 9/11.
It may be that the FBI is facing increasing problems tracking ISIL. It may even be
— though I’m skeptical — that those problems would outweigh the value of making
stealing iPhones less useful.
But even as he calls for a real debate, Comey offers not one bit of real evidence to counter
the crappy FBI reporting in the official reports to suggest this is not more FBI
fearmongering.
This argument is baseless fearmongering.
McLaughlin 15 — Jenna McLaughlin, Reporter and Blogger covering surveillance and national security for The
Intercept, former national security and foreign policy reporter and editorial fellow at Mother Jones, 2015 (“FBI and Comey
Find New Bogeyman for Anti-Encryption Arguments: ISIS,” The Intercept, July 7th, Available Online at
https://firstlook.org/theintercept/2015/07/07/fbi-finds-new-bogeyman-anti-encryption-arguments-isis/, Accessed 0720-2015)
After months of citing hypothetical crimes as a reason to give law enforcement a magical key to unlock encrypted digital
messages, FBI Director James Comey
has latched onto a new bogeyman: ISIS.
In a speech he gave in October 2014 as part of his coordinated push to make the case that the FBI is “going dark,”
Comey leaned on examples of kidnappers and child abusers who texted details of their
violent plots that law enforcement agents weren’t privy to.
But, as The Intercept reported shortly after, those examples were largely bogus and had
nothing to do with encryption.
Now, in a preview of his appearance Wednesday before the Senate Intelligence Committee, Comey is playing
the ISIS card, saying that it is becoming impossible for the FBI to stop the group’s recruitment and planned attacks.
(He uses an alternate acronym, ISIL, for the Islamic State.)
“The current ISIL threat … involves ISIL operators in Syria recruiting and tasking dozens of troubled Americans to kill
people, a process that increasingly takes part through mobile messaging apps that are end-to-end encrypted,
communications that may not be intercepted, despite judicial orders under the Fourth Amendment,” Comey wrote on
Monday in a blog post on the pro-surveillance website Lawfare.
While providing no specific, independently confirmable examples, Comey has
claimed that FBI agents are currently encountering problems because of encrypted
communications as they track potential ISIS sympathizers and radicals.
Comey has long argued that sophisticated encryption technology being implemented by tech giants, including Google
and Apple, will make it harder and harder for the FBI to track its targets. Encryption scrambles the contents of digital
communications, making it impossible for users without the “key” to read messages in plain language.
Comey has vaguely indicated that he wants tech companies to build a special entrance to communications: a specific
passcode or key — or combination of keys — that only law enforcement can use, when appropriate.
Privacy and cryptology experts have come out strongly against Comey’s
suggestion, arguing that encryption makes people safer, and that creating a hole in encryption for law enforcement
creates a hole for criminals to go through, too.
They also note that law enforcement can thwart encryption in most cases, and can
supplement investigations with traditional methods not involving surveillance.
“The FBI have been trying to argue that the internet is ‘going dark’ for several years now, and Congress has not yet
bought into their propositions,” Amie Stepanovich, the U.S. policy manager for digital rights at Access, an international
pro-privacy organization, wrote in an email to The Intercept. “Terrorist threats are harder to substantiate and easier to use
as justifications for additional funding,” she wrote.
According to a Federal Courts report on wiretapping in 2014 published last week, law
enforcement personnel at the state and federal level were only stymied by
encryption on four wiretaps all year.
Neema Singh Guliani, legislative council for the American Civil Liberties Union, said she thinks
that report might be a reason Comey has switched from arguing about the restrictions on federal
law enforcement to focusing on the dangers posed by ISIS. “According to the report,
encryption has not been a significant impediment for law enforcement,” Guliani
wrote in an email. “This represents a decrease from prior years. Given this report,
Comey’s prior contention that backdoors are needed for federal law enforcement
needs is unpersuasive.”
Tiffiny Cheng, co-founder of Fight for the Future, a nonprofit dedicated to privacy rights, said that
Comey is fueling a culture of fear that enriches both defense contractors and the
agencies they support. “The U.S. government has not looked at data on efficacy to decide where the line
between security and liberty should be, instead they just shoot whatever they want from the mouth in order to stay in the
game,” she wrote in an email.
They Say: “Garland, TX (Elton Simpson)”
The Garland (Elton Simpson) example goes aff.
McLaughlin 15 — Jenna McLaughlin, Reporter and Blogger covering surveillance and national security for The
Intercept, former national security and foreign policy reporter and editorial fellow at Mother Jones, 2015 (“FBI and Comey
Find New Bogeyman for Anti-Encryption Arguments: ISIS,” The Intercept, July 7th, Available Online at
https://firstlook.org/theintercept/2015/07/07/fbi-finds-new-bogeyman-anti-encryption-arguments-isis/, Accessed 0720-2015)
To the extent that Comey has mentioned any specific ISIS-related investigation, it
is one that doesn’t support his argument.
In May, after two gunmen arrived at a controversial anti-Muslim exhibition in
Garland, Texas, and were slain by law enforcement before they could carry out their attack, Comey publicly
announced that the FBI had been tracking one of the would-be attackers, Elton Simpson, for
months.
“This is the ‘going dark’ problem in living color. There are Elton Simpsons out there that I have not found and I cannot
see,” he said.
But FBI surveillance didn’t stop Elton Simpson — the Garland Police
Department did. The local police never got the FBI’s email, and if they had,
Garland’s Police Chief Bates told NPR, the response would not have been any
different: “Please note that the contents of that email would not have prevented
the shooting nor would it have changed the law enforcement response in any
fashion.”
They Say: “Benjamin Wittes”
Wittes’s argument is terrible.
Cushing 15 — Tim Cushing, Staff Writer for Techdirt, 2015 (“NSA Apologist Offers Solutions To 'Encryption'
Problem, All Of Which Are Basically 'Have The Govt Make Them Do It',” Techdirt, July 15th, Available Online at
https://www.techdirt.com/articles/20150714/20210431643/nsa-apologist-offers-solutions-to-encryption-problem-allwhich-are-basically-have-govt-make-them-do-it.shtml, Accessed 07-20-2015)
Benjamin Wittes,
one of the NSA apologists ensconced at Lawfare, has written a
long piece in defense of FBI head James Comey's assertions that there must be some
way tech companies can give him what he wants without compromising the
privacy and security of every non-terrorist/criminal utilizing the same broken
encryption.
What he suggests is highly problematic, although he obviously pronounces that
word as "pragmatic." He implies the solution is already known to tech
companies, but that their self-interest outweighs the FBI's push for a "greater
good" fix.
The theory is that companies have every incentive for market reasons to protect consumer privacy, but no
incentives at all to figure out how to provide law enforcement access in the context of doing so.
There's some truth to this theory. Tech companies are particularly wary of appearing to be complicit in government
surveillance programs as a couple of years of leaks have done considerable damage to their prospects in foreign markets.
Wittes suggests the government isn't doing much to sell this broken encryption plan, despite Comey's multiple statements
on the dangers posed by encrypted communications. And he's right. If the government truly wants a "fix," it needs to start
laying the groundwork. It can't just be various intel/law enforcement heads stating "we're not really tech guys" and
suggesting tech companies put the time and effort into solving their problems for them.
If we begin—as the computer scientists do—with a posture of great skepticism as to the plausibility of any
scheme and we place the burden of persuasion on Comey, law enforcement, and the intelligence community to
demonstrate the viability of any system, the obvious course is government-sponsored research. What we need
here is not a Clipper Chip-type initiative, in which the government would develop and produce a complete
system, but a set of intellectual and technical answers to the challenges the technologists have posed. The goal
here should be an elaborated concept paper laying out how a secure extraordinary access system would work
in sufficient detail that it can be evaluated, critiqued, and vetted; think of the bitcoin paper here as a model.
Only after a period of public vetting, discussion, and refinement would the process turn to the question of what
sorts of companies we might ask to implement such a system and by what legal means we might ask.
Thus ends the intelligent suggestions in Wittes' thinkpiece. Everything else is exactly the sort of thing Comey keeps
hinting at, but seems unwilling to actually put in motion. It's the government-power elephant in the room. Actually,
several elephants. It's the underlying, unvocalized threat that lies just below the surface of Comey's government-slanted
PR efforts. Wittes just goes through the trouble of vocalizing them.
First, he gives
Comey's chickenshit, ignorant sales pitch a completely disingenuous,
self-serving reading. Comey has refused to acknowledge the fact that what he's
seeking is not actually possible. He claims he doesn't have the tech background
to make more informed assertions while simultaneously insisting the solution
exists – and could easily be found if only these tech companies were willing to
apply themselves.
[Comey] is talking in very different language: the language of performance requirements. He wants to leave the
development task to Silicon Valley to figure out how to implement government's requirements. He wants to
describe what he needs—decrypted signal when he has a warrant—and leave the companies to figure out how
to deliver it while still providing secure communications in other circumstances to their customers.
The advantage to this approach is that it potentially lets a thousand flowers bloom. Each company might do it
differently. They would compete to provide the most security consistent with the performance standard. They
could learn from each other. And government would not be in the position of developing and promoting
specific algorithms. It wouldn't even need to know how the task was being done.
In Wittes' estimation, Comey is being wise and promoting open innovation,
rather than just refusing to openly acknowledge that his desire to access and
intercept communications far exceeds his desire to allow millions of noncriminals access to safer connections and communications.
Wittes goes on to offer a handful of "solutions" to the Second Crypto War. Not a single one
includes the government growing up and learning to deal with the new,
encrypted status quo. He follows up the one useful suggestion – government research exploring the feasibility
of the proposed encryption bypass – with one of his worst ideas:
If you simply require the latter [law enforcement access] as a matter of law, [tech companies] will devote
resources to the question of how to do so while still providing consumer security. And while the problems are
hard, they will prove manageable once the tech giants decide to work them hard—rather than protesting their
impossibility.
There's not a worse idea out there than making certain forms of encryption
illegal to use in the United States. But Wittes tries his hardest to find equally awful
ideas. Like this one, which would open tech companies to an entire new area of liability.
Another, perhaps softer, possibility is to rely on the possibility of civil liability to incentivize companies to focus
on these issues. At the Senate Judiciary Committee hearing this past week, the always interesting Senator
Sheldon Whitehouse posed a question to Deputy Attorney General Sally Yates about which I've been thinking
as well: "A girl goes missing. A neighbor reports that they saw her being taken into a van out in front of the
house. The police are called. They come to the home. The parents are frantic. The girl's phone is still at home."
The phone, however, is encrypted.
Wittes quotes Whitehouse's statements, in which he compares encryption to industrial pollution
and suggests tech companies – not the criminal in question; not the investigators who are seemingly unable
to explore other options – be held liable for the criminal's actions. Wittes poses a rhetorical question –
one that assumes most of America wants what Comey wants.
Might a victim of an ISIS attack domestically committed by someone who communicated and plotted using
communications architecture specifically designed to be immune, and specifically marketed as immune, from
law enforcement surveillance have a claim against the provider who offered that service even after the director
of the FBI began specifically warning that ISIS was using such infrastructure to plan attacks? To the extent such
companies have no liability in such circumstances, is that the distribution of risk that we as a society want?
Holding companies responsible for the actions of criminals is completely stupid.
Providing encryption to all shouldn't put companies at risk of civil suits. The
encryption isn't being provided solely for use by bad guys. It makes no more
sense than holding FedEx responsible for shipments of counterfeit drugs. And yet,
we've seen our government do exactly that, in essence requiring every affected private company to act as deputized law
enforcement entities, despite there being no logical reason to put them in this position. Wittes
feels the best
solutions involve the government forcing companies to bend to its will, and
provide compromised encryption under duress.
The final solution proposed by Wittes is to let everything go to hell and assume the political
landscape – along with tech companies' "sympathies" – will shift accordingly. This would be the "let's hope for
the tragic death of a child" plan:
[W]e have an end-to-end encryption issue, in significant part, because companies are trying to assure customers
worldwide that they have their backs privacy-wise and are not simply tools of NSA. I think those politics are
likely to change. If Comey is right and we start seeing law enforcement and intelligence agencies blind in
investigating and preventing horrible crimes and significant threats, the pressure on the companies is going to
shift. And it may shift fast and hard. Whereas the companies now feel intense pressure to assure customers that
their data is safe from NSA, the kidnapped kid with the encrypted iPhone is going to generate a very different
sort of political response. In extraordinary circumstances, extraordinary access may well seem reasonable.
If this does happen, Wittes' assumption will likely be correct. Politicians have never been shy about capitalizing on
tragedies to nudge the government power needle. This will be no different. One wonders why no one has come forward
with a significantly compelling tragedy by this point, considering the wealth of encryption options currently on the
market. A logical person would assume this lack of compelling anecdotal evidence would suggest encryption really hasn't
posed a problem yet -- especially considering the highly-motivated sales pitches that have been offered nonstop since
Google and Apple's announcement of their encryption-by-default plans. The
"problem" Comey and
others so desperately wish to "solve" remains almost entirely theoretical at this
point.
But the FBI and others aren't going to wait until the next tragedy. They want the
path of least resistance now. The solutions proposed by Wittes are exactly the
sort of thing they'd be interested in: expanded government power and increased
private sector liability. This is why Comey has no solution to offer. There is
none. There is only the option of making companies do what he wants, but he's
too wary of public backlash to actually say these things out loud. Wittes has
saved him the trouble and proven himself no more trustworthy than those who
want easy access, no matter the negative implications or unintended
consequences of these actions.
Wittes relies on a flawed analogy. Encrypted communications aren’t an
ungovernable space.
Friedersdorf 15 — Conor Friedersdorf, Staff Writer for The Atlantic, 2015 (“How Dangerous Is End-to-End
Encryption?,” The Atlantic, July 14th, Available Online at http://www.theatlantic.com/politics/archive/2015/07/nsaencryption-ungoverned-spaces/398423/, Accessed 07-12-2015)
Technologists warn that there is no way to build in a “backdoor” just for law
enforcement, much as there’s no way to outfit a safe with a backdoor that only
the FBI can open. If encryption is weakened so that government can, in theory,
access anyone’s data with a warrant, then in practice, everyone’s
communications will be vulnerable to Chinese hackers, the Russian government,
and NSA employees operating beyond constitutional bounds without
individualized warrants.
Over at Lawfare, Benjamin Wittes, who tends to favor increasing rather than circumscribing governmental
powers that relate to national security, muses at length on this question. “Would it be a good idea to have a
world-wide communications infrastructure that is, as Bruce Schneier has aptly put it, secure from all attackers?” he asks.
“That is, if we could snap our fingers and make all device-to-device communications perfectly secure against interception
from the Chinese, from hackers, from the FSB but also from the FBI even wielding lawful process, would that be
desirable? Or, in the alternative, do we want to create an internet as secure as possible from everyone except government
investigators exercising their legal authorities with the understanding that other countries may do the same?”
He finds it useful to consider that question before deciding whether it is even possible to secure the Internet against
everyone except lawful government investigators.
For now, let’s play along.
He says that the answer is not a close call.
“The belief in principle in creating a giant world-wide network on which surveillance is technically impossible is really an
argument for the creation of the world's largest ungoverned space,” he writes. “I understand why techno-anarchists find
this idea so appealing. I can't imagine for moment, however, why anyone else would.”
He goes on to attempt an analogy:
Consider the comparable argument in physical space: the creation of a city in which authorities are entirely
dependent on citizen reporting of bad conduct but have no direct visibility onto what happens on the streets
and no ability to conduct search warrants (even with court orders) or to patrol parks or street corners. Would
you want to live in that city? The idea that ungoverned spaces really suck is not controversial when you're
talking about Yemen or Somalia. I see nothing more attractive about the creation of a worldwide architecture in
which it is technically impossible to intercept and read ISIS communications with followers or to follow child
predators into chatrooms where they go after kids.
Even at this conceptual level, before
even considering whether a government-only
backdoor is possible and cost-effective, it seems to me that Wittes’s analysis is flawed.
The problem lies in the limits of his analogy.
In an ungoverned territory like Somalia, bad actors can take violent physical
actions with impunity—say, seizing a cargo ship, killing the captain, and taking hostages. If authorities
were similarly helpless on America’s streets—if gangs could rob or murder pedestrians as they
pleased, and police couldn’t see or do a thing—that would, indeed, be dystopian. But when
communications are encrypted, the “ungoverned territory” does not encompass
actions, violent or otherwise, just thoughts and their expression.
No harm is done within the encrypted space.
To be sure, plots planned inside that space can do terrible damage in the real
world—but so can plots hatched by gang members on public streets whispering
into one another’s ears, or Tony Soprano out on his boat, having swept it for FBI
bugs.
Wittes’s argument justifies government surveillance of everyone’s thoughts.
His argument is terrible.
Friedersdorf 15 — Conor Friedersdorf, Staff Writer for The Atlantic, 2015 (“How Dangerous Is End-to-End
Encryption?,” The Atlantic, July 14th, Available Online at http://www.theatlantic.com/politics/archive/2015/07/nsaencryption-ungoverned-spaces/398423/, Accessed 07-12-2015)
I wonder what Wittes would make of a different analogy.
In the absence of end-to-end encryption—indeed, even if it becomes a universally available tool—
the largest ungoverned space in the world won’t be Somalia or the Internet, but
the aggregate space between the ears of every human being on planet earth.
No authority figure can see into my brain, or the brain of Chinese human-rights
activist Liu Xiaobo, or the brains of ISIS terrorists, or the brains of Black Lives
Matter protestors, or the brains of child pornographers, or the brains of Tea
Partiers or progressive activists or whoever it is that champions your political
ideals. If government had access to all of our thoughts, events from the American
Revolution to the 9/11 terrorist attacks would’ve been impossible. Reflecting on
this vast “ungoverned territory,” does Wittes still regard as uncontroversial the
notion that “ungoverned spaces really suck”? More to the point, if it’s ever technically
possible, would he prefer a world in which government is afforded a “backdoor”
into my brain, yours, and those of the next Osama bin Laden, MLK, and Dylann
Storm Roof?
To be clear, I don’t mean to assert that “backdoor” access to digital communications is just like equivalent access to our
brains. But say
that end-to-end encryption is the norm going forward. Do readers
think that America would be more like Somalia? Or more like today’s America,
only with greater privacy for thoughts, papers, and personal effects that enables
both significant goods and harms?
As in contemporary America—and unlike in Somalia—terrorists, child
pornographers, and other serious criminals would have to operate outside
“ungoverned spaces” to harm any innocents. The threats they pose can be
adequately addressed there.
They Say: “Cyrus Vance”
Vance is wrong.
O’Connor 14 — Nuala O’Connor, President and Chief Executive of the Center for Democracy & Technology,
former Global Privacy Leader at General Electric, former Vice President of Compliance & Consumer Trust and Associate
General Counsel for Data & Privacy Protection at Amazon.com, former Deputy Director of the Office of Policy & Strategic
Planning, Chief Privacy Officer, and Chief Counsel for Technology at the United States Department of Commerce, former
Chief Privacy Officer at the Department of Homeland Security, holds a J.D. from Georgetown University Law Center,
2014 (“Apple and Google are helping to protect our privacy,” Letter To The Editor — Washington Post, October 2nd,
Available Online at http://www.washingtonpost.com/opinions/apple-and-google-are-helping-to-protect-ourprivacy/2014/10/02/ea35524a-4824-11e4-a4bf-794ab74e90f0_story.html, Accessed 07-05-2015)
When law enforcement officials ask technology companies to make technology
less secure in the name of public safety, they are asking for weakened privacy
protections that leave citizens open to hacking and fraud.
In his Sept. 28 Sunday Opinion piece, “Can you catch me now? Good,” Cyrus R. Vance Jr., the district attorney of
Manhattan, argued that Apple and Google are preventing law enforcement from
obtaining critical evidence stored on smartphones as a result of the companies’ new encryptionby-default approach. This is not true. Law enforcement, with a warrant, can access
phone backups stored on a hard drive or in the “cloud.” Law enforcement also
can access with a warrant any data stored by a company on behalf of a user.
Further, a court can order a person to unlock his or her phone.
People are aware of threats to privacy and personal information stored
electronically, including from unscrupulous hackers and thieves and invasive
government surveillance. We should encourage stronger device security to
reduce crime, rather than support weaker standards that open people to outside
attacks.
Apple and Google deserve kudos for stepping up to better secure and protect our
personal communications and privacy.
They Say: “Ronald Hosko”
Hosko is wrong.
Lee 14 — Timothy B. Lee, Senior Editor covering technology at Vox, previously covered technology policy for the
Washington Post and Ars Technica, former Adjunct Scholar at the Cato Institute, holds a Master’s in Computer Science from
Princeton University, 2014 (“The government says iPhone encryption helps criminals. They're wrong.,” Vox, September
29th, Available Online at https://www.vox.com/2014/9/29/6854679/iphone-encryption-james-comey-governmentbackdoor, Accessed 07-05-2015)
Comey suggested that law enforcement access to the contents of smartphones would be
essential to savings lives in terrorism and kidnapping cases. But his speech was short
on specific examples where encryption actually thwarted — or would have thwarted — a
major police investigation.
Last week, former FBI official Ronald Hosko wrote an op-ed in the Washington Post offering a
concrete example of a case where smartphone encryption would have thwarted a law
enforcement investigation and cost lives. "Had this technology been in place," Hosko wrote, "we
wouldn’t have been able to quickly identify which phone lines to tap. That delay would have cost us our victim his life."
There's just one problem: Hosko was wrong. In the case he cited, the police had
not used information gleaned from a seized smartphone. Instead, they used
wiretaps and telephone calling records — methods that would have been
unaffected by Apple's new encryption feature. The Washington Post was forced
to issue a correction.
Indeed, while law enforcement groups love to complain about ways that encryption
and other technologies have made their jobs harder, technology has also provided the
police with vast new troves of information to draw upon in their investigations. With the assistance
of cell phone providers, law enforcement can obtain detailed records of a suspect's every move. And consumers
increasingly use cloud-computing services that store emails, photographs, and other private information on servers where
they can be sought by investigators.
So while smartphone encryption could make police investigations a bit more
difficult, the broader trend has been in the other direction: there are more and
more ways for law enforcement to gain information about suspects. There's no
reason to think smartphone encryption will be a serious impediment to solving
crimes.
They Say: “Stanley Crovitz”
Crovitz is totally wrong.
Masnick 14 — Mike Masnick, Founder and Chief Executive Officer of Floor64—a software company, Founder and
Editor of Techdirt, 2014 (“Ridiculously Misinformed Opinion Piece In WSJ Asks Apple And Google To Make Everyone
Less Safe,” Techdirt, November 24th, Available Online at
https://www.techdirt.com/articles/20141124/07011329232/ridiculously-misinformed-opinion-piece-wsj-asks-applegoogle-to-make-everyone-less-safe.shtml, Accessed 07-20-2015)
Former Wall Street Journal publisher L. Gordon Crovitz
still gets to publish opinion pieces in the
WSJ. And while I often find them interesting, any time he touches on technology in almost any
manner, he seems to fall flat on his face, often in embarrassing ways – such as the time he
insisted the internet was invented by companies without government support (yes, he really argued that). Crovitz
has also been strongly pro-surveillance state for years. He's attacked Wikileaks and Chelsea Manning by
blatantly taking quotes out of context, and then last year, writing a column about the Snowden leaks
that showed he doesn't understand even the basic facts. Crovitz tends to see the
world the way he wants to see it, rather than the way it really is.
His latest is no exception, repeating a bunch of bogus or debunked claims to argue
that the tech industry should happily insert back doors into technology to aid in
surveillance. He kicks it off by both repeating the false claim concerning how
"subway bomber" Najibullah Zazi was caught, but also totally misunderstanding the difference
between encrypting data on a device and encrypting data in transit:
It’s a good thing Najibullah Zazi didn’t have access to a modern iPhone or Android device a few years ago
when he plotted to blow up New York City subway stations. He was caught because his email was tapped by
intelligence agencies—a practice that Silicon Valley firms recently decided the U.S. government is no longer
permitted.
Apple , Google, Facebook and others are playing with fire, or in the case of Zazi with a plot to blow up subway
stations under Grand Central and Times Square on Sept. 11, 2009. An Afghanistan native living in the U.S., Zazi
became a suspect when he used his unencrypted Yahoo email account to double-check with his al Qaeda
handler in Pakistan about the precise chemical mix to complete his bombs. Zazi and his collaborators, identified
through phone records, were arrested shortly after he sent an email announcing the imminent attacks: “The
marriage is ready.”
Except, no. It wouldn't have mattered if he had a modern iPhone or Android
device because whether or not email is encrypted is entirely unrelated to whether
or not data on the device is encrypted. What Apple and Google are promising
now is to encrypt data on the device. Even if that was turned on, if you send an
unencrypted email, it's still available to be viewed. Crovitz is comparing two
completely different things and doesn't seem to realize it. What kind of standards
does the WSJ have when it allows such false arguments to be published
uncritically?
Furthermore, the fact that Zazi sent an unencrypted email via Yahoo was a different issue. And Yahoo encrypted all its
email connections a while ago, and no one freaked out at the time. Even so, that's unimportant, because law
enforcement and the intelligence community can and do still read emails with a
warrant. And, as was made clear by many in the analysis of the Zazi case, he had been watched by
law enforcement for a while. The phone encryption that Google and Apple are
discussing would have had no impact whatsoever on the Zazi case. So why even bring it
up, other than pure surveillance state FUD?
But, to
someone as ignorant of the basics as Crovitz, it's an opportunity to double
down.
The Zazi example (he pleaded guilty to conspiracy charges and awaits sentencing) highlights the risks that
Silicon Valley firms are taking with their reputations by making it impossible for intelligence agencies or law
enforcement to gain access to these communications.
Except, again, that's not true. Intelligence
agencies and law enforcement would still have
access to communications in transit – just not data held on his phone directly
(which they wouldn't have unless they got the phone itself). Second, it still wouldn't be "impossible" to
get the information. They could either crack the encryption or issue a subpoena
ordering the phone's owner to unlock the data (or potentially face a potential contempt of court
ruling). While there are some 5th Amendment concerns with that latter route, it's still not "impossible." And it's not about
communications. Crovitz
is just totally ignorant of what he's writing about.
Crovitz’s argument requires banning math.
Kocher 15 — Paul Kocher, President and Chief Scientist of Cryptography Research, Inc.—a cryptography company
specializing in applied cryptographic engineering, Member of the National Academy of Engineering and the National
Cyber Security Hall of Fame, 2015 (“It’s Still 2+2=4 for NSA and ISIS,” Wall Street Journal — Letter to the Editor, July 14th,
Available Online at http://www.wsj.com/articles/its-still-2-2-4-for-nsa-and-isis-1436908138, Accessed 07-20-2015)
L. Gordon Crovitz
is puzzled that Silicon Valley can’t stop terrorists from using
strong encryption (“Why Terrorists Love Silicon Valley,” Information Age, July 6). The reason is
simple. Encryption methods are nothing more than mathematics. Silicon Valley
companies cannot make mathematics work differently for terrorists.
Mr. Crovitz imagines our government might seek to ban unbreakable crypto.
Software, compilers, math textbooks and other widely available materials that
make it simple for individuals to create their own crypto would all have to be
tightly regulated. Terrorists could, of course, still use codewords or abstract poetry,
so these would naturally need to be banned. Identical laws would also have to be
passed and enforced in every country.
Crovitz’s argument doesn’t make sense. The FBI still has many investigative
tools.
Yakowicz 15 — Will Yakowicz, Staff Writer for Inc. magazine, holds a B.A. in Journalism and English Literature
from New York University, 2015 (“What the Government Gets Wrong About Cybersecurity,” Inc., July 7th, Available
Online at http://www.inc.com/will-yakowicz/is-data-encryption-really-the-enemy-fbi-claims.html, Accessed 07-202015)
A Senate Judiciary Committee hearing on Wednesday is set to put a spotlight on the fine line tech companies must walk
between keeping users' data private and making it available to the government in matters of national security.
Edward Snowden's
revelations about the National Security Agency's surveillance programs, coupled
with a deluge of data breaches, led to an outcry for better privacy and security
measures. Tech companies listened, and have been beefing up security features in
consumer products. In May, Apple, Facebook, Google, and others sent a letter to President Obama urging him to shoot
down proposals that would force them to enfeeble the security of their products to help law enforcement access
encrypted data more easily.
On the other side of the debate, FBI director James Comey
has waged a campaign against
encryption, stating that if his agency cannot intercept internet communications, it will not be able to save citizens
from terror attacks. During a speech, Comey said, "encryption threatens to lead all of us to a very dark place."
Earlier this week, L. Gordon Crovitz, a former publisher of the The Wall Street Journal, echoed
Comey's
concerns in an op-ed in the WSJ entitled, "Why Terrorists Love Silicon Valley." The column contends that
tech companies have made consumer products too secure with end-to-end encryption. Crovitz says when the FBI asked
tech companies to find a way to balance privacy encryption and court-ordered legal searches, the technologists said it was
impossible.
"Terror attacks are increasingly planned online, outside the reach of intelligence and law enforcement," Crovitz writes.
"Once a recruit is identified, ISIS tells him to switch to an encrypted smartphone. Legal wiretaps are useless because the
signal is indecipherable. Even when the devices are lawfully seized through court orders, intelligence and lawenforcement agencies are unable to retrieve data from them."
Not only is this kind of language irresponsible, linking encryption to terrorism,
Crovitz's fearmongering article seems to hold up Comey's campaign against
encryption as the unvarnished truth. Data encryption will not result in the U.S.
getting attacked. Compelling tech businesses to tear down basic privacy
measures in the service of fighting terrorism is a move back to a surveillance
state.
Post-9/11 fear led to a decade of mass surveillance. Considering how little was
accomplished, it's clear that relying on access to data and monitoring electronic
communications is not the only way to prevent terrorist plots. Cyberattacks have
started to cross over into the physical world—because smartphones, appliances, and our country's
infrastructure are all connected to the internet, our nation's security actually depends on
encryption.
Meanwhile, encryption appears to have done little to foil law enforcement's use of
wire and electronic surveillance to bring down terrorists. According to the United
States Courts' Wiretap Report 2014, instances of encrypted devices interfering with
wiretaps decreased nearly by half from 2013 to 2014. Moreover, the report also found
that last year the vast majority of wiretaps were granted for investigations into
drug deals, not potential terrorist plots.
The FBI's claim of being crippled if tech companies don't allow them to monitor
electronic communications is a stretch, says David Gorodyansky, co-founder of virtual private network
provider AnchorFree. "It would presume the FBI was completely useless before the
internet was created," he says.
AnchorFree helps its 350 million users surf the Web anonymously to prevent hackers from stealing their identity,
advertising firms from selling their search history, and governments from censoring online activity. By default, the
company does not collect any data on its users. When law enforcement agencies serve AnchorFree with subpoenas, it
complies but has little information to share.
Gorodyansky says greater dialogue between businesses, consumers, and government agencies can help to achieve a
balance between privacy and public safety. He says if he were to be involved in these conversations, he would propose a
system where every internet user has a default right to privacy and encryption. But that right can be lost if a user sets off a
trigger like visiting a terrorist website, talking about or searching ways to make bombs, or other actions of that nature.
He estimates that only
1 percent of people use the internet for criminal purposes. So
why should everyone else be subject to invasions of privacy? To illustrate his point, he
mentions a friend of his who traveled to North Korea, where he noticed that none of the homes had curtains or blinds in
the windows. When he asked why, his guide told him that if you cover your windows you have something to hide.
"In
North Korea, you have to keep your windows wide open so people can look
inside," Gorodyansky says. "Is that the society we want to create here?"
Crovitz is lying about the Louisiana case.
Masnick 14 — Mike Masnick, Founder and Chief Executive Officer of Floor64—a software company, Founder and
Editor of Techdirt, 2014 (“Ridiculously Misinformed Opinion Piece In WSJ Asks Apple And Google To Make Everyone
Less Safe,” Techdirt, November 24th, Available Online at
https://www.techdirt.com/articles/20141124/07011329232/ridiculously-misinformed-opinion-piece-wsj-asks-applegoogle-to-make-everyone-less-safe.shtml, Accessed 07-20-2015)
Since then, U.S. and British officials have made numerous trips to Silicon Valley to explain the dangers. FBI
Director James Comey gave a speech citing the case of a sex offender who lured a 12-year-old boy in Louisiana
in 2010 using text messages, which were later obtained to get a murder conviction. “There should be no one in
the U.S. above the law,” Mr. Comey said, “and also no places within the U.S. that are beyond the law.”
Again, different issue. The
Louisiana case? That was debunked a day later, and it wasn't
because of access to the kind of information that would now be encrypted. As noted
by the Intercept:
In another case, of a Lousiana sex offender who enticed and then killed a 12-year-old boy, the
big break
had nothing to do with a phone: The murderer left behind his keys and a
trail of muddy footprints, and was stopped nearby after his car ran out of
gas.
Next thing you know, Crovitz will argue that all shoes should come premuddied so that law enforcement can track them. After all, how could law enforcement track
down criminals who don't leave a trail of muddy footprints?
Then Crovitz shifts to his own personal worldview -- insisting that the public actually doesn't want privacy or protection
from the snooping eyes of government. He insists, the truth is the public really wants to be spied on.
It looks like Silicon Valley has misread public opinion. The initial media frenzy caused by the Edward Snowden
leaks has been replaced by recognition that the National Security Agency is among the most lawyered agencies
in the government. Contrary to initial media reports, the NSA does not listen willy-nilly to phone and email
communications.
Last week, the Senate killed a bill once considered a sure thing. The bill would have created new barriers to the
NSA obtaining phone metadata to connect the dots to identify terrorists and prevent their attacks. Phone
companies, not the NSA, would have retained these records. There would have been greater risks of leaks of
individual records. An unconstitutional privacy advocate would have been inserted into Foreign Intelligence
Surveillance Court proceedings.
First off, no, the USA Freedom Act was never "a sure thing." From the very beginning, it was considered a massive long
shot. And, no it would not have "created new barriers" -- it would have merely made it clear that the NSA can't simply
collect everyone's data in the hopes of magically sifting through the haystack and finding connections. Also, Crovitz is flat
out wrong (again!) that this would have led to a "greater risk" because the phone companies held the data. While this was
the key talking point among those who voted against it, it's simply incorrect. The telcos already retain that information.
The bill made no changes to what information telcos could and would retain. It only said that they shouldn't also have to
ship all that data to the NSA as well. There was no increased risk. Saying so is -- once again -- trumpeting Crovitz's
ignorance.
Furthermore, the idea that the public is miraculously comfortable with the government spying on them... based on the
government voting against curtailing government surveillance is simply ludicrous. It doesn't even pass a basic laugh test.
The Pew Research poll that tracks this issue most closely continues to show that the vast majority of people are against
NSA surveillance on American data, and the numbers who feel that way have been growing consistently since the first of
the Snowden revelations.
But let me repeat the assertion Crovitz made here, just to remind everyone of how idiotic it is: he's saying that the public
is now comfortable with surveillance because Congress voted down surveillance reform. And he thinks this is obvious.
The lesson of the Snowden accusations is that citizens in a democracy make reasonable trade-offs between
privacy and security once they have all the facts. As people realized that the rules-bound NSA poses little to no
risk to their privacy, there was no reason to hamstring its operations. Likewise, law-abiding people know that
there is little to no risk to their privacy when communications companies comply with U.S. court orders.
Facts, huh? It's kind of funny that he'd argue for the facts when he seems to be lacking in many of them. And he's wrong.
There is tremendous risk to privacy, as illustrated by the fact that the NSA regularly abused its powers to spy on
Americans. Furthermore, he ignores (or is ignorant of the fact) that much of the data the NSA collects is also freely
available to the CIA and FBI -- and that the FBI taps into it so often that it doesn't even track how many times it dips into
the database.
And of course, none of this even bothers to point out that the reason why Google and Apple are increasing encryption is
because it makes us all much safer from actual everyday threats -- including the very threats that the NSA and others in
law enforcement keep warning us about. Making us all safer is a good thing, though, not to L. Gordon Crovitz,
apparently.
Crovitz is either woefully clueless and misinformed or he's purposely
misleading the American public. Neither reflects well on him or the Wall Street
Journal.
They Say: “Stewart Baker”
Baker is wrong.
Cohn et al. 14 — Cindy Cohn, Executive Director and former Legal Director and General Counsel of the Electronic
Frontier Foundation, holds a J.D. from the University of Michigan Law School, with Jeremy Gillula, Staff Technologist at
the Electronic Frontier Foundation, holds a Ph.D. in Computer Science from Stanford University, and Seth Schoen, Senior
Staff Technologist at the Electronic Frontier Foundation, 2014 (“What Default Phone Encryption Really Means For Law
Enforcement,” Vice News, October 8th, Available Online at https://news.vice.com/article/what-default-phoneencryption-really-means-for-law-enforcement, Accessed 07-05-2015)
After Apple announced it was expanding the scope of what types of data would be encrypted on devices running iOS 8,
the law enforcement community was set ablaze with indignation. When Google followed suit and announced that
Android L would also come with encryption on by default, it added fuel to the fire.
Law enforcement officials have angrily decried Apple and Google's decisions,
using all sorts of arguments against the idea of default encryption (including the classic
"Think of the children!" line of reasoning). One former NSA and Department of Homeland Security
official even suggested that because China might forbid Apple from selling a
device with default encryption, the US should forbid Apple from doing so here.
A former high-ranking American security official claiming the US should match
China in restricting the use of privacy-enhancing technology is disconcerting, to
put it mildly.
They Say: “Washington Post ‘14”
The Washington Post is wrong.
Cohn et al. 14 — Cindy Cohn, Executive Director and former Legal Director and General Counsel of the Electronic
Frontier Foundation, holds a J.D. from the University of Michigan Law School, with Jeremy Gillula, Staff Technologist at
the Electronic Frontier Foundation, holds a Ph.D. in Computer Science from Stanford University, and Seth Schoen, Senior
Staff Technologist at the Electronic Frontier Foundation, 2014 (“What Default Phone Encryption Really Means For Law
Enforcement,” Vice News, October 8th, Available Online at https://news.vice.com/article/what-default-phoneencryption-really-means-for-law-enforcement, Accessed 07-05-2015)
Unfortunately, that hasn't stopped law enforcement from twisting the nature of the Apple and Google announcements in
order to convince the public that default encryption on mobile devices will bring about a doomsday scenario of criminals
using "technological fortresses" to hide from the law. And sadly, some people seem to be buying this propaganda. Last
week, the
Washington Post published an editorial calling for Apple and Google to use "their
wizardry" to "invent a kind of secure golden key they would retain and use only when a court has
approved a search warrant."
While the Post's Editorial Board may think technologists can bend cryptography
to their every whim, it isn't so. Cryptography is about math, and math is made
up of fundamental laws that nobody, not even the geniuses at Apple and Google, can break.
One of those laws is that any key, even a golden one, can be stolen by ne'er-dowells. Simply put, there is no such thing as a key that only law enforcement can use
— any key creates a new backdoor that becomes a target for criminals, industrial
spies, or foreign adversaries.
The “Golden Key” is impossible.
Poulsen 14 — Kevin Poulsen, Senior Editor at Wired, former hacker who developed SecureDrop, 2014 (“Apple’s
iPhone Encryption Is a Godsend, Even if Cops Hate It,” Wired, October 8th, Available Online at
http://www.wired.com/2014/10/golden-key/, Accessed 07-05-2015)
However it got there, Apple has come to the right place. It’s a basic axiom of information security that “data at rest”
should be encrypted. Apple should be lauded for reaching that state with the iPhone. Google should be praised for
announcing it will follow suit in a future Android release.
And yet, the argument for encryption backdoors has risen like the undead. In
a much-discussed editorial
that ran Friday, The Washington Post sided with law enforcement. Bizarrely, the Post
acknowledges backdoors are a bad idea—“a back door can and will be exploited by bad guys, too”—
and then proposes one in the very next sentence: Apple and Google, the paper says, should invent
a “secure golden key” that would let police decrypt a smartphone with a warrant.
The paper doesn’t explain why this “golden key” would be less vulnerable to
abuse than any other backdoor. Maybe it’s the name, which seems a product of the same
branding workshop that led the Chinese government to name its Internet censorship system the “golden shield.” What’s
not to like? Everyone loves gold!
Implicit in the Post’s argument is the notion that the existence of the search
warrant as a legal instrument obliges Americans to make their data accessible:
that weakening your crypto is a civic responsibility akin to jury duty or paying
taxes. “Smartphone users must accept that they cannot be above the law if there is a valid search warrant,” writes the
Post.
This talking point, adapted from Comey’s press conference, is an insult to anyone savvy
enough to use encryption. Both Windows and OS X already support strong full-disk crypto, and using it is a
de facto regulatory requirement for anyone handling sensitive consumer or medical data. For the rest of us, it’s
common sense, not an unpatriotic slap to the face of law and order.
This argument also misunderstands the role of the search warrant. A search
warrant allows police, with a judge’s approval, to do something they’re not
normally allowed to do. It’s an instrument of permission, not compulsion. If the
cops get a warrant to search your house, you’re obliged to do nothing except stay
out of their way. You’re not compelled to dump your underwear drawers onto
your dining room table and slash open your mattress for them. And you’re not
placing yourself “above the law” if you have a steel-reinforced door that doesn’t
yield to a battering ram.
They Say: “Washington Post ‘15”
The Post editorial board has zero technical expertise.
Whittaker 15 — Zack Whittaker, Writer-Editor for ZDNet, CNET, and CBS News, 2015 (“After Washington Post
rolls out HTTPS, its editorial board bemoans encryption debate,” ZDNet, July 19th, Available Online at
http://www.zdnet.com/article/after-washington-post-rolls-out-https-its-editorial-board-decries-encryption-debate/,
Accessed 07-20-2015)
There's hope that by the time the Washington Post's editorial board takes a third
crack at the encryption whip, it might say something worthwhile.
Late on Saturday, the The Washington Post's editorial board published what initially read as a scathing anti-encryption
and pro-government rhetoric opinion piece that scolded Apple and Google (albeit a somewhat incorrect assertion) for
providing "end-to-end encryption" (again, an incorrect assertion) on their devices, locking out federal authorities
investigating serious crimes and terrorism.
Read to the end, and you'll find the
editorial came up with nothing.
It was a bland and mediocre follow-up to a similar opinion piece, which was called
"staggeringly dumb" and "seriously embarrassing" for proposing a "golden key"
to bypass encryption.
Critically, what the Post gets out of this editorial remains widely unknown, perhaps with the exception of riling up
members of the security community. It's not as though the company is particularly invested in either side. Aside the
inaccuracies in the board's opinion, and the fair (and accurate) accusation that the article said "nothing" (one assumes that
means nothing of "worth" or "value"), it's hypocritical to make more than one statement on this matter while at the same
time becoming the first major news outlet to start encrypting its entire website.
The board's follow-up sub-600 worded note did not offer anything new, but reaffirmed its desire to
see both tech companies and law enforcement "reconcile the competing imperatives" for privacy and data access,
respectively. (It's worth noting the board's opinion does not represent every journalist or reporter working at the national
daily, but it does reflect the institution's views on the whole.)
Distinguished security researcher Kenn White, dismissed the editorial in just three
words: "Nope. No need."
Because right now, there is no viable way allow both encrypted services while
allowing police and federal agencies access to that scrambled information
through so-called "backdoor" means. Just last week, a group of 13 of the world's preeminent
cryptographers and security researchers released a paper (which White linked to in his tweet)
explaining that "such access will open doors through which criminals and
malicious nation-states can attack the very individuals law enforcement seeks to
defend."
In other words: if there's a secret way in for the police and the feds, who's to say a hacker won't find it, too?
The Post's own decision to roll out encryption across its site seems bizarre
considering the editorial board's conflicting views on the matter.
Such head-scratching naivety prompted one security expert to ask anyone who
covers security at the Post to "explain reality" to the board. Because, clearly, the
board isn't doing its job well if on two separate occasions it's fluffed up reporting
on a subject with zero technical insight.
If the board, however, needs help navigating the topic, there is no doubt a virtual long line of security experts, academics,
and researchers lining up around the block ready to assist. At least then there's hope the board can strike it third-time
lucky in covering the topic.
The Post cites no evidence — ignore it.
Cushing 15 — Tim Cushing, Staff Writer for Techdirt, 2015 (“Washington Post Observes Encryption War 2.0 For
Several Months, Learns Absolutely Nothing,” Techdirt, July 20th, Available Online at
https://www.techdirt.com/articles/20150719/19031331697/washington-post-observes-encryption-war-20-severalmonths-learns-absolutely-nothing.shtml, Accessed 07-20-2015)
Last October -- following Apple and Google's announce of encryption-by-default for iOS and Android devices -- was
greeted with law enforcement panic, spearheaded by FBI director James Comey, who has yet to find the perfect dead
child to force these companies' hands.
The Washington Post editorial board found Comey's diatribes super-effective! It
published a post calling for some sort of law enforcement-only, magical hole in
Apple and Google's encryption.
How to resolve this? A police “back door” for all smartphones is undesirable — a back door can and will be
exploited by bad guys, too. However, with all their wizardry, perhaps Apple and Google could invent a kind of
secure golden key they would retain and use only when a court has approved a search warrant. Ultimately,
Congress could act and force the issue, but we’d rather see it resolved in law enforcement collaboration with
the manufacturers and in a way that protects all three of the forces at work: technology, privacy and rule of law.
When is a "back door" not a "back door?" Well, apparently when an editorial board
spells it G-O-L-D-E-N K-E-Y. It's the same thing, but in this particular pitch, it magically
isn't, because good intentions. Or something.
Months later, the debate is still raging. But it's boiled down to two arguments:
1. This is impossible. You can't create a "law enforcement only" backdoor in
encryption. It's simply not possible because a backdoor is a backdoor and can be
used by anyone who can locate the door handle.
2. No, it isn't. Please see below for citations and references:
[the article includes several paragraph breaks to indicate that there are, in fact, no citations or references]
The FBI is at an impasse. Comey firmly believes this is possible, despite openly
admitting he has zero evidence to back this claim up. When asked for specifics,
Comey defers to "smart tech guys" and their warlock-like skills.
Sensing James Comey might be struggling a bit, the editorial board of the Washington Post is
once again riding to the rescue. And they've brought the same level of
cluelessness with them. (h/t to Techdirt reader Steve R.)
Mr. Comey’s assertions should be taken seriously. A rule-of-law society cannot allow sanctuary for those who
wreak harm. But there are legitimate and valid counter arguments from software engineers, privacy advocates
and companies that make the smartphones and software. They say that any decision to give law enforcement a
key — known as “exceptional access” — would endanger the integrity of all online encryption, and that would
mean weakness everywhere in a digital universe that already is awash in cyberattacks, thefts and intrusions.
They say that a compromise isn’t possible, since one crack in encryption — even if for a good actor, like the
police — is still a crack that could be exploited by a bad actor. A recent report from the Massachusetts Institute
of Technology warned that granting exceptional access would bring on “grave” security risks that outweigh the
benefits.
After providing some statements opposing its view on the matter -- most notably an actual research paper written by
actual security researchers -- the editorial board continues on to declare this all irrelevant.
The tech companies are right about the overall importance of encryption, protecting consumers and insuring
privacy. But these companies ought to more forthrightly acknowledge the legitimate needs of U.S. law
enforcement.
And by "forthrightly acknowledge," the board means "give law enforcement what it wants, no matter the potential
damage." After all, what's PERSONAL safety, security and a handful of civil liberties compared to "legitimate needs of
law enforcement?"
All freedoms come with limits; it seems only proper that the vast freedoms of the Internet be subject to the same
rule of law and protections that we accept for the rest of society.
Your rights end where law enforcement's "legitimate needs" begin. Except they don't. The needs of law enforcement don't
trump the Bill of Rights. The needs of law enforcement don't automatically allow it to define the acceptable parameters of
the communications of US citizens.
The editorial finally wraps up by calling for experts in the field to resolve this issue:
This conflict should not be left unattended. Nineteen years ago, the National Academy of Sciences studied the
encryption issue; technology has evolved rapidly since then. It would be wise to ask the academy to undertake
a new study, with special focus on technical matters, and recommendations on how to reconcile the competing
imperatives.
The WaPo editorial board is no better than James Comey. It can cite nothing in
support of its view but yet still believes it's right. And just like Comey, the board
is being wholly disingenuous in its "deferral" to security researchers and tech
companies. It, like Comey, wants to hold two contradictory views.
Tech/security researchers are dumb when they say this problem can't be
solved.
Tech/security researchers are super-smart and can solve this problem.
So, they (the board and Comey) want to ignore the "smart guys" when they say this is
impossible, but both are willing to listen if they like the answers they're hearing.
Politics DA
No Link — Under The Radar
Encryption flies under the radar — no link.
Geller 15 — Eric Geller, Deputy Morning Editor at The Daily Dot—the “hometown newspaper of the Internet,” 2015
(“The rise of the new Crypto War,” The Daily Dot, July 10th, Available Online at
http://www.dailydot.com/politics/encryption-crypto-war-james-comey-fbi-privacy/, Accessed 07-20-2015)
The encryption hearing attracted scant attention on Capitol Hill—certainly
nowhere near as much press as the Republican Party’s endless Benghazi hearings.
While many representatives lambasted Hess and Conley for their dubious arguments, the
issue failed to break out into the mainstream. Encryption is not a sexy issue,
even if it has huge ramifications for privacy and civil liberties. Higher-profile,
more partisan fights are consuming Washington right now; lawmakers would
rather attend hearings and deliver speeches about those issues. That’s the way to
rile up voters, score endorsements, and secure donations.
No Link — Obama
Either Obama won’t spend political capital on encryption or his new position
will quickly end the debate.
Geller 15 — Eric Geller, Deputy Morning Editor at The Daily Dot—the “hometown newspaper of the Internet,” 2015
(“The rise of the new Crypto War,” The Daily Dot, July 10th, Available Online at
http://www.dailydot.com/politics/encryption-crypto-war-james-comey-fbi-privacy/, Accessed 07-20-2015)
Divided government
As Comey, Rogers, and other national-security officials campaign for backdoors, one
important voice has
been largely absent from the debate.
“I lean probably further in the direction of strong encryption than some do inside of law enforcement,” President Barack
Obama told Recode’s Kara Swisher on Feb. 15, shortly after Obama spoke at the White House summit on cybersecurity
and consumer protection in Silicon Valley.
Obama’s interview with Swisher marked a rare entrance for the president into the
backdoor debate, which has pitted his law-enforcement professionals against the
civil libertarians who were encouraged by his historic 2008 election and disappointed
by his subsequent embrace of the surveillance status quo.
If the president felt strongly enough about strong encryption, he could swat
down FBI and NSA backdoor requests any time he wanted. White House
advisers and outside experts have offered him plenty of policy cover for doing
so. The President’s Review Group on Intelligence and Communications Technologies, which Obama convened in the
wake of the Snowden disclosures, explicitly discouraged backdoors in its final report.
The review group recommended “fully supporting and not undermining efforts to create encryption standards,” ...
“making clear that [the government] will not in any way subvert, undermine, weaken, or make vulnerable generally
available commercial encryption,” and “supporting efforts to encourage the greater use of encryption technology for data
in transit, at rest, in the cloud, and in storage.”
The report also warned of “serious economic repercussions for American businesses” resulting from “a growing distrust
of their capacity to guarantee the privacy of their international users.” It was a general warning about the use of electronic
surveillance, but it nevertheless applies to the potential fallout from a backdoor mandate.
The White House’s own reports on cybersecurity and consumer privacy suggest
that the president generally supports the use of encryption. “To the extent that
you’ve heard anything from the White House and from the president, it’s in
favor of making sure that we have strong encryption and that we’re building
secure, trustworthy systems,” said Weitzner, who advised Obama as U.S. deputy chief technology officer
for Internet policy from 2011 to 2012.
Weitzner pointed out that the president had subtly quashed a push for backdoors by the previous FBI director, Robert
Mueller.
Mueller “hoped that the administration would end up supporting a very substantial [Internet-focused] expansion of
CALEA,” Weitzner said. “That didn’t happen, and … despite the fact that you had the FBI director come out very
strongly saying [criminals] were going dark, the administration never took a position as a whole in support of that kind of
statutory change. You can read between the lines.”
Obama’s reluctance to directly confront his FBI chief reflects the bureau’s long
history of autonomy in debates over law-enforcement powers, said a former Obama
administration official.
“It’s
pretty well understood that the FBI has a certain amount of independence
when they’re out in the public-policy debate advocating for whatever they think
is important,” the former official said.
The White House is reviewing "the technical, geopolitical, legal, and economic implications" of various
encryption proposals, including the possibility of legislation, administration officials told the Washington Post
this week. Weitzner said that Obama
may also be waiting until the latest round of the
Crypto Wars has progressed further.
“The White House tends to get involved in debates once they’ve matured,” he said.
“If you jump in on everything right up front, the volume can become
unmanageable. I know that there’s a lot of attention being paid, and I think that’s the right thing to do at this
point.”
The president’s noncommittal stance has earned him criticism from proencryption lawmakers who say that their fight would be much easier if the
commander-in-chief weighed in. “The best way to put all this to bed,” Hurd said,
“would be for the president to be very clear saying that he is not interested in
pursuing backdoors to encryption and believes that this is the wrong path to go,
in order to squash the debate once and for all.”
If Obama ever formally came out against backdoors, it would represent a
significant shift away from decades of anti-encryption government policies,
including undermining industry-standard security tools and attacking tech
companies through public bullying and private hacking.
Download