Encryption

advertisement
1AC — Secure Data Act
First, encryption is a fundamental human right. Without it, privacy and
freedom of speech are impossible.
O’Neill 14 — Patrick Howell O'Neill, Reporter who covers the future of the internet for The Daily Dot—the
“hometown newspaper of the internet,” 2014 (“Encryption shouldn’t just be an election issue—it should be a human
right,” Kernel Magazine, October 19th, Available Online at http://kernelmag.dailydot.com/issue-sections/staffeditorials/10587/encryption-human-right/#sthash.zpesaOpG.dpuf, Accessed 06-24-2015)
We are all born with certain inalienable rights. Now, the notion of free speech—core to the entire
idea of human rights—must be constantly re-examined in the face of a rapidly changing world where the most important
speech increasingly takes place on the Internet.
Free speech isn’t merely about the abstract idea of saying whatever you want. It’s the
freedom to speak, ask questions, and seek knowledge without anyone, hidden or
otherwise, looking over your shoulder. In the digital age, free speech is about being
able to use the Internet unmolested by the greedy eyes of corporations with
incentive to sell your personal data, hackers wanting to steal or destroy it, and
massive regimes collecting it all.
Everything a person does online is saved as data. The messages you send, the
websites you look at, the files you save are all data that can live on forever. The
only way to protect your most personal data—your location, credit card numbers, political
thoughts, medical queries, and everything else you do on a phone, tablet, or computer—from malicious spying
eyes is through encryption.
That’s why encryption should be considered a human right.
Modern encryption—at its core, really good mathematics that make it possible to protect your data so that no computer
can decrypt it without your go ahead—is legally protected by a 1995 American court decision that declared computer
code constitutionally protected free speech. But many of the world’s top cops continue to demonize encryption, saying it
will inevitably enable and immunize criminals on massive scales.
Criticisms of encryption coming from corners of power are louder now than
they’ve been in two decades. Earlier this week, James B. Comey, director of the Federal Bureau of
Investigation (FBI), said that law enforcement agencies should have access to encrypted communications, because “the
law hasn’t kept pace with technology, and this disconnect has created a significant public safety problem.” He believes
Apple’s new smartphone encryption puts its users “beyond the law.”
What is bound to be one of the great political issues of this century is only now
beginning to enter the mainstream in a clear way. One day in the not-too-distant
future, cryptography will be an election issue. Prominent politicians will debate
louder than ever about privacy and secrecy while voters—even moms and dads—will
cast ballots with strong encryption on their minds.
That’s not to say the debate hasn’t already begun. The war over encryption is four decades old. It
dates back to the 1970s when a new invention called public key cryptography definitively broke the government’s
monopoly on secrets. All of sudden, using free software that utilized clever mathematics and powerful cryptography,
normal people were able to keep data private from even the most powerful states on the planet.
The war over encryption—most notably the so-called “crypto wars” of the 1990s—saw the American
government try to make strong encryption a military-grade weapon in the eyes
of the law. Opposed chiefly by the Electronic Frontier Foundation, courts declared
computer code to be free speech and said the government’s regulations were
unconstitutional.
Despite the landmark legal victory, the war over encryption has continued to this
day.
John J. Escalante, chief of detectives for the Chicago Police Department, has called
encryption mostly a tool of pedophiles—a claim that’s disingenuous and
misleading, if not outright dangerous. For one thing, many city and federal police
agents use encryption tools regularly, and encryption stymied a total of nine
police investigations last year. There are plenty of ways to investigate crimes
involving cryptography that don’t involve banning or curtailing it.
There’s no denying that these tools have some very ugly users. However, for the
few billion of us who want to keep our digital lives private from unwanted
eavesdroppers and hackers, being forcefully grouped in with terrorists and
pedophiles is a hard insult to stomach.
Encryption works to protect you—and everyone else—online. More than that, it’s the
best protection you have. There are simply no other options that can compare.
If, for some reason, you assume a hack will never happen to you, let me give you some
perspective on the current state of digital security. 2014 is known in information technology circles as
“the year of the breach” because it has boasted some of the biggest hacks in
history. 2013 had a nickname too: The year of the breach. Come to think of it, 2012 was
called something eerily similar: The year of the breach.
2011? You get the idea.
This isn’t merely one year of massive security breaches, it’s an era of profound digital
insecurity in which sensitive personal data—the information that can be put together to add up to a
startlingly complete picture of our lives and thoughts—is under attack by criminals, corporations,
and governments whose sophistication, budget, and drive is only growing.
Consider the following, put forth by Eben Moglen, a law professor at Columbia University, in 2010: “Facebook
holds and controls more data about the daily lives and social interactions of half
a billion people than 20th-century totalitarian governments ever managed to
collect about the people they surveilled.”
The Internet’s “architecture has also made it possible for businesses and
governments to fill giant data vaults with the ore of human existence—the
appetites, interests, longings, hopes, vanities, and histories of people surfing the
Internet, often unaware that every one of their clicks is being logged, bundled,
sorted, and sold to marketers,” the New York Times journalist Jim Dwyer wrote in his new book, More
Awesome Than Money. “Together, they amount to nothing less than a full psyche scan,
unobstructed by law or social mores.”
When people like Comey suggest that law enforcement should have a “back door” or
“golden key” that allows cops to easily access all encrypted communication, they
are willfully ignoring the reality shouted to them by the vast majority of the
information technology industry.
“You can’t build a ‘back door’ that only the good guys can walk through ,”
cryptographer Bruce Schneier wrote recently. “Encryption protects against cybercriminals,
industrial competitors, the Chinese secret police, and the FBI. You’re either
vulnerable to eavesdropping by any of them, or you’re secure from
eavesdropping from all of them.”
When encryption becomes a campaign issue, that’s going to go on the bumper stickers: You either have real
privacy and security for everyone or for no one.
“The existing ‘back doors’ in network switches, mandated under U.S. laws such as CALEA, have
become the go-to weak-spot for cyberwar and industrial espionage,” author Cory
Doctorow wrote in the Guardian. “It was Google’s lawful interception backdoor that let the Chinese government raid the
Gmail account of dissidents. It was the lawful interception backdoor in Greece’s national telephone switches that let
someone—identity still unknown—listen in on the Greek Parliament and prime minister during a sensitive part of the
2005 Olympic bid (someone did the same thing the next year in Italy).”
If, like many Americans, you say you don’t mind if the U.S. government watches what
you do online, take a step back and consider the bigger picture.
The American government is not the only government—nevermind other
organizations—watching and hacking people on the Internet. China, Russia,
Iran, Israel, the U.K., and every other nation online decided long ago that
cyberspace is a militarized country. All the states with the necessary resources
are doing vast watching and hacking as well.
Encryption proved a crucial help to protesters during the Arab Spring. It helps
Iranian liberals push against their oppressive theocracy. From African free
speech activists to Chinese pro-democracy organizers to American cops
investigating organized crime, strong encryption saves lives, aids law
enforcement (ironic, huh?), protects careers, and helps build a more free and verdant
world. Journalists—citizen and professional alike—depend on encryption to keep
communications and sources private from the people and groups they report on,
making it essential to an independent and free press.
The right to privacy, the right to choose what parts of yourself are exposed to the world, was described over a century ago
by the U.S. Supreme Court and held up as an issue of prime importance last year by U.N. human rights chief Navi Pillay.
It’s something we all need to worry about.
Lacking good law, privacy is best defended by good technology. You cannot truly
talk about online privacy without talking about encryption. That’s why many of
the world’s biggest tech firms such as Google, Apple, and Yahoo are adding strong
encryption to some of their most popular products.
“There is only one way to make the citizens of the digital age secure, and that is
to give them systems designed to lock out everyone except their owners,” Doctorow
wrote. “The police have never had the power to listen in on every conversation, to
spy upon every interaction. No system that can only sustain itself by arrogating
these powers can possibly be called ‘just.’”
In the digital age, encryption is our only guarantee of privacy. Without it, the
ideal of free speech could be lost forever.
Second, U.S. government attacks on encryption leave everyone vulnerable,
jeopardizing privacy and data security. There is no such thing as a “good guys
only” backdoor.
Doctorow 14 — Cory Doctorow, journalist and science fiction author, Co-Editor of Boing Boing, Fellow at the
Electronic Frontier Foundation, former Canadian Fulbright Chair for Public Diplomacy at the Center on Public Diplomacy
at the University of Southern California, recipient of the Electronic Frontier Foundation’s Pioneer Award, 2014 (“Crypto
wars redux: why the FBI's desire to unlock your private life must be resisted,” The Guardian, October 9th, Available Online
at http://www.theguardian.com/technology/2014/oct/09/crypto-wars-redux-why-the-fbis-desire-to-unlock-yourprivate-life-must-be-resisted, Accessed 06-24-2015)
Eric Holder, the outgoing US attorney general, has
joined the FBI and other law enforcement
agencies in calling for the security of all computer systems to be fatally
weakened. This isn’t a new project – the idea has been around since the early 1990s, when the NSA classed all strong
cryptography as a “munition” and regulated civilian use of it to ensure that they had the keys to unlock any technological
countermeasures you put around your data.
In 1995, the Electronic Frontier Foundation won a landmark case establishing that code was a form of protected
expression under the First Amendment to the US constitution, and since then, the whole world has enjoyed relatively
unfettered access to strong crypto.
How strong is strong crypto? Really, really strong. When properly implemented
and secured by relatively long keys, cryptographic algorithms can protect your
data so thoroughly that all the computers now in existence, along with all the
computers likely to ever be created, could labour until the sun went nova
without uncovering the keys by “brute force” – ie trying every possible permutation of password.
The “crypto wars” of the early 1990s were fuelled by this realisation – that computers were
changing the global realpolitik in an historically unprecedented way. Computational crypto made
keeping secrets exponentially easier than breaking secrets, meaning that, for the
first time in human history, the ability for people without social or political
power to keep their private lives truly private from governments, police, and
corporations was in our grasp.
The arguments then are the arguments now. Governments invoke the Four
Horsemen of the Infocalypse (software pirates, organised crime, child
pornographers, and terrorists) and say that unless they can decrypt bad guys’
hard drives and listen in on their conversations, law and order is a dead letter.
On the other side, virtually every security and cryptography expert tries
patiently to explain that there’s no such thing as “a back door that only the good
guys can walk through” (hat tip to Bruce Schneier). Designing a computer that bad guys
can’t break into is impossible to reconcile with designing a computer that good
guys can break into.
If you give the cops a secret key that opens the locks on your computerised
storage and on your conversations, then one day, people who aren’t cops will get
hold of that key, too. The same forces that led to bent cops selling out the public’s personal information to Glen
Mulcaire and the tabloid press will cause those cops’ successors to sell out access to the world’s computer systems, too,
only the numbers of people who are interested in these keys to the (United) Kingdom will be much larger, and they’ll
have more money, and they’ll be able to do more damage.
That’s really the argument in a nutshell. Oh, we can talk about whether the danger is
as grave as the law enforcement people say it is, point out that only a tiny number of
criminal investigations run up against cryptography, and when they do, these
investigations always find another way to proceed. We can talk about the fact
that a ban in the US or UK wouldn’t stop the “bad guys” from getting perfect
crypto from one of the nations that would be able to profit (while US and UK business
suffered) by selling these useful tools to all comers. But that’s missing the point:
even if every crook was using crypto with perfect operational security, the
proposal to back-door everything would still be madness.
Because your phone isn’t just a tool for having the odd conversation with your friends –
nor is it merely a tool for plotting crime – though it does duty in both cases. Your phone, and all the other
computers in your life, they are your digital nervous system. They know
everything about you. They have cameras, microphones, location sensors. You
articulate your social graph to them, telling them about all the people you know and how you know
them. They are privy to every conversation you have. They hold your logins and
passwords for your bank and your solicitor’s website; they’re used to chat to your therapist and the STI clinic and
your rabbi, priest or imam.
That device – tracker, confessor, memoir and ledger – should be designed so that
it is as hard as possible to gain unauthorised access to. Because plumbing leaks at the seams,
and houses leak at the doorframes, and lie-lows lose air through their valves. Making something airtight is
much easier if it doesn’t have to also allow the air to all leak out under the right
circumstances.
There is no such thing as a vulnerability in technology that can only be used by
nice people doing the right thing in accord with the rule of law. The existing
“back doors” in network switches, mandated under US laws such as CALEA, have become the
go-to weak-spot for cyberwar and industrial espionage. It was Google’s lawful interception
backdoor that let the Chinese government raid the Gmail account of dissidents. It was the lawful interception backdoor in
Greece’s national telephone switches that let someone – identity still unknown – listen in on the Greek Parliament and
prime minister during a sensitive part of the 2005 Olympic bid (someone did the same thing the next year in Italy).
The most shocking Snowden revelation wasn’t the mass spying (we already knew about
that, thanks to whistleblowers like Mark Klein, who spilled the beans in 2005). It was the fact that the UK
and US spy agencies were dumping $250,000,000/year into sabotaging operating
systems, hardware, and standards, to ensure that they could always get inside
them if they wanted to. The reason this was so shocking was that these spies
were notionally doing this in the name of “national security”– but they were
dooming everyone in the nation (and in every other nation) to using products
that had been deliberately left vulnerable to attack by anyone who
independently discovered the sabotage.
There is only one way to make the citizens of the digital age secure, and that is to
give them systems designed to lock out everyone except their owners. The
police have never had the power to listen in on every conversation, to spy upon
every interaction. No system that can only sustain itself by arrogating these
powers can possibly be called “just.”
Third, these government programs are an attack on encryption itself. The
damage is done even if agencies don’t abuse backdoors.
Gillmor 14 — Dan Gillmor, Director of the Knight Center for Digital Media Entrepreneurship at the Walter Cronkite
School of Journalism and Mass Communication at Arizona State University, Fellow at the Berkman Center for Internet &
Society at Harvard University, recipient of the Electronic Frontier Foundation’s Pioneer Award, 2014 (“Law Enforcement
Has Declared War on Encryption It Can’t Break,” Future Tense—a Slate publication, October 1st, Available Online at
http://www.slate.com/blogs/future_tense/2014/10/01/law_enforcement_has_declared_war_on_encryption_it_can_t_b
reak.html, Accessed 06-24-2015)
Suppose two suspected criminals happened to be the last two people on Earth
who could understand and speak a certain language. Would law enforcement
argue that they should only be permitted to speak in a language that others
could understand?
That's an imperfect but still useful analogy to a "debate" now taking place in American
policy circles, as law enforcement reminds us that it will never be satisfied until
it can listen in on everything we say and read everything we create—in real time
and after the fact.
The spark for this latest tempest is Apple's announcement that iPhones will
henceforth be encrypted by default. Data on the device will be locked with a key that only the phone
user controls, not Apple. Meanwhile, Google's Android operating system—which, unlike Apple's iOS, has
offered this functionality for several years now as an option—will also make device encryption the
default setting in its next version.
Attorney General Eric Holder and FBI Director James Comey are leading the howdare-you chorus, with close harmony from the usual authoritarian acolytes. Their
goal can't only be to get Apple and Google to roll back this latest move, because as many people have pointed out, almost
all of what people keep on their phones is also stored in various corporate cloud computers that law enforcement can pry
open in a variety of ways, including a subpoena, secret order in alleged national security cases, or outright hacking.
No, the longer-range objective seems plain enough. This
is the launch of the latest, and most
alarming, attack on the idea of encryption itself—or at least encryption the government can't
easily crack. In particular, as the latest push to control crypto makes clear, law
enforcement wants so-called back doors into users' devices: technology that users
can't thwart, just in case the police want to get in.
Never mind that Congress has already said it doesn’t expect communications providers to provide such capabilities in
most cases. Here's relevant language from the law:
A telecommunications carrier shall not be responsible for decrypting, or ensuring the government’s ability to
decrypt, any communication encrypted by a subscriber or customer, unless the encryption was provided by the
carrier and the carrier possesses the information necessary to decrypt the communication.
What's discouraging to people who believe in digital security—that is, security for our
devices and our communications from criminals, business competitors, and
government spies who flout the Constitution, among others—is law enforcement's
disconnection from the reality that back doors increase vulnerability. University of
Pennsylvania researcher and security expert Matt Blaze put it this way: “Crypto
backdoors are dangerous even if you trust the government not to abuse them.
We simply don't know how to build them reliably.”
The hackers of the world—criminals, foreign governments, you name it—will be thrilled if
Holder, Comey and the no-privacy-for-you backup singers get their way.
The other, even worse, disconnect is the implicit notion that there is no measure
we shouldn’t take to guarantee our ability to stop and punish crime. The
Constitution, and especially the Bill of Rights, says we do take some additional risks in
order to have liberty. Why have we become so paranoid and fearful as a society
that we’d even entertain the notion that civil liberties mean next to nothing in the
face of our fear?
Fourth, this attack on encryption harms privacy and security. The impact is
global.
Cohn 14 — Cindy Cohn, Executive Director and former Legal Director and General Counsel of the Electronic Frontier
Foundation, holds a J.D. from the University of Michigan Law School, 2014 (“EFF Response to FBI Director Comey's
Speech on Encryption,” Electronic Frontier Foundation, October 17th, Available Online at
https://www.eff.org/deeplinks/2014/10/eff-response-fbi-director-comeys-speech-encryption, Accessed 06-24-2015)
FBI Director James Comey
gave a speech yesterday reiterating the FBI's nearly twenty-yearold talking points about why it wants to reduce the security in your devices,
rather than help you increase it. Here's EFF's response:
The FBI should not be in the business of trying to convince companies to offer
less security to their customers. It should be doing just the opposite. But that's
what Comey is proposing—undoing a clear legal protection we fought hard for in the 1990s.1 The law
specifically ensures that a company is not required to essentially become an
agent of the FBI rather than serving your security and privacy interests. Congress
rightly decided that companies (and free and open source projects and anyone else
building our tools) should be allowed to provide us with the tools to lock our digital
information up just as strongly as we can lock up our physical goods. That's
what Comey wants to undo.
It's telling that his remarks echo so closely the arguments of that era. Compare them, for example, with this comment
from former FBI Director Louis Freeh in May of 1995, now nearly twenty years ago:
[W]e're in favor of strong encryption, robust encryption. The country needs it, industry needs it. We just want
to make sure we have a trap door and key under some judge's authority where we can get there if somebody is
planning a crime.
Now just as then, the
FBI is trying to convince the world that some fantasy version of
security is possible—where "good guys" can have a back door or extra key to
your home but bad guys could never use it. Anyone with even a rudimentary
understanding of security can tell you that's just not true. So the "debate" Comey
calls for is phony, and we suspect he knows it. Instead, Comey wants everybody to have
weak security, so that when the FBI decides somebody is a "bad guy," it has no
problem collecting personal data.
That's bad science, it's bad law, it's bad for companies serving a global
marketplace that may not think the FBI is always a "good guy," and it's bad for
every person who wants to be sure that their data is as protected as possible—
whether from ordinary criminals hacking into their email provider, rogue
governments tracking them for politically organizing, or competing companies
looking for their trade secrets.
Perhaps Comey's speech is saber rattling. Maybe it's an attempt to persuade the American people that we've undertaken
significant reforms in light of the Snowden revelations—the U.S. government has not—and that it's time for the
"pendulum" to swing back. Or maybe by putting this issue in play, the FBI may hope to draw our eyes away from, say, its
attempt to water down the National Security Letter reform that Congress is considering. It's difficult to tell.
But if
the FBI gets its way and convinces Congress to change the law, or even if it
convinces companies like Apple that make our tools and hold our data to weaken the security they offer to us, we'll
all end up less secure and enjoying less privacy. Or as the Fourth Amendment puts it: we'll be be
less "secure in our papers and effects."
For more on EFF's coverage of the "new" Crypto Wars, read this article focusing on the security issues we wrote last week
in Vice. And going back even earlier, a broader update to a piece we wrote in 2010, which itself was was based on our
fights in the 90s. If the FBI wants to try to resurrect this old debate, EFF will be in strong opposition, just as we were 20
years ago. That's because—just like 20 years ago—the
encryption.
Internet needs more, not less, strong
Fifth, fears that encryption will cause agencies to “go dark” are wrong and
explained by loss aversion and the endowment effect.
Swire and Ahmad 11 — Peter Swire, C. William O’Neill Professor of Law at the Moritz College of Law of the
Ohio State University, served as the Chief Counselor for Privacy in the Office of Management and Budget during the
Clinton Administration, holds a J.D. from Yale Law School, and Kenesa Ahmad, Legal and Policy Associate with the
Future of Privacy Forum, holds a J.D. from the Moritz College of Law of the Ohio State University, 2011 (“‘Going Dark’
Versus a ‘Golden Age for Surveillance’,” Center for Democracy & Technology, November 28th, Available Online at
https://cdt.org/blog/%E2%80%98going-dark%E2%80%99-versus-a-%E2%80%98golden-age-forsurveillance%E2%80%99/, Accessed 06-24-2015)
What explains the agencies’ sense of loss when the use of wiretaps has expanded,
encryption has not been an important obstacle, and agencies have gained new
location, contact, and other information? One answer comes from behavioral economics
and psychology, which has drawn academic attention to concepts such as “loss
aversion” and the “endowment effect.” “Loss aversion” refers to the tendency to
prefer avoiding losses to acquiring gains of similar value. This concept also helps
explain the “endowment effect” – the theory that people place higher value on
goods they own versus comparable goods they do not own. Applied to
surveillance, the idea is that agencies feel the loss of one technique more than
they feel an equal-sized gain from other techniques. Whether based on the
language of behavioral economics or simply on common sense, we are familiar
with the human tendency to “pocket our gains” – assume we deserve the good
things that come our way, but complain about the bad things, even if the good
things are more important.
A simple test can help the reader decide between the “going dark” and “golden age
of surveillance” hypotheses. Suppose the agencies had a choice of a 1990-era
package or a 2011-era package. The first package would include the wiretap
authorities as they existed pre-encryption, but would lack the new techniques for
location tracking, confederate identification, access to multiple databases, and
data mining. The second package would match current capabilities: some
encryption-related obstacles, but increased use of wiretaps, as well as the
capabilities for location tracking, confederate tracking and data mining. The
second package is clearly superior – the new surveillance tools assist a vast range
of investigations, whereas wiretaps apply only to a small subset of key
investigations. The new tools are used far more frequently and provide granular
data to assist investigators.
Conclusion
This post casts new light on government agency claims that we are “going dark.” Due
to changing
technology, there are indeed specific ways that law enforcement and national
security agencies lose specific previous capabilities. These specific losses,
however, are more than offset by massive gains. Public debates should
recognize that we are truly in a golden age of surveillance. By understanding
that, we can reject calls for bad encryption policy. More generally, we should
critically assess a wide range of proposals, and build a more secure computing
and communications infrastructure.
The United States federal government should enact the Secure Data Act.
First, the plan prevents law enforcement and intelligence agencies from
weakening encryption. This is crucial to safeguard overall cybersecurity.
McQuinn 14 — Alan McQuinn, Research Assistant with the Information Technology and Innovation Foundation,
holds a B.S. in Political Communication from the University of Texas-Austin, 2014 (“The Secure Data Act could help law
enforcement protect against cybercrime,” The Hill, December 19th, Available Online at
http://thehill.com/blogs/congress-blog/technology/227594-the-secure-data-act-could-help-law-enforcement-protectagainst, Accessed 06-24-2015)
Last Sunday, Sen. Ron Wyden (D-Ore.) wrote an op-ed describing the role that U.S. law enforcement should play in
fostering stronger data encryption to make information technology (IT) systems more secure. This op-ed explains
Wyden’s introduction of the the
Secure Data Act, which would prohibit the government from
mandating that U.S. companies build “backdoors” in their products for the
purpose of surveillance. This legislation responds directly to recent comments
by U.S. officials, most notably the Federal Bureau of Investigation (FBI) Director James Comey,
chastising Apple and Google for creating encrypted devices to which law
enforcement cannot gain access. Comey and others have argued that U.S. tech
companies should design a way for law enforcement officials to access consumer
data stored on those devices. In this environment, the Secure Data Act is a homerun
for security and privacy and is a good step towards reasserting U.S.
competitiveness in building secure systems for a global market.
By adopting its position on the issue the FBI is working against its own goal of
preventing cybercrime as well as broader government efforts to improve
cybersecurity. Just a few years ago, the Bureau was counseling people to better
encrypt their data to safeguard it from hackers. Creating backdoor access for law
enforcement fundamentally weakens IT systems because it creates a new
pathway for malicious hackers, foreign governments, and other unauthorized
parties to gain illicit access. Requiring backdoors is a step backwards for
companies actively working to eliminate security vulnerabilities in their
products. In this way, security is a lot like a ship at sea, the more holes you put in the
system—government mandated or not—the faster it will sink. The better solution
is to patch up all the holes in the system and work to prevent any new ones.
Rather than decreasing security to suit its appetite for surveillance, the FBI
should recognize that better security is needed to bolster U.S. defenses against
online threats.
The Secure Data Act is an important step in that direction because it will stop U.S.
law enforcement agencies from requiring companies to introduce vulnerabilities
in their products. If this bill is enacted, law enforcement will be forced to use
other means to solve crimes, such as by using metadata from cellular providers,
call records, text messages, and even old-fashioned detective work. This will also
allow U.S. tech companies, with the help of law enforcement, to continue to
strengthen their systems, better detect intrusions, and identify emerging
threats. Law enforcement, such as the recently announced U.S. Department of Justice Cybersecurity Unit—a
unit designed solely to “deter, investigate, and prosecute cyber criminals,” should work in cooperation
with the private sector to create a safer environment online. A change of course is
also necessary
to restore the ability of U.S. tech companies to compete globally,
where mistrust has run rampant following the revelations of mass government
surveillance.
With the 113th Congress at an end, Wyden has promised to reintroduce the Data Secure Act again in the next Congress.
Congress should move expediently to advance Senator Wyden’s bill to promote
security and privacy in U.S. devices and software. Furthermore, as Congress marks up the
legislation and considers amendments, it should restrict not just government access to devices,
but also government control of those devices. These efforts will move the efforts
of our law enforcement agencies away from creating cyber vulnerabilities and
allow electronics manufacturers to produce the most secure devices imaginable.
Second, the SDA is crucial to end the U.S.’s war on encryption. This bolsters
data security and boosts tech industry competitiveness.
Wyden 14 — Ron Wyden, United States Senator (D-OR) who serves on the Senate Select Committee on Intelligence
and who is known as “the internet’s Senator” for his leadership on technology issues, former Member of the United States
House of Representatives, holds a J.D. from the University of Oregon School of Law, 2014 (“With hackers running
rampant, why would we poke holes in data security?,” Los Angeles Times, December 14th, Available Online at
http://www.latimes.com/opinion/op-ed/la-oe-1215-wyden-backdoor-for-cell-phones-20141215-story.html, Accessed 0624-2015)
Hardly a week goes by without a new report of some massive data theft that has
put financial information, trade secrets or government records into the hands of
computer hackers.
The best defense against these attacks is clear: strong data encryption and more
secure technology systems.
The leaders of U.S. intelligence agencies hold a different view. Most prominently, James
Comey, the FBI director, is lobbying Congress to require that electronics
manufacturers create intentional security holes — so-called back doors — that would
enable the government to access data on every American's cellphone and
computer, even if it is protected by encryption.
Unfortunately, there are no magic keys that can be used only by good guys for
legitimate reasons. There is only strong security or weak security.
Americans are demanding strong security for their personal data. Comey and
others are suggesting that security features shouldn't be too strong, because this
could interfere with surveillance conducted for law enforcement or intelligence
purposes. The problem with this logic is that building a back door into every
cellphone, tablet, or laptop means deliberately creating weaknesses that hackers
and foreign governments can exploit. Mandating back doors also removes the
incentive for companies to develop more secure products at the time people
need them most; if you're building a wall with a hole in it, how much are you
going invest in locks and barbed wire? What these officials are proposing would
be bad for personal data security and bad for business and must be opposed by
Congress.
In Silicon Valley several weeks ago I convened a roundtable of executives from America's most
innovative tech companies. They made it clear that widespread availability of data
encryption technology is what consumers are demanding.
It is also good public policy. For years, officials of intelligence agencies like the
NSA, as well as the Department of Justice, made misleading and outright inaccurate
statements to Congress about data surveillance programs — not once, but
repeatedly for over a decade. These agencies spied on huge numbers of lawabiding Americans, and their dragnet surveillance of Americans' data did not
make our country safer.
Most Americans accept that there are times their government needs to rely on
clandestine methods of intelligence gathering to protect national security and
ensure public safety. But they also expect government agencies and officials to
operate within the boundaries of the law, and they now know how egregiously
intelligence agencies abused their trust.
This breach of trust is also hurting U.S. technology companies' bottom line,
particularly when trying to sell services and devices in foreign markets. The
president's own surveillance review group noted that concern about U.S.
surveillance policies “can directly reduce the market share of U.S. companies.”
One industry estimate suggests that lost market share will cost just the U.S. cloud
computing sector $21 billion to $35 billion over the next three years.
Tech firms are now investing heavily in new systems, including encryption, to
protect consumers from cyber attacks and rebuild the trust of their customers. As
one participant at my roundtable put it, “I'd be shocked if anyone in the industry takes the foot off the pedal in terms of
building security and encryption into their products.”
Built-in back doors have been tried elsewhere with disastrous results. In 2005, for
example, Greece discovered that dozens of its senior government officials' phones had
been under surveillance for nearly a year. The eavesdropper was never identified, but
the vulnerability was clear: built-in wiretapping features intended to be
accessible only to government agencies following a legal process.
Chinese hackers have proved how aggressively they will exploit any security
vulnerability. A report last year by a leading cyber security company identified
more than 100 intrusions in U.S. networks from a single cyber espionage unit in
Shanghai. As another tech company leader told me, “Why would we leave a back door lying around?”
Why indeed. The U.S. House of Representatives recognized how dangerous this idea was and
in June approved 293-123, a bipartisan amendment that would prohibit the
government from mandating that technology companies build security
weaknesses into any of their products. I introduced legislation in the Senate to
accomplish the same goal, and will again at the start of the next session.
Technology is a tool that can be put to legitimate or illegitimate use. And
advances in technology always pose a new challenge to law enforcement
agencies. But curtailing innovation on data security is no solution, and certainly
won't restore public trust in tech companies or government agencies. Instead we
should give law enforcement and intelligence agencies the resources that they
need to adapt, and give the public the data security they demand.
Finally, the SDA establishes a binding legal framework that prevents agencies
from backdooring encryption. The status quo doesn’t solve.
Mukunth 15 — Vasudevan Mukunth, Science Editor at The Wire—a public interest news publication, holds a
degree in Mechanical Engineering from the Birla Institute of Technology and Science a post-graduate degree from the
Asian College of Journalism, 2015 (“A Call For a New Human Right, the Right to Encryption,” The Wire, June 2nd,
Available Online at http://thewire.in/2015/06/02/a-call-for-a-new-human-right-the-right-to-encryption/, Accessed 0624-2015)
DUAL_EC_DRBG is the name of a program that played an important role in the National Security Agency’s infiltration of
communication protocols, which was revealed by whistleblower Edward Snowden. The program, at the time, drew the
suspicion of many cryptographers who wondered why it was being used instead of the NIST’s more advanced standards.
The answer arrived in December 2013: DUAL_EC_DRBG was a backdoor.
A backdoor is a vulnerability deliberately inserted into a piece of software to
allow specific parties to decrypt it whenever they want to. When the NSA wasn’t
forcibly getting companies to hand over private data, it was exploiting preinserted backdoors to enter and snoop around. Following 9/11, the Patriot Act
made such acts lawful, validating the use of programs like DUAL_EC_DRBG that put
user security and privacy at stake to defend the more arbitrarily defined
questions of national security.
However, the use of such weakened encryption standards is a Trojan horse that lets
in the weaknesses of those standards as well. When engineers attempt to use those standards for
something so well-defined as the public interest, such weaknesses can undermine that definition. For example, one
argument after Snowden’s revelations was to encrypt communications such that only the government could access them.
This was quickly dismissed because it’s
open knowledge among engineers that there are no
safeguards that can be placed around such ‘special’ access that would deter anyone
skilled enough to hack through it.
It’s against this ‘power draws power’ scenario that a new report from the UN Office of the High
Commissioner for Human Rights (OHCHR) makes a strong case – one which the influential
Electronic Frontier Foundation has called “groundbreaking”. It says, “requiring encryption
back-door access, even if for legitimate purposes, threatens the privacy necessary
to the unencumbered exercise of the right to freedom of expression.” Some may think
this verges on needless doubt, but the report’s centre of mass rests on backdoors’ abilities to compromise individual
identities in legal and technological environments that can’t fully protect those identities.
On June 1, those provisions of the Patriot Act that justified the interception of
telephone calls expired and the US Senate was unable to keep them going. As Anuj
Srivas argues, it is at best “mild reform” that has only plucked at the low-hanging
fruit – reform that rested on individuals’ privacy being violated by unconstitutional means. The provisions will be
succeeded by the USA Freedom Act, which sports some watered-down notions of
accountability when organisations like the NSA trawl data.
According to the OHCHR report, however, what we really need are proactive
measures. If decryption is at the heart of privacy violations, then strong
encryption needs to be at the heart of privacy protection – i.e. encryption must
be a human right. Axiomatically, as the report’s author, Special Rapporteur David Kaye writes,
individuals rely on encryption and anonymity to “safeguard and protect their
right to expression, especially in situations where it is not only the State creating
limitations but also society that does not tolerate unconventional opinions or
expression.” On the same note, countries like the US that intentionally compromise
products’ security, and the UK and India which constantly ask for companies to hand over the keys to their data
to surveil their citizens, are now human rights violators.
By securing the importance of strong encryption and associating it with securing
one’s identity, the hope is to insulate it from fallacies in the regulation of
decryption – such as in the forms of the Patriot Act and the Freedom Act. Kaye argues,
“Privacy interferences that limit the exercise of the freedoms of opinion and expression … must not in any event interfere
with the right to hold opinions, and those that limit the freedom of expression must be provided by law and be necessary
and proportionate to achieve one of a handful of legitimate objectives.”
This anastomosis in the debate can be better viewed as a wedge that was created around 1995. The FBI Director at the
time, Louis Freeh, had said that the bureau was “in favor of strong encryption, robust encryption. The country needs it,
industry needs it. We just want to make sure we have a trap door and key under some judge’s authority where we can get
there if somebody is planning a crime.”
Then, in October 2014, then FBI Director James Comey made a similar statement: “It makes more sense to address any
security risks by developing intercept solutions during the design phase, rather than resorting to a patchwork solution
when law enforcement comes knocking after the fact.” In the intervening decades, however, awareness of the
vulnerabilities of partial encryption has increased while the law has done little to provide recourse for the gaps in online
protection. So, Comey’s arguments are more subversive than Freeh’s.
Kaye’s thesis is from a human rights perspective, but its conclusions apply to
everyone – to journalists, lawyers, artists, scholars, anyone engaged in the
exploration of controversial information and with a stake in securing their
freedom of expression. In fact, a corollary of his thesis is that strong encryption will
ensure unfettered access to the Internet. His report also urges Congress to pass the
Secure Data Act, which would prevent the US government from forcibly
inserting backdoors in software to suit its needs.
The Encryption Advantage Stem
Privacy — with a twist
Journalism
Tech Competitiveness
Cybersecurity — government (OPM) and non-government
Sectors — banking, medical, etc.
Law Enforcement — link turn
NSA’s Bullrun Program
What it is
https://www.propublica.org/article/the-nsas-secret-campaign-to-crack-undermine-internet-encryption
Why ProPublica published the story.
Engelberg and Tofel 13 — Stephen Engelberg, Editor-in-Chief and Founding Managing Editor of
ProPublica—an independent, non-profit, investigative news organization, former Managing Editor of The Oregonian, and
Richard Tofel, President and Founding General Manager of ProPublica, former Assistant Publisher of The Wall Street
Journal, 2013 (“Why We Published the Decryption Story,” ProPublica, September 5th, Available Online at
https://www.propublica.org/article/why-we-published-the-decryption-story, Accessed 06-24-2015)
ProPublica is today publishing a story in partnership with the Guardian and The
New York Times about U.S. and U.K. government efforts to decode enormous amounts of
Internet traffic previously thought to have been safe from prying eyes. This story
is based on documents provided by Edward Snowden, the former intelligence community
employee and contractor. We want to explain why we are taking this step, and why we believe it is in the public interest.
The story, we believe, is an important one. It shows that the expectations of millions
of Internet users regarding the privacy of their electronic communications are
mistaken. These expectations guide the practices of private individuals and
businesses, most of them innocent of any wrongdoing. The potential for abuse of
such extraordinary capabilities for surveillance, including for political purposes,
is considerable. The government insists it has put in place checks and balances to
limit misuses of this technology. But the question of whether they are effective is
far from resolved and is an issue that can only be debated by the people and their
elected representatives if the basic facts are revealed.
It’s certainly true that some number of bad actors (possibly including would-be
terrorists) have been exchanging messages through means they assumed to be
safe from interception by law enforcement or intelligence agencies. Some of these
bad actors may now change their behavior in response to our story.
In weighing this reality, we have not only taken our own counsel and that of our publishing partners, but have also
conferred with the government of the United States, a country whose freedoms give us remarkable opportunities as
journalists and citizens.
Two possible analogies may help to illuminate our thinking here.
First, a historical event: In 1942, shortly after the World War II Battle of Midway, the Chicago Tribune published an article
suggesting, in part, that the U.S. had broken the Japanese naval code (which it had). Nearly all responsible journalists we
know would now say that the Tribune’s decision to publish this information was a mistake. But today’s story bears no
resemblance to what the Tribune did. For one thing, the U.S. wartime code-breaking was confined to military
communications. It did not involve eavesdropping on civilians.
The second analogy, while admittedly science fiction, seems to us to offer a clearer parallel. Suppose for a moment
that the U.S. government had secretly developed and deployed an ability to read
individuals’ minds. Such a capability would present the greatest possible
invasion of personal privacy. And just as surely, it would be an enormously valuable
weapon in the fight against terrorism.
Continuing with this analogy, some might say that because of its value as an intelligence
tool, the existence of the mind-reading program should never be revealed. We do
not agree. In our view, such a capability in the hands of the government would pose
an overwhelming threat to civil liberties. The capability would not necessarily
have to be banned in all circumstances. But we believe it would need to be
discussed, and safeguards developed for its use. For that to happen, it would
have to be known.
There are those who, in good faith, believe that we should leave the balance between civil liberty and security entirely to
our elected leaders, and to those they place in positions of executive responsibility. Again, we do not agree. The
American system, as we understand it, is premised on the idea -- championed by such men as
Thomas Jefferson and James Madison -- that government run amok poses the greatest
potential threat to the people’s liberty, and that an informed citizenry is the
necessary check on this threat. The sort of work ProPublica does -- watchdog journalism -- is a
key element in helping the public play this role.
American history is replete with examples of the dangers of unchecked power
operating in secret. Richard Nixon, for instance, was twice elected president of this
country. He tried to subvert law enforcement, intelligence and other agencies for
political purposes, and was more than willing to violate laws in the process. Such
a person could come to power again. We need a system that can withstand such
challenges. That system requires public knowledge of the power the government
possesses. Today’s story is a step in that direction.
The Crypto Wars
Export controls / dual use
Pretty Good Privacy (PGP)
Open Source vs. Proprietary Software
Netscape/SSL
Lawsuits/First Amendment
The Clipper Chip
“Going Dark”
Four Horsemen of the Infocalypse
Major Court Cases
Bernstein v. United States
Junger v. Daley
Fletcher 99 — Betty Binns Fletcher, Federal Judge on the United States Court of Appeals for the Ninth Circuit, 1999
(Decision in Bernstein v. U.S. Department of Justice, Ninth Circuit Court of Appeals, Case Number 97-16686, May 6th,
Available Online at https://www.epic.org/crypto/export_controls/bernstein_decision_9_cir.html, Accessed 06-24-2015)
Second, we note that the
government's efforts to regulate and control the spread of
knowledge relating to encryption may implicate more than the First Amendment
rights of cryptographers. In this increasingly electronic age, we are all required
in our everyday lives to rely on modern technology to communicate with one
another. This reliance on electronic communication, however, has brought with it a
dramatic diminution in our ability to communicate privately. Cellular phones
are subject to monitoring, email is easily intercepted, and transactions over the
internet are often less than secure. Something as commonplace as furnishing our
credit card number, social security number, or bank account number puts each of
us at risk. Moreover, when we employ electronic methods of communication, we
often leave electronic "fingerprints" behind, fingerprints that can be traced back to us.
Whether we are surveilled by our government, by criminals, or by our neighbors,
it is fair to say that never has our ability to shield our affairs from prying eyes been at
such a low ebb. The availability and use of secure encryption may offer an
opportunity to reclaim some portion of the privacy we have lost. Government
efforts to control encryption thus may well implicate not only the First
Amendment rights of cryptographers intent on pushing the boundaries of their
science, but also the constitutional rights of each of us as potential recipients of
encryption's bounty. Viewed from this perspective, the government's efforts to
retard progress in cryptography may implicate the Fourth Amendment, as well
as the right to speak anonymously, see McIntyre v. Ohio Elections Comm'n, 115 S. Ct. 1511, 1524 (1995),
the right against compelled speech, see Wooley v. Maynard, 430 U.S. 705, 714 (1977), and the
right to informational privacy, see Whalen v. Roe, 429 U.S. 589, 599-600 (1977). While we leave for another
day the resolution of these difficult issues, it is important to point out that Bernstein's is a suit not merely
concerning a small group of scientists laboring in an esoteric field, but also
touches on the public interest broadly defined.
Basic Encryption Terms
Cryptography
Encryption
Plaintext
Ciphertext
Algorithm
Key
Symmetric-key encryption vs. Public-key (or Asymmetric) encryption
Public key
Private key
Man in the Middle Attack
Key escrow (“Golden Key”)
Tools You Can Use
Signals (Whisper Systems)
HTTPS Everywhere
Mymail-Crypt for Gmail
Off The Record Chat
Tor
Download