HSS Encryption Advantage Negative

advertisement
Explanation/Notes
This file contains negative materials for debating encryption advantages. Additional materials can be found in the
relevant case and advantage negative files.
Case
Encryption Advantage Stem
Encryption is growing — the status quo solves.
Valerio 15 — Pablo Valerio, International Business and IT Consultant, holds an M.S. in Electrical Engineering from
The Ohio State University, 2015 (“Data Encryption On The Rise,” Network Computing, January 23rd, Available Online at
http://www.networkcomputing.com/applications/data-encryption-on-the-rise/a/d-id/1318719, Accessed 07-06-2015)
With corporate demand for data security growing, full data encryption is
becoming a default feature in the mobile space.
Because of hacking scandals and government snooping, many companies are
starting to encrypt all their data. Both Apple and Google have committed to help
users encrypt data in their devices and in all communications. Some experts
project that more that 20% of Internet traffic will be encrypted by the end of this
year.
Mobile devices are the biggest battlefield for security and encryption services.
Many executives carry massive amounts of confidential data on their devices, usually unencrypted. While most laptops
used by people working with confidential data have some form of disk encryption, and/or remote management, that
doesn’t apply to smartphones and tablets.
In order to gain corporate customers’ confidence, smartphone and tablet
manufacturers are working hard to increase the security of their devices. BlackBerry
was among the first to employ full encryption of communications on its popular BlackBerry messenger. Samsung is
committed to provide full encryption and data locks with its Knox platform.
Now, both
Apple and Google are providing full encryption as a default option on
their mobile operating systems with an encryption scheme they are not able to
break themselves, since they don’t hold the necessary keys.
Companies will be forced to offer unencrypted products — business and
international pressure.
Baker 15 — Stewart Baker, former General Counsel of the National Security Agency, former Assistant Secretary for
Policy at the Department of Homeland Security, 2015 (“Data Access Shouldn’t Be Up to Companies Alone,” Room for
Debate—a New York Times expert blog, January 12th, Available Online at
http://www.nytimes.com/roomfordebate/2014/09/30/apple-vs-the-law/data-access-shouldnt-be-up-to-companiesalone, Accessed 07-05-2015)
Apple is a lot like a teenager getting Edward Snowden's name tattooed up her arm.
The excitement will die, but the regrets will last. For all of us.
Most Americans believe in privacy from government searches, but not for
criminals. The Constitution protects a citizen's “houses, papers and effects” only
until a judge finds probable cause that the citizen has committed a crime. This year,
the Supreme Court ruled that the police need a warrant to search cellphones seized at the time of arrest. But with
Apple's new encryption, probable cause and a warrant will be of little help to the
police who seize a suspect’s iPhone and want to search it.
That decision should not be left to Apple alone. And it won't be.
Companies do not want to give their employees the power to roam corporate networks in secrecy. And even if they did,
their regulators wouldn't let them. If
Apple wants to sell iPhones for business use, it will
have to give companies a way to read their employees’ business
communications. Corporate IT departments won’t welcome a technology that
could help workers hide misdeeds from their employer.
And as
a global company, Apple is subject to regulation and market pressure
everywhere. If China doesn't like Apple's new policy, it can ban the iPhone or
simply encourage China's mobile carriers to slow Apple's already weak sales
there. Even democracies like India, and U.S. allies like the United Arab Emirates, have
shown the determination and the clout to force changes in phone makers'
security choices.
So if Apple wants to sell its iPhone everywhere, it will have to compromise. But
then what? Will it really give China's authoritarian regime more access to iPhone
data than it gives to American police trying to stop crimes in this country?
And if so, how will its management sleep at night?
Privacy Advantage
More encryption means less legal protection of metadata. This turns privacy.
Kerr 14 — Orin Kerr, Fred C. Stevenson Research Professor at The George Washington University Law School, former
trial attorney in the Computer Crime and Intellectual Property Section at the U.S. Department of Justice, holds a J.D. from
Harvard University, 2014 (“Julian Sanchez on encryption, law enforcement, and the balance of power,” The Volokh
Conspiracy—a Washington Post scholarly blog, September 24th, Available Online at
http://www.washingtonpost.com/news/volokh-conspiracy/wp/2014/09/24/julian-sanchez-on-encryption-lawenforcement-and-the-balance-of-power/, Accessed 06-29-2015)
If Julian is right, though, I
don’t think it means that the widespread use of encryption will have
no effect on Fourth Amendment law. Even if the increasing use of encryption
will help to restore the status quo balance of power, that restoration is likely to
have a significant influence in terms of changes not taken. Consider the debate over the
plain view exception to the Fourth Amendment. I’ve argued that the courts should eliminate the plain view exception for
digital searches because computer searches bring so much information into plain view. It’s an equilibrium-adjustment
idea: The existing plain view doctrine gives the government too much power given the new technological facts of
computers. If
broad use of encryption ends up making it much harder for the
government to get access to contents, however, I suspect courts will be less inclined to
take away that law enforcement power. Perhaps the same is true for the metadata debate. If the
widespread use of encryption ends up broadly limiting access to contents even
with a warrant, perhaps courts become less likely to impose new Fourth
Amendment restrictions on access to metadata.
Cybersecurity Advantage
No cyber impact — it’s exaggerated.
Healey 13 — Jason Healey, Director of the Cyber Statecraft Initiative at the Atlantic Council, President of the Cyber
Conflict Studies Association, Lecturer in Cyber Policy at Georgetown University, Lecturer in Cyber National Security
Studies at the School of Advanced International Studies at Johns Hopkins University, former Director for Cyber
Infrastructure Protection at the White House, holds a Master’s in Computer Security from James Madison University,
2013 (“No, Cyberwarfare Isn't as Dangerous as Nuclear War,” U.S. News & World Report, March 20th, Available Online at
http://www.usnews.com/opinion/blogs/world-report/2013/03/20/cyber-attacks-not-yet-an-existential-threat-to-theus, Accessed 01-27-2015)
America does not face an existential cyberthreat today, despite recent warnings.
Our cybervulnerabilities are undoubtedly grave and the threats we face are severe but far from
comparable to nuclear war.
The most recent alarms come in a Defense Science Board report on how to make
military cybersystems more resilient against advanced threats (in short, Russia or China).
It warned that the "cyber threat is serious, with potential consequences similar in
some ways to the nuclear threat of the Cold War." Such fears were also expressed by Adm. Mike
Mullen, then chairman of the Joint Chiefs of Staff, in 2011. He called cyber "The single biggest
existential threat that's out there" because "cyber actually more than theoretically, can
attack our infrastructure, our financial systems."
While it is true that cyber attacks might do these things, it is also true they have
not only never happened but are far more difficult to accomplish than
mainstream thinking believes. The consequences from cyber threats may be similar in some ways to
nuclear, as the Science Board concluded, but mostly, they are incredibly dissimilar.
Eighty years ago, the generals of the U.S. Army Air Corps were sure that their bombers would easily topple other
countries and cause their populations to panic, claims which did not stand up to reality. A
study of the 25-year
history of cyber conflict, by the Atlantic Council and Cyber Conflict Studies Association, has shown a
similar dynamic where the impact of disruptive cyberattacks has been consistently
overestimated.
Rather than theorizing about future cyberwars or extrapolating from today's concerns, the history of
cyberconflict that have actually been fought, shows that cyber incidents have so
far tended to have effects that are either widespread but fleeting or persistent but
narrowly focused. No attacks, so far, have been both widespread and persistent.
There have been no authenticated cases of anyone dying from a cyber attack.
Any widespread disruptions, even the 2007 disruption against Estonia, have been short-lived
causing no significant GDP loss.
Moreover, as with conflict in other domains, cyberattacks can take down many targets but keeping them down over time
in the face of determined defenses has so far been out of the range of all but the most dangerous adversaries such as
Russia and China. Of course, if the United States is in a conflict with those nations, cyber will be the least important of the
existential threats policymakers should be worrying about. Plutonium trumps bytes in a shooting war.
This is not all good news. Policymakers have recognized the problems since at least 1998 with little significant progress.
Worse, the threats and vulnerabilities are getting steadily more worrying. Still, experts
have been warning
of a cyber Pearl Harbor for 20 of the 70 years since the actual Pearl Harbor.
The transfer of U.S. trade secrets through Chinese cyber espionage could someday accumulate into an existential threat.
But it doesn't seem so seem just yet, with only handwaving estimates of annual losses of 0.1 to 0.5 percent to the total U.S.
GDP of around $15 trillion. That's bad, but it
cyberwar."
doesn't add up to an existential crisis or "economic
Other countries will demand a golden key regardless of U.S. policy.
Wittes 15 — Benjamin Wittes, Senior Fellow in Governance Studies at the Brookings Institution, Editor in Chief of
Lawfare, Member of the Task Force on National Security and Law at the Hoover Institution, 2015 (“Thoughts on
Encryption and Going Dark, Part II: The Debate on the Merits,” Lawfare—a national security blog curated by the
Brookings Institution, July 12th, Available Online at http://www.lawfareblog.com/thoughts-encryption-and-going-darkpart-ii-debate-merits, Accessed 07-13-2015)
I am certain that the computer scientists are correct that foreign governments will move in this direction, but I think they
are misreading the consequences of this. China
and Britain will do this irrespective of what
the United States does, and that fact may well create potential opportunity for the U.S.
After all, if China and Britain are going to force U.S. companies to think through the
problem of how to provide extraordinary access without compromising general
security, perhaps the need to do business in those countries will provide much of the
incentive to think through the hard problems of how to do it. Perhaps countries
far less solicitous than ours of the plight of technology companies or the privacy interests of their users
will force the research that Comey can only hypothesize. Will Apple then take the view that it
can offer phones to users in China which can be decrypted for Chinese authorities when they require it but that it's
technically impossible to do so in the United States?
(Note: this potentially hurts the NIST CP’s solvency. I would read this in the
block only if I was not extending the NIST CP.)
CISA CP
This is an advantage CP to solve cybersecurity.
1NC — CISA CP
The United States federal government should enact the Cybersecurity
Information Sharing Act.
CISA solves cybersecurity — info sharing is key.
Keating 15 — Frank Keating, President and Chief Executive Officer of the American Bankers Association—a
national lobbying organization for the banking industry, 2015 (Open Letter to the Senate Regarding the Cybersecurity
Information Sharing Act, April 14th, Available Online at
http://www.aba.com/Advocacy/LetterstoCongress/Documents/LettertoSenateReCISASupport.pdf, Accessed 07-062015)
I am writing on behalf of the members of the American Bankers Association (ABA) to
urge you to support the Cybersecurity Information Sharing Act (CISA) when it is brought to the Senate floor
this month.
CISA is bipartisan legislation introduced by Chairman Burr and Vice Chairman Feinstein, and was reported by a strong
14-1 vote out of the Senate Intelligence Committee. It
will enhance ongoing efforts by the private
sector and the Federal government to better protect our critical infrastructure
and protect Americans from all walks of life from cyber criminals. CISA facilitates
increased cyber intelligence information sharing between the private and public
sectors, and strikes the appropriate balance between protecting consumer
privacy and allowing information sharing on serious threats to our nation’s
critical infrastructures.
Cybersecurity is a top priority for the financial services industry. Banks invest hundreds
of millions of dollars every year to put in place multiple layers of security to protect sensitive data. Protecting customers
has always been and will remain our top priority.
We look forward to working with Chairman Burr, Vice Chairman Feinstein and all members of
the Senate on this important issue and urge you to support their efforts to bring
this bill to the floor and pass it as soon as possible to better protect America’s
cybersecurity infrastructure against current and future threats.
Solvency
CISA boosts cybersecurity without jeopardizing privacy.
Feinstein 14 — Dianne Feinstein, Member of the United States Senate (D-CA), Chair of the Senate Intelligence
Committee, 2014 (“Dianne Feinstein: Cybersecurity Information Sharing Act will help protect us,” Mercury News, July 21st,
Available Online at http://www.mercurynews.com/opinion/ci_26188154/dianne-feinstein-cybersecurity-informationsharing-act-will-help, Accessed 07-06-2015)
Every week, millions of computer networks come under attack by hackers, cyber criminals
and hostile foreign nations. These networks include banks and retail outlets, nuclear power plants and dams, even critical
military hubs.
The cost to consumers, companies and the government reaches into the billions of dollars, and the attacks are
increasing in severity.
This threat extends into core sectors of our society. Theft of personal and financial information from retailers is all too
common. Up to 70 million Target customers had their personal information stolen last year. Denial-of-service attacks,
where hackers essentially disable websites, have made it difficult for millions of Americans to bank online.
Beyond the threat to consumers, intellectual property pilfered from innovative American companies is estimated to cost
the economy hundreds of billions of dollars. And hostile foreign governments routinely use cyberspace to steal sensitive
national security information.
The potential targets are countless, the potential damage immense. Nearly everything in our society is linked to
computers, from GPS satellites to the electricity grid to Wall Street.
As chairman of the Senate Intelligence Committee, I have listened for several years as the intelligence community has
described the growing threat. It
is past time for Congress to take meaningful action to
address this danger.
To counter this peril, private sector companies need clear authority to take
defensive actions on their own networks, and the private sector and government
must be allowed to share information about cyber attacks to develop the best
methods to counter them.
Over the past year I have worked closely with Sen. Saxby Chambliss of Georgia, vice chairman of the Senate
Intelligence Committee, on legislation to address this issue. The committee recently approved the
Cybersecurity Information Sharing Act with a strong bipartisan vote.
While critics have focused on concerns of privacy advocates, I believe the 12-3 committee vote represents the broad
support this bill will have in the Senate and the country.
It was written in consultation with a wide range of stakeholders, including the
executive branch, privacy advocates, the financial industry, tech companies and
others. Its evolving text was made public twice in the weeks leading up to committee debate, and changes were made
based on suggestions.
Like all substantial legislation, this bill includes compromises. But the privacy protections and incentives for information
sharing are balanced and will help us identify and overcome threats.
The bill requires the government to share more information about cyber threats
so the private sector can better defend itself, including classified information
where possible. It also only allows companies to share information for
cybersecurity purposes. It is not an intelligence collection bill and doesn't alter
requirements for conducting surveillance.
Furthermore, the government is barred from using information shared with it for any
regulatory action or criminal investigations not relating specifically to
cybersecurity.
All provisions were drafted with the goal of protecting personal information. All companies that share
information are required to remove any irrelevant personal information before
sending it. And all information about cyber threats shared with the government
is subject to privacy protections and includes strong penalties for abuses. The
government's receipt and use of shared information will receive multiple levels
of oversight.
Finally, the bill is voluntary. No company is compelled to share any information.
Companies that participate are only authorized to look for cyber threats on their
own networks and those of their consenting customers.
CISA is key to cybersecurity.
Interested Parties 15 — An Open Letter by the Agricultural Retailers Association, American Bankers
Association, American Chemistry Council, American Fuel & Petrochemical Manufacturers, American Gas Association,
American Insurance Association, American Petroleum Institute, American Public Power Association, American Water
Works Association, ASIS International, Association of American Railroads, Council of Insurance Agents & Brokers,
CTIA–The Wireless Association, Edison Electric Institute, Federation of American Hospitals, Financial Services
Roundtable–BITS GridWise Alliance, Internet Commerce Coalition, Large Public Power Council, National Association of
Chemical Distributors, National Association of Corporate Directors, National Association of Manufacturers, National
Business Coalition on e-Commerce & Privacy, National Cable & Telecommunications Association, National Rural Electric
Cooperative Association, NTCA–The Rural Broadband Association, Property Casualty Insurers Association of America
Securities Industry and Financial Markets Association, Society of Chemical Manufacturers & Affiliates,
Telecommunications Industry Association, U.S. Chamber of Commerce, United States Telecom Association, and Utilities
Telecom Council, 2015 (Letter to Congress from Interested Parties Supporting the Cybersecurity Information Sharing Act,
January 27th, Available Online at http://www.pciaa.net/docs/default-source/industryissues/letter_to_congress_supporting_cybersecurity_information_sharing_act.pdf?sfvrsn=2, Accessed 07-07-2015)
Our organizations, which represent nearly every sector of the American
economy, strongly urge the Senate to quickly pass a cybersecurity informationsharing bill. Cybersecurity is a top priority of our associations and members.
The Select Committee on Intelligence passed a smart and workable bill in July 2014,
which earned broad bipartisan support. Recent cyber incidents underscore the need for
legislation to help businesses improve their awareness of cyber threats and to
enhance their protection and response capabilities.
Above all, we need Congress to send a bill to the president that gives businesses
legal certainty that they have safe harbor against frivolous lawsuits when
voluntarily sharing and receiving threat indicators and countermeasures in real
time and taking actions to mitigate cyberattacks.
The legislation also needs to offer protections related to public disclosure, regulatory, and antitrust matters in order to
increase the timely exchange of information among public and private entities.
Our organizations also believe that legislation needs to safeguard privacy and civil liberties and establish appropriate
roles for civilian and intelligence agencies. The cybersecurity measure approved last year by the Select Committee on
Intelligence reflected practical compromises among many stakeholders on these issues.
Cyberattacks aimed at U.S. businesses and government entities are being
launched from various sources, including sophisticated hackers, organized
crime, and state-sponsored groups. These attacks are advancing in scope and
complexity.
We are committed to working with lawmakers and their staff members to get cybersecurity
information-sharing legislation swiftly enacted to strengthen our national
security and the protection and resilience of U.S. industry. Congressional action
cannot come soon enough.
NIST “Golden Key” CP
1NC — NIST “Golden Key” CP
The United States federal government should cease encouraging or requiring
manufacturers of electronic devices or software to build into such devices or
software a mechanism that allows the United States federal government to
bypass encryption technology for the purposes of domestic surveillance unless
this mechanism has been published publically for more than twelve months
and certified to be unbreakable by The National Institute of Standards and
Technology.
This solves the case. Either it results in the plan or it results in a Golden Key
that avoids our “encryption bad” disads.
Rosenzweig 15 — Paul Rosenzweig, Founder of Red Branch Consulting PLLC—a homeland security consulting
company, Senior Advisor to The Chertoff Group—a consulting group specializing in homeland security issues,
Distinguished Visiting Fellow at the Homeland Security Studies and Analysis Institute, Professorial Lecturer in Law at
George Washington University, Senior Editor of the Journal of National Security Law & Policy, Visiting Fellow at The
Heritage Foundation, former Deputy Assistant Secretary for Policy in the Department of Homeland Security, 2015
(“Testing Encryption Insecurity: A Modest Proposal,” Lawfare—a national security blog curated by the Brookings
Institution, July 7th, Available Online at http://www.lawfareblog.com/testing-encryption-insecurity-modest-proposal,
Accessed 07-20-2015)
Co-blogger Susan Landau has yet another thoughtful post on the insecurity of back door encryption requirements – what
she calls mandating insecurity. She calls it magical thinking, which is perhaps a bit harsh. But her point is well taken --
one ground for opposing mandated back doors is, as I've said, that they seem to be
technologically infeasible to maintain only for government access.
So here is a thought experiment -- a useful immaculate back door would, in theory, be one that
is openable by the government and only by the government. In other words one
which, even if the methodology for how it was created and implemented was
publicly known, could not be broken. We actually have a real world model for that
construct – public key encryption itself. The solution is robust because the "how"
of it (the mathematics) can be widely published and yet the particular applications of it
remain impenetrable – cracking private messages to me encrypted with my public key remains
impossible.
So I propose a simple rule (perhaps even one that Congress could enact into law) – encryption
providers may be required to adopt a government sponsored "back door"
technology if, and only if, the methodology for that technology has been
published publicly for more than 12 months and no efforts to subvert or defeat it
have been successful. NIST gets to judge success. That way if the NSA/FBI have
a real solution that can withstand public scrutiny (and, I assume, sustained attack) they can
use it. Absent that ... the risks outweigh the rewards.
Overview/Explanation
The CP solves the case and avoids the crime and terror DAs. Before banning a
“Golden Key” system for law enforcement access, the government should first
encourage technologists to develop one that works. The CP alone keeps the
pressure on industry to develop a workable solution that balances the costs
and benefits of encryption. Either it will result in the same outcome as the plan
or it will result in a workable Golden Key that avoids our disads. Either way,
the counterplan is the best option.
We have comparative evidence. A new study is the best way to balance the
costs and benefits of encryption.
Washington Post 15 — Washington Post Editorial Board, 2015 (“Putting the digital keys to unlock data out of
reach of authorities,” July 18th, Available Online at https://www.washingtonpost.com/opinions/putting-the-digitalkeys-to-unlock-data-out-of-reach-of-authorities/2015/07/18/d6aa7970-2beb-11e5-a250-42bd812efc09_story.html,
Accessed 07-20-2015)
This conflict should not be left unattended. Nineteen years ago, the National Academy
of Sciences studied the encryption issue; technology has evolved rapidly since then. It
would be wise to ask the academy to undertake a new study, with special focus
on technical matters, and recommendations on how to reconcile the competing
imperatives.
This avoids our net-benefit. Encryption hampers effective law enforcement. A
“secure golden key” should be required.
Washington Post 14 — Washington Post Editorial Board, 2014 (“Compromise needed on smartphone
encryption,” October 3rd, Available Online at http://www.washingtonpost.com/opinions/compromise-needed-onsmartphone-encryption/2014/10/03/96680bf8-4a77-11e4-891d-713f052086a0_story.html, Accessed 07-05-2015)
Law enforcement officials deserve to be heard in their recent warnings about the
impact of next-generation encryption technology on smartphones, such as Apple’s new iPhone.
This is an important moment in which technology, privacy and the rule of law are colliding.
Apple announced Sept. 17 that its latest mobile operating system, iOS 8, includes encryption so thorough that the
company will not be able to unlock it for law enforcement. The encryption is to be set by the user, and Apple will not
retain the key. Google’s next version of its popular Android operating system also will be unlockable by the company.
Both insist they are giving consumers ironclad privacy protection. The moves are in large part a response to public
worries about National Security Agency surveillance of Internet and telephone metadata revealed by former government
contractor Edward Snowden.
What has the law enforcement community up in arms is the prospect of losing
access to the data on these smartphones in cases where they have a valid, courtapproved search warrant. The technology firms, while pledging to honor search warrants in other situations,
say they simply won’t possess the ability to unlock the smartphones. Only the owner of the phone, who set up the
encryption, will be able to do that. Attorney General Eric H. Holder Jr. said this could imperil investigations in
kidnapping and other cases; FBI Director James B. Comey said he could not understand why the tech companies would
“market something expressly to allow people to place themselves beyond the law.”
This is not about mass surveillance. Law enforcement authorities are not asking
for the ability to surveil everyone’s smartphone, only those relatively few cases
where there is a court-approved search warrant. This seems reasonable and not
excessively intrusive. After all, the government in many other situations has a right
— and responsibility — to set standards for products so that laws are followed.
Why not smartphones? Moreover, those worried about privacy can take solace from the Supreme Court’s decision in June
in Riley v. California, which acknowledged the large amount of private information on smartphones and said a warrant is
generally required before a search.
Law enforcement will not be entirely without tools in criminal investigations. Data stored in the “cloud” and other
locations will still be available; wiretaps, too. But smartphone
users must accept that they cannot
be above the law if there is a valid search warrant.
How to resolve this? A police “back door” for all smartphones is undesirable — a back door can and will be exploited by
bad guys, too. However, with
all their wizardry, perhaps Apple and Google could invent
a kind of secure golden key they would retain and use only when a court has
approved a search warrant. Ultimately, Congress could act and force the issue, but
we’d rather see it resolved in law enforcement collaboration with the
manufacturers and in a way that protects all three of the forces at work: technology,
privacy and rule of law.
Only the counterplan ensures that law enforcement doesn’t “go dark.”
Washington Post 15 — Washington Post Editorial Board, 2015 (“Putting the digital keys to unlock data out of
reach of authorities,” July 18th, Available Online at https://www.washingtonpost.com/opinions/putting-the-digitalkeys-to-unlock-data-out-of-reach-of-authorities/2015/07/18/d6aa7970-2beb-11e5-a250-42bd812efc09_story.html,
Accessed 07-20-2015)
A contentious debate about encryption of data on smartphones and elsewhere has become
even more intense in recent weeks. A collision is unfolding between law enforcement
devoted to fighting crime and terrorism and advocates of privacy and secure
communications. In these chaotic digital times, both are vital to the national
interest, and it is imperative that experts invest serious time and resources into
finding ways to reconcile the conflict.
FBI Director James B. Comey has raised alarms — most recently in testimony to Congress this month —
about the prospect of law enforcement “going dark,” meaning that it would be unable to obtain
information, with a court order, from encrypted communications on smartphones and other sources. The reason for this
alarm are the decisions by Apple and Google to provide end-to-end encryption as the default in their operating systems,
meaning that only the holder of the device can unlock the information. The technology companies say they do not hold
the key (although they do hold data in servers that would still be accessible by law enforcement). Mr. Comey
warns that this encryption could provide a safe haven for terrorists and criminals.
He has not proposed a specific fix, but insisted that the problem of “going dark” is “grave, growing and extremely
complex.”
Mr. Comey’s
assertions should be taken seriously. A rule-of-law society cannot
allow sanctuary for those who wreak harm. But there are legitimate and valid counter arguments
from software engineers, privacy advocates and companies that make the smartphones and software. They say that any
decision to give law enforcement a key — known as “exceptional access” — would endanger the integrity of all online
encryption, and that would mean weakness everywhere in a digital universe that already is awash in cyberattacks, thefts
and intrusions. They say that a compromise isn’t possible, since one crack in encryption — even if for a good actor, like
the police — is still a crack that could be exploited by a bad actor. A recent report from the Massachusetts Institute of
Technology warned that granting exceptional access would bring on “grave” security risks that outweigh the benefits.
Last October in this space, we
urged Apple and Google, paragons of innovation, to create
a kind of secure golden key that could unlock encrypted devices, under a court
order, when needed. The tech sector does not seem so inclined. Two major industry
associations for hardware and software wrote President Obama in June, “We are opposed to any policy actions or
measures that would undermine encryption as an available and effective tool.” The
tech companies are
right about the overall importance of encryption, protecting consumers and
insuring privacy. But these companies ought to more forthrightly acknowledge
the legitimate needs of U.S. law enforcement. All freedoms come with limits; it
seems only proper that the vast freedoms of the Internet be subject to the same
rule of law and protections that we accept for the rest of society.
They Say: “Permute – Do Both”
Links to the Terror and Crime DAs — immediately prohibiting all Golden
Keys even if they work jeopardizes law enforcement and national security. The
counterplan alone maintains the possibility of a working Golden Key.
Leverage DA — the counterplan keeps the pressure on tech companies to
develop a workable Golden Key. The perm removes the incentive to make it
work.
Wittes 15 — Benjamin Wittes, Senior Fellow in Governance Studies at the Brookings Institution, Editor in Chief of
Lawfare, Member of the Task Force on National Security and Law at the Hoover Institution, 2015 (“Thoughts on
Encryption and Going Dark, Part II: The Debate on the Merits,” Lawfare—a national security blog curated by the
Brookings Institution, July 12th, Available Online at http://www.lawfareblog.com/thoughts-encryption-and-going-darkpart-ii-debate-merits, Accessed 07-13-2015)
Which of these approaches is the right way to go? I would pursue several of them simultaneously. At
least for
now, I would hold off on any kind of regulatory mandate, there being just too
much doubt at this stage concerning what's doable. I would, however, take a hard look at the role that
civil liability might play. I think the government, if it's serious about creating an
extraordinary access scheme, needs to generate some public research
establishing proof of concept. We should watch very carefully how the
companies respond to the mandates they will receive from governments that will
approach this problem in a less nuanced fashion than ours will. And Comey
should keep up the political pressure. The combination of these forces may well
produce a more workable approach to the problem than anyone can currently
envision.
Sequencing DA — vetting a Golden Key first is key.
Wittes 15 — Benjamin Wittes, Senior Fellow in Governance Studies at the Brookings Institution, Editor in Chief of
Lawfare, Member of the Task Force on National Security and Law at the Hoover Institution, 2015 (“Thoughts on
Encryption and Going Dark, Part II: The Debate on the Merits,” Lawfare—a national security blog curated by the
Brookings Institution, July 12th, Available Online at http://www.lawfareblog.com/thoughts-encryption-and-going-darkpart-ii-debate-merits, Accessed 07-13-2015)
If we begin—as the computer scientists do—with a posture of great skepticism as
to the plausibility of any scheme and we place the burden of persuasion on
Comey, law enforcement, and the intelligence community to demonstrate the viability of any
system, the obvious course is government-sponsored research. What we need here
is not a Clipper Chip-type initiative, in which the government would develop and produce a complete system, but a
set of intellectual and technical answers to the challenges the technologists have
posed. The goal here should be an elaborated concept paper laying out how a
secure extraordinary access system would work in sufficient detail that it can be
evaluated, critiqued, and vetted; think of the bitcoin paper here as a model. Only after a period
of public vetting, discussion, and refinement would the process turn to the
question of what sorts of companies we might ask to implement such a system
and by what legal means we might ask.
They Say: “Permute – Do The CP”
This severs the certainty of the plan. The counterplan doesn’t prohibit
government backdoors in all circumstances; the plan does. Severance
permutations should be rejected because they undermine clash and
debatability.
They Say: “Golden Key Impossible”
This describes the status quo. The counterplan creates pressure on industry to
come up with a workable solution.
Even if a Golden Key is impossible, the counterplan is better than plan. Any
chance that a workable system exists is enough to justify adding the condition.
A Golden Key system is technically feasible, but only in the world of the
counterplan.
Crovitz 15 — L. Gordon Crovitz, Columnist and Former Publisher of The Wall Street Journal, former Executive VicePresident of Dow Jones, 2015 (“Why Terrorists Love Silicon Valley,” Wall Street Journal, July 5th, Available Online at
http://www.wsj.com/articles/why-terrorists-love-silicon-valley-1436110797, Accessed 07-20-2015)
Silicon Valley never met a problem it couldn’t solve—until now. Top Internet
companies say there is no technical way for them to protect their users’ legitimate
privacy with encryption while also enabling intelligence agents and law
enforcement to gain access to what terrorists plot online.
FBI director James Comey says Silicon Valley’s no-can-do attitude is “depressing.”
Terror attacks are increasingly planned online, outside the reach of intelligence
and law enforcement. Islamic State reaches an estimated 200,000 prospects daily
through posts on social media. Once a recruit is identified, ISIS tells him to switch to
an encrypted smartphone. Legal wiretaps are useless because the signal is
indecipherable. Even when the devices are lawfully seized through court orders,
intelligence and law-enforcement agencies are unable to retrieve data from them.
“Our job is to find needles in a nationwide haystack, needles that are increasingly invisible to us because of end-to-end
encryption,” FBI Director Comey recently told CNN. The
FBI says it can no longer estimate how
many recruits in the U.S. are communicating with ISIS and other terrorist groups.
Benjamin Wittes, who edits the Lawfare blog for the Brookings Institution, summarized the
problem last week: “As a practical matter, that means there are people in the U.S.
whom authorities reasonably believe to be in contact with ISIS for whom
surveillance is lawful and appropriate but for whom useful signals interception
is not technically feasible. That’s a pretty scary thought.” Mr. Wittes asks “whether we really
want ISIS to be able to recruit people on Twitter and then have secure communications with them elsewhere. And if we
don’t want that, what then?”
Silicon Valley acts as if it’s above having to comply with legal searches. Last year,
Apple announced that its iPhone operating system would encrypt data by default. “Unlike our competition,” Apple
bragged, “it’s not technically feasible for us to respond to government warrants.” Google now has similar encryption for
Android devices, as does Facebook for its WhatsApp app.
Congress requires traditional telephone companies to retain records so that they
can be disclosed under court orders. These records and wiretaps are basic tools
for preventing terrorism. Silicon Valley executives and lobbyists want their
technologies to be exempt.
They need to be reminded that the Fourth Amendment of the Constitution permits reasonable
searches and seizures. That creates duties for responsible companies to be able to
comply. The original justification for national regulation of electronic communication was self-defense. The
Communications Act of 1934, which created the Federal Communications Commission, said “the purpose of the national
defense” justified federal oversight of “interstate and foreign commerce in communication by wire and radio.”
Earlier this year Mr. Comey asked Congress to act unless technologists come up with ways of allowing encryption to
protect privacy from hackers while also allowing government agencies to conduct legal searches. “Technical people say
it’s too hard,” he said. “My reaction to that is: Really? Really too hard? Too hard for the people we have in this country to
figure something out? I’m not that pessimistic.”
The head of the National Security Agency, Michael Rogers, proposes that technology
companies create digital keys to allow access to smartphones and other devices,
with the key divided into pieces so that no company or government agency (or
hacker) can gain access to the information on its own. “I don’t want a back door,” he said,
meaning unfettered ability to bypass device security. “I want a front door. And I want the front
door to have multiple locks.”
Internet executives should know better than anyone that using multiple keys can
work—because multiple keys are the means by which the Internet itself is
protected. The engineers who designed the Internet applied this approach to
ensure the integrity of global Internet addresses and names.
The Internet Corporation for Assigned Names and Numbers, or Icann, requires people acting together to gain access to
the core of the Internet. Seven people from around the world meet several times a year, with each bringing an oldfashioned metal key to unlock a safe-deposit box that contains a smartcard. The group then verifies that there has been no
tampering with the root zone of Internet names and addresses. Old-fashioned physical keys could likewise be used by
Internet companies and government officials acting together to enable court-ordered access.
This week, the Senate Judiciary Committee holds a hearing, “Going Dark: Encryption, Technology, and the Balance
Between Public Safety and Privacy.” Lawmakers
aren’t giving up on finding a way to
accommodate both of these important interests. Neither should Silicon Valley.
Prefer our evidence — crypto experts aren’t trying hard enough. The
counterplan forces them to actually try.
Wittes 15 — Benjamin Wittes, Senior Fellow in Governance Studies at the Brookings Institution, Editor in Chief of
Lawfare, Member of the Task Force on National Security and Law at the Hoover Institution, 2015 (“Thoughts on
Encryption and Going Dark, Part II: The Debate on the Merits,” Lawfare—a national security blog curated by the
Brookings Institution, July 12th, Available Online at http://www.lawfareblog.com/thoughts-encryption-and-going-darkpart-ii-debate-merits, Accessed 07-13-2015)
Consider the report issued this past week by a group of computer security experts
(including Lawfare contributing editors Bruce Schneier and Susan Landau), entitled "Keys Under Doormats: Mandating
Insecurity By Requiring Government Access to All Data and Communications." The
report does not make
an in-principle argument or a conceptual argument against extraordinary access.
It argues, rather, that the effort to build such a system risks eroding cybersecurity in
ways far more important than the problems it would solve. The authors, to summarize,
make three claims in support of the broad claim that any exceptional access system would "pose . . . grave security risks
[and] imperil innovation." What are those "grave security risks"?
* "[P]roviding exceptional access to communications would force a U-turn from the best practices now being deployed to
make the Internet more secure. These practices include forward secrecy—where decryption keys are deleted immediately
after use, so that stealing the encryption key used by a communications server would not compromise earlier or later
communications. A related technique, authenticated encryption, uses the same temporary key to guarantee confidentiality
and to verify that the message has not been forged or tampered with."
* "[B]uilding in exceptional access would substantially increase system complexity" and "complexity is the enemy of
security." Adding code to systems increases that system's attack surface, and a certain number of additional
vulnerabilities come with every marginal increase in system complexity. So by requiring a potentially complicated new
system to be developed and implemented, we'd be effectively guaranteeing more vulnerabilities for malicious actors to
hit.
* "[E]xceptional access would create concentrated targets that could attract bad actors." If we require tech companies to
retain some means of accessing user communications, those keys have to stored somewhere, and that storage then
becomes an unusually high-stakes target for malicious attack. Their theft then compromises, as did the OPM hack, large
numbers of users.
The strong implication of the report is that these issues are not resolvable, though the
report never quite says that. But at a minimum, the authors raise a series of important questions about
whether such a system would, in practice, create an insecure internet in general—rather than one whose general security
has the technical capacity to make security exceptions to comply with the law.
There is some reason, in my view, to suspect that the picture may not be quite as
stark as the computer scientists make it seem. After all, the big tech companies
increase the complexity of their software products all the time, and they
generally regard the increased attack surface of the software they create as a
result as a mitigatable problem. Similarly, there are lots of high-value intelligence
targets that we have to secure and would have big security implications if we
could not do so successfully. And when it really counts, that task is not hopeless.
Google and Apple and Facebook are not without tools in the cybersecurity
department.
The real question, in my view, is whether a system of the sort Comey imagines could be built in fashion in which the
security gain it would provide would exceed the heightened security risks the extraordinary access would involve. As
Herb Lin puts it in his excellent, and admirably brief, Senate testimony the other day, this is ultimately a question without
an answer in the absence of a lot of new research. "One
side says [the] access [Comey is seeking]
inevitably weakens the security of a system and will eventually be compromised by a
bad guy; the other side says it doesn’t weaken security and won’t be
compromised. Neither side can prove its case, and we see a theological clash of
absolutes." Only when someone actually does the research and development
and tries actually to produce a system that meets Comey's criteria are we going
to find out whether it's doable or not.
Politics Net-Benefit
The counterplan avoids the link to politics. Our link is based on law
enforcement backlash to the plan. The counterplan pacifies law enforcement
by pushing for a Golden Key.
Independently, Cameron will pressure Obama to oppose encryption, forcing
his hand.
WSJ 15 — Wall Street Journal, 2015 (“U.K.’s Cameron to Lobby Obama on Encryption,” Byline Danny Yadron and
Devlin Barrett, January 14th, Available Online at http://www.wsj.com/articles/u-k-s-cameron-to-lobby-obama-tocriticize-tech-company-encryption-1421277542, Accessed 07-23-2015)
British Prime Minister David Cameron
plans to lobby President Barack Obama this week to more
publicly criticize U.S. technology companies, such as Facebook Inc., that offer encrypted
communications that can’t be unscrambled even with a court order, two people familiar with the matter said.
The move would extend Mr. Cameron’s efforts to help intelligence and security
officials access the information they say they need to help counter terror. One issue he and intelligence officials
have highlighted in recent days is the growing use of encryption and the difficulty it poses for law enforcement. The U.S.
Justice Department has also sought a way to access encrypted communications with a court order, but has been rebuffed
by civil-liberties concerns.
Mr. Cameron, a Conservative, could
put pressure on Mr. Obama to pick a side in a fight
between privacy advocates and law enforcement over secret messaging in the digital age.
“Are we going to allow a means of communications which it simply isn’t possible to read?” Mr. Cameron said in a speech
Monday. “No. We must not.”
The counterplan avoids this link because it relieves the pressure on Obama.
He can tell Cameron he’s doing his best to achieve a workable Golden Key.
Terrorism/Crime DA
Terrorism DA Links
Encryption decimates effective law enforcement. The impact is rampant
terrorism and crime.
Glasser 14 — Ellen Glasser, President of the Society of Former Special Agents of the Federal Bureau of Investigation,
Adjunct Professor in the Criminology & Criminal Justice Department at the University of North Florida, served as an FBI
Agent for 24 years, 2014 (“Tech companies are making it harder for the nation's law enforcement,” The Baltimore Sun,
November 6th, Available Online at http://www.baltimoresun.com/news/opinion/oped/bs-ed-fbi-apple-20141106story.html, Accessed 07-05-2015)
FBI Director Comey has been on the job for just over a year and is working to change perceptions. In addressing the
myriad challenges that face our nation, he brings
a positive, reasoned approach to the public discussion of
privacy versus safety. While appreciating the public's concern over privacy, he has
been very clear that the marketing of these new devices will seriously impede law
enforcement's ability to protect Americans. Put simply, legal access to
unencrypted mobile device information is needed to keep our citizens and our
country safer.
Here is some FBI reality. We live in an apocalyptic, post-9/11 world, where the FBI is
confronted with a dizzying array of threats from terrorist bombings to
beheadings of innocent victims. Over the years, the FBI has also responded to anthrax
attacks, shoe and underwear bombers, White House fence jumpers, child
molesters, school shootings, human trafficking, kidnappings and massive
fraud schemes. The FBI investigates these matters within the scope of the law
and with great, abiding respect for the right of individuals to privacy.
Let me bring this close to home. What if your child was abducted, and the FBI developed
mobile device information and had a court order, but FBI agents were unable to
access the critical, time-sensitive, unencrypted information that was necessary to
save your child's life? Thankfully, most people will never be in a life-or-death
situation like this, but it does happen. When it does — any FBI agent can tell you from
experience — people want help. Let's start by helping them now.
Public perception needs to change so the focus is on handcuffing the bad guys,
not tying the hands of the good guys. Please contact your elected representatives to tell them that
corrective legislation is necessary to require companies like Apple and Google to
work with law enforcement and find a solution to this problem.
Law enforcement can’t break strong encryption. ISIS loves this.
Wittes 15 — Benjamin Wittes, Senior Fellow in Governance Studies at the Brookings Institution, Editor in Chief of
Lawfare, Member of the Task Force on National Security and Law at the Hoover Institution, 2015 (“Thoughts on
Encryption and Going Dark: Part I,” Lawfare—a national security blog curated by the Brookings Institution, July 9th,
Available Online at http://www.lawfareblog.com/thoughts-encryption-and-going-dark-part-i, Accessed 07-13-2015)
FBI Director James Comey
has been on a public offensive of late, arguing against end-to-end
encryption that prevents law enforcement access to communications even when
authorities have appropriate legal process to capture those communications. The
offensive began with a speech at Brookings some months ago. More recently, Comey made these comments on CNN,
these comments in a private conversation with me, and wrote this piece for Lawfare.
Yesterday, he was on Capitol Hill, testifying both before the Senate Judiciary Committee (video at this link, prepared
statement here) and before the Senate Select Committee on Intelligence (video below):
[Video Omitted]
Comey made some news yesterday. For one thing, he stated very clearly to the Judiciary Committee—and with
evident reluctance—that some of the encryption the bureau is now facing is beyond its
capacity to crack:
[I]f we intercept data in motion between two encrypted devices or across an encrypted mobile messaging app
and it's strongly encrypted, we can't break it.
Now, this is sometimes—I hate that I'm here saying this, but I actually think the problem is severe enough that I
need to let the bad guys know that. That's the risk in what we're talking about here. The bad—I'm just
confirming something for the bad guys.
Sometimes people watch TV and think, "Well, the FBI must have some way
to break that strong encryption." We do not, which is why this is such an
important issue.
At another point, he stated that while some companies have designed systems that they lack the capacity to decrypt, in
other instances, some companies have simply declined to assist investigators in decrypting signal even where decryption
was possible—a matter on which at least one senator fought further information. (See Comey's comments at 1:17:00 and
his subsequent exchange with Senator Sheldon Whitehouse at 1:20:00 of the Judiciary Committee hearing.)
All in all, Comey's reception on the Hill was significantly warmer than I expected. The Bureau has clearly done a lot of
quiet behind-the-scenes work with members to familiarize them with the problem as the FBI sees it, and many members
yesterday seemed to require little persuasion.
But Comey has a very heavy lift ahead of him if he is to make progress on the "Going Dark" problem. For one thing, it's
not entirely clear what constitutes progress from the Bureau's perspective. The administration is, at this stage, not asking
for legislation, after all. It's merely describing an emergent problem.
But this is a bit of a feint. The core of that emergent problem, at least as Comey's joint statement with Deputy Attorney
General Sally Yates frames it, is that CALEA—which mandates that telecommunications providers retain the capacity for
law enforcement to get access to signal for lawful wiretapping—does not reach internet companies. So even if Apple and
Google were to voluntarily retain encryption keys, some other actor would very likely not do so. Absent
a legal
requirement that companies refrain from making true end-to-end encrypted
services available without a CALEA-like stop-gap, some entity will see a market hole and
provide those services. And it's fair to assume that ISIS and the most sophisticated bad
actors will gravitate in the direction of that service provider.
In other words, I think Comey and Yates inevitably are asking for legislation, at least in the longer term. The
administration has decided not to seek it now, so the conversation is taking place at a somewhat higher level of
abstraction than it would if there were a specific legislative proposal on the table. But the
current discussion
should be understood as an effort to begin building a legislative coalition for
some sort of mandate that internet platform companies retain (or build) the ability
to permit, with appropriate legal process, the capture and delivery to law
enforcement and intelligence authorities of decrypted versions of the signals they
carry.
The plan risks catastrophic terrorism.
Weissmann 14 — Andrew Weissmann, Senior Fellow at the Center for Law and Security and the Center on the
Administration of Criminal Law at New York University, former General Counsel for the Federal Bureau of Investigation,
holds a J.D. from Columbia Law School, 2014 (“Apple, Boyd, and Going Dark,” Just Security, October 20th, Available
Online at http://justsecurity.org/16592/apple-boyd-dark/, Accessed 07-05-2015)
To my mind – although, as in many areas of the law, there is no perfect solution — the
cost of a system
where we may be more at risk to illegal hacking is outweighed by the vital role
lawful electronic interception plays in thwarting crime – including devastating
terrorist attacks. Law enforcement and intelligence officials, including most recently FBI
Director James Comey, have noted that we all – including criminals — increasingly use
non-telephonic means to communicate. The ability to monitor electronic
communications is decreasing with every new encryption tool on such
communication systems. Law enforcement authorities in the US and overseas rightfully
note how such data is critical to solving everyday crimes, such as kidnapping,
fraud, child pornography and exploitation, among many others. And at least as
important, preventing terrorist attacks requires such ability, as intelligence agencies
note (although due to the Snowden leaks, resulting in the public perception that the intelligence community has too
much, not too little, access to information, the ramifications from encryption on traditional law enforcement is likely to be
relied on by the government in the public debate on this issue).
This is a judgment Congress needs to make, and soon. In weighing the interests, however, it is no answer to say that the
government should revert to means other than lawful intercepts obtained through court orders based on probable cause
to prevent crimes. The
reality of electronic communications is here to stay and plays a
vital role in how crimes are perpetrated by allowing people to communicate
with conspirators and to carry out their nefarious plans. In this regard, the government and
privacy advocates both need to be consistent in their arguments: it is the latter who usually remind us that the advent of
smartphones and “big data” makes traditional Fourth Amendment line-drawing obsolete. And they have a point, as the
Supreme Court is starting to recognize. But by the same token, it
is increasingly important to have an
ability to monitor such communications, after meeting the necessary Fourth
Amendment standard upon a showing to an independent Article III court.
The plan substantially increases the risk of catastrophic crime and terrorism.
Rubin 14 — Jennifer Rubin, Columnist and Blogger for the Washington Post, holds a J.D. from the University of
California-Berkeley, 2014 (“Silicon Valley enables terrorists and criminals,” Right Turn—a Washington Post blog, October
19th, Available Online at http://www.washingtonpost.com/blogs/right-turn/wp/2014/10/19/silicon-valley-enablesterrorists-and-criminals/, Accessed 07-05-2015)
Google chairman Eric Schmidt likes to brag that his company is “on the right side of history.” He pats himself on the
back for pulling out of China because of that country’s censoring practices. His company even has a slogan,
“Don’t be evil,” meant to remind Google employees that they aspire to the highest ethical standards. But, to be
blunt, Google is violating its own “don’t be evil” rule by insisting on encryption
technology which locks out anti-terrorist and law enforcement agencies. That
gives terrorists and common criminals alike huge protection and puts their
fellow Americans at risk.
Benjamin Wittes of the Brookings Institution explains this is not about “encryption,” as some reports
characterize it. No one is talking about eliminating encryption, he explains, “Without it, you
couldn’t have electronic commerce. Nobody wants to get rid of encryption.” He explains, “ The only question is
whether there should be government access with lawful process — or not.”
In a scantly covered speech this week, FBI Director James Comey explained:
The issue is whether companies not currently subject to the Communications Assistance for
Law Enforcement Act should be required to build lawful intercept capabilities for
law enforcement. We aren’t seeking to expand our authority to intercept communications. We are
struggling to keep up with changing technology and to maintain our ability to actually collect the
communications we are authorized to intercept.
And if the challenges of real-time interception threaten to leave us in the dark, encryption
threatens
to lead all of us to a very dark place.
Encryption is nothing new. But the challenge to law enforcement and national security officials is markedly
worse, with recent default encryption settings and encrypted devices and networks—all designed to increase
security and privacy.
With Apple’s new operating system, the information stored on many iPhones and other Apple devices will be
encrypted by default. Shortly after Apple’s announcement, Google announced plans to follow suit with its
Android operating system. This means the companies themselves won’t be able to unlock phones, laptops, and
tablets to reveal photos, documents, e-mail, and recordings stored within.
That is a problem that is not solved, as Apple claims, by providing access to the cloud. “But uploading to the cloud
doesn’t include all of the stored data on a bad guy’s phone, which has the potential to create a black hole for law
enforcement,” Comey said. “And if the bad guys don’t back up their phones routinely, or if they opt out of uploading to
the cloud, the data will only be found on the encrypted devices themselves. And it is people most worried about what’s
on the phone who will be most likely to avoid the cloud and to make sure that law enforcement cannot access
incriminating data.”
In fact, the blocked phones are simply part of a marketing pitch to cater to young people who are misinformed and
paranoid about what information the government has access to. Comey observed that “it will have very serious
consequences for law enforcement and national security agencies at all levels. Sophisticated
criminals will
come to count on these means of evading detection. It’s the equivalent of a closet
that can’t be opened. A safe that can’t be cracked. And my question is, at what cost?”
Well, some terrorists will use it to plan and execute murderous schemes, organized
crime will use it to hide from law enforcement and the American people will be
less safe and less secure.
Maybe the president (whose party benefits from liberal high-tech donors) should call these people in for a chat and
explain why they should stop this. Alternatively, Congress should hold open hearings and have these execs explain why
they want to give terrorists an e-hideout. Then again, maybe concerned Americans who want to combat terrorists should
simply not use these products. (Hang onto your old phone until they drop the “locked safe,” for example.) What
President Obama, Congress and the American people should not do is sit idly by while they put us at risk for pecuniary
gain.
Comey went out of his way to be nice to these companies: “Both companies are run by good people, responding to
what they perceive is a market demand.” Too nice, in my mind. Instead he should have just told them flat
out, “Don’t be evil.”
Especially true of ISIS.
AP 15 — Associated Press, 2015 (“US Officials: Encryption Hinders Monitoring Extremists,” Byline Eric Tucker, June
4th, Available Online at http://www.forensicmag.com/news/2015/06/us-officials-encryption-hinders-monitoringextremists, Accessed 07-06-2015)
The growing
use of encrypted communications and private messaging by
supporters of the Islamic State group is complicating efforts to monitor terror
suspects and extremists, U.S. law enforcement officials said Wednesday.
Appearing before the House Homeland Security Committee, the officials said that even as thousands of
Islamic State group followers around the world share public communications on
Twitter, some are exploiting social media platforms that allow them to shield
their messages from law enforcement.
"There are 200-plus social media companies. Some of these companies build their business model
around end-to-end encryption," said Michael Steinbach, head of the FBI's
counterterrorism division. "There is no ability currently for us to see that"
communication, he said.
Encryption helps terrorists — Zazi proves.
Crovitz 14 — L. Gordon Crovitz, Columnist and Former Publisher of The Wall Street Journal, former Executive VicePresident of Dow Jones, 2014 (“Terrorists Get a Phone Upgrade,” Wall Street Journal, November 23rd, Available Online at
http://www.wsj.com/articles/gordon-crovitz-terrorists-get-a-phone-upgrade-1416780266, Accessed 07-20-2015)
It’s a good thing Najibullah Zazi didn’t have access to a modern iPhone or Android
device a few years ago when he plotted to blow up New York City subway stations. He
was caught because his email was tapped by intelligence agencies—a practice that
Silicon Valley firms recently decided the U.S. government is no longer permitted.
Apple, Google, Facebook and others are playing with fire, or in the case of Zazi
with a plot to blow up subway stations under Grand Central and Times Square on Sept. 11, 2009. An
Afghanistan native living in the U.S., Zazi became a suspect when he used his unencrypted
Yahoo email account to double-check with his al Qaeda handler in Pakistan about the
precise chemical mix to complete his bombs. Zazi and his collaborators,
identified through phone records, were arrested shortly after he sent an email
announcing the imminent attacks: “The marriage is ready.”
The Zazi example (he pleaded guilty to conspiracy charges and awaits sentencing) highlights the risks
that Silicon Valley firms are taking with their reputations by making it impossible for
intelligence agencies or law enforcement to gain access to these communications.
In September, marketers from Apple bragged of changes to its operating system so that it will not comply with judicial
orders in national-security or criminal investigations.
“Unlike our competitors,” Apple announced, “it’s not technically feasible for us to respond to government warrants.”
This encryption was quickly matched by Google and the WhatsApp messaging service owned by Facebook.
In a private meeting last month, Deputy Attorney General James Cole asked the general counsel of Apple why the
company would want to market to criminals. As the Journal reported last week, Mr. Cole gave the hypothetical of the
police announcing that they would have been able to rescue a murdered child if only they could have had access to the
killer’s mobile device. Apple’s response was that the U.S. can always pass a law requiring companies to provide a way to
gain access to communications under court orders.
Since then, U.S. and British officials have made numerous trips to Silicon Valley to explain the dangers. FBI Director
James Comey gave a speech citing the case of a sex offender who lured a 12-year-old boy in Louisiana in 2010 using text
messages, which were later obtained to get a murder conviction. “There should be no one in the U.S. above the law,” Mr.
Comey said, “and also no places within the U.S. that are beyond the law.”
Robert Hannigan, the head of Britain’s electronic-intelligence agency, Government Communications Headquarters,
warned in a Financial Times op-ed earlier this month: “However
much they may dislike it,” Silicon
Valley firms “have become the command-and-control networks of choice for
terrorists and criminals.”
Even without terrorism attacks that could have been prevented, Mr. Hannigan said, he thought Internet users may be
“ahead” of Silicon Valley: “They do not want the media platforms they use with their friends and families to facilitate
murder or child abuse.”
It looks like Silicon Valley has misread public opinion. The initial media frenzy caused by the Edward Snowden leaks has
been replaced by recognition that the National Security Agency is among the most lawyered agencies in the government.
Contrary to initial media reports, the NSA does not listen willy-nilly to phone and email communications.
Last week, the Senate killed a bill once considered a sure thing. The bill would have created new barriers to the NSA
obtaining phone metadata to connect the dots to identify terrorists and prevent their attacks. Phone companies, not the
NSA, would have retained these records. There would have been greater risks of leaks of individual records. An
unconstitutional privacy advocate would have been inserted into Foreign Intelligence Surveillance Court proceedings.
The lesson of the Snowden accusations is that citizens in a democracy make reasonable
trade-offs between privacy and security once they have all the facts. As people
realized that the rules-bound NSA poses little to no risk to their privacy, there
was no reason to hamstring its operations. Likewise, law-abiding people know
that there is little to no risk to their privacy when communications companies
comply with U.S. court orders.
Finding no willingness by Silicon Valley to rethink its approach without being required by law, FBI Director Comey
recently asked Congress to update the Communications Assistance for Law Enforcement Act of 1994. This requires
traditional phone companies to comply with court orders to provide access to records. He wants the law updated to cover
Apple, Google and other digital companies.
Silicon Valley firms should find ways to comply with U.S. court orders or expect
Congress to order them to do so. They also shouldn’t be surprised if their
customers think less of companies that go out of their way to market technical
solutions to terrorists and criminals.
Crime DA Links
Encryption transforms the internet into an ungovernable space. Law
enforcement would become powerless.
Wittes 15 — Benjamin Wittes, Senior Fellow in Governance Studies at the Brookings Institution, Editor in Chief of
Lawfare, Member of the Task Force on National Security and Law at the Hoover Institution, 2015 (“Thoughts on
Encryption and Going Dark, Part II: The Debate on the Merits,” Lawfare—a national security blog curated by the
Brookings Institution, July 12th, Available Online at http://www.lawfareblog.com/thoughts-encryption-and-going-darkpart-ii-debate-merits, Accessed 07-13-2015)
Consider the conceptual question first. Would
it be a good idea to have a world-wide
communications infrastructure that is, as Bruce Schneier has aptly put it, secure from all
attackers? That is, if we could snap our fingers and make all device-to-device
communications perfectly secure against interception from the Chinese, from
hackers, from the FSB but also from the FBI even wielding lawful process, would
that be desirable? Or, in the alternative, do we want to create an internet as secure as
possible from everyone except government investigators exercising their legal
authorities with the understanding that other countries may do the same?
Conceptually speaking, I am with Comey on this question—and the matter does
not seem to me an especially close call. The belief in principle in creating a giant
world-wide network on which surveillance is technically impossible is really an
argument for the creation of the world's largest ungoverned space. I understand
why techno-anarchists find this idea so appealing. I can't imagine for moment, however,
why anyone else would.
Consider the comparable argument in physical space: the creation of a city in
which authorities are entirely dependent on citizen reporting of bad conduct but
have no direct visibility onto what happens on the streets and no ability to
conduct search warrants (even with court orders) or to patrol parks or street
corners. Would you want to live in that city? The idea that ungoverned spaces
really suck is not controversial when you're talking about Yemen or Somalia. I
see nothing more attractive about the creation of a worldwide architecture in
which it is technically impossible to intercept and read ISIS communications
with followers or to follow child predators into chatrooms where they go after
kids.
Encryption empowers criminals and terrorists — it makes prosecution
impossible.
Vance 14 — Cyrus Vance Jr., District Attorney of Manhattan, holds a J.D. from the Georgetown University Law
Center, 2014 (“Apple and Google threaten public safety with default smartphone encryption,” Washington Post, September
26th, Available Online at http://www.washingtonpost.com/opinions/apple-and-google-threaten-public-safety-withdefault-smartphone-encryption/2014/09/25/43af9bf0-44ab-11e4-b437-1a7368204804_story.html, Accessed 07-05-2015)
Apple and Google, whose operating systems run a combined 96.4 percent of smartphones worldwide,
announced last week that their new operating systems will prevent them from
complying with U.S. judicial warrants ordering them to unlock users’ passcodeprotected mobile devices.
Each company tweaked the code of its new and forthcoming mobile operating systems — iOS 8 and Android “L,”
respectively — for this explicit purpose. “Apple cannot bypass your passcode and therefore cannot access this data. So it’s
not technically feasible for us to respond to government warrants for the extraction of this data from devices in their
possession,” reads a new section of Apple’s Web site. “Keys are not stored off of the device, so they cannot be shared with
law enforcement,” a Google spokeswoman stated.
While these maneuvers may be a welcome change for those who seek greater privacy controls, the
unintended
victors will ultimately be criminals, who are now free to hide evidence on their
phones despite valid warrants to search them.
On the losing end are the victims of crimes — from sexual assault to money
laundering to robbery, kidnapping and homicide — many of whom undoubtedly are these
companies’ own loyal customers.
When news of these changes was reported, I did a brief survey of my office’s
recent cases to see which defendants Apple and Google would have protected had
their passcode-locked smartphones been running iOS 8 or Android “L” at the time of their arrests. I found:
* Multiple violent gang members who discussed in a smartphone video their
plans to shoot a rival. The video was taken shortly before the members mistakenly shot an innocent
bystander. The evidence would later be used to implicate two dozen of the gang’s members in
additional murders and shootings.
* A vile “up-skirter” who was observed by police inside a major subway station
walking up and down stairs behind women wearing skirts, with two iPhones angled upward in his hands. A warrant
allowed us to search the phones, which revealed exactly what you would think, as recorded by the perpetrator at multiple
stations throughout New York.
* An
identity thief whose smartphone contained the bank account numbers,
blank check images, account activity screen shots and tax return information of
several individuals.
Today, nearly every criminal case has a digital component. Much of the
evidence required to identify, locate and prosecute criminals is stored on
smartphones. None of the above cases could be prosecuted as effectively if the
perpetrators had smartphone software incorporating Apple and Google’s
privacy guarantees.
Apple and Google have brought their products to a new level of privacy, and of course privacy is critically important to
our society. But the protection of privacy is found in the Constitution, which requires warrants issued by neutral,
detached judges and supported by probable cause before law enforcement can obtain information from a mobile device.
Absent certain narrow exceptions, my office cannot search a mobile device without a warrant. Neither can the other
thousands of state and local prosecutors offices throughout the country. The warrant requirement assures that peoples’
possessions and privacy remain secure in all but exceptional circumstances.
Apple’s and Google’s software updates, however, push mobile devices beyond the
reach of warrants and thus beyond the reach of government law enforcement.
This would make mobile devices different from everything else. Even bank
security boxes — the “gold standard” of the pre-digital age — have always been searchable
pursuant to a judicial warrant. That’s because banks keep a key to them.
I am aware of no plausible reason why these companies cannot reverse these dangerous maneuvers in their next
scheduled updates to iOS 8 and Android “L.” Apple’s
and Google’s software should not
provide aid and comfort to those who commit crimes. This is not a matter of good or bad
corporate citizenship. It is a matter of national public safety.
When threats to the common public safety arise, we ask Congress to do
everything within its constitutional authority to address them. The provision of
cloaking tools to murderers, sex offenders, identity thieves and terrorists
constitutes such a threat.
Absent remedial action by the companies, Congress should act appropriately.
Encryption could get us all killed.
Hosko 14 — Ronald T. Hosko, President of the Law Enforcement Legal Defense Fund, Former Assistant Director of
the Criminal Investigative Division at the Federal Bureau of Investigation, 2014 (“Don’t Create Virtual Sanctuaries for
Criminals,” Room for Debate—a New York Times expert blog, October 1st, Available Online at
http://www.nytimes.com/roomfordebate/2014/09/30/apple-vs-the-law/dont-create-virtual-sanctuaries-for-criminals,
Accessed 07-05-2015)
When the director of the F.B.I. recently voiced his concerns about the impact of newly designed smartphone encryption
systems on our security, the unsurprising, reflexive response of many was, “Too bad, government. We can’t trust you.”
But, like most issues that divide our nation, it’s more complicated than that.
As a former insider, I watched the painful flow of Snowden
disclosures with dismay – those leaks, I’m
confident, have weakened our national security. It is no leap to suggest that those who already aim to
harm us are bolstered by the information they have learned from the revelations. Even a moderately savvy
criminal will closely observe the actions taken by law enforcement during their
pursuit or conviction, later using such tactics themselves in future, often more
dangerous crimes.
Popular culture has also skewed the public’s view of how law enforcement
actually solves crimes. As happy as we would all be to close complex cases
within a 60-minute time slot using cutting-edge technology, fighting crime nearly
always requires exhaustive investigation and not simply the use of fancy
gadgetry. In fact, investigators are limited in their technological capabilities, and I'm
concerned that the gap between public perception and reality could only make
matters worse.
The virtual sanctuaries constructed and marketed by certain companies will not
only attract ordinary Americans seeking to protect their private communications,
but also criminals and conspirators who wish to destroy our nation or to do
great harm to others. Creating these technological fortresses will have our
intelligence and law enforcement communities scurrying to penetrate them. On
occasion, they’ll succeed. But on others, time or cost will defeat them, and people
will be hurt or killed. While we debate the delicate balance of privacy and the
lawful need to intrude, we leave American lives at risk.
Encryption exponentially increases the risk of catastrophic crime and terrorism.
Hosko 14 — Ronald T. Hosko, President of the Law Enforcement Legal Defense Fund, Former Assistant Director of
the Criminal Investigative Division at the Federal Bureau of Investigation, 2014 (“Apple and Google’s new encryption
rules will make law enforcement’s job much harder,” Washington Post, September 23rd, Available Online at
http://www.washingtonpost.com/posteverything/wp/2014/09/23/i-helped-save-a-kidnapped-man-from-murderwith-apples-new-encryption-rules-we-never-wouldve-found-him/, Accessed 07-05-2015)
Last week, Apple and Google announced that their new operating systems will be encrypted by default. Encrypting
a phone doesn’t make it any harder to tap, or “lawfully intercept” calls. But it does
limit law enforcement’s
access to a data, contacts, photos and email stored on the phone itself.
That kind information can help law enforcement officials solve big cases
quickly. For example, criminals sometimes avoid phone interception by
communicating plans via Snapchat or video. Their phones contain contacts, texts,
and geo-tagged data that can help police track down accomplices. These new
rules will make it impossible for us to access that information. They will create
needless delays that could cost victims their lives.*
Law enforcement officials rely on all kinds of tools to solve crimes and bring
criminals to justice. Most investigations don’t rely solely on information from
one source, even a smartphone. But without each and every important piece of the
investigative puzzle, criminals and those who plan acts destructive to our
national security may walk free.
In my last FBI assignments, I was privy to information that regularly demonstrated
how criminal actors adapted to law enforcement investigative techniques – how
drug conspirators routinely “dropped” their cellphones every 30 days or so, estimating the
time it takes agents to identify and develop probable cause on a new device before seeking interception authority; how
child predators migrated to technologies like the Onion Router to obfuscate who’s
posting and viewing online posting and viewing online images and videos of horrific acts of child
abuse.
We shouldn’t give them one more tool.
But the long-used cellular service selling points of clarity and coverage have been overtaken by a new one – concealment.
Capitalizing on post-Snowden disclosures fears, Apple and Android have pitched this as a move to protect consumers’
privacy. Don’t misunderstand me — I, too, place a great value on personal privacy. I have little interest in the
government collecting and storing all of my texts and e-mails or logging all of my calls.
But Apple’s and Android’s new
protections will protect many thousands of criminals
who seek to do us great harm, physically or financially. They will protect those
who desperately need to be stopped from lawful, authorized, and entirely
necessary safety and security efforts. And they will make it impossible for police
to access crucial information, even with a warrant.
As Apple and Android trumpet their victories over law enforcement efforts, our citizenry, our Congress, and our media
ought to start managing expectations about future law enforcement and national security success. We’ve
lived in
an era where the term “connecting the dots” is commonly used. If our cutting
edge technologies are designed to keep important dots out of the hands of our
government, we all might start thinking about how safe and secure we will be
when the most tech-savvy, dedicated criminals exponentially increase their own
success rates.
* Editors note: This story incorrectly stated that Apple and Google’s new encryption rules would have hindered law
enforcement’s ability to rescue the kidnap victim in Wake Forest, N.C. This is not the case. The piece has been corrected.
DA Turns Case
DA turns the case — people will demand backdoors in response to horrible
crimes.
Wittes 15 — Benjamin Wittes, Senior Fellow in Governance Studies at the Brookings Institution, Editor in Chief of
Lawfare, Member of the Task Force on National Security and Law at the Hoover Institution, 2015 (“Thoughts on
Encryption and Going Dark, Part II: The Debate on the Merits,” Lawfare—a national security blog curated by the
Brookings Institution, July 12th, Available Online at http://www.lawfareblog.com/thoughts-encryption-and-going-darkpart-ii-debate-merits, Accessed 07-13-2015)
There's a final, non-legal factor that may push companies to work this problem as energetically as they are now moving
toward end-to-end encryption: politics. We are at very particular moment in the cryptography debate, a moment in which
law enforcement sees a major problem as having arrived but the tech companies see that problem as part of the solution
to the problems the Snowden revelations created for them. That is, we have an end-to-end encryption issue, in significant
part, because companies are trying to assure customers worldwide that they have their backs privacy-wise and are not
simply tools of NSA. I think those politics are likely to change. If
Comey is right and we start seeing
law enforcement and intelligence agencies blind in investigating and preventing
horrible crimes and significant threats, the pressure on the companies is going
to shift. And it may shift fast and hard. Whereas the companies now feel intense
pressure to assure customers that their data is safe from NSA, the kidnapped kid
with the encrypted iPhone is going to generate a very different sort of political
response. In extraordinary circumstances, extraordinary access may well seem
reasonable. And people will wonder why it doesn't exist.
They Say: “Friedersdorf”
Friedersdorf is wrong — encryption is an unprecedented attack on the rule of
law.
Wittes 15 — Benjamin Wittes, Senior Fellow in Governance Studies at the Brookings Institution, Editor in Chief of
Lawfare, Member of the Task Force on National Security and Law at the Hoover Institution, 2015 (“Conor Friedersdorf
Takes the Bait on Encryption,” Lawfare—a national security blog curated by the Brookings Institution, July 15th, Available
Online at http://www.lawfareblog.com/conor-friedersdorf-takes-bait-encryption, Accessed 07-20-2015)
Over at the Atlantic, Conor Friedersdorf has taken my bait on encryption. The other day, I
proposed separating the end-to-end encryption discussion into two distinct questions: First, we should ask whether a
world of end-to-end strong encryption, in which whole categories of surveillance are impossible, is a desirable outcome.
Then, second, we should ask whether the costs of retaining the possibility of surveillance are too high in security and
privacy terms. On the first question, I wrote the following:
Conceptually speaking, I am with Comey on this question—and the matter does not seem to me an especially
close call. The belief in principle in creating a giant world-wide network on which surveillance is technically
impossible is really an argument for the creation of the world's largest ungoverned space. I understand why
techno-anarchists find this idea so appealing. I can't imagine for moment, however, why anyone else would.
Consider the comparable argument in physical space: the creation of a city in which authorities are entirely
dependent on citizen reporting of bad conduct but have no direct visibility onto what happens on the streets
and no ability to conduct search warrants (even with court orders) or to patrol parks or street corners. Would
you want to live in that city? The idea that ungoverned spaces really suck is not controversial when you're
talking about Yemen or Somalia. I see nothing more attractive about the creation of a worldwide architecture in
which it is technically impossible to intercept and read ISIS communications with followers or to follow child
predators into chatrooms where they go after kids.
I separated this question precisely because I think it's
important in the encryption discussion to
smoke out the in-principle libertarianism that asserts some deep-seated right to
encrypted communication and distinguish it from the practical concerns that
worry that the costs of retaining law enforcement access is too high relative to the
benefits. The former is an ideological position to which facts and technological
possibility are irrelevant. The latter is one that might be highly responsive to
technological development.
Friedersdorf, as I knew someone would, has taken me up on the principled challenge:
The problem lies in the limits of his analogy.
In an ungoverned territory like Somalia, bad actors can take violent physical actions with impunity–say, seizing
a cargo ship, killing the captain, and taking hostages. If authorities were similarly helpless on America’s streets–
if gangs could rob or murder pedestrians as they pleased, and police couldn’t see or do a thing–that would,
indeed, be dystopian. But when communications are encrypted, the “ungoverned territory” does not
encompass actions, violent or otherwise, just thoughts and their expression.
No harm is done within the encrypted space.
To be sure, plots planned inside that space can do terrible damage in the real world–but so can plots hatched by
gang members on public streets whispering into one another’s ears, or Tony Soprano out on his boat, having
swept it for FBI bugs.
I wonder what Wittes would make of a different analogy.
In the absence of end-to-end encryption–indeed, even if it becomes a universally available tool–the largest
ungoverned space in the world won’t be Somalia or the Internet, but the aggregate space between the ears of
every human being on planet earth.
No authority figure can see into my brain, or the brain of Chinese human-rights activist Liu Xiaobo, or the
brains of ISIS terrorists, or the brains of Black Lives Matter protestors, or the brains of child pornographers, or
the brains of Tea Partiers or progressive activists or whoever it is that champions your political ideals. If
government had access to all of our thoughts, events from the American Revolution to the 9/11 terrorist attacks
would’ve been impossible. Reflecting on this vast “ungoverned territory,” does Wittes still regard as
uncontroversial the notion that “ungoverned spaces really suck”? More to the point, if it’s ever technically
possible, would he prefer a world in which government is afforded a “backdoor” into my brain, yours, and
those of the next Osama bin Laden, MLK, and Dylann Storm Roof?
Friedersdorf puts forward two ideas in these paragraphs, both of them underdeveloped and worth exploring.
The first is his idea that cyberspace involves a neat division between thought and action—that is, that you can encrypt
thought and expression, but you cannot encrypt deeds. So an ungoverned space consisting of encrypted communications
is fundamentally unlike the ungoverned space of Somalia, because you have to step outside of it in order to do any actual
harm. And once you're in the real world, you can be investigated and captured.
As a preliminary matter, it is worth noting that Friedersdorf here is
dead wrong in his empirical
assertion. Code—including malware—involves not the expression of ideas but making
computers, and sometimes things other than computers, do things, can be sent using
encrypted channels. Stolen intellectual property is routinely exfiltrated in Friedersdorf's words
"within the encrypted space." Child pornography is moved around the world
and stored using encryption. Illicit transactions are conducted routinely online.
Money can be stolen online (think about someone breaking into your Paypal account). The problem
of cybersecurity would be a rather easy one were Friedersdorf correct that "the
'ungoverned territory' does not encompass actions, violent or otherwise, just
thoughts and their expression."
It is true, so far anyway, that the realm of violent action normally involves recourse
to the kinetic world. But that is growing less true by the day. Drones, after all, are
operated remotely using communications channels. In our book on The Future of Violence,
Gabby and I describe "sextortion" cases that amount to a sort of online coerced sexual
activity that takes place entirely remotely. Friedersdorf sounds very 20th Century
when he suggests that you can't do someone any real harm without stepping into
the "real world."
But let's assume for a moment that he is correct—that the online world is a world of ideas,
thought, and communication only. Does it follow conceptually that we should be comfortable with its
being unsurveillable? Certain communications, after all, involve commands between people
("Shoot that guy") or conspiracies between people ("Let's blow up the Congress
of the United States, okay?"). A conspiracy, after all, is an agreement (communication)
between two or more people, plus an overt act. Even if we accept that the overt
act cannot take place by means of an encrypted channel, the agreement certainly
can and increasingly does. We did not accept about telecommunications that
because the vast majority of phone calls are innocent and many of them are
sensitive and one cannot do violence over the phone that we should make
telephone surveillance technically impossible. Even assuming Friedersdorf's erroneous premise,
why should we aim conceptually to create a worldwide telecommunications
infrastructure in which terrorists, hackers, and child pornographers are as secure
as dissidents and democratic citizens? To be clear, I understand why cybersecurity needs may require
us to accept this outcome in practical terms. I fail, however, to understand why we should aspire to it.
This brings me to Friedersdorf's rather interesting effort to turn the tables and offer a hypothetical world of his own: "In
the absence of end-to-end encryption—indeed, even if it becomes a universally available tool—the largest ungoverned
space in the world won’t be Somalia or the Internet, but the aggregate space between the ears of every human being on
planet earth." Friedersdorf
means this as a dystopian vision, but it is quite literally the
world we have always lived in. Communications of all types have always been
possible to surveil. Have a private conversation with someone; someone else
may be listening unbeknownst to either party. Write a letter; it can be intercepted
and read. Use a code; it can be broken. Talk on the phone; you might be
wiretapped. The innovation here lies not in the idea of the possibility of
interception; it lies, rather, in the idea of the impossibility of decryption, in the
notion of a norm of almost-perfectly secure communications.
As such, Friedersdorf's
suggestion that I am pushing ever further into people's heads is a
little bit mischievous. The insides of our heads, after all, are truly the arena of pure
ideas (you cannot think an act of violence) and, as such, are unregulated. And the
question before us is not whether that zone of impunity will shrink. The question,
rather, is how much it will grow and whether we will for the first time contemplate
a global information infrastructure that is in many instances (but for the possibility of
betrayal by a communications partner) as secure as that space between our ears. There are, as I have
discussed, practical reasons why we might have to extend the zone of impunity for the first time beyond the individual's
head. But we
should not underestimate the gravity of that step or kid ourselves that
it will be without big consequences.
Download