CryptoNeg (Backdoors/Encryption)

advertisement
CryptoNeg (Backdoors/Encryption)
Case - General
CyberAttacks Inevitable
Cybersecurity is vulnerable because lack of strategy by DHS, increased interconnectivity;
systems were not built do deal with this
Vicinanzo 15 (http://www.hstoday.us/briefings/daily-news-analysis/single-article/dhs-leaves-federalfacilities-open-to-cyber-attacks/5f3d0085da0b04a7918f9b41a80874c1.html, DHS Leaves Federal Facilities
Open To Cyber Attacks)//A.V.
Amid reports that US Central Command’s social media accounts were attacked by hackers claiming allegiance to the Islamic State (IS), the Government
Accountability Office (GAO) issued an audit report indicating DHS
is unprepared to address the increasing vulnerablilty of
federal facilities to cyber attacks. GAO found the Department of Homeland Security (DHS)—the agency responsible for
protecting federal facilities—lacks a strategy to address cyber risk to building and access control systems in
federal facilities. Consequently, the nearly 9,000 federal facilities protected by Federal Protection Services (FPS) remain vulnerable to cyber
threats. “Federal facilities contain building and access control systems—computers that monitor and control
building operations such as elevators, electrical power, and heating, ventilation, and air conditioning—that are
increasingly being connected to other information systems and the Internet,” GAO said. The GAO auditors added,
“The increased connectivity heightens their vulnerability to cyber attacks, which could compromise security
measures, hamper agencies’ ability to carry out their missions, or cause physical harm to the facilities or their
occupants.” The increase in the connectivity of these systems has also led to an increase in vulnerability to
cyber attacks. For example, in 2009, a security guard in a Dallas-area hospital uploaded malware to a hospital computer which controlled the
heating, air conditioning, and ventilation for two floors. Court documents indicate that the breach could have interfered with patient treatment,
highlighting the danger cyber intrusions pose to building and access control systems. A
cyber expert told GAO these systems were not
designed with cybersecurity in mind. “Security officials we interviewed also said that cyber attacks on systems in federal facilities could
compromise security countermeasures, hamper agencies’ ability to carry out their missions, or cause physical harm to the facilities and their occupants,”
GAO said. Sources of cyber threats to building and access control systems include corrupt employees, criminal groups, hackers and terrorists. In
particular, insiders—which include disgruntled employees, contractors or other persons abusing their positions of trust—represent a significant threat to
these systems. Editor's note: Read the report, Getting Inside the Insider Threat, by Nadia Short, vice president and general manager of Cyber and
Intelligence Solutions at General Dynamics Advanced Information Systems, in Homeland Security Today's Oct./Nov. special section on cybersecurity.
Cyber incidents reported to DHS involving industrial control systems increased from 140 to 243 incidents, a 74
percent increase, between fiscal years 2011 and 2014. Despite this increase, DHS does not have a strategy to
address cyber risk to building and access control systems. Specifically, DHS lacks a strategy that defines the problem, identifies the
roles and responsibilities, analyzes the resources needed, and identifies a methodology for assessing cyber risk to building and access control systems.
According to GAO, “The absence of a strategy that clearly defines the roles and responsibilities of key
components within DHS has contributed to the lack of action within the department. For example, no one within DHS is
assessing or addressing cyber risk to building and access control systems particularly at the nearly 9,000 federal facilities protected by FPS.” DHS’s
failure to develop a strategy has led to confusion among several components within DHS about their roles and
responsibilities. For example, FPS’s Deputy Director for Policy and Programs indicated FPS’s authority includes cybersecurity. However, the official
said that FPS is not assessing cyber risk because FPS does not have the expertise. Moreover, DHS lacks clear guidance on how federal
agencies should report cybersecurity incidents. Before 2014, for instance, DHS did not specify that information
systems included industrial control systems. DHS clarified this guidance in 2014 in part because of questions GAO auditors asked during
their review. “By not assessing the risk to these systems and taking steps to reduce that risk, federal facilities may
be vulnerable to cyber attacks,” GAO said.
Intel security considers backdoors irrelevant for security; Flurry of other factors are creating
holes in the system hackers can exploit
Tang 15 (http://www.techtradeasia.info/2015/04/enteprrises-mostly-unprepared-for.html, Enterprises
mostly unprepared for cyberattacks: Intel Security)//A.V.
A new report, Tackling Attack Detection and Incident Response* from the Enterprise Strategy Group (ESG), has found that security professionals are
inundated with security incidents, averaging 78 investigations per organisation in the last year. Over a quarter (28%) of those incidents involved targeted
attacks – one of the most dangerous and potentially damaging forms of cyberattacks. According
to the IT and security professionals
surveyed in the study, commissioned by Intel Security (formerly McAfee), better detection tools, better
analysis tools, and more training on how to deal with incident response issues are the top ways to improve the
efficiency and effectiveness of the information security staff. “When it comes to incident detection and response, time has an
ominous correlation to potential damage,” said Jon Oltsik, Senior Principal Analyst at ESG. “The longer it takes an organisation to
identify, investigate, and respond to a cyberattack, the more likely it is that their actions won’t be enough to
preclude a costly breach of sensitive data. With this in mind, Chief Information Security Officers (CISOs) should remember that collecting
and processing attack data is a means toward action -- improving threat detection and response effectiveness and efficiency.” Nearly 80% of the
people surveyed believe the lack of integration and communication between security tools creates bottlenecks
and interferes with their ability to detect and respond to security threats. Real-time, comprehensive visibility is especially
important for rapid response to targeted attacks, and 37% called for tighter integration between security intelligence and IT operations tools. In
addition, the top time-consuming tasks involved scoping and taking action to minimise the impact of an attack, activities that can be accelerated by
integration of tools. These
responses suggest that the very common patchwork architectures of dozens of individual
security products have created numerous silos of tools, consoles, processes and reports that prove very time
consuming to use. These architectures are creating ever greater volumes of attack data that drown out
relevant indicators of attack. Security professionals surveyed claim that real-time security visibility suffers from limited understanding of user
behaviour and network, application, and host behaviour. While the top four types of data collected are network-related, and
30% collect user activity data, it’s clear that data capture isn’t sufficient. Users need more help to contextualise
the data to understand what behaviour is worrisome. This gap may explain why nearly half (47%) of
organisations said determining the impact or scope of a security incident was particularly time consuming. Users
understand they need help to evolve from simply collecting volumes of security event and threat intelligence data to more effectively making sense of
the data and using it to detect and assess incidents. Fifty-eight
percent said they need better detection tools, such as static
and dynamic analysis tools with cloud-based intelligence to analyse files for intent. Fifty-three percent say they
need better analysis tools for turning security data into actionable intelligence. One-third called for better
tools to baseline normal system behaviour so teams can detect variances faster. People who took the survey admitted to a
lack of knowledge of the threat landscape and security investigation skills, suggesting that even better visibility through technical integration or
analytical capabilities will be inadequate if incident response teams cannot make sense of the information they see. For instance, only 45% of
respondents consider themselves very knowledgeable about malware obfuscation techniques, and 40%
called for more training to
improve cybersecurity knowledge and skills. The volume of investigations and limited resources and skills contributed to a strong desire
among respondents for help incident detection and response. Forty-two percent reported that taking action to minimise the impact of an attack was
one of their most time-consuming tasks. Twenty-seven percent would like better automated analytics from security intelligence tools to speed real-time
comprehension; while 15% want automation of processes to free up staff for more important duties. “Just as the medical profession must deliver heartattack patients to the hospital within a ‘golden hour’ to maximise likelihood of survival, the security industry must work towards reducing the time it
takes organisations to detect and deflect attacks, before damage is inflicted,” said Chris Young, General Manager at Intel Security. “This requires that we
ask and answer tough questions on what is failing us, and evolve our thinking around how we do security.” The
ESG believes that there is a
hidden story within the Intel Security research that hints at best practices and lessons learned. This data
strongly suggests that CISOs: · Create a tightly-integrated enterprise security technology architecture CISOs must
replace individual security point tools with an integrated security architecture. This strategy works to improve the sharing of attack information and
cross-enterprise visibility into user, endpoint, and network behaviour, not to mention more effective, coordinated responses. ·
Anchor their
cybersecurity strategy with strong analytics, moving from volume to value Cybersecurity strategies must be based upon
strong security analytics. This means collecting, processing, and analysing massive amounts of internal and external data. Internal data includes logs,
flows, packets, endpoint forensics, static/dynamic malware analysis, organisational intelligence (i.e., user behaviour, business behaviour, etc.) while
external data would cover threat intelligence and vulnerability notifications, among others. ·
Automate incident detection and response
whenever possible Because organisations will always struggle to keep up with the most recent attack techniques, CISOs must commit to more
automation such as advanced malware analytics, intelligent algorithms, machine learning, and the consumption of threat intelligence to compare
internal behaviour with incidents of compromise (IoCs) and tactics, techniques, and procedures (TTPs) used by cyber-adversaries. ·
Commit to
continuous cybersecurity education CISOs should require ongoing cyber-education for their security teams, including an annual series of
courses that provide individual professionals more depth of understanding of threats and best practices for efficient and effective incident response.
Cyberattacks will hit all their critical sectors because sheer lack of security, backdoors are
irrelevant
AFP 14 (http://news.yahoo.com/us-unprepared-cyber-attack-9-11-report-authors-190538960.html, US
unprepared for cyber-attack: 9/11 report authors)//A.V.
Washington (AFP) - The United
States has failed to sufficiently adapt to new cyber-security threats, exposing itself to
potential terror strikes as devastating as September 11, authors of the report on the 2001 attacks warned Wednesday.
In July 2004, the independent 9/11 commission issued a comprehensive, nearly 600-page report with
numerous recommendations for upgrading the US security apparatus to avoid a new catastrophe. A decade later
the commission's former members have released a blunt follow-up, pointing out gaps in US security that increase the risk of
cyber-attacks on infrastructure, including energy, transport and finance systems, and the theft of intellectual
property from the private sector. After exhaustive meetings with national security officials, "every single one
of them said we're not doing what we should be doing to protect ourselves against cyber-security" threats, former
9/11 commission co-chair Tom Kean told a House homeland security panel. "And because this stealing of information is so invisible to the American
public, they don't realize what a disaster it is." The new report, released Monday, warned that the fight against terrorism was entering a "new and
dangerous phase" marked by a sense of "counterterrorism
fatigue" that masked the urgency needed to address emerging
threats. "We are at September 10th levels in terms of cyber preparedness," it quoted a former senior national security
leader as saying. The former commissioners pointed to the difficulties in beefing up a security posture that brings government and the private sector
into cooperation. "The
government is doing much better protecting itself and its systems than it is helping the
private sector protect itself. We think our vulnerability in the latter area is greater," former commission member Jamie
Gorelick said. "We are uncomfortable with having our national security apparatus operating in the private sector," she added. "But if you think about
what the real threats are, an enemy who would shut down our power grid for example, those are real threats to which I don't believe we have great
answers at the moment." For months
Congress and the White House have debated cyber-security legislation that
would improve information sharing about cyber threats between federal authorities and companies in
strategic sectors. But such coordination remains contentious, amid concerns about the confidentiality of
personal data.
(not good card) Backdoors aren’t the problem; weak security systems in general allow
hackers to get in regardless
Lever 14 (http://www.businessinsider.com/afp-cyberattacks-to-worsen-in-2015-mcafee-researchers-201412, Cyberattacks Are Just Going To Get Worse From Here)//A.V.
Health care hacks McAfee said it is already seeing hackers targeting devices such as webcams with weak security
and industrial control systems. But it sees health care as an especially worrisome sector. "With the increasing
proliferation of healthcare IoT devices and their use in hospitals, the threat of the loss of information contained on those devices
becomes increasingly likely," the report said. It noted that health care data "is even more valuable than credit card data" on hacker black
markets. McAfee says other threats will also grow, including "ransomware," which locks down data and forces the victim to pay a ransom to retrieve it,
and attacks on mobile phone operating systems. In
the retail sector, digital payments may cut the risk of credit-card
skimmers, but hackers may be able to exploit wireless systems such as Bluetooth and near field
communications (NFC) used for mobile payments. "With consumers now sending payment information over a protocol with known
vulnerabilities, it is highly likely that attacks on this infrastructure will emerge in 2015," the report said. The report comes in the wake of news about
large-scale cyberattacks that have been linked to Russia or China, and a major infiltration of Sony Pictures which stole massive amounts of data. In retail,
Home Depot and others have reported data breaches affecting millions of customers. "The year 2014 will be remembered as the year of shaken trust,"
said Vincent Weafer, senior vice president at Intel-owned McAfee. "Restoring trust in 2015 will require stronger industry collaboration, new standards
for a new threat landscape, and new security postures that shrink time-to-detection."
Cyberattackers will use more sophisticated tactics to attack servers regardless of backdoors
Lever 14 (http://www.businessinsider.com/afp-cyberattacks-to-worsen-in-2015-mcafee-researchers-201412, Cyberattacks Are Just Going To Get Worse From Here)//A.V.
Read more: http://www.businessinsider.com/afp-cyberattacks-to-worsen-in-2015-mcafee-researchers-201412#ixzz3gC0T6P8z
Washington (AFP) - A series of spectacular cyberattacks drew headlines this year, and the situation will only
worsen in 2015 as hackers use more advanced techniques to infiltrate networks, security researchers said Tuesday.
McAfee Labs' 2015 Threats Predictions report sees increased cyber-warfare and espionage, along with new
strategies from hackers to hide their tracks and steal sensitive data. "Cyber espionage attacks will continue to
increase in frequency," the report said. "Long-term players will become stealthier information gatherers, while
newcomers will look for ways to steal money and disrupt their adversaries." McAfee said small nations and terror groups
will become even more active and will "attack by launching crippling distributed denial of service attacks or using malware that wipes the master boot
record to destroy their enemies' networks." At
the same time, cybercriminals will use better methods to remain hidden on
a victim's network, to carry out long-term theft of data without being detected, the researchers said. "In this
way, criminals are beginning to look and act more like sophisticated nation-state cyberespionage actors, who
watch and wait to gather intelligence," the report said. The report also said hackers are looking to target more connected devices,
including computers in the farming, manufacturing, and health care sectors. "The number and variety of devices in the Internet of Things (IoT) family is
growing exponentially. In the consumer space, they are now seen in appliances, automobiles, home automation, and even light bulbs," McAfee said.
SQ Solves
The Comprehensive national cybersecurity initiative is already reinforcing cyber security,
there is no need to do the plan.
White House 2010
"The Comprehensive National Cybersecurity Initiative.” (https://www.whitehouse.gov/issues/foreignpolicy/cybersecurity/national-initiative)
The CNCI consists of a number of mutually reinforcing initiatives with the following major goals designed to help
secure the United States in cyberspace: To establish a front line of defense against today’s immediate threats by creating or enhancing shared
situational awareness of network vulnerabilities, threats, and events within the Federal Government—and ultimately with state, local, and tribal governments and private
sector partners—and the ability to act quickly to reduce our current vulnerabilities and prevent intrusions. To
defend against the full spectrum of
threats by enhancing U.S. counterintelligence capabilities and increasing the security of the supply chain for
key information technologies. To strengthen the future cybersecurity environment by expanding cyber education; coordinating
and redirecting research and development efforts across the Federal Government; and working to define and
develop strategies to deter hostile or malicious activity in cyberspace. In building the plans for the CNCI, it was quickly realized that
these goals could not be achieved without also strengthening certain key strategic foundational capabilities within the Government. Therefore, the CNCI includes
funding within the federal law enforcement, intelligence, and defense communities to enhance such key
functions as criminal investigation; intelligence collection, processing, and analysis; and information assurance
critical to enabling national cybersecurity efforts. The CNCI was developed with great care and attention to
privacy and civil liberties concerns in close consultation with privacy experts across the government. Protecting
civil liberties and privacy rights remain fundamental objectives in the implementation of the CNCI. In accord
with President Obama’s declared intent to make transparency a touchstone of his presidency, the Cyberspace
Policy Review identified enhanced information sharing as a key component of effective cybersecurity. To improve
public understanding of Federal efforts, the Cybersecurity Coordinator has directed the release of the following summary description of the CNCI.
No Backdoor Mandate
FBI Backdoor proposal is disliked by both parties; no chance of passing
Tyler Lee, 10-22-2014, "Congress Shuts Down FBI Director's Decryption Law Proposal," Ubergizmo,
http://www.ubergizmo.com/2014/10/congress-shuts-down-fbi-directors-decryption-law-proposal/
We’re sure many weren’t too pleased that the FBI Director had hinted that the agency could be thinking about taking action against the likes of Apple
and Google, both of whom have recently introduced encryption features that would basically make it impossible for
them to unlock a user’s smartphone from their end. This is great for users as it means that law enforcement agencies will
not be able to spy on their phones, at least not as easily as before, and now what seems like even more great news is that Congress
does not appear to have FBI Director James Comey’s back. While official action has yet to be filed, members of Congress
have tweeted that they doubt they believe such a law would have a chance of being passed. According to
republican Darrell Issa, “The FBI and Justice Department must be more accountable – tough sell for them to
now ask the American people for more surveillance power.” Democrat Zoe Lofgren added that Comey’s
proposal would have a “zero chance” of being passed, with Senator Ron Wyden stating that he did not believe
more than a handful of lawmakers would actually get behind such a legislation. Now in the face of opposition from both
members of the government and the public, will Comey be backing down from his quest? We guess we’ll have to wait and see, but in the meantime it
looks like users can rest assured that their privacy is still protected.
No Cyber War
Cyberwar will stay on the computers, won’t lead to real world conflict.
Gewirtz 2015
David Gewirtz is an author, U.S. policy advisor and computer scientist. He is one of America's foremost cybersecurity experts. He is also director of the U.S. Strategic Perspective Institute as well as the founder of ZATZ
Publishing. David is a member of FBI InfraGard, the Cyberwarfare Advisor for the International Association for
Counterterrorism & Security Professionals, a columnist for The Journal of Counterterrorism and Homeland
Security, and has been a regular CNN contributor. "Why the next World War will be a cyberwar first, and a
shooting war second," ZDNet, http://www.zdnet.com/article/the-next-world-war-will-be-a-cyberwar-first-anda-shooting-war-a-distant-second/
Shooting wars are very expensive and very risky. Tremendous amounts of material must be produced and
transported, soldiers and sailors must be put into harm's way, and incredible logistics and supply chain
operations must be set up and managed on a nationwide (or multi-national level). Cyberwar is cheap. The
weapons are often co-opted computers run by the victims being targeted. Startup costs are minimal. Individual
personnel risk is minimal. It's even possible to conduct a cyberwar without the victims knowing (or at least
being able to prove) who their attackers are.
Empirics prove cyber war won’t happen, probability of cyberwar in the squo is low now
Rid 2012 (Thomas Rid is a professor in the Department of War Studies at King’s College London, 2012,
October 05, Cyber War Will Not Take Place, Journal of Strategic Studies, p. 6)
But is it? Are the Cassandras of cyber warfare on the right side of history? Is cyber war really coming? This
article argues that cyber war will not take place. That statement does not come with a Giraudouxian twist and
irony. It is meant literally – as a statement about the past, the present, and the likely future: Cyber war has
never happened in the past. Cyber war does not take place in the present. And it is highly unlikely that cyber
war will occur in the future. Instead, all past and present political cyber attacks are merely sophisticated
versions of three activities that are as old as warfare itself: subversion, espionage, and sabotage. That is
improbable to change in the years ahead.
Cyberwar doesn’t meet the criteria of ‘war’
Rid 2012 (Thomas Rid is a professor in the Department of War Studies at King’s College London, 2012,
October 05, Cyber War Will Not Take Place, Journal of Strategic Studies, p. 7-8)
Clausewitz still offers the most concise concept of war. It has three main elements. Any aggressive or defensive
action that aspires to be a stand-alone act of war, or may be interpreted as such, has to meet all three criteria.
Past cyber-attacks do not. The first element is war’s violent character. ‘War is an act of force to compel the
enemy to do our will’, wrote Carl von Clausewitz on the first page of On War.7 All war, pretty simply, is violent.
If an act is not potentially violent, it is not an act of war. Then the term is diluted and degenerates to a mere
metaphor, as in the ‘war’ on obesity or the ‘war’ on cancer. A real act of war is always potentially or actually
lethal, at least for some participants on at least one side. Unless physical violence is stressed, war is a
hodgepodge notion, to paraphrase Jack Gibbs.8 In Clausewitz’s thinking, violence is the pivotal point of all war.
Both enemies – he usually considered two sides – would attempt to escalate violence to the extreme, unless
tamed by friction, imponderables, and politics. 9 The second element highlighted by Clausewitz is war’s
instrumental character. An act of war is always instrumental. To be instrumental, there has to be a means and
an end. Physical violence or the threat of force is the means. The end is to force the enemy to accept the
offender’s will. Such a definition is ‘theoretically necessary’, Clausewitz argued.10 To achieve the end of war, one
opponent has to be rendered defenseless. Or, to be more precise: the opponent has to be brought into a
position, against his will, where any change of that position brought about by the continued use of arms would
bring only more disadvantages for him, at least in that opponent’s view. Complete defenselessness is only the
most extreme of those positions. Both opponents use violence in this instrumental way, shaping each other’s
behavior, giving each other the law of action, in the words of the Prussian philosopher of war.11 The
instrumental use of means takes place on tactical, operational, strategic, and political levels. The higher the
order of the desired goal, the more difficult it is to achieve. As Clausewitz put it, in the slightly stilted language
of his time: ‘The purpose is a political intention, the means is war; never can the means be understood without
the purpose.’12 This leads to another central feature of war. The third element that Clausewitz identified is
war’s political nature. An act of war is always political. The objective of battle, to ‘throw’ the enemy and to
make him defenseless, may temporarily blind commanders and even strategists to the larger purpose of war.
War is never an isolated act. War is never only one decision. In the real world, . It transcends the use of force.
This insight was captured by Clausewitz’s most famous phrase, ‘War is a mere continuation of politics by other
means.’13 To be political, a political entity or a representative of a political entity, whatever its constitutional
form, has to have an intention, a will. That intention has to be articulated. And one side’s will has to be
transmitted to the adversary at some point during the confrontation (it does not have to be publicly
communicated). Any violent act and its larger political intention also has to be attributed to one side at some
point during the confrontation. History does not know acts of war without eventual attribution.
No Cyberterror
No cyberterrorism impact—threats exaggerated
Quigley, Burns, and Stallard 15
(Kevin, Calvin, Kristen, 3/26/15, Government Information Quarterly, “‘Cyber Gurus’: A rhetorical analysis of the
language of cybersecurity specialists and the implications for security policy and critical infrastructure
protection,” http://cryptome.org/2015/05/cyber-gurus.pdf, 7/16/16, SM)
While these are four prevalent types of cybersecurity issues, there is∂ evidence to suggest that the threat is exaggerated and
oversimplified for∂ some. Many note the lack of empirical evidence to support the widespread∂ fear of cyber-terrorism
and cyber-warfare, for instance∂ (Cavelty, 2007; Hansen & Nissenbaum, 2009; Lewis, 2003; Rid, 2013;∂ Stohl, 2007).∂ According to Stohl (2007), there
is little vulnerability in critical infrastructure∂ that could lead to violence or fatalities. Secondly, there are few∂
actors who would be interested in or capable of exploiting such vulnerabilities.∂ Thirdly, and in relation to cyber-terrorism
in particular, the∂ expenses necessary to carry out cyber-attacks are greater than traditional∂ forms of terrorism,
limiting the utility of cyber-attacks compared∂ to other available measures (Stohl, 2007). Instead, technology is most∂ often used by terrorists to provide
information, solicit financial support,∂ network with like-minded terrorists, recruit, and gather information; in other words, “terrorist groups are simply
exploiting modern tools to∂ accomplish the same goals they sought in the past” (Stohl, 2007, p. 230).
No cyberterrorism threat—easy to prevent, they’re using the Internet only for promotional
purposes
Cluley 14
(Graham, 10/20/14, The State of Security, “GCHQ Spokesperson Says Cyber Terrorism Is ‘Not a Concern’,”
former employee of Sophos, McAfee, Dr. Solomon’s, inducted into the InfoSecurity Europe Hall of Fame,
http://www.tripwire.com/state-of-security/security-data-protection/gchq-spokesperson-says-cyber-terrorismis-not-a-concern/, 7/17/15, SM)
Yes, a terrorist could launch a denial-of-service attack, or write a piece of malware, or hack into a sensitive system, just as easily as the next (nonterrorist), but there
is no reason to believe that an attack launched by a terrorist living in his secret HQ in the mountain caves of
Afghanistan would be any harder to stop than the hundreds of thousands of other attacks launched each day.∂
That’s not to say that launching an Internet attack wouldn’t have attractive aspects for those behind a terror campaign. Put bluntly, it’s a heck lot easier
(and less physically dangerous) to write a Trojan horse to infect a computer on the other side of the world, than to drive a lorry loaded up with Semtex
outside a government building’s front door.∂ Furthermore, terrorists are often interested in making headlines, to focus the world’s attention on what
they believe to be their plight. If innocent people die during a terrorist action that certainly does help you make the newspapers, but it’s very bad for
public relations, and is going to make it a lot harder to convince others to sympathise with your campaign.∂ The good news about pretty much all
Internet attacks, of course, is that they don’t involve the loss of life. Any damage done is unlikely to leave individuals maimed or bleeding, but can still
bloody the nose of a government that should have been better protected or potentially disrupt economies.∂ But still, such
terrorist-initiated
Internet attacks should be no harder to protect against than the financially-motivated and hacktivist attacks
that organisations defend themselves against every day.∂ So, when a journalist asks me if I think cyber terrorism is a big concern, I
tend to shrug and say “Not that much” and ask them to consider why Al Qaeda, for instance, never bothered to launch a
serious Internet attack in the 13 years since September 11.∂ After all, if it is something for us all to fear – why
wouldn’t they have done it already?∂ So, I was pleased to have my views supported last week – from a perhaps surprising source.∂ GCHQ,
the UK intelligence agency which has become no stranger to controversy following the revelations of NSA whistleblower Edward Snowden, appears to
agree that cyber terrorism is not a concern. Or at least that’s what they’re saying behind closed doors, according to SC Magazine.∂ Part of SC Magazine
story on cyber terrorism∂ The report quoted an unnamed GCHQ spokesperson at a CSARN (City Security And Resilience Networks) forum held last week
in London, debunking the threat posed by cyber terrorists:∂ “Quite frankly we don’t see cyber terrorism. It hasn’t occurred…but we have to guard
against it. For those of you thinking about strategic threats, terrorism is not [a concern] at this point in time,” although he added that the agency was
‘very concerned’ on a possible attack at the time of the 2012 London Olympics.∂ ∂ He said that while
it is clear that terrorism groups –
such as ISIS and Al-Qaeda – are technically-adept, there’s been no sign of them venturing to cyber beyond
promotional purposes.∂ ∂ “For some reason, there doesn’t seem intent to use destructive cyber capability. It’s clearly a theoretical
threat. We’ve not seen – and we were very worried around London Olympics – but we’ve never seen it. We’ll continue to keep an eye on it.”∂ In a
time when the potential threat posed by terrorism is often used as an excuse for covert surveillance by intelligence agencies, such as GCHQ, and the UK
government raising the “threat level” to “Severe” at the end of August due to conflict in Iraq and Syria, one has to wonder if the spokesperson quoted
was speaking entirely “on-message.”
Vulnerability Inevitable
US Infrastructure vulnerable to Cyber-Terrorism
Weekly Analysis, 7-15-2015, "Official: Greatest cyber risks to national security involve handful of sectors,"
Inside Cybersecurity, http://insidecybersecurity.com/Cyber-General/Cyber-Public-Content/official-greatestcyber-risks-to-national-security-involve-handful-of-sectors/menu-id-1089.html
The greatest cyber risks to U.S. national security involve about a third of the country's 16 critical infrastructure
sectors, according to an FBI official. The bureau's cybersecurity outreach program for critical infrastructure is focused on six sectors –
banking and finance, energy, transportation, information technology, communications and public health – the
program's leader, Stacy Stevens, said during a June 9 public meeting of cybersecurity professionals organized by the Department of Homeland Security
in Cambridge, MA. The FBI official's comments, as well as documents obtained by Inside Cybersecurity under the Freedom of Information Act, shed new
light on how U.S. authorities view cyber risks in industry, a subject shrouded in secrecy that some argue is excessive. An Obama administration adviser,
Richard Danzig, last year urged greater disclosure of cyber risks facing various sectors in the interest of enabling better policymaking. Stevens told Inside
Cybersecurity that the FBI and DHS have a shared understanding of which sectors are associated with the greatest cyber-related national security risks.
This hierarchy enables the FBI cybersecurity outreach unit to prioritize its resources. The unit has focused on banking and finance, energy,
transportation, information technology and communications since it was established in 2013 and added public health to the list more recently, she said.
President Obama has repeatedly urged improvements in cybersecurity for critical infrastructure, including in an executive order issued in 2013. Obama's
speech at the White House cybersecurity summit in February mentioned most of the sectors cited by Stevens. " Much
of our critical
infrastructure -- our financial systems, our power grid, health systems -- run on networks connected to the Internet, which is
hugely empowering but also dangerous, and creates new points of vulnerability that we didn't have before,"
Obama said. "Foreign governments and criminals are probing these systems every single day. We only have to
think of real-life examples -- an air traffic control system going down and disrupting flights, or blackouts that
plunge cities into darkness -- to imagine what a set of systematic cyber attacks might do." But DHS has been tightlipped about which infrastructure sectors and assets face the most significant cyber risks. In response to Obama's 2013 executive order, the agency
produced an unclassified "for official use only" report in July 2013 to identify critical infrastructure where a cybersecurity incident could cause
"catastrophic" regional or national damage to public health or safety, economic security or national security. Inside Cybersecurity obtained a redacted
version of the report through the Freedom of Information Act. It omits the names of the specific sectors and infrastructure deemed most vulnerable, but
reveals that a
DHS working group identified "61 entities in five critical infrastructure sectors where a cybersecurity
incident could reasonably result in catastrophic regional or national effects on public health or safety,
economic security, or national security." The DHS study also identified "13 sectors, subsectors, or modes, where a cybersecurity incident
on a single entity would not be expected to result in catastrophic regional or national effects." "A cybersecurity incident is possible in all
sectors," DHS wrote in its 2013 report, "but not all cybersecurity incidents would generate the catastrophic consequences required for consideration
under [Obama's February 2013 executive order]." "As technology and business practices change, greater cyber dependence will likely increase the
impact of potential consequences of cybersecurity incidents," the report states, noting the agency would annually re-evaluate the list of infrastructure at
greatest risk from a cybersecurity incident. Non-catastrophic risks can still be significant. The electrical grid, finance sector, water supply, and
telecommunications systems are the "big four targets" of cyber attacks intended to have a distinct and immediate impact, Richard Bejtlich, chief security
strategist for FireEye, recently testified before Congress.
Cyber insurgency inevitable
Rosenzweig 13 (Paul Rosenzweig, 2013 “Cyber warfare: how conflicts in cyberspace are challenging
America and changing the world,” pg 49)
The same cannot, unfortunately, be said of cyber intrusions by nonstate actors. Unconstrained by the limits of sovereignty,
devoid of any territory to protect, and practically immune from retaliation, these groups pose a significant
danger to stability. We might think of them as cyber terrorists, but perhaps a better conception is that of a cyber insurgent. A good way to
look at this is through the prism of the challenge to social and governmental authority by WikiLeaks and its founder,
Julian Assange, and its support by the hacktivist group Anonymous. Their story is one of both enhanced information transparency and, more significantly
for our purposes, the ability to ·wage combat in cyberspace.
Cyberterror will adapt- anonymity and growing capabilities
Rosenzweig 13 (Paul Rosenzweig, 2013 “Cyber warfare: how conflicts in cyberspace are challenging
America and changing the world,” pg 65)
This description of the correlation of forces in cyberspace is, in many ways, congruent with similar analyses of the physical world. Terrorists enabled by
asymmetric power (IEDs and box cutters) have likewise challenged traditional state authorities. And, just
as Americans must learn to deal
with these kinetic insurgent challenges, so too must they respond to cyber insurgency. • Current capabilities of
nonstate actors are weak but improving. The current capabilities of organized nonstate actors in cyberspace are relatively modest.
While DDoS attacks can be a significant annoyance, they are not an existential threat. This state of affairs is
unlikely to hold for long. As the Stuxnet computer virus demonstrates, significant real-world effects can
already be achieved by sophisticated cyber actors. It is only a matter of time until less sophisticated nonstate actors achieve the same
capability. • Attribution is always a challenge. Determining the origin of an attack can be problematic. Sending a
message from a digital device to a provider is akin to mailing a letter. The service provider acts as an electronic carrier that sends the message through
routers and servers which deliver the message to the targeted computer. The
attacking computers may have been hijacked and be
under the control of a server in another country. An attacker may disguise its locations by circuitous routing or by masking the
message's source identification, similar to fudging a letter's return address and postmark. A cyber insurgent may strike several
countries, multiple Internet service providers, and various telecommunications linkages, all subject to varying
legal requirements and reporting standards, which makes tracing the source extremely difficult. Overcoming these
difficulties by technical means alone is a vexing problem and an unnecessary one. As the scope of conflicts in cyberspace develops, governments around
the world will use all techniques in their arsenal to exploit the weaknesses of the nonstate actors who are part of the threat.
Chinese Cyberattacks overhyped
NSA is reverse Engineering Chinese programs to prevent cyber attacks
Lee 13 (China’s Cyber-War: Don’t Believe the Hype, http://www.counterpunch.org/2013/03/18/chinas-cyberwar-dont-believe-the-hype/)//A.V.
The United States apparently feels that it can “win the Internet” by harnessing the power of the invincible American technological knowhow to the antiChinese cyber-crusade. In another of the seemingly endless series of self-congratulatory backgrounders given by US government insiders, the godlike
powers of the National Security Agency were invoked to Foreign Policy magazine in an article titled Inside the Black Box: How the NSA is helping US
companies fight back against Chinese hackers: In
the coming weeks, the NSA, working with a Department of Homeland
Security joint task force and the FBI, will release to select American telecommunication companies a wealth of
information about China’s cyber-espionage program, according to a US intelligence official and two
government consultants who work on cyber projects. Included: sophisticated tools that China uses,
countermeasures developed by the NSA, and unique signature-detection software that previously had been
used only to protect government networks. Very little that China does escapes the notice of the NSA, and
virtually every technique it uses has been tracked and reverse-engineered. For years, and in secret, the NSA
has also used the cover of some American companies – with their permission – to poke and prod at the
hackers, leading them to respond in ways that reveal patterns and allow the United States to figure out, or
“attribute,” the precise origin of attacks. The NSA has even designed creative ways to allow subsequent attacks
but prevent them from doing any damage. Watching these provoked exploits in real time lets the agency learn
how China works. And amid the bluster, a generous serving of bullshit: Now, though, the cumulative effect of Chinese economic warfare –
American companies’ proprietary secrets are essentially an open book to them – has changed the secrecy calculus. An American official who has been
read into the classified program – conducted by cyber-warfare technicians from the Air Force’s 315th Network Warfare Squadron and the CIA’s secret
Technology Management Office – said that China has become the “Curtis LeMay” of the post-Cold War era: “It is not abiding by the rules of statecraft
anymore, and that must change.”
Chinese cyber threat is over-exaggerated
Jon R. Lindsay 15 (is an assistant research scientist at the University of California, San Diego. In the summer
of 2015, he will become Assistant Professor of Digital Media and Global Affairs at the University of Toronto
Munk School of Global
Affairs.http://belfercenter.ksg.harvard.edu/publication/25321/exaggerating_the_chinese_cyber_threat.html,
"Exaggerating the Chinese Cyber Threat")//A.V.
BOTTOM LINES Inflated Threats and Growing Mistrust. The United States and China have more to gain than lose through their intensive use of the
internet, even as friction in cyberspace remains both frustrating and inevitable. Threat misperception heightens the risks of miscalculation in a crisis and
of Chinese backlash against competitive U.S. firms. The U.S. Advantage. For every type of Chinese cyber threat—political, espionage, and military—there
are also serious Chinese vulnerabilities and countervailing U.S. strengths. Protection of Internet Governance. To ensure the continued high performance
of information technology firms and the mutual benefits of globalization, the United States should preserve liberal norms of open interconnection and
the multistakeholder system—the loose network of academic, corporate, and governmental actors managing global technical protocols. INFLATED
THREATS AND GROWING MISTRUST Policymakers
in the United States often portray China as posing a serious
cybersecurity threat. In 2013 U.S. National Security Adviser Tom Donilon stated that Chinese cyber intrusions
not only endanger national security but also threaten U.S. firms with the loss of competitive advantage. One U.S.
member of Congress has asserted that China has "laced the U.S. infrastructure with logic bombs." Chinese critics, meanwhile, denounce Western
allegations of Chinese espionage and decry National Security Agency (NSA) activities revealed by Edward Snowden. The People's Daily newspaper has
described the United States as "a thief crying 'stop thief.'" Chinese commentators increasingly call for the exclusion of U.S. internet firms from the
Chinese market, citing concerns about collusion with the NSA, and argue that the institutions of internet governance give the United States an unfair
advantage. The
rhetorical spiral of mistrust in the Sino-American relationship threatens to undermine the mutual
benefits of the information revolution. Fears about the paralysis of the United States' digital infrastructure or
the hemorrhage of its competitive advantage are exaggerated. Chinese cyber operators face underappreciated organizational
challenges, including information overload and bureaucratic compartmentalization, which hinder the weaponization of cyberspace or absorption of
stolen intellectual property. More important, both
the United States and China have strong incentives to moderate the
intensity of their cyber exploitation to preserve profitable interconnections and avoid costly punishment. The
policy backlash against U.S. firms and liberal internet governance by China and others is ultimately more worrisome for U.S. competitiveness than
espionage; ironically,
it is also counterproductive for Chinese growth. The United States is unlikely to experience
either a so-called digital Pearl Harbor through cyber warfare or death by a thousand cuts through industrial
espionage. There is, however, some danger of crisis miscalculation when states field cyberweapons. The secrecy of cyberweapons'
capabilities and the uncertainties about their effects and collateral damage are
as likely to confuse friendly militaries as they are to
muddy signals to an adversary. Unsuccessful preemptive cyberattacks could reveal hostile intent and thereby encourage retaliation with
more traditional (and reliable) weapons. Conversely, preemptive escalation spurred by fears of cyberattack could encourage the target to use its
cyberweapons before it loses the opportunity to do so. Bilateral dialogue is essential for reducing the risks of misperception between the United States
and China in the event of a crisis. THE U.S. ADVANTAGE The secrecy regarding the cyber capabilities and activities of the United States and China creates
difficulty in estimating the relative balance of cyber power across the Pacific. Nevertheless,
the United States appears to be gaining an
increasing advantage. For every type of purported Chinese cyber threat, there are also serious Chinese
vulnerabilities and growing Western strengths. Much of the international cyber insecurity that China generates reflects internal
security concerns. China exploits foreign media and digital infrastructure to target political dissidents and minority populations. The use of national
censorship architecture (the Great Firewall of China) to redirect inbound internet traffic to attack sites such as GreatFire.org and GitHub in March 2015 is
just the latest example of this worrisome trend. Yet
prioritizing political information control over technical cyber defense
also damages China's own cybersecurity. Lax law enforcement and poor cyber defenses leave the country
vulnerable to both cybercriminals and foreign spies. The fragmented and notoriously competitive nature of the Communist Party
state further complicates coordination across military, police, and regulatory entities. There is strong evidence that China continues to
engage in aggressive cyber espionage campaigns against Western interests. Yet it struggles to convert even
legitimately obtained foreign data into competitive advantage, let alone make sense of petabytes of stolen
data. Absorption is especially challenging at the most sophisticated end of the value chain (e.g., advanced fighter
aircraft), which is dominated by the United States. At the same time, the United States conducts its own cyber espionage against
China , as the Edward Snowden leaks dramatized, which can indirectly aid U.S. firms (e.g., in government trade
negotiations). China's uneven industrial development, fragmented cyber defenses, erratic cyber tradecraft, and the market
dominance of U.S. technology firms provide considerable advantages to the United States. Despite high levels of
Chinese political harassment and espionage, there is little evidence of skill or subtlety in China's military cyber operations. Although Chinese strategists
describe cyberspace as a highly asymmetric and decisive domain of warfare, China's military cyber capacity does not live up to its doctrinal aspirations.
A disruptive attack on physical infrastructure requires careful testing, painstaking planning, and sophisticated
intelligence. Even experienced U.S. cyber operators struggle with these challenges. By contrast, the Chinese
military is rigidly hierarchical and has no wartime experience with complex information systems. Further,
China's pursuit of military "informatization" (i.e., emulation of the U.S. network-centric style of operations)
increases its dependence on vulnerable networks and exposure to foreign cyberattack. To be sure, China engages in
aggressive cyber campaigns, especially against nongovernmental organizations and firms less equipped to defend themselves than government entities.
These activities, however, do not constitute major military threats against the United States, and they do
nothing to defend China from the considerable intelligence and military advantages of the United States.
PROTECTION OF INTERNET GOVERNANCE Outmatched by the West in direct cyber confrontation yet eager to maintain the global connectivity
supporting economic growth, China (together with Russia and other members of the Shanghai Cooperation Organization) advocates for internet
governance reform. These changes, predicated on so-called internet sovereignty, would replace the current multistakeholder system and its liberal
norms of internet openness with a formal international regulator, such as the United Nations' International Telecommunication Union, and strong
norms of noninterference with sovereign networks. Chinese complaints of U.S. internet hegemony are not completely unfounded: the internet
reinforces U.S. dominance, but it does so through a light regulatory touch that relies on the self-interest of stakeholders—academic scientists,
commercial engineers, government representatives, and civil society organizations. The internet expands in a self-organized fashion because adopters
have incentives to pursue increasing returns to interconnection. The profit-driven expansion of networks and markets through more reliable and
voluminous transactions and more innovative products (e.g., cloud services, mobile computing, and embedded computing) tends to reinforce the
economic competitiveness of the United States and its leading information technology firms. Many Western observers fear that cyber reform based on
the principle of internet sovereignty might legitimize authoritarian control and undermine the cosmopolitan promise of the multistakeholder system.
China, however, benefits too much from the current system to pose a credible alternative. Tussles around internet governance are more likely to result
in minor change at the margins of the existing system, not a major reorganization that shifts technical protocols and operational regulation to the United
Nations. Yet this is not a foregone conclusion, as China moves to exclude U.S. firms such as IBM, Oracle, EMC, and Microsoft from its domestic markets
and attempts to persuade other states to support governance reforms at odds with U.S. values and interests. CONCLUSION Information technology has
generated tremendous wealth and innovation for millions, underwriting the United States' preponderance as well as China's meteoric rise. The
costs
of cyber espionage and harassment pale beside the mutual benefits of an interdependent, globalized
economy. The inevitable frictions of cyberspace are not a harbinger of catastrophe to come, but rather a sign
that the states inflicting them lack incentives to cause any real harm. Exaggerated fears of cyberwarfare or an
erosion of the United States' competitive advantage must not be allowed to undermine the institutions and
architectures that make the digital commons so productive.
Case Warming
A2 Warming 1NC
Time frame is a 100 years- Latest IPCC models
Ridley 14, (Matt Ridley is the author of The Rational Optimist, a columnist for the Times (London) and a
member of the House of Lords. He spoke at Ideacity in Toronto on June 18., “PCC commissioned models to see
if global warming would reach dangerous levels this century. Consensus is ‘no’” , [ http://tinyurl.com/mgyn8ln
] , //hss-RJ)
The debate over climate change is horribly polarized. From the way it is conducted, you would think that only
two positions are possible: that the whole thing is a hoax or that catastrophe is inevitable. In fact there is room
for lots of intermediate positions, including the view I hold, which is that man-made climate change is real but
not likely to do much harm, let alone prove to be the greatest crisis facing humankind this century. After more
than 25 years reporting and commenting on this topic for various media organizations, and having started out
alarmed, that’s where I have ended up. But it is not just I that hold this view. I share it with a very large
international organization, sponsored by the United Nations and supported by virtually all the world’s
governments: the Intergovernmental Panel on Climate Change (IPCC) itself. The IPCC commissioned four
different models of what might happen to the world economy, society and technology in the 21st century and what each would mean for the
climate, given a certain assumption about the atmosphere’s “sensitivity” to carbon dioxide. Three of the models show a moderate, slow
and mild warming, the hottest of which leaves the planet just 2 degrees Centigrade warmer than today in
2081-2100. The coolest comes out just 0.8 degrees warmer. Now two degrees is the threshold at which warming starts to
turn dangerous, according to the scientific consensus. That is to say, in three of the four scenarios considered
by the IPCC, by the time my children’s children are elderly, the earth will still not have experienced any harmful
warming, let alone catastrophe. But what about the fourth scenario? This is known as RCP8.5, and it produces
3.5 degrees of warming in 2081-2100. Curious to know what assumptions lay behind this model, I decided to
look up the original papers describing the creation of this scenario. Frankly, I was gobsmacked. It is a world that
is very, very implausible. For a start, this is a world of “continuously increasing global population” so that there
are 12 billion on the planet. This is more than a billion more than the United Nations expects, and flies in the
face of the fact that the world population growth rate has been falling for 50 years and is on course to reach
zero – i.e., stable population – in around 2070. More people mean more emissions. Second, the world is
assumed in the RCP8.5 scenario to be burning an astonishing 10 times as much coal as today, producing 50% of its
primary energy from coal, compared with about 30% today. Indeed, because oil is assumed to have become scarce, a lot of liquid fuel would then be
derived from coal. Nuclear and renewable technologies contribute little, because of a “slow pace of innovation” and hence “fossil fuel technologies
continue to dominate the primary energy portfolio over the entire time horizon of the RCP8.5 scenario.” Energy efficiency has improved very little.
These are highly unlikely assumptions. With abundant natural gas displacing coal on a huge scale in the United States today, with the price
of solar power plummeting, with nuclear power experiencing a revival, with gigantic methane-hydrate gas resources being discovered on the seabed,
with energy efficiency rocketing upwards, and with population growth rates continuing to fall fast in virtually every country in the world, the one thing
we can say about RCP8.5 is that it is very, very implausible. Notice,
however, that even so, it is not a world of catastrophic pain.
The per capita income of the average human being in 2100 is three times what it is now. Poverty would be
history. So it’s hardly Armageddon. But there’s an even more startling fact. We now have many different studies of climate sensitivity
based on observational data and they all converge on the conclusion that it is much lower than assumed by the IPCC in these models. It has to be,
otherwise global temperatures would have risen much faster than they have over the past 50 years. As Ross McKitrick noted on this page earlier this
week, temperatures have not risen at all now for more than 17 years. With these much more realistic estimates of sensitivity (known as “transient
climate response”), even RCP8.5 cannot produce dangerous warming. It manages just 2.1C of warming by 2081-2100. That is to say, even if you pile
crazy assumption upon crazy assumption till you have an edifice of vanishingly small probability, you cannot even manage to make climate change cause
minor damage in the time of our grandchildren, let alone catastrophe. That’s not me saying this – it’s the IPCC itself. But what strikes me as truly
fascinating about these scenarios is that they tell us that globalization, innovation and economic growth are unambiguously good for the environment.
At the other end of the scale from RCP8.5 is a much more cheerful scenario called RCP2.6. In this happy world, climate change is not a problem at all in
2100, because carbon dioxide emissions have plummeted thanks to the rapid development of cheap nuclear and solar, plus a surge in energy efficiency.
The RCP2.6 world is much, much richer. The average person has an income about 15 times today’s in real terms, so that most people are far richer than
Americans are today. And it achieves this by free trade, massive globalization, and lots of investment in new technology. All the things the green
movement keeps saying it opposes because they will wreck the planet. The answer to climate change is, and always has been, innovation. To worry
now in 2014 about a very small, highly implausible set of circumstances in 2100 that just might, if climate
sensitivity is much higher than the evidence suggests, produce a marginal damage to the world economy,
makes no sense. Think of all the innovation that happened between 1914 and 2000. Do we really think there will be less in this century? As for
how to deal with that small risk, well there are several possible options. You could encourage innovation and trade. You could put a modest but growing
tax on carbon to nudge innovators in the right direction. You could offer prizes for low-carbon technologies. All of these might make a little sense. But
the one thing you should not do is pour public subsidy into supporting old-fashioned existing technologies that produce more carbon dioxide per unit of
energy even than coal (bio-energy), or into ones that produce expensive energy (existing solar), or that have very low energy density and so require
huge areas of land (wind). The IPCC produced two reports last year. One said that the cost of climate change is likely to be less than 2% of GDP by the
end of this century. The other said that the cost of decarbonizing the world economy with renewable energy is likely to be 4% of GDP. Why do
something that you know will do more harm than good?
Warming’s the biggest hoax ever---latest studies and their data is wrong
Hoft 2/8 (Jim, writer for the Gateway pundit – “Report: Global Warming Biggest Science Hoax Ever “
http://www.thegatewaypundit.com/2015/02/304076/)//GV
The Global warming hoax is shaping up to be the greatest science scandal ever. The Telegraph reported: When future
generations look back on the global-warming scare of the past 30 years, nothing will shock them more than
the extent to which the official temperature records – on which the entire panic ultimately rested – were
systematically “adjusted” to show the Earth as having warmed much more than the actual data justified. Two
weeks ago, under the headline “How we are being tricked by flawed data on global warming”, I wrote about Paul
Homewood, who, on his Notalotofpeopleknowthat blog, had checked the published temperature graphs for three weather stations in Paraguay against the temperatures that
had originally been recorded. In
each instance, the actual trend of 60 years of data had been dramatically reversed, so
that a cooling trend was changed to one that showed a marked warming. This was only the latest of many
examples of a practice long recognised by expert observers around the world – one that raises an ever larger question mark over
the entire official surface-temperature record. Following my last article, Homewood checked a swathe of other South American weather stations around the original three.
In each case he found the same suspicious one-way “adjustments”. First these were made by the US government’s Global Historical
Climate Network (GHCN). They were then amplified by two of the main official surface records, the Goddard Institute for Space Studies (Giss) and the National Climate Data
Center (NCDC), which
use the warming trends to estimate temperatures across the vast regions of the Earth where
no measurements are taken. Yet these are the very records on which scientists and politicians rely for their
belief in “global warming”. Homewood has now turned his attention to the weather stations across much of the Arctic,
between Canada (51 degrees W) and the heart of Siberia (87 degrees E). Again, in nearly every case, the same
one-way adjustments have been made, to show warming up to 1 degree C or more higher than was indicated
by the data that was actually recorded. This has surprised no one more than Traust Jonsson, who was long in charge of climate research for the Iceland
met office (and with whom Homewood has been in touch). Jonsson was amazed to see how the new version completely “disappears” Iceland’s “sea ice years” around 1970,
when a period of extreme cooling almost devastated his country’s economy.
Their consensus is flawed
Tol 14 Richard Tol is a professor of economics at the University of Sussex, and a professor of the economics of
climate change at the Vrije Universiteit Amsterdam. He is a member of the Academia Europaea. He was a
contributer for the IPCC before he withdrew due to their exaggeration. He has a PhD from VU University
Amsterdam. (“The claim of a 97% consensus on global warming does not stand up Consensus is irrelevant in
science. There are plenty of examples in history where everyone agreed and everyone was wrong”,
http://www.theguardian.com/environment/blog/2014/jun/06/97-consensus-global-warming, 6/6/2014)
Kerwin \
Dana Nuccitelli writes that I “accidentally confirm the results of last year’s 97% global warming consensus
study”. Nothing could be further from the truth. I show that the 97% consensus claim does not stand up. At best,
Nuccitelli, John Cook and colleagues may have accidentally stumbled on the right number. Cook and co selected some 12,000 papers
from the scientific literature to test whether these papers support the hypothesis that humans played a
substantial role in the observed warming of the Earth. 12,000 is a strange number. The climate literature is
much larger. The number of papers on the detection and attribution of climate change is much, much smaller.
Cook’s sample is not representative. Any conclusion they draw is not about “the literature” but rather about
the papers they happened to find. Most of the papers they studied are not about climate change and its
causes, but many were taken as evidence nonetheless. Papers on carbon taxes naturally assume that carbon
dioxide emissions cause global warming – but assumptions are not conclusions. Cook’s claim of an increasing consensus
over time is entirely due to an increase of the number of irrelevant papers that Cook and co mistook for evidence. The abstracts of the 12,000 papers
were rated, twice, by 24 volunteers. Twelve rapidly dropped out, leaving an enormous task for the rest. This shows. There are
patterns in the
data that suggest that raters may have fallen asleep with their nose on the keyboard. In July 2013, Mr Cook claimed to
have data that showed this is not the case. In May 2014, he claimed that data never existed. The
data is also ridden with error. By
Cook’s own calculations, 7% of the ratings are wrong. Spot checks suggest a much larger number of errors, up
to one-third [of the data is wrong]. Cook tried to validate the results by having authors rate their own papers.
In almost two out of three cases, the author disagreed with Cook’s team about the message of the paper in
question. Attempts to obtain Cook’s data for independent verification have been in vain. Cook sometimes claims that the raters are interviewees
who are entitled to privacy – but the raters were never asked any personal detail. At other times, Cook claims that the raters are not interviewees but
interviewers. The 97% consensus paper rests on yet another claim: the raters are incidental, it is the rated papers that matter. If
you measure
temperature, you make sure that your thermometers are all properly and consistently calibrated.
Unfortunately, although he does have the data, Cook does not test whether the raters judge the same paper in
the same way. Consensus is irrelevant in science. There are plenty of examples in history where everyone
agreed and everyone was wrong. Cook’s consensus is also irrelevant in policy. They try to show that climate
change is real and human-made. It is does not follow whether and by how much greenhouse gas emissions
should be reduced. The debate on climate policy is polarised, often using discussions about climate science as
a proxy. People who want to argue that climate researchers are secretive and incompetent only have to point
to the 97% consensus paper. On 29 May, the Committee on Science, Space and Technology of the US House of Representatives examined the
procedures of the UN Intergovernmental Panel on Climate Change. Having been active in the IPCC since 1994, serving in various roles in all its three
working groups, most recently as a convening lead author for the fifth assessment report of working group II, my testimony to the committee briefly
reiterated some of the mistakes made in the fifth assessment report but focused on the structural faults in the IPCC, notably the selection of authors and
staff, the weaknesses in the review process, and the competition for attention between chapters. I highlighted that the IPCC is a natural monopoly that
is largely unregulated. I recommended that its assessment reports be replaced by an assessment journal. In an article on 2 June, Nuccitelli ignores the
subject matter of the hearing, focusing instead on a brief interaction about the 97% consensus paper co-authored by… Nuccitelli. He unfortunately
missed the gist of my criticism of his work. Successive literature reviews, including the ones by the IPCC, have time and again established that there has
been substantial climate change over the last one and a half centuries and that humans caused a large share of that climate change. There is
disagreement, of course, particularly on the extent to which humans contributed to the observed warming. This is part and parcel of a healthy scientific
debate. There is widespread agreement, though, that climate change is real and human-made. I
believe Nuccitelli and colleagues are
wrong about a number of issues. Mistakenly thinking that agreement on the basic facts of climate change
would induce agreement on climate policy, Nuccitelli and colleagues tried to quantify the consensus, and
failed. In his defence, Nuccitelli argues that I do not dispute their main result. Nuccitelli fundamentally misunderstands research. Science is not a
set of results. Science is a method. If the method is wrong, the results are worthless. Nuccitelli’s pieces are two of a series
of articles published in the Guardian impugning my character and my work. Nuccitelli falsely accuses me of journal shopping, a despicable practice. The
theologist Michael Rosenberger has described climate protection as a new religion, based on a fear for the
apocalypse, with dogmas, heretics and inquisitors like Nuccitelli. I prefer my politics secular and my science
sound.
A2 Warming EXT Time Frame
Extend the Time frame is 100 yearsThe card references IPCC models saying 3 of 4
scenarios happen by 2100 and worst case scenario before that relies on faulty evidence of
infinite stable population growth rate and 10 times the coal consumption
Their models artificially inflate heat records
Spencer 12 (Roy, former NASA climatologist and author, “McKitrick & Michaels Were Right: More Evidence of Spurious Warming in the IPCC Surface Temperature
Dataset,” 3/30 http://www.drroyspencer.com/page/2/)
The supposed gold standard in surface temperature data is that produced by Univ. of East Anglia, the so-called
CRUTem3 dataset. There has always been a lingering suspicion among skeptics that some portion of this IPCC
official temperature record contains some level of residual spurious warming due to the urban heat island effect.
Several published papers over the years have supported that suspicion. The Urban Heat Island (UHI) effect is
familiar to most people: towns and cities are typically warmer than surrounding rural areas due to the
replacement of natural vegetation with manmade structures. If that effect increases over time at thermometer
sites, there will be a spurious warming component to regional or global temperature trends computed from the
data. Here I will show based upon unadjusted International Surface Hourly (ISH) data archived at NCDC that the
warming trend over the Northern Hemisphere, where virtually all of the thermometer data exist, is a function
of population density at the thermometer site.
A2 Warming AT: Consensus
Extend Tol 14- They cherrypicked 12000 papers out of a much larger lit base->Paper by
paper analysis shows most papers mistake assumption for conclusion- they would count
carbon tax reports as confirmation of warming, 1/3 reports rely on false data.
Scientific consensus doesn’t matter – they all use their own and each other’s false research,
assumptions, and models
Spencer 9 (Roy, former climatologist at NASA, “The MIT Global Warming Gamble,” May, http://www.drroyspencer.com/2009/05/the-mit-global-warming-gamble/)
True, there are many scientists who really do think our tinkering with the climate system through our
greenhouse gas emissions is like playing Russian roulette. But the climate system tinkers with itself all the time,
and the climate has managed to remain stable. There are indeed internal, chaotic fluctuations in the climate system that might
appear to be random, but their effect on the whole climate system are constrained to operate within a certain range. If
the climate system really was that sensitive, it would have forced itself into oblivion long ago. The MIT research group
pays lip service to relying on “peer-reviewed science”, but it looks like they treat peer-reviewed scientific publications as random events, too. If 99 papers have
been published which claim the climate system is VERY sensitive, but only 1 paper has been published that
says the climate system is NOT very sensitive, is there then a 99-in-100 (99%) chance that the climate system is
very sensitive? NO. As has happened repeatedly in all scientific disciplines, it is often a single research paper
that ends up overturning what scientists thought they knew about something. In climate research, those 99 papers typically will
all make the same assumptions, which then pretty much guarantees they will end up arriving at the same conclusions. So, those 99 papers do not
constitute independent pieces of evidence. Instead, they might be better described as evidence that ‘group
think’ still exists. It turns out that the belief in a sensitive climate is not because of the observational evidence, but in spite of it. You can start to learn more about the
evidence for low climate sensitivity (negative feedbacks) here. As the slightly-retouched photo of the MIT research group shown above suggests, I predict that it is
only a matter of time before the climate community placing all its bets on the climate models is revealed to be
a very bad gamble.
Case Economy
1NC – Cloud Computing
Cloud is growingly rapidly because benefits are too high; midsized firms are picking up for
big biz slack over seas
McCue 14 (http://www.forbes.com/sites/tjmccue/2014/01/29/cloud-computing-united-states-businesseswill-spend-13-billion-on-it/, Cloud Computing: United States Businesses Will Spend $13 Billion On It,, global
market for cloud equipment to 80 billis by 2018)//A.V.
Instead of a slow-moving fluffy white cloud image, the cloud computing industry should use a tornado – that
might be a better way to visualize how fast cloud computing is growing today. Amazon is dominating, but is
followed by the usual suspects: IBM IBM +0.88%, Apple AAPL +0.83%, Cisco, Google GOOG +15.96%, Microsoft
MSFT -0.13%, Salesforce, and Rackspace, to name a few. (Disclosure: I am on the paid blogger team for IBM Midsize Insider, which
covers technology pertinent to midsize companies, including the cloud, among other topics.) “The cloud” is frequently in the news, but there is also a
fair amount of confusion, outside of technology teams. What is cloud computing and why do businesses need to care? IBM
published this
handy infographic: 5 Reasons Businesses Use The Cloud. Among the reasons: Collaboration, better access to
analytics, increasing productivity, reducing costs, and speeding up development cycles. By 2015, end-user
spending on cloud services could be more than $180 billion. It is predicted that the global market for cloud
equipment will reach $79.1 billion by 2018 If given the choice of only being able to move one application to the cloud, 25% of
respondents would choose storage By 2014, businesses in the United States will spend more than $13 billion on cloud
computing and managed hosting services. According to Jack Woods at Silicon Angle, there’s some serious
growth forecasted and he lists 20 recent cloud computing statistics you can use to make your case for why you
need the cloud or to understand why you should consider it for your business. The above bullet points come
from his post. At the simplest level, cloud computing lets you share files and resources via the web. If you run a small business, you read about
cloud storage as a way to backup and protect your data. Companies, such as, Google Docs (now Drive), DropBox, Carbonite, Cubby, and a slew of others.
If you need to choose an online backup provider, take a look at Tim Fisher’s post: 40 Online Backup Services Reviewed. Looking beyond online backup, to
understanding where the cloud is going, I look to Louis Columbus who writes extensively about the cloud here on Forbes and elsewhere. After reading
Louis’ Big Data post a couple of weeks ago, 2014: The Year Big Data Adoption Goes Mainstream In The Enterprise, (linked in left column) it made it clear
that big data takes big crunching power and is part of why many midsized and large enterprises are not leveraging in-house servers – there is no
leverage. You want to leverage a cloud provider like those listed above that can scale up and down with your computing needs. Well known tech
guru, Om Malik, can help you wrap your head around cloud computing and its enormous, rapid growth in his
interview with Amazon CTO Werner Vogels. Vogels spells out how the big impact of the cloud will take place
overseas, outside the USA, especially for small and midsized businesses. You can read additional 2014 from GigaOm here.
Forbes contributor, Joe McKendrick, penned a useful post as I was writing this today: 9 ‘Worst Practices’ To Avoid With Cloud Computing. It includes
some great ideas to help you evaluate whether the cloud can help your business or not (listed to the left). From
improved collaboration to
productivity increases, the cloud has the potential to change your business. The cloud is rapidly growing,
moving quickly – if you’ve been on the fence around investing in new infrastructure, you should look to the
cloud. This post from Network Computing lists out a bunch of recent stats around big data and the cloud. Final note: The Myth Of Online Backup by
Tony Bradley here on Forbes is excellent and highlights how you need to read the fine print for any cloud service you use. He explains a not-so-extreme
example of a photographer using Carbonite to back up her 2.5 Terabytes of important, wouldn’t-want-to-lose customer data. Worthy read.
Cloud computing industry is inherently more secure than server storage and generates 1.5
times profit compared with alternatives
Papersave 15 (http://www.papersave.com/blog/bid/209222/Cloud-computing-is-growing-rapidly-despitesome-misguided-concerns, Cloud computing is growing rapidly, despite some misguided concerns)//A.V.
The use of cloud computing to manage business documents and transactions is becoming more prominent
throughout the world. The technology has proven to lead to increased profitability for those who have implemented it, but in some
cases, unwarranted uncertainties are preventing companies from recognizing this advantage. Despite high usage
rates, apprehension exists in the European market The cloud has already expanded in a significant way across most of Europe. According to statistics
from EuroStat, Nordic countries are leading the way in adopting cloud solutions. Finland, Sweden and Denmark all rated in the top four in terms of
having the highest proportions of enterprises that used the technology, with 51 percent of Finnish companies indicating they used some sort of cloudbased service. The source found that the average rate of usage throughout the continent was 19 percent, and that there were 10 nations that rose
above this number. Among cloud subscribers, EuroStat noted that email leads the way for all services, with 66 percent using it. The second-most popular
solution is cloud storage, with 53 percent overall indicating use, and three northwest European countries - Ireland at 74 percent, Iceland at 74 percent
and the U.K. at 71 percent - lead the way. Despite widespread acclimation globally, a certain mystique continues to surround the cloud. The source
remarked that of those enterprises which have not yet begun to move any services to the cloud, the most common reason cited was a lack of knowledge
about the technology, at 42 percent. Furthermore, 37 percent indicated apprehension regarding risk of a security breach, and 33 percent were uncertain
about the location of cloud-based data. All
of these concerns should be alleviated by the fact that, according to Systems
and Software, cloud data centers are inherently more safe than on-premise servers, due to their advanced
security methodologies and initiatives. However, a certain "cloud," so to speak, continues to hang over the
industry, as many potential users remain unjustifiably skeptical about security. Meanwhile, overall
implementation is skyrocketing An IDC white paper, entitled "Successful Cloud Partners 2.0," surveyed over
700 Microsoft partners worldwide and found that those with more than half of their revenue coming from the
cloud made 1.5 times the gross profit when compared to other partners. The study noted that on average, the respondents
make 26 percent of their income from cloud-related products and services, a number that is expected to surpass 40 percent in 2016. While the North
American and Western European economies have largely implemented cloud computing over the past decade or so, in emerging markets like Latin
America and Asia, the concept is still fairly new. The IDC study predicted that the cloud will grow in developing places 1.8 times faster than established
markets, and will close the gap on them. By 2017,
the source projected that newcomers will account for 21.3 percent of
the public cloud market. "Spending on public IT cloud services will grow to $108 billion in 2017." The source
predicted that spending on public IT cloud services will grow from $47.4 billion in 2013 to $108 billion in 2017.
Software as a service is projected to remain the highest-grossing cloud solution category through 2017, as it is
expected to make up 57.9 percent of the overall industry. However, IDC said that the platform-as-a-service and infrastructure-as-aservice sectors will grow faster than SaaS over the next five years. It appears as though fewer companies are viewing the cloud in a skeptical manner
with each passing year, and, eventually, the technology will become the standard for business practices throughout the world. Developing nations are
catching up to the traditional powerhouses in North America and Western Europe, so it's only a matter of time before the world is predominantly run via
cloud computing.
No Impact to Econ collapse- countries turn inward
Bennet and Nordstrom 2k (D. Scott Bennett and Timothy Nordstrom, Department of Political Science at
Penn State, February 2000, The Journal of Conflict Resolution, Vol. 44, Iss. 1, “Foreign policy substitutability
and internal economic problems in enduring rivalries,” p. Proquest)
Military competition between states
requires large amounts of resources, and rivals require even more attention. Leaders may choose to negotiate a settlement that ends a
rivalry to free up important resources that may be reallocated to the domestic economy. In a “guns versus butter”
world of economic trade-offs, when a state can no longer afford to pay the expenses associated with competition in
a rivalry, it is quite rational for leaders to reduce costs by ending a rivalry. This gain (a peace dividend) could be achieved at any time by ending a rivalry.
However, such a gain is likely to be most important and attractive to leaders when internal conditions are bad and the leader is
seeking ways to alleviate active problems. Support for policy change away from continued rivalry is more likely to
develop when the economic situation sours and elites and masses are looking for ways to improve a worsening
situation. It is at these times that the pressure to cut military investment will be greatest and that state leaders will
be forced to recognize the difficulty of continuing to pay for a rivalry. Among other things, this argument also encompasses the view that the cold war
ended because the U[SSR]nion of Soviet Socialist Republics could no longer compete economically with the United States.
Conflict settlement is also a distinct route to dealing with internal problems that leaders in rivalries may pursue when faced with internal problems.
Economic decline doesn’t cause war – best and most recent data
Drezner 14 [Daniel, American Professor of International Politics at the Fletcher School of Law and Diplomacy at Tufts University, April, “The
System Worked: How the World Stopped Another Great Depression,” Oxford University Press, pg. 37-8/AKG]
A closer look at the numbers, however, reveals more encouraging findings. What seemed to be an inexorable
increase in piracy, for example, turned out to be a blip. By September 2013, the total numbers of piracy attacks had
fallen to their lowest levels in seven years. Attacks near Somalia, in particular, declined substantially; the total number of attacks
fell by 70 percent in 2012 and an additional 86 percent in the first nine months of 2013. Actual hijackings were down 43 percent compared to 2008/9
levels. 47 The US Navy’s figures reveal similar declines in the number and success rate of pirate attacks. 48 Security
concerns have not
dented the opening of the global economy. As for the effect of the Great Recession on political conflict, the aggregate effects were
surprisingly modest. A key conclusion of the Institute for Economics and Peace in its 2012 report was that “the average level of
peacefulness in 2012 is approximately the same as it was in 2007.” 49 The institute’s concern in its 2013 report about a
decline in peace was grounded primarily in the increase in homicide rates— a source of concern, to be sure, but not exactly
the province of global governance. Both interstate violence and global military expenditures have declined
since the start of the financial crisis. Other studies confirm that the Great Recession has not triggered any increase in
violent conflict. Looking at the post-crisis years, Lotta Themnér and Peter Wallensteen conclude, “The pattern is one of relative
stability when we consider the trend for the past five years.” 50 The decline in secular violence that started with the end of the
Cold War has not been reversed. Rogers Brubaker observes that “the crisis has not to date generated the surge in protectionist
nationalism or ethnic exclusion that might have been expected.” 51
Trade is irrelevant for war
Katherine Barbieri 13, Associate Professor of Political Science at the University of South Carolina, Ph.D. in
Political Science from Binghamton University, “Economic Interdependence: A Path to Peace or Source of
Interstate Conflict?” Chapter 10 in Conflict, War, and Peace: An Introduction to Scientific Research, google
books
How does interdependence affect war, the most intense form of conflict? Table 2 gives the empirical results. The rarity of wars makes any analysis of
their causes quite difficult, for variations in interdependence will seldom result in the occurrence of war. As in the case of MIDs, the log-likelihood ratio tests for each model suggest that the inclusion of
the various measures of interdependence and the control variables improves our understanding of the factors affecting the occurrence of war over that obtained from the null model. However, the
individual interdependence variables, alone, are not statistically significant. This is not the case with contiguity and relative capabilities, which are both statistically significant. Again, we see that
contiguous dyads are more conflict-prone and that dyads composed of states with unequal power are more pacific than those with highly equal power. Surprisingly, no evidence is provided to support the
evidence from the pre-WWII period provides
support for those arguing that economic factors have little, if any, influence on affecting leaders’ decisions to
engage in war, but many of the control variables are also statistically insignificant. These results should be interpreted with caution, since the sample does not contain a sufficient number wars
commonly held proposition that democratic states are less likely to engage in wars with other democratic states.¶ The
to allow us to capture great variations across different types of relationships. Many observations of war are excluded from the sample by virtue of not having the corresponding explanatory measures. A
This study provides little empirical
support for the liberal proposition that trade provides a path to interstate peace. Even after controlling for the
influence of contiguity, joint democracy, alliance ties, and relative capabilities, the evidence suggests that in
most instances trade fails to deter conflict. Instead, extensive economic interdependence increases the likelihood that dyads engage in
militarized dispute; however, it appears to have little influence on the incidence of war. The greatest hope for peace appears to arise from symmetrical
variable would have to have an extremely strong influence on conflict—as does contiguity—to find significant results. ¶ 7. Conclusions
trading relationships. However, the dampening effect of symmetry is offset by the expansion of interstate linkages. That is, extensive economic linkages, be they symmetrical or asymmetrical, appear to
pose the greatest hindrance to peace through trade.
Ext tech companies growing
Extend cloud computing industry is growing, gives stats for every major company growing,
growth will hit 80 billion in 3 years, small cloud firms will pick up the slack even if bigger
ones stagnate- that’s mccue 14, this is because cloud computing industry generates 1.5
times the profit, that’s papersave 15
Cloud industry revenues grew by 60% after snowden; companies just don’t give a shit
Relander 15 (http://www.investopedia.com/articles/investing/032715/cloudcomputing-industryexponential-growth.asp, Cloud-Computing: An industry In Exponential Growth)//A.V.- 60%growth, growth in 5
years, grow40 millis by 2018
Investors today have far more options available than in the past. Among these options is the cloud. Nasdaq
reported that last year revenues for cloud services grew by 60 percent. Furthermore, cloud computing is
anticipated to continue growing at a robust rate over the course of the next five years. If you are considering investing in
a technology-based company, there are certainly many advantages. By owning stock in a company offering cloud-based services, you will not only be
able to follow the latest trends but also have the opportunity to make money from the explosive growth of this industry. Before getting involved in cloud
computing investing, however, it is important to understand what is involved, what is driving the growth in this industry, and the best way to plan your
cloud investments. What Is Cloud Computing? Before you consider investing in cloud computing, it is a good idea to have a basic understanding of
exactly what it is. While many of us enjoy the ability to upload photos, documents, and videos to "the cloud" and then retrieve them at our convenience,
the concept of cloud computing is somewhat abstract. The heart of cloud computing is fairly simple. Companies providing cloud services make it possible
to store data and applications remotely, and then access those files via the Internet. (For a background on the Internet industry from which cloud
computing has emerged, see article: The Industry Handbook: The Internet Industry.) Cloud computing is primarily comprised of three services:
infrastructure as a service (IaaS), software as a service (SaaS), and platform as a service (Paas). Software as a service is expected to experience the
fastest growth, followed by infrastructure as a service. According to research conducted by Forrester, the cloud computing market is anticipated to grow
from $58 billion in 2013 to more than $191 billion by the year 2020. Software as a Service (SaaS) Deployed online, SaaS involves the licensure of an
application to customers. Licenses are typically provided through a pay-as-you-go model or on-demand. This rapidly growing market could provide an
excellent investment opportunity, with Goldman Sachs reporting that SaaS software revenues are expected to reach $106 billion by next year.
Infrastructure as a Service (IaaS) Infrastructure as a service involves a method for delivering everything from operating systems to servers and storage
through IP-based connectivity as part of an on-demand service. Clients can avoid the need to purchase software or servers, and instead procure these
resources in an outsourced on-demand service. Platform as a Service (PaaS) Of the three layers of cloud-based computing, PaaS is considered the most
complex. While PaaS shares some similarities with SaaS, the primary difference is that instead of delivering software online, it is actually a platform for
creating software that is delivered via the Internet. Forrester research indicates that PaaS solutions are expected to generate $44 billion in revenues by
the year 2020. What Is Driving Growth in Cloud Computing The rise of cloud-based software has offered companies from all sectors a number of
benefits, including the ability to use software from any device, either via a native app or a browser. As a result, users are able to carry over their files and
settings to other devices in a completely seamless manner. Cloud
computing is about far more than just accessing files on
multiple devices, however. Thanks to cloud-computing services, users can check their email on any computer and even store files using
services such as Dropbox and Google Drive. Cloud-computing services also make it possible for users to back up their
music, files, and photos, ensuring that those files are immediately available in the event of a hard drive crash.
Driving the growth in the cloud industry is the cost savings associated with the ability to outsource the
software and hardware necessary for tech services. According to Nasdaq, investments in key strategic areas
such as big data analytics, enterprise mobile, security and cloud technology, is expected to increase to more
than $40 million by 2018. With cloud-based services expected to increase exponentially in the future, there
has never been a better time to invest, but it is important to make sure you do so cautiously. (See article: A Primer On
Investing In The Tech Industry.)
EXT No Resources for war
Empirics flow neg on this
DeMause 2
(Lloyd deMause, director of The Institute for Psychohistory, “Nuclear War as an Anti-Sexual Group Fantasy”
Updated December 18th 2002, http://www.geocities.com/kidhistory/ja/nucsex.htm)
The nation "turns inward" during this depressed phase of the cycle. Empirical studies have clearly demonstrated that
major economic downswings are accompanied by "introverted" foreign policy moods, characterized by fewer armed
expeditions, less interest in foreign affairs in the speeches of leaders, reduced military expenditures, etc. (Klingberg, 1952; Holmes,
1985). Just as depressed people experience little conscious rage--feeling "I deserve to be killed" rather than "I want to
kill others" (Fenichel, 1945, p. 393)--interest in military adventures during the depressed phase wanes, arms expeditures decrease
and peace treaties multiply.
Ext Empirics Collapse =/= war
93 crises prove no war
Miller 2k (Morris, Economist, Adjunct Professor in the Faculty of Administration – University of Ottawa,
Former Executive Director and Senior Economist – World Bank, “Poverty as a Cause of Wars?”,
Interdisciplinary Science Reviews, Winter, p. 273) NR
The question may be reformulated. Do wars spring from a popular reaction to a sudden economic crisis that
exacerbates poverty and growing disparities in wealth and incomes? Perhaps one could argue, as some scholars do, that it is some dramatic
event or sequence of such events leading to the exacerbation of poverty that, in turn, leads to this deplorable denouement. This exogenous
factor might act as a catalyst for a violent reaction on the part of the people or on the part of the political leadership who would then possibly
be tempted to seek a diversion by finding or, if need be, fabricating an enemy and setting in train the process leading to war. According
to
a study undertaken by Minxin Pei and Ariel Adesnik of the Carnegie Endowment for International Peace,
there would not appear to be any merit in this hypothesis. After studying ninety-three episodes of
economic crisis in twenty-two countries in Latin America and Asia in the years since the Second World War they concluded that:19
Much of the conventional wisdom about the political impact of economic crises may be wrong ... The severity of
economic crisis – as measured in terms of inflation and negative growth - bore no relationship to the collapse of regimes ... (or, in
democratic states, rarely) to an outbreak of violence ... In the cases of dictatorships and semidemocracies, the ruling elites responded
to crises by increasing repression (thereby using one form of violence to abort another).
ZERO uptick in conflict after the 08 collapse proves
Barnett 9 (Senior managing director of Enterra Solutions LLC and a contributing editor/online
columnist for Esquire magazine, columnist for World Politics Review, Thomas P.M. “The New Rules:
Security Remains Stable Amid Financial Crisis,” World Politics Review,
8/252009, http://www.aprodex.com/the-new-rules--security-remains-stable-amid-financial-crisis-398bl.aspx)
When the global financial crisis struck roughly a year ago, the blogosphere was ablaze with all sorts of scary
predictions of, and commentary regarding, ensuing conflict and wars -- a rerun of the Great Depression leading to world war,
as it were. Now, as global economic news brightens and recovery -- surprisingly led by China and emerging markets -- is the talk of the day, it's interesting to look back over the past year and realize how
globalization's first truly worldwide recession has had virtually no impact whatsoever on the international
security landscape. None of the more than three-dozen ongoing conflicts listed by GlobalSecurity.org can be
clearly attributed to the global recession. Indeed, the last new entry (civil conflict between Hamas and Fatah in the Palestine) predates the economic crisis by a year, and three
quarters of the chronic struggles began in the last century. Ditto for the 15 low-intensity conflicts listed by Wikipedia (where the latest entry is the Mexican "drug war" begun in 2006). Certainly, the Russia-Georgia
conflict last August was specifically timed, but by most accounts the opening ceremony of the Beijing Olympics was the most important external trigger (followed by the U.S. presidential campaign) for that sudden spike
in an almost two-decade long struggle between Georgia and its two breakaway regions. Looking over the various databases, then, we see a most familiar picture: the usual mix of civil conflicts, insurgencies, and
the only two potential state-on-state wars (North v. South Korea,
Israel v. Iran) are both tied to one side acquiring a nuclear weapon capacity -- a process wholly unrelated to
global economic trends. And with the United States effectively tied down by its two ongoing major interventions (Iraq and Afghanistan-bleeding-into-Pakistan), our involvement
elsewhere around the planet has been quite modest, both leading up to and following the onset of the
economic crisis: e.g., the usual counter-drug efforts in Latin America, the usual military exercises with allies across Asia, mixing it up with pirates off Somalia's coast). Everywhere else
we find serious instability we pretty much let it burn, occasionally pressing the Chinese -- unsuccessfully -- to do something. Our new Africa Command, for example, hasn't led us to anything beyond advising and
training local forces. So, to sum up: No significant uptick in mass violence or unrest (remember the smattering of urban riots last year in places like Greece, Moldova and
Latvia?); The usual frequency maintained in civil conflicts (in all the usual places) ; Not a single state-on-state war
directly caused (and no great-power-on-great-power crises even triggered); No great improvement or
disruption in great-power cooperation regarding the emergence of new nuclear powers (despite all that diplomacy); A modest
scaling back of international policing efforts by the system's acknowledged Leviathan power (inevitable given the strain); and No serious efforts by any rising great power to
liberation-themed terrorist movements. Besides the recent Russia-Georgia dust-up,
challenge that Leviathan or supplant its role. (The worst things we can cite are Moscow's occasional deployments of strategic assets to the Western hemisphere and its weak efforts to outbid the United States on basing
rights in Kyrgyzstan; but the best include China and India stepping up their aid and investments in Afghanistan and Iraq.) Sure, we've finally seen global defense spending surpass the previous world record set in the late
1980s, but even that's likely to wane given the stress on public budgets created by all this unprecedented "stimulus" spending. If anything, the friendly cooperation on such stimulus packaging was the most notable great-
Can we say that the world has suffered a distinct shift to political radicalism as a result of
the economic crisis? Indeed, no. The world's major economies remain governed by center-left or center-right
political factions that remain decidedly friendly to both markets and trade. In the short run, there were attempts across the board to insulate
economies from immediate damage (in effect, as much protectionism as allowed under current trade rules), but there was no great slide into "trade wars." Instead, the World Trade
power dynamic caused by the crisis.
Organization is functioning as it was designed to function, and regional efforts toward free-trade agreements have not slowed. Can we say Islamic radicalism was inflamed by the economic crisis? If it was, that shift was
clearly overwhelmed by the Islamic world's growing disenchantment with the brutality displayed by violent extremist groups such as alQaida. And looking forward, austere economic times are just as likely to breed connecting evangelicalism as disconnecting fundamentalism. At the end of the day, the economic crisis did
not prove to be sufficiently frightening to provoke major economies into establishing global regulatory
schemes, even as it has sparked a spirited -- and much needed, as I argued last week -- discussion of the continuing viability of the U.S. dollar as the world's primary reserve currency. Naturally, plenty of
experts and pundits have attached great significance to this debate, seeing in it the beginning of "economic
warfare" and the like between "fading" America and "rising" China . And yet, in a world of globally integrated production chains and interconnected
financial markets, such "diverging interests" hardly constitute signposts for wars up ahead. Frankly, I don't welcome a world in which America's fiscal profligacy goes undisciplined, so bring it on -- please! Add it all up and
this global financial crisis has proven the great resilience of America's post-World War II
international liberal trade order. Do I expect to read any analyses along those lines in the blogosphere any time soon? Absolutely not. I expect the fantastic fearmongering to proceed apace. That's what the Internet is for.
it's fair to say that
Ext Trade Doesn’t Solve War
Trade doesn’t stop war, 1NC Barbieri uses rigorous statistical analysis controlling a ton
of variables- reject their studies that don’t rely on empirical support
Their authors have causality backward---war prevents trade
Omar M. G. Keshk 10, senior lecturer in the Political Science Department at, and PhD in Political
Science from, Ohio State University; Rafael Reuveny, prof of international political economy and
ecological economics at and PhD from Indiana University; and Brian M. Pollins, emeritus Associate Prof
of Political Science at Ohio State; “Trade and Conflict: Proximity, Country Size, and Measures,” Conflict
Management and Peace Science 2010 27: 3, SAGE journals
In all, any signal that “trade brings peace” remains weak and inconsistent, regardless of the way proximity is modeled in
the conflict equation. The signal that conflict reduces trade, in contrast, is strong and consistent. Thus, international politics
are clearly affecting dyadic trade, while it is far less obvious whether trade systematically affects dyadic politics, and if it does, whether that
effect is conflict dampening or conflict amplifying. This is what we have termed in KPR (2004) “The Primacy of Politics.” ¶ 7. Conclusion¶ This study revisited the simultaneous equations model we
presented in KPR (2004) and subjected it to four important challenges. Two of these challenges concerned The specification of the conflict equation in our model regarding the role of inter- capital distance and the sizes of both
sides in a dyad; one questioned the bilateral trade data assumptions used in the treatment of zero and missing values, and one challenge suggested a focus on fatal MIDs as an alternative indicator to the widely used all-MID
measure ¶ The theoretical and empirical analyses used to explore proposed alternatives to our original work were instructive and the empirical results were informative, but there are certainly other legitimate issues that the trade
and conflict research community may continue to ponder. For example, researchers may continue to work on questions of missing bilateral trade data, attempt to move beyond the near- exclusive use of the MIDs data as we
contemplate the meaning of “military conflict,” and use, and extend the scope of, the Harvey Starr GIS-based border data as one way to treat contiguity with more sophistication than the typical binary variable. ¶ The single
The
results we obtained under all the 36 SEM alternatives we estimated yielded an important, measurable effect of conflict on trade.
Henceforth, we would say with high confidence that any study of the effect of trade on conflict that ignores this reverse fact is
practically guaranteed to produce estimates that contain simultaneity bias. Such studies will claim that
“trade brings peace,” when we now know that in a much broader range of circumstances, it is “peace
that brings trade.Ӧ Our message to those who would use conflict as one factor in a single-equation model of trade is only slightly less cautionary. They too face dangers in ignoring the other side of the coin.
greatest lesson of this study is that future work studying the effect of international trade on international military conflict needs to employ a simultaneous specification of the relationship between the two forces.
In one half of the 36 permutations we explored, the likelihood of dyadic military conflict was influenced by trade flows. In most tests where this effect surfaced, it was positive, that is, trade made conflict more likely. But the
direction of this effect is of no consequence for the larger lesson: trade modelers ignore the simultaneity between international commerce and political enmity at their peril. They too run no small risk of finding themselves
deceived by simultaneity bias.¶ Our empirical findings show clearly that international politics pushes commerce in a much broader range of circumstances than the reverse. In fact, we could find no combination of model choices,
Liberal claims regarding the effect of dyadic trade on
dyadic conflict simply were not robust in our findings. They survived in only 8 of the 36 tests we ran, and failed to hold up when certain data assumptions were
indicators, or data assumptions that failed to yield the result that dyadic conflict reduces dyadic trade.
altered, and were seriously vulnerable to indicator choices regarding inter-capital distance, conflict, and national size.
Trade and interdependence don’t prevent war
Zachary Keck 13, Associate Editor of The Diplomat, monthly columnist for The National Interest,
7/12/13, “Why China and the US (Probably) Won’t Go to War,” http://thediplomat.com/2013/07/whychina-and-the-us-probably-wont-go-to-war/
Xinhua was the latest to weigh in on this question ahead of the Strategic and Economic Dialogue this
week, in an article titled, “China, U.S. Can Avoid ‘Thucydides Trap.’” Like many others, Xinhua’s
argument that a U.S.-China war can be avoided is based largely on their strong economic relationship.
This logic is deeply flawed both historically and logically. Strong economic partners have gone to war in
the past, most notably in WWI, when Britain and Germany fought on opposite sides despite being each
other’s largest trading partners.
More generally, the notion of a “capitalist peace” is problematic at best. Close trading ties can raise the
cost of war for each side, but any great power conflict is so costly already that the addition of a
temporarily loss of trade with one’s leading partner is a small consideration at best.
And while trade can create powerful stakeholders in each society who oppose war, just as often trading
ties can be an important source of friction. Indeed, the fact that Japan relied on the U.S. and British
colonies for its oil supplies was actually the reason it opted for war against them. Even today, China’s
allegedly unfair trade policies have created resentment among large political constituencies in the
United States.
AT: Gartkze
Gartzke’s model has significant missing values which biases its findings
Zhen Han 12, MA, Political Science, University of British Columbia, March 2012, “The Capitalist Peace
Revisited: A New Liberal Peace Model and the Impact of Market Fluctuations,”
https://circle.ubc.ca/bitstream/handle/2429/41809/ubc_2012_spring_han_zhen.pdf?sequence=1
The missing value problem needs serious attention for the students who study liberal peace models.
Dafoe finds that missing values in Gartzke’s models are systematically associated with its major
explanatory variable—market openness96 , thus leads to a biased conclusion . For example, China, the
U.S.S.R, and North Korea were involved in several militarized interstate conflicts, but a significant
part of the market openness is missing for these countries 97, and excluding these cases from the
model leads to a bias . While Dafoe assigns value 1 (least open to financial market) to all the missing values of China, U.S.S.R and
North Korea, he finds that market openness lost its significance and democracy become significant again98. But Dafoe’s approach can be
problematic as well, because these nations may be open to each other while staying closed to the west or the global financial markets. In
case of North Korea, foreign capital from the U .S.S.R and China are pivotal to the survival of the regime.
Correcting for those missing values proves market liberalization causes conflict--Gartzke’s backwards
Zhen Han 12, MA, Political Science, University of British Columbia, March 2012, “The Capitalist Peace
Revisited: A New Liberal Peace Model and the Impact of Market Fluctuations,”
https://circle.ubc.ca/bitstream/handle/2429/41809/ubc_2012_spring_han_zhen.pdf?sequence=1
Model 1 replicates Model 5 of Gartzke’s capitalist peace paper100. A major difference between the findings of
Model 1 and Gartzke’s capitalist peace Model 5 is that, Model 1 of this paper shows that higher level of financial market
openness is positively associated with more conflict , while Gartzke finds his market openness index is negatively associated
with more conflict101. As Dafoe points out, Gartzke’s finding can be damaged by the missing values in his market
openness variable, and the temporal dependence and cross-sectional dependence are not properly
controlled102. Model 1 pays close attention to these problems , and finds that, at least in this period, market
openness is positively associated with more conflicts . As the data of this paper focuses on a different time period, this result does not suggest
Gartzke is wrong, but further explanation of why market openness is positively associated with more conflict is necessary.
The low value of democracy is negatively associated with conflicts, and this finding is consistent with the argument of democratic peace theory. The positive impact of the high value of
democracy possibly shows that a discrepant dyad—when the democracy low value is controlled—is more likely to fight each other. As Choi points out, the interpretation of the
democracy high variable is often difficult, but it seems the democratic peace theory is well supported by this data.
The traditional commercial peace theory, which focuses on the trade dependency created by international commodity trade, is also supported by this model. Development makes
noncontiguous states more likely to fight each other, as the development facilitated the capacity of states to project power to a longer distance, but development also makes contiguous
states less likely to fight each other103. This finding supports that the interaction effect between contiguity and development is also robust in this period. Being a major power makes the
state more likely to be involved in conflicts. Similar to this finding, a state is more likely to be involved in MIDs if its national power index is higher. However, formal alliances have no
significant impact on the probability of MIDs.
Model 2 replaces the high value of democracy with the democracy distance variable104 . Since the democracy distance variable is a linear transformation of the high value of
democracy105 , this replacement produces identical results to Model 1, but the interpretation of democratic peace in this model is much easier. The positive and significant impact of the
democracy distance variable supports the expectation from Choi: politically different countries— the authoritarian states and the democratic states—are more likely to fight each other
106 . Different political ideology can be the underlining reason for tension. As this paper suggests before, since many pacifying mechanisms available for democracies do not exist in
autocratic and discrepant dyads, the same democracy distance should have different impact in different types of dyad. Model 3.1 applies this proposal and makes the lower value of
democracy interact with the democracy distance variable. The findings are impressive: The negative coefficient of the lower value of democracy becomes significant again; the coefficient
of democracy distance loses its significance, but the interaction effects between these two variables are positively significant. This finding supports the democratic peace argument:
countries are less likely to fight if they both are highly democratic, but this pacifying effect has been mitigated if the democracy distance is getting bigger. Figure 1 presents a prediction
of the probability of conflict based on Model 3.1. It shows that the probability of conflict is almost the same for autocratic and discrepant dyads, and both of them are much higher than
the probability for democratic dyads.
Model 3.2 replaces the low value of democracy with a three-category indicator of dyad type 107 and makes the dyad type indicator interacting with the democracy distance variable. The
result shows that, compared with the base category (democratic dyad), the risk of fighting is higher in the other two types of dyads. In the base category, democratic distance does not
have significant impact on their chance of fighting. Figure 2 shows how the predicted probability of conflict, based on Model 3.2, changes across different dyad types. The predicted
probability shows that one can confidently claim that democratic dyads are more peaceful than other types of dyad, but the upward trend, which is similar to the trend showing in the
predicted chance of fighting for autocracies, shows that bigger democracy distance leads to more conflicts in these two types of dyads. The discrepant dyad group generally behaves
similarly to the autocracy group, except that the downward trend of the curve, showing that instead of fighting for different democratic ideology, shows discrepant dyads often fight for
other reasons. However, the confidence interval of the discrepant dyad group largely overlaps with the confidence interval of the autocracy group, so more data are needed to
distinguish whether discrepant dyads behave differently from autocracy dyads.
In conclusion, this paper argues that the democratic peace model can be improved by interacting the democracy distance variable with the other democracy measurement of the dyad.
Findings from these interaction models support the dyadic claim that ―democratic countries are unlikely to fight each other‖, but they also suggest one cannot extend this claim to the
monadic level. Democratic countries are not more peaceful, as the chance of conflicts is high in a discrepant dyad. Increasing ideological differences, as measured by the democracy
distance variable in these models, can increase the chances of conflicts.
On the commercial peace aspect, Model 1 of this paper suggests that higher
market openness can lead to more
conflicts. This positive correlation might be explained by the spillover effect of market fluctuation. In order to
capture the impact of market fluctuation, Model 4 added a set of variables related to the measurement of foreign capital net inflows to the
model. The results show that, once the capital flow factors are considered in the model, the market openness variable loses its significance,
and a higher level of capital net inflow is positively associated with more interstate conflicts.
The missing value indicator of capital net inflows is included in the model to control the damage caused by missing data in the capital net
inflow variable. This missing indicator is positive and significant, suggesting that missing economic data are systematically associated with
militarized conflicts. The lagged capital net inflows variable, measured as the percentage of GDP, is included in the model, and higher level
of capital net inflows is associated with a higher risk of conflicts. The
change of capital net inflow variable, which is
measured by the level of current capital net inflows minus the level of the one-year lagged capital net inflows, is also positively
associated with more conflicts, meaning the risk of conflict is higher if there are more foreign capitals
pouring into the country . These findings support the theory of this paper that large capital inflows can
destabilize the domestic economy and cause crises, but they are also contrary to the conventional understanding that
foreign capital will leave the conflicting region. However, it can be explained by the following reasons.
A2 data localization
The cost of data localization is not worth the gains – both small and big companies
won’t leave
Marel, Makivama, and Bauer in 14 (Published May 2014 by Erik van der MArel, Hosku LeeMakivama, Matthias Bauer ECIPE, 7-15-2015, "The Costs of Data Localisation: A Friendly Fire on
Economic Recovery," http://www.ecipe.org/publications/dataloc/ )
This paper aims to quantify the losses that result from data localisation requirements and related data privacy and security laws that discriminate
The study looks at the effects of
recently proposed or enacted legislation in seven jurisdictions, namely Brazil, China, the European Union
(EU), India, Indonesia, South Korea and Vietnam.Access to foreign markets and globalised supply chains
are the major sources of growth, jobs and new investments – in particular for developing economies.
Manu- facturing and exports are also dependent on having access to a broad range of ser- vices at
competitive prices, which depend on secure and efficient access to data. Data localisation potentially affects any
business that uses the internet to produce, deliver, and receive payments for their work, or to pay their salaries and taxes .The impact of
recently proposed or enacted legislation on GDP is substantial in all seven countries: Brazil (-0.2%), China (-1.1%),
EU (-0.4%), India (-0.1%), Indone- sia (-0.5%), Korea (-0.4%) and Vietnam (-1.7%). These changes significantly affect post-crisis
economic recovery and can undo the productivity increases from major trade agreements, while
economic growth is often instrumental to social stability.If these countries would also introduce economywide data localisation require- ments that apply across all sectors of the economy, GDP losses would be even highagainst foreign suppliers of data, and downstream goods and services providers, using GTAP8.
er: Brazil (-0.8%), the EU (-1.1%), India (-0.8%), Indonesia (-0.7%), Korea (-1.1%).The impact on overall domestic investments is also considerable:
Brazil (-4.2%), China (-1.8%), the EU (-3.9%), India (-1.4%), Indonesia (-2.3%), Korea (-0.5%) and Vietnam (-3.1). Exports of China and Indonesia
also decrease by -1.7% as a conse- quence of direct loss of competitiveness.Welfare losses (expressed as actual economic losses by the citizens)
amount to up to $63 bn for China and $193 bn for the EU. For India, the loss per worker is equivalent to 11% of the average month salary, and
the negative impact of disrupting crossborder data flows should not be ignored. The globalised economy has made unilateral trade restric- tions
a counterproductive strategy that puts the country at a relative loss to others, with no possibilities to
mitigate the negative impact in the long run. Forced locali- sation is often the product of poor or onesided economic analysis, with the sur- reptitious objective of keeping foreign competitors out. Any gains
stemming from data localisation are too small to outweigh losses in terms of welfare and output in the
general economy.
almost 13 percent in China and around 20% in Korea and Brazil.The findings show that
Big tech companies are not going to localize their data
Miller in 14 (Claire Cain Miller, 1-24-2014, "Google Pushes Back Against Data
Localization," Bits Blog, http://bits.blogs.nytimes.com/2014/01/24/google-pushes-back-againstdata-localization/, NB)
The big tech companies have put forth a united front when it comes to pushing back against the
government after revelations of mass surveillance. But their cooperation goes only so far. Microsoft this week suggested that
it would deepen its existing efforts to allow customers to store their data near them and outside the United States. Google, for its part, has been
fighting this notion of so-called data localization. “If data localization and other efforts are successful, then what we will
face is the effective Balkanization of the Internet and the creation of a ‘splinternet’ broken up into smaller national
and regional pieces, with barriers around each of the splintered Internets to replace the global Internet we know today,” Richard Salgado, Google’s
director of law enforcement and information security, told a congressional panel in November. Data crisscrosses the globe among data centers,
and companies often store redundant copies of data in different places in case of natural disaster or technical failure. In most cases, companies
cannot even pinpoint precisely where certain data is located. At the same time, the United States government is tapping
the fiber-optic network that connects data centers worldwide, according to leaked documents. So even if
data is stored outside the United States, it could be intercepted during its travels. Still, Microsoft and other
tech companies are trying to prevent foreign customers from switching to services outside the United
States. In the next three years, the cloud computing industry could lose $180 billion, 25 percent of its
revenue, because of such defections, according to Forrester, a research company. Yet even though Google
faces these same risks and requests from foreign customers, its policy position is for surveillance reform
instead of data localization, according to a person briefed on Google’s policy who would speak only anonymously. Though Google at
one time tried to offer customers the ability to store their data in one location in response to requests, it does not offer that feature now because
it determined it was illogical, the person said. Google
decided data is more secure if it is stored in multiple locations
and that storing it in one location slows Google services and makes accessing the data less convenient for
customers, the person said. Mr. Salgado said a proposed law in Brazil that would require all data of Brazilian citizens and companies to be
stored in the country would be so difficult to comply with that Google “could be barred from doing business in one of the
world’s most significant markets.” For a great many around the globe, the Snowden disclosures revealed a disturbing
relationship between the major U.S. technology firms and the American national security establishment.
Specifically, the disclosures showed that Yahoo, Google, and other large American tech companies had provided the
NSA with access to the data of the users of their services. Although there were many programs that tied the major American
firms to the NSA, three in particular drew special ire: the much-discussed PRISM7 program, a collaborative
effort between the NSA and the FBI which compelled Internet companies to hand over data held within
servers located on U.S. soil in response a subpoena issued by a special intelligence court, and two
programs known as MUSCULAR and TEMPORA,89 both of which allowed the NSA (in partnership with
Britain’s signals intelligence agency, the GCHQ) to access information transmitted through the data
communication links of American-owned firms located outside the U.S., where statutory limitations on
data collection are far less stringent.10 he fact that American companies provided the U.S. government with information and access
to data (knowingly in some cases, apparently unwittingly in others) has led many foreign leaders to conclude that only
domestic firms – or at least non-American firms – operating exclusively within local jurisdictions, can be
trusted to host the data of their citizens. Prominent political voices around the globe have been anything but subtle in their
articulation of this assessment. Following the publication of the PRISM program in the Guardian newspaper, German Interior Minister
Hans-Peter Friedrich declared that, “whoever fears their communication is being intercepted in any way
should use services that don't go through American servers.”11 France’s Minister for the Digital Economy similarly insisted
that it was now necessary to “locate datacenters and servers in [French] national territory in order to better ensure data security.”12 Brazilian
President Dilma Rousseff agreed, insisting that, "there
is a serious problem of storage databases abroad. That certain
situation we will no longer accept."13 Unsurprisingly, these declarations from government officials at the
ministerial level and higher, and the policy responses those declarations suggest, are profoundly troubling
to American technology companies. U.S. firms have issued dire warnings in response,14 predicting that they could
lose tens of billions of dollars in revenue abroad as distrustful foreign governments and customers move – either by choice or by legal mandate
– to non-U.S. alternatives. Firms
fear that the anti-American backlash and potentially resulting data localization
laws (depending on the specifics of the rules enacted) will mean that they will be forced out of certain
markets, or forced to build expensive – and oftentimes unnecessarily redundant – data centers abroad.
Analysts are suggesting the fallout could mirror what happened to Huawei and ZTE, the Chinese technology and telecommunications firms that
were forced to abandon some U.S. contracts when American lawmakers accused the companies of planting in their products coding “backdoors”
for the Chinese People’s Liberation Army and intelligence services. 15 A much-cited estimate16 by the Information Technology and
Neg – A2 data localization.
Lewis, 2014
James A. Lewis Senior fellow and program director at the Center for Strategic and International Studies
(CSIS). Before joining CSIS, he worked at the Departments of State and Commerce. He was the
Rapporteur for the 2010 and the 2012–13 United Nations Group of Governmental Experts on
Information Securit(2014) Heartbleed and the State of Cybersecurity, American Foreign Policy Interests:
The Journal of the National Committee on American Foreign Policy, 36:5, 294-299
Far from disappearing,
nation-states are adopting and adjusting—as they have done before—to a new technological
environment. Part of that process has been the gradual extension of sovereign control into what was formerly perceived as a borderless
domain—cyberspace. This extension of sovereignty has major implications for international security, affecting strategies for both offense and
defense, and for negotiations on norms, transparency, and obligations. The perception that sovereignty did not apply to cyberspace disguised a
set of thorny issues on state responsibilities, the role of neutral parties in cyberspace, and the nature of defense today. The
implications
of sovereignty and territorialization, leading to the creation of some kind of ‘‘multipolar’’ Internet, will
become central issues for international security as governments extend sovereign control over
cyberspace and develop the technologies and policies to implement that control. The 1990s view of cyberspace as a global commons 2
foreclosed some policy options for cybersecurity. As the belief in a commons is discarded, these options come back
into play. Nation-states are defining their borders in cyberspace and will now move to assert their control over them. Conversely, the
extension of sovereignty over the Internet and the growing role of the state in cyberspace both provide
new opportunities for better cybersecurity. Countries did not abandon their rights and duties when they adopted the Internet,
they simply did not exercise them. In the last few years, nations have discovered that they can, in fact, extend their sovereign control into
cyberspace. The reason: cyberspace is a physical, man-made creation, not a natural domain. It is created by an assembly of interconnected
computers. The speed at which these computers connect gave the illusion that there were no borders. The
physical underpinnings
of cyberspace—the computers, fiber-optic cables, and other devices— are all located within national
territory and thus subject to national laws. The few satellites or undersea cables not located in national territory are still subject
to the jurisdiction of some nation. Cyberspace has borders within which nations can assert their rights and responsibilities. This is not the
‘‘balkanization’’ of the Internet. Balkanization is a pejorative term applied by defenders of the status quo
(e.g., those supportive of a largely unregulated space dominated by the private interests and the political and cultural norms of a few
countries). The Internet will be no more balkanized than any physical terrain is now. We are unaccustomed to the exercise of sovereign control
in cyberspace; once such control is in place, clearly the Internet and its users will adjust. What nations do within their own territories and on the
networks and infrastructures located within those territories is their own business, subject to their international commitments on interstate
relations and human rights. Two factors are driving the extension of sovereign control. The
first is that the existing set of rules
and institutions that ‘‘manage’’ cyberspace are too weak and too limited for what has become a global
infrastructure. The many failings of cybersecurity and of privacy highlight this weakness, and governments are seeking to exercise their
responsibilities for public safety and national security purposes. The second is the discomfort with the implicit extension of American norms
and values across cyberspace. Cyberspace was shaped and governed by American beliefs, particularly on the freedom of speech (and by
implication, unrestricted access to content). Many
countries—and not just authoritarian states—find this
undesirable.3 Asserting sovereignty is a means to push back against being subsumed politically and
commercially by the superpower. Had a situation where governments were largely absent and where private actors exercised
control been able to deliver security, the status quo might possibly have remained unchallenged, but as nations perceived growing
risk in cyberspace because of their growing economic dependence and military (and, for some, political)
vulnerability in cyberspace, their initial reaction was (and continues to be) to assert control.
Distrust won’t undercut global connectivity
Consumer Distrust of banks in squo hasn’t triggered impacts
The Financial Brand 12, (a digital publication focused on marketing and strategy issues affecting
retail banks and credit unions), Consumers Distrust Banks More Than Any Other Industry, March 20,
2012, http://thefinancialbrand.com/22896/edelman-banking-financial-services-consumer-trust-study/,
The study, conducted by research firm StrategyOne on behalf of Edelman, involved 20-minute online
interviews with 30,000 people in 25 countries. Edelman describes participants in the survey as members
of the “informed public” — college graduates whose household income is in the top quartile for their
age, and who follow newsworthy issues several times a week. Since 2008, consumers’ confidence in
banks’ ability to “do the right thing” has plummeted — a stunning 46% in the US, and an equallyshocking 30% in the UK. Meanwhile, in other markets like India and China where consumers already
have high levels of trust, faith in financial firms’ integrity has increased as much as 12% over the same
three-year period. Respondents ranked “has ethical business practices” (76%), “listens to customer
needs and feedback” (74%), and “places customers ahead of profits” (73%) as the most important
actions financial firms should take to rebuild trust. 31% of consumers think more regulations are needed
to curb irresponsible business practices. 25% want more the government involved to ensure companies
are behaving responsibly. Edelman says the concepts of trust and reputation are inseparable. A financial
institution’s reputation (i.e., its brand) is a consumer’s aggregate feelings about its past behaviors. Past
performance creates future expectations, which in turn determines the degree of trust a consumer
places in an organization. The Financial Trust Barometer, now in its 12th year, seals Edelman’s position
as one of the world’s foremost authorities on reputation tracking in the banking sector. Consumers in
the U.S. are more trusting of financial services brands than many other developed countries, however,
the industry is only trusted by a majority of consumers in nine of 25 countries studied. A report released
earlier this year by consulting firm Oliver Wyman found that 63% of people trust no one but themselves
to manage their retirement savings. Banks were trusted by less than 8% of those surveyed in the U.S.
and U.K. In order to regain trust, banks will have to offer simpler products that appear to offer value for
money, according to the Oliver Wyman study. To do that, lenders and insurers will probably cut onequarter of their costs over the next eight years by delivering products online instead of through
branches.
Cloud Industry in Exponential Growth
The cloud industry is rapidly growing
McCue 14 (http://www.forbes.com/sites/tjmccue/2014/01/29/cloud-computing-united-statesbusinesses-will-spend-13-billion-on-it/, Cloud Computing: United States Businesses Will Spend $13
Billion On It)//A.V.
Instead of a slow-moving fluffy white cloud image, the cloud computing industry should use a tornado –
that might be a better way to visualize how fast cloud computing is growing today. Amazon is
dominating, but is followed by the usual suspects: IBM IBM +0.88%, Apple AAPL +0.83%, Cisco, Google
GOOG +15.96%, Microsoft MSFT -0.13%, Salesforce, and Rackspace, to name a few. (Disclosure: I am on the paid
blogger team for IBM Midsize Insider, which covers technology pertinent to midsize companies, including the cloud, among other topics.) “The
cloud” is frequently in the news, but there is also a fair amount of confusion, outside of technology teams. What is cloud computing and why do
businesses need to care? IBM
published this handy infographic: 5 Reasons Businesses Use The Cloud. Among
the reasons: Collaboration, better access to analytics, increasing productivity, reducing costs, and
speeding up development cycles. By 2015, end-user spending on cloud services could be more than
$180 billion. It is predicted that the global market for cloud equipment will reach $79.1 billion by 2018 If
given the choice of only being able to move one application to the cloud, 25% of respondents would choose storage By 2014, businesses
in the United States will spend more than $13 billion on cloud computing and managed hosting services.
According to Jack Woods at Silicon Angle, there’s some serious growth forecasted and he lists 20 recent
cloud computing statistics you can use to make your case for why you need the cloud or to understand
why you should consider it for your business. The above bullet points come from his post. At the simplest
level, cloud computing lets you share files and resources via the web. If you run a small business, you read about cloud storage as a way to
backup and protect your data. Companies, such as, Google Docs (now Drive), DropBox, Carbonite, Cubby, and a slew of others. If you need to
choose an online backup provider, take a look at Tim Fisher’s post: 40 Online Backup Services Reviewed. Looking beyond online backup, to
understanding where the cloud is going, I look to Louis Columbus who writes extensively about the cloud here on Forbes and elsewhere. After
reading Louis’ Big Data post a couple of weeks ago, 2014: The Year Big Data Adoption Goes Mainstream In The Enterprise, (linked in left
column) it made it clear that big data takes big crunching power and is part of why many midsized and large enterprises are not leveraging inhouse servers – there is no leverage. You want to leverage a cloud provider like those listed above that can scale up and down with your
computing needs. Well
known tech guru, Om Malik, can help you wrap your head around cloud computing
and its enormous, rapid growth in his interview with Amazon CTO Werner Vogels. Vogels spells out how
the big impact of the cloud will take place overseas, outside the USA, especially for small and midsized
businesses. You can read additional 2014 from GigaOm here. Forbes contributor, Joe McKendrick, penned a useful post as I was writing this
today: 9 ‘Worst Practices’ To Avoid With Cloud Computing. It includes some great ideas to help you evaluate whether the cloud can help your
business or not (listed to the left). From
improved collaboration to productivity increases, the cloud has the
potential to change your business. The cloud is rapidly growing, moving quickly – if you’ve been on the
fence around investing in new infrastructure, you should look to the cloud. This post from Network Computing lists
out a bunch of recent stats around big data and the cloud. Final note: The Myth Of Online Backup by Tony Bradley here on Forbes is excellent
and highlights how you need to read the fine print for any cloud service you use. He explains a not-so-extreme example of a photographer using
Carbonite to back up her 2.5 Terabytes of important, wouldn’t-want-to-lose customer data. Worthy read.
Cloud industry set to exponentially grow because it cuts costs for biznessseseseses
Relander 15 (http://www.investopedia.com/articles/investing/032715/cloudcomputing-industryexponential-growth.asp, Cloud-Computing: An industry In Exponential Growth)//A.V.- 60%growth,
growth in 5 years, grow40 millis by 2018
Investors today have far more options available than in the past. Among these options is the cloud.
Nasdaq reported that last year revenues for cloud services grew by 60 percent. Furthermore, cloud
computing is anticipated to continue growing at a robust rate over the course of the next five years. If you
are considering investing in a technology-based company, there are certainly many advantages. By owning stock in a company offering cloud-
based services, you will not only be able to follow the latest trends but also have the opportunity to make money from the explosive growth of
this industry. Before getting involved in cloud computing investing, however, it is important to understand what is involved, what is driving the
growth in this industry, and the best way to plan your cloud investments. What Is Cloud Computing? Before you consider investing in cloud
computing, it is a good idea to have a basic understanding of exactly what it is. While many of us enjoy the ability to upload photos, documents,
and videos to "the cloud" and then retrieve them at our convenience, the concept of cloud computing is somewhat abstract. The heart of cloud
computing is fairly simple. Companies providing cloud services make it possible to store data and applications remotely, and then access those
files via the Internet. (For a background on the Internet industry from which cloud computing has emerged, see article: The Industry Handbook:
The Internet Industry.) Cloud computing is primarily comprised of three services: infrastructure as a service (IaaS), software as a service (SaaS),
and platform as a service (Paas). Software as a service is expected to experience the fastest growth, followed by infrastructure as a service.
According to research conducted by Forrester, the cloud computing market is anticipated to grow from $58 billion in 2013 to more than $191
billion by the year 2020. Software as a Service (SaaS) Deployed online, SaaS involves the licensure of an application to customers. Licenses are
typically provided through a pay-as-you-go model or on-demand. This rapidly growing market could provide an excellent investment
opportunity, with Goldman Sachs reporting that SaaS software revenues are expected to reach $106 billion by next year. Infrastructure as a
Service (IaaS) Infrastructure as a service involves a method for delivering everything from operating systems to servers and storage through IPbased connectivity as part of an on-demand service. Clients can avoid the need to purchase software or servers, and instead procure these
resources in an outsourced on-demand service. Platform as a Service (PaaS) Of the three layers of cloud-based computing, PaaS is considered
the most complex. While PaaS shares some similarities with SaaS, the primary difference is that instead of delivering software online, it is
actually a platform for creating software that is delivered via the Internet. Forrester research indicates that PaaS solutions are expected to
generate $44 billion in revenues by the year 2020. What Is Driving Growth in Cloud Computing The rise of cloud-based software has offered
companies from all sectors a number of benefits, including the ability to use software from any device, either via a native app or a browser. As a
result, users are able to carry over their files and settings to other devices in a completely seamless manner. Cloud
computing is about
far more than just accessing files on multiple devices, however. Thanks to cloud-computing services, users can check
their email on any computer and even store files using services such as Dropbox and Google Drive. Cloud-computing services also
make it possible for users to back up their music, files, and photos, ensuring that those files are
immediately available in the event of a hard drive crash. Driving the growth in the cloud industry is the
cost savings associated with the ability to outsource the software and hardware necessary for tech
services. According to Nasdaq, investments in key strategic areas such as big data analytics, enterprise
mobile, security and cloud technology, is expected to increase to more than $40 million by 2018. With
cloud-based services expected to increase exponentially in the future, there has never been a better
time to invest, but it is important to make sure you do so cautiously. (See article: A Primer On Investing In The Tech
Industry.)
Cloud computing industry is inherently more secure than server storage and generates
1.5 times profit compared with alternatives
Papersave 15 (http://www.papersave.com/blog/bid/209222/Cloud-computing-is-growing-rapidlydespite-some-misguided-concerns, Cloud computing is growing rapidly, despite some misguided
concerns)//A.V.
The use of cloud computing to manage business documents and transactions is becoming more
prominent throughout the world. The technology has proven to lead to increased profitability for those who have implemented it,
but in some cases, unwarranted uncertainties are preventing companies from recognizing this
advantage. Despite high usage rates, apprehension exists in the European market The cloud has already expanded in a significant way
across most of Europe. According to statistics from EuroStat, Nordic countries are leading the way in adopting cloud solutions. Finland, Sweden
and Denmark all rated in the top four in terms of having the highest proportions of enterprises that used the technology, with 51 percent of
Finnish companies indicating they used some sort of cloud-based service. The source found that the average rate of usage throughout the
continent was 19 percent, and that there were 10 nations that rose above this number. Among cloud subscribers, EuroStat noted that email
leads the way for all services, with 66 percent using it. The second-most popular solution is cloud storage, with 53 percent overall indicating
use, and three northwest European countries - Ireland at 74 percent, Iceland at 74 percent and the U.K. at 71 percent - lead the way. Despite
widespread acclimation globally, a certain mystique continues to surround the cloud. The source remarked that of those enterprises which have
not yet begun to move any services to the cloud, the most common reason cited was a lack of knowledge about the technology, at 42 percent.
Furthermore, 37 percent indicated apprehension regarding risk of a security breach, and 33 percent were uncertain about the location of cloudbased data. All
of these concerns should be alleviated by the fact that, according to Systems and Software,
cloud data centers are inherently more safe than on-premise servers, due to their advanced security
methodologies and initiatives. However, a certain "cloud," so to speak, continues to hang over the
industry, as many potential users remain unjustifiably skeptical about security. Meanwhile, overall
implementation is skyrocketing An IDC white paper, entitled "Successful Cloud Partners 2.0," surveyed
over 700 Microsoft partners worldwide and found that those with more than half of their revenue
coming from the cloud made 1.5 times the gross profit when compared to other partners. The study noted
that on average, the respondents make 26 percent of their income from cloud-related products and services, a number that is expected to
surpass 40 percent in 2016. While the North American and Western European economies have largely implemented cloud computing over the
past decade or so, in emerging markets like Latin America and Asia, the concept is still fairly new. The IDC study predicted that the cloud will
grow in developing places 1.8 times faster than established markets, and will close the gap on them. By
2017, the source projected
that newcomers will account for 21.3 percent of the public cloud market. "Spending on public IT cloud
services will grow to $108 billion in 2017." The source predicted that spending on public IT cloud
services will grow from $47.4 billion in 2013 to $108 billion in 2017. Software as a service is projected to
remain the highest-grossing cloud solution category through 2017, as it is expected to make up 57.9
percent of the overall industry. However, IDC said that the platform-as-a-service and infrastructure-as-a-service sectors will grow
faster than SaaS over the next five years. It appears as though fewer companies are viewing the cloud in a skeptical manner with each passing
year, and, eventually, the technology will become the standard for business practices throughout the world. Developing nations are catching up
to the traditional powerhouses in North America and Western Europe, so it's only a matter of time before the world is predominantly run via
cloud computing.
Cloud computing = tech industry
UNC NDG (Cloud Computing,
http://www.unc.edu/courses/2010spring/law/357c/001/cloudcomputing/examples.html)//A.V.
Types of Cloud Computing Major corporations including Amazon, Google, IBM, Sun, Cisco, Dell, HP, Intel,
Novell, and Oracle have invested in cloud computing and offer individuals and businesses a range of
cloud-based solutions. For a list of the top 150 players in cloud computing, see the Cloud Computing Journal website:
http://cloudcomputing.sys-con.com/node/770174 Social Networking Perhaps the most famous use of cloud computing,
which does not strike people as "cloud computing" at first glance is social networking Websites,
including Facebook, LinkedIn, MySpace, Twitter, and many, many others. The main idea of social networking is to
find people you already know or people you would like to know and share your information with them. Of course, when you share your
information with these people, you're also sharing it with the people who run the service.
Data Localization Inevitable
5 alternative causes to data localization, we will re-read the rest of the 1ac article you
discarded
Chander and Le 15 (Director, California International Law Center, Professor of Law and Martin Luther
King, Jr. Hall Research Scholar, University of California, Davis; Free Speech and Technology Fellow,
California International Law Center; A.B., Yale College; J.D., University of California, Davis School of Law,
Anupam Chander and Uyên P. Lê, DATA NATIONALISM, EMORY LAW JOURNAL, Vol. 64:677,
http://law.emory.edu/elj/_documents/volumes/64/3/articles/chander-le.pdf)//A.V.
II. Analysis The country studies above reveal the pervasive efforts across the world to erect barriers to the
global Internet. But can these measures that break the World Wide Web be justified by important
domestic policy rationales? Governments offer a variety of arguments for data localization, from
avoiding foreign surveillance to promoting users’ security and privacy to bolstering domestic law
enforcement and securing domestic economic development. We consider below these four
justifications, as well as the costs they will impose on the economic development and political and social freedom across the world. We
leave for a later study a crucial additional concern—the fundamental tension between data localization and trade liberalization obligations. 168
Data localization makes impossible the forms of global business that have appeared over the last two decades, allowing the provision of
information services across borders. Moreover, protectionist policies barring access to foreign services only invite reciprocal protectionism from
one’s trading partners, harming consumers and businesses alike in the process by denying them access to the world’s leading services. A.
Foreign Surveillance Beginning on June 5, 2013, the British newspaper The Guardian shocked the world
with revelations that the U.S. National Security Agency (NSA) had been secretly intercepting personal
data of individuals and dignitaries domestically and abroad. 169 Through internal records released by
Edward Snowden, a technical specialist working for the NSA, the NSA was accused of monitoring more
than thirty-five world leaders 170 and intercepting communications from more than 50,000 computer systems
worldwide. 171 Anger at disclosures of U.S. surveillance abroad has led some countries to respond by attempting to keep data from leaving
their shores, lest it fall into U.S. or other foreign governmental hands. For example, India’s former Deputy National Security Advisor, Nehchal
Sandhu, reportedly sought ways to route domestic Internet traffic via servers within the country, arguing that “[s]uch an arrangement would
limit the capacity of foreign elements to scrutinize intra-India traffic.” 172 The BRICS nations (Brazil, Russia, India, China, and South Africa) are
seeking to establish an international network of cables that would create “a network free of US eavesdropping.” 173 But does data localization
in fact stave off foreign surveillance? There are significant reasons to be skeptical of this claim. First, the United States, like many countries,
concentrates much of its surveillance efforts abroad. Indeed, the Foreign Intelligence Surveillance Act is focused on gathering information
overseas, limiting data gathering largely only when it implicates U.S. persons. 174
The recent NSA surveillance disclosures
have revealed extensive foreign operations. 175 Indeed, constraints on domestic operations may well
have spurred the NSA to expand operations abroad. As the Washington Post reports, “Intercepting communications
overseas has clear advantages for the NSA, with looser restrictions and less oversight.” 176 Deterred by a 2011 ruling by the Foreign
Intelligence Surveillance Court barring certain broad domestic surveillance of Internet and telephone traffic, 177 the NSA may have increasingly
turned its attention overseas. Second,
the use of malware eliminates even the need to have operations on the
ground in the countries in which surveillance occurs. The Dutch newspaper NRC Handelsblad reports that the NSA has
infiltrated every corner of the world through a network of malicious malware. 178 A German computer expert noted that “data was intercepted
here [by the NSA] on a large scale.” 179 The NRC Handelsblad suggests that the NSA has even scaled the Great Firewall of China, 180
demonstrating that efforts to keep information inside a heavily secured and monitored ironclad firewall do not necessarily mean that it cannot
be accessed by those on the other side of the earth. This is a commonplace phenomenon on the Internet, of course. The recent enormous
security breach of millions of Target customers in the United States likely sent credit card data of Americans to servers in Russia, perhaps
through the installation of malware on point of-sale devices in stores. 181 Third, while governments denounce foreign surveillance on behalf of
their citizens, governments routinely share clandestinely intercepted information with each other. 182The Guardian reports that Australia’s
intelligence agency collects and shares bulk data of Australian nationals with its partners—the United States, Britain, Canada, and New Zealand
(collectively known as the “5-Eyes”). 183 Even while the German government has been a forceful critic of NSA surveillance, the German
intelligence service has been described as a “prolific partner” of the NSA. 184Der Spiegel reports that the German foreign intelligence agency
Bundesnachrichtendienst (BND) has been collaborating with the NSA, passing about 500 million pieces of metadata in the month of December
2012 alone. 185 The NSA has collaborated with the effort led by the British intelligence agency Government Communications Headquarters
(GCHQ) to hack into Yahoo!’s webchat service to access unencrypted webcam images of millions of users. 186 A German computer expert
observes, “We know now that data was intercepted here on a large scale. So limiting traffic to Germany and Europe doesn’t look as promising
as the government and [Deutsche Telekom] would like you to believe.” 187 Fourth, far from making surveillance more difficult for a foreign
government, localization requirements might in fact make it easier. By compelling companies to use local services rather than global ones,
there is a greater likelihood of choosing companies with weak security measures. By their very nature, the global services are subject to intense
worldwide competition, while local services—protected by the data localization requirements—might have less need to offer stronger security
to attract customers, and fewer resources to do so, than companies with a global scale. Weaker security makes such systems easier targets for
foreign surveillance. This is what we call the “Protected Local Provider” problem. Fifth, data localization might actually facilitate foreign
surveillance. Centralizing information about users in a locality might actually ease the logistical burdens of foreign intelligence agencies, which
can now concentrate their surveillance of a particular nation’s citizens more easily. We call this the “Jackpot” problem. Finally, we note that the
United States is hardly alone in laws empowering authorities to order corporations to share data of private persons. A recent study shows that
such powers are widespread. 188 Indeed, some other states permit access to data without requiring a court order. 189 That is, one state could
require a multinational Internet service provider to store all its data on local servers, but that fact does not bar another state from requiring the
same multinational provider to turn over data on those servers. One data localization measure—South Korea’s requirement that mapping data
be stored in the country—seems especially difficult to defend. After all, under the rules, one can access South Korean maps from abroad freely,
as long as services are themselves based in South Korea. 190 Thus, if a foreigner wants to access online maps of South Korea, it simply needs to
turn to Naver and Daum, services that use servers located in that country. 191 As Yonsei University Business School Professor Ho Geun Lee
stated, should North Korea want, it can use Naver and Daum’s services to view street maps and photographs of streets. 192 In sum, as Emma
Llansó of the U.S.-based Center for Technology and Democracy warns with respect to Brazil’s attempt to block information from leaving that
country, data localization “would not necessarily keep Brazilians’ data out of the NSA’s hands.” 193 One security professional observes, “The
only way to really make anything that is NSA proof is to not have it connect to the Internet.” 194 B.
Privacy and Security Closely
related to the goal of avoiding foreign surveillance through data localization is the goal of protecting the
privacy and security of personal information against nongovernmental criminal activities. As the country
studies above show, the laws of many countries make it difficult to transfer personal data outside of national borders in the name of privacy
and security. While these laws are not explicitly designed to localize data, by creating significant barriers to the export of data, they operate as
data localization measures. The irony is that such efforts are likely to undermine, not strengthen, the privacy and security of the information.
195 First, localized data servers reduce the opportunity to distribute information across multiple servers in different locations. As we have
noted above, the information gathered together in one place offers a tempting jackpot, an ideal target for criminals. As some computer experts
have noted, “Requirements to localize data . . . only make it impossible for cloud service providers to take advantage of the Internet’s
distributed infrastructure and use sharding and obfuscation on a global scale.” 196 Sharding is the process in which rows of a database table are
held separately in servers across the world—making each partition a “shard” that provides enough data for operation but not enough to reidentify an individual. 197 “The correct solution,” Pranesh Prakash, Policy Director with India’s Centre for Internet and Society suggests, “would
be to encourage the creation and use of de-centralised and end-to-end encrypted services that do not store all your data in one place.” 198
Second, as we noted above, the Protected Local Provider offering storage and processing services may be more likely to have weak security
infrastructure than companies that continuously improve their security to respond to the ever-growing sophistication of cyberthieves. As a
recent cover feature of the IEEE Computer Society magazine observes, “The most common threats to data in the cloud involve breaches by
hackers against inadequately protected systems, user carelessness or lack of caution, and engineering errors.” 199 Information technology
associations from Europe, Japan, and the United States have echoed this observation, arguing that “security is a function of how a product is
made, used, and maintained, not by whom or where it is made.” 200 When Australia was contemplating a rule requiring health data to remain
in the country (a rule that was subsequently implemented), Microsoft made a similar argument. Microsoft argued that the rule might
undermine the security of Australian health information by limiting consumer choice among potential providers and wrote, “Consumers should
have the ability to personally control their [personal electronic health records] by choosing to have their [personal electronic health records]
held by an entity not located within Australia’s territorial boundaries if they believe that entity can provide to them a service that meets their
individual needs.” 201 Indeed,
countries pushing for data localization themselves are sometimes hotbeds of
cybercrimes. According to experts, “Cyber security is notoriously weak in Indonesia.” 202 Indeed, the nation has been called a “hacker’s
paradise.” 203 One 2013 report on Vietnam suggests that “2,045 agency and business websites were hacked this year, but the number of cyber
security experts was too small to cope with all of them.” 204 Another account suggests that “Brazil is among the main targets of virtual threats
such as malware and phishing.” 205 For example, in 2011, hackers stole one billion dollars from companies in Brazil, as Forbes put it, the “worst
prepared nation to adopt cloud technology.” 206 At times, a cybertheft can begin with a domestic burglary, as in the case of one recent
European episode. 207 Or cyberthefts can be accomplished with a USB “thumb” drive. In January 2014, information about more than 100
million South Korean credit cards was stolen, likely through an “inside job” by a contractor armed with a USB drive. 208 Most fundamentally,
there is little reason to believe that the personal information of British Columbians is more secure just because it is stored on a government
computer in Vancouver than one owned by IBM, a few miles further south. C.
Economic Development Many governments
believe that by forcing companies to localize data within national borders, they will increase investment
at home. Thus, data localization measures are often motivated, whether explicitly or not, by desires to
promote local economic development. In fact, however, data localization raises costs for local businesses, reduces access to
global services for consumers, hampers local start-ups, and interferes with the use of the latest technological advances. In an Information Age,
the global flow of data has become the lifeblood of economies across the world. While some in Europe have raised concerns about the transfer
of data abroad, the European Commission has recognized “the critical importance of data flows notably for the transatlantic economy.” 209
The Commission observes that international data transfers “form an integral part of commercial exchanges across the Atlantic including for new
growing digital businesses, such as social media or cloud computing, with large amounts of data going from the EU to the US.” 210 Worried
about the effect of constraints on data flows on both global information sharing and economic development, the Organisation for Economic Cooperation and Development (OECD) has urged nations to avoid “barriers to the location, access and use of cross-border data facilities and
functions” when consistent with other fundamental rights, in order to “ensure cost effectiveness and other efficiencies.” 211 The worry about
the impact of data localization is widely shared in the business community as well. The value of the Internet to national economies has been
widely noted. 212 Regarding Brazil’s attempt to require data localization, the Information Technology Industry Council, an industry association
representing more than forty major Internet companies, had argued that “in-country data storage requirements would detrimentally impact all
economic activity that depends on data flows.” 213 The Swedish government agency, the National Board of Trade, recently interviewed fifteen
local companies of various sizes across sectors and concluded succinctly that “trade cannot happen without data being moved from one
location to another.” 214 Data localization, like most protectionist measures, leads only to small gains for a few local enterprises and workers,
while causing significant harms spread across the entire economy. The domestic benefits of data localization go to the few owners and
employees of data centers and the few companies servicing these centers locally. Meanwhile, the harms of data localization are widespread,
felt by small, medium, and large businesses that are denied access to global services that might improve productivity. In response to Russia’s
recently passed localization law, the NGO Russian Association for Electronic Communications stressed the potential economic consequences,
pointing to the withdrawal of global services and substantial economic losses caused by the passing of similar laws in other countries. 215 For
example, besides the loss of international social media platforms, localization would make it impossible for Russians to order airline tickets or
consumer goods through online services. Localization requirements also seriously affect Russian companies like Aeroflot because the airline
depends on foreign ticket-booking systems. 216 Critics worried, at the time, that the Brazilian data localization requirement would “deny[]
Brazilian users access to great services that are provided by US and other international companies.” 217 Marilia Marciel, a digital policy expert
at Fundação Getulio Vargas in Rio de Janeiro, observes, “Even Brazilian companies prefer to host their data outside of Brazil.” 218 Data
localization affects domestic innovation by denying entrepreneurs the ability to build on top of global services based abroad. Brasscom, the
Brazilian Association of Information Technology and Communication Companies, argues that such obligations would “hurt[] the country’s ability
to create, innovate, create jobs and collect taxes from the proper use of the Internet.” 219 Governments implementing in-country data
mandates imagine that the various global services used in their country will now build infrastructure locally. Many services, however, will find it
uneconomical and even too risky to establish local servers in certain territories. 220 Data centers are expensive, all the more so if they have the
highest levels of security. One study finds Brazil to be the most expensive country in the Western hemisphere in which to build data centers.
221 Building a data center in Brazil costs $60.9 million on average, while building one in Chile and the United States costs $51.2 million and $43
million, respectively. 222 Operating such a data center remains expensive because of enormous energy and other expenses—averaging
$950,000 in Brazil, $710,000 in Chile, and $510,000 in the United States each month. 223 This cost discrepancy is mostly due to high electricity
costs and heavy import taxes on the equipment needed for the center. 224 Data centers employ few workers, with energy making up threequarters of the costs of operations. 225 According to the 2013 Data Centre Risk Index—a study of thirty countries on the risks affecting
successful data center operations—Australia, Russia, China, Indonesia, India, and Brazil are among the riskiest countries for running data
centers. 226 Not only are there significant economic costs to data localization, the potential gains are more limited than governments imagine.
Data server farms are hardly significant generators of employment, populated instead by thousands of computers and few human beings. The
significant initial outlay they require is largely in capital goods, the bulk of which is often imported into a country. The diesel generators, cooling
systems, servers, and power supply devices tend to be imported from global suppliers. 227 Ironically, it is often American suppliers of servers
and other hardware that stand to be the beneficiaries of data localization mandates. 228 One study notes, “Brazilian suppliers of components
did not benefit from this [data localization requirement], since the imported products dominate the market.” 229 By increasing capital
purchases from abroad, data localization requirements can in fact increase merchandise trade deficits. Furthermore, large data farms are
enormous consumers of energy, 230 and thus often further burden overtaxed energy grids. They thereby harm other industries that must now
compete for this energy, paying higher prices while potentially suffering limitations in supply of already scarce power. Cost, as well as access to
the latest innovations, drives many e-commerce enterprises in Indonesia to use foreign data centers. Daniel Tumiwa, head of the Indonesian ECommerce Association (IdEA), states that “[t]he cost can double easily in Indonesia.” 231 Indonesia’s Internet start-ups have accordingly often
turned to foreign countries such as Australia, Singapore, or the United States to host their services. One report suggests that “many of the
‘tools’ that start-up online media have relied on elsewhere are not fully available yet in Indonesia.” 232 The same report also suggests that a
weak local hosting infrastructure in Indonesia means that sites hosted locally experience delayed loading time. 233 Similarly, as the Vietnamese
government attempts to foster entrepreneurship and innovation, 234 localization requirements effectively bar start-ups from utilizing cheap
and powerful platforms abroad and potentially handicap Vietnam from “join[ing] in the technology race.” 235 Governments worried about
transferring data abroad at the same time hope, somewhat contradictorily, to bring foreign data within their borders. Many countries seek to
become leaders in providing data centers for companies operating across their regions. In 2010, Malaysia announced its Economic
Transformation Program 236 to transform Malaysia into a world-class data center hub for the Asia-Pacific region. 237 Brazil hopes to
accomplish the same for Latin America, while France seeks to stimulate its economy via a “Made in France” digital industry. 238 Instead of
spurring local investment, data localization can lead to the loss of investment. First, there’s the retaliation effect. Would countries send data to
Brazil if Brazil declares that data is unsafe if sent abroad? Brasscom notes that the Brazilian Internet industry’s growth would be hampered if
other countries engage in similar reactive policies, which “can stimulate the migration of datacenters based here, or at least part of them, to
other countries.” 239 Some in the European Union sympathize with this concern. European Commissioner for the Digital Agenda, Neelie Kroes,
has expressed similar doubts, worrying about the results for European global competitiveness if each country has its own separate Internet. 240
Then there’s the avoidance effect. Rio de Janeiro State University Law Professor Ronaldo Lemos, who helped write the original Marco Civil and
is currently Director of the Rio Institute for Technology and Society, warns that the localization provision would have caused foreign companies
to avoid the country altogether: “It could end up having the opposite effect to what is intended, and scare away companies that want to do
business in Brazil.” 241 Indeed, such burdensome local laws often lead companies to launch overseas, in order to try to avoid these rules
entirely. Foreign companies, too, might well steer clear of the country in order to avoid entanglement with cumbersome rules. For example,
Yahoo!, while very popular in Vietnam, places its servers for the country in Singapore. 242 In these ways we see that data localization mandates
can backfire entirely, leading to avoidance instead of investment. Data localization requirements place burdens on domestic enterprises not
faced by those operating in more liberal jurisdictions. Countries that require data to be cordoned off complicate matters for their own
enterprises, which must turn to domestic services if they are to comply with the law. Such companies must also develop mechanisms to
segregate the data they hold by the nationality of the data subject. The limitations may impede development of new, global services. Critics
argue that South Korea’s ban on the export of mapping data, for example, impedes the development of next-generation services in Korea:
Technology services, such as Google Glass, driverless cars, and information programs for visually-impaired users, are unlikely to develop and
grow in Korea. Laws made in the 1960s are preventing many venture enterprises from advancing to foreign markets via location/navigation
services. 243 The harms of data localization for local businesses are not restricted to Internet enterprises or to consumers denied access to
global services. As it turns out, most of the economic benefits from Internet technologies accrue to traditional businesses. A McKinsey study
estimates that about seventy-five percent of the value added created by the Internet and data flow is in traditional industries, in part through
increases in productivity. 244 The potential economic impact across the major sectors—healthcare, manufacturing, electricity, urban infrastructure, security, agriculture, retail, etc.—is estimated at $2.7 to $6.2 trillion per year. 245 This is particularly important for emerging
economies, in which traditional industries remain predominant. The Internet raises profits as well, due to increased revenues, lower costs of
goods sold, and lower administrative costs. 246 With data localization mandates, traditional businesses will lose access to the many global
services that would store or process information offshore. Data localization requirements also interfere with the most important trends in
computing today. They limit access to the disruptive technologies of the future, such as cloud computing, the “Internet of Things,” and datadriven innovations (especially those relying on “big data”). Data localization sacrifices the innovations made possible by building on top of
global Internet platforms based on cloud computing. This is particularly important for entrepreneurs operating in emerging economies that
might lack the infrastructure already developed elsewhere. And it places great impediments to the development of both the Internet of Things
and big data analytics, requiring costly separation of data by political boundaries and often denying the possibility of aggregating data across
borders. We discuss the impacts on these trends below. Cloud Computing. Data localization requirements will often prevent access to global
cloud computing services. As we have indicated, while governments assume that global services will simply erect local data server farms, such
hopes are likely to prove unwarranted. Thus, local companies will be denied access to the many companies that might help them scale up, or to
go global. 247 Many companies around the world are built on top of existing global services. Highly successful companies with Indian origins
such as Slideshare and Zoho relied on global services such as Amazon Web Services and Google Apps. 248 A Slideshare employee cites the
scalability made possible by the use of Amazon’s cloud services, noting, “Sometimes I need 100 servers, sometimes I only need 10.” 249 A
company like Zoho can use Google Apps, while at the same time competing with Google in higher value-added services. 250 Accessing such
global services thus allows a small company to maintain a global presence without having to deploy the vast infrastructure that would be
necessary to scale as needed. The Internet of Things. As the world shifts to Internet-connected devices, data localization will require data flows
to be staunched at national borders, requiring expensive and cumbersome national infrastructures for such devices. This erodes the promise of
the Internet of Things—where everyday objects and our physical surroundings are Internet-enabled and connected—for both consumers and
businesses. Consumer devices include wearable technologies that “measure some sort of detail about you, and log it.” 251 Devices such as
Sony’s Smartband allied with a Lifelog application to track and analyze both physical movements and social interactions 252 or the Fitbit 253
device from an innovative start-up suggest the revolutionary possibilities for both large and small manufacturers. The connected home and
wearable computing devices are becoming increasingly important consumer items. 254 A heart monitoring system collects data from patients
and physicians around the world and uses the anonymized data to advance cardiac care. 255 Such devices collect data for analysis typically on
the company’s own or outsourced computer servers, which could be located anywhere across the world. Over this coming decade, the Internet
of Things is estimated to generate $14.4 trillion in value that is “up for grabs” for global enterprises. 256 Companies are also adding Internet
sensors not just to consumer products but to their own equipment and facilities around the world through RFID tags or through other devices.
The oil industry has embraced what has come to be known as the “digital oil field,” where real-time data is collected and analyzed remotely.
257 While data about oil flows would hardly constitute personal information, such data might be controlled under laws protecting sensitive
national security information. The Internet of Things shows the risks of data localization for consumers, who may be denied access to many of
the best services the world has to offer. It also shows the risk of data localization for companies seeking to better monitor their systems around
the world. Data Driven Innovation (Big Data). Many analysts believe that data-driven innovations will be a key basis of competition, innovation,
and productivity in the years to come, though many note the importance of protecting privacy in the process of assembling ever-larger
databases. 258 McKinsey even reclassifies data as a new kind of factor of production for the Information Age. 259 Data localization threatens
big data in at least two ways. First, by limiting data aggregation by country, it increases costs and adds complexity to the collection and
maintenance of data. Second, data localization requirements can reduce the size of potential data sets, eroding the informational value that
can be gained by cross-jurisdictional studies. Large-scale, global experiments technically possible through big data analytics, especially on the
web, may have to give way to narrower, localized studies. Perhaps anonymization will suffice to comport with data localization laws and thus
still permit cross-border data flow, but this will depend on the specifics of the law. D.
Domestic Law Enforcement Governments
have an obligation to protect their citizens, including both preventing harms and punishing those who
have committed crimes. Widespread fear of terrorist attacks in particular has led some countries to
widen surveillance efforts. The United States expanded its surveillance authority in the wake of the 2001
terrorist attacks with the USA PATRIOT Act 260 and then subsequently with other measures such as the Foreign Intelligence
Surveillance Act of 1978 Amendments Act of 2008. 261 After the 2008 Mumbai attack in which the terrorists used BlackBerry devices, the
Indian government sought access to telecommunications providers’ data and asked certain
telecommunications providers to locate their servers in India to facilitate access to data by law
enforcement. 262 More recently, after the revelations of widespread NSA spying, the Internet Service Providers Association of India, which
represents India’s domestic Internet Service Providers, asked the government to require foreign internet companies to offer services in that
country through local servers, citing concerns for their consumers’ privacy. 263
France just recently adopted the law on
military programming permitting certain ministries to see “electronic and digital communications” in
“real time.” 264 While in Vietnam, government officials justify Decree 72 as necessary for law
enforcement, including the enforcement of copyright laws regarding news publications and aiding
investigation of defamation on social networks. 265 After a draft of this paper was made available online, we learned that
the United States government has, on occasion, exercised its authority to review foreign investments into United States telecommunications
infrastructure to require data localization from some of the telecommunications companies. 266 The obligations seem to have arisen as part of
the informal “Team Telecom” review of such investments. Team Telecom consists in representatives from the Departments of Justice, Defense,
and Homeland Security, as well as the Federal Bureau of Investigation. 267 The inconsistent and varying nature of these obligations—
sometimes requiring only prior notice for the use of a foreign service and other times requiring data storage in the United States—suggests that
the law enforcement needs are exaggerated. There is no reason to suspect that a criminal is more likely to use one telecommunications
provider over another. Equally important, it seems unlikely that data localization will prove an effective means to ensure that data about their
residents is available to law enforcement personnel when they want it. Moreover, other alternatives are reasonably available to assist law
enforcement access to data—alternatives that are both less trade restrictive and more speech-friendly than data localization. Data localization
will not necessarily provide law enforcement better access to a criminal’s data trail because localization requirements are extremely hard to
enforce. They might simply end up driving potential wrongdoers abroad to less compliant and more secretive services. Indeed, the most lawabiding companies will follow costly data localization rules, while others will simply ignore them, comforted by the knowledge that such laws
are difficult to enforce. Any success with gaining information from these companies will likely prove temporary, as, over time, potential
scofflaws will become aware of the monitoring and turn to services that intentionally skirt the law. The services avoiding the law will likely be
foreign ones, lacking any personnel or assets on the ground against which to enforce any sanction. Thus, understood dynamically, the data
localization requirement will only hamper local and law abiding enterprises, while driving some citizens abroad. Law enforcement is, without
doubt, a laudable goal, so long as the laws themselves do not violate universal human rights. Many governments already have authority under
their domestic laws to compel a company operating in their jurisdictions to share data of their nationals held by that company abroad. A recent
study of ten countries concluded that the government already had the right to access data held extraterritorially in the cloud in every
jurisdiction examined. 268 Although the process varied, “every single country . . . vests authority in the government to require a Cloud service
provider to disclose customer data in certain situations, and in most instances this authority enables the government to access data physically
stored outside the country’s borders.” 269 Even if companies refuse to comply with such orders, or if the local subsidiary lacks the authority to
compel its foreign counterpart to share personal data, governments can resort to information-sharing agreements. For example, the
Convention on Cybercrime, which has been ratified by forty-four countries including the United States, France, and Germany, 270 obliges
Member States to adopt and enforce laws against cybercrimes and to provide “mutual assistance” to each other in enforcing cyberoffenses.
271 Many states have entered into specific Mutual Legal Assistance Treaties (MLATs) with foreign nations. These treaties establish a process
that protects the rights of individuals yet gives governments access to data held in foreign jurisdictions. Currently, the United States has MLATs
in force with fifty-six countries. 272 The United States also entered into a Mutual Legal Assistance Agreement (MLAA) with China and Taiwan.
273 All the countries discussed in the country studies above, with the exception of Indonesia, Kazakhstan, and Vietnam, have MLAT
arrangements in force with the United States. Generally, MLATs “specify which types of requested assistance must be provided, and which may
be refused.” 274 Requests for assistance may be refused typically when the execution of such request would be prejudicial to the state’s
security or public interest; the request relates to a political offense; there is an absence of reasonable grounds; the request does not conform to
the MLAT’s provisions; or the request is incompatible with the requested state’s law. 275 The explanatory notes to the MLAT between the
United States and the European Union observe that a request for data shall only be denied on data protection grounds in “exceptional cases.”
276 At the same time, there are procedural requirements to help ensure that the information gathering is supporting a proper governmental
investigation. For example, Article 17 of the U.S.–Germany MLAT provides that the government requesting assistance must do so in writing and
must specify the evidence or information sought, authorities involved, applicable criminal law provisions, etc. 277 An effective MLAT process
gives governments the ability to gather information held on servers across the world. The International Chamber of Commerce has recognized
the crucial role of MLATs in facilitating the lawful interception of cross-border data flow and stressed the need to focus on MLATs instead of
localization measures. 278 Similarly, the European Commission has recently stressed that the rebuilding of trust in the U.S.–E.U. relationship
must focus in part on a commitment to use legal frameworks such as the MLATs. 279 Mutual cooperation arrangements are far more likely to
prove effective in the long run to support government information gathering efforts than efforts to confine information within national
borders. E.
Freedom Information control is central to the survival of authoritarian regimes. Such regimes
require the suppression of adverse information in order to maintain their semblance of authority. This is
because “even authoritarian governments allege a public mandate to govern and assert that the
government is acting in the best interests of the people.” 280 Information that disturbs the claim of a
popular mandate and a beneficent government is thus to be eliminated at all costs. Opposition
newspapers or television is routinely targeted, with licenses revoked or printing presses confiscated. The
Internet has made this process of information control far more difficult by giving many dissidents the
ability to use services based outside the country to share information. The Internet has made it harder,
though not impossible, for authoritarian regimes to suppress their citizens from both sharing and
learning information. 281 Data localization will erode that liberty-enhancing feature of the Internet. The
end result of data localization is to bring information increasingly under the control of the local
authorities, regardless of whether that was originally intended. The dangers inherent in this are plain. Take the following
cases. The official motivation for the Iranian Internet, as set forth by Iran’s head of economic affairs Ali Aghamohammadi, was to create an
Internet that is “a genuinely halal network, aimed at Muslims on an ethical and moral level,” which is also safe from cyberattacks (like Stuxnet)
and dangers posed by using foreign networks. 282 However, human rights
activists believe that “based on [the
country’s] track record, obscenity is just a mask to cover the government’s real desire: to stifle dissent
and prevent international communication.” 283 An Iranian journalist agreed, “[t]his is a ploy by the
regime,” which will “only allow[] [Iranians] to visit permitted websites.” 284 More recently, even Iran’s
Culture Minister Ali Janati acknowledged this underlying motivation: “We cannot restrict the advance of
[such technology] under the pretext of protecting Islamic values.” 285 Well aware of this possibility, Internet
companies have sought at times to place their servers outside the country in order to avoid the information held therein being used to target
dissidents. Consider one example: when it began offering services in Vietnam, Yahoo! made the decision to use servers outside the country,
perhaps to avoid becoming complicit in that country’s surveillance regime. 286 This provides important context for the new Vietnamese decree
mandating local accessibility of data. While the head of the Ministry of Information’s Online Information Section defends Decree 72 as
“misunderstood” and consistent with “human rights commitments,” 287 the Committee to Protect Journalists worries that this decree will
require “both local and foreign companies that provide Internet services . . . to reveal the identities of users who violate numerous vague
prohibitions against certain speech in Vietnamese law.” 288 As Phil Robertson of Human Rights Watch argues, “This is a law that has been
established for selective persecution. This is a law that will be used against certain people who have become a thorn in the side of the
authorities in Hanoi.” 289
Case - Cybersecurity
1NC
Systems were not built to handle cyberattacks
Vicinanzo 15 (http://www.hstoday.us/briefings/daily-news-analysis/single-article/dhs-leaves-federalfacilities-open-to-cyber-attacks/5f3d0085da0b04a7918f9b41a80874c1.html, DHS Leaves Federal
Facilities Open To Cyber Attacks, high system interconnectivity= vulnerable)//A.V.
Amid reports that US Central Command’s social media accounts were attacked by hackers claiming allegiance to the Islamic State (IS), the
Government Accountability Office (GAO) issued an audit report indicating DHS
is unprepared to address the increasing
vulnerablilty of federal facilities to cyber attacks. GAO found the Department of Homeland Security (DHS)—the agency
responsible for protecting federal facilities—lacks a strategy to address cyber risk to building and access
control systems in federal facilities. Consequently, the nearly 9,000 federal facilities protected by Federal Protection Services (FPS)
remain vulnerable to cyber threats. “Federal facilities contain building and access control systems—computers
that monitor and control building operations such as elevators, electrical power, and heating,
ventilation, and air conditioning—that are increasingly being connected to other information systems
and the Internet,” GAO said. The GAO auditors added, “The increased connectivity heightens their
vulnerability to cyber attacks, which could compromise security measures, hamper agencies’ ability to
carry out their missions, or cause physical harm to the facilities or their occupants.” The increase in the
connectivity of these systems has also led to an increase in vulnerability to cyber attacks. For example, in 2009,
a security guard in a Dallas-area hospital uploaded malware to a hospital computer which controlled the heating, air conditioning, and
ventilation for two floors. Court documents indicate that the breach could have interfered with patient treatment, highlighting the danger
cyber intrusions pose to building and access control systems. A
cyber expert told GAO these systems were not designed
with cybersecurity in mind. “Security officials we interviewed also said that cyber attacks on systems in federal facilities could
compromise security countermeasures, hamper agencies’ ability to carry out their missions, or cause physical harm to the facilities and their
occupants,” GAO said. Sources of cyber threats to building and access control systems include corrupt employees, criminal groups, hackers and
terrorists. In particular, insiders—which include disgruntled employees, contractors or other persons abusing their positions of trust—represent
a significant threat to these systems. Editor's note: Read the report, Getting Inside the Insider Threat, by Nadia Short, vice president and
general manager of Cyber and Intelligence Solutions at General Dynamics Advanced Information Systems, in Homeland Security Today's
Oct./Nov. special section on cybersecurity. Cyber
incidents reported to DHS involving industrial control systems
increased from 140 to 243 incidents, a 74 percent increase, between fiscal years 2011 and 2014. Despite
this increase, DHS does not have a strategy to address cyber risk to building and access control systems.
Specifically, DHS lacks a strategy that defines the problem, identifies the roles and responsibilities, analyzes the resources needed, and
identifies a methodology for assessing cyber risk to building and access control systems. According
to GAO, “The absence of a
strategy that clearly defines the roles and responsibilities of key components within DHS has contributed
to the lack of action within the department. For example, no one within DHS is assessing or addressing cyber risk to building
and access control systems particularly at the nearly 9,000 federal facilities protected by FPS.” DHS’s failure to develop a strategy
has led to confusion among several components within DHS about their roles and responsibilities. For
example, FPS’s Deputy Director for Policy and Programs indicated FPS’s authority includes cybersecurity. However, the official said that FPS is
not assessing cyber risk because FPS does not have the expertise. Moreover, DHS
lacks clear guidance on how federal
agencies should report cybersecurity incidents. Before 2014, for instance, DHS did not specify that
information systems included industrial control systems. DHS clarified this guidance in 2014 in part because of questions
GAO auditors asked during their review. “By not assessing the risk to these systems and taking steps to reduce that
risk, federal facilities may be vulnerable to cyber attacks,” GAO said.
Backdoors aren’t the problem- system design is
Tang 15 (http://www.techtradeasia.info/2015/04/enteprrises-mostly-unprepared-for.html, Enterprises
mostly unprepared for cyberattacks: Intel Security, lack of communication create security gaps,
information overload drown out info needed, )//A.V.
A new report, Tackling Attack Detection and Incident Response* from the Enterprise Strategy Group (ESG), has found that security
professionals are inundated with security incidents, averaging 78 investigations per organisation in the last year. Over a quarter (28%) of those
incidents involved targeted attacks – one of the most dangerous and potentially damaging forms of cyberattacks. According
to the IT
and security professionals surveyed in the study, commissioned by Intel Security (formerly McAfee),
better detection tools, better analysis tools, and more training on how to deal with incident response
issues are the top ways to improve the efficiency and effectiveness of the information security staff.
“When it comes to incident detection and response, time has an ominous correlation to potential damage,” said Jon Oltsik, Senior Principal
Analyst at ESG. “The
longer it takes an organisation to identify, investigate, and respond to a cyberattack, the
more likely it is that their actions won’t be enough to preclude a costly breach of sensitive data. With this in
mind, Chief Information Security Officers (CISOs) should remember that collecting and processing attack data is a means toward action -improving threat detection and response effectiveness and efficiency.” Nearly
80% of the people surveyed believe the lack
of integration and communication between security tools creates bottlenecks and interferes with their
ability to detect and respond to security threats. Real-time, comprehensive visibility is especially important for rapid response
to targeted attacks, and 37% called for tighter integration between security intelligence and IT operations tools. In addition, the top timeconsuming tasks involved scoping and taking action to minimise the impact of an attack, activities that can be accelerated by integration of
tools. These
responses suggest that the very common patchwork architectures of dozens of individual
security products have created numerous silos of tools, consoles, processes and reports that prove very
time consuming to use. These architectures are creating ever greater volumes of attack data that drown
out relevant indicators of attack. Security professionals surveyed claim that real-time security visibility suffers from limited
understanding of user behaviour and network, application, and host behaviour. While the top four types of data collected are
network-related, and 30% collect user activity data, it’s clear that data capture isn’t sufficient. Users
need more help to contextualise the data to understand what behaviour is worrisome. This gap may
explain why nearly half (47%) of organisations said determining the impact or scope of a security
incident was particularly time consuming. Users understand they need help to evolve from simply collecting volumes of security
event and threat intelligence data to more effectively making sense of the data and using it to detect and assess incidents. Fifty-eight
percent said they need better detection tools, such as static and dynamic analysis tools with cloudbased intelligence to analyse files for intent. Fifty-three percent say they need better analysis tools for
turning security data into actionable intelligence. One-third called for better tools to baseline normal
system behaviour so teams can detect variances faster. People who took the survey admitted to a lack of knowledge of the
threat landscape and security investigation skills, suggesting that even better visibility through technical integration or analytical capabilities
will be inadequate if incident response teams cannot make sense of the information they see. For instance, only 45% of respondents consider
themselves very knowledgeable about malware obfuscation techniques, and 40% called
for more training to improve
cybersecurity knowledge and skills. The volume of investigations and limited resources and skills contributed to a strong desire
among respondents for help incident detection and response. Forty-two percent reported that taking action to minimise the impact of an
attack was one of their most time-consuming tasks. Twenty-seven percent would like better automated analytics from security intelligence
tools to speed real-time comprehension; while 15% want automation of processes to free up staff for more important duties. “Just as the
medical profession must deliver heart-attack patients to the hospital within a ‘golden hour’ to maximise likelihood of survival, the security
industry must work towards reducing the time it takes organisations to detect and deflect attacks, before damage is inflicted,” said Chris Young,
General Manager at Intel Security. “This requires that we ask and answer tough questions on what is failing us, and evolve our thinking around
how we do security.” The
ESG believes that there is a hidden story within the Intel Security research that hints
at best practices and lessons learned. This data strongly suggests that CISOs: · Create a tightly-integrated
enterprise security technology architecture CISOs must replace individual security point tools with an integrated security
architecture. This strategy works to improve the sharing of attack information and cross-enterprise visibility into user, endpoint, and network
behaviour, not to mention more effective, coordinated responses
Fundamental flaws in the system make cyberattacks on military bases and satellites
inevitable
Gonsalves 14 (http://www.csoonline.com/article/2146021/cyber-attacks-espionage/major-securityflaws-threaten-satellite-communications.html, Major security flaws threaten satellite
communications)//A.V.
Security researchers find major flaws in satellite communication gear used worldwide in aeronautics, the energy and maritime industries,
emergency services and by government agencies and the military. An
analysis of satellite communication gear from more
than a half-dozen major manufacturers has uncovered critical vulnerabilities that could be exploited to
disrupt military operations and ship and aircraft communications. The flaws were found in software and
ground-based satellite systems used worldwide and manufactured by U.S.-based Harris Corp., Hughes
and Iridium Communications; U.K.-based Cobham and Inmarsat; Thuraya, headquartered in Dubai, United Arab Emirates, and the
Japan Radio Co., security firm IOActive reported in a technical white paper released this week. Satellite communication (SATCOM) networks are
critical in aeronautics, the energy and maritime industries, emergency services and the media. Government agencies and the military also
depend on such networks. MORE ON CSO: How to spot a phishing email From October to December 2013, IOActive researchers reversed
engineered the publicly available firmware updates of SATCOM products from the manufacturers. What
the researchers found
were major vulnerabilities that could let a cyberattacker intercept, manipulate or block
communications, and in some cases, remotely take control of the physical device. The findings were serious
enough for the vendor to recommend that SATCOM manufacturers and resellers "immediately remove all publicly accessible copies of device
firmware updates from their websites, if possible, and strictly control access to updates in the future." IOActive has notified the vendors of the
flaws and is working with the government CERT Coordination Center. CERT, which stands for Computer Emergency Response Team, is a part of
the Software Engineering Institute (SEI), which is a U.S.-funded research and development center at the Carnegie Mellon University. Specific
details needed to replicate or test the vulnerabilities will not be released publicly until the second half of the year to give the vendors time to
develop patches for their products. So far, only Iridium was working on a fix, Cesar Cerrudo, chief technology officer for IOActive Labs, said
Friday. "Government
agencies are aware of the situation, but we don't know how hard they are pressuring
vendors to get the vulnerabilities fixed." The classes of vulnerabilities uncovered by IOActive included
hardcoded credentials, undocumented protocols, insecure protocols and backdoors. Many of the problems
were discovered in Broadband Global Area Network satellite receivers. BGAN is an Internet and voice network often used in military operations.
The system was used is efforts to locate the Malaysian passenger plane that crashed last month. The equipment analyzed was also used in
accessing Inmarsat-C and FleetBroadband, both maritime communication systems; SwiftBroadband, an IP-based data and voice aeronautical
system that has been approved by the International Civil Aviation Organization (ICAO) for aircraft safety services; and Classic Aero Service, an
aeronautical system used for voice, fax and data services. To
exploit the vulnerabilities, an attacker would have to first
compromise or gain physical access to a PC connected to one of the above networks, Cerrudo, chief
technology officer for IOActive Labs, said. Once in the control of the attacker, the computer could then be used to compromise
vulnerable devices without needing a user name or password. "The impact will depend on the scenario, if the devices are
compromised when they are really needed then the impact would be bigger and maybe cause
accidents," Cerrudo said.
Britain is putting backdoors into products
Geller 15 (may 28http://www.dailydot.com/politics/uk-surveillance-backdoors-law-tech-companies/,
The U.K. is about to force tech companies to decrypt protected user data)//A.V.
The British government is about to give its spy agencies the power to force technology companies to
decrypt their customers' protected data, marking the first move by a major Western power to mandate
the use of so-called "backdoors" in commercial technology. "Under the proposed new powers," the Telegraph reports,
"the spy agencies will be able to obtain a warrant from the Home Secretary that will oblige an internet
companies [sic] to break down its encryption protection on a suspect and allow access to his or her
communications." In order to comply with such an order, tech companies will need to build backdoors in their
services' encryption. Backdoors provide universal access to encrypted data, bypassing the encryption solutions that companies
advertise. They are the software equivalent of the master keys that apartment supervisors use to access any unit in a building. Security experts
universally condemn the implementation of backdoors, calling them serious vulnerabilities in encrypted products and warning that they make
tempting targets for hackers. A common refrain from the security and privacy communities is that, in the words of Sen. Ron Wyden (D-Ore.),
"There’s no such thing as a magic door that can only be used by the good people for worthwhile reasons." "Backdoors
and other
government efforts to weaken encryption undermine the security of the Internet for everyone," said Alex
Abdo, a staff attorney at the American Civil Liberties Union. "Strong encryption is especially important for those most at risk of governmental
suppression, such as journalists, dissidents, and human-rights activists. In an era of mass surveillance and crippling cyberattacks, strong
encryption is more important than ever." Drew Mitnick, policy counsel at the international privacy group Access, agreed. "The
U.K.
proposals to expand surveillance powers are an affront to the rights to privacy and expression and
create entirely new risks to the security of everyday Internet users," Mitnick said. "Requiring companies to
build encryption backdoors makes users and the technology they depend on more vulnerable to
malicious attacks. Law enforcement has many tools to compel the production of necessary information without weakening digital
security or limiting fundamental rights."
No impact to military cyberattack- no one gave a shit in 2009
Alternet 9 (Worst cyber attack on US military came via flash drive: US,
http://www.alternet.org/rss/breaking_news/271139/worst_cyber_attack_on_us_military_came_via_fla
sh_drive%3A_us)//A.V.
The most serious cyber attack on the US military's networks came from a tainted flash drive in 2008,
forcing the Pentagon to review its digital security, a top US defense official said Wednesday. US Deputy
Secretary of Defense William Lynn testifies in 2009. The most serious cyber attack on the US military's networks came
from a tainted flash drive in 2008, forcing the Pentagon to review its digital security, Lynn said
Wednesday. The thumb drive, which was inserted in a military laptop in the Mideast, contained
malicious code that "spread undetected on both classified and unclassified systems, establishing what
amounted to a digital beachhead, from which data could be transferred to servers under foreign
control," Deputy Defense Secretary William Lynn wrote in the journal Foreign Affairs. The code was placed on the drive by "a foreign
intelligence agency," Lynn wrote. "It was a network administrator's worst fear: a rogue program operating
silently, poised to deliver operational plans into the hands of an unknown adversary." Previous media
reports speculated that the attack may have originated from Russia. The Pentagon had never openly discussed the
incident, but Lynn chose to reveal the details of the attack as officials try to raise public awareness of the growing threat posed to government
computer networks. The incident served as a wake-up for the Pentagon and prompted major changes in how the department handled digital
threats, including the formation of a new cyber military command, Lynn said. After the 2008 assault, the Pentagon banned its work force from
using flash drives, but recently eased the prohibition. Since the attack, the military has developed methods to uncover intruders inside its
network, or so-called "active defense systems," according to Lynn. But
he added that drafting rules of engagement for
defending against cyber attack was "not easy," as the laws of war were written before the advent of a
digital battlefield.
Big Advantage flaw- No evidence NSA has backdoors on satellites or on our own
military- backdoors only put in US silicon valley tech companies
Mccullagh 13 (http://www.cnet.com/news/nsa-has-backdoor-access-to-internet-companiesdatabases/, NSA has backdoor access to Internet companies' databases, Declan McCullagh is the chief
political correspondent for CNET. You can e-mail him or follow him on Twitter as declanm. Declan
previously was a reporter for Time and the Washington bureau chief for Wired and wrote the Taking
Liberties section and Other People's Money column for CBS News' Web site. See full bio)//A.V.
A top-secret surveillance program gives the National Security Agency surreptitious access to customer
information held by Microsoft, Yahoo, Apple, Google, Facebook, and other Internet companies,
according to a pair of new reports. The program, code-named PRISM, reportedly allows NSA analysts to
peruse exabytes of confidential user data held by Silicon Valley firms by typing in search terms. PRISM reports have
been used in 1,477 items in President Obama's daily briefing last year, according to an internal presentation to the NSA's Signals Intelligence
Directorate obtained by the Washington Post and the Guardian newspapers.
Heg isn’t key to peace
Christopher J. Fettweis 11, Department of Political Science, Tulane University, 9/26/11, Free Riding or Restraint? Examining European
Grand Strategy, Comparative Strategy, 30:316–332, EBSCO
Assertions that without the combination of U.S. capabilities, presence and commitments instability would
return to Europe and the Pacific Rim are usually rendered in rather vague language. If the United States were to decrease
its commitments abroad, argued Robert Art, “the world will become a more dangerous place and, sooner or later, that will
redound to America’s detriment.”53 From where would this danger arise? Who precisely would do the fighting,
and over what issues? Without the United States, would Europe really descend into Hobbesian anarchy?
Would the Japanese attack mainland China again, to see if they could fare better this time around? Would the Germans and
French have another go at it? In other words, where exactly is hegemony is keeping the peace? With one exception,
these questions are rarely addressed. That exception is in the Pacific Rim. Some analysts fear that a de facto surrender of U.S.
hegemony would lead to a rise of Chinese influence. Bradley Thayer worries that Chinese would become “the language of diplomacy, trade and
commerce, transportation and navigation, the internet, world sport, and global culture,” and that Beijing would come to “dominate science and
technology, in all its forms” to the extent that soon theworldwould witness a Chinese astronaut who not only travels to the Moon, but “plants
the communist flag on Mars, and perhaps other planets in the future.”54 Indeed Chin a is the only other major power that has increased its
military spending since the end of the Cold War, even if it still is only about 2 percent of its GDP. Such levels of effort do not suggest a desire to
compete with, much less supplant, the United States. The much-ballyhooed, decade-long
military buildup has brought
Chinese spending up to somewhere between one-tenth and one-fifth of the U.S. level. It is hardly
clear that a restrained United States would invite Chinese regional, must less global, political expansion. Fortunately
one need not ponder for too long the horrible specter of a red flag on Venus, since on the planet Earth, where war is no longer the dominant
form of conflict resolution, the threats posed by even a rising China would not be terribly dire. The dangers contained in the terrestrial security
environment are less severe than ever before.
Believers in the pacifying power of hegemony ought to keep in mind
threats are more significant than vague, unnamed
a rather basic tenet: When it comes to policymaking, specific
dangers. Without specific risks, it is just as plausible to interpret U.S. presence as redundant, as overseeing a peace that has already arrived.
Strategy should not be based upon vague images emerging from the dark reaches of the
neoconservative imagination. Overestimating Our Importance One of the most basic insights of cognitive
psychology provides the final reason to doubt the power of hegemonic stability: Rarely are our actions
as consequential upon their behavior as we perceive them to be. A great deal of experimental evidence exists to
support the notion that people (and therefore states) tend to overrate the degree to which their behavior is
responsible for the actions of others. Robert Jervis has argued that two processes account for this overestimation, both
ofwhichwould seem to be especially relevant in theU.S. case. 55 First, believing that we are responsible for their actions
gratifies our national ego (which is not small to begin with; the United States is exceptional in its exceptionalism). The hubris of the
United States, long appreciated and noted, has only grown with the collapse of the Soviet Union.56 U.S. policymakers famously
have comparatively little knowledge of—or interest in—events that occur outside of their own
borders. If there is any state vulnerable to the overestimation of its importance due to the
fundamental misunderstanding of the motivation of others, it would have to be the United States.
Second, policymakers in the United States are far more familiar with our actions than they are with the decision-making processes of our allies.
Try as we might,
it is not possible to fully understand the threats, challenges, and opportunities that our
allies see from their perspective. The European great powers have domestic politics as complex as ours, and they also have
competent, capable strategists to chart their way forward. They react to many international forces, of which U.S.
behavior is only one. Therefore, for any actor trying to make sense of the action of others, Jervis notes, “in the absence of strong
evidence to the contrary, the most obvious and parsimonious explanation is that he was responsible.”57 It is natural, therefore, for U.S.
policymakers and strategists to believe that the behavior of our allies (and rivals) is shaped largely by what
Washington does. Presumably Americans are at least as susceptible to the overestimation of their ability as any other people, and
perhaps more so. At the very least, political psychologists tell us, we are probably not as important to them as we think.
The importance of U.S. hegemony in contributing to international stability is therefore almost
certainly overrated . In the end, one can never be sure why our major allies have not gone to, and do not even plan for, war. Like
deterrence, the hegemonic stability theory rests on faith; it can only be falsified, never proven . It does not
seem likely, however, that hegemony could fully account for twenty years of strategic decisions made in allied capitals if the international
system were not already a remarkably peaceful place. Perhaps
to begin with, and our commitments are redundant.
these states have no intention of fighting one another
Ext Systems too weak
Extend regardless of backdoors, infrastructure will be destroyed by cyberattacks- 1NC
Vinanzo 15 says too much interconnectivity collapses the system quickly and codes
were not built with attacks in mind, Tang 15 indicates that lack of security integration
drowns out warnings from cyberattacks causing holes for attacks
3 More alt causes
Hoffman 15 (jan 27, http://www.energypost.eu/vulnerability-electric-utility-system-cyber-attacks/,
The vulnerability of our electric utility system to cyber attacks, more cyberhackers, increased
countermeasure costs, no manpower)//A.V.
Is it difficult to provide this cyber protection? The simple answer is yes, for several reasons: the growing
numbers of wireless networks and cyber hackers, the cost of counteracting malicious hacking, the lack
of newly trained professionals to address the hacking issue, and what I have long considered a major
problem: the inability to focus enough attention on cyber security issues. Capable hackers Let me
discuss each of the barriers in order. Wireless networking is growing because it offers many advantages
– reduced wiring requirements and related costs, remote operation and reduced manpower
requirements, ability to monitor more variables continuously and control systems to a finer degree.
Disadvantages arise when inadequate attention is paid to preventing hacker penetration into the
network, thus allowing disruption of normal operations or allowing hackers to take control of the
network. Also, the number of capable hackers is increasing rapidly. Many schemes have been proposed
for restricting unauthorized access to a network, usually using passwords, but often these passwords are
not adequate to stop an experienced hacker and most people are resistant to remembering long,
complicated passwords. Many companies are also not yet convinced of the need to spend the money on
sophisticated protection systems, and some may see the consequences of a hacking as less costly than
the required investment. At some level we can all relate to this mindset. The U.S. Air Force is the largest
single user of aviation fuel in the world Costs are inherent in any attempt to prevent hacking, ranging
from software and hardware costs to labor costs. There is some indication that SONY, an electronics
company, spent too little on protection costs by underestimating the potential threat to its cyber
systems. It is a mistake it won’t make again, and should serve as a wake up call to other corporate and
government bodies. The trained manpower issue is a critical one. As a vice president of Oracle
Corporation noted in Congressional testimony: the vast majority of people available today to address
cyber security issues are the ones who designed and implemented the current vulnerable information
technology system. Should they be the ones to try and fix it, or do we need newly-trained cyber experts
who are not so closely linked to today’s operating modes? Clearly there are people who have the
requisite high level skills – think NSA – but are they available broadly on a global basis? Looks like a good
field to get into as soon as possible.
Cyberattackers will use more sophisticated tactics to attack servers regardless of
backdoors
Lever 14 (http://www.businessinsider.com/afp-cyberattacks-to-worsen-in-2015-mcafee-researchers2014-12, Cyberattacks Are Just Going To Get Worse From Here)//A.V.
Read more: http://www.businessinsider.com/afp-cyberattacks-to-worsen-in-2015-mcafee-researchers2014-12#ixzz3gC0T6P8z
Washington (AFP) - A series of spectacular cyberattacks drew headlines this year, and the situation will
only worsen in 2015 as hackers use more advanced techniques to infiltrate networks, security researchers said
Tuesday. McAfee Labs' 2015 Threats Predictions report sees increased cyber-warfare and espionage, along
with new strategies from hackers to hide their tracks and steal sensitive data. "Cyber espionage attacks
will continue to increase in frequency," the report said. "Long-term players will become stealthier
information gatherers, while newcomers will look for ways to steal money and disrupt their
adversaries." McAfee said small nations and terror groups will become even more active and will "attack by launching crippling distributed
denial of service attacks or using malware that wipes the master boot record to destroy their enemies' networks." At the same time,
cybercriminals will use better methods to remain hidden on a victim's network, to carry out long-term
theft of data without being detected, the researchers said. "In this way, criminals are beginning to look
and act more like sophisticated nation-state cyberespionage actors, who watch and wait to gather
intelligence," the report said. The report also said hackers are looking to target more connected devices, including computers in the
farming, manufacturing, and health care sectors. "The number and variety of devices in the Internet of Things (IoT) family is growing
exponentially. In the consumer space, they are now seen in appliances, automobiles, home automation, and even light bulbs," McAfee said.
Ext Satellites and military vulnerable
Computer dependency fundamentally hurts satellite security- aff wont solve that
Stuart 15 (http://www.ft.com/cms/s/0/659ab77e-c276-11e4-ad8900144feab7de.html#axzz3hb6C1a7o, Comment: Satellite industry must invest in cyber security)//A.V.
In 2007 the Sri Lankan government noticed that propaganda for the Tamil Tigers, the rebel group, was being broadcast regionally from an
Intelsat satellite over the Indian Ocean. This was not an unconventional business arrangement between the US satellite communications
company and the rebels, but the result of a cyber attack. Two years earlier, Tamil fighters had hacked the satellite and proceeded to use its
signal sporadically for their own political purposes. Events
targeting satellites are not isolated and they are likely to
increase. The industry needs to spend money on security, to avoid both potential larger financial loss
and threats to safety. There are some 1,000 functioning satellites orbiting the Earth relaying and amplifying information sent through
radio frequencies from one point on Earth to another. They form part of the infrastructure that we rely on for safety and quality of life. Many
people do not realise how pervasive satellite services are in contemporary society. Atomic clocks on global positioning system (GPS) satellites
allow the financial industry to co-ordinate trading across multiple time zones. Aircraft, ships, cars and military personnel use navigation
satellites and broadband internet can reach rural locations and moving objects such as trains. Telecommunications satellites provide audio and
video connections. Earth observation satellites provide imagery used by the military and governments in early warning weather systems and to
monitor the environment. Given
that satellites are computer-dependent, they are susceptible to cyber security
attacks. Two components of the satellite infrastructure are vulnerable: the ground-based components
and on-board computers. Attackers may be individuals, organisations or hostile governments. Activity could be politically motivated
(sabotage, espionage, censorship, propaganda, terrorism) or financially driven (industrial competition, theft of data or services). Hackers could
simply be rogue thrill seekers or vandals. One can imagine the consequences of significant interference with satellites. The outcome of a denialof-service attack or manipulation of location data that affects airline navigation systems could be fatal. Disrupting global transactions by
targeting satellites could have serious economic outcomes, including the freezing of leading trading hubs. Jamming (intentionally blocking or
interfering with a signal) can range from being an inconvenience to a grave concern, for example denying satellite information to military
personnel. Where jamming disrupts commercial services, companies risk reputational erosion and consumer backlash. This is a sector with
estimated global revenue of $195.2bn in 2013, according to the Satellite Industry Association. Hackers can potentially commandeer a satellite
to distribute their content or to manoeuvre the hardware in a way that disables it — effectively turning it into a piece of space junk. (The latter
event has yet to occur, but hackers did take control of a Nasa Terra Earth Observation satellite in 2008.) Are
hacking and jamming
scenarios a realistic concern? Experience and recent research suggests that the risk is relatively high. The
box below lists only a selection of known incidents. A report by IOActive, a security consultancy, in 2014,
found that vulnerabilities remain across many services, but that the satellite industry has been reluctant
to respond.
Ext Foreign Backdoors
Russia putting backdoors into companies
Victor 15 (june 1, http://www.phonearena.com/news/Surprise-surprise-Russian-YotaPhone-comeswith-a-backdoor-for-the-KGB-to-spy-on-you-Update-Russian-court-order-required_id69930, Surprise,
surprise: Russian YotaPhone comes with a backdoor for the KGB to spy on you (Update: Russian court
order required)//A.V.
Hey, color us surprised: the Russian YotaPhone, an Android smartphone best known for having a
secondary rear-facing e-ink display in addition ot the regular color front display, has another special
feature: a backdoor that the former KGB (now known as FSB, or Federal Security Service) can use to spy
on customers' private affairs. What's particularly shocking is that the admission comes in perfect honesty and good will from Sergey
Chemezov, the head of Russian Government-sponsored Rostech corporation that is making the phone. "The FSB will have access [to
users' information]. We don't have the right to sell phones on the market in any other way - otherwise,
the devices could be used by terrorists, criminals," said Chemezov. Read more at http://www.phonearena.com/news/Surprisesurprise-Russian-YotaPhone-comes-with-a-backdoor-for-the-KGB-to-spy-on-you-Update-Russian-court-orderrequired_id69930#faUy7OV5rAlQEj6C.99 Chemezov
doesn't skip on the opportunity of trashing Apple as well,
claiming the iPhone has become 'the choice of terrorists'. In an epic climax of broken logic and hypocrisy, Chemezov then
goes on to say that equipping the phone with a backdoor is necessary so that the phone doesn't become a choice of terrorists. Like, you know,
that irresponsible, terrorist-prone iPhone (singled out by Chemezov himself) and other devices that ship with system-wide encryption and
without an actual backdoor for governments to use (at least not with one that is publicly admitted by Apple, Google, or government officials).
France is installing backdoors
Geniar 15 (May 5 , https://ma.ttias.be/french-law-forces-backdoors-on-isps/, French Law Forces
Backdoors On ISPs)//A.V.
This law just got approved in France. 438 votes in favor, 86 against, 42 abstained. The original, French, version is here:
ASSEMBLÉE NATIONALE 2669. A google translated version reads as follows. Article 6 [...] It is also stated that operators and
service providers will, if necessary, be able to observe the provisions governing the secrecy of national
defense. Finally, Article L. 871-4 provides that CNCTR the members and agents can penetrate, for control purposes, on the premises of
operators and service providers. Article 7 also moves, adapting in the new Book VIII of the Code of internal security of existing criminal
provisions, including the fact that repress from revealing that information technology is implemented or refusal to transmit login data whose
collection has been authorized. Every
ISP or hosting provider in France should be worried. OVH, one of the
biggest hosting providers in the world, was already threatening to leave France. If they're looking to
store their servers physically nearby, maybe we can partner up. But all kidding aside, I'm curious what they'll do now,
since the law has been approved. A translated post on OVH's statement shows some of the real dangerous of this law. ... Requiring
French hosts to accept real-time capture of connection data and the establishment of "black boxes" with
blurred in infrastructure, means giving French intelligence services access and visibility into all data
traveling over networks. This unlimited access insinuate doubt among customers of hosting providers on
the use of these "black boxes" and the protection of their personal data. OVH.com We can all argue in favor of endto-end encryption, SSL everywhere, ... but that doesn't change the fact that your government is forcing the internet providers
in your country to install and maintain a backdoor, so French law enforcement can intervene and spy at
their own choosing. It's a sad day for France and a sad day for ISPs and hosting providers in general.
China is putting backdoors in to strengthen security against terrorists
Leyden
15(http://www.theregister.co.uk/2015/03/05/obama_criticises_china_tech_rules_backdoor_terrorism/
, Obama criticises China's mandatory backdoor tech import rules)//A.V.
US prez Barack Obama has criticised China's new tech rules, urging the country to reverse the policy if it wants a business-as-usual situation
with the US to continue. As previously reported,
proposed new regulations from the Chinese government would
require technology firms to create backdoors and provide source code to the Chinese government
before technology sales within China would be authorised. China is also asking that tech companies
adopt Chinese encryption algorithms and disclose elements of their intellectual property. The new
requirements, laid out in a 22-page document approved late last year, are supposedly geared towards
strengthening the cyber security of critical Chinese industries and guarding against terrorism. In an
interview with Reuters, Obama said Beijing's far-reaching counter-terrorism law would require
technology firms to hand over encryption keys as well as installing "backdoors" into systems, thus
granting Chinese authorities access in the process. "We have made it very clear that this is something they are going to have
to change if they are to do business with the United States," Obama said. "This is something that I’ve raised directly with President Xi." The
proposed laws "would essentially force all foreign companies, including US companies, to turn over to
the Chinese government mechanisms where they can snoop and keep track of all the users of those
services," Obama added. "As you might imagine, tech companies are not going to be willing to do that," he said. Aside from user privacy
concerns, Western business groups such as the US Chamber of Commerce have criticised China's policies as protectionist. The proposed rules
extend the scope of recently adopted financial industry regulations that effectively encouraged Chinese banks to buy from domestic technology
vendors. The
Chinese government is pushing these anti-terrorism rules as vital in protecting state and
business secrets. The disagreement marks another cyber security and technology policy difference between US and China, with relations
not yet healed from ongoing complaints about Chinese cyber espionage and the Snowden revelations.
EXT Heg D
Extend Fetwiess- Heg isn’t key to peace, their authors overestimates the importance
of the US in stopping conflict and can only rest on faith because their theory is only
falsifiable but not provable- their war scenarios are too vague and rely on unique
circumstances in the past that are un-applicable now.
U.S. primacy isn’t key to peace---their data is flawed
Christopher Preble 10, director of Foreign Policy Studies at the CATO Institute, August 3, 2010, “U.S.
Military Power: Preeminence for What Purpose?,” online: http://www.cato-at-liberty.org/u-s-militarypower-preeminence-for-what-purpose/
Most in Washington still embraces the notion that America is, and forever will be, the world’s
indispensable nation. Some scholars, however, questioned the logic of hegemonic stability
theory from the very beginning. A number continue to do so today. They advance arguments
diametrically at odds with the primacist consensus. Trade routes need not be policed by a
single dominant power; the international economy is complex and resilient. Supply
disruptions are likely to be temporary, and the costs of mitigating their effects should be
borne by those who stand to lose — or gain — the most. Islamic extremists are scary, but
hardly comparable to the threat posed by a globe-straddling Soviet Union armed with
thousands of nuclear weapons. It is frankly absurd that we spend more today to fight Osama
bin Laden and his tiny band of murderous thugs than we spent to face down Joseph Stalin
and Chairman Mao. Many factors have contributed to the dramatic decline in the number of
wars between nation-states; it is unrealistic to expect that a new spasm of global conflict would
erupt if the United States were to modestly refocus its efforts, draw down its military power,
and call on other countries to play a larger role in their own defense, and in the security of
their respective regions.
But while there are credible alternatives to the United States serving in its current dual role as
world policeman / armed social worker, the foreign policy establishment in Washington has
no interest in exploring them. The people here have grown accustomed to living at the center
of the earth, and indeed, of the universe. The tangible benefits of all this military spending
flow disproportionately to this tiny corner of the United States while the schlubs in fly-over
country pick up the tab.
Empirics devastate them
Christopher J. Fettweis 11, Department of Political Science, Tulane University, 9/26/11, Free Riding or
Restraint? Examining European Grand Strategy, Comparative Strategy, 30:316–332, EBSCO
It is perhaps worth noting that there is no evidence to support a direct relationship between the relative level of U.S. activism and
international stability. In fact, the limited data we do have suggest the opposite may be true. During the 1990s,
the United States cut back on its defense spending fairly substantially. By 1998, the United States was spending $100
billion less on defense in real terms than it had in 1990.51 To internationalists, defense hawks and believers in
hegemonic stability, this irresponsible “peace dividend” endangered both national and global security. “No serious analyst of American military capabilities,” argued
Kristol and Kagan, “doubts that the defense budget has been cut much too far to meet America’s responsibilities to itself and to world peace.”52 On the other hand, if the pacific trends were not
based upon U.S. hegemony but a strengthening norm against interstate war, one would not have
expected an increase in global instability and violence. The verdict from the past two decades is fairly plain: The world
grew more peaceful while the United States cut its forces. No state seemed to believe that its security was
endangered by a less-capable United States military, or at least none took any action that would suggest such a belief.
No militaries were enhanced to address power vacuums, no security dilemmas drove insecurity or arms
races, and no regional balancing occurred once the stabilizing presence of the U.S. military was
diminished. The rest of the world acted as if the threat of international war was not a pressing concern,
despite the reduction in U.S. capabilities. Most of all, the United States and its allies were no less safe. The incidence and magnitude of
global conflict declined while the United States cut its military spending under President Clinton, and kept declining
as the Bush Administration ramped the spending back up. No complex statistical analysis should be necessary to reach
the conclusion that the two are unrelated. Military spending figures by themselves are insufficient to disprove a connection between overall U.S. actions and international
stability. Once again, one could presumably argue that spending is not the only or even the best indication of hegemony, and that it is instead U.S. foreign political and security commitments that maintain stability. Since neither
was significantly altered during this period, instability should not have been expected. Alternately, advocates of hegemonic stability could believe that relative rather than absolute spending is decisive in bringing peace. Although
the United States cut back on its spending during the 1990s, its relative advantage never wavered. However, even if it is true that either U.S. commitments or relative spending account for global pacific trends, then at the very least
stability can evidently be maintained at drastically lower levels of both. In other words, even if one can be allowed to argue in the alternative for a moment and suppose that there is in fact a level of engagement below which the
United States cannot drop without increasing international disorder, a rational grand strategist would still recommend cutting back on engagement and spending until that level is determined. Grand strategic decisions are never
final; continual adjustments can and must be made as time goes on. Basic logic suggests that the United States ought to spend the minimum amount of its blood and treasure while seeking the maximum return on its investment.
And if the current era of stability is as stable as many believe it to be, no increase in conflict would ever occur irrespective of U.S. spending, which would save untold trillions for an increasingly debt-ridden nation. It is also perhaps
worth noting that if opposite trends had unfolded, if other states had reacted to news of cuts in U.S. defense spending with more aggressive or insecure behavior, then internationalists would surely argue that their expectations
. If increases in conflict would have been interpreted as proof of the wisdom of internationalist
strategies, then logical consistency demands that the lack thereof should at least pose a problem. As it stands, the
only evidence we have regarding the likely systemic reaction to a more restrained United States
suggests that the current peaceful trends are unrelated to U.S. military spending. Evidently the rest of the world
can operate quite effectively without the presence of a global policeman. Those who think otherwise base
their view on faith alone.
had been fulfilled
AT: Transition Wars
No transition wars
Parent 11—assistant for of pol sci, U Miami. PhD in pol sci, Columbia—and—Paul MacDonald—
assistant prof of pol sci, Williams (Joseph, Graceful Decline?;The Surprising Success of Great Power
Retrenchment, Intl. Security, Spring 1, p. 7)
Some observers might dispute our conclusions, arguing that hegemonic transitions are more conflict prone than other moments of
acute relative decline. We counter that there are deductive and empirical reasons to doubt this argument. Theoretically,
hegemonic powers should actually find it easier to manage acute relative decline. Fallen hegemons still
have formidable capability, which threatens grave harm to any state that tries to cross
them. Further, they are no longer the top target for balancing coalitions, and recovering hegemons
may be influential because they can play a pivotal role in alliance formation. In addition,
hegemonic powers, almost by definition, possess more extensive overseas commitments; they should be able to
more readily identify and eliminate extraneous burdens without exposing vulnerabilities or
exciting domestic populations.
the empirical record supports these conclusions . In particular, periods of hegemonic transition do not appear more conflict prone than
those of acute decline. The last reversal at the pinnacle of power was the Anglo-American transition, which
took place around 1872 and was resolved without armed confrontation. The tenor of that transition may have been
We believe
influenced by a number of factors: both states were democratic maritime empires, the United States was slowly emerging from the Civil War, and Great Britain could likely coast on a large lead in
similar factors may work to cushion the impending
Sino-American transition. Both are large, relatively secure continental great powers, a fact that
mitigates potential geopolitical competition. 93 China faces a variety of domestic political challenges, including strains
among rival regions, which may complicate its ability to sustain its economic performance or engage in foreign policy
adventurism. 94
domestic capital stock. Although China and the United States differ in regime type,
No transition wars---best empirics and theory
Paul K. MacDonald 11, Assistant Professor of Political Science at Williams College, and Joseph M.
Parent, Assistant Professor of Political Science at the University of Miami, Spring 2011, “Graceful
Decline?: The Surprising Success of Great Power Retrenchment,” International Security, Vol. 35, No. 4, p.
7-44
Our findings are directly relevant to what appears to be an impending great power transition between China
and the United States. Estimates of economic performance vary, but most observers expect Chinese GDP to surpass U.S. GDP sometime in
the next decade or two.91 This prospect has generated considerable concern. Many scholars foresee major conflict during a
Sino-U.S. ordinal transition. Echoing Gilpin and Copeland, John Mearsheimer sees the crux of the issue as irreconcilable goals: China
wants to be America's superior and the United States wants no peer competitors. In his words, "[N]o amount [End Page 40] of goodwill can
ameliorate the intense security competition that sets in when an aspiring hegemon appears in Eurasia."92 Contrary
to these
predictions, our analysis suggests some grounds for optimism. Based on the historical track record of great
powers facing acute relative decline, the United States should be able to retrench in the coming decades. In the next few
years, the United States is ripe to overhaul its military, shift burdens to its allies, and work to decrease costly
international commitments. It is likely to initiate and become embroiled in fewer militarized disputes
than the average great power and to settle these disputes more amicably. Some might view this prospect
with apprehension, fearing the steady erosion of U.S. credibility. Yet our analysis suggests that retrenchment
need not signal weakness. Holding on to exposed and expensive commitments simply for the sake of one's
reputation is a greater geopolitical gamble than withdrawing to cheaper, more defensible frontiers. Some observers
might dispute our conclusions, arguing that hegemonic transitions are more conflict prone than other moments of acute
relative decline. We counter that there are deductive and empirical reasons to doubt this argument. Theoretically,
hegemonic powers should actually find it easier to manage acute relative decline. Fallen hegemons still
have formidable capability, which threatens grave harm to any state that tries to cross them. Further, they
are no longer the top target for balancing coalitions, and recovering hegemons may be influential because they can
play a pivotal role in alliance formation. In addition, hegemonic powers, almost by definition, possess more extensive
overseas commitments; they should be able to more readily identify and eliminate extraneous burdens without
exposing vulnerabilities or exciting domestic populations. We believe the empirical record supports these
conclusions . In particular, periods of hegemonic transition do not appear more conflict prone than those of acute
decline. The last reversal at the pinnacle of power was the Anglo-American transition, which took place
around 1872 and was resolved without armed confrontation. The tenor of that transition may have been influenced by a
number of factors: both states were democratic maritime empires, the United States was slowly emerging from the Civil War, and Great Britain
could likely coast on a large lead in domestic capital stock. Although China and the United [End Page 41] States differ in regime type, similar
factors may work to cushion the impending Sino-American transition. Both are large, relatively secure
continental great powers, a fact that mitigates potential geopolitical competition.93 China faces a variety of
domestic political challenges, including strains among rival regions, which may complicate its ability to sustain its economic performance or
engage in foreign policy adventurism.94 Most important, the United States is not in free fall. Extrapolating the data into the future, we
anticipate the
United States will experience a "moderate" decline, losing from 2 to 4 percent of its share of great power GDP in
the relatively gradual rate of U.S.
decline relative to China, the incentives for either side to run risks by courting conflict are minimal. The
United States would still possess upwards of a third of the share of great power GDP, and would have little to
gain from provoking a crisis over a peripheral issue. Conversely, China has few incentives to exploit U.S.
weakness.96 Given the importance of the U.S. market to the Chinese economy, in addition to the critical role
played by the dollar as a global reserve currency, it is unclear how Beijing could hope to consolidate or expand its
increasingly advantageous position through direct confrontation. In short, the United States should be able to reduce its
the five years after being surpassed by China sometime in the next decade or two.95 Given
foreign policy commitments in East Asia in the coming decades without inviting Chinese expansionism. Indeed, there is evidence that a policy of
retrenchment could reap potential benefits. The drawdown and repositioning of U.S. troops in South Korea, for example, rather than fostering
instability, has resulted in an improvement in the occasionally strained relationship between Washington and Seoul. 97 U.S. moderation on
Taiwan, rather than encouraging hard-liners in Beijing, resulted in an improvement in cross-strait relations and reassured U.S. allies that
Washington would not inadvertently drag them into a Sino-U.S. conflict. 98 Moreover, Washington’s support for the development of
multilateral security institutions, rather than harming bilateral alliances, could work to enhance U.S. prestige while embedding China within a
more transparent regional order. 99 A
policy of gradual retrenchment need not undermine the credibility of U.S.
alliance commitments or unleash destabilizing regional security dilemmas . Indeed, even if Beijing harbored
revisionist intent, it is unclear that China will have the force projection capabilities necessary to take and
hold additional territory. 100 By incrementally shifting burdens to regional allies and multilateral
institutions, the United States can strengthen the credibility of its core commitments while accommodating
the interests of a rising China. Not least among the benefits of retrenchment is that it helps alleviate an
unsustainable financial position. Immense forward deployments will only exacerbate U.S. grand
strategic problems and risk unnecessary clashes. 101
Recent trends prove
Joshua Goldstein 11, professor emeritus of IR, American U. PhD in pol sci from MIT. Former visiting
professor emeritus at Yale, Sept 2011, Think Again: War,
http://www.foreignpolicy.com/articles/2011/08/15/think_again_war
Nor do shifts in the global balance of power doom us to a future of perpetual war. While some political scientists
argue that an increasingly multipolar world is an increasingly volatile one -- that peace is best assured by the predominance of a
single hegemonic power, namely the United States -- recent geopolitical history suggests otherwise. Relative U.S. power and
worldwide conflict have waned in tandem over the past decade. The exceptions to the trend, Iraq and
Afghanistan, have been lopsided wars waged by the hegemon, not challenges by up-and-coming new
powers. The best precedent for today's emerging world order may be the 19th-century Concert of Europe, a collaboration of great powers that largely maintained the peace for a
century until its breakdown and the bloodbath of World War I.
AT: Lashout
Lashout is a joke unsupported by studies—prefer rigorous analysis
Paul K. MacDonald 11, Assistant Professor of Political Science at Williams College, and Joseph M.
Parent, Assistant Professor of Political Science at the University of Miami, Spring 2011, “Graceful
Decline?: The Surprising Success of Great Power Retrenchment,” International Security, Vol. 35, No. 4, p.
7-44
How do great powers respond to acute decline? The erosion of the relative power of the United States has scholars and policymakers reexamining
this question. The central issue
is whether prompt retrenchment is desirable or probable. Some pessimists
counsel that retrenchment is a dangerous policy, because it shows weakness and invites attack. Robert Kagan, for example,
warns, "A reduction in defense spending . . . would unnerve American allies and undercut efforts to gain greater
cooperation. There is already a sense around the world, fed by irresponsible pundits here at home, that the United States is in terminal
decline. Many fear that the economic crisis will cause the United States to pull back from overseas commitments. The announcement of a defense
cutback would be taken by the world as evidence that the American retreat has begun."1 Robert Kaplan likewise argues, "Husbanding our
power in an effort to slow America's decline in a post-Iraq and post-Afghanistan world would mean avoiding debilitating land entanglements and
focusing instead on being
more of an offshore balancer. . . . While this may be in America's interest, the very signaling of such
an aloof intention may encourage regional bullies. . . . [L]essening our engagement with the world would have devastating
consequences for humanity. The disruptions we witness today are but a taste of what is to come should our country flinch from its international
responsibilities."2 The consequences of these views are clear: retrenchment should be avoided and forward defenses maintained into the indefinite
future.3
Other observers advocate retrenchment policies, but they are pessimistic [End Page 7] about their prospects.4 Christopher Layne, for instance,
predicts, "Even as the globe is being turned upside down by material factors, the foreign policies of individual states are shaped by the ideas leaders
hold about their own nations' identity and place in world politics. More than most, America's foreign policy is the product of such ideas, and U.S.
foreign-policy elites have constructed their own myths of empire to justify the United States' hegemonic role."5 Stephen Walt likewise advocates
greater restraint in U.S. grand strategy, but cautions, "The United States . . . remains a remarkably immature great power, one whose rhetoric is
frequently at odds with its conduct and one that tends to treat the management of foreign affairs largely as an adjunct to domestic politics. . . .
[S]eemingly secure behind its nuclear deterrent and oceanic moats, and possessing unmatched economic and military power, the United States
allowed its foreign policy to be distorted by partisan sniping, hijacked by foreign lobbyists and narrow domestic special interests, blinded by lofty
but unrealistic rhetoric, and held hostage by irresponsible and xenophobic members of Congress."6 Although retrenchment is a preferable policy,
these arguments suggest that great powers often cling to unprofitable foreign commitments for parochial reasons of national culture or domestic
politics.7
These arguments have grim implications for contemporary international politics. With the rise of new powers,
such as China, the international pecking order will be in increasing flux in the coming decades.8 Yet, if the pessimists are correct,
politicians and interests groups in the United States will be unwilling or unable to realign resources
with overseas commitments. Perceptions of weakness and declining U.S. credibility will
encourage policymakers to hold on to burdensome overseas commitments, despite their high
costs in blood and treasure.9 Policymakers in Washington will struggle to retire from profitless military
engagements and restrain ballooning current accounts and budget deficits.10 For some observers, the wars in Iraq and Afghanistan
represent the ill-advised last gasps of a declining hegemon seeking to bolster its plummeting position.11
question the logic and evidence of the retrenchment pessimists. To date there has been
neither a comprehensive study of great power retrenchment nor a study that lays out the case for
retrenchment as a practical or probable policy. This article fills these gaps by systematically
examining the relationship between acute relative decline and the responses of great powers. We
In this article, we
examine eighteen cases of acute relative decline since 1870 and advance three main arguments.
First, we
challenge the retrenchment pessimists' claim that domestic or international constraints inhibit the
ability of declining great powers to retrench. In fact, when states fall in the hierarchy of great
powers, peaceful retrenchment is the most common response, even over short time spans. Based on
the empirical record, we find that great powers retrenched in no less than eleven and no more than fifteen of the eighteen cases, a range of 61-83
percent. When
international conditions demand it, states renounce risky ties, increase reliance
on allies or adversaries, draw down their military obligations, and impose adjustments on domestic
populations.
Second, we find that the magnitude of relative decline helps explain the extent of great power retrenchment. Following the dictates of neorealist
theory, great
powers retrench for the same reason they expand: the rigors of great power politics
compel them to do so.12 Retrenchment is by no means easy, but [End Page 9] necessity is the mother of invention, and declining
great powers face powerful incentives to contract their interests in a prompt and
proportionate manner. Knowing only a state's rate of relative economic decline explains its corresponding degree of retrenchment in
as much as 61 percent of the cases we examined.
Third, we argue that the rate of decline helps explain what forms great power retrenchment will take. How fast great powers fall contributes to
whether these retrenching states will internally reform, seek new allies or rely more heavily on old ones, and make diplomatic overtures to
enemies. Further, our analysis suggests that great
powers facing acute decline are less likely to initiate or
escalate militarized interstate disputes. Faced with diminishing resources, great powers
moderate their foreign policy ambitions and offer concessions in areas of lesser strategic value.
Contrary to the pessimistic conclusions of critics, retrenchment neither requires aggression nor
invites predation. Great powers are able to rebalance their commitments through compromise,
rather than conflict. In these ways, states respond to penury the same way they do to plenty: they seek to adopt policies that maximize
security given available means. Far from being a hazardous policy, retrenchment can be successful. States that
retrench often regain their position in the hierarchy of great powers. Of the fifteen great powers that adopted
retrenchment in response to acute relative decline, 40 percent managed to recover their ordinal rank. In contrast, none of the declining powers
that failed to retrench recovered their relative position.
Case Grid
No Grid Advantage
The grid is not vulnerable to cyber attacks – federal standards solve.
Michael Hayden, 14, Michael Hayden, ormer director of the Central Intelligence Agency and National
Security Agency, "Cybersecurity and the North American Electric Grid: New Policy Approaches to
Address an Evolving Threat", Bipartisan Policy Center, 2-28-2014,
http://bipartisanpolicy.org/events/cybersecurity-and-north-american-electric-grid-new-policyapproaches-address-evolving/
In many ways, the
electric power sector is in a stronger position to address cyber threats than other sectors,
as it already has mandatory standards—enforced at the federal level in the United States—that apply to
the bulk power system.10 The bulk power system is generally composed of high-voltage transmission
facilities and large generation facilities, but does not include small electric generators or the distribution systems that are used to
distribute power to local customers. The reliability of such local distribution facilities and small generators is
governed by state government regulators (provincial in Canada), or in the case of municipalities and rural electric cooperatives,
local government boards and commissions. Mandatory federal cybersecurity standards therefore apply only to the bulk
power system facilities and do not extend to the electric distribution system. Given this complex regulatory structure, achieving an
adequate level of cybersecurity across the electric grid as a whole is a challenge for industry and policymakers.
Their evidence is all hype.
Lawson, 2013,
Sean Department of Communication University of Utah. "Beyond Cyber-Doom: Assessing the Limits of
Hypothetical Scenarios in the Framing of Cyber-Threats." Journal of Information Technology & Politics
10.1 (2013): 86-103.
But we
have not seen anything close to the kinds of scenarios outlined by Yoran, Ergma, Toffler, and others.
Terrorists did not use cyberattack against the World Trade Center; they used hijacked aircraft. And the
attack of 9/11 did not lead to the long-term collapse of the U.S. economy; we would have to wait for the impacts of
years of bad mortgages for a financial meltdown. Nor did the cyberattacks on Estonia approximate what happened on
9/11 as Yoran has claimed, and certainly not nuclear warfare as Ergma has claimed. In fact, a scientist at the
NATO Co-operative Cyber Defence Centre of Excellence, which was established in Tallinn, Estonia in response to the 2007
cyberattacks, has written that the immediate impacts of those attacks were “minimal” or “nonexistent,”
and that the “no critical services were permanently affected” (Ottis, 2010: 72).
Cyber doom impacts are not supported by evidence and their authors have a financial
incentive to make the impacts seem worse than they are.
Lawson, 2013,
Sean Department of Communication University of Utah. "Beyond Cyber-Doom: Assessing the Limits of
Hypothetical Scenarios in the Framing of Cyber-Threats." Journal of Information Technology & Politics
10.1 (2013): 86-103.
Nonetheless, many
cybersecurity proponents continue to offer up cyber-doom scenarios that not only make
analogies to weapons of mass destruction (WMDs) and the terrorist attacks of 9/11, but also hold out
economic, social, and even civilizational collapse as possible impacts of cyberattacks. A report from the Hoover
Institution has warned of so-called “eWMDs” (Kelly & Almann, 2008); the FBI has warned that a cyberattack could have the same impact as a
“wellplaced bomb” (FOXNews.com, 2010b); and official DoD documents refer to “weapons of mass disruption,” implying that cyberattacks
might have impacts comparable to the use of WMD (Chairman of the Joint Chiefs of Staff 2004, 2006). John Arquilla, one of the first to theorize
cyberwar in the 1990s (Arquilla & Ronfeldt, 1997), has spoken of “a grave and growing capacity for crippling our tech-dependent society” and
has said that a “cyber 9/11” is a matter of if, not when (Arquilla, 2009). Mike McConnell, who has claimed that we are already in an ongoing
cyberwar (McConnell, 2010), has even predicted that a cyberattack could surpass the impacts of 9/11 “by an order of magnitude” (The Atlantic,
2010). Finally, some have even compared the impacts of prospective cyberattacks to the 2004 Indian Ocean tsunami that killed roughly a
quarter million people and caused widespread physical destruction in five countries (Meyer, 2010); suggested that cyberattack could pose an
“existential threat” to the United States (FOXNews.com 2010b); and offered the possibility that cyberattack threatens not only the continued
existence of the United States, but all of “global civilization” (Adhikari, 2009). In response,
critics have noted that not only has
the story about who threatens what, how, and with what potential impact shifted over time, but it has
done so with very little evidence provided to support the claims being made (Bendrath, 2001, 2003; Walt, 2010).
Others have noted that the cyber-doom scenarios offered for years by cybersecurity proponents have yet to
come to pass and question whether they are possible at all (Stohl, 2007). Some have also questioned the motives of
cybersecurity proponents. Various think tanks, security firms, defense contractors, and business leaders who
trumpet the problem of cyber attacks are portrayed as selfinterested ideologues who promote
unrealistic portrayals of cyber-threats (Greenwald, 2010).
Can't solve grid—too many operational burdens
Parthemore & Rogers, ‘10
[Christine, Fellow, Will, Bacevich Fellow, Center for New American Security, “Nuclear Reactors on
Military Bases May Be Risky,” Center for a New American Security, 5-20,
http://www.cnas.org/node/4502]
Many serious
complications must be weighed as well. Military base personnel often do not have the
necessary training in nuclear reactor management, oversight and regulatory credentials. Nuclear
reactors would necessitate additional qualified personnel and improved physical security requirements
to meet the 24/7 operations needs. As with siting for all energy production, local public resistance could be
problematic. When considering the impact of a reactor casualty, the resulting impact on the operational mission
effectiveness of the tenant commands on the base must also be considered so as to avoid a single point
vulnerability that disables all military operations on site. And while many private companies are touting new designs for
small reactors that would work well in this capacity, the technology may still be years away from fully meeting
technical requirements and federal regulatory standards.13 Proliferation considerations would also need to be part of any
adjudication of what types of reactors are most suitable for these purposes.
Grids resilient – backup solves
Wood 12 -- Senior Communications Advisor at Business Roundtable (Carter, 8/2/12, "The grid: After
India, America? No, but still…" http://businessroundtable.org/blog/the-grid-after-india-america-no-butstill/)
A blackout of such scale could not happen in the United States. For one thing, we don't have 600 million people. And
America's electrical grid is certainly much more resilient than the one in India, a still-developing country with
ineffective governments. Still, as The Washington Post reports today, "Aging power grid on overload as U.S. demands more electricity." At
CNBC, Jim Cramer asked Thomas F. Farrell II, Chairman, President
& CEO of Dominion Resources, about India.
Could the same thing happen in the United States? Farrell responded: Our system has a lot more rigor to it and
partly because we have reserve margins, meaning we have more power stations than we need to run at
any particular moment in time, so that if a power station goes out, there's a back-up to help keep the grid
stable. They don't have that much excess power in India, and when they get to the root cause, they'll probably find that
was somewhere in there.
Cyberattacks won’t occur on sensitive targets
Martin C. LIBICKI 2009 (Senior Policy Analyst – RAND Corporation, “Cyberdeterrence and Cyberwar”
http://www.rand.org/pubs/monographs/2009/RAND_MG877.pdf)
Some targets may be too risky or messy to be good targets. The risky targets include nuclear commandand-control systems (lest nervous adversaries conclude that they must use it or lose it) and space
systems (many of which are also strategic). Targets that give pause because of the mess their confusion
may cause include those whose malfunctioning may lead to civilian deaths, those whose disruption can
create vast environmental damage, and those whose integrity and accuracy can be very difficult to
restore when peace resumes (e.g., managers of bank and billing records). It would be good to think that
such systems are unassailable (or at least engineered to fail safely) precisely because they are sensitive.
Might a better reason to leave targets untouched be that restraint might persuade the other side to do
likewise, thereby limiting mutual destruction? 13 Mutually respected safe zones may even provide a
path for both sides to de-escalate.
No Meltdown Impact
Meltdowns don’t kill people. Fukishima proves.
Watts, 15,
Geoff Watts spent five years in medical research, working on cancer and on the effects of lasers on the
eye. But he abandoned an academic career in favour of science and medical journalism. Member of the
UK Government’s Human Genetics Commission, and a Fellow of the Academy of Medical Sciences. "Is
your fear of radiation irrational?", Mosaic, 7-14-2015, http://mosaicscience.com/story/your-fearradiation-irrational
To appreciate the measure of our hot-button fixation with radioactivity, recall the events of 2011 in Japan. The magnitude 9
earthquake and subsequent tsunami that hit the country on 11 March was by any measure a disaster. 20,000 people
died and more than 500 square kilometres of land were flooded. Families lost their homes, their businesses and their livelihoods. It didn’t
take long for the media to discover that one of the casualties, in pole position when the tsunami struck, was the Fukushima
nuclear power station. From that moment the story ceased to be about a natural event and became, in effect, about a man-made one.
It became that chilling scenario: a nuclear disaster. Of the 20,000 deaths, some were directly due to the earthquake itself, while others were
caused by drowning. How
many deaths were the result of radiation from the damaged plant? None. In its
section on the health consequences of the Fukushima tragedy, the report by the UN’s Scientific Committee on the Effects of
Atomic Radiation says: “No radiation-related deaths or acute diseases have been observed among the workers and
general public exposed to radiation from the accident.” The dose to the public, the report goes on to say, was generally
low or very low. “No discernible increased incidence of radiation-related health effects are expected among exposed members of the
public or their descendants.” This is not to play down the impact of the event. Three of the nuclear plant’s reactors suffered
damage to their cores, and a large amount of radioactive material was released into the environment.
Twelve workers are thought to have received doses of iodine-131 that will increase their risk of developing cancer of the thyroid gland. A
further 160 workers experienced doses sufficient to increase their risk of other cancers. “However,” says the report, “any increased incidence of
cancer in this group is expected to be indiscernible because of the difficulty of confirming such a small incidence against the normal statistical
fluctuations in cancer incidence.” In short, while
a terrifying natural event had killed many thousands of people, the
focus of attention in Japan and round the world was on one component of the tragedy that killed no one at the
time. Radiation exposure may have shortened the lives of some of those directly involved, but its effects are likely to be so small that we may
never know for sure whether they are related to the accident or not. When it comes to disaster, nuclear trumps natural. Our
sense of the
relative importance of things is absurdly skewed.
Chernobyl demonstrates very low risk of death even in catastrophic meltdowns.
Watts, 15,
Geoff Watts spent five years in medical research, working on cancer and on the effects of lasers on the
eye. But he abandoned an academic career in favour of science and medical journalism. Member of the
UK Government’s Human Genetics Commission, and a Fellow of the Academy of Medical Sciences. "Is
your fear of radiation irrational?", Mosaic, 7-14-2015, http://mosaicscience.com/story/your-fearradiation-irrational
Chernobyl, of course, was much worse. A poorly designed reactor operating under weak safety arrangements in a bureaucratic and secretive society was a
recipe for disaster. On 26 April 1986 all the ingredients came together – ironically during an experimental and bungled safety check. One of the
reactors overheated, caught fire, exploded and released a large quantity of radioactive material into the
atmosphere. 116,000 people were evacuated; another 270,000 found themselves living in a zone described as “highly contaminated”. It sounds bad.
For 134 of the workers involved in the initial cleanup, it was very bad. The dose they received was enough to cause acute radiation sickness, and 28 of them soon
died. Then, distrust of official information together with rumours of the dire consequences to be expected created a disproportionate fear. One
rumour
that 15,000 nuclear victims had been buried in a mass
grave. Nor did such rumours die away; another in 2000 held that 300,000 people had by that time died of radiation. The reality, though hardly
inconsequential, was less catastrophic. A World Health Organization expert group was set up to examine the aftermath of the disaster and
circulating during the period immediately following the accident claimed
to calculate its future health consequences. On the basis of average radiation exposure for the evacuees, the people who weren’t evacuated and the many more
thousands of workers later involved in the cleanup, the report
concluded that cancer deaths in these three groups will increase by no
more than 4 per cent. The report’s conclusions have been, and still are, contested – but the weight of orthodox opinion continues to line up behind the
expert group’s calculations.
Case – Heg/Tech Leadership
1NC
Spending cuts are an alt cause to tech leadership and the economy
Alivisatos et al 13 (Paul Alivisatos is director of Lawrence Berkeley National Laboratory. Eric D. Isaacs
is director of Argonne National Laboratory. Thom Mason is director of Oak Ridge National Laboratory)
(PAUL ALIVISATOS, ERIC D. ISAACS, AND THOM MASONMAR 12 2013, The Sequester Is Going to
Devastate U.S. Science Research for Decades,
http://www.theatlantic.com/politics/archive/2013/03/the-sequester-is-going-to-devastate-us-scienceresearch-for-decades/273925/)
Less than one percent of the federal budget goes to fund basic science research -- $30.2 billion out of
the total of $3.8 trillion President Obama requested in fiscal year 2012. By slashing that fraction even
further, the government will achieve short-term savings in millions this year, but the resulting gaps in
the innovation pipeline could cost billions of dollars and hurt the national economy for decades to come.
As directors of the Department of Energy's National Laboratories, we have a responsibility both to
taxpayers and to the thousands of talented and committed men and women who work in our labs. We
are doing everything we can to make sure our scientists and engineers can keep working on our nation's
most pressing scientific problems despite the cuts. It's not yet clear how much funding the National Labs
will lose, but it will total tens of millions of dollars. Interrupting -- or worse, halting -- basic research in
the physical, biological, and computational sciences would be devastating, both for science and for the
many U.S. industries that rely on our national laboratory system to power their research and
development efforts.
Instead, this drop in funding will force us to cancel all new programs and research initiatives, probably
for at least two years. This sudden halt on new starts will freeze American science in place while the
rest of the world races forward, and it will knock a generation of young scientists off their stride,
ultimately costing billions in missed future opportunities.
New ideas, new insights, new discoveries -- these are the lifeblood of science and the foundation of
America's historic culture of innovation and ingenuity. The science community recognizes the
importance of those new ideas, so we have systems in place to make sure great new ideas get a chance
to thrive. Every ongoing federally funded science program is reviewed regularly to make sure it's on
track and likely to yield results. Each year, stalled programs are terminated to make room for more
promising lines of research. Under sequestration, we will continue to review and cull unsuccessful
research efforts, but we won't be able to bring in new ideas to take their place.
Every federal agency that supports basic scientific research is facing this impossible dilemma. The
National Science Foundation -- which funds 20 percent of all federally supported basic research at
American colleges and universities -- just announced it is cutting back on 1,000 new research grants it
had planned to award this year. The Department of Energy's Office of Science, the nation's largest
supporter of basic research in the physical sciences, will have to shut the door on hundreds of new
proposals as well. The impact will multiply as long-planned and overdue supercomputer upgrades and
other necessary investments in our scientific infrastructure are stretched out, delayed, or put on hold
indefinitely.
The National Laboratories aren't just crucial to America's scientific infrastructure. They are also powerful
engines of economic development. Nobel Prize-winning economist Robert Solow has calculated that
over the past half century, more than half of the growth in our nation's GDP has been rooted in scientific
discoveries -- the kinds of fundamental, mission-driven research that we do at the labs. This early-stage
research has led to extraordinary real-world benefits, from nuclear power plants to compact fluorescent
bulbs to blood cholesterol tests. Because the United States has historically valued scientific inspiration,
our government has provided creative scientists and engineers with the support, facilities, and time they
need to turn brilliant ideas into real-world solutions.
Basing funding decisions solely on short-term fiscal goals risks the heart of America's scientific enterprise
and long-term economic growth -- diminishing our world leadership in science, technology and in the
creation of cutting-edge jobs.
Sequestration won't have an immediate, visible impact on American research. Laboratories will continue
to open their doors, and scientists and engineers will go to work. But as we choke off our ability to
pursue promising new ideas, we begin a slow but inexorable slide to stagnation. We can't afford to lose
a generation of new ideas and forfeit our national future.
Foreign countries are set to outpace the united states in tech leadership regardless of
the plan
Lempert 8
(Richard O., the Eric Stein Distinguished University Professor of Law and Sociology at The University of
Michigan Law School and a Research Professor in the George Washington Institute of Public Policy,
“Maintaining U.S. Scientific Leadership”, http://www.scienceprogress.org/2008/03/maintaining-usscientific-leadership/, 3/3)
If these advantages were not enough, the competition for science leadership was weak thanks to the
devastation that Europe suffered in two world wars and the slow rebuilding of European economies in
the post-war era. The upshot: U.S. science leadership is not natural and inevitable, but the loss of that
leadership may be. Countries much larger than the United States, most notably India and China, are
experiencing economic growth that outstrips ours, and as they grow in wealth they are rapidly
improving their educational systems and basic science infrastructures. Moreover, as globalization leads
companies born in the United States to move research and production capacity abroad, market demand
for trained scientists and engineers is increasing elsewhere while it is being dampened here. Even if the
United States retains a per capita education and investment advantage over India and China, population
differences alone mean that the number of trained scientists and engineers in these countries will soon
dwarf the number in America, with differences in the quantity and quality of science innovation likely to
follow. Added to the Asian challenge is a Europe that can no longer be seen as a set of discrete countries
when it comes to science. Rather, cross-border research teams are being encouraged, and European
Union-wide funding mechanisms are being established. In short, several decades from now we may find
that we are not the world’s number one country when it comes to science, however measured, but
perhaps no. 4 behind China, India, and the EU. We may also find that being in fourth place is not
altogether bad. When children in China are vaccinated against polio, they are not worse off because the
vaccine was invented in the United States. When an Indian inventor draws on two decades of U.S.
government-funded research to achieve a technological breakthrough, her accomplishment will not be
lessened because it would not have happened had research in the United States not paved the way.
Heg isn’t key to peace
Christopher J. Fettweis 11, Department of Political Science, Tulane University, 9/26/11, Free Riding or Restraint? Examining European
Grand Strategy, Comparative Strategy, 30:316–332, EBSCO
Assertions that without the combination of U.S. capabilities, presence and commitments instability would
return to Europe and the Pacific Rim are usually rendered in rather vague language. If the United States were to decrease
its commitments abroad, argued Robert Art, “the world will become a more dangerous place and, sooner or later, that will
redound to America’s detriment.”53 From where would this danger arise? Who precisely would do the fighting,
and over what issues? Without the United States, would Europe really descend into Hobbesian anarchy?
Would the Japanese attack mainland China again, to see if they could fare better this time around? Would the Germans and
French have another go at it? In other words, where
exactly is hegemony is keeping the peace? With one exception,
these questions are rarely addressed. That exception is in the Pacific Rim. Some analysts fear that a de facto surrender of U.S.
hegemony would lead to a rise of Chinese influence. Bradley Thayer worries that Chinese would become “the language of diplomacy, trade and
commerce, transportation and navigation, the internet, world sport, and global culture,” and that Beijing would come to “dominate science and
technology, in all its forms” to the extent that soon theworldwould witness a Chinese astronaut who not only travels to the Moon, but “plants
the communist flag on Mars, and perhaps other planets in the future.”54 Indeed Chin a is the only other major power that has increased its
military spending since the end of the Cold War, even if it still is only about 2 percent of its GDP. Such levels of effort do not suggest a desire to
compete with, much less supplant, the United States. The much-ballyhooed, decade-long
military buildup has brought
Chinese spending up to somewhere between one-tenth and one-fifth of the U.S. level. It is hardly
clear that a restrained United States would invite Chinese regional, must less global, political expansion. Fortunately
one need not ponder for too long the horrible specter of a red flag on Venus, since on the planet Earth, where war is no longer the dominant
form of conflict resolution, the threats posed by even a rising China would not be terribly dire. The dangers contained in the terrestrial security
Believers in the pacifying power of hegemony ought to keep in mind
a rather basic tenet: When it comes to policymaking, specific threats are more significant than vague, unnamed
dangers. Without specific risks, it is just as plausible to interpret U.S. presence as redundant, as overseeing a peace that has already arrived.
Strategy should not be based upon vague images emerging from the dark reaches of the
neoconservative imagination. Overestimating Our Importance One of the most basic insights of cognitive
psychology provides the final reason to doubt the power of hegemonic stability: Rarely are our actions
as consequential upon their behavior as we perceive them to be. A great deal of experimental evidence exists to
support the notion that people (and therefore states) tend to overrate the degree to which their behavior is
responsible for the actions of others. Robert Jervis has argued that two processes account for this overestimation, both
ofwhichwould seem to be especially relevant in theU.S. case. 55 First, believing that we are responsible for their actions
gratifies our national ego (which is not small to begin with; the United States is exceptional in its exceptionalism). The hubris of the
United States, long appreciated and noted, has only grown with the collapse of the Soviet Union.56 U.S. policymakers famously
have comparatively little knowledge of—or interest in—events that occur outside of their own
borders. If there is any state vulnerable to the overestimation of its importance due to the
fundamental misunderstanding of the motivation of others, it would have to be the United States.
environment are less severe than ever before.
Second, policymakers in the United States are far more familiar with our actions than they are with the decision-making processes of our allies.
Try as we might,
it is not possible to fully understand the threats, challenges, and opportunities that our
allies see from their perspective. The European great powers have domestic politics as complex as ours, and they also have
competent, capable strategists to chart their way forward. They react to many international forces, of which U.S.
behavior is only one. Therefore, for any actor trying to make sense of the action of others, Jervis notes, “in the absence of strong
evidence to the contrary, the most obvious and parsimonious explanation is that he was responsible.”57 It is natural, therefore, for U.S.
policymakers and strategists to believe that the behavior of our allies (and rivals) is shaped largely by what
Washington does. Presumably Americans are at least as susceptible to the overestimation of their ability as any other people, and
perhaps more so. At the very least, political psychologists tell us, we are probably not as important to them as we think.
The importance of U.S. hegemony in contributing to international stability is therefore almost
certainly overrated . In the end, one can never be sure why our major allies have not gone to, and do not even plan for, war. Like
deterrence, the hegemonic stability theory rests on faith; it can only be falsified, never proven . It does not
seem likely, however, that hegemony could fully account for twenty years of strategic decisions made in allied capitals if the international
system were not already a remarkably peaceful place. Perhaps
to begin with, and our commitments are redundant.
these states have no intention of fighting one another
Ext Alt Causes
Funding cuts is destroying the tech talent pool
Anderson ’10 [Professor of Biology, CalTech. David,
http://www.nytimes.com/2010/01/13/opinion/l13science.html]
As a professor of biology at the California Institute of Technology, where a large fraction of our students
are Asian, I found that your article rang true to me. Recently, one of my star Chinese graduate students
announced to my surprise and consternation that he would return to China after completing his training.
Every move requires not only a pull, but also a push. By focusing on a scientist, Shi Yigong, who was a
well-funded Howard Hughes Medical Institute investigator, the article did not mention a major force
driving Asian and other foreign-born life scientists out of the United States: the difficulty in obtaining
research funding from the National Institutes of Health. For example, the N.I.H. currently prohibits its
funds from being used to support the graduate or the postdoctoral training of foreign-born science
students, who make up the majority of our talent pool in many fields. In addition, the N.I.H.’s
conservative review panels make it very difficult for young scientists to secure funding for creative, risky
basic research, once they establish their own labs. This affects senior scientists as well. Recently, for
example, two of my colleagues, both tenured full professors, left Caltech to take positions with
Germany’s well-funded Max Planck Institute. I myself came close to accepting such an offer. Our
government research funding policies are driving some of our best scientists out of the country. The
brain gain that we achieved after World War II is gradually being turned into a brain drain.
EXT Heg D
Extend Fetwiess- Heg isn’t key to peace, their authors overestimates the importance
of the US in stopping conflict and can only rest on faith because their theory is only
falsifiable but not provable- their war scenarios are too vague and rely on unique
circumstances in the past that are un-applicable now.
U.S. primacy isn’t key to peace---their data is flawed
Christopher Preble 10, director of Foreign Policy Studies at the CATO Institute, August 3, 2010, “U.S.
Military Power: Preeminence for What Purpose?,” online: http://www.cato-at-liberty.org/u-s-militarypower-preeminence-for-what-purpose/
Most in Washington still embraces the notion that America is, and forever will be, the world’s
indispensable nation. Some scholars, however, questioned the logic of hegemonic stability
theory from the very beginning. A number continue to do so today. They advance arguments
diametrically at odds with the primacist consensus. Trade routes need not be policed by a
single dominant power; the international economy is complex and resilient. Supply
disruptions are likely to be temporary, and the costs of mitigating their effects should be
borne by those who stand to lose — or gain — the most. Islamic extremists are scary, but
hardly comparable to the threat posed by a globe-straddling Soviet Union armed with
thousands of nuclear weapons. It is frankly absurd that we spend more today to fight Osama
bin Laden and his tiny band of murderous thugs than we spent to face down Joseph Stalin
and Chairman Mao. Many factors have contributed to the dramatic decline in the number of
wars between nation-states; it is unrealistic to expect that a new spasm of global conflict would
erupt if the United States were to modestly refocus its efforts, draw down its military power,
and call on other countries to play a larger role in their own defense, and in the security of
their respective regions.
But while there are credible alternatives to the United States serving in its current dual role as
world policeman / armed social worker, the foreign policy establishment in Washington has
no interest in exploring them. The people here have grown accustomed to living at the center
of the earth, and indeed, of the universe. The tangible benefits of all this military spending
flow disproportionately to this tiny corner of the United States while the schlubs in fly-over
country pick up the tab.
Empirics devastate them
Christopher J. Fettweis 11, Department of Political Science, Tulane University, 9/26/11, Free Riding or
Restraint? Examining European Grand Strategy, Comparative Strategy, 30:316–332, EBSCO
It is perhaps worth noting that there is no evidence to support a direct relationship between the relative level of U.S. activism and
international stability. In fact, the limited data we do have suggest the opposite may be true. During the 1990s,
the United States cut back on its defense spending fairly substantially. By 1998, the United States was spending $100
billion less on defense in real terms than it had in 1990.51 To internationalists, defense hawks and believers in
hegemonic stability, this irresponsible “peace dividend” endangered both national and global security. “No serious analyst of American military capabilities,” argued
Kristol and Kagan, “doubts that the defense budget has been cut much too far to meet America’s responsibilities to itself and to world peace.”52 On the other hand, if the pacific trends were not
based upon U.S. hegemony but a strengthening norm against interstate war, one would not have
expected an increase in global instability and violence. The verdict from the past two decades is fairly plain: The world
grew more peaceful while the United States cut its forces. No state seemed to believe that its security was
endangered by a less-capable United States military, or at least none took any action that would suggest such a belief.
No militaries were enhanced to address power vacuums, no security dilemmas drove insecurity or arms
races, and no regional balancing occurred once the stabilizing presence of the U.S. military was
diminished. The rest of the world acted as if the threat of international war was not a pressing concern,
despite the reduction in U.S. capabilities. Most of all, the United States and its allies were no less safe. The incidence and magnitude of
global conflict declined while the United States cut its military spending under President Clinton, and kept declining
as the Bush Administration ramped the spending back up. No complex statistical analysis should be necessary to reach
the conclusion that the two are unrelated. Military spending figures by themselves are insufficient to disprove a connection between overall U.S. actions and international
stability. Once again, one could presumably argue that spending is not the only or even the best indication of hegemony, and that it is instead U.S. foreign political and security commitments that maintain stability. Since neither
was significantly altered during this period, instability should not have been expected. Alternately, advocates of hegemonic stability could believe that relative rather than absolute spending is decisive in bringing peace. Although
the United States cut back on its spending during the 1990s, its relative advantage never wavered. However, even if it is true that either U.S. commitments or relative spending account for global pacific trends, then at the very least
stability can evidently be maintained at drastically lower levels of both. In other words, even if one can be allowed to argue in the alternative for a moment and suppose that there is in fact a level of engagement below which the
United States cannot drop without increasing international disorder, a rational grand strategist would still recommend cutting back on engagement and spending until that level is determined. Grand strategic decisions are never
final; continual adjustments can and must be made as time goes on. Basic logic suggests that the United States ought to spend the minimum amount of its blood and treasure while seeking the maximum return on its investment.
And if the current era of stability is as stable as many believe it to be, no increase in conflict would ever occur irrespective of U.S. spending, which would save untold trillions for an increasingly debt-ridden nation. It is also perhaps
worth noting that if opposite trends had unfolded, if other states had reacted to news of cuts in U.S. defense spending with more aggressive or insecure behavior, then internationalists would surely argue that their expectations
. If increases in conflict would have been interpreted as proof of the wisdom of internationalist
strategies, then logical consistency demands that the lack thereof should at least pose a problem. As it stands, the
only evidence we have regarding the likely systemic reaction to a more restrained United States
suggests that the current peaceful trends are unrelated to U.S. military spending. Evidently the rest of the world
can operate quite effectively without the presence of a global policeman. Those who think otherwise base
their view on faith alone.
had been fulfilled
AT: Transition Wars
No transition wars
Parent 11—assistant for of pol sci, U Miami. PhD in pol sci, Columbia—and—Paul MacDonald—
assistant prof of pol sci, Williams (Joseph, Graceful Decline?;The Surprising Success of Great Power
Retrenchment, Intl. Security, Spring 1, p. 7)
Some observers might dispute our conclusions, arguing that hegemonic transitions are more conflict prone than other moments of
acute relative decline. We counter that there are deductive and empirical reasons to doubt this argument. Theoretically,
hegemonic powers should actually find it easier to manage acute relative decline. Fallen hegemons still
have formidable capability, which threatens grave harm to any state that tries to cross
them. Further, they are no longer the top target for balancing coalitions, and recovering hegemons
may be influential because they can play a pivotal role in alliance formation. In addition,
hegemonic powers, almost by definition, possess more extensive overseas commitments; they should be able to
more readily identify and eliminate extraneous burdens without exposing vulnerabilities or
exciting domestic populations.
the empirical record supports these conclusions . In particular, periods of hegemonic transition do not appear more conflict prone than
those of acute decline. The last reversal at the pinnacle of power was the Anglo-American transition, which
took place around 1872 and was resolved without armed confrontation. The tenor of that transition may have been
We believe
influenced by a number of factors: both states were democratic maritime empires, the United States was slowly emerging from the Civil War, and Great Britain could likely coast on a large lead in
similar factors may work to cushion the impending
Sino-American transition. Both are large, relatively secure continental great powers, a fact that
mitigates potential geopolitical competition. 93 China faces a variety of domestic political challenges, including strains
among rival regions, which may complicate its ability to sustain its economic performance or engage in foreign policy
adventurism. 94
domestic capital stock. Although China and the United States differ in regime type,
No transition wars---best empirics and theory
Paul K. MacDonald 11, Assistant Professor of Political Science at Williams College, and Joseph M.
Parent, Assistant Professor of Political Science at the University of Miami, Spring 2011, “Graceful
Decline?: The Surprising Success of Great Power Retrenchment,” International Security, Vol. 35, No. 4, p.
7-44
Our findings are directly relevant to what appears to be an impending great power transition between China
and the United States. Estimates of economic performance vary, but most observers expect Chinese GDP to surpass U.S. GDP sometime in
the next decade or two.91 This prospect has generated considerable concern. Many scholars foresee major conflict during a
Sino-U.S. ordinal transition. Echoing Gilpin and Copeland, John Mearsheimer sees the crux of the issue as irreconcilable goals: China
wants to be America's superior and the United States wants no peer competitors. In his words, "[N]o amount [End Page 40] of goodwill can
ameliorate the intense security competition that sets in when an aspiring hegemon appears in Eurasia."92 Contrary
to these
predictions, our analysis suggests some grounds for optimism. Based on the historical track record of great
powers facing acute relative decline, the United States should be able to retrench in the coming decades. In the next few
years, the United States is ripe to overhaul its military, shift burdens to its allies, and work to decrease costly
international commitments. It is likely to initiate and become embroiled in fewer militarized disputes
than the average great power and to settle these disputes more amicably. Some might view this prospect
with apprehension, fearing the steady erosion of U.S. credibility. Yet our analysis suggests that retrenchment
need not signal weakness. Holding on to exposed and expensive commitments simply for the sake of one's
reputation is a greater geopolitical gamble than withdrawing to cheaper, more defensible frontiers. Some observers
might dispute our conclusions, arguing that hegemonic transitions are more conflict prone than other moments of acute
relative decline. We counter that there are deductive and empirical reasons to doubt this argument. Theoretically,
hegemonic powers should actually find it easier to manage acute relative decline. Fallen hegemons still
have formidable capability, which threatens grave harm to any state that tries to cross them. Further, they
are no longer the top target for balancing coalitions, and recovering hegemons may be influential because they can
play a pivotal role in alliance formation. In addition, hegemonic powers, almost by definition, possess more extensive
overseas commitments; they should be able to more readily identify and eliminate extraneous burdens without
exposing vulnerabilities or exciting domestic populations. We believe the empirical record supports these
conclusions . In particular, periods of hegemonic transition do not appear more conflict prone than those of acute
decline. The last reversal at the pinnacle of power was the Anglo-American transition, which took place
around 1872 and was resolved without armed confrontation. The tenor of that transition may have been influenced by a
number of factors: both states were democratic maritime empires, the United States was slowly emerging from the Civil War, and Great Britain
could likely coast on a large lead in domestic capital stock. Although China and the United [End Page 41] States differ in regime type, similar
factors may work to cushion the impending Sino-American transition. Both are large, relatively secure
continental great powers, a fact that mitigates potential geopolitical competition.93 China faces a variety of
domestic political challenges, including strains among rival regions, which may complicate its ability to sustain its economic performance or
engage in foreign policy adventurism.94 Most important, the United States is not in free fall. Extrapolating the data into the future, we
anticipate the
United States will experience a "moderate" decline, losing from 2 to 4 percent of its share of great power GDP in
the relatively gradual rate of U.S.
decline relative to China, the incentives for either side to run risks by courting conflict are minimal. The
United States would still possess upwards of a third of the share of great power GDP, and would have little to
gain from provoking a crisis over a peripheral issue. Conversely, China has few incentives to exploit U.S.
weakness.96 Given the importance of the U.S. market to the Chinese economy, in addition to the critical role
played by the dollar as a global reserve currency, it is unclear how Beijing could hope to consolidate or expand its
increasingly advantageous position through direct confrontation. In short, the United States should be able to reduce its
the five years after being surpassed by China sometime in the next decade or two.95 Given
foreign policy commitments in East Asia in the coming decades without inviting Chinese expansionism. Indeed, there is evidence that a policy of
retrenchment could reap potential benefits. The drawdown and repositioning of U.S. troops in South Korea, for example, rather than fostering
instability, has resulted in an improvement in the occasionally strained relationship between Washington and Seoul. 97 U.S. moderation on
Taiwan, rather than encouraging hard-liners in Beijing, resulted in an improvement in cross-strait relations and reassured U.S. allies that
Washington would not inadvertently drag them into a Sino-U.S. conflict. 98 Moreover, Washington’s support for the development of
multilateral security institutions, rather than harming bilateral alliances, could work to enhance U.S. prestige while embedding China within a
more transparent regional order. 99 A
policy of gradual retrenchment need not undermine the credibility of U.S.
alliance commitments or unleash destabilizing regional security dilemmas . Indeed, even if Beijing harbored
revisionist intent, it is unclear that China will have the force projection capabilities necessary to take and
hold additional territory. 100 By incrementally shifting burdens to regional allies and multilateral
institutions, the United States can strengthen the credibility of its core commitments while accommodating
the interests of a rising China. Not least among the benefits of retrenchment is that it helps alleviate an
unsustainable financial position. Immense forward deployments will only exacerbate U.S. grand
strategic problems and risk unnecessary clashes. 101
Recent trends prove
Joshua Goldstein 11, professor emeritus of IR, American U. PhD in pol sci from MIT. Former visiting
professor emeritus at Yale, Sept 2011, Think Again: War,
http://www.foreignpolicy.com/articles/2011/08/15/think_again_war
Nor do shifts in the global balance of power doom us to a future of perpetual war. While some political scientists
argue that an increasingly multipolar world is an increasingly volatile one -- that peace is best assured by the predominance of a
single hegemonic power, namely the United States -- recent geopolitical history suggests otherwise. Relative U.S. power and
worldwide conflict have waned in tandem over the past decade. The exceptions to the trend, Iraq and
Afghanistan, have been lopsided wars waged by the hegemon, not challenges by up-and-coming new
powers. The best precedent for today's emerging world order may be the 19th-century Concert of Europe, a collaboration of great powers that largely maintained the peace for a
century until its breakdown and the bloodbath of World War I.
AT: Lashout
Lashout is a joke unsupported by studies—prefer rigorous analysis
Paul K. MacDonald 11, Assistant Professor of Political Science at Williams College, and Joseph M.
Parent, Assistant Professor of Political Science at the University of Miami, Spring 2011, “Graceful
Decline?: The Surprising Success of Great Power Retrenchment,” International Security, Vol. 35, No. 4, p.
7-44
How do great powers respond to acute decline? The erosion of the relative power of the United States has scholars and policymakers reexamining
this question. The central issue
is whether prompt retrenchment is desirable or probable. Some pessimists
counsel that retrenchment is a dangerous policy, because it shows weakness and invites attack. Robert Kagan, for example,
warns, "A reduction in defense spending . . . would unnerve American allies and undercut efforts to gain greater
cooperation. There is already a sense around the world, fed by irresponsible pundits here at home, that the United States is in terminal
decline. Many fear that the economic crisis will cause the United States to pull back from overseas commitments. The announcement of a defense
cutback would be taken by the world as evidence that the American retreat has begun."1 Robert Kaplan likewise argues, "Husbanding our
power in an effort to slow America's decline in a post-Iraq and post-Afghanistan world would mean avoiding debilitating land entanglements and
focusing instead on being
more of an offshore balancer. . . . While this may be in America's interest, the very signaling of such
an aloof intention may encourage regional bullies. . . . [L]essening our engagement with the world would have devastating
consequences for humanity. The disruptions we witness today are but a taste of what is to come should our country flinch from its international
responsibilities."2 The consequences of these views are clear: retrenchment should be avoided and forward defenses maintained into the indefinite
future.3
Other observers advocate retrenchment policies, but they are pessimistic [End Page 7] about their prospects.4 Christopher Layne, for instance,
predicts, "Even as the globe is being turned upside down by material factors, the foreign policies of individual states are shaped by the ideas leaders
hold about their own nations' identity and place in world politics. More than most, America's foreign policy is the product of such ideas, and U.S.
foreign-policy elites have constructed their own myths of empire to justify the United States' hegemonic role."5 Stephen Walt likewise advocates
greater restraint in U.S. grand strategy, but cautions, "The United States . . . remains a remarkably immature great power, one whose rhetoric is
frequently at odds with its conduct and one that tends to treat the management of foreign affairs largely as an adjunct to domestic politics. . . .
[S]eemingly secure behind its nuclear deterrent and oceanic moats, and possessing unmatched economic and military power, the United States
allowed its foreign policy to be distorted by partisan sniping, hijacked by foreign lobbyists and narrow domestic special interests, blinded by lofty
but unrealistic rhetoric, and held hostage by irresponsible and xenophobic members of Congress."6 Although retrenchment is a preferable policy,
these arguments suggest that great powers often cling to unprofitable foreign commitments for parochial reasons of national culture or domestic
politics.7
These arguments have grim implications for contemporary international politics. With the rise of new powers,
such as China, the international pecking order will be in increasing flux in the coming decades.8 Yet, if the pessimists are correct,
politicians and interests groups in the United States will be unwilling or unable to realign resources
with overseas commitments. Perceptions of weakness and declining U.S. credibility will
encourage policymakers to hold on to burdensome overseas commitments, despite their high
costs in blood and treasure.9 Policymakers in Washington will struggle to retire from profitless military
engagements and restrain ballooning current accounts and budget deficits.10 For some observers, the wars in Iraq and Afghanistan
represent the ill-advised last gasps of a declining hegemon seeking to bolster its plummeting position.11
question the logic and evidence of the retrenchment pessimists. To date there has been
neither a comprehensive study of great power retrenchment nor a study that lays out the case for
retrenchment as a practical or probable policy. This article fills these gaps by systematically
examining the relationship between acute relative decline and the responses of great powers. We
In this article, we
examine eighteen cases of acute relative decline since 1870 and advance three main arguments.
First, we
challenge the retrenchment pessimists' claim that domestic or international constraints inhibit the
ability of declining great powers to retrench. In fact, when states fall in the hierarchy of great
powers, peaceful retrenchment is the most common response, even over short time spans. Based on
the empirical record, we find that great powers retrenched in no less than eleven and no more than fifteen of the eighteen cases, a range of 61-83
percent. When
international conditions demand it, states renounce risky ties, increase reliance
on allies or adversaries, draw down their military obligations, and impose adjustments on domestic
populations.
Second, we find that the magnitude of relative decline helps explain the extent of great power retrenchment. Following the dictates of neorealist
theory, great
powers retrench for the same reason they expand: the rigors of great power politics
compel them to do so.12 Retrenchment is by no means easy, but [End Page 9] necessity is the mother of invention, and declining
great powers face powerful incentives to contract their interests in a prompt and
proportionate manner. Knowing only a state's rate of relative economic decline explains its corresponding degree of retrenchment in
as much as 61 percent of the cases we examined.
Third, we argue that the rate of decline helps explain what forms great power retrenchment will take. How fast great powers fall contributes to
whether these retrenching states will internally reform, seek new allies or rely more heavily on old ones, and make diplomatic overtures to
enemies. Further, our analysis suggests that great
powers facing acute decline are less likely to initiate or
escalate militarized interstate disputes. Faced with diminishing resources, great powers
moderate their foreign policy ambitions and offer concessions in areas of lesser strategic value.
Contrary to the pessimistic conclusions of critics, retrenchment neither requires aggression nor
invites predation. Great powers are able to rebalance their commitments through compromise,
rather than conflict. In these ways, states respond to penury the same way they do to plenty: they seek to adopt policies that maximize
security given available means. Far from being a hazardous policy, retrenchment can be successful. States that
retrench often regain their position in the hierarchy of great powers. Of the fifteen great powers that adopted
retrenchment in response to acute relative decline, 40 percent managed to recover their ordinal rank. In contrast, none of the declining powers
that failed to retrench recovered their relative position.
Case Solvency
Solvency
cyber attacks happen all the time (reasons)
- OPM data hack, no retaliation against china
- Attribution, no one to blame
Backdoors aren’t the problem; weak security systems in general allow hackers to get
in regardless
Lever 14 (http://www.businessinsider.com/afp-cyberattacks-to-worsen-in-2015-mcafee-researchers2014-12, Cyberattacks Are Just Going To Get Worse From Here)//A.V.
Health care hacks McAfee said it is already seeing hackers targeting devices such as webcams with weak
security and industrial control systems. But it sees health care as an especially worrisome sector. "With the
increasing proliferation of healthcare IoT devices and their use in hospitals, the threat of the loss of information contained on
those devices becomes increasingly likely," the report said. It noted that health care data "is even more valuable than credit
card data" on hacker black markets. McAfee says other threats will also grow, including "ransomware," which locks down data and forces the
victim to pay a ransom to retrieve it, and attacks on mobile phone operating systems. In
the retail sector, digital payments may
cut the risk of credit-card skimmers, but hackers may be able to exploit wireless systems such as
Bluetooth and near field communications (NFC) used for mobile payments. "With consumers now sending payment
information over a protocol with known vulnerabilities, it is highly likely that attacks on this infrastructure will emerge in 2015," the report said.
The report comes in the wake of news about large-scale cyberattacks that have been linked to Russia or China, and a major infiltration of Sony
Pictures which stole massive amounts of data. In retail, Home Depot and others have reported data breaches affecting millions of customers.
"The year 2014 will be remembered as the year of shaken trust," said Vincent Weafer, senior vice president at Intel-owned McAfee. "Restoring
trust in 2015 will require stronger industry collaboration, new standards for a new threat landscape, and new security postures that shrink
time-to-detection."
Cyberattackers will use more sophisticated tactics to attack servers regardless of
backdoors
Lever 14 (http://www.businessinsider.com/afp-cyberattacks-to-worsen-in-2015-mcafee-researchers2014-12, Cyberattacks Are Just Going To Get Worse From Here)//A.V.
Read more: http://www.businessinsider.com/afp-cyberattacks-to-worsen-in-2015-mcafee-researchers2014-12#ixzz3gC0T6P8z
Washington (AFP) - A series of spectacular cyberattacks drew headlines this year, and the situation will
only worsen in 2015 as hackers use more advanced techniques to infiltrate networks, security researchers said
Tuesday. McAfee Labs' 2015 Threats Predictions report sees increased cyber-warfare and espionage, along
with new strategies from hackers to hide their tracks and steal sensitive data. "Cyber espionage attacks
will continue to increase in frequency," the report said. "Long-term players will become stealthier
information gatherers, while newcomers will look for ways to steal money and disrupt their
adversaries." McAfee said small nations and terror groups will become even more active and will "attack by launching crippling distributed
denial of service attacks or using malware that wipes the master boot record to destroy their enemies' networks." At the same time,
cybercriminals will use better methods to remain hidden on a victim's network, to carry out long-term
theft of data without being detected, the researchers said. "In this way, criminals are beginning to look
and act more like sophisticated nation-state cyberespionage actors, who watch and wait to gather
intelligence," the report said. The report also said hackers are looking to target more connected devices, including computers in the
farming, manufacturing, and health care sectors. "The number and variety of devices in the Internet of Things (IoT) family is growing
exponentially. In the consumer space, they are now seen in appliances, automobiles, home automation, and even light bulbs," McAfee said.
No Solvency – Secure Data Act
Loopholes exist for the FBI and NSA
Cushing 14
Tim Cushing, Techdirt contributor, 12-5-14, "Ron Wyden Introduces Legislation Aimed At Preventing FBIMandated Backdoors In Cellphones And Computers," Techdirt.,
https://www.techdirt.com/articles/20141204/16220529333/ron-wyden-introduces-legislation-aimedpreventing-fbi-mandated-backdoors-cellphones-computers.shtml//SRawal
Here's the actual wording of the backdoor ban [pdf link], which has a couple of loopholes in it. (a) IN GENERAL.—Except as
provided in subsection (b), no agency may mandate that a manufacturer, developer, or seller of covered products design or alter the security functions in its product
or service to allow the surveillance of any user of such product or service, or to allow the physical search of such product, by any agency. Subsection
(b)
presents the first loophole, naming the very act that Comey is pursuing to have amended in his agency's favor. (b)
EXCEPTION.—Subsection (a) shall not apply to mandates authorized under the Communications Assistance
for Law Enforcement Act (47 U.S.C. 1001 et seq.). Comey wants to alter CALEA or, failing that, get a few legislators to
run some sort of encryption-targeting legislation up the Congressional flagpole for him. Wyden's bill won't thwart these
efforts and it does leave the NSA free to continue with its pre-existing homebrewed backdoor efforts -- the
kind that don't require mandates because they're performed off-site without the manufacturer's
knowledge.
The NSA will still have access- government can still influence companies – no incentive
to listen to the plan
Newman 14
Lily Hay Newman, 12-5-2014, "Senator Proposes Bill to Prohibit Government-Mandated Backdoors in
Smartphones," Slate Magazine,
http://www.slate.com/blogs/future_tense/2014/12/05/senator_wyden_proposes_secure_data_act_to_
keep_government_agencies_from.html//SRawal
It's worth noting, though, that the Secure Data Act doesn't actually prohibit backdoors—it just prohibits agencies
from mandating them. There are a lot of other types of pressure government groups could still use to
influence the creation of backdoors, even if they couldn't flat-out demand them. Here's the wording in the bill:
"No agency may mandate that a manufacturer, developer, or seller of covered products design or alter the security functions in its product or
service to allow the surveillance of any user of such product or service, or to allow the physical search of such product, by any agency."
0-Day Vulnerability Circumvention
The NSA and DOD buy 0-Day vulnerabilities and use them to survey US citizens.
Eliminating backdoors does not stop this.
Smith 13
Ms. Smith, Ms. Smith (not her real name) is a freelance writer and programmer with a special and
somewhat personal interest in IT privacy and security issues., 5-12-2013, "U.S. government is 'biggest
buyer' of zero-day vulnerabilities, report claims," Network World,
http://www.networkworld.com/article/2224620/microsoft-subnet/u-s--government-is--biggest-buyer-of-zero-day-vulnerabilities--report-claims.html
When it comes to exploiting zero-days for cyberweapons and cyber-spying, China's not the only "devil"...we are too, according to
a Reuters report that claimed the U.S. government is the "biggest buyer in a burgeoning gray market where hackers
and security firms sell tools for breaking into computers." For the first time, the Pentagon's annual report to Congress [pdf]
accused the Chinese government and military of targeting U.S. government and defense networks as well as using cyber espionage to get its
hands on cutting-edge military technology. While this is not new news, it's new for the Pentagon's report. Also not new was China's
reaction,calling the accusations "groundless" and "hype." Some security experts were openly skeptical of the Pentagon's report, which did not
mention what cyberspying the U.S. does in return. The U.S. spends billions every year on "cyberdefense and constructing
increasingly sophisticated cyberweapons." NSA chief General Keith Alexander told Congress that the U.S. is "creating more than a dozen
offensive cyberunits, designed to mount attacks, when necessary, at foreign computer networks." Reuters added that the
NSA and
Department of Defense are "spending so heavily for information on holes in commercial computer systems,
and on exploits taking advantage of them, that they are turning the world of security research on its head." An
unnamed, former defense contractor for the U.S. said, "My job was to have 25 zero-days on a USB stick,
ready to go." Although intelligence agencies code some of their own zero-day exploits, Reuters said, "Private companies have also sprung
up that hire programmers to do the grunt work of identifying vulnerabilities and then writing exploit code. The starting rate for a zero-day is
around $50,000, some buyers said, with the price depending on such factors as how widely installed the targeted software is and how long the
zero-day is expected to remain exclusive." Vendors at secret snoop conferences pimp a variety of products to assist law enforcement and
intelligence agencies in spying. Vupen, which is well-known for selling zero-day exploits exclusively to governments, recently sent 12
researchers to an offensive hacking techniques conference where attendees heard talks such as "Advanced Heap Manipulation in Windows 8."
While the U.S. government sent more than a dozen people to the conference,
or products are intended
it's unknown whether the acquired knowledge
for spying on other governments or deploying malware for eavesdropping and
remote searches, aka virtual force and Trojan horse warrants. ReVuln is newer to the exploits-for-sale game than Vupen, but "specializes
in crafting exploits for industrial control systems that govern everything from factory floors to power generators." When "asked if they would
be troubled if some of their programs were used in attacks that caused death or destruction, they said: 'We don't sell weapons, we sell
information. This question would be worth asking to vendors leaving security holes in their products'." The Reuters' article warned there
are
"unintended consequences" when the U.S. chooses to exploit a vulnerability instead of warning the public about
the hole. If and when other countries discover the vulnerability, it could be reverse-engineered and used against U.S. corporations. One such
example was the Duqu malware that was "designed to steal industrial-facility designs from Iran." Duqu was copied by cybercriminals, who then
"rolled it into 'exploit kits,' including one called Blackhole and another called Cool." Despite Microsoft issuing a patch, F-Secure reported that
"hackers used it last year to attack 16 out of every 1,000 U.S. computers and an even greater proportion in some other countries." Security
researchers are increasingly reluctant to publicly report zero-days for no compensation. The vulnerability might be worth a small fortune after
all; and even
though Google and Facebook pay "bounties," the companies say "they are hard-pressed to
compete financially with defense-industry spending."
The USFG will use 0-Day exploits to gather intelligence information on US citizens
instead of disclosing them to software developers.
Wessler and Cortez 15
Nathan Freed Wessler, Staff Attorney, ACLU Speech, Privacy & Technology Project
& Dyan Cortez, Legal Assistant, ACLU Human Rights Program and Speech, Privacy, and Technology
Project, 6-26-2015, "FBI Releases Details of 'Zero-Day' Exploit Decisionmaking Process," American Civil
Liberties Union, https://www.aclu.org/blog/free-future/fbi-releases-details-zero-day-exploitdecisionmaking-process
In response to an ACLU Freedom of Information Act request, the FBI has released a set of internal slides that shed new light on the
federal government’s process for evaluating how to handle so-called “zero-day” vulnerabilities in software and
internet platforms. This new document helps inform the raging national debate about how to secure the nation’s infrastructure against
malicious hacking and foreign espionage. Zero-day
exploits are computer code that take advantage of security flaws
in software that are unknown to the software’s programmers and users. While security vulnerabilities can be
exploited by criminals and perpetrators of cyber-attacks, they can also be exploited by governments for
military, intelligence, and law-enforcement purposes. Zero-day exploits can, for example, be used to gain unauthorized
access to a computer system in order to deliver spyware or download sensitive user data. The most effective way to protect
systems from security exploits is for software developers to release a patch fixing the underlying flaw, but this is only possible if they are
notified of the security hole. Without
a patch, users remain vulnerable and potential targets can do very little to
protect themselves. This is what makes zero-day exploits so alarming. Back in April 2014, we filed a FOIA request to
obtain documents related to the policies followed by government agencies when they discover or acquire zeroday vulnerabilities and software exploits. In particular, we sought to learn how the government balances its own intelligence needs with
the importance of making sure that the software used by Americans is as secure as possible. After more than a year, the FBI
finally responded to our request by releasing a heavily redacted document that outlines key details of the
government’s internal process for dealing with zero-days. While scant on details, the document is significantly more illuminating than the
responses we and other groups have received from elsewhere in the government. As we know from reports about the formal US policy
regarding zero-day exploits, the
government reserves the right to keep a vulnerability secret—leaving all users
of the affected software open to attack—and use it offensively instead. The new document sheds light on the
process through which the government makes that decision. According to the document, after a federal agency learns of a zeroday vulnerability, representatives of “all concerned” US government agencies are informed and then
“participate in discussions” to decide how to balance the US government missions of “cybersecurity, information
assurance, intelligence, counterintelligence, law enforcement, military operations and critical
infrastructure protection.” The FBI recognizes that “in most circumstances” there will be a conflict between serving the interests of
“intelligence collection, investigative matters and information assurance.” But it fails to explain who will decide which interests to
privilege and what safeguards are in place to prevent the country’s many intelligence and law enforcement agencies
from overriding other voices. We remain concerned that the offensive intelligence needs of agencies will be prioritized
above cybersecurity. Fixing flaws, rather than stockpiling them, is the best way to make the Internet more secure. At a time when our
leaders in Washington seem to be more focused on the threat of cyber-attacks than ever before, it is vital that the intelligence community not
undermine efforts to improve the security of the computer systems upon which so much of our economy depends. Vulnerabilities should be
reported and fixed as soon as they are discovered, and the government has the capacity and obligation to assist software developers in doing
this. The
government’s practice of soliciting, purchasing, hoarding, and using zero-day exploits raises
serious concerns. We hope that the process described in the FBI’s document will lead to the right outcomes, but
we fear it will often not. Key questions remain unanswered, and the government should provide additional explanation so the public
can participate meaningfully in this important debate.
The NSA has historically used 0-Days for the same purposes as backdoors, Heartbleed
proves.
Zetter 15
Kim Zetter is an award-winning, senior staff reporter at Wired covering cybercrime, privacy, and
security. She is writing a book about Stuxnet, a digital weapon that was designed to sabotage Iran's
nuclear program, 3-30-2015, "US Used Zero-Day Exploits Before It Had Policies for Them," WIRED,
http://www.wired.com/2015/03/us-used-zero-day-exploits-policies/
AROUND THE SAME time the US and Israel were already developing and unleashing Stuxnet on computers in Iran, using five zero-day exploits to
get the digital weapon onto machines there, the government realized it needed a policy for how it should handle zero-day vulnerabilities,
according to a new document obtained by the Electronic Frontier Foundation. The
document, found among a handful of heavily redacted
on the
pages released after the civil liberties group sued the Office of the Director of National Intelligence to obtain them, sheds light
backstory behind the development of the government’s zero-day policy and offers some insight into the motivations
for establishing it. What the documents don’t do, however, is provide support for the government’s assertions that
it discloses the “vast majority” of zero-day vulnerabilities it discovers instead of keeping them secret and
exploiting them. “The level of transparency we have now is not enough,” says Andrew Crocker a legal fellow at EFF. “It doesn’t answer a
lot of questions about how often the intelligence community is disclosing, whether they’re really following this process, and who is involved in
making these decisions in the executive branch. More transparency is needed.” The
timeframe around the development of the
policy does make clear, however, that the government was deploying zero-days to attack systems long
before it had established a formal policy for their use. Task Force Launched in 2008 Titled “Vulnerability Equities Process
Highlights,” (.pdf) the document appears to have been created July 8, 2010, based on a date in its file name. Vulnerability equities
process in the title refers to the process whereby the government assesses zero-day software security holes that
it either finds or buys from contractors in order to determine whether they should be disclosed to the
software vendor to be patched or kept secret so intelligence agencies can use them to hack into systems as
they please. The government’s use of zero-day vulnerabilities is controversial, not least because when it withholds information about
software vulnerabilities to exploit them in targeted systems, it leaves every other system that use the same software also vulnerable to being
hacked, including U.S. government computers and critical infrastructure systems. According to the document, the equities process grew out of
a task force the government formed in 2008 to develop a plan for improving its ability “to use the full spectrum of offensive capabilities to
better defend U.S. information systems.” Making use of offensive capabilities likely refers to one of two things: either encouraging the
intelligence community to share information about its stockpile of zero-day vulnerabilities so the holes can be patched on government and
critical infrastructure systems; or using the NSA’s cyber espionage capabilities to spot and stop digital threats before they reach U.S. systems.
This interpretation seems to be supported by a second document (.pdf) released to EFF, which describes how, in 2007, the government realized
it could strengthen its cyber defenses “by providing insight from our own offensive capabilities” and “marshal our intelligence collection to
prevent intrusions before they happen.” One of the recommendations the task force made was to develop a vulnerabilities equities process.
Some time in 2008 and 2009 another working group, led by the Office of the Director of National Intelligence, was established to address this
recommendation with representatives from the intelligence community, the U.S. attorney general, the FBI, DoD, State Department, DHS and,
most notably, the Department of Energy. The Department of Energy might seem the odd-man-out in this group, but the DoE’s Idaho National
Lab conducts research on the security of the nation’s electric grid and, in conjunction with DHS, it also runs a control system security
assessment program that involves working with the makers of industrial control systems to uncover vulnerabilities in their products. Industrial
control systems are used to manage equipment at power and water plants, chemical facilities and other critical infrastructure. Although there
have long been suspicions that the DoE program is used by the government to uncover vulnerabilities that the intelligence community then
uses to exploit in the critical infrastructure facilities of adversaries, DHS sources have insisted to WIRED on a number of occasions that the
assessment program is aimed at getting vulnerabilities fixed and that any information uncovered is not shared with the intelligence community
for purposes of exploiting vulnerabilities. When a significant vulnerability in an industrial control system is discovered by the Idaho lab, it’s
discussed with members of an equities group—formed by representatives of the intelligence community and other agencies—to determine if
any agency that might already be using the vulnerability as part of a critical mission would suffer harm if the vulnerability were disclosed. Of
course, it should be noted that this also allows such agencies to learn about new vulnerabilities they might want to exploit, even if that’s not
the intent. Following the working group’s discussions with DoE and these other agencies throughout 2008 and 2009, the government produced
a document titled “Commercial and Government Information Technology and Industrial Control Product or System Vulnerabilities Equities
Policy and Process.” Note the words “Industrial Control” in the title, signaling the special importance of these types of vulnerabilities. The end
result of the working group’s meetings was the creation of an executive secretariat within the NSA’s Information Assurance Directorate, which
is responsible for protecting and defending national security information and systems, as well as the creation of the vulnerabilities equities
process for handling the decision-making, notification procedures and the appeals process around the government’s use and disclosure of zerodays. We now know, however, that the
equities process established by the task force was flawed, due to statements made
last year by a government-convened intelligence reform board and by revelations that the process had to
undergo a reboot or “reinvigoration” following suggestions that too many vulnerabilities were being
withheld for exploitation rather than disclosed. Equities Process Not Transparent The equities process was not
widely known outside the government until last year when the White House publicly acknowledged for
the first time that it uses zero-day exploits to hack into computers. The announcement came only after
the infamous Heartbleed vulnerability was discovered and Bloomberg erroneously reported that the
NSA had known about the hole for two years and had remained silent about it in order to exploit it. The
NSA and the White House disputed the story. The latter referenced the equities process, insisting that any time the
NSA discovers a major flaw in software, it must disclose the vulnerability to vendors to be patched—that is, unless
there is “a clear national security or law enforcement” interest in using it
Case Other
A2 Anonymity
Anonymity is a bad idea
Joel Brenner, 2014
Served as the inspector general of the National Security Agency from 2002-2006 and as senior counsel
of that agency from 2009-1010. In between, he served as the national counterintelligence executive
under the Director of National Intelligence, in charge of counterintelligence strategy and policy across
the federal government. "Mr. Wemmick's Condition; Or, Privacy as a Disposition, Complete with
Skeptical Observations Regarding Various Regulatory Enthusiasms" (Lawfare Research Paper Series)
http://www.lawfareblog.com/joel-brenner-mr-wemmicks-condition-or-privacy-disposition-completeskeptical-observations-regarding
Another drum major in the parade of bad privacy ideas is the assertion, sometimes violently made by
people wearing masks in public places, that anonymity is a core aspect of privacy. Equating privacy with
anonymity turns the privacy’s core instinct – the wish to be let alone – into an asserted right to interact
with others in a faceless, traceless, and unaccountable manner, even if one causes harm. This is a
fundamental contradiction and it turns privacy inside out. I’m not saying anonymity is irrelevant to
privacy. It protects my ability to go places, buy things, and read things I do not want others to know
about and do not want to admit doing. That is, anonymity protects hypocrisy.
This is hard to swallow if you believe hypocrisy is always evil. But that’s a shallow and moralistic
position. In some contexts hypocrisy is called courtesy, kindness, or regard for another’s feelings. In
other contexts it’s “the first Maxim of worldly Wisdom,” as Adams put it, and simply means concealing
from others sentiments and intentions such “as others have not a right to know.” Which is a pretty good
gloss on privacy. But in all these cases we are implicitly or explicitly dissembling. And in electronic
villages, as in physical villages, dissembling is difficult. This isn’t a matter of preference, it’s just a fact.
In political speech, anonymity protects security, not privacy. In some countries it protects political
speech and the right to organize and can be the difference between life and death. In American history
the anonymous pamphlet and petition played an important role in struggle for independence, and we
value the secret ballot as proof against intimidation, though it is not guaranteed by the Constitution and
was not uniformly used in American Presidential elections until 1884 (twelve years after the British
adopted it). In China, relative anonymity in electronic communications has been a significant
impediment to the Chinese government’s ability to police free expression, which is why they have
outlawed it. But it is confused, and wrong, to think that protecting political organizing and political
speech from tyrants is a question of privacy rather than politics, which is a quintessentially public affair.
A free, civilized order requires that people be permitted to communicate any way they want, in relative
anonymously or otherwise, unless they harm someone else. This is one of the oldest and most basic
principles of both Roman and Common law, classically expressed as Sic utere tuo ut alienum non lædas,
which loosely means you must use what’s yours so as not to injure the lawful rights of others.118 When
people in a free and civilized order do interfere with the lawful rights of others, they have no right to
anonymity, even if anonymity were technically achievable.
Offcase
China DA
1NC China DA
Data localization key to Chinese innovation- NSA used as an excuse to develop
domestic companies
Tiezzi 14 (http://thediplomat.com/2014/12/chinas-quest-to-oust-foreign-tech-firms/, China's Quest to
Oust Foreign Tech Firms
If the Chinese government has its way, foreign technology will be absent from “key sectors” in China by
2020.Her main focus is on China, and she writes on China’s foreign relations, domestic politics, and
economy. Shannon previously served as a research associate at the U.S.-China Policy Foundation, where
she hosted the weekly television show China Forum. She received her A.M. from Harvard University and
her B.A. from The College of William and Mary. Shannon has also studied at Tsinghua University in
Beijing.)//A.V.
China wants to eliminate the use of foreign technology in key sectors by 2020, Bloomberg reported Wednesday,
citing people familiar with the campaign. The goal is to have domestic Chinese technologies in place in banks, stateowned enterprises, key government agencies, and the military. The Bloomberg report merely solidifies other signs that
China was looking to replace foreign technology firms with domestic ones, both for security reasons and
as part of China’s push to ramp up domestic innovation in the high-tech sector. Back in February ,at the first
meeting of the Internet security and informatization group, President Xi Jinping called developing domestic technologies and
ensuring cyber-security “two wings of a bird” – equally crucial parts of China’s cyber strategy moving
forward. To be a cyber power, Xi said, China must develop its own technology. This became an urgent goal for China
after revelations from Edward Snowden about U.S. cyber espionage on Chinese institutions. The Snowden leaks crystallized longstanding Chinese uneasiness about over-reliance on foreign firms like Microsoft and IBM for its
technology needs. There’s also a growing sense of confidence in China that domestic alternatives are
now realistic in a way they weren’t in the past. China released its first home-grown operating system
this year with much fanfare. Bloomberg’s report notes that the recent drive to eliminate foreign tech by 2020
ramped up after a successful test where Microsoft’s Windows was replaced with China’s own NeoKylin
OS and foreign servers were replaced with Chinese-made ones. China has shown signs of a new determination to oust
foreign firms for months. Back in May, China targeted U.S. technology companies in what appeared to be a concerted attempt to lessen their
market share. Beijing banned the use of Windows 8 on government computers and encouraged Chinese banks to switch away from IBM
servers. Microsoft,
Apple, Cisco, and other firms came under heavy fire as national security risks. These
moves were partially in retaliation for the U.S. decision to indict five PLA officers on charges of economic
espionage, but the Bloomberg report suggests the campaign against U.S. tech firms was only part of a
larger strategy. More recently, Microsoft and Qualcomm became targets of a broader anti-monopoly
campaign that some argued was a thinly-veiled pretext for hamstringing the operations of foreign firms
within China. For China, the push to replace foreign technology with domestic tech is a perfect fusion of
national security interests and economic goals. Part of China’s economic reform involves transitioning
away from being a top manufacturer for other global brands in favor of fostering Chinese innovation –
especially in the technology sector. The Snowden leaks provided the perfect cover for a concerted push
in this direction. The U.S. can complain about overt economic favoritism giving an unfair advantage to
Chinese domestic firms, and could even potentially bring suit against China in the WTO. However,
national security concerns provide a blanket excuse for China to justify its actions. After all, the U.S. itself all but
blocked the Chinese firm Huawei from the U.S. market citing security concerns; China can argue it is simply returning the
favor. Eliminating foreign tech from key sectors by 2020 is an ambitious goal — but even if China doesn’t make that deadline, it will eventually
get there. Xi has placed an unusual emphasis on establishing China as a “cyber power,” both in terms of
market share and in terms of political influence. The campaign to switch over to domestic technologies
is one part of this, as is the new high-profile push for global acceptance of China’s “internet sovereignty”
concept. With the emphasis Xi is putting on this goal, don’t look for this campaign to go away any time soon — bad news for
Western companies like Microsoft and IBM, but great for their Chinese counterparts.
Growth slowdown collapses CCP Stability, the Global Economy and lashout
Lewis, 7 (Research Director of the Economic Research Council (Dan, April 19, 2007, “The nightmare of
a Chinese economic collapse,” World Finance, http://www.economicpolicycentre.com/wpcontent/uploads/2010/10/The-nightmare-of-a-Chinese-economic-collapse.pdf )
A reduction in demand for imported Chinese goods would quickly entail a decline in China’s
economic growth rate. That is alarming. It has been calculated that to keep China’s society stable – ie
to manage the transition from a rural to an urban society without devastating unemployment the minimum growth rate is 7.2 percent . Anything less than that and unemployment will rise and
the massive shift in population from the country to the cities becomes unsustainable. This is
when real discontent with communist party rule becomes vocal and hard to ignore. It doesn’t end
there. That will at best bring a global recession . The crucial point is that communist authoritarian states have at least
had some success in keeping a lid on ethnic tensions – so far. But when multi-ethnic communist countries fall apart
from economic stress and the implosion of central power, history suggests that they don’t
become successful democracies overnight. Far from it. There’s a very real chance that China might go
the way of Yugoloslavia or the Soviet Union – chaos, civil unrest and internecine war. In the very
worst case scenario, a Chinese government might seek to maintain national cohesion by going to
war with Taiwan – whom America is pledged to defend.
2NC O/V
Data localization in China is key to developing a domestic tech sector that can
innovate on its own and drive the economy, slowdown in economic growth causes
internal upheaval global economic collapse, CCP will lash out at Taiwan to pacify the
population drawing in the US into confrontation
AT: Startups In Dumps
China’s tech innovation is booming because of Domestic R&D investment
Weisfeld 15 (Feb, Editor’s note: Zack Weisfeld is general manager at Microsoft Ventures Global
Accelerators. , http://techcrunch.com/2015/02/03/is-there-a-tech-bubble-in-china/, Is There A Tech
Bubble In China?)//A.V.
I recently returned from my latest business trip to China, where I had the opportunity to meet with local entrepreneurs, VCs, and
representatives from China’s Silicon Valley, Zhong Guan Cun. This reinforced my belief that China
has become a hub for highgrowth startups. The recent international success stories of Jack Ma’s Alibaba and Xiaomi, which just
raised $1.1 billion, are just a few examples. When I mentioned China’s growing number of technology startup companies to
some of the leading local VCs, they agreed that there had been an explosion in the market and mentioned the
increasing number of billion-dollar companies joining the ecosystem each year — most of which are
geared toward serving domestic customers. Many leading local VCs also acknowledged that the Chinese investment
market is currently on fire. An article in Forbes illustrates just that. The article points out that in the third quarter of 2014,
there were 107 venture investments in China’s technology sector with an aggregate deal value of $4.66
billion. This easily surpasses the 98 deals with a total value of $4.45 billion that occurred in 2013. VCs admit
that they are going with a “spray and pray” strategy, investing in as many early-stage startups as they can, for fear of missing out on the next
billion-dollar company. In a quick-to-copy, fast-to-roll-out market like China’s, the real competition is between giants like Baidu and Alibaba.
Smaller companies are often acquired to speed up growth or to eliminate the competition. This explains the extremely high number of M&As in
the technology sector in 2014 in China: 851 deals worth $47.5 billion, up 62 percent from 2013 ($29.3 billion via 654 deals) and the highest
volume on record for the sector (based on information from Dealogic.com). One
of the venture capitalists I spoke to
compared Chinese market investments to high-stakes poker: big risk and big reward. Seed investments
in China are exceptionally high and can range from several millions to tens of millions of dollars. But there is
a reason behind these unusual investment amounts: Chinese companies don’t have the necessary infrastructure that Western ecosystems
have, meaning that most companies need to build everything from scratch. This requires a bigger investment. In addition, most Chinese
startups serve local markets, which are usually easier to cater to. Lucky for them, their market happens to be the biggest in the world. The
Chinese entrepreneurs are a bit different than the typical startup founders. From everything I’ve learnt during my visits to China,
Chineseentrepreneurs are more independent, mature, and mentally stable – as well as armed with
strong interpersonal skills – than those found in other countries. Investors refer to them as “The new entrepreneur for
the modern lifestyle,” and while some investors may describe them as not being as creative as Western entrepreneurs, where they excel is
when it comes to execution. In China, the best operator, not the best innovator, carries the day. However, what
surprised me the
most during my last visit to China was to see how the entire entrepreneurial landscape in the country is
shifting.There is less bureaucracy involved in starting a new business than there was even a few years
ago, with a simultaneous increase in the amount of resources being assigned to research and
development (R&D). Additionally, entrepreneurial education and culture are developing – albeit slowly – and there are clear efforts
among numerous governmental authorities to encourage more private-sector involvement. While barriers continue to exist in the form of taxes
and regulations, China
is taking action to support innovation and growth for the long run. The Chinese
market is extremely hot. While startups go from nothing to billions and can often stage IPOs rather
quickly, they are also able to attract customers and produce revenue (when they are not bought), as is the
case with many of the companies coming out of our accelerator program in China.
AT: China Econ in Dumps
China’s economy is in a state of turbulence; innovation is critical to sustaining growth
Red Herring 15 (April 28, http://www.redherring.com/finance/chinese-tech-sector-evolving-amideconomic-turbulence/, Chinese tech sector evolving amid economic turbulence)//A.V.
Whatever it might be called, China is facing an unfamiliar period of economic turbulence, at least in the
modern era. Beijing has responded by emphasizing what has historically been considered a weakness:
innovation. Li Keqiang used the word twelve times at the World’s Economic Forum Meeting in Davos this past January, where he described
a specific vision for China’s future: “To foster a new engine of growth, we will encourage mass entrepreneurship
and innovation…[they are], in our eyes, a ‘gold mine’ that provides constant source of creativity and
wealth.” Se Yan, the Peking University professor, classifies innovation as belonging to one of two types. “Maker innovation” is
practiced chiefly in the U.S. “If it changes the way people live, it’s maker innovation,” says Yan, offering the
most concise definition possible. Then there is “process innovation,” which perfects existing production processes and has fueled economic
growth in Germany and Japan. With a move away from manufacturing, and a rising population of college graduates, Yan
believes that
innovation of the “maker” kind is needed if China is to realize its full potential. The Innoway project, located in
Beijing’s Haidian district, is one such initiative aimed at making China more technologically innovative. A joint venture between the
Zhongguancun Science Park and a privately held asset management group, together they have purchased an alleyway that was previously lined
with bookstores and converted the area into incubator space. The
goal is to become the center of China’s emerging start-
up culture. According to the most recent data from the China Internet Network Information Report, the country has nearly 650 million
internet users. Around 560 million of them connect to the Internet through their smartphones. The
opportunity in front of Innoway is clearly rich, but the question of how the group plans to capture it is
still being worked out. Many of the incubators in Innoway have the same model; that is, they are actually coffee shops. Member
companies gain access to Wi-Fi, the camaraderie of other start-ups, and advice from the investors and sponsored events that are affiliated with
the cafes, all for the price of a cup of coffee. Almost a year old (it opened officially last June), around 20% of the early stage companies that
have frequented Innoway’s cafes have gone on to raise funding, according to the incubator’s PR officials. It
remains to be seen
whether an organic approach can produce the “unicorn companies” that will fuel new growth within the
Chinese economy. But in a place where people must use VPNs to connect to American websites, and euphemisms to search for
information related 1989’s Tiananmen Square Incident, perhaps a lack of structure is exactly what the country’s technology sector needs.
EXT localization k2 China Econ/AT L/Turn
Extend Data-localization is key to the Chinese economy, they are using foreign security
threats from NSA as a pretext to promote domestic tech firms that can innovate on
their own and become the drivers of the economy
Pushing out foreign forms key to developing domestic firms with partnerships
Stratfor 15 (http://www.kvue.com/story/news/world/stratfor/2015/07/09/in-china-network-securityand-economic-interests-align/29920653/, In China, network security and economic interests align)//A.V.
China's defensive efforts in cyberspace will remain an uphill battle because of its continued reliance on foreign technology, increasing
cyberespionage concerns and enormous population of Internet users. China
will merge its network security imperatives
with ambitions to bolster its own domestic technology sector. China's need to boost network security
and tap into its own domestic demand will drive Beijing to become increasingly active in developing its
domestic industries through partnerships and overseas acquisitions of technology firms. Unlike the United
States, which must seek partnerships with the private sector and international partners in defending its online interests, Beijing assumes sole
responsibility for managing security over its immense population of Internet users and devices. This is not an easy task for China, which is
encountering a surge in domestic cybercriminals and foreign state actors with highly sophisticated offensive capabilities. Though
China
faces an uphill battle in developing its defenses in cyberspace, its security needs and virtual threats
from the West will help propel its ambition to expand into high-tech industries, thus combining China's
network security imperatives with its economic ambitions. China's National People's Congress passed a new national
security law July 1 comprising an extensive list of national security directives that addresses, among other concerns, network security. The
document is meant to provide top-level guidance, so its overall language is vague. However, the law highlights China's growing emphasis on
safeguarding against cyberattacks and espionage, as well as its drive to develop native technologies to support network security. Like the
United States, China is still developing its own view and strategy of defense in cyberspace. China's Internet Space In little over a decade, China's
Internet presence has experienced explosive growth that has made it the world's leader in terms of Internet users and devices. In 2004, there
were some 96 million Internet users in China. By 2014, this number had risen to 641 million, more than North America's entire population. Even
with this many users, less than half of China's population has regular Internet access. China's new national security law calls for extending
Internet services to rural areas, indicating the significant growth in Internet usage that could occur in the coming years. The rising number of
Internet users in China has paved the way for the expansion of social media platforms, online commerce giants and demands for high-tech
services and goods. Mobile Internet devices such as smartphones and tablets are extremely popular in China; in 2014, more Internet users
logged on via mobile devices than personal computers. China now boasts the largest online retail market in the world, with nearly $450 billion
in retail sales in 2014 (about a 50 percent increase from 2013). The scope
of China's Internet presence has helped Chinese
companies like Alibaba and Tencent become behemoths in the online world. Foreign technology
companies find the significant and growing demand for tech services and devices in China irresistible,
particularly since China's Internet space still stands on foreign-made technology. However, this demand also drives the Chinese
government's overarching push to develop indigenous high-tech industries and eventually become
innovative itself. Moreover, as a result of China's exploding online population, China's network security concerns have
grown, creating a need for online security technologies that China eventually could develop
domestically. The Problems As China's online economy surges, the government is encountering many of the same network security
challenges that other countries face. However, China also faces its own particular challenges. The country's increasing Internet population
means that the scale at which China must strictly monitor and control the information disseminated online, particularly via social media outlets,
is growing substantially. Thus, China has a pressing need to continue developing technologies to control such a large online crowd. China has
already created control strategies such as the "Great Firewall," which controls China's cyberspace links with the global Internet, and the Green
Dam, a software package installed on personal computers to monitor online activity. As attractive as China's Internet population is to both
domestic and foreign firms trying to meet the growing demand, it is also an attractive target for both domestic and foreign criminals. By some
estimates, the proliferation of malware on Chinese computing devices is among the most extreme in the world. In a 2014 report by Panda
Security, China's malware infection rate was estimated at 49 percent, the highest of countries listed in the report. The Chinese are significant
targets of cybercrime. A common form of cyberattack used to steal personal information, login credentials and deploy malware is known as
"phishing." Phishing is a tactic in which an attacker attempts to coax an unwitting victim into visiting a malicious website, often through e-mails
that entice users to visit a link. A critical component of such an attack often involves registering an Internet domain name to point to a
compromised or malicious server. According to a study by the Anti-Phishing Working Group, of the 27,253 malicious domain registrations
around the world in the second half of 2014, 84 percent were registered specifically to phish Chinese Internet users, particularly individuals
using Taobao, the Industrial and Commercial Bank of China, the Bank of China and Alipay services. A significant contributor to these security
issues is the substantial proliferation of pirated software. According to one estimate, 74 percent of software installed by Chinese users is
unlicensed (compared to 18 percent in the United States). Virtually all mainstream software platforms, from operating systems like Microsoft
Windows to graphics editors such as Adobe Photoshop, are pirated through end users and Chinese vendors. Users of pirated software often are
cut off from critical security updates, increasing the risk of being targeted for cyberattacks including the use of malware. Given the high rate of
unlicensed software and the large population of Internet users, China is ripe for lone actors and organized crime to target with malware.
Foreign states, particularly the U.S. government, also pose a substantial threat to China as cyberespionage escalates year after year. Online
espionage has led to Beijing's growing distrust of foreign-made devices and software. The classified documents leaked by former NSA
contractor Edward Snowden bolstered this distrust by revealing that the United States had compromised widely used software packages and
networking devices prior to their export to foreign countries. This tactic is a particular concern for China given its reliance on foreign-made
technology to prop up its cyberspace. However, China is constrained by its relative lack of domestic high-tech capabilities. China's
Strategy The recent national security law is the latest step in China's efforts to bolster its defenses in cyberspace, and one that further
legislative announcements are likely to follow in the coming months. The law essentially calls for the combination of
China's ambitions in the technology sector with its network security needs. To accomplish this, China intends to
leverage the factors that make network security a more important priority — namely, China's enormous Internet presence and
market, which attracts foreign technology companies, and its distrust of foreign-made technology,
which raises demand for alternatives. China's efforts to become an innovator in information technology reach back more than a
decade; connecting those efforts with network security is more recent. In February 2014, President Xi Jinping assumed the role as head of the
newly created Central Internet Security and Informatization Leading Group charged with overseeing China's national cybersecurity. During Xi's
assumption of the role, he
likened network security and the development of indigenous information
technology to "two wings of a bird and two wheels of an engine." Following the creation of the Central Internet
Security and Informatization Leading Group, the government aimed to exchange foreign-made technology with
domestically built technology in its banking sector, military, government agencies and state-owned
enterprises. In September 2014, the China Banking Regulatory Commission mandated that by 2019, 75 percent of Chinese financial
institutions' information technology infrastructure would rely on "secure and controllable" technologies, implying a switch to domestic
technology or the use of foreign technology for which the government has thorough access to the associated intellectual property. However,
the mandate has since been delayed, probably in part because of foreign pressure but also because Chinese-built systems are simply not
effective yet and would even likely increase security vulnerabilities. To
shed its dependence on foreign-made technology,
China will have to develop its high-tech industry to the point of becoming an innovator itself. China's efforts
to combine domestic innovation and cybersecurity have worried many foreign tech firms operating in China, which see the strategy as an
indicator that China will become even more aggressive in acquiring sensitive intellectual property to drive the development of its domestic
industry.
China is going through the typical stages of development for East Asian economies: first
exporting low-end manufactured goods, then moving into middle-end manufacturing while licensing
foreign technology, next imitating or developing technology independent of licensing, then focusing on
middle- and high-end manufacturing before finally beginning its own innovations. In some sectors, such as
building low- and high-end servers, China has already begun its attempts to compete with foreign companies in the domestic market through
Chinese companies like Inspur, which is using fears of foreign technology bolstered by the Snowden leaks to persuade Chinese consumers to
move away from its competitor, IBM. In
another example of China attempting to push away foreign technology,
China banned the installation of the Microsoft Windows 8 operating system on government computers
in May 2014. It should be noted that Snowden's documents specified that Microsoft allowed backdoor
access to encrypted communication in its Outlook.com service. Beijing would have been aware of this for at least a year
before banning Windows 8. China's need to boost cybersecurity and eventually begin serving its own technology
demands will motivate the country to become increasingly active in developing its domestic industries
through overseas acquisitions of technology firms and through partnerships. To the ire of foreign firms,
partnerships with foreign technology companies will serve China's strategy to "co-innovate" and "reinnovate" — in other words, to imitate or develop technology independent of licensing. Moreover, industrial espionage and intellectual
property theft will continue. For many foreign technology companies, the sheer size of China's growing technology demands makes having a
stake in the market a necessity. This limits companies' ability to push back against China's ambitions. The revelations
from Snowden's documents and growing network security concerns give China a pretext to push foreign partners to further disclose sensitive
intellectual property and possibly even to install backdoors. In
this way, China will be able to accelerate its progress
toward developing its own domestic technology sector, giving domestic firms the ability to meet the
country's demand for technology while greatly improving China's cyberdefense capabilities.
Best empirical, controlled study confirms promoting domestic partnerships over
foreign firms is key to Chinese economic innovation
Gong et al. 15 ( http://www.businessspectator.com.au/article/2015/7/27/china/why-chinas-techsector-taking, Why China's tech sector is taking off)//A.V.
Prior to its WTO accession in 2001, China’s external trade policy was rather restrictive, particularly so with respect to foreign investment, and
more particularly so with respect to the influence foreign firms might exert on their Chinese affiliates. Keeping
foreign investment
under tight control was perceived as an inevitable necessity in order to stay on track with the main
emphases of China’s investment policy -- to produce exports and to promote technology transfer. Thus, the
Catalogue of Industries for Guiding Foreign Investment, issued in 1995, classified FDI into the categories of 'prohibited', 'restricted', 'permitted'
or 'encouraged', and imposed numerous restrictions and a cumbersome examination and approval system on it, particularly in certain key
industrial sectors. In the run-up to WTO accession, the Chinese government reverted to a policy of making investment in China more attractive
to foreign investors. Successive amendments of the Catalogue increased the number of sectors open to foreign investment (from 186 sectors in
1997 to 262 in 2002), allowed for wholly foreign-owned enterprises and removed requirements for these such as exporting a stipulated
percentage (which used to be 70 per cent!) of their Chinese production. To further accelerate the pace of introducing advanced technologies
from abroad, China particularly encouraged FDI to its high-tech industries, including electronics and information, software, aeronautics and
astronautics. It then turned out that promoting advanced technology and exporting seemed to be closely linked to each other, as the products
incorporating the new technologies have become the key to the rapid expansion of China’s exports. And wholly foreign-owned enterprises have
become the most popular form of FDI in China. FDI, technology upgrading and exports One question now is whether the specific FDI ownership
structure, particularly the emergence of many wholly foreign-owned enterprises, has affected China’s technology upgrading and its exports.
Two contradicting arguments could be raised. Either, one may expect a higher share of foreign
ownership to lead to higher levels of investment in technology and skills, to technology upgrading and to
higher export activity. The underlying idea would be that foreign high-tech firms might prefer
introducing their valuable new technologies via wholly owned affiliates rather than via joint ventures.
Or, one may make a case for a higher share of foreign ownership leading to lower levels of technology
investment and skill upgrading. This could be the case because foreign firms might tend to cherry pick
the best targets for wholly-owned takeovers, such as those with little need for technology upgrading, or
because they might like to integrate wholly-owned affiliates completely into their international
production network, stripping them of their R&D activities and relocating these to the headquarters.
The theoretical expectation is, therefore, ambiguous and needs to be decided by empirical evidence.
Our recent study analyses such effects of foreign owned firms on China’s take-off after the lifting of the
hitherto existing restrictions. Based on the Annual Reports of Industrial Enterprise Statistics for the
period 2001 to 2007, by the China National Bureau of Statistics, some 27,500 Chinese firms are analysed
that had no exporting and no R&D activities prior to 2002. The study then looks on the likelihood of
these firms to take up exporting and R&D activities, conditional on five different types of ownership:
Continuous domestic ownership (foreign ownership share 0 per cent; 26,004 cases) Small minority acquisition (foreign ownership share < 25
per cent; 152 cases) Minority acquisition (foreign ownership share 25 per cent – 49 per cent; 497 cases) Majority acquisition (foreign ownership
share 50 per cent – 99 per cent; 349 cases) Whole acquisition (foreign ownership share 100 per cent; 511 cases) The data allow to further
differentiate between foreign acquisition by ethnic Chinese or truly foreign investors, also between private or state ownership for the
remaining domestic share of the acquired firm. The
study then sets out to assess the specific effects on the exporting
and R&D performing behaviour for the group of firms being acquired by foreign investors, as distinct
from the control group of firms remaining completely in domestic ownership. The study takes selection
biases into consideration -- it considers that the very characteristics that may make a firm attractive for
partial or whole acquisitions may also influence its general likelihood of starting to export or to perform
R&D, irrespective of it being actually acquired by a foreign owner. These characteristics may distinguish it from the
domestic-owned firms, may impair the utility of the latter to serve as control group, and may lead to biased estimations. The study deals with
such selection biases in the observables by adopting propensity score reweighting in combination with covariate adjusted regressions, to obtain
a so-called doubly robust estimator. Figure 1 displays the results of the study, the differential effects of the various kinds of foreign ownership
as compared to complete domestic ownership. These effects are estimated for the acquisition year, one year after, and two years after
acquisition.
Foreign-owned firms are more likely to start exporting after being acquired than firms that
remain purely domestic-owned (blue columns being all positive). While this is true for all types of foreign-owned firms, the export
starting effect is a little less for firms with majority ownership than for all others. With regard to the effect of whole ownership on exporting
behaviour, no obvious advantage or disadvantage over ownership with smaller shares can be detected. The
estimated effects are
percentage changes, e.g., a wholly-acquired affiliate is about 8 per cent more likely to start to export in
the acquisition year than a purely domestic control group firm. Moreover, further calculations reveal that exporting
behaviour is usually not increased if the foreign investor is of Chinese ethnicity, but is frequently spurred if the remaining domestic part of the
firm is state-owned. As to taking up R&D activities, the findings are more mixed (red columns being either positive or negative). In
several
cases, the likelihood of taking up R&D increases after the foreign acquisition, most strongly in the case
of minority acquisitions. The estimated effects also tend to increase over time. Yet, at least for whole acquisitions, it is
even less likely to start R&D activities than for the still domestic-owned firms. Thus, in this case, foreign
investors seem to count on their home facilities of R&D rather than investing in their Chinese affiliate for
this purpose. Chinese ethnicity of the foreign investor, again, mostly does not help improving the uptake of R&D activities in China, and
state ownership of the remaining domestic part of the acquired firm gives mixed results. Summarising, the liberalisation of FDI in China seems a
success story from the perspective of Chinese policymakers. The
inflows of foreign investment thereafter have
contributed considerably to the policy objective of raising Chinese export activities. And they have also
contributed to the policy objective of increasing R&D activities in China, though only in the case of joint
ventures and not in that of wholly-owned firms. The results provide solid evidence that joint ventures
between foreign owners and Chinese firms can contribute positively to China’s science and technology
take-off.
AT: Tech Not key to their econ
Promoting domestic tech firms in china is the key internal link to every facet of their
economy
Whyman 14 (September 10, http://www.brookings.edu/blogs/techtank/posts/2014/09/10-chinatech-challenge, China’s Tech Challenge and Opportunity)//A.V.
Can China become a global power in technology? China’s ambitious effort to build its own tech sector is
poised to achieve success. China wants to improve its tech industry: first to advance its economic
development and increase innovation, and second to increase national security. China wants its own
tech sector. Its efforts towards this goal will have a big impact on global economic growth and US-China
relations. China’s Disruptive Rise China is structuring its markets to give its’ domestic champions space to
grow. In recent weeks antitrust actions against Qualcomm and Microsoft have come to a head, Cisco and
IBM have faced persistent market access challenges in China, and Apple’s iPhone has appeared on CCTV
as a potential national security threat. Global tech companies fear they are losing access to one of the most important tech
markets. Meanwhile, governments are trading cyber security recriminations. Yet, China can accelerate growth and innovation
in the global tech sector too. China Tech is Different This is not just a “trade issue.” It is systemic. China’s
efforts to support its own technology sector are engineered into the very sinews of its economy,
spanning tax, subsidies, antitrust, procurement, standards, localization, and technology transfer. The
government shapes the tech market in deep and granular ways. Further, China is leveraging access to its
large domestic market; whereas, Japan and Korea have emphasized global export-led strategies and had
more incentive to accept global rules. From an American perspective, earlier economic disputes with Japan and Korea were
contained within more developed political relationships and institutions. Big Consequences The Chinese market is often the largest
source of growth for global tech companies and is the largest market for PCs, mobile phones, and e-commerce. The tech supply chain
has migrated to China, increasing interdependence. Huawei, ZTE, Lenovo, Xiaomi, Inspur, Baidu, Alibaba,
and Tencent have grown into major companies. They drive growth in the Chinese economy and their
entrance into global markets will transform the competitive landscape. Increasing Commercial Conflicts It is critical
that the worlds’ top two economies resolve these trade related issues: China has launched anti-trust investigations against
tech giants Qualcomm, Microsoft, and is becoming more assertive in global mergers, e.g. pending review of
Applied Materials and Tokyo Electron. Cross border investments: China’s Lenovo purchased IBM’s server business and Motorola. Yet, it is not
clear whether foreign companies can purchase or fully own Chinese tech companies. IBM, CSCO, Oracle, EMC, and others fear they’re being
shut out of large parts of the Chinese market. Huawei and ZTE have reciprocal concerns. Intellectual property enforcement and mandated
technology transfer. The planned investments and policies of the Chinese government in the semiconductor industry. Internet governance,
such as differing perspectives on domain names. There are
no easy answers, but the world’s two largest economies
need to find ways to manage these differences. The success of America and China in addressing these
issues will make the world richer or poorer, and safer or more insecure. Chinese Companies Succeeding
Across the Tech Sector
EXT China Lashout
Unrest in china causes Global econ collapse and hyper nationalism that
guarantees nuclear lashout
Bleicher, 11 – principal in his consulting firm, The Strategic Path LLC, adjunct professor at Georgetown
University Law School, and contributor to Foreign Policy In Focus (9/13, Samuel, “Is China Heading for
Collapse?,” Inter-Hemispheric Resource Center Press, ProQuest)//schnall
The West and China's Decline A great deal of evidence suggests that China's current growth is unsustainable for a variety of reasons. If so, it
faces hard times ahead, regardless of the character of the governing regime. China's
decline, especially if it results in chaotic
conditions there, is likely to damage the economic well-being of the United States and the international
community, which are depending on China to be the engine of global economic growth and a partner for
peaceful cooperation on nuclear proliferation. Long-term economic stagnation or decline carries important implications for
the foreign policies of the United States and the international community. Over the last three decades the United States and Europe have
followed a range of policies designed to integrate China into the global economic and political systems. In recent years, however, a growing
chorus in Congress has argued that it is time, now that China is so strong, to insist that it change its self-interested policies on currency
valuation, intellectual property protection, and human rights. But changing U.S. and European direction at this time could be just the wrong
move. Although
China may look like a rising economic and political competitor today, that situation could
quickly reverse. The wildly erroneous predictions of "Japan as #1" three decades ago should warn the outside world not to over-react to
the "China threat." Punitive U.S. measures in response to China's mercantilist trade and currency policies and
disregard for intellectual property rights, however justifiable on the merits, could create the impression in China that the US
has created, or at least hastened and deepened its economic stagnation. The United States should avoid
providing the CPC regime any excuse to claim the United States is the cause of China's woes. If the
Chinese people as a whole ever adopt that view, U.S.-China relations could be poisoned for decades.
The United States and the international community should also recognize that, as China's economy
deteriorates, any confrontational military maneuvers are likely to be met with escalation rather than
compromise. Confrontation would tempt a struggling CPC regime to adopt a jingoistic, rally-'round-theflag patriotism. The CPC regime already inculcates and makes political use of anti-Japanese feelings among Chinese born long after the
end of World War II. "Foreign threats" would serve both to encourage public unity and to justify crushing
whatever real or perceived internal opposition exists. It would also favor increased military expenditures
and distract people from adverse economic and ecological developments. The United States should make serious efforts to
avoid becoming the new enemy. It will need to tread carefully to avoid making China's economic decline
worse both for China and the rest of the world.
Topicality
T – Domestic
Cybersecurity is foreign.
Fred H. Cate, Professor of Law at Indiana, 2015,
“Edward Snowden and the NSA: Law, Policy, and Politics” in the Snowden Reader ed. Fidler, David P.
ProQuest ebrary
While the president ordered federal agencies to strengthen cyber defenses, and the intelligence
community identified weaknesses in those defenses as part of the greatest threat the country faces,
documents leaked by Snowden suggest the NSA has actually been weakening cyber security. According
to a number of the documents, in its quest for tools to break into the communications of other
countries and industries, the NSA has worked deliberately to weaken cyber security.
T – Vulnerabilities = foreign
Backdoors are used for foreign surveillance.
Bellovin, 2014,
Steven M. Bellovin is a professor of computer science at Columbia University., et al. "Lawful hacking:
Using existing vulnerabilities for wiretapping on the Internet." Northwestern Journal of Technology and
Intellectual Property. 12 (2014).
The question of when to report vulnerabilities that are being exploited is not new for the U.S
government. In particular, the National Security Agency (NSA) has faced this issue several times in its history, as we discuss below. 153
The NSA performs two missions for the U.S. government: the well-known mission of signals intelligence,
or SIGINT, which involves “reading other people’s mail,”231 and the lesser-known mission of
communications security, COMSEC, which involves protecting U.S. military and diplomatic
communications.232 In principle, it is extremely useful to house the U.S. signals intelligence mission in the same agency as the U.S.
communications security mission because each is in a position to learn from the other. SIGINT’s ability to penetrate certain communication
channels could inform COMSEC’s knowledge of potential weaknesses in our own and COMSEC’s awareness of security problems in certain
communications channels might inform SIGINT’s knowledge of a target’s potential weakness.
154 Reality is in fact very different. COMSEC’s awareness of the need to secure certain communications channels has often been thwarted by
SIGINT’s desire that patching be delayed so that it can continue to exploit traffic using the vulnerability in question. How
this
contradictory situation is handled depends primarily on where the vulnerable communications system is
operating. If the insecure communications system is being used largely in the U.S. and in smaller nations
that are unlikely to harm the U.S., then patching would not hurt the SIGINT mission. In that situation,
COMSEC is allowed to inform the vendor of the vulnerability. In most other instances, informing the vendor is
delayed so that SIGINT can continue harvesting product. Although this was never a publicly stated NSA
policy, this modus operandi was a fairly open secret.233
T Surveillance
1. Interpretation – Surveillance is systematic and routine attention to personal details
for the purpose or influence or detention.
Richards, 2013 Neil M. Professor of Law, Washington University. "The Dangers of Surveillance" Harv.
L. Rev. 126 1934 http://www.harvardlawreview.org/wp-content/uploads/pdfs/vol126_richards.pdf
What, then, is surveillance? Scholars working throughout the English-speaking academy have produced a thick descriptive literature examining
the nature, causes, and implications of the age of surveillance.6 Working under the umbrella term of “surveillance studies,” these scholars
represent both the social sciences and humanities, with sociologists making many of the most significant contributions.7 Reviewing the vast
surveillance studies literature, Professor David Lyon concludes that surveillance is primarily about power, but it is also about personhood.8 Lyon
offers a definition of surveillance as “the
focused, systematic and routine attention to personal details for
purposes of influence, management, protection or direction.”9 Four aspects of this definition are noteworthy, as they
expand our understanding of what surveillance is and what its purposes are. First, it is focused on learning information about
individuals. Second, surveillance is systematic; it is intentional rather than random or arbitrary. Third,
surveillance is routine — a part of the ordinary administrative apparatus that characterizes modern
societies.10 Fourth, surveillance can have a wide variety of purposes — rarely totalitarian domination,
but more typically subtler forms of influence or control.11
2. Violation – Backdoors are not surveillance because there is no observation of
personal information.
David Omand 15, 3-19-2015, visiting professor at King’s College London. "Understanding Digital
Intelligence and the Norms That Might Govern It," Global Commission On Internet Governance Paper
Series: no. 8 — March 2015 , https://www.cigionline.org/publications/understanding-digitalintelligence-and-norms-might-govern-it
The second issue concerns how an invasion of privacy of digital communications is defined. Is it when
the computer of an intercepting agency accesses the relevant packets of data along with the rest of the streams of
digital information on a fibre optic cable or other bearer? Or is it when a sentient being, the intelligence analyst, can actually
see the resulting information about the communication of the target? Perhaps the most damaging loss of trust from
the Snowden allegations has come from the common but unwarranted assumption that access in bulk to large volumes of digital
communications (the “haystack”) in order to find the communications of intelligence targets (the wanted “needles”) is evidence of mass
surveillance of the population, which it is not.
The distinction is between authorizing a computer to search through bulk data on the basis of some
discriminating algorithm to pull out sought-for communications (and discard the rest) and authorizing an
analyst to examine the final product of the material thus filtered and selected. It is the latter step that
governs the extent of, and justification for, the intrusion into personal privacy. The computer filtering is, with the
right discriminator, capable (in theory, of course, not in actual practice) of selecting out any sought for communication. But that does not mean
the population is under mass surveillance.47 Provided the
discriminator and selection program chosen and used by
the accessing computer only selects for human examination the material that a warrant has authorized,
and the warrant is legally justified, then the citizens’ privacy rights are respected. Of course, if the selectors were
set far too broadly and trawled in too much for sentient examination, then the exercise would fail to be proportionate (and would be unlawful,
therefore, in most jurisdictions).
3. Prefer our interpretation
A. Limits are necessary for negative preparation and clash, and their interpretation
makes the topic too big.
B. Including data gathering as surveillance makes the topic unmanageable and
undebateable for the negative.
4. Topicality is a voting issue.
Links
1NC Terror Link
Without access to backdoors, law enforcement won’t have the capacity to collect
intelligence data because of increasingly complex encryption
AP 7/8 (Eric Tucker, “FBI, JUSTICE DEPT. TAKE ENCRYPTION CONCERNS TO CONGRESS” Associated
Press,
http://hosted.ap.org/dynamic/stories/U/US_FBI_ENCRYPTION?SITE=AP&SECTION=HOME&TE
MPLATE=DEFAULT&CTIME=2015-07-08-06-22-03)
WASHINGTON (AP) -- Federal law enforcement officials warned Wednesday that data encryption is making it
harder to hunt for pedophiles and terror suspects, telling senators that consumers' right to privacy is not
absolute and must be weighed against public-safety interests. The testimony before the Senate Judiciary Committee
marked the latest front in a high-stakes dispute between the Obama administration and some of the world's most influential tech companies,
placing squarely before Congress an ongoing discussion that shows no signs of an easy resolution. Senators, too, offered divided opinions. FBI
and Justice Department officials have repeatedly asserted that encryption technology built into
smartphones makes it harder for them to monitor and intercept messages from criminal suspects, such
as Islamic State sympathizers who communicate online and child predators who conceal pornographic
images. They say it's critical that they be able to access encrypted communications during investigations,
with companies maintaining the key to unlock such data. But they face fierce opposition from Silicon Valley companies
who say encryption safeguards customers' privacy rights and offers protections from hackers, corporate spies and other breaches. The
companies in recent months have written to the Obama administration and used public speeches to argue for the value of strong encryption.
FBI Director James Comey, who has pressed his case repeatedly over the last year before think tanks and in other settings, sought
Wednesday to defuse some of the tension surrounding the dispute. He told senators that he believed technology companies were
fundamentally on the same page as law enforcement, adding, "I am not here to fight a war." "Encryption is a great thing. It keeps us all safe. It
protects innovation," Comey said. "It protects my children. It protects my health care. It is a great thing." But he warned that criminals were
using encryption to create a safe zone from law enforcement. He
said that concern was especially acute at a time when
the Islamic State has been recruiting sympathizers through social media and then directing them to
encrypted platforms that federal agents cannot access. "Our job is to look at a haystack the size of this
country for needles that are increasingly invisible to us because of end-to-end encryption," he said.
Terror DA – Link Encryption
Encryption causes terrorism.
Geller, 2015
Eric Geller, Deputy Morning Editor, The Daily Dot, 7-10-2015, "The rise of the new Crypto War," Daily
Dot, http://www.dailydot.com/politics/encryption-crypto-war-james-comey-fbi-privacy/
If universal encryption became the norm, the FBI director said in a May 20 Q&A session at Georgetown Law School, “all of
our lives, including the lives of criminals and terrorist and spies, will be in a place that is utterly
unavailable to the court-ordered process.” It was in this climate of heated rhetoric and government-versus-industry sparring
that the House Committee on Oversight and Government Reform convened a hearing on April 29 called “Encryption Technology and Potential
U.S. Policy Responses.” Amy Hess, executive assistant director of the FBI’s Science and Technology Branch, represented the bureau. Joining her
in calling for backdoors was Daniel Conley, the district attorney for Suffolk County, Massachusetts. Hess
repeatedly invoked the
specter of terrorism when she described the dangers of strong encryption. She described the warrant
process for encrypted electronic devices as “an exercise in futility,” adding, “Terrorists and other
criminals know this and will increasingly count on these means of evading detection.”
Encryption makes it hard to track terrorist activity.
David Ignatius 15,
David Ignatius writes a twice-a-week foreign affairs column for the Washington Post and is an associate
editor, 6-16-2015, "The nation’s point man against cyberattacks," Washington Post,
http://www.washingtonpost.com/opinions/the-nations-point-man-againstcyberattacks/2015/06/16/4be6e950-1466-11e5-89f3-61410da94eb1_story.html
The place where security cooperation is most needed, and hardest to get these days, is Silicon Valley.
Post-Snowden, U.S. tech companies are running scared and embracing strong encryption in an effort to convince customers they can protect
their privacy. This backlash has gone so far that it’s hurting the country, Johnson believes. In a speech in San
Francisco in April, he appealed to tech companies to think of the public’s safety, as well as their corporate reputations on privacy issues.
“Encryption
is making it harder for your government to find criminal activity and potential terrorist
activity,” he warned. The latest cyberassault on OPM records shows just how vulnerable the U.S. government and private companies
are to hacking. “We need each other, and we must work together. There are things government can do for you, and there are things we need
you to do for us,” he told the San Francisco audience. Johnson is right, and he needs to say it louder. In
cyber and terror attacks, the
“bad guys” are getting stronger, and demoralized Western governments are getting weaker. This
imbalance needs to be fixed.
Terror DA Link – Zero days
The plan is unilateral disarmament.
Lewis, 2014
James A. Lewis Senior fellow and program director at the Center for Strategic and International Studies
(CSIS). Before joining CSIS, he worked at the Departments of State and Commerce. He was the
Rapporteur for the 2010 and the 2012–13 United Nations Group of Governmental Experts on
Information Securit(2014) Heartbleed and the State of Cybersecurity, American Foreign Policy Interests:
The Journal of the National Committee on American Foreign Policy, 36:5, 294-299
The torrent of leaks about how the NSA evaded private defenses points to the limitation of any technical solution for cybersecurity. There
is
no silver bullet. When the United States decontrolled encryption in 1999, the technology community
celebrated a victory in the ‘‘crypto wars.’’5 It was, in fact, an intelligence coup: the public and potential
opponents thought they had won and were safe when they were actually incredibly vulnerable. Major
intelligence agencies are formidable opponents, employing thousands of people and spending hundreds of millions of dollars to penetrate
online defenses. They have decades of experience and have access to sophisticated technologies—not just supercomputers— unavailable to
the private sector. They are also not bound by the laws of their foreign targets.
The NSA is not alone in this—Russia’s Federal
Security Service (FSB) is as good if not better. A thriving global black market in software vulnerabilities
has emerged, allowing hackers to sell unknown vulnerabilities to companies and intelligence agencies.
Some have called for the United States to refrain from buying from the black market, but this would be a
form of unilateral disarmament most likely resulting in a less-effective NSA. It would also save our
opponents some money by lowering the price of black market exploits (since demand would lessen somewhat). In
any case, most software products contain so many flaws that hackers and spies do not need a black market to find flaws or use ‘‘supply chain
attacks’’ to build in backdoors. That
the United States faces intractable opponents has serious implications for
policy. This situation was not anticipated by the American designers of the Internet and poses
unanticipated problems for security and for governance. The political foundation of cyberspace reflects the thinking of the
1990s about the future of international relations: the end of inter-state conflict and borders, a globalized economy based on shared political
values, the decline of the Westphalian state, and the belief that civil society could manage some key public functions better than governments.
They did not expect serious international competition or hostility to reemerge nor did they expect the
explosion of threats from violent non-state actors. Thus they designed an open system with weak governance and security.
The NSA (and some of its European counterparts) also took advantage of this, but to reinforce rather than
undermine the security of the United States and its allies—a point that is often overlooked.
Politics Link
Congress is more likely to support law enforcement.
Meinrath & Vitka 2014
Sascha Meinrath is Director of X-Lab, Sean Vitka is Federal Policy Manager of the Sunlight Foundation,
“Crypto War II” Critical Studies in Media Communication Vol. 31, Iss. 2, 2014
Such battles are likely to migrate to one of the most powerful, and least prepared, venues for
technological debate on the planet—the U.S. Congress. Within this arena, law enforcement’s influence
is more powerful. The consistent argument is that encryption and anonymity endanger society (Clapper,
2013). With this new corporate interest, industry lobbyists will simultaneously argue that encryption is
undermining their intellectual property and other business interests, and that users freely accept
surveillance via the purchase of their products and use of their applications. Their narrative regarding
consumer discontent is that unhappy users could always “vote with their feet” and switch providers.
PTX Link – Zero Days
Congressional oversight of zero days is unpopular.
Fidler 2014
Mailyn Fidler, Marshall Scholar, Department of Politics and International Relations May 2014 “Anarchy
or Regulation: Controlling the Global Trade In Zero-Day Vulnerabilities” A Thesis Submitted To The
Interschool Honors Program in International Security Studies, Center for International Security and
Cooperation, Freeman Spogli Institute for International Studies, Stanford University
https://decryptedmatrix.com/wp-content/uploads/2014/06/Fidler-Zero-Day-Vulnerability-Thesis.pdf
Congressional action could also implement controls on when and how zero-days can be bought and
used. Congressional action could be used to impose the limits discussed in the executive oversight
section: limits on purchase, use, and disclosure of zero-day vulnerabilities. It could also require reporting
to relevant Congressional committees when a zero-day is not disclosed. Congressional oversight
provides an avenue for longer-lasting oversight regimes, in contrast with more easily alterable executive
orders, and also could be accompanied by additional funding for oversight or the threat of cutting off
appropriations if the executive branch fails to follow oversight rules.
However, congressional oversight is likely politically difficult to achieve. Snowden has made most cyber
topics politically fraught, and Congress is currently generally perceived as dysfunctional. Beyond these
political considerations, congressional oversight has traditionally been reserved for oversight programs
with a broader purview, such as establishing principles that apply to all foreign intelligence activities or
covert operations, not principles that apply just a specific tool.
XO CP
XO CP – Backdoors/Encryption
Presidential action alone solves.
Geller, 2015
Eric Geller, Deputy Morning Editor, The Daily Dot, 7-10-2015, "The rise of the new Crypto War," Daily
Dot, http://www.dailydot.com/politics/encryption-crypto-war-james-comey-fbi-privacy/
As Comey, Rogers, and other national-security officials campaign for backdoors, one important voice has
been largely absent from the debate. “I lean probably further in the direction of strong encryption than some do inside of law
enforcement,” President Barack Obama told Recode’s Kara Swisher on Feb. 15, shortly after Obama spoke at the White House summit on
cybersecurity and consumer protection in Silicon Valley. Obama’s
interview with Swisher marked a rare entrance for the
president into the backdoor debate, which has pitted his law-enforcement professionals against the civil libertarians who were
encouraged by his historic 2008 election and disappointed by his subsequent embrace of the surveillance status quo. If the president felt
strongly enough about strong encryption, he could swat down FBI and NSA backdoor requests any time
he wanted. White House advisers and outside experts have offered him plenty of policy cover for doing so. The President’s Review
Group on Intelligence and Communications Technologies, which Obama convened in the wake of the
Snowden disclosures, explicitly discouraged backdoors in its final report. The review group recommended “fully
supporting and not undermining efforts to create encryption standards,” ... “making clear that [the
government] will not in any way subvert, undermine, weaken, or make vulnerable generally available
commercial encryption,” and “supporting efforts to encourage the greater use of encryption technology for data in transit, at rest, in
the cloud, and in storage.” The report also warned of “serious economic repercussions for American businesses” resulting from “a growing
distrust of their capacity to guarantee the privacy of their international users.” It was a general warning about the use of electronic surveillance,
but it nevertheless applies to the potential fallout from a backdoor mandate. The White House’s own reports on cybersecurity and consumer
privacy suggest that the president generally supports the use of encryption. “To the extent that you’ve heard anything from the White House
and from the president, it’s in favor of making sure that we have strong encryption and that we’re building secure, trustworthy systems,” said
Weitzner, who advised Obama as U.S. deputy chief technology officer for Internet policy from 2011 to 2012. Weitzner pointed out that the
president had subtly quashed a push for backdoors by the previous FBI director, Robert Mueller. Mueller “hoped that the administration would
end up supporting a very substantial [Internet-focused] expansion of CALEA,” Weitzner said. “That didn’t happen, and … despite the fact that
you had the FBI director come out very strongly saying [criminals] were going dark, the administration never took a position as a whole in
support of that kind of statutory change. You can read between the lines.” Obama’s reluctance
to directly confront his FBI
chief reflects the bureau’s long history of autonomy in debates over law-enforcement powers, said a former
Obama administration official. “It’s pretty well understood that the FBI has a certain amount of independence when they’re out in the publicpolicy debate advocating for whatever they think is important,” the former official said. The
White House is reviewing "the
technical, geopolitical, legal, and economic implications" of various encryption proposals, including the
possibility of legislation, administration officials told the Washington Post this week. Weitzner said that Obama
may also be waiting until the latest round of the Crypto Wars has progressed further. “The White House tends to get involved in
debates once they’ve matured,” he said. “If you jump in on everything right up front, the volume can become unmanageable. I
know that there’s a lot of attention being paid, and I think that’s the right thing to do at this point.” The president’s noncommittal
stance has earned him criticism from pro-encryption lawmakers who say that their fight would be much
easier if the commander-in-chief weighed in. “The best way to put all this to bed,” Hurd said, “would be for
the president to be very clear saying that he is not interested in pursuing backdoors to encryption and
believes that this is the wrong path to go, in order to squash the debate once and for all.” If Obama ever
formally came out against backdoors, it would represent a significant shift away from decades of antiencryption government policies, including undermining industry-standard security tools and attacking
tech companies through public bullying and private hacking.
XO CP – Zero Days
The executive is the best way to regulate Zero Days.
Fidler 2014
Mailyn Fidler, Marshall Scholar, Department of Politics and International Relations May 2014 “Anarchy
or Regulation: Controlling the Global Trade In Zero-Day Vulnerabilities” A Thesis Submitted To The
Interschool Honors Program in International Security Studies, Center for International Security and
Cooperation, Freeman Spogli Institute for International Studies, Stanford University
https://decryptedmatrix.com/wp-content/uploads/2014/06/Fidler-Zero-Day-Vulnerability-Thesis.pdf
Theoretically, oversight of executive
branch action has the best chance of encouraging U.S. government
agencies to disclose rather than stockpile vulnerabilities, and of resetting the equities calculus between
defense and offense. Additional oversight could reduce the number of vulnerabilities the U.S. government purchases. Alternatively, if
oversight encourages a reduction in exploited vulnerabilities (by encouraging disclosure) but the
government does not decrease purchasing, the government could function like a white-market
clearinghouse – a government bug bounty – continuing to buy vulnerabilities but ensuring most of them get disclosed and
fixed.440 However, if oversight encourages a reduction in the number of vulnerabilities the government
purchases, it is an open question as to what might happen to the vulnerabilities the government
otherwise would have purchased. A decrease in U.S. government buying could reduce demand and
prices, and could lead gray-market companies to diversify their product offerings or turn back to white-market options. Or, gray-market
companies may instead find buyers for the surplus offerings who are less favorable to U.S. interests. From the options examined here,
pursuing an expanded program of executive branch oversight of zero-day use and procurement seems
the most feasible and effective option. Such oversight mechanisms would address U.S. government
participation in the market, a critical component of the gray-market system, and would represent a
politically attainable and adaptable approach to regulating the zero-day trade.
PPD 20 proves that the executive can create guidelines for zero days.
Fidler 2014
Mailyn Fidler, Marshall Scholar, Department of Politics and International Relations May 2014 “Anarchy
or Regulation: Controlling the Global Trade In Zero-Day Vulnerabilities” A Thesis Submitted To The
Interschool Honors Program in International Security Studies, Center for International Security and
Cooperation, Freeman Spogli Institute for International Studies, Stanford University
https://decryptedmatrix.com/wp-content/uploads/2014/06/Fidler-Zero-Day-Vulnerability-Thesis.pdf
The last domestic approach, oversight of executive branch activities concerning zerodays, could come in
many forms, including transparency-building mechanisms or substantive rules about what executive
branch agencies can do with purchased or discovered zero-days. For instance, an executive order or
presidential policy directive could provide common definitions, harmonize policies across agencies,
encourage inter-agency transparency to drive zero-day acquisition costs down, and establish substantive
rules on purchasing, disclosing, and using zerodays. Models for this kind of executive branch action on
cybersecurity can be found in Presidential Policy Directive 20, which organized U.S. government
activities on offensive and defensive cyber operations, and Executive Order 13636 (Improving Critical
Infrastructure Cybersecurity), which created policy momentum for strengthening cybersecurity in critical
infrastructure sectors.
Such oversight of executive branch activities would have two aims. It would seek to change government
behavior to be more responsive to the perceived downsides of the zero-day trade. It would also seek to
alter the market in positive ways. For instance, it could seek to alter gray-market demand or pricing in
ways that would make the white market a more attractive option for sellers, particularly by modulating
the purchasing behavior of one of the major gray market buyers, the U.S. government. Alternatively,
oversight could encourage the U.S. government to continue purchasing vulnerabilities, but build
mechanisms to disclose more of these vulnerabilities immediately. The particular market effects of an
executive-branch oversight approach will depend on how exactly this oversight is structured and what
such oversight requires of relevant actors.
CP solves and avoids ptx.
Fidler 2014
Mailyn Fidler, Marshall Scholar, Department of Politics and International Relations May 2014 “Anarchy
or Regulation: Controlling the Global Trade In Zero-Day Vulnerabilities” A Thesis Submitted To The
Interschool Honors Program in International Security Studies, Center for International Security and
Cooperation, Freeman Spogli Institute for International Studies, Stanford University
https://decryptedmatrix.com/wp-content/uploads/2014/06/Fidler-Zero-Day-Vulnerability-Thesis.pdf
This analysis means that the best approach to improving oversight rests with the executive branch.
Executive branch oversight offers traction because it is an existing approach, is politically feasible, and
can be calibrated to address the government’s purchase, disclosure, and exploitation of zero-days. Still,
current U.S. policy towards zero-days raises concerns about relying on this approach. The Obama
administration’s language in explaining its zero-day policy sparks questions about whether zero-days
purchased by the U.S. government are covered under current policy, or whether only vulnerabilities
discovered in-house are included. If purchased vulnerabilities are exempt, then a significant gap exists in
current oversight. In addition, improved executive branch oversight may also need to address periodic
post-use or poststockpiling review of zero-day vulnerabilities discovered, purchased, and stockpiled by
the U.S. government. Doing so would ensure such oversight continuously addresses the cybersecurity
problems associated with zero days and their trade and would offer a route to oversight of purchased
vulnerabilities.
Download