Wake CM 2NC v GSU GS R6 UGA

advertisement
We are way ahead of China and Russia.
Greg Grant, Defense Tech, “The End of U.S. Maritime Dominance? Not So Fast Argues Naval Strategist,” 9/2/ 2010, http://defensetech.org/2010/09/02/the-endof-u-s-maritime-dominance-not-so-fast/
In a new research paper I came across, Geoffrey
Till, a top-notch historian and naval strategist, looks at the notion that this is
shaping up to be China’s century, not America’s, and that the maritime decline of the U.S. is a foregone
conclusion. He contends that predicting the rise and fall of great power maritime dominance is a bit trickier and harder
to measure than many claim.
The debate itself is being driven by economics, of course; China’s GDP witnessed an astonishing 10-fold increase between 1978 and 2004. A “highly effective”
government stimulus program and massive credit expansion meant China recovered quickly from the financial crisis; the U.S. has not. In 2000, U.S. GDP was eight
times larger than China’s; now it’s only four times larger. “Historically, growth in GDP has a high correlation with naval expenditure,” writes Till.
The other major driver is China’s
“growing and absolute dependence on overseas commodities, energy and markets.” That
fact alone means China “has little choice but to become more maritime in its orientation.” Some of China’s naval modernization
can be seen as making up for decades of neglect.
What about the naval balance? While the Navy’s planned expansion to 313 ships may never happen, its current level of 280 ships seems overwhelming, says Till.
Numbers of ships don’t tell the whole story. Borrowing from Bob Work’s analysis, Till
cites tonnage as a better indicator of fleet strength
as “the offensive and defensive power of an individual unit is usually a function of size.”
Tonnage wise, the U.S. battle fleet has a 2.63:1 advantage over a combined Chinese-Russian fleet. Factoring in
the advantage the U.S. Navy possesses in its vertical launch magazines (actual strike power) an enormous 20power superiority exists. That’s not all:
“Its 56 SSN/SSGN nuclear power submarine fleet might on the face of it seem overpowered by the world’s
other 220 SSNs and SSKs but the qualitative advantages of the U.S. submarine force are huge. It is much the
same story with regard to the U.S. Navy’s amphibious and critical support fleets, in its capacity to support
special forces operations, in its broad area maritime surveillance capabilities, in its U.S. Coast Guard (the equivalent of many of
the world’s navies) and in the enormous advantage conferred by the experience of many decades of 24/7 oceanic operations.”
The real strength of a navy should be measured not by the number of units, Till argues, but how those compare to the requirements the platform is intended to perform.
Till also questions Chinese shipbuilding prowess, noting deficiencies in quality assurance, innovation and
experience expected in “an industry in the first flush of youth.” While China’s manufacturing success is
impressive, it remains mostly labor-intensive, low priced consumer items. It remains far behind Germany and
Japan in technological innovation and the export of machinery.
US forces will never withdraw.
Brooks and Wohlforth – 2 (Stephen Brooks and William Wohlforth, Both are Associate Professors in the Government Department at Dartmouth,
“American Primacy in Perspective,” Foreign Affairs, July / August 2002)
Historically, the major forces pushing powerful states toward restraint and magnanimity have been the limits of their
strength and the fear of overextension and balancing. Great powers typically checked their ambitions and deferred to others not because they
wanted to but because they had to in order to win the cooperation they needed to survive and prosper. It is thus no surprise that today's
champions of American moderation and international benevolence stress the constraints on American power rather than the
lack of them. Political scientist Joseph Nye, for example, insists that "[the term] unipolarity is misleading because it exaggerates the degree to which
the United States is able to get the results it wants in some dimensions of world politics. . . . American power is less effective than it might first
appear." And he cautions that if the United States "handles its hard power in an overbearing, unilateral manner," then others might be provoked into
forming a balancing coalition. Such arguments are unpersuasive, however, because they fail to acknowledge the true
nature of the current international system. The United States cannot be scared into meekness by warnings of
inefficacy or potential balancing. Isolationists and aggressive unilateralists see this situation clearly, and their domestic opponents need to
as well. Now and for the foreseeable future, the United States will have immense power resources it can bring to
bear to force or entice others to do its bidding on a case-by-case basis.
No chances of European war
Ronald Asmus, Senior Transatlantic Fellow at the German Marshall Fund, Council on Foreign Relations Senior
Adjunct Fellow, former Deputy Assistant Secretary of State for European Affairs, Sept/Oct 2003, Foreign
Affairs, p. Lexis
Several factors make the recent collapse in transatlantic cooperation surprising. The crisis came on the heels of the
alliance’s renaissance in the 1990s. Following deep initial differences over Bosnia at the start of the decade, the United States and Europe
came together to stem the bloodshed in the Balkans in 1995 and again in 1999. Led by Washington, NATO expanded to
include central and eastern Europe as part of a broader effort to secure a new post-Cold War peace. This initiative was also
accompanied by the creation of a new NATO partnership with Russia. As a result, Europe today is more
democratic, peaceful, and secure than ever. For the first time in a century, Washington need not worry about a major
war on the continent -- a testimony to the success in locking in a post-Cold War peace over the last decade.
No risk of US involvement in European wars
Christopher Layne, visiting scholar at the Center for International Studies at the University of Southern
California, MacArthur Foundation Fellow in Global Security, October 25, 1999, Cato Policy Analysis, “Faulty
Justifications and Ominous Prospects,” http://www.cato.org/pubs/pas/pa357.pdf
Instability in its peripheries may affect Europe, but, contrary to the U.S. foreign policy establishment’s conventional wisdom, it has never been true that Europe’s
wars invariably affect America’s security interests. Most of Europe’s wars—even wars involving the great
powers—have not affected American security. Moreover, the counterhegemonic strategy much more accurately delineates the requirements of America’s
European strategy than does the current strategy of reassurance and stabilization. The kinds of small-scale conflicts that have occurred this decade in
the Balkans do not threaten America’s security interests because such conflicts do not raise the single strategic
danger that Europe could pose to the United States: the emergence of a continental hegemon. Thus, the “new” NATO
represents a radical transformation of the alliance’s strategic mission— and of America’s role in NATO.
No risk of breakout – fears of a tipping point are empirically overblown
Francis Gavin, Tom Slick Professor of International Affairs and Director of the Robert S. Strauss Center for International Security and Law, Lyndon B. Johnson
School of Public Affairs, University of Texas at Austin, 2010. International Security, “Same As It Ever Was; Nuclear Alarmism, Proliferation, and the Cold War,” p.
Lexis
Fears of a tipping point were especially acute in the aftermath of China's 1964 detonation of an atomic bomb: it was predicted that India,
Indonesia, and Japan might follow, with consequences worldwide, as "Israel, Sweden, Germany, and other potential nuclear
countries far from China and India would be affected by proliferation in Asia." 40 A U.S. government document identified "at least eleven nations (India, Japan,
Israel, Sweden, West Germany, Italy, Canada, Czechoslovakia, East Germany, Rumania, and Yugoslavia)" with the capacity to go nuclear, a number that would soon
"grow substantially" to include "South Africa, the United Arab Republic, Spain, Brazil and Mexico." 41 A top-secret, blue-ribbon committee established to craft the
U.S. response contended that "the [1964] Chinese nuclear explosion has increased the urgency and complexity of this problem by creating strong pressures to develop
independent nuclear forces, which, in turn, could strongly influence the plans of other potential nuclear powers." 42
These predictions were largely
wrong. In 1985 the National Intelligence Council noted that for "almost thirty years the Intelligence Community has been writing about which nations might next get
the bomb." All of these estimates based their largely pessimistic and ultimately incorrect estimates on factors such as the
increased "access to fissile materials," improved technical capabilities in countries, the likelihood of "chain
reactions," or a "scramble" to proliferation when "even one additional state demonstrates a nuclear capability." The 1985 report goes on, "The most striking
characteristic of the present-day nuclear proliferation scene is that, despite the alarms rung by past Estimates, no additional overt proliferation of nuclear weapons has
actually occurred since China tested its bomb in 1964." Although
"some proliferation of nuclear explosive capabilities and other major proliferationtaken place in the past two decades," they did not have "the damaging, systemwide impacts
that the Intelligence community generally anticipated they would." 43 In his analysis of more than sixty years of failed
efforts to accurately predict nuclear proliferation, analyst Moeed Yusuf concludes that "the pace of proliferation has
been much slower than anticipated by most." The majority of countries suspected of trying to obtain a nuclear
weapons capability "never even came close to crossing the threshold. In fact, most did not even initiate a weapons
program." If all the countries that were considered prime suspects over the past sixty years had developed nuclear weapons, "the world would have at least 19
nuclear powers today." 44 As Potter and Mukhatzhanova argue, government and academic experts frequently "exaggerated the scope
and pace of nuclear weapons proliferation." 45 Nor is there compelling evidence that a nuclear proliferation chain reaction will
ever occur. Rather, the pool of potential proliferators has been shrinking. Proliferation pressures were far greater
during the Cold War. In the 1960s, at least twenty-one countries either had or were considering nuclear weapons research
programs. Today only nine countries are known to have nuclear weapons. Belarus, Brazil, Kazakhstan, Libya, South Africa, Sweden, and
Ukraine have dismantled their weapons programs. Even rogue states that are/were a great concern to U.S. policymakers--Iran, Iraq, Libya, and North Korea-began their nuclear weapons programs before the Cold War had ended. 46 As far as is known, no nation has started a
new nuclear weapons program since the demise of the Soviet Union in 1991. 47 Ironically, by focusing on the threat of rogue states,
related developments have
policymakers may have underestimated the potentially far more destabilizing effect of proliferation in "respectable" states such as Germany, Japan, South Korea, and
Taiwan.
No warming – their models are incorrect and satellite data disproves.
Steven F. Hayward, F.K. Weyerhaeuser fellow at the American Enterprise Institute, 3-15-2010, The Weekly Standard, “In Denial,”
http://www.weeklystandard.com/print/articles/denial
This central pillar of the climate campaign is unlikely to survive much longer, and each repetition of the “science-is-settled”
mantra inflicts more damage on the credibility of the climate science community. The scientist at the center of the Climategate scandal at East
Anglia University, Phil (“hide the decline”) Jones dealt the science-is-settled narrative a huge blow with his candid admission in a BBC interview
that his surface temperature data are in such disarray they probably cannot be verified or replicated, that the
medieval warm period may have been as warm as today, and that he agrees that there has been no statistically
significant global warming for the last 15 years—all three points that climate campaigners have been bitterly contesting. And Jones specifically
disavowed the “science-is-settled” slogan: BBC: When scientists say “the debate on climate change is over,” what exactly do they mean, and what don’t they mean?
Jones: It would be supposition on my behalf to know whether all scientists who say the debate is over are saying that for the same reason. I don’t believe the vast
majority of climate scientists think this. This is not my view. There is still much that needs to be undertaken to reduce uncertainties, not just for the future, but for the
instrumental (and especially the palaeoclimatic) past as well [emphasis added]. Judith Curry, head of the School of Earth and Atmos-pheric Sciences at Georgia Tech
and one of the few scientists convinced of the potential for catastrophic global warming who is willing to engage skeptics seriously, wrote February 24: “No
one
really believes that the ‘science is settled’ or that ‘the debate is over.’ Scientists and others that say this seem to want to advance a
particular agenda. There is nothing more detrimental to public trust than such statements.” The next wave of climate revisionism is likely to reopen most of the central
questions of “settled science” in the IPCC’s Working Group I, starting with the data purporting to prove how much the Earth has warmed over the last century. A
London Times headline last month summarizes the shocking revision currently underway: “World May Not Be Warming, Scientists Say.” The
Climategate
emails and documents revealed the disarray in the surface temperature records the IPCC relies upon to validate its
claim of 0.8 degrees Celsius of human-caused warming, prompting a flood of renewed focus on the veracity and handling of surface temperature data. Skeptics such as
Anthony Watts, Joseph D’Aleo, and Stephen McIntyre have been pointing out the defects in the surface temperature record for years, but the media and the IPCC
ignored them. Watts and D’Aleo have painstakingly documented (and in many cases photographed) the huge number of temperature
stations that have
been relocated, corrupted by the “urban heat island effect,” or placed too close to heat sources such as air conditioning
compressors, airports, buildings, or paved surfaces, as well as surface temperature series that are conveniently left out of the
IPCC reconstructions and undercut the IPCC’s simplistic story of rising temperatures. The compilation and statistical
treatment of global temperature records is hugely complex, but the skeptics such as Watts and D’Aleo offer compelling critiques showing that most of the
reported warming disappears if different sets of temperature records are included, or if compromised station
records are excluded. The puzzle deepens when more accurate satellite temperature records, available starting in 1979, are considered. There is a glaring
anomaly: The satellite records, which measure temperatures in the middle and upper atmosphere, show very little
warming since 1979 and do not match up with the ground-based measurements. Furthermore, the satellite readings of
the middle- and upper-air temperatures fail to record any of the increases the climate models say should be happening in
response to rising greenhouse gas concentrations. John Christy of the University of Alabama, a contributing author to the IPCC’s Working
Group I chapter on surface and atmospheric climate change, tried to get the IPCC to acknowledge this anomaly in its 2007 report but was ignored. (Christy is
responsible for helping to develop the satellite monitoring system that has tracked global temperatures since 1979. He received NASA’s Medal for Exceptional
Scientific Achievement for this work.) Bottom line: Expect some surprises to come out of the revisions of the surface temperature records that will take place over the
next couple of years. Eventually the climate modeling community is going to have to reconsider the central question: Have
the models the IPCC uses for its
the climate’s sensitivity to greenhouse gases? Two recently published
studies funded by the U.S. Department of Energy, one by Brookhaven Lab scientist Stephen Schwartz in the Journal of Geophysical Research, and
one by MIT’s Richard Lindzen and Yong-Sang Choi in Geophysical Research Letters, both argue for vastly lower climate sensitivity to
greenhouse gases. The models the IPCC uses for projecting a 3 to 4 degree Celsius increase in temperature all assume large positive (that is, temperaturemagnifying) feedbacks from a doubling of carbon dioxide in the atmosphere; Schwartz, Lindzen, and Choi discern strong negative (or
temperature-reducing) feedbacks in the climate system, suggesting an upper-bound of future temperature rise of no
more than 2 degrees Celsius. If the climate system is less sensitive to greenhouse gases than the climate campaign believes, then what is causing plainly
predictions of catastrophic warming overestimated
observable changes in the climate, such as earlier arriving springs, receding glaciers, and shrinking Arctic Ocean ice caps? There have been alternative explanations in
the scientific literature for several years, ignored by the media and the IPCC alike. The IPCC downplays theories of variations in solar activity, such as sunspot activity
and gamma ray bursts, and although there is robust scientific literature on the issue, even the skeptic community is divided about whether solar activity is a primary
cause of recent climate variation. Several studies of Arctic warming conclude that changes
in ocean currents, cloud formation, and wind
patterns in the upper atmosphere may explain the retreat of glaciers and sea ice better than greenhouse gases.
Another factor in the Arctic is “black carbon”—essentially fine soot particles from coal-fired power plants and forest fires, imperceptible to the naked eye but reducing
if the medieval warm period was
indeed as warm or warmer than today, we cannot rule out the possibility that the changes of recent decades are
part of a natural rebound from the “Little Ice Age” that followed the medieval warm period and ended in the
19th century. Skeptics have known and tried to publicize all of these contrarian or confounding scientific findings, but the compliant news media routinely
the albedo (solar reflectivity) of Arctic ice masses enough to cause increased summertime ice melt. Above all,
ignored all of them, enabling the IPCC to get away with its serial exaggeration and blatant advocacy for more than a decade.
No warming – it’s just random fluctuation and data collection errors
Jonathan Leake, 2-14-2010, Times Online, “World may not be warming, say scientists,”
http://www.timesonline.co.uk/tol/news/environment/article7026317.ece?print=yes&randnum=1269060067737
The United Nations climate panel faces a new challenge with scientists
casting doubt on its claim that global temperatures are rising
inexorably because of human pollution. In its last assessment the Intergovernmental Panel on Climate Change (IPCC) said the evidence that the
world was warming was “unequivocal”. It warned that greenhouse gases had already heated the world by 0.7C and that there could be 5C-6C more warming by 2100,
with devastating impacts on humanity and wildlife. However, new
research, including work by British scientists, is casting doubt on such claims.
Some even suggest the world may not be warming much at all. “The temperature records cannot be relied on as
indicators of global change,” said John Christy, professor of atmospheric science at the University of Alabama in Huntsville,
a former lead author on the IPCC. The doubts of Christy and a number of other researchers focus on the thousands of weather stations
around the world, which have been used to collect temperature data over the past 150 years. These stations, they believe, have been seriously
compromised by factors such as urbanisation, changes in land use and, in many cases, being moved from site to
site. Christy has published research papers looking at these effects in three different regions: east Africa, and the American states of California and Alabama. “The
story is the same for each one,” he said. “The popular data sets show a lot of warming but the apparent temperature rise was actually caused by local factors affecting
the weather stations, such as land development.” The IPCC faces similar criticisms from Ross McKitrick, professor of economics at the University of Guelph, Canada,
who was invited by the panel to review its last report. The experience turned him into a strong critic and he has since published a research paper questioning its
methods. “We concluded, with overwhelming statistical significance, that the
IPCC’s climate data are contaminated with surface effects
from industrialisation and data quality problems. These add up to a large warming bias,” he said. Such warnings are
supported by a study of US weather stations co-written by Anthony Watts, an American meteorologist and climate change sceptic. His study, which has not been peer
reviewed, is illustrated with photographs of weather stations in locations where their readings are distorted by heat-generating equipment. Some are next to airconditioning units or are on waste treatment plants. One of the most infamous shows a weather station next to a waste incinerator. Watts has also found examples
overseas, such as the weather station at Rome airport, which catches the hot exhaust fumes emitted by taxiing jets. In Britain, a weather station at Manchester airport
was built when the surrounding land was mainly fields but is now surrounded by heat-generating buildings. Terry Mills,
professor of applied statistics
Loughborough University, looked at the same data as the IPCC. He found that the warming trend it reported over
the past 30 years or so was just as likely to be due to random fluctuations as to the impacts of greenhouse gases.
Mills’s findings are to be published in Climatic Change, an environmental journal. “The earth has gone through warming spells like these at
least twice before in the last 1,000 years,” he said.
and econometrics at
Warming may not be good, but it won’t end the world – their impact is fear mongering.
Ross Douthat, New York Times, “Cap-and-trade not right for global warming threat,” 8/2/2010,
http://www.omaha.com/article/20100802/NEWS0802/708029983
The ’70s were a great decade for apocalyptic enthusiasms, and none was more potent than the fear that human population growth had
outstripped the Earth’s carrying capacity. According to a chorus of credentialed alarmists, the world was entering an age of
sweeping famines, crippling energy shortages and looming civilizational collapse.
It was not lost on conservatives that this analysis led inexorably to left-wing policy prescriptions — a government-run energy sector at home, and population control for
the teeming masses overseas.
Social conservatives and libertarians, the two wings of the American Right, found common ground resisting these
prescriptions. And time was unkind to the alarmists. The catastrophes never materialized, and global living
standards soared. By the turn of the millennium, the developed world was worrying about a birth dearth.
This is the lens through which most conservatives view the global warming debate. Again, a doomsday scenario
has generated a crisis atmosphere, which is being invoked to justify taxes and regulations that many leftwingers would support anyway.
History, however, rarely repeats itself exactly — and conservatives who treat global warming as just another scare story are
almost certainly mistaken.
Rising temperatures won’t “destroy” the planet, as fear-mongers and celebrities like to say. But the evidence that carbon
emissions are altering the planet’s ecology is too convincing to ignore. Conservatives who dismiss climate change as a hoax are
making a spectacle of their ignorance.
No impact and stability impossible
Diana West, Contributing Analyst for Townhall.com, 2009.
“Let Afghanistan Go,” http://townhall.com/columnists/DianaWest/2009/04/23/let_afghanistan_go
The point is, the United States is getting a lot of bang for a lot of buck but not much else. Don't get me wrong:
If killing small bands of Taliban is in
the best interest of the United States, I'm for it. But I do not believe it is -- and certainly not as part of the grand strategy conceived first by
the Bush administration and now expanded by the Obama administration to turn Afghanistan into a state capable of warding off what is daintily known as "extremism,"
but is, in fact, bona-fide jihad to advance Sharia (Islamic law). Anybody remember Sisyphus? Well, trying
to transform Afghanistan into an antijihad, anti-Sharia player -- let alone functional nation -- is like trying to roll Sisyphus' rock up the hill. This is not to
suggest that there is no war or enemies to fight, which is what both the Left and the Paleo-Right will say; there most certainly are. But sinking all possible
men, materiel and bureaucracy into Afghanistan, as the Obama people and most conservatives favor, to try to bring a corrupt Islamic
culture into working modernity while simultaneously fighting Taliban and wading deep into treacherous Pakistani wars is no
way to victory -- at least not to U.S. victory. On the contrary, it is the best way to bleed and further degrade U.S. military
capabilities. Indeed, if I were a jihad chieftain, I couldn't imagine a better strategy than to entrap tens of thousands of America's very best young men in an open-
ended war of mortal hide-and-seek in the North West Frontier. I decided to ask someone with real military experience how we could fend off jihad without further
digging ourselves into Central Asia. I called up retired Maj.
Gen. Paul Vallely, one of the few top military leaders who talks on the record, to ask for his
for Afghanistan. "Basically, let it go ," he said. Let Afghanistan go -- music to my ears, particularly given the
source is no Hate-America-First professor or Moveon-dot-org-nik, but a lifelong patriotic conservative warrior.
" There's nothing to win there," he explained, engaging in an all-too-exotic display of common sense. "What do you get for it? What's the return?
Well, the return's all negative for the United States."
strategy recommendation
Rights
Extinction outweighs.
Schell, policy analyst and proliferation expert, 2000 (Jonathan, “The Fate of the Earth”, p. 94-5)
To say that human extinction is a certainty would, of course, be a misrepresentation—just as it would be a
misrepresentation to say that extinction can be ruled out. To begin with, we know that a holocaust may not
occur at all. If one does occur, the adversaries may not use all their weapons. If they do use all their weapons, the global effects, in the ozone and elsewhere, may be moderate. And if the effects are not moderate
but extreme, the ecosphere may prove resilient enough to withstand them without breaking down catastrophically. These are all substantial reasons for supposing that mankind will not be extinguished in a nuclear holocaust,
Yet at the same time we are compelled to admit that
there may be a holocaust, that the adversaries may use all their weapons, that the global effects, including effects of which we are as yet unaware, may be severe, that the ecosphere may suffer
catastrophic breakdown, and that our species may be extinguished. We are left with uncertainty, and are forced to make our
decisions in a state of uncertainty. If we wish to act to save our species, we have to muster our resolve in spite of our awareness that the life of the species may not now in fact be jeopardized.
On the other hand, if we wish to ignore the peril, we have to admit that we do so in the knowledge that the species may be
in danger of imminent self-destruction. When the existence of nuclear weapons was made known, thoughtful people everywhere in the world realized that if the great powers entered
or even that extinction in a holocaust is unlikely, and they tend to calm our fear and reduce our sense of urgency.
into a nuclear-arms race the human species would sooner or later face the possibility of extinction. They also realized that in the absence of international agreements preventing it an arms race would probably occur. They
The discovery of the energy in mass—of “the basic power of the universe”—
and of a means by which man could release that energy altered the relationship between [humans] and the
source of [their] life, the earth. In the shadow of this power, the earth became small and the life of the human
species doubtful. In that sense, the question of human extinction has been on the political agenda of the world
ever since the first nuclear weapon was detonated, and there was no need for the world to build up its present tremendous arsenals before starting to worry about it. At just
knew that the path of nuclear armament was a dead end for mankind.
what point the species crossed, or will have crossed, the boundary between merely having the technical knowledge to destroy itself and actually having the arsenals at hand, ready to be used at any second, is not precisely
with some twenty thousand megatons of nuclear explosive power in existence, and with
more being added every day, we have entered into the zone of uncertainty, which is to say the zone of risk of
extinction. But the mere risk of extinction has a significance that is categorically different from, and
immeasurably greater than, that of any other risk, and as we make our decisions we have to take that
significance into account. Up to now, every risk has been contained within the frame of life; extinction
would shatter the frame. It represents not the defeat of some purpose but an abyss in which all human
purposes would be drowned for all time. We have no right to place the possibility of this limitless, eternal
defeat on the same footing as risks that we run in the ordinary conduct of our affairs in our particular
transient moment of human history. To employ a mathematical analogy, we can say that although the risk of
extinction may be fractional, the stake is, humanly speaking, infinite, and a fraction of infinity is still infinity.
In other words, once we learn that a holocaust might lead to extinction we have no right to gamble, because if
we lose, the game will be over, and neither we nor anyone else will ever get another chance. Therefore, although, scientifically
speaking, there is all the difference in the world between the mere possibility that a holocaust will bring about extinction and the certainty of it, morally they are the same, and we have no choice but
to address the issue of nuclear weapons as though we knew for a certainty that their use would put an end to our
species. In weighing the fate of the earth and, with it, our own fate, we stand before a mystery, and in tampering with the earth we tamper with a mystery. We are in deep ignorance. Our ignorance should dispose us to
knowable. But it is clear that at present,
wonder, our wonder should make us humble, our humility should inspire us to reverence and caution, and our reverence and caut ion should lead us to act without delay to withdraw the threat we now pose to the earth and to
ourselves.
Terrorists prefer conventional weapons to bioterror
John Parachini, analyst for RAND, 2001, Anthrax Attacks, Biological Terrorism and Preventive Responses, p.11-12
The use of disease and biological material as a weapon is not a new method of warfare. What is surprising is how infrequently it is has been used.
Biological agents may appeal to the new terrorist groups because they affect people indiscriminately and unnoticed, thereby sowing panic. A pattern is emerging that
terrorists who perpetrate mass and indiscriminate attacks do not claim responsibility.5 In contrast to the turgid manifestos issued by terrorists in the 1960s, 1970s and
1980s, recent mass casualty terrorists have not claimed responsibility until they were imprisoned. Biological agents enable terrorists to preserve their anonymity
because of their delayed impact and can be confused with natural disease outbreaks. Instead of the immediate gratification of seeing an explosion or the glory of
claiming credit for disrupting society, the biological weapons terrorist may derive satisfaction from seeing society’s panicked response to their actions. If this is the
case, this is a new motive for the mass casualty terrorist. There are a number of countervailing disincentives for states and terrorists to use
biological weapons, which help explain why their use is so infrequent. The technical and operational challenges biological weapons pose are
considerable. Acquiring the material, skills of production, knowledge of weaponization, and successfully delivering the weapon, to the
target is difficult. In cases where the populations of the terrorist supporters and adversaries are mixed, biological weapons risk inadvertently hitting the same people
for whom terrorists claim to fight. Terrorists may also hesitate in using biological weapons specifically because breaking the taboo on their
use may evoke considerable retaliation. The use of disease as a weapon is widely recognized in most cultures as a means of killing that
is beyond the bounds of a civilized society. From a psychological perspective, terrorists may be drawn to explosives as arsonists are
drawn to fire. The immediate gratification of explosives and the thrill of the blast may meet a psychological need of terrorists that the
delayed effects of biological weapons do not. Causing slow death of others may not offer the same psychic thrill achieved by killing with firearms or
explosives. Perhaps the greatest alternative to using biological weapons is that terrorists can inflict (and have inflicted) many more fatalities and
casualties with conventional explosives than with unconventional weapons. Biological weapons present technical and operational challenges that
determined killers may not have the patience to overcome or they may simply concentrate their efforts on more readily available
alternatives.
Multiple technical barriers to bioterror
Jonathan Tucker, visiting fellow, Hoover Institution, Stanford University, CURRENT HISTORY, April 2000, p.151. (MHSOLT2456)
the technical
challenges associated with the production and efficient dissemination of chemical or biological agents make
catastrophic attacks unlikely Acquiring such a capability would require terrorists to overcome a series of major
hurdles: hiring technically trained personnel with the relevant expertise, gaining access to specialized chemical
weapon ingredients or virulent microbial strains, obtaining equipment suitable for the mass-production of
chemical or biological agents, and developing wide area delivery systems. Toxic weapons also entail hazards and operational
Although some terrorist groups may be motivated by the desire to inflict mass casualties and a subset may be capable of avoiding premature arrest,
uncertainties much greater than those associated with firearms and explosives.
Prefer our evidence ---- theirs is clear exaggeration
Andrew O’Neil, lecturer in Politics and International Relations at Flinders University, April 2003, Australian Journal of International Affairs, Vol. 57, No. 1,
ebscohost, p. 109
Given the high stakes involved, it
is all too easy to exaggerate possible scenarios involving terrorists using WMD. Yet it is
equally easy to dismiss possible threat scenarios as being unduly alarmist. As the head of the United Nation’s Terrorism
Prevention Branch has remarked, the greatest challenge in evaluating the WMD terrorist threat is ‘walking the fine line between fear and paranoia on the one hand, and
prudence and disbelief on the other’ (Schmid 2000: 108). One of the most prevalent features in mainstream discussions of WMD terrorism has been the conflation of
motive and capability. All too often observers assume that simply because terrorist groups are motivated to acquire WMD they will be successful in doing so. A related
assumption is that once terrorists gain access to WMD materials they will, ipso facto, be able to build a weapon and deliver it against assigned targets. The prevalence
of this approach has meant that insufficient attention has been paid to addressing the key issue of accessibility to nuclear, chemical, and biological weapons on the part
of terrorist groups and the likelihood of such groups actually using WMD. Consequently, the challenging nature of assessing the threat of WMD terrorism has
frequently been overlooked in much of the academic literature. Simply accepting at face value the hypothesis that WMD terrorism is only ‘a matter of time’ is no
substitute for detailed and measured threat assessment. As I have argued, the issue is complex and not one that lends itself to hard and fast conclusions. On the one
it remains very difficult for all but the most technologically advanced terrorist organisations to
successfully weaponise nuclear material and CW and BW agents for delivery against targets. This is particularly
the case with respect to nuclear weapons, but also holds true for chemical and biological weapons. In the case of
biological weapons—which have become the most feared category of WMD in terms of likely terrorist use—although the requisite material for devising BW agents is widely available, the skill
and expertise for effectively weaponising a BW agent is still seemingly beyond terrorist groups. Overall,
acquiring WMD capabilities for delivery against targets is a lot harder for terrorists than is generally
acknowledged in the literature.
hand, I demonstrated that
Acquisition, production, and delivery barriers prevent effective bioterror
Paul Cornish, fellow at Catham House, February, 2007, The CBRN System: Assessing the Threat of Terrorist Use of Chemical, Biological, Radiological and
Nuclear Weapons in the UK, p. http://www.chathamhouse.org.uk/publications/papers/view/-/id/434/
It is important, however, not to exaggerate the availability of BW. BW production involves four stages – acquisition, production, weaponization and
delivery – the first three of which are progressively more difficult: 1. Acquisition. It would not be easy to acquire the seed stock of a pathogen or a
toxinproducing organism, but it would not be impossible either. One option would be theft, from one of the 1,500 germ banks dotted around the world, or from a
research laboratory, hospital or public health laboratory.32 Not all of these facilities can be expected to maintain the highest possible levels of physical and human
security. It is conceivable that a scientist or technician with legitimate access to key materials and organisms might be suborned by or volunteer to assist a terrorist
group, or even set out as a one-man band to avenge some terrible grievance. Another option would be fraud. In one celebrated example, in 1995 an American white
supremacist, Larry Wayne Harris, applied to the American Type Culture Collection for the bubonic plague bacterium. Harris’s application was, fortunately, found to be
fraudulent and he was prosecuted and imprisoned. For the most sophisticated BW proliferator, gene synthesis might offer another option: ‘Armed with a fake e-mail
address, a would-be terrorist could probably order the building blocks of a deadly biological weapon online, and receive them by post within weeks. […] Dozens of
biotech firms now offer to synthesise complete genes from the chemical components of DNA.’ Often, the companies concerned are lax in their security screening of
requests for DNA sequences, with the result that ‘terrorists could order genes that confer virulence to dangerous pathogens such as the Ebola virus, and engineer them
into another virus or bacterium.’33 The prospect of genetic modification (GM) of BW has begun to capture the imagination in recent years: GM bacteria, viruses or
prions could be more infective; resistant to antibiotics and vaccines; targeted at specific organs; able to lie dormant without detection before causing disease (stealth
organisms); and have greater environmental stability and survivability. GM anthrax, a modified smallpox immune response that would render current smallpox vaccines
ineffective, and a synthetic poliomyelitis virus are prime candidates.34 2. Production . The manufacturing of BW agents is not straightforward. Bulk
production, in particular, would be demanding and dangerous. 3. Weaponization. Weaponizing a BW agent is yet more challenging, for two
reasons. First, the health and safety of those involved in BW production could scarcely be more at risk. Quite apart from the hazard of handling highly
the reliability of laboratory
equipment are unlikely to be considered a priority. It might prove difficult to persuade scientists and laboratory workers to agree to work
under such conditions. Second, it would not be a simple matter to produce a stable device with a predictable effect. BW agents are, in general,
vulnerable to environmental and weather 11 conditions. Some BW agents present specific challenges; anthrax spores, for example, are known to
dangerous micro-organisms and toxins, in a covert production process the personal protection of laboratory workers and
‘clump’ too easily when inadequately aerosolized. It must be borne in mind, however, that for a terrorist group seeking a ‘single-shot’ BW attack, safety, reliability and
predictability in both production and weaponization might not be of great concern.
And, there’s no incentive for extinction-threatening bioweapons.
John D. Steinbruner, senior fellow at the Brookings Institution, Winter 1997, Foreign Policy, n109 p85(12), “Biological weapons: a plague upon all houses,”
p. infotrac
A lethal pathogen that could efficiently spread from one victim to another would be capable of initiating an intensifying cascade of disease that
might ultimately threaten the entire world population. The 1918 influenza epidemic demonstrated the potential for a global contagion of this sort but not
necessarily its outer limit. Nobody really knows how serious a possibility this might be, since there is no way to measure it reliably. Before the first atomic device was
tested, there was genuine concern that such an explosion could ignite the Earth’s atmosphere. American physicists were able to provide a credible calculation that
proved the contrary. It would be comparably important to establish that no conceivable pathogen could kill a substantial portion of the entire human population, but
current scientific knowledge simply cannot support such a determination. If anything, the balance of uncertain judgment would probably have to lean the other way.
The unique and unusually ominous characteristics of biological agents have been reflected in the military and diplomatic history of the subject. While aggressively
developing other lethal technologies, most of the major military establishments of this century have treated biological agents with distinct
caution. They have conducted dedicated programs both to develop biological agents for offensive use and to devise methods of protection against them. The offensive
efforts have focused on agents such as anthrax and tularemia, however, that are highly lethal to those directly exposed but do not efficiently spread
from one victim to another and thus do not present the problem of an uncontrollable chain reaction. There has been as yet no authoritative
indication of a deliberate attempt to develop the type of biological agent that would be most sensationally destructive – one that combines
lethality with efficiency of propagation. Moreover, with the exception of limited experimentation by Japanese forces in China prior to World War II, the major
militaries have not attempted to use any biological agents in actual battles. Nor have they provisionally deployed them to the extent that they did with chemical and
nuclear weapons.
Download