Wake CM 2NC v. Emory CP Kentucky r7.doc

advertisement
CP
Species loss won’t snowball
Mark Sagoff, Senior Research Scholar at the Maryland School of Public Policy, Pew Scholar in Conservation
and the Environment, June 1997, Atlantic Monthly, “Do We Consume Too Much?”
http://www.chem.brown.edu/chem12/readings/atlantic/consume.html
There is no credible argument, moreover, that all or even most of the species we are concerned to protect are
essential to the functioning of the ecological systems on which we depend. (If whales went extinct, for
example, the seas would not fill up with krill.) David Ehrenfeld, a biologist at Rutgers University, makes this
point in relation to the vast ecological changes we have already survived. “Even a mighty dominant like the
American chestnut,” Ehrenfeld has written, “extending over half a continent, all but disappeared without
bringing the eastern deciduous forest down with it.” Ehrenfeld points out that the species most likely to be
endangered are those the biosphere is least likely to miss. “Many of these species were never common or
ecologically influential; by no stretch of the imagination can we make them out to be vital cogs in the
ecological machine.”
Collapse of biodiversity won’t collapse the entire ecosystem and many species are redundant.
Carlos Davidson, conservation biologist, 5-1-2k [Bioscience, Vol. 50, No. 5, lexis]
Biodiversity limits. The original rivet metaphor (Ehrlich and Ehrlich 1981) referred to species extinction and biodiversity loss as a limit to human population and the
economy. A wave of species extinctions is occurring that is unprecedented in human history (Wilson 1988, 1992, Reid and
Miller 1989). The decline of biodiversity represents irreplaceable and incalculable losses to future generations of humans. Is biodiversity loss a case of limits, as
suggested by the rivet metaphor, or is it a continuum of degradation with local tears, as suggested by the tapestry metaphor? In the rivet metaphor, it is not the loss of
it is unclear that
biodiversity loss will lead to ecosystem collapse. Research in this area is still in its infancy, and results from the
limited experimental studies are mixed. Some studies show a positive relationship between diversity and some aspect of ecosy stem function, such
as the rate of nitrogen cycling (Kareiva 1996, Tilman et al. 1996). Others support the redundant species concept (Lawton and Brown
1993, Andren et al. 1995), which holds that above some low number, additional species are redundant in terms
of ecosystem function. Still other studies support the idiosyncratic species model (Lawton 1994), in which loss of some species reduces some aspect of
species by itself that is the proposed limit but rather some sort of ecosystem collapse that would be triggered by the species loss. But
ecosystem function, whereas loss of others may increase that aspect of ecosystem function. The relationship between biodiversity and ecosystem function is
undoubtedly more complex than any simple metaphor. Nonetheless, I believe that the tapestry metaphor provides a more useful view of biodiversity loss than the rivet
metaphor. A species extinction is like a thread pulled from the tapestry. With each thread lost, the tapestry gradually becomes threadbare. The loss of some species may
lead to local tears. Although everything is linked to everything else, ecosystems
are not delicately balanced, clocklike mechanisms in
which the loss of a part leads to collapse. For example, I study California frogs, some of which are
disappearing. Although it is possible that the disappearances signal some as yet unknown threat to humans (the
miner's canary argument), the loss of the frogs themselves is unlikely to have major ecosystem effects. The
situation is the same for most rare organisms, which make up the bulk of threatened and endangered species.
For example, if the black toad (Bufo exsul) were to disappear from the few desert springs in which it lives, even
careful study would be unlikely to reveal ecosystem changes. To argue that there are not limits is not to claim that biodiversity losses do
not matter. Rather, in calling for a stop to the destruction, it is the losses themselves that count, not a putative cliff that humans will fall off of somewhere down the
road.
Technological advances prevent spillover effects.
Simon, prof. of business administration at Univ. of Maryland, 2-9-94 [Julian, The Ultimate Resource II: People,
Materials, and Environment, Chapter 31,
http://www.juliansimon.org/writings/Ultimate_Resource/TCHAR31.txt]
The issue first came to scientific prominence in 1979 with Norman Myers's book The Sinking Ark. It then was brought to an international public and onto the U. S. policy agenda by the 1980
Global 2000 Report to the President. These still are the canonical texts. Unlike the story in chapter 9 about the loss of farmland scare, where the crisis has vanished instead of the farmland, the
scare about extinction of species was not quickly extinguished when its statistical basis was shown not to exist in the early 1980s, but instead continued bigger than ever. The Global 2000
forecast extraordinary losses of species between 1980 and 2000. "Extinctions of plant and animal species will increase dramatically. Hundreds of thousands of species -- perhaps as
many as 20 percent of all species on earth -- will be irretrievably lost as their habitats vanish, especially in tropical forests," it said. Yet the data
on the
observed rates of species extinction are wildly at variance with common belief, and do not provide support for
the various policies suggested to deal with the purported dangers. Furthermore, recent scientific and technical advances - especially seed banks and genetic engineering, and perhaps electronic mass-testing of new drugs -- have rendered
much less crucial the maintenance of a particular species of plant life in its natural habitat than would have been the case in earlier years.
ADV 1
Other countries are falling behind and US exports are surging – no brink to their advantage.
Thomas White, Global Investment Consulting Firm, “U.S.: Agricultural Sector Emerges as Bright Spot,” 9/17/2010, http://www.thomaswhite.com/explorethe-world/postcard/2010/us-agriculture-sector.aspx
Recent estimates released by the United States Department of Agriculture (USDA) show that agricultural exports are predicted to rise
$107.5 billion for the fiscal year 2010, the second highest year ever, with the previous such highs recorded in 2008 when a commodity
boom fueled agriculture exports worldwide. A delighted USDA stated that “agriculture is one of the only major sectors of the American economy with a trade surplus –
expected to be $30.5 billion this year.” Next
year is supposed to be even better with exports increasing to $113 billion.
The U.S. has benefited from a fall in fortunes from key competitors like Russia, whose wheat crop has been scorched by record
heat, and even the Ukraine and Kazakhstan have struggled to match export grain targets. This leaves the U.S. in the enviable position
of being able to take advantage of rising demand from markets in the Middle East and North Africa – markets that the
U.S. so far has failed to dominate. And then there is the giant called China, whose voracious demand is forecasted to make it the America’s second largest market,
overtaking Mexico in the process. The
U.S. is benefiting from China’s transition from being a net exporter of corn to a net
importer of corn. Chinese corn imports are expected to grow to 5.8 million tons in 2011, up from 1.7 million tons this year. Figures like these have
helped the agricultural sector sustain its momentum in what has been a rocky year for the U.S. economy. The
agriculture industry, in its quiet way, has been generating jobs, sustaining rural America’s economy. The
USDA estimates that ‘every billion dollars worth of agricultural exports supports more than 8,000 jobs and generate
$1.4 billion in economic activity.’
As the world economy continues to recover, global trade in grain, meat and commodities is only going to rise. The U.S. has
worked itself up to this lofty position, and it is an advantage it is not going to surrender anytime soon. There are risks,
no doubt. A double-dip recession in the U.S., a widening trade surplus with China or an economic plunge in the European Union. But having weathered
many a storm in the past three years, there is a steely resilience to this sector. Strange as it sounds, American
agriculture is truly emerging as the leading light at the end of a long recession tunnel.
Thomas Dawson, January 5, 2006, American Chronicle, “Food for Thought and the Price of Food,”
http://www.americanchronicle.com/articles/viewArticle.asp?articleID=4533
It may seem to many that we are living in a period in which there are potentially insurmountable problems facing us on every side. Certainly the
world is on
the precipice of a population explosion that we will be unable to sustain. The consumption of our natural resources and the destruction of our
environment continue on a scale never imagined by the majority of us. However, nearly every generation of mankind has seen periods of hard times and some of us
have experienced some very good times as well. The very nature of life on earth has been a history of turmoil and upheaval, from subsistence and mere survival to
prosperity and a degree of security, and sometimes, back again. Don’t expect things to change for the better in the very near future regardless of our sophisticated
economy. Consider the single aspect of food prices in the western world. Food
has been relatively inexpensive in the western world, except in war-
torn areas for the entire lifetime of our generation. This will probably not be the case for the next generation. It was only a few years ago
that the population explosion was in the news all the time, almost to the same extent that we are currently preoccupied with the energy crunch usually referred to as
“peak oil”, and the erosion of the western standard of living by “globalization”. The media let up on the problems of population growth because people got tired of
hearing about it. After all, the western world didn’t appear to be particularly affected by it. The population explosion has since been generally ignored in the news until
recently. That is not to infer that the problem went away. It took thousands of years of human history to produce and sustain a population of a billion people by the early
nineteenth century. In the past 200 years, we have multiplied that population by six. There
are now over six billion people in the world and we
will add the next billion people in only about a dozen years. With the advent of the industrial revolution, the western world became trade oriented
over the last couple of centuries. Since the cold war has ended, our international companies have seized opportunities to sharply increase their profits by arbitraging the
labor markets of Asia while selling products at home; sometimes referred to as globalization. This employment of large numbers of people has given impetus and
acceleration to the already rising prosperity of a small percentage of the population in various parts of Asia. This small increase in prosperity affecting such large
numbers of people has spawned a demand for resources and commodities around the world. Suddenly, a few people in the more populated parts of the world have the
monetary wherewithal to improve their standard of living and have hopes for a better life for their children. They have needs of infrastructure, electricity and
transportation as well as food. Now the western world finds itself competing for limited resources, especially energy. The most efficient forms of energy are oil and gas.
The owners of oil and gas find themselves in an enviable position where they have an asset worthy of preservation. They will probably never
again allow the prices to fall very much for any extended period of time. The cost of energy and fertilizer (usually made
from natural gas) are substantial costs in food production, not to mention the cost of transporting that food. The
2006 crops will be affected by the recent increase of prices in oil and gas. Expect food prices to accelerate their
rise in the next year and continue to rise thereafter. To exacerbate the problem, many farmers around the world
can now make more money raising crops for bio-diesel fuels than they can make raising food. Across South Asia, in the
Amazon and elsewhere, farmers are razing the forests to plant crops capable of making biofuels. Even in this country, laws will be
2NC EXT – No War
Studies show no correlation between decline and war.
Morris Miller, economist, adjunct professor in the University of Ottawa’s Faculty of Administration, consultant on international development issues, former
Executive Director and Senior Economist at the World Bank, Winter 2000, Interdisciplinary Science Reviews, Vol. 25, Iss. 4, “Poverty as a cause of wars?”
The question may be reformulated. Do
wars spring from a popular reaction to a sudden economic crisis that exacerbates
poverty and growing disparities in wealth and incomes? Perhaps one could argue, as some scholars do, that it is some dramatic event or
sequence of such events leading to the exacerbation of poverty that, in turn, leads to this deplorable denouement. This exogenous factor might act as a catalyst for a
violent reaction on the part of the people or on the part of the political leadership who would then possibly be tempted to seek a diversion by finding or, if need be,
fabricating an enemy and setting in train the process leading to war. According to a study undertaken by Minxin Pei and Ariel Adesnik of the Carnegie Endowment for
International Peace,
there would not appear to be any merit in this hypothesis. After studying ninety-three episodes of
economic crisis in twenty-two countries in Latin America and Asia in the years since the Second World War
they concluded that:19 Much of the conventional wisdom about the political impact of economic crises may be
wrong ... The severity of economic crisis - as measured in terms of inflation and negative growth - bore no relationship to the
collapse of regimes ... (or, in democratic states, rarely) to an outbreak of violence ... In the cases of dictatorships and semidemocracies, the
ruling elites responded to crises by increasing repression (thereby using one form of violence to abort another).
World War II analogies are false.
Ferguson – 6 (Niall, prof. of history, Foreign Affairs, “The Next War of the World”, lexis)
Nor can economic crises explain the bloodshed. What may be the most familiar causal chain in modern
historiography links the Great Depression to the rise of fascism and the outbreak of World War II. But that simple
story leaves too much out. Nazi Germany started the war in Europe only after its economy had recovered. Not
all the countries affected by the Great Depression were taken over by fascist regimes, nor did all such regimes
start wars of aggression. In fact, no general relationship between economics and conflict is discernible for the
century as a whole. Some wars came after periods of growth, others were the causes rather than the
consequences of economic catastrophe, and some severe economic crises were not followed by wars.
Historical models predicting war are no longer applicable – economics have become decoupled from
military power.
Daniel Deudney, Hewlett Fellow in Science, Technology, and Society at the Center for Energy and Environmental Studies at Princeton University, “Environment
and Security: Muddled Thinking,” April 1991, The Bulletin of the Atomic Scientists, Vol. 47, No. 3, Ebsco
Poverty wars. In a second scenario, declining living standards first cause internal turmoil, then war. If groups at all levels of affluence protect their standard of living by
pushing deprivation on other groups, class war and revolutionary upheavals could result. Faced with these pressures, liberal democracy and free market systems could
increasingly be replaced by authoritarian systems capable of maintaining minimum order. If authoritarian regimes are more war-prone because they lack democratic
control, and if revolutionary regimes are war-prone because of their ideological fervor and isolation, then the world is likely to become more violent. The
record
of previous depressions supports the proposition that widespread economic stagnation and unmet economic
expectations contribute to international conflict.
Although initially compelling, this scenario has major flaws. One is that it is arguably based on unsound economic theory. Wealth is
formed not so much by the availability of cheap natural resources as by capital formation through savings and
more efficient production. Many resource-poor countries, like Japan, are very wealthy, while many countries with more extensive resources are poor.
Environmental constraints require an end to economic growth based on growing use of raw materials, but not necessarily an end to growth in the production of goods
and services.
In addition, economic
decline does not necessarily produce conflict. How societies respond to economic decline
may largely depend upon the rate at which such declines occur. And as people get poorer, they may become less
willing to spend scarce resources for military forces. As Bernard Brodie observed about the modern era, “The predisposing
factors to military aggression are full bellies, not empty ones.” The experience of economic depressions over the
last two centuries may be irrelevant, because such depressions were characterized by under-utilized production
capacity and falling resource prices. In the 1930s, increased military spending stimulated economies, but if
economic growth is retarded by environmental constraints, military spending will exacerbate the problem.
Power wars. A third scenario is that environmental degradation might cause war by altering the relative power of states; that is, newly stronger
states may be tempted to prey upon the newly weaker ones, or weakened states may attack and lock in their
positions before their power ebbs further. But such alterations might not lead to war as readily as the lessons of
history suggest, because economic power and military power are not as tightly coupled as in the past. The
economic power positions of Germany and Japan have changed greatly since World War II, but these changes
have not been accompanied by war or threat of war. In the contemporary world, whole industries rise, fall, and
relocate, causing substantial fluctuations in the economic well-being of regions and peoples, without producing
wars. There is no reason to believe that changes in relative wealth and power caused by the uneven impact of environmental degradation would inevitably lead to
war.
Economic decline forces conflict settlement to free up resources.
D. Scott Bennett and Timothy Nordstrom, department of political science at Penn State, “Foreign Policy Substitutability and Internal Economic Problems
in Enduring Rivalries,” Feb 2000, Journal of Conflict Resolution, Vol. 44, No. 1, JSTOR
Conflict settlement is also a distinct route to dealing with internal problems that leaders in rivalries may pursue
when faced with internal problems. Military competi- tion between states requires large amounts of resources,
and rivals require even more attention. Leaders may choose to negotiate a settlement that ends a rivalry to free
up important resources that may be reallocated to the domestic economy. In a "guns ver- sus butter" world of
economic trade-offs, when a state can no longer afford to pay the expenses associated with competition in a
rivalry, it is quite rational for leaders to reduce costs by ending a rivalry. This gain (a peace dividend) could be
achieved at any time by ending a rivalry. However, such a gain is likely to be most important and attrac- tive to leaders when internal conditions are
bad and the leader is seeking ways to allevi- ate active problems. Support for policy change away from continued rivalry is more
likely to develop when the economic situation sours and elites and masses are looking for ways to improve a
worsening situation. It is at these times that the pressure to cut military investment will be greatest and that state
leaders will be forced to recognize the difficulty of continuing to pay for a rivalry. Among other things, this argument
also encompasses the view that the cold war ended because the Union of Soviet Socialist Republics could no
longer compete economically with the United States.
EXT – Econ Resilient
The economy is resilient.
Avery – 5 (Susan, Purchasing, “U.S. economy: 'amazingly resilient': report from ISM: forecast is optimistic.(Institute of Supply Management)”,
http://www.highbeam.com/doc/1G1-133570318.html)
The U.S. economy is amazingly resilient and purchasing and supply chain executives can take credit. Those are the
thoughts of two economists who spoke before purchasing professionals attending this year's Institute of Supply Management (ISM) conference in San Antonio.
Both highlighted the resiliency of the U.S. economy to withstand such shocks as rising oil prices, the wars in Iraq
and Afghanistan and the tragic events of 9/11. Thomas Siems, senior economist and policy adviser at the Federal Reserve Bank of Dallas, credits
the ability of purchasing professionals to manage the supply chains for the economy's recent performance. "I am very optimistic about the
economy," Siems told the audience, highlighting a chart that shows U.S. business-cycle expansion and contraction going back to 1948 as an illustration of the
resilience of the economy. He also says that economists look to three areas to define the resilience of the economy: good
policies (fiscal and/or monetary), good luck (with the threat of terrorism, the wars and energy prices, the economy hasn't had much lately), and good
practices. Siems sees good practices, particularly those performed by purchasing professionals, as key to
economic performance. He also used a comparison of sales-growth volatility to production-growth volatility to
back up his statement. Production-growth volatility measures activity at the beginning of the supply chain while sales-growth volatility tracks activity at the
end. Looking at the indicator over 10 years, he finds that currently the two are virtually the same, attributing the convergence to better management practices,
specifically recent improvements in inventory-management techniques. Looking ahead, Siems
has some concerns surrounding "knowledge,
skills, education and our ability to compete globally. I have problems with protectionist ideas, policies and agendas where we are not thinking
systemically and looking at the big picture." He closed by urging the audience "to think globally and act globally." Robert Fry, senior associate economist at DuPont,
picked up on Siems' theme of the economy's resiliency in the face of a series of events that could be considered bad luck. (Fry calls them headwinds.) At DuPont, he
assists the company's businesses with interpreting economic data and using it to forecast their performance. He presented DuPont's outlook for the global economy to
the ISM members in attendance. "As Tom points out, the
U.S. economy has been amazingly resilient," said Fry. "We've been hit by
these headwinds one after another. We suffered a mild recession. People are bemoaning the weakness of the
recovery, but when you have a mild recession, you can't have a strong recovery."
US economy is empirically resilient.
Michael Dawson, US Treasury Deputy Secretary for Critical Infrastructure Protection and Compliance Policy, January 8, 2004, Remarks at
the Conference on Protecting the Financial Sector and Cyber Security Risk Management, “Protecting the Financial Sector from Terrorism and Other Threats,”
http://www.ustreas.gov/press/releases/js1091.htm
Fortunately, we are starting from a very strong base. The
American economy is resilient. Over the past few years, we have seen
that resilience first hand, as the American economy withstood a significant fall in equity prices, an economic
recession, the terrorist attacks of September 11, corporate governance scandals, and the power outage of August
14-15. There are many reasons for the resilience of the American economy. Good policies – like the President’s Jobs and
Growth Initiative – played an important part. So has the resilience of the American people. One of the reasons are economy is so
resilient is that our people are so tough, so determined to protect our way of life. Like the economy as a whole,
the American financial system is resilient. For example, the financial system performed extraordinarily well
during the power outage last August. With one exception, the bond and major equities and futures markets were open the next day at their regular
trading hours. Major market participants were also well prepared, having invested in contingency plans, procedures, and equipment such as backup power generators.
This resilience mitigates the
economic risks of terrorist attacks and other disruptions, both to the financial system itself and to the American
economy as a whole.
The U.S. financial sector withstood this historic power outage without any reported loss or corruption of any customer data.
Allies prevent decline.
Brooks and Wohlforth – 2 (Stephen Brooks and William Wohlforth, Both are Associate Professors in the Government Department at Dartmouth,
“American Primacy in Perspective,” Foreign Affairs, July / August 2002)
Previous historical experiences of balancing, moreover, involved groups of status quo powers seeking to contain a
rising revisionist one. The balancers had much to fear if the aspiring hegemon got its way. Today, however, U.S. dominance is
the status quo. Several of the major powers in the system have been closely allied with the United States for
decades and derive substantial benefits from their position. Not only would they have to forego those benefits if
they tried to balance, but they would have to find some way of putting together a durable, coherent alliance
while America was watching. This is a profoundly important point, because although there may be several
precedents for a coalition of balancers preventing a hegemon from emerging, there is none for a group of
subordinate powers joining to topple a hegemon once it has already emerged, which is what would have to
happen today.
US forces will never withdraw.
Brooks and Wohlforth – 2 (Stephen Brooks and William Wohlforth, Both are Associate Professors in the Government Department at Dartmouth,
“American Primacy in Perspective,” Foreign Affairs, July / August 2002)
Historically, the major forces pushing powerful states toward restraint and magnanimity have been the limits of their
strength and the fear of overextension and balancing. Great powers typically checked their ambitions and deferred to others not because they
wanted to but because they had to in order to win the cooperation they needed to survive and prosper. It is thus no surprise that today's
champions of American moderation and international benevolence stress the constraints on American power rather than the
lack of them. Political scientist Joseph Nye, for example, insists that "[the term] unipolarity is misleading because it exaggerates the degree to which
the United States is able to get the results it wants in some dimensions of world politics. . . . American power is less effective than it might first
appear." And he cautions that if the United States "handles its hard power in an overbearing, unilateral manner," then others might be provoked into
forming a balancing coalition. Such arguments are unpersuasive, however, because they fail to acknowledge the true
nature of the current international system. The United States cannot be scared into meekness by warnings of
inefficacy or potential balancing. Isolationists and aggressive unilateralists see this situation clearly, and their domestic opponents need to
as well. Now and for the foreseeable future, the United States will have immense power resources it can bring to
bear to force or entice others to do its bidding on a case-by-case basis.
ADV 2
Industrial agriculture solves famine – all data agrees
Staniford, scientific consultant and peak oil expert, physics PhD, 3/10/2008
(Stuart, “Food to 2050,” http://www.theoildrum.com/node/3702#more)
The first gasoline powered tractor to be mass produced was introduced by Ford in 1917. Yet the yield take-off doesn't begin until 19 40 , and is
almost certainly due to the agricultural innovations that comprise the green revolution. As The Future of Crop Yields and
Cropped Area explains it:
The Green Revolution strategy emerged from a surprising confluence of different lines of agricultural research (Evans, 1998) – the
development of cheap nitrogenous fertilizers, of dwarf varieties of major cereals, and of effective weed control. Nitrogenous fertilizers
increase crop production substantially, but make plants top-heavy, causing them to fall over. The development of dwarf varieties solves this problem, but at the cost of
making plants highly susceptible to weeds, which grow higher than the dwarf plants, depriving them of light. The development of effective herbicides removed this
problem. Further Green Revolution development focused on crop breeding to increase the harvest index – the ratio of the mass of grain to total above-ground biomass.
Secondly, anyone
who wants to suggest that the world can be fed other than through industrial agriculture
has some explaining to do about this data. Every crop shows yields prior to the green revolution that were flat
and a small fraction of modern yields. If we returned to yields like that, either a lot of us would be starving, or we'd be
terracing and irrigating most of the currently forested hillsides on the planet for food. While shopping for locally grown produce at your nearest
organic farmer's market, stop and give a moment of thanks for the massive productivity of the industrial enterprise that
brings you, or at least your fellow citizens, almost all of your calorie input.
Only continued high yields prevent human misery that outweighs any impact in history
Easterbrook, contributing editor of The Atlantic, January 1997
(Gregg, “Forgotten Benefactor of Humanity,” http://www.theatlantic.com/issues/97jan/borlaug/borlaug.htm)
HIS opponents may not know it, but Borlaug has long warned of the dangers of population growth. "In my Nobel lecture," Borlaug says, "I suggested we had
until the year 2000 to tame the population monster, and then food shortages would take us under. Now I believe
we have a little longer. The Green Revolution can make Africa productive. The breakup of the former Soviet Union has caused its grain output to plummet,
but if the new republics recover economically, they could produce vast amounts of food. More fertilizer can make the favored lands of Latin America -- especially
Argentina and Brazil -- more productive. The cerrado region of Brazil, a very large area long assumed to be infertile because of toxic soluble aluminum in the soil, may
become a breadbasket, because aluminum-resistant crop strains are being developed." This last is an example of agricultural advances and environmental protection
going hand in hand: in the past decade the deforestation rate in the Amazon rain forest has declined somewhat, partly because the cerrado now looks more attractive.
Borlaug continues, "But
Africa, the former Soviet republics, and the cerrado are the last frontiers. After they are in use,
the world will have no additional sizable blocks of arable land left to put into production, unless you are willing to level
whole forests, which you should not do. So future food-production increases will have to come from higher yields. And
though I have no doubt yields will keep going up, whether they can go up enough to feed the population monster
is another matter. Unless progress with agricultural yields remains very strong, the next century will
experience sheer human misery that, on a numerical scale , will exceed the worst of everything that has
come before."
[Borlaug = Nobel-Prize winning advocate of industrial agriculture and professor at Texas A&M]
organic is by definition less productive
DeGregori, professor of economics at the University of Houston, 2003
(Thomas R., “Shiva the Destroyer?,” http://www.butterfliesandwheels.com/articleprint.php?num=17)
Dr. Vandana Shiva, in a book length diatribe against the Green Revolution, frequently refers to its voracious demand for chemical
fertilizers and indicates that there are alternative ways, more benign, of achieving these outputs (Shiva 1991). Plants need
ingredients (nutrients) in order to grow. If a molecule is in the plant, it or its constituent elements must come from somewhere. Except for carbon dioxide from
the atmosphere, plants derive their nutrients from the soil, or in the case of nitrogen from atmospheric nitrogen mediated by cyanobacteria (other than that from
fertilizer). More plant
output means more nutrient input. The often repeated claim that Green Revolution plants
need more fertilizer has about as much meaning as saying that it takes more food to raise three children than it
does to raise one. If sufficient nutrient is not in the soil, it must be added. Shiva's argument in essence is that one
can grow plants without nutrients or that one can achieve the same output as Green Revolution seeds yield without providing nutrient input other than
available "organic" sources. This is patently nonsensical and violates our fundamental knowledge of physics. Shiva has
made a number preposterous statements over the years about yields in traditional Indian agriculture or traditional agriculture elsewhere such as among the Maya. Even
before the Green Revolution dramatically increased the demand for and use of synthetic fertilizer, there was a large difference between
the nutrients extracted from the soil in India and the "organic" nutrients available to be returned to it. In fact, nearly twice as much
nutrient was being withdrawn from the soil as was being returned. Contrary to Shiva's assertions, this process was not sustainable. Given
the dramatic increases in Indian agricultural output over the last four decades (which more than accommodated a doubling of the population), the deficit in "organic"
nutrient must be vastly greater today. Shiva cites Sir Albert Howard, whose vitalist ideas on "organic" agriculture were developed in colonial India (Howard 1940). But
though he was a strong proponent of composting ("Indore method"), Howard recognized the need for additional synthetic fertilizer and improved seeds, which means he
might have favored GM crops if he were alive today.
High yield is net beneficial for the environment
Staniford, scientific consultant and peak oil expert, physics PhD, 3/10/2008
(Stuart, “Food to 2050,” http://www.theoildrum.com/node/3702#more)
Which raises a third important point. Food = Area Cropped x Average Yield. If average yields had not
increased like this, humanity's impact on natural ecosystems would be much greater. It's true that industrial
agriculture has a lot of impacts (nitrogen runoff and the like). However, the alternative would probably have
been worse, since it would have required us to intensively exploit enormous areas of fragile, and currently less
intensively exploited, land.
Trends are good on species because of industrial farming – we control uniqueness
Avery, director of the Center for Global Food Issues at the Hudson Institute, November 26 2003
(Dennis T., “Giving Thanks for Abundant Food,”
http://www.hudson.org/index.cfm?fuseaction=publication_details&id=3114)
I am grateful that the rate of wild species extinction is now as low as it has been since the sixteenth century . We
lost only half as many known species of birds, animals and/or fish in the last third of the twentieth century (twenty species) as
we did during the last third of the nineteenth century (forty species), according to the UN Environmental Program. I am startled to
realize that only during the last sixty years or so has the world been able to feed all people without either hunting
wild species to extinction or stealing hunting/farming land from other people.
Turn: increased genetic diversity and safeguards require monoculture– peer-reviewed science proves ***
DeGregori, professor of economics at the University of Houston, 7/14/2003
(Thomas R., “The Anti-Monoculture Mania,” http://www.butterfliesandwheels.com/articleprint.php?num=28)
The argument that the Green Revolution crops have led to a diminution of genetic diversity, with a potential for a
disease or pest infestation engendering a global crop loss catastrophe, is taken as axiomatic in many circles as one more
threat that modern science imposes upon us. In fact, there is a sizeable and growing body of solidly based, scientific, peerreviewed research that finds the exact opposite of the conventional wisdom (CIMMYT 1996, Evenson and Gollin 1994 &
1997, Gollin and Smale 1998, Rice et al. 1998, Smale 1997 & 1998, Smale et al. 1996 & 2002 and Wood and Lenné, 1999). Findings for wheat for example, "suggest
that yield
stability, resistance to rusts, pedigree complexity, and the number of modern cultivars in farmers' fields have all
increased since the early years of the Green Revolution" (Smale and McBride 1996). The conclusion that the "trends in genetic diversity of
cereal crops are mainly positive" is warranted by the evidence. Moreover, this diversity does more than "just protect against
large downside risk for yields." It was "generated primarily as a byproduct to breeding for yield and quality improvement and
provides a pool of genetic resources for future yield growth." Consequently, the "threat of unforeseen, widespread, and
catastrophic yield declines striking as the result of a narrow genetic base must be gauged against this reality"
(Rosegrant and Hazell 2000, 312). Most critics do not seem to realize that the Green Revolution was not a one-shot
endeavor for wheat and rice, but an ongoing process of research for new varieties and improved agricultural
practices. There is an international network of growers, extension agents, local, regional, national and international research
stations, often linked by satellite, that has successfully responded to disease outbreaks which in earlier times
could well have resulted in a global crisis. Historically, the farmer had access to only a limited number of local varieties of seeds. Today,
should there be a disease or other cropping problem, the farmer can be the beneficiary of a new variety drawn from
seed bank accessions that number into the hundreds of thousands for major crops like rice. With transgenic technology, the options for the
cultivators are becoming vastly greater. Monoculture today is in fact not only consistent with an incredible diversity of means for
crop protection, it is the sine qua non for them, because it is not possible to have such resources for all the less widely
planted crops. In a world of 6 billion people, with over 2 billion of them in agriculture, it is not difficult to cherry-pick
instances of major crop disease outbreaks, but the issue is how representative are these examples, and what should our response to
them be? Too often, narratives such as that in Newsweek are used to condemn the Green Revolution, which has increased food production
by 2.7 times on about the same land under cultivation, accommodating a doubling of the population over the last 40 years, while creating more
stable food production in areas that have been historically most prone to crop failures and famine.
Science and history prove no threat from monoculture
DeGregori, professor of economics at the University of Houston, 5/2/2004
(Thomas R., “Green Myth vs. the Green Revolution,”
http://www.butterfliesandwheels.com/articleprint.php?num=50)
The Green Revolution seeds turn out to be more disease resistant (as plant breeders have added multiple disease
resistant genes - gene stacking), requiring less pesticides. "Increasingly, scientists breed for polygenic (as opposed to monogenic)
resistance by accumulating diverse, multiple genes from new sources and genes controlling different mechanisms of resistance within single
varieties (Smale 1997, 1265, see also Cox and Wood 1999, 46). The coefficient of variation for rice production has been steadily decreasing for the last forty years
which would seem to indicate the new technologies in agricultural production are not as fragile as some would have us believe (Lenné and Wood 1999, see also Wood
and Lenné 1999a&b and Evenson and Gollin 1997). This has also been the case for wheat. "Yield stability, resistance to rusts, pedigree complexity, and the number of
modern cultivars in farmers' fields have all increased since the early years of the Green Revolution" (Smale and McBride 1996). Modern "monoculture" is
central to the unverified claims about modern varieties being less disease resistant (DeGregori 2003c). The "natural
ecosystems" from which important cereals were domesticated were often moncultures - "extensive, massive
stands in primary habitats, where they are dominant annuals." This includes the "direct ancestors of our cereals Hordeum spontaneum
(for barley), Triticum boeoticum (for einkorn wheat) and Triticum dicoccoides (for emmer wheat)" which "are common wild plants in the Near East" (Wood and Lenné
1999, 445). This was not unique to the Near East but was a prevailing pattern of the time. In the transition from Pleistocene to the Holocene, "climatic changes in
seasonal regimes decreased diversity, increased zonation of plant communities, and caused a shift in net antiherbivory defense strategies" (Guthrie 1984, 260, cited in
DeGregori 2003c). The "ecological richness of late Pleistocene" in many of the areas that humans were first to develop agriculture, gave way to "relative ecological
Critics of modern agriculture who fear the
hark back to the southern corn-leaf blight in the U.S. in 1970 since they
cannot come up with any other comparable loss in the last half century in corn or wheat or rice, the staples that
provide about two-thirds of the world's food production. The $1 billion in losses of about 15 to 25% of the 1970 corn crop was
substantial, but these loses should be considered against the fact that corn yields had more than doubled over the previous two
decades and that the crop year following the blight was one of record yields. When not using the corn blight, the critics go back over 150
years to the Irish potato famine. And they simply ignore the crop losses and famine that have been the lot of humankind since
the begining of conventional, largely "organic" agriculture.
homogeneity during the succeeding Holocene" (Guthrie 1984, 251, cited in DeGregori 2003c).
susceptibility to disease from monculture, continually
No warming – their models are incorrect and satellite data disproves.
Steven F. Hayward, F.K. Weyerhaeuser fellow at the American Enterprise Institute, 3-15-2010, The Weekly Standard, “In Denial,”
http://www.weeklystandard.com/print/articles/denial
This central pillar of the climate campaign is unlikely to survive much longer, and each repetition of the “science-is-settled”
mantra inflicts more damage on the credibility of the climate science community. The scientist at the center of the Climategate scandal at East
Anglia University, Phil (“hide the decline”) Jones dealt the science-is-settled narrative a huge blow with his candid admission in a BBC interview
that his surface temperature data are in such disarray they probably cannot be verified or replicated, that the
medieval warm period may have been as warm as today, and that he agrees that there has been no statistically
significant global warming for the last 15 years—all three points that climate campaigners have been bitterly contesting. And Jones specifically
disavowed the “science-is-settled” slogan: BBC: When scientists say “the debate on climate change is over,” what exactly do they mean, and what don’t they mean?
Jones: It would be supposition on my behalf to know whether all scientists who say the debate is over are saying that for the same reason. I don’t believe the vast
majority of climate scientists think this. This is not my view. There is still much that needs to be undertaken to reduce uncertainties, not just for the future, but for the
instrumental (and especially the palaeoclimatic) past as well [emphasis added]. Judith Curry, head of the School of Earth and Atmos-pheric Sciences at Georgia Tech
and one of the few scientists convinced of the potential for catastrophic global warming who is willing to engage skeptics seriously, wrote February 24: “No
one
really believes that the ‘science is settled’ or that ‘the debate is over.’ Scientists and others that say this seem to want to advance a
particular agenda. There is nothing more detrimental to public trust than such statements.” The next wave of climate revisionism is likely to reopen most of the central
questions of “settled science” in the IPCC’s Working Group I, starting with the data purporting to prove how much the Earth has warmed over the last century. A
London Times headline last month summarizes the shocking revision currently underway: “World May Not Be Warming, Scientists Say.” The
Climategate
emails and documents revealed the disarray in the surface temperature records the IPCC relies upon to validate its
claim of 0.8 degrees Celsius of human-caused warming, prompting a flood of renewed focus on the veracity and handling of surface temperature data. Skeptics such as
Anthony Watts, Joseph D’Aleo, and Stephen McIntyre have been pointing out the defects in the surface temperature record for years, but the media and the IPCC
ignored them. Watts and D’Aleo have painstakingly documented (and in many cases photographed) the huge number of temperature
stations that have
been relocated, corrupted by the “urban heat island effect,” or placed too close to heat sources such as air conditioning
compressors, airports, buildings, or paved surfaces, as well as surface temperature series that are conveniently left out of the
IPCC reconstructions and undercut the IPCC’s simplistic story of rising temperatures. The compilation and statistical
treatment of global temperature records is hugely complex, but the skeptics such as Watts and D’Aleo offer compelling critiques showing that most of the
reported warming disappears if different sets of temperature records are included, or if compromised station
records are excluded. The puzzle deepens when more accurate satellite temperature records, available starting in 1979, are considered. There is a glaring
anomaly: The satellite
records, which measure temperatures in the middle and upper atmosphere, show very little
warming since 1979 and do not match up with the ground-based measurements. Furthermore, the satellite readings of
the middle- and upper-air temperatures fail to record any of the increases the climate models say should be happening in
response to rising greenhouse gas concentrations. John Christy of the University of Alabama, a contributing author to the IPCC’s Working
Group I chapter on surface and atmospheric climate change, tried to get the IPCC to acknowledge this anomaly in its 2007 report but was ignored. (Christy is
responsible for helping to develop the satellite monitoring system that has tracked global temperatures since 1979. He received NASA’s Medal for Exceptional
Scientific Achievement for this work.) Bottom line: Expect some surprises to come out of the revisions of the surface temperature records that will take place over the
next couple of years. Eventually the climate modeling community is going to have to reconsider the central question: Have
the models the IPCC uses for its
the climate’s sensitivity to greenhouse gases? Two recently published
studies funded by the U.S. Department of Energy, one by Brookhaven Lab scientist Stephen Schwartz in the Journal of Geophysical Research, and
one by MIT’s Richard Lindzen and Yong-Sang Choi in Geophysical Research Letters, both argue for vastly lower climate sensitivity to
greenhouse gases. The models the IPCC uses for projecting a 3 to 4 degree Celsius increase in temperature all assume large positive (that is, temperaturemagnifying) feedbacks from a doubling of carbon dioxide in the atmosphere; Schwartz, Lindzen, and Choi discern strong negative (or
temperature-reducing) feedbacks in the climate system, suggesting an upper-bound of future temperature rise of no
more than 2 degrees Celsius. If the climate system is less sensitive to greenhouse gases than the climate campaign believes, then what is causing plainly
predictions of catastrophic warming overestimated
observable changes in the climate, such as earlier arriving springs, receding glaciers, and shrinking Arctic Ocean ice caps? There have been alternative explanations in
the scientific literature for several years, ignored by the media and the IPCC alike. The IPCC downplays theories of variations in solar activity, such as sunspot activity
and gamma ray bursts, and although there is robust scientific literature on the issue, even the skeptic community is divided about whether solar activity is a primary
cause of recent climate variation. Several studies of Arctic warming conclude that changes
in ocean currents, cloud formation, and wind
patterns in the upper atmosphere may explain the retreat of glaciers and sea ice better than greenhouse gases.
Another factor in the Arctic is “black carbon”—essentially fine soot particles from coal-fired power plants and forest fires, imperceptible to the naked eye but reducing
if the medieval warm period was
indeed as warm or warmer than today, we cannot rule out the possibility that the changes of recent decades are
part of a natural rebound from the “Little Ice Age” that followed the medieval warm period and ended in the
19th century. Skeptics have known and tried to publicize all of these contrarian or confounding scientific findings, but the compliant news media routinely
the albedo (solar reflectivity) of Arctic ice masses enough to cause increased summertime ice melt. Above all,
ignored all of them, enabling the IPCC to get away with its serial exaggeration and blatant advocacy for more than a decade.
No motivation for agroterror – history is on our side
Peter Chalk, transnational terrorism expert for RAND, 2/9/2001, The US Agricultural Sector: A New Target for Terrorism?, Jane's Intelligence Digest, p.
http://www.janes.com/security/international_security/news/jir/jir010209_1_n.shtml
Despite the ease and potentially severe implications of carrying out biological attacks against agriculture, to date only a handful of actual or
threatened incidents have occurred. If there are no real technological or psychological constraints to employing biological weapons against
agriculture, why haven't terrorists made more use of this modus operandi, especially given its potential to cause significant economic, political and social
upheaval? One reason could be that terrorists haven't thought through the full implications of deliberately targeting agricultural livestock and
produce. According to this interpretation, it may only be a matter of time before more instances of this type of aggression take place. Another possibility may be that
deliberate sabotage is traditionally not something health officials have actively looked for when investigating crop or animal disease outbreaks. The implication here is
that more acts may have actually taken place than are known about. Animal and plant health officials in Washington concede this is a possibility, acknowledging that in
most countries (including the USA) the tendency is to automatically assume that disease outbreaks are naturally occurring events. The inevitable consequence has been
epidemiological investigations that seldom consider the possibility of deliberate pathogenic introduction. Finally, it could be that terrorists consider this form of
aggression to be too 'dry' in comparison to traditional bombings, in the sense that attacks against crops and animals do not produce
immediate, visible effects. The impact, while significant, is delayed, lacking a single point for the media to focus on. As such, the fact that
biological agro-terrorism has not emerged as more of a problem is perhaps understandable.
Detection prevents successful agroterror
Gus R. Douglass, former president of the National Association of State Departments of Agriculture, 2/1/ 2004, Charleston Gazette, p. lexis
I write in response to Charles A. Nichols' assertion that the American public is unprotected against mad cow disease or otherwise adulterated food. While Mr. Nichols is
free to differ, the history of agriculture in this country is one of a safe and bountiful food supply. It is true that agriculture has become big business.
But big business in the Midwest is small business here in West Virginia - 12,000 small businesses producing cattle that feed our nation and bring new dollars into our
state. Our society has changed dramatically since I was a boy. Everybody lived on a farm back then. Today, few people live on farms, and large business conglomerates
produce vast amounts of food in a highly efficient, and, yes, safe, manner. Today we have 24-hour supermarkets and fast-food restaurants that feed a vibrant and
productive nation, even more so with the threat of agroterrorism. As agriculture has grown and evolved, so too have government regulation and
institutionalized attention to safety. The ban on downed cattle is a thing whose time has come - for health reasons, if nothing else - even though the large
majority of downer animals will likely have only broken bones. Although there is always room for improvement in any system, the early detection and rapid
response of the U.S. Department of Agriculture to the bovine spongiform encephalopathy incident shows that the plan we have in place is
an effective system for dealing with animal emergencies. Granted, it would be easier for the mad cow investigators now if there were a national animal
identification system in place. It is a measure I have advocated in the past. And it is likely to become a reality in the near future. There is always room for improvement.
But the fact is, despite all the attention given to it, mad cow disease never really posed a human health threat. But it did show that America is
vulnerable to foreign animal diseases, even ones it has successfully protected itself against for years. Disease surveillance needs to be a priority in this state and nation.
No impact to bioterror – a single assault rifle is more dangerous than a massive bio attack
Stratfor.com 12/21/2007 (“Bioterrorism: Sudden Death Overtime?,” http://www2.stratfor.com/analysis/bioterrorism_sudden_death_overtime)
In this season of large college bowl games and the National Football League playoffs in the United States, and large nonsporting events such as the New Year’s Eve
celebration in New York’s Times Square — not to mention the upcoming Olympic Games in Beijing — a discussion of bioterrorism and the threat it poses might be of
interest. First, it must be recognized that during the past several decades of the modern terrorist era, biological weapons have been used very
infrequently — and there are some very good reasons for this. Contrary to their portrayal in movies and television shows, biological agents are difficult to
manufacture and deploy effectively in the real world. In spite of the fear such substances engender, even in cases in which they have been somewhat effective
they have proven to be less effective and more costly than more conventional attacks using firearms and explosives. In fact, nobody even noticed what was
perhaps the largest malevolent deployment of biological agents in history, in which thousands of gallons of liquid anthrax and
botulinum toxin were released during several attacks in a major metropolitan area over a three-year period. This use of biological agents was perpetrated by
the Japanese apocalyptic cult Aum Shinrikyo. An examination of the group’s chemical and biological weapons (CBW) program provides some important insight into
biological weapons, their costs — and their limitations. In the late 1980s, Aum’s team of trained scientists spent millions of dollars to develop a series of state-of-theart biological weapons research and production laboratories. The group experimented with botulinum toxin, anthrax, cholera and Q fever and even tried to acquire the
Ebola virus. The group hoped to produce enough biological agent to trigger a global Armageddon. Between April of 1990 and August of 1993, Aum conducted seven
large-scale attacks involving the use of thousands of gallons of biological agents — four with anthrax and three with botulinum toxin. The group’s first attempts at
unleashing mega-death on the world involved the use of botulinum toxin. In April of 1990, Aum used a fleet of three trucks equipped with aerosol sprayers to release
liquid botulinum toxin on targets that included the Imperial Palace, the Diet and the U.S. Embassy in Tokyo, two U.S. naval bases and the airport in Narita. In spite
of the massive quantities of agent released, there were no mass casualties and, in fact, nobody outside of the cult was even aware
the attacks had taken place. When the botulinum operations failed to produce results, Aum’s scientists went back to the drawing board and retooled their
biological weapons facilities to produce anthrax. By mid-1993, they were ready to launch attacks involving anthrax, and between June and August of 1993 the group
sprayed thousands of gallons of aerosolized liquid anthrax in Tokyo. This time Aum not only employed its fleet of sprayer trucks, but also use sprayers mounted on the
roof of their headquarters to disperse a cloud of aerosolized anthrax over the city. Again, the attacks produced no results and were not even noticed. It was only after the
group’s successful 1995 subway attacks using sarin nerve agent that a Japanese government investigation discovered that the 1990 and 1993 biological attacks had
occurred. Aum Shinrikyo’s team of highly trained scientists worked under ideal conditions in a first-world country with a virtually
unlimited budget. The team worked in large, modern facilities to produce substantial quantities of biological weapons. Despite the millions of dollars the
group spent on its bioweapons program, it still faced problems in creating virulent biological agents, and it also found it difficult to dispense those agents
effectively. Even when the group switched to employing a nerve agent, it only succeeded in killing a handful of people. A comparison between
the Aum Shinrikyo Tokyo subway attack and the jihadist attack against the Madrid trains in 2004 shows that chemical/biological attacks are more expensive to produce
and yield fewer results than attacks using conventional explosives. In the March 1995 Tokyo subway attack — Aum’s most successful — the group placed 11 sarinfilled plastic bags on five different subway trains and killed 12 people. In the 2004 Madrid attack, jihadists detonated 10 improvised explosive devices (IEDs) and killed
191 people. Aum’s CBW program cost millions and took years of research and effort; the Madrid bombings only cost a few thousand dollars, and the IEDs were
assembled in a few days. The most deadly biological terrorism attack to date was the case involving a series of letters containing anthrax in the
weeks following the Sept. 11 attacks — a case the FBI calls Amerithrax. While the Amerithrax letters did cause panic and result in companies all across the country
temporarily shutting down if a panicked employee spotted a bit of drywall dust or powdered sugar from doughnuts eaten by someone on the last shift, in practical
terms, the attacks were very ineffective. The Amerithrax letters resulted in five deaths; another 22 victims were infected but recovered after receiving medical
treatment. The letters did not succeed in infecting senior officials at the media companies targeted by the first wave of letters, or Sens. Tom Daschle and Patrick Leahy,
who were targeted by a second wave of letters. By way of comparison, John Mohammed, the so-called “D.C. Sniper,” was able to cause mass panic and
kill twice as many people (10) by simply purchasing and using one assault rifle. This required far less time, effort and expense than producing the anthrax
spores used in the Amerithrax case. It is this cost-benefit ratio that, from a militant’s perspective, makes firearms and explosives more attractive
weapons for an attack. This then is the primary reason that more attacks using biological weapons have not been executed: The cost is higher than the
benefit. Certainly, history has shown that militant organizations and homegrown militants are interested in large sporting events as venues for terror; one needs to
look no further than the 1972 Munich Massacre, the 1980 Olympic Park bombing or even the 2005 incident in which University of Oklahoma student Joel Hinrichs died
after a TATP-filled backpack he was wearing exploded outside a football game at Oklahoma Memorial Stadium, to see this. Because of this, vigilance is needed.
However, militants planning such attacks will be far more likely to use firearms or IEDs in their attacks than they will biological agents. Unfortunately, in the
real world guns and suicide bombs are far more common — and more deadly — than air horns filled with creepy bioterror.
Even if used, bioweapons won’t spread or cause epidemics
Gregg Easterbrook, senior fellow at The New Republic, July 2003, Wired, “We’re All Gonna Die!”
http://www.wired.com/wired/archive/11.07/doomsday.html?pg=2&topic=&topic_set=
3. Germ warfare!Like chemical agents, biological weapons have never lived up to their billing in popular culture. Consider the 1995 medical thriller
Outbreak, in which a highly contagious virus takes out entire towns. The reality is quite different . Weaponized smallpox escaped from a Soviet laboratory
in Aralsk, Kazakhstan, in 1971; three people died, no epidemic followed. In 1979, weapons-grade anthrax got out of a Soviet facility in Sverdlovsk
(now called Ekaterinburg); 68 died, no epidemic. The loss of life was tragic, but no greater than could have been caused by a single conventional
bomb. In 1989, workers at a US government facility near Washington were accidentally exposed to Ebola virus. They walked around the
community and hung out with family and friends for several days before the mistake was discovered. No one died. The fact is, evolution has spent
millions of years conditioning mammals to resist germs. Consider the Black Plague. It was the worst known pathogen in history, loose
in a Middle Ages society of poor public health, awful sanitation, and no antibiotics. Yet it didn’t kill off humanity. Most people who were caught in the
epidemic survived. Any superbug introduced into today’s Western world would encounter top-notch public health, excellent sanitation,
and an array of medicines specifically engineered to kill bioagents. Perhaps one day some aspiring Dr. Evil will invent a bug that bypasses the immune
system. Because it is possible some novel superdisease could be invented, or that existing pathogens like smallpox could be genetically altered to make them more
virulent (two-thirds of those who contract natural smallpox survive), biological agents are a legitimate concern. They may turn increasingly troublesome as time passes
and knowledge of biotechnology becomes harder to control, allowing individuals or small groups to cook up nasty germs as readily as they can buy guns today. But no
superplague has ever come close to wiping out humanity before, and it seems unlikely to happen in the future.
Download