No scenario for great power war – laundry list - openCaselist 2015-16

advertisement
Plan text
Plan: The United States Congress should amend the Dickey-Wicker amendment to
legalize the sale of human organs produced through human embryonic stem cell
regenerative research.
1AC 1
Contention one is biotech
The stem cell market is a major stepping stone for technological innovation- solves all
impacts
Lane and Matthews 13 (Neal F. Lane, Ph.D. Senior Fellow in Science and Technology Policy Kirstin
R.W. Matthews, Ph.D. Fellow in Science and Technology Policy “2013 POLICY RECOMMENDATIONS FOR
THE OBAMA ADMINISTRATION” http://bakerinstitute.org/media/files/Research/ab3a2eb0/STP-pubPolicyRecommendations.pdf)
Human embryonic stem cell (hESC) research is an emerging field of biomedical research that started in 1998 with the
derivation of the first cell line. Scientists look forward to the possibility that hESCs, along with other types of stem cells found in adults,
can advance research in areas as diverse as developmental biology, cancer research, and regenerative
medicine. Early in his administration, President Obama passed an executive order that directed NIH to develop new guidelines for regulating federally funded
hESC research.13 As with the previous administration, and consistent with the “Dickey-Wicker Amendment” appropriation rider, the NIH only funds research that
uses hESC lines previously approved by an ethics review committee and created through private funds. As of November 2012, there are 184 hESC lines eligible for
federal funding, a nine-fold increased from 21 lines in 2008. 14 Following the adoption of the new NIH guidelines, a lawsuit was brought against the federal
government, Sherley v. Sebelius, which challenged the new NIH guidelines. A
district court judge subsequently granted a
preliminary injunction halting all funding of hESC research at NIH. Ultimately, the injunction was dismissed as well as the case,
but scientists had already begun to question the sustainability of this type of research . During the
past 15 years, each
presidential administration—Presidents Bill Clinton, George W. Bush, and now Obama—have created their own
stem cell policies using executive orders. While there was consistency during each administration, the executive orders were
altered when a new president was elected. This inconsistency is negatively impacting stem cell
research, causing scientists to shy away from the field and making them unsure about the
area’s funding future . Working with Congress , the president should create new legislation
executive order permanent. The
that will make his 2009
law should: • Support research on all types of human stem cells, including
embryonic and adult . Rice University’s Baker Institute • Authorize federal funding of hESC research on
lines derived according to NIH ethical guidelines, regardless of the date the cell lines were
derived or created . • Clarify which research is eligible for federal funding
hESCs) and which research is not (i.e., the creation of hESC lines). This
(i.e., research utilizing approved
law would assure scientists that federal policy
would remain the same year-to-year and administration-to-administration.
Conclusion Science
and technology impact most areas of public policy , including domestic and national security ,
energy and climate change , the environment , health and safety, agriculture,
transportation, education , and, of most immediate concern, the economy and jobs for
Americans . From federal investments in science and engineering R&D , particularly basic and applied research,
we obtain new knowledge and technologies that improve the ability of our nation to meet
its economic, security, and social goals . In the United States, scientific discoveries and
technological breakthroughs have been shown to drive innovation , which plays a vital role in
sustainable economic growth . The second term of the Obama administration will provide a
unique opportunity to keep the nation on track to advance U .S. science and technology
research and ensure its applications to societal goals. That will require that the administration’s S&T
team give particular attention to the integrity of scientific advice to government, research funding, STEM education, the creation of
a permanent U.S. stem cell policy , and the development of new tools for science policy.
Legalization of stem cell research revolutionizes the biotech industry – allows for
innovation and lowers costs- causes spillover of genetic methods to biotech
applications
Cadden 08 (Laura Cadden, investment strategist at Today’s Financial News, 11/5/08, “New
administration could mean advances — and profits — for U.S. stem cell biotech companies”,
http://www.todaysfinancialnews.com/us-stocks-and-markets/new-administration-could-mean-profitsfor-us-stem-cell-companies-5269.html)
One of the biggest impacts of the new administration could be in stem cell research, according to Laura
Cadden. Support for the use of embryonic stem cells would open a whole world of
opportunity for specialist biotech firms . Laura picks 5 small-cap biotech stocks that would make
huge gains on new stem cell legislation. This from Today’s Financial News: The theoretical benefits of
stem cell therapy could have a revolutionary effect on biotechnology and medicine. Stem cell
technology could create a renewable source of specifically differentiated cells to replace and regenerate
cells and tissues damaged by conditions such as heart disease, Alzheimer’s, diabetes, spinal cord injury,
Parkinson’s etc… It could provide tools for the identification and (hopefully) prevention of the causes of
abnormal cell division that lead to birth defects and cancer. And it could change the way we test
new medications e ntirely. Adult vs. embryonic stem cells As a general rule, adult stem cells can only
be relied upon to divide and replenish into cell types of their original tissue. This is fine in situations
where a patient’s own cells can be used and such treatment thereby avoids immune rejection.
Embryonic stem cells, on the other hand, can develop into any and all cell types. And they are much
easier to grow in culture as compared to adult stem cells. What does this mean for stem cell biotech
companies? Most of the smaller American companies engaged in stem cell research have had to focus
on a specific adult stem cell for narrow applications because of limitations to Federal funding for
new stem cell cultures. For example… StemCells, Inc. (NASDAQ:STEM), is currently focused on human
neural stem cell and human liver engrafting cells. Stem Cell Therapeutics Corp. (CVE:SSS) and BrainStorm
Cell Therapeutics (OTC:BCLI) take cells from patients’ own bone marrow in order to treat, Parkinson’s,
ALS, spinal cord injury, etc. Transitions Therapeutics Inc. (NASDAQ:TTHI) and Ixion Biotechnology, Inc.
focus on using islet beta cells in the pancreas to treat diabetes. The addition of embryonic stem
cells to genetic therapy has the potential to revolutionize the revolutionary . Imagine… rather
than focusing all their money and time on one specific type of cell, they could apply their science to cells
affecting areas throughout the body. These unspecified embryonic cells can (again, in theory) be
specialized to fix whatever ails you, once the science catches up. The tiny biotech firms would no longer
have to rely on qualified adult donors (think of all the restrictions the Red Cross now has regarding
acceptable blood donors!). And with the relative ease of embryonic stem cell culture proliferation,
experimentation can reach new levels . Obama Administration to support stem cell research
President-elect Barrack Obama has clearly stated his opinion, “ … we must all work together to expand
federal funding of stem cell research and continue moving forward in our fight against disease by
advancing our knowledge through science and medicine.” And that could mean lower costs and higher
ROI for these small biotech companies (many of which are trading under $2 today!).
Warming causes extinction- only genetic biotechnology applications solve adaptation
and bioengineered pathogen release
Baum and Wilson 13 (Seth D. Baum* and Grant S. Wilson Global Catastrophic Risk Institute * ‘The
Ethics of Global Catastrophic Risk from Dual-Use Bioengineering’ Ethics in Biology, Engineering and
Medicine, 4(1):59-72 (2013). Pg lexis)
Note: “GCR”: Global Catastrophic Risk
In addition to itself being a GCR,
bioengineering can
also
reduce the chances that other GCRs will occur .
One such GCR is climate change. Catastrophic
climate change scenarios could involve sea level rise of up to 10 meters,
droughts, increased extreme weather events, loss of most threatened and endangered species, and temperature increases
of 6 degrees Celsius.37 Still worse than that would be outcomes in which large portions of the land surface on Earth
become too warm for mammals (including humans) to survive .38 And the worst scenario could involve climate engineering
backfiring to result in extremely rapid temperature increase.39 6 Despite the risks of climate change, the
international community
has struggled to satisfactorily address the issue, for a variety of political, technological, and economical reasons.
Bioengineering may be able to help . An army of bioengineered algae that is specifically designed to
convert carbon dioxide into a “biocrude” fuel ready to be made into fuel for any vehicle type – a
technology
that Craig Venter’s Synthetic Genomics, Inc. is developing with a $600 million investment from ExxonMobil –
could
remove greenhouse gases from the atmosphere and provide a plentiful, carbon-neutral fuel
source
that does not pose many of the downsides of today’s biofuel options (although this technology has its own risks).40 Or, despite
being a bizarre proposition
, humans could be genetically engineered to reduce our CO2 output, such
as by engineering humans to be intolerant to meat or to be smaller in size. 41 Likewise, while a
deadly bioengineered virus has the potential to escape from a lab oratory and cause a global
catastrophe , such research may be necessary to create vaccines for viruses that could cause
worldwide pandemics . For example, the Influenza Pandemic of 1918-1919 (the Spanish flu) killed about 50 million people
worldwide.42 Would modern
bioengineering technology have been able to avoid this global catastrophe ?
In fact, researchers justified the airborne H5N1 virus, discussed above, as helping to prevent the spread of a similar strain that could mutate
naturally. Overall, there is a dynamic relationship between bioengineering and other GCRs that should be assessed when considering how to
respond to these risks.
Warming outweighs and turns every impact- guarantees extinction
Sharp and Kennedy, 14 – is an associate professor on the faculty of the Near East South Asia Center
for Strategic Studies (NESA). A former British Army Colonel he retired in 2006 and emigrated to the U.S.
Since joining NESA in 2010, he has focused on Yemen and Lebanon, and also supported NESA events into
Afghanistan, Turkey, Egypt, Israel, Palestine and Qatar. He is the faculty lead for NESA’s work supporting
theUAE National Defense College through an ongoing Foreign Military Sales (FMS) case. He also directs
the Network of Defense and Staff Colleges (NDSC) which aims to provide best practice support to
regional professional military and security sector education development and reform. Prior to joining
NESA, he served for 4 years as an assistant professor at the College of International Security Affairs
(CISA) at National Defense University where he wrote and taught a Masters' Degree syllabus for a
program concentration in Conflict Management of Stability Operations and also taught strategy,
counterterrorism, counterinsurgency, and also created an International Homeland Defense Fellowship
program. At CISA he also designed, wrote and taught courses supporting the State Department's Civilian
Response Corps utilizing conflict management approaches. Bob served 25 years in the British Army and
was personally decorated by Her Majesty the Queen twice. Aftergraduating from the Royal Military
Academy, Sandhurst in 1981, he served in command and staff roles on operations in Northern Ireland,
Kosovo, Gulf War 1, Afghanistan, and Cyprus. He has worked in policy and technical staff appointments
in the UK Ministry of Defense and also UK Defense Intelligence plus several multi-national organizations
including the Organization for Security and Cooperation in Europe (OSCE). In his later career, he
specialized in intelligence. He is a 2004 distinguished graduate of the National War College and holds a
masters degree in National Security Strategy from National Defense University, Washington, D.C. AND is
a renewable energy and climate change specialist who has worked for the World Bank and the Spanish
Electric Utility ENDESA on carbon policy and markets (Robert and Edward, 8-22, “Climate Change and
Implications for National Security” http://www.internationalpolicydigest.org/2014/08/22/climatechange-implications-national-security/)djm
Our planet is 4.5 billion years old. If that whole time was to be reflected on a single one-year calendar then the dinosaurs died off sometime
late in the afternoon of December 27th and modern humans emerged 200,000 years ago, or at around lunchtime on December 28th.
Therefore, human life on earth is very recent. Sometime on December 28th humans made the first fires – wood fires – neutral in the carbon
balance. Now reflect on those most recent 200,000 years again on a single one-year calendar and you might be surprised to learn that the
industrial revolution began only a few hours ago during the middle of the afternoon on December 31st, 250 years ago, coinciding with the
discovery of underground carbon fuels. Over the 250 years carbon fuels have enabled tremendous technological advances including a
population growth from about 800 million then to 7.5 billion today and the consequent demand to extract even more carbon. This has occurred
during a handful of generations, which is hardly noticeable on our imaginary one-year calendar. The
release of this carbon – however –
is changing our climate at such a rapid rate that it threatens our survival and presence on earth.
defies imagination that so much damage has been done in such a relatively short time. The
the single most significant threat to life on earth
It
implications of climate change are
and, put simply, we
are not doing enough to rectify
the damage. This relatively very recent ability to change our climate is an inconvenient truth; the science is sound. We know of
the complex set of interrelated national and global security risks that are a result of global warming
and the
velocity at which climate change is occurring. We worry it may already be too late. Climate change writ large has informed few, interested
some, confused many, and polarized politics. It
has already led to an increase in natural disasters
including but not limited
to droughts, storms, floods, fires etc. The year 2012 was among the 10 warmest years on record according to an American Meteorological
Society (AMS) report. Research suggests that climate
change is already affecting human displacement; reportedly 36 million
people were displaced in 2008 alone because of sudden natural disasters. Figures for 2010 and 2011 paint a grimmer picture of people
displaced because of rising sea levels, heat and storms. Climate change affects all natural systems . It
impacts temperature and consequently it
affects water and weather patterns . It contributes to desertification,
deforestation and acidification of the oceans . Changes in weather patterns may mean droughts in one area and
floods in another. Counter-intuitively, perhaps, sea
levels rise but perennial river water supplies are reduced because
glaciers are retreating. As glaciers and polar ice caps melt, there is an albedo effect, which is a double
whammy of less temperature
regulation because
of less surface area of ice present. This means that less
absorption occurs and also there is less reflection of the sun’s light. A potentially critical wild card could be runaway climate
change due to the release of methane from melting tundra. Worldwide permafrost soils contain about 1,700 Giga Tons of carbon, which is
about four times more than all the carbon released through human activity thus far. The
planet has already adapted itself to
dramatic climate change including a wide range of distinct geologic periods and multiple extinctions, and at a pace that it can be
managed. It is human intervention
that
has accelerated the pace dramatically : An increased surface
temperature, coupled with more severe weather and changes in water distribution will create uneven
threats to our agricultural systems
Malaria, Dengue and the West Nile virus.
and will foster and support the
Rising sea levels will
infrastructure centers and with more than 3.5 billion people –
increasingly
half the planet
spread of insect borne diseases
threaten
like
our coastal population and
– depending on the ocean for their primary source of
food, ocean acidification may dangerously undercut critical natural food systems which would result in reduced rations. Climate change also
carries significant inertia. Even if emissions were completely halted today, temperature increases would continue for some time. Thus
the
impact is not only to the environment, water, coastal homes, agriculture and fisheries as mentioned, but
also would lead to conflict and thus impact national security . Resource wars are inevitable as
countries respond, adapt and compete for the shrinking set of
those available
resources . These wars
have arguably already started and will continue in the future because climate
change will force countries to act for national
survival; the so-called Climate Wars. As early as 2003 Greenpeace alluded to a report which it claimed was commissioned by the
Pentagon titled: An Abrupt Climate Change Scenario and Its Implications for U.S. National Security. It painted a picture
of a world in turmoil because global warming had accelerated. The scenario outlined was both abrupt and alarming. The
report offered recommendations but backed away from declaring climate change an immediate problem, concluding that it would actually be
more incremental and measured; as such it would be an irritant, not a shock for national security systems. In 2006 the Center for Naval
Analyses (CNA) – Institute of Public Research – convened
a board of 11 senior retired generals and admirals to assess
National Security and the Threat to Climate Change. Their initial report was published in April 2007 and made no mention of
the potential acceleration of climate change. The
national security
team found that climate change was a serious threat to
and that it was: “most likely to happen in regions of the world that are already fertile ground for extremism.” The
team made recommendations from their analysis of regional impacts which suggested the following.
Europe would experience
some fracturing
because of border migration. Africa would need more stability and humanitarian
operations provided by the United States. The Middle East would experience a “loss of food and water
security (which) will increase pressure to emigrate across borders.” Asia would suffer from “threats to
water and the spread of infectious disease.” In 2009 the CIA opened a Center on Climate Change and National Security to
coordinate across the intelligence community and to focus policy. In May 2014, CNA again convened a Military Advisory Board
but this time to assess National Security and the Accelerating Risk of Climate Change. The report
concludes
that climate
change is no longer a future threat but occurring right now
and the authors
appeal to the security community, the entire government and the American people to not only build resilience against projected
climate change impacts but to
change across
all
form agreements to stabilize climate change and also to integrate climate
strategy and planning . The calm of the 2007 report is replaced by a tone of anxiety concerning the
future coupled with calls for public discourse and debate because “time and tide wait for no man.” The report notes a key distinction between
resilience (mitigating the impact of climate change) and agreements (ways to stabilize climate change) and states that: Actions
by the
United States and the international community have been insufficient to adapt to the challenges associated with projected climate
change. Strengthening
resilience to climate impacts already locked into the system is critical, but this will reduce long-
term risk only if improvements in resilience are accompanied by actionable agreements on ways to
stabilize climate change. The 9/11 Report framed the terrorist attacks as less of a failure of intelligence than a failure of imagination.
Greenpeace’s 2003 account of the
Pentagon’s alleged report describes a coming climate Armageddon
readers was unimaginable and hence the report was not really taken seriously. It described:
which to
A world thrown into turmoil by
drought, floods, typhoons . Whole countries rendered uninhabitable . The capital of the Netherlands
submerged. The borders of the U.S. and Australia patrolled
by armies firing into waves of starving boat people
desperate to find a new home. Fishing boats armed with cannon to drive off competitors. Demands for access to water
and farmland backed up with nuclear weapons . The CNA and Greenpeace/Pentagon reports are both mirrored by
similar analysis by the World Bank which highlighted
not only the physical manifestations of climate change, but also
the significant human impacts that threaten to unravel decades of economic development , which will
ultimately
foster conflict . Climate change is the quintessential “Tragedy of the Commons,” where the cumulative impact of
many individual actions (carbon emission in this case) is not seen as linked to the marginal gains available to each individual action and not seen
as cause and effect. It is simultaneously huge, yet amorphous and nearly invisible from day to day. It is occurring very fast in geologic time
terms, but in human time it is (was) slow and incremental. Among environmental problems, it is uniquely global. With our planet and culture
figuratively and literally honeycombed with a reliance on fossil fuels, we face systemic challenges in changing the reliance across multiple layers
of consumption, investment patterns, and political decisions; it will be hard to fix!
Warming is real, anthropogenic, rapid and leads to extinction- scientific consensus
Prothero 12 – Donald R. Prothero is a Professor of Geology at Occidental College and Lecturer in Geobiology at the California Institute of Technology.
(“How We Know Global Warming is Real and Human Caused”, 3/1/2012, http://www.skeptic.com/eskeptic/12-02-08/)
How do we know that global warming is real and primarily human caused? There are numerous lines of evidence that converge to this conclusion. Carbon Dioxide
Increase. Carbon
dioxide in our atmosphere has increased at an unprecedented rate in the past 200 years. Not one data set
collected over a long enough span of time shows otherwise. Mann et al. (1999) compiled the past 900 years’ worth of
temperature data from tree rings, ice cores, corals, and direct measurements of the past few centuries, and the sudden
increase of temperature of the past century stands out like a sore thumb. This famous graph (see Figure 1 above) is now known as the “hockey stick” because it is
long and straight through most of its length, then bends sharply upward at the end like the blade of a hockey stick. Other graphs show that climate was very stable
within a narrow range of variation through the past 1000, 2000, or even 10,000 years since the end of the last Ice Age. There were minor warming events during the
Climatic Optimum about 7000 years ago, the Medieval Warm Period, and the slight cooling of the Little Ice Age from the 1700s and 1800s. But the
magnitude and rapidity of the warming represented by the last 200 years is simply unmatched in all of human history. More revealing,
the timing of this warming coincides with the Industrial Revolution, when humans first began massive deforestation and released carbon dioxide by burning coal,
gas, and oil.
Melting Polar Ice Caps. The polar icecaps are thinning and breaking up at an alarming rate. In 2000, my former graduate advisor Malcolm McKenna was one of the
first humans to fly over the North Pole in summer time and see no ice, just open water. The Arctic ice cap has been frozen solid for at least the past 3 million years
and maybe longer3, but now the entire ice sheet is breaking up so fast that by 2030 (and possibly sooner) less than half of the Arctic will be ice covered in the
summer.4 As one can see from watching the news, this is an ecological disaster for everything that lives up there, from the polar bears to the seals and walruses to
the animals they feed upon, to the 4 million people whose world is melting beneath their feet. The Antarctic is thawing even faster. In February–March 2002, the
Larsen B ice shelf—over 3000 square km (the size of Rhode Island) and 220 m (700 feet) thick—broke up in just a few months, a story typical of nearly all the ice
shelves in Antarctica. The
Larsen B shelf had survived all the previous ice ages and interglacial warming episodes for the past 3
million years, and even the warmest periods of the last 10,000 years—yet it and nearly all the other thick ice sheets on the Arctic, Greenland, and
Antarctic are vanishing at a rate never before seen in geologic history.
Melting Glaciers. Glaciers are all retreating at the highest rates ever documented. Many of those glaciers, especially in the Himalayas, Andes, Alps, and Sierras,
provide most of the freshwater that the populations below the mountains depend upon—yet this fresh water supply is vanishing. Just think about the percentage of
world’s population in southern Asia (especially India) that depend on Himalayan snowmelt for their fresh water. The implications are staggering. The permafrost
that once remained solidly frozen even in the summer has now thawed, damaging the Inuit villages on the Arctic coast and threatening all our pipelines to the North
Slope of Alaska. This is catastrophic not only for life on the permafrost, but as it thaws, the permafrost releases huge amounts of greenhouse gases and is one of the
major contributors to global warming. Not only is the ice vanishing, but we have seen record heat waves over and over again, killing thousands of people, as each
year joins the list of the hottest years on record. (2010 just topped that list as the hottest year, surpassing the previous record in 2009, and we shall know about
2011 soon enough). Natural animal and plant populations are being devastated all over the globe as their environment changes.5 Many animals respond by moving
their ranges to formerly cold climates, so now places that once did not have to worry about disease-bearing mosquitoes are infested as the climate warms and
allows them to breed further north.
Sea Level Rise. All that melted ice eventually ends up in the ocean, causing sea level to rise, as it has many times in the geologic past. At present, sea
level is
rising about 3–4 mm per year, more than ten times the rate of 0.1–0.2 mm/year that has occurred over the past 3000 years. Geological data show that
sea level was virtually unchanged over the past 10,000 years since the present interglacial began. A few millimeters here or there doesn’t impress people, until you
consider that the rate is accelerating and that most scientists predict sea level will rise 80–130 cm in just the next century. A sea level rise of 1.3 m (almost 4 feet)
would drown many of the world’s low-elevation cities, such as Venice and New Orleans, and low-lying countries such as the Netherlands or Bangladesh. A number
of tiny island nations such as Vanuatu and the Maldives, which barely poke out above the ocean now, are already vanishing beneath the waves. Eventually their
entire population will have to move someplace else.6 Even a small sea level rise might not drown all these areas, but they are much more vulnerable to the large
waves of a storm surge (as happened with Hurricane Katrina), which could do much more damage than sea level rise alone. If sea level rose by 6 m (20 feet), most of
the world’s coastal plains and low-lying areas (such as the Louisiana bayous, Florida, and most of the world’s river deltas) would be drowned.
Most of the world’s population lives in coastal cities such as New York, Boston, Philadelphia, Baltimore, Washington, D.C., Miami, Shanghai, and London. All of those
cities would be partially or completely under water with such a sea level rise. If all the glacial ice caps melted completely (as they have several times before during
past greenhouse episodes in the geologic past), sea level would rise by 65 m (215 feet)! The entire Mississippi Valley would flood, so you could dock your boat in
Cairo, Illinois. Such a sea level rise would drown nearly every coastal region under hundreds of feet of water, and inundate New York City, London and Paris. All that
would remain would be the tall landmarks, such as the Empire State Building, Big Ben, and the Eiffel Tower. You could tie your boats to these pinnacles, but the rest
of these drowned cities would be deep under water.
Climate Deniers’ Arguments and Scientists’ Rebuttals
Despite the overwhelming evidence there are many people who remain skeptical. One reason is that they have been fed lies, distortions, and misstatements by the
global warming denialists who want to cloud or confuse the issue. Let’s examine some of these claims in detail:
“It’s just natural climatic variability.” No, it is not. As I detailed in my 2009 book, Greenhouse of the Dinosaurs, geologists and paleoclimatologists know a lot about
past greenhouse worlds, and the icehouse planet that has existed for the past 33 million years. We have a good understanding of how and why the Antarctic ice
sheet first appeared at that time, and how the Arctic froze over about 3.5 million years ago, beginning the 24 glacial and interglacial episodes of the “Ice Ages” that
have occurred since then. We know how variations in the earth’s orbit (the Milankovitch cycles) controls the amount of solar radiation the earth receives, triggering
the shifts between glacial and interglacial periods. Our current warm interglacial has already lasted 10,000 years, the duration of most previous interglacials, so if it
were not for global warming, we would be headed into the next glacial in the next 1000 years or so. Instead, our pumping greenhouse gases into our atmosphere
after they were long trapped in the earth’s crust has pushed the planet into a “super-interglacial,” already warmer than any previous warming period. We can see
the “big picture” of climate variability most clearly in the EPICA cores from Antarctica (see Figure 2 below), which show the details of the last 650,000 years of
glacial-interglacial cycles. At no time during any previous interglacial did the carbon dioxide levels exceed 300 ppm, even at their very warmest. Our atmospheric
carbon dioxide levels are already close to 400 ppm today. The atmosphere is headed to 600 ppm within a few decades, even if we stopped releasing greenhouse
gases immediately. This
is decidedly not within the normal range of “climatic variability,” but clearly unprecedented in human history.
Anyone who says this is “normal variability” has never seen the huge amount of paleoclimatic data that show otherwise. “It’s just another warming episode, like the
Mediaeval Warm Period, or the Holocene Climatic Optimum” or the end of the Little Ice Age.” Untrue. There
were numerous small fluctuations of
warming and cooling over the last 10,000 years of the Holocene. But in the case of the Mediaeval Warm Period (about 950–1250 A.D.), the temperatures
increased by only 1°C, much
less than we have seen in the current episode of global warming (see Figure 1). This episode was also only a local
warming in the North Atlantic and northern Europe. Global temperatures over this interval did not warm at all, and actually cooled by more than 1°C. Likewise, the
warmest period of the last 10,000 years was the Holocene Climatic Optimum (5000–9000 B.C.) when warmer and wetter conditions in Eurasia caused the rise of the
first great civilizations in Egypt, Mesopotamia, the Indus Valley, and China. This was largely a Northern Hemisphere-Eurasian phenomenon, with 2–3°C warming in
the Arctic and northern Europe. But there was almost no warming in the tropics, and cooling or no change in the Southern Hemisphere.7 To the Eurocentric world,
these warming events seemed important, but on a global scale the effect is negligible. In addition, neither of these warming episodes is related to increasing
greenhouse gases. The Holocene Climatic Optimum, in fact, is predicted by the Milankovitch cycles, since at that time the axial tilt of the earth was 24°, its steepest
value, meaning the Northern Hemisphere got more solar radiation than normal—but the Southern Hemisphere less, so the two balanced. By contrast, not only is
the warming observed in the last 200 years much greater than during these previous episodes, but it is also global and bipolar, so it is not a purely local effect. The
warming that ended the Little Ice Age (from the mid-1700s to the late 1800s) was due to increased solar radiation prior to 1940. Since 1940, however, the amount
of solar radiation has been dropping, so the only candidate for the post-1940 warming has to be carbon dioxide.8
“It’s
just the sun, or cosmic rays, or volcanic activity or methane.” Nope, sorry. The amount of heat that the sun
provides has been decreasing since 19409, just the opposite of the denialists’ claims. There is no evidence (see Figure 3 below) of
increase in cosmic radiation during the past century.10 Nor is there any clear evidence that large-scale volcanic events
(such as the 1815 eruption of Tambora in Indonesia, which changed global climate for about a year) have any long-term effect that would explain 200
years of warming and carbon dioxide increase. Volcanoes erupt only 0.3 billion tonnes of carbon dioxide each year, but humans emit over 29 billion tonnes a year11,
roughly 100 times as much. Clearly, we have a bigger effect. Methane is a more powerful greenhouse gas, but there is 200 times more carbon dioxide than
methane, so carbon dioxide is still the most important agent.12 Every other alternative has
been looked at, but the only clear-cut relationship is
between human-caused carbon dioxide increase and global warming. “The climate records since 1995 (or 1998) show cooling.” That’s a
deliberate deception. People who throw this argument out are cherry-picking the data.13 Over the short term, there was a slight
cooling trend from 1998–2000 (see Figure 4 below), because 1998 was a record-breaking El Niño year, so the next few years look cooler by comparison. But since
2002, the overall long-term trend of warming is unequivocal. This statement is a clear-cut case of using out-of-context data in an attempt to deny reality. All
of
the 16 hottest years ever recorded on a global scale have occurred in the last 20 years. They are (in order of hottest first):
2010, 2009, 1998, 2005, 2003, 2002, 2004, 2006, 2007, 2001, 1997, 2008, 1995, 1999, 1990, and 2000.14 In other words, every year since 2000 has been in the Top
Ten hottest years list, and the rest of the list includes 1995, 1997, 1998, 1999, and 2000. Only 1996 failed to make the list (because of the short-term cooling
mentioned already).
“We had record snows in the winters of 2009–2010, and in 2010–2011.” So what? This is nothing more than the difference between weather (short-term seasonal
changes) and climate (the long-term average of weather over decades and centuries and longer). Our local weather tells us nothing about another continent, or the
global average; it is only a local effect, determined by short-term atmospheric and oceanographic conditions.15 In fact, warmer global temperatures mean more
moisture in the atmosphere, which increases the intensity of normal winter snowstorms. In this particular case, the climate denialists forget that the early winter of
November–December 2009 was actually very mild and warm, and then only later in January and February did it get cold and snow heavily. That warm spell in early
winter helped bring more moisture into the system, so that when cold weather occurred, the snows were worse. In addition, the snows were unusually heavy only
in North America; the rest of the world had different weather, and the global climate was warmer than average. And the summer of 2010 was the hottest on record,
breaking the previous record set in 2009.
“Carbon dioxide is good for plants, so the world will be better off.” Who do they think they’re kidding? The people who promote this idea clearly don’t know much
global geochemistry, or are trying to cynically take advantage of the fact that most people are ignorant of science. The Competitive Enterprise Institute (funded by
oil and coal companies and conservative foundations16) has run a series of shockingly stupid ads concluding with the tag line “Carbon dioxide: they call it pollution,
we call it life.” Anyone who knows the basic science of earth’s atmosphere can spot the deceptions in this ad.17 Sure, plants take in carbon dioxide that animals
exhale, as they have for millions of years. But the whole point of the global warming evidence (as shown from ice cores) is that the delicate natural balance of
carbon dioxide has been thrown out of whack by our production of too much of it, way in excess of what plants or the oceans can handle. As a consequence, the
oceans are warming18 and absorbing excess carbon dioxide making them more acidic. Already we are seeing a shocking decline in coral reefs (“bleaching”) and
extinctions in many marine ecosystems that can’t handle too much of a good thing. Meanwhile, humans are busy cutting down huge areas of temperate and
tropical forests, which not only means there are fewer plants to absorb the gas, but the slash and burn practices are releasing more carbon dioxide than plants can
keep up with. There is much debate as to whether increased carbon dioxide might help agriculture in some parts of the world, but that has to be measured against
the fact that other traditional “breadbasket” regions (such as the American Great Plains) are expected to get too hot to be as productive as they are today. The
latest research19 actually shows that increased carbon dioxide inhibits the absorption of nitrogen into plants, so plants (at least those that we depend upon today)
are not going to flourish in a greenhouse world. Anyone who tells you otherwise is ignorant of basic atmospheric science.
“I agree that climate is changing, but I’m skeptical that humans are the main cause, so we shouldn’t do anything.” This is just fence sitting. A lot of reasonable
skeptics deplore the “climate denialism” of the right wing, but still want to be skeptical about the cause. If they want proof, they can examine the huge array of data
that directly points to humans causing global warming.20 We can directly measure the amount of carbon dioxide humans are producing, and it tracks exactly with
the amount of increase in atmospheric carbon dioxide. Through carbon isotope analysis, we can show that this carbon dioxide in the atmosphere is coming directly
from our burning of fossil fuels, not from natural sources. We can also measure oxygen levels that drop as we produce more carbon that then combines with oxygen
to produce carbon dioxide. We have satellites in space that are
measuring the heat released from the planet and can actually see
the atmosphere get warmer. The most crucial proof emerged only in the past few years: climate models of the greenhouse effect predict
that there should be cooling in the stratosphere (the upper layer of the atmosphere above 10 km (6 miles) in elevation, but
warming in the troposphere (the bottom layer of the atmosphere below 10 km (6 miles), and that’s exactly what our space probes
have measured. Finally, we can rule out any other culprits (see above): solar heat is decreasing since 1940, not increasing, and there are no measurable
increases in cosmic radiation, methane, volcanic gases, or any other potential cause. Face it—it’s our problem.
Why Do People Deny Climate Change? Thanks to all the noise and confusion over the debate, the general public has only a vague idea of what the debate is really
about, and only about half of Americans think global warming is real or that we are to blame.21 As in the debate over evolution and creationism, the scientific
community is virtually unanimous on what the data demonstrate about anthropogenic global warming. This has been true for over a decade. When science
historian Naomi Oreskes
surveyed all peer-reviewed papers on climate change published between 1993 and 2003 in the world’s leading
scientific journal, Science, she found that there were 980 supporting the idea of human-induced global warming and none
opposing it. In 2009, Doran and Kendall Zimmerman23 surveyed all the climate scientists who were familiar with the data. They found that 95–99%
agreed that global warming is real and that humans are the reason. In 2010, the prestigious Proceedings of the National Academy of
Sciences published a study that showed that 98% of the scientists who actually do research in climate change are in agreement with anthropogenic global
warming.24 Every major scientific organization in the world has endorsed the conclusion of anthropogenic climate change as well. This
is a rare degree
of agreement within such an independent and cantankerous group as the world’s top scientists. This is the same degree of scientific consensus
that scientists have achieved over most major ideas, including gravity, evolution, and relativity. These and only a few other topics in science can
claim this degree of agreement among nearly all the world’s leading scientists, especially among everyone who is close to the scientific data and knows the problem
intimately. If it were not such a controversial topic politically, there would be almost no interest in debating it, since the evidence is so clear-cut. If the climate
science community speaks with one voice (as in the 2007 IPCC report, and every report since then), why is there still any debate at all? The answer has been
revealed by a number of investigations by diligent reporters who got past the PR machinery denying global warming, and uncovered the money trail. Originally,
there was no real “dissenters” to the idea of global warming by scientists who are actually involved with climate research. Instead, the forces with vested interests
in denying global climate change (the energy
companies, and the “free-market” advocates) followed the strategy of tobacco companies:
create a smokescreen of confusion and prevent the American public from recognizing scientific consensus. As the famous memo25 from the
tobacco lobbyists said “Doubt is our product.” The denialists generated an anti-science movement entirely out of thin air and PR. The evidence for this PR conspiracy
has been well documented in numerous sources. For example, Oreskes and Conway revealed from memos leaked to the press that in April 1998 the right-wing
Marshall Institute, SEPP (Fred Seitz’s lobby that aids tobacco companies and polluters), and ExxonMobil, met in secret at the American Petroleum Institute’s
headquarters in Washington, D.C. There they planned a $20 million campaign to get “respected scientists” to cast doubt on climate change, get major PR efforts
going, and lobby Congress that global warming isn’t real and is not a threat.
The right-wing
institutes and the energy lobby beat the bushes to find scientists—any scientists—who might
disagree with the scientific consensus. As investigative journalists and scientists have documented over and over again,26 the denialist conspiracy essentially
paid for the testimony of anyone who could be useful to them. The day that the 2007 IPCC report was released (Feb. 2, 2007), the British newspaper The Guardian
reported that the conservative American Enterprise Institute (funded largely by oil companies and conservative think tanks) had offered $10,000 plus travel
expenses to scientists who would write negatively about the IPCC report.27
We are accustomed to the hired-gun “experts” paid by lawyers to muddy up the evidence in the case they are fighting, but this is extraordinary—buying scientists
outright to act as shills for organizations trying to deny scientific reality. With
this kind of money, however, you can always find a fringe
scientist or crank or someone with no relevant credentials who will do what they’re paid to do. The NCSE satirized this tactic of composing phony “lists of
scientists” with their “Project Steve.”28 They showed that there were more scientists named “Steve” than their entire list of “scientists who dispute evolution.” It
may generate lots of PR and a smokescreen to confuse the public, but it doesn’t change the fact that scientists who actually do research in climate change are
unanimous in their insistence that anthropogenic global warming is a real threat. Most scientists I know and respect work very hard for little pay, yet they still
cannot be paid to endorse some scientific idea they know to be false.
Bioengineered pathogen release causes extinction
Mhyrvold 13 – Nathan Myhrvold founded Intellectual Ventures after retiring as chief strategist and
chief technology officer of Microsoft Corporation. During his 14 years at Microsoft, Nathan founded
Microsoft Research and numerous technology groups. He has always been an avid inventor. To date, he
has been awarded hundreds of patents and has hundreds of patents pending. Before joining Microsoft,
Nathan was a postdoctoral fellow in the department of applied mathematics and theoretical physics at
Cambridge University, and he worked with Professor Stephen Hawking. He earned a doctorate in
theoretical and mathematical physics and a master's degree in mathematical economics from Princeton
University, and he also has a master's degree in geophysics and space physics and a bachelor's degree in
mathematics from UCLA. (“Strategic Terrorism: A Call to Action”, July 2013,
http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2290382)
Even more so than with nuclear weapons, the
cost and technical difficulty of producing biological arms has dropped
precipitously in recent decades with the boom in industrial molecular biology. A small team of people with the necessary technical training
and some cheap equipment can create weapons far more terrible than any nuclear bomb. Indeed, even a single individual might do so. Taken
together, these
trends utterly undermine the lethality-versus-cost curve that existed throughout all of human history.
Access to extremely lethal agents—even to those that may exterminate the human race—will be available to nearly anybody. Access to mass
death has been democratized; it has spread from a small elite of superpower leaders to nearly anybody with modest resources. Even the leader
of a ragtag, stateless group hiding in a cave—or in a Pakistani suburb—can potentially have “the button.” Turning Life Against the Living The
first and simplest kinds of biological weapons are those that are not contagious and thus do not lead to epidemics. These have been developed
for use in military conflicts for most of the 20th century. Because the pathogens used are not contagious, they are considered controllable: that
is, they have at least some of the command-and-control aspects of a conventional weapon. Typically, these pathogens have been
“weaponized,” meaning bred or refined for deployment by using artillery shells, aerial bombs, or missiles much like conventional explosive
warheads. They can be highly deadly. Anthrax is the most famous example. In several early- 20th-century outbreaks, it killed nearly 90% of
those infected by inhaling bacterial spores into their lungs. Anthrax was used in the series of mail attacks in the United States in the fall of 2001.
Even with advanced antibiotic treatment, 40% of those who contracted inhalational anthrax died during the 2001 attacks.1 That crime is
believed to have been the work of a lone bioweapons scientist who sought to publicize the threat of a biological attack and boost funding for
his work on anthrax vaccines. This conclusion is consistent with the fact that virtually no effort was made to disperse the bacterium— indeed,
the letters carrying the spores thoughtfully included text warning of anthrax exposure and recommending that the recipient seek immediate
treatment. Despite this intentional effort to limit rather than spread the infection, a surprising amount of trouble was caused when the fine
anthrax powder leaked from envelopes and contaminated other mail. Before this episode, nobody would have guessed that letters mailed in
New Jersey to addresses in Manhattan and Washington, D.C., could kill someone in Connecticut, but they did. And no one would have predicted
that a domestic bioterrorist launching multiple attacks, including one against the U.S. Congress, would elude the FBI for years. But that is what
happened. What if such an attack were made not by some vigilante trying to alert the world to the dangers of bioweapons but instead by a real
sociopath? Theodore J. Kaczynski, better known as the “Unabomber,” may have been such a person. He was brilliant enough to earn a Ph.D. in
mathematics from the University of Michigan yet was mentally disturbed enough to be a one-man terrorist cell: His mail bombs claimed victims
over nearly two decades. Kaczynski certainly had enough brains to use sophisticated methods, but because he opposed advanced technology,
he made untraceable low-tech bombs that killed only three people. A future Kaczynski with training in microbiology and genetics, and an
eagerness to use the destructive power of that science, could be a threat to the entire human race. Indeed, the world has already experienced
some true acts of biological terror. Aum Shinrikyo produced botulinum toxin and anthrax and reportedly released them in Tokyo on four
separate occasions. A variety of technical and organizational difficulties frustrated these attacks, which did not cause any casualties and went
unrecognized at the time for what they were, until the later Sarin attack clued in the authorities.2 Had the group been a bit more competent,
things could have turned out far worse. One 2003 study found that an airborne release of one kilogram of an anthrax-spore-containing aerosol
in a city the size of New York would result in 1.5 million infections and 123,000 to 660,000 fatalities, depending on the effectiveness of the
public health response.3 A 1993 U.S. government analysis determined that 100 kilograms of weaponized anthrax, if sprayed from an airplane
upwind of Washington, D.C., would kill between 130,000 and three million people.4 Because anthrax spores remain viable in the environment
for more than 30 years,1 portions of a city blanketed by an anthrax cloud might have to be abandoned for years while extensive cleaning was
done. Producing enough anthrax to kill 100,000 Americans is far easier to do—and far harder to detect—than is constructing a nuclear bomb of
comparable lethality. Anthrax, moreover, is rather benign as biological weapons go. The pathogen is reasonably well understood, having been
studied in one form or another in biowarfare circles for more than 50 years. Natural strains of the bacterium are partially treatable with long
courses of common antibiotics such as ciprofloxacin if the medication is taken sufficiently quickly, and vaccination soon after exposure seems to
reduce mortality further.5 But bioengineered anthrax that is resistant to both antibiotics and vaccines is known to have been produced in both
Soviet and American bioweapons laboratories. In 1997, a group of Russian scientists even openly published the recipe for one of these
superlethal strains in a scientific journal.6 In addition, numerous other agents are similar to anthrax in that they are highly lethal but not
contagious. The lack of contagion means that an attacker must administer the pathogen to the people he wishes to infect. In a military context,
this quality is generally seen as a good thing because the resulting disease can be contained in a specific area. Thus, the weapon can be directed
at a well-defined target, and with luck, little collateral damage will result. Unfortunately, many
biological agents are
communicable and so can spread beyond the people initially infected to affect the entire population. Infectious pathogens are
inherently hard to control because there is usually no reliable way to stop an epidemic once it starts. This property makes such biological
agents difficult to use as conventional weapons. A nation that starts an epidemic may see it spread to the wrong country—or even to its own
people. Indeed, one cannot target a small, well-defined population with a contagious pathogen; by its nature, such a pathogen may infect the
entire human race. Despite this rather severe drawback, both the Soviet Union and the United States, as well as Imperial Japan, investigated
and produced contagious bioweapons. The logic was that their use in a military conflict would be limited to last-ditch, “scorched earth”
campaigns, perhaps with a vaccine available only to one side. Smallpox is the most famous example. It is highly contagious and spreads through
casual contact. Smallpox was eradicated in the wild in 1977, but it still exists in both U.S. and Russian laboratories, according to official
statements.7 Unofficial holdings are harder to track, but a number of countries, including North Korea, are believed to possess covert smallpox
cultures. Biological weapons were strictly regulated by international treaty in 1972. The United States and the Soviet Union agreed not to
develop such weapons and to destroy existing stocks. The United States stopped its bioweapons work, but the Russians cheated and kept a
huge program going into the 1990s, thereby producing thousands of tons of weaponized anthrax, smallpox, and far more exotic biological
weapons based on genetically engineered viruses. No one can be certain how far either the germs or the knowledge has spread since the
collapse of the Soviet Union. Experts estimate that a large-scale, coordinated smallpox attack on the United States might kill 55,000 to 110,000
people, assuming that sufficient vaccine is available to contain the epidemic and that the vaccine works.8, 9 The death toll may be far higher if
the smallpox strain has been engineered to be vaccine-resistant or to have enhanced virulence. Moreover, a smallpox attack on the United
States could easily broaden into a global pandemic, despite the U.S. stockpile of at least 300 million doses of vaccine. All it would take is for one
infected person to leave the country and travel elsewhere. If New York City were attacked with smallpox, infections would most likely appear
on every continent, except perhaps Antarctica, within two weeks. Once these beachheads were established, the epidemic would spread almost
without check because the vaccine in world stockpiles and the infrastructure to distribute it would be insufficient. That is particularly true in the
developing world, which is ill equipped to handle their current disease burden to say nothing of a return of smallpox. Even if “only” 50,000
people were killed in the United States, a million or more would probably die worldwide before the disease could be contained, and
containment would probably require many years of effort. As horrible as this would be, such a pandemic is by no means the worst attack one
can imagine, for several reasons. First, most of the classic bioweapons are based on 1960s and 1970s technology because the 1972 treaty
halted bioweapons development efforts in the United States and most other Western countries. Second, the Russians, although solidly
committed to biological weapons long after the treaty deadline, were never on the cutting edge of biological research. Third and most
important, the science and technology of molecular biology have made enormous advances, utterly transforming
the field in the last few decades. High school biology students routinely perform molecular-biology manipulations that would have been
impossible even for the best superpower-funded program back in the heyday of biological-weapons research. The biowarfare methods of the
1960s and 1970s are now as antiquated as the lumbering mainframe computers of that era. Tomorrow’s terrorists will have vastly more deadly
bugs to choose from. Consider this sobering development: in
2001, Australian researchers working on mousepox, a
nonlethal virus that infects mice (as chickenpox does in humans), accidentally discovered that a simple genetic modification
transformed the virus.10, 11 Instead of producing mild symptoms, the new virus killed 60% of even those mice already immune to the
naturally occurring strains of mousepox. The new virus, moreover, was unaffected by any existing vaccine or antiviral drug. A team of
researchers at Saint Louis University led by Mark Buller picked up on that work and, by late 2003, found a way to improve on it: Buller’s
variation on mousepox was 100% lethal, although his team of investigators also devised combination vaccine and antiviral therapies that were
partially effective in protecting animals from the engineered strain.12, 13 Another saving grace is that the genetically altered virus is no longer
contagious. Of course, it is quite possible that future tinkering with the virus will change that property, too. Strong reasons exist to believe that
the genetic modifications Buller made to mousepox would work for other poxviruses and possibly for other classes of viruses as well. Might the
same techniques allow chickenpox or another poxvirus that infects humans to be turned into a 100% lethal bioweapon, perhaps one that is
resistant to any known antiviral therapy? I’ve asked this question of experts many times, and no one has yet replied that such a manipulation
couldn’t be done. This case is just one example. Many
more are pouring out of scientific journals and conferences every
year. Just last year, the journal Nature published a controversial study done at the University of Wisconsin–Madison in which
virologists enumerated the changes one would need to make to a highly lethal strain of bird flu to make it easily
transmitted from one mammal to another.14 Biotechnology is advancing so rapidly that it is hard to keep track of all the new potential
threats. Nor is it clear that anyone is even trying. In addition to lethality and drug resistance, many other parameters can be played with, given
that the infectious power of an epidemic depends on many properties, including the length of the latency period during which a person is
contagious but asymptomatic. Delaying the onset of serious symptoms allows each new case to spread to more people and thus makes the
virus harder to stop. This dynamic is perhaps best illustrated by HIV, which is very difficult to transmit compared with smallpox and many other
viruses. Intimate contact is needed, and even then, the infection rate is low. The balancing factor is that HIV can take years to progress to AIDS,
which can then take many more years to kill the victim. What makes HIV so dangerous is that infected people have lots of opportunities to
infect others. This property has allowed HIV to claim more than 30 million lives so far, and approximately 34 million people are now living with
this virus and facing a highly uncertain future.15 A virus genetically engineered to infect its host quickly, to generate symptoms slowly—say,
only after weeks or months—and to spread easily through the air or by casual contact would be vastly more devastating than HIV . It could
silently penetrate the population to unleash its deadly effects suddenly. This type of epidemic would be almost impossible to combat because
most of the infections would occur before the epidemic became obvious. A technologically sophisticated terrorist group could develop such a
virus and kill a large part of humanity with it. Indeed, terrorists may not have to develop it themselves: some scientist may do so first and
publish the details. Given the rate at which biologists are making discoveries about viruses and the immune system, at some point in the near
future, someone
may create artificial pathogens that could drive the human race to extinction . Indeed, a
detailed species-elimination plan of this nature was openly proposed in a scientific journal. The ostensible purpose of that particular research
was to suggest a way to extirpate the malaria mosquito, but similar techniques could be directed toward humans.16 When I’ve talked to
molecular biologists about this method, they are quick to point out that it is slow and easily detectable and could be fought with biotech
remedies. If you challenge them to come up with improvements to the suggested attack plan, however, they have plenty of ideas. Modern
biotechnology will soon be capable, if it is not already, of bringing about the demise of the human race— or at least of killing a sufficient
number of people to end high-tech civilization and set humanity back 1,000 years or more. That terrorist groups could achieve this level of
technological sophistication may seem far-fetched, but keep in mind that it takes only a handful of individuals to accomplish these
tasks. Never has lethal power of this potency been accessible to so few, so easily. Even more dramatically than nuclear proliferation, modern
biological science has frighteningly undermined the correlation between the lethality of a weapon and its cost, a fundamentally stabilizing
mechanism throughout history. Access to extremely lethal agents—lethal enough to exterminate Homo sapiens—will be available to anybody
with a solid background in biology, terrorists included. The 9/11 attacks involved at least four pilots, each of whom had sufficient education to
enroll in flight schools and complete several years of training. Bin Laden had a degree in civil engineering. Mohammed Atta attended a German
university, where he earned a master’s degree in urban planning—not a field he likely chose for its relevance to terrorism. A
future set of
terrorists could just as easily be students of molecular biology who enter their studies innocently enough but later put their
skills to homicidal use. Hundreds of universities in Europe and Asia have curricula sufficient to train people in the skills necessary to make a
sophisticated biological weapon, and hundreds more in the United States accept students from all over the world. Thus it seems likely that
sometime in the near future a small band of terrorists, or even a single misanthropic individual, will overcome our best defenses and do
something truly terrible, such as fashion a bioweapon that could kill millions or even billions of people. Indeed,
the creation of such
weapons within the next 20 years seems to be a virtual certainty . The repercussions of their use are hard to
estimate. One approach is to look at how the scale of destruction they may cause compares with that of other calamities that the human race
has faced.
Genetic research key to biotech agriculture- solves food instability
Zilberman 14 (David Zilberman is a professor and holds the Robinson Chair in the Department of
Agricultural and Resource Economics at UC Berkeley. He is also a member of the Giannini Foundation of
Agricultural Economics. The research leading to this paper was supported by the Energy Biosciences
Institute and Cotton, Inc. The author thanks Scott Kaplan, Eunice Kim, and Angela Erickson for their
assistance. The Economics of Sustainable Development
http://ajae.oxfordjournals.org/content/96/2/385.short)
The major applications of the new bioeconomy considered here include genetic modification, biofuels, and green chemistry. Genetic
modification has had a large range of applications in medicine and is a foundation of the fastgrowing medical biotechnology industry (Lebkowski et al. 2001). Agricultural biotechnology has also grown rapidly. However, the
use of genetically modified crops (GMOs) is a subject of restrictive regulation, and their utilization has been limited to four major crops (corn,
soybeans, cotton, and rapeseed). Furthermore, the
United States, Brazil, and Argentina are the major users of GM
technology in these four crops, and China and India have adopted GM cotton. In spite of its limited use,
GM technology already provides major benefits by increasing the estimated supply of corn and
soybeans by 13% and 20%, respectively, and reducing their estimated prices by 20% and 30%, respectively
(Barrows, Sexton, and Zilberman 2013). The adoption of GM varieties in Europe and Africa, and the expansion of its
use to major food crops like wheat and rice, is likely to significantly reduce the food price inflation
seen in recent years (Sexton and Zilberman 2011). Some of the key elements of the new bioeconomy are listed below, and include
genetic modification, and biofuels and developments in green chemistry. Genetic modification:
Genetic modification of crops
is a major contributor to sustainable development . Existing GM varieties significantly reduce crop
damage (Qaim and Zilberman 2003), greenhouse gas emissions , and the footprint of agriculture
Sexton, and Zilberman 2013). Today GMOs are in their infancy, but they provide new
(Barrows,
and more precise means to improve
crops and adapt to changing conditions. New innovations instituted at various stages of developments are likely to increase
the input use efficiency of water and fertilizers in crop production and of grains as sources of animal feed. The development and adoption of
these innovations has stalled because of regulations (Bennett et al. 2013). Nonetheless, GMOs
improve the speed of
development or modification of crop varieties and thus can provide a means of adapting to climate
change (Zilberman, Zhao, and Heiman 2012). Biofuels: For millennia, wood, dung, and oils supplied energy for cooking, heating, and other
functions. Here we refer to the agricultural (broadly defined) production of feedstocks and their industrial processing for modern applications.
Examples include the production of ethanol, biodiesel, and wood chips to replace fossil fuels. The production of biofuels for transport fuel was
motivated by the high price of oil and other fuels, balance of trade considerations, and concerns about climate change (Rajagopal and
Zilberman 2007). However, direct and indirect effects on food prices (Zilberman et al. 2013) and the environment (greenhouse gas emissions
and deforestation (Khanna and Crago 2012) raised questions about biofuels. Yet liquid fuels have relative advantages in major applications and
are most likely to be produced sustainably through biofuels. Learning by doing in sugarcane and corn biofuels production has improved their
environmental and economic performance (Khanna and Crago 2012). Research on second and third generation biofuels is promising, and
several will be produced on nonagricultural lands in the foreseeable future (Youngs and Somerville 2012). The evolution of biofuels is
dependent on policy, and the emergence of clean and efficient biofuels is more likely to be followed by continued investment in research and
appropriate pricing of carbon (Chen and Khanna 2013).
The future of biofuels is also affected by the future of
GMOs . Policy changes that will enable the introduction and large-scale adoption of GMO rice and
wheat varieties, which will increase rice and wheat yields by more than 10%, and the adoption of GM
traits in Africa and Europe may reduce food commodity prices and free up lands that will allow the
adoption of sugarcane for biofuel in India and other developing countries. Greater acceptance of
transgenic technology is likely to increase its utilization in biofuel feedstock production and improve the
productivity of sugarcane, grasses, and trees considered for the production of
second-generation biofuels.
The design of
biofuel policy and the interaction of biofuels and biotechnology policies are subjects for future research. Green chemistry (broadly defined):
Green chemistry represents a transition from petroleum-based chemicals to biomass-based chemicals (Clark, Luque, and Matharu 2012). Green
chemistry emphasizes a reduction in the toxicity of outputs, recycling, energy efficiency, and production of decomposable products with
minimal waste. Its principles of operation are consistent with some of the concepts associated with sustainable development elucidated above.
The reliance on biomass suggests that the transition to green chemistry will lead to a more spatially distributed network of bio-refineries
instead of the highly centralized refinery systems in place today, suggesting that the transition to green chemistry will be an engine for regional
development. Increased reliance on plant and animal feedstocks will enhance investment in bio-prospecting in order to discover new
feedstocks and valuable chemicals.
Research to develop advanced biotechnology methods and products
will be crucial to the development of the bioeconomy . For example, one of the impediments to using many crops
as feedstocks is their high lignin content, and the development of varieties with lower lignin content will reduce the cost and increase the range
of products that can serve as feedstock for fuel and other applications.
Extinction inevitable- try or die for sustainable GM food production
Trewavas 2000 [Anthony, Institute of Cell and Molecular Biology – University of Edinburgh, “GM Is
the Best Option We Have”, AgBioWorld, 6-5, http://www.agbioworld.org/biotech-info/articles/biotechart/best_option.html]
But these are foreign examples; global warming is the problem that requires the UK to develop GM technology. 1998 was the warmest year in the last one thousand
years. Many think global warming will simply lead to a wetter climate and be benign. I do not. Excess rainfall in northern seas has been predicted to halt the Gulf
Stream. In this situation, average UK temperatures would fall by 5 degrees centigrade and give us Moscow-like winters. There are already worrying signs of salinity
changes in the deep oceans. Agriculture would be seriously damaged and necessitate the rapid development of new crop varieties to secure our food supply. We
would not have much warning. Recent detailed analyses of arctic ice cores has shown that the climate can switch between stable states in fractions of a decade.
Even if the climate is only wetter and warmer new crop pests and rampant disease will be the consequence. GM
technology can enable new
crops to be constructed in months and to be in the fields within a few years. This is the unique benefit GM offers. The UK populace needs to
much more positive about GM or we may pay a very heavy price. In 535A.D. a volcano near the present Krakatoa exploded with the force of
200 million Hiroshima A bombs. The dense cloud of dust so reduced the intensity of the sun that for at least
two years thereafter, summer turned to winter and crops here and elsewhere in the Northern hemisphere failed completely. The population
survived by hunting a rapidly vanishing population of edible animals. The after-effects continued for a decade and human history was changed irreversibly. But the
planet recovered. Such examples of benign nature's wisdom, in full flood as it were, dwarf and make miniscule the tiny modifications we make upon our
environment. There are apparently 100
such volcanoes round the world that could at any time unleash forces as great. And
even smaller volcanic explosions change our climate and can easily threaten the security of our food supply. Our hold on this
planet is tenuous. In the present day an equivalent 535A.D. explosion would destroy much of our civilisation. Only those with
agricultural technology sufficiently advanced would have a chance at survival. Colliding asteroids are
another problem that requires us to be forward-looking accepting that technological advance may be the only buffer
between us and annihilation.
1AC 2
Legalization solves commercialization of stem-cell therapy- key to disease research
Medina 8 (Joanne, J.D. candidate, University of the Pacific, McGeorge School of Law, to be
conferred 2008. “ Is Stem Cell Research a One-Way Ticket to the Island of Dr. Moreau?
Singapore and the United States' Differing Paths” 21 Pac. McGeorge Global Bus. & Dev. L.J.
125 2008 pg. lexis)
Preliminary findings suggest that stem cell research is the most promising avenue for
finding cures and treatments for many debilitating diseases that afflict millions
of people worldwide . The U.S. government must reevaluate its current policies
that restrict human embryonic stem cell research. The United States must
expand stem cell research support and funding or face being technologically
and economically disadvantaged . If the United States does not lift restrictions on stem
cell research imposed by President Bush, American scientists will not be able to fully explore
the potential of stem cell research. The current U.S. policy puts the health of millions of
Americans at risk. The United States cannot afford to lose its top scientists and researchers to
other countries, like Singapore, which encourages and funds their research. The United States
cannot risk not having access to new treatments and therapies that will inevitably be discovered
through stem cell research. The United States has a long and proud record of being
a leader in science and medicine. The United States also has a long and proud
record of establishing a high standard of ethics in science and medicine. The United
States must take a lead role in the international arena to ensure that the research is conducted
ethically and for the greatest public good. This can only be effectuated by enacting
domestic legislation that can be harmonized with the rest of the world and with
international guidelines. The U.S. government should not lose sight of the ultimate goal of stem
cell research-to improve the quality of human lives. Although the debates continue about "when
does life begin?" it is important for to realize that it is not only the life of an embryo that must be
considered; consideration must also be given to those already living. The benefits of stem cell
research substantially outweigh the costs of the research and it is thus important that we create
guiding principles and requirements so that stem cell research develops in the most effective
and beneficial way possible.
Key to pandemic modeling, stops the diseases most likely to kill humanity
McGough 1 (McGough, Robert E. J.D. 2001, The Catholic University of America, Columbus School of
Law. Associate, Morgan, Lewis & Bockius LLP."Case for Federal Funding of Human Embryonic Stem Cell
Research: The Interplay of Moral Absolutism and Scientific Research, A." J. Contemp. Health L. & Pol'y 18
(2001): 147.)
Laboratory success like that achieved in the studies described above apparently represents only the tip of the
proverbial iceberg of the potential applications of ES and EG cell therapies. The ability of ES and EG cells to perpetuate
themselves indefinitely in culture, as well as their potential to develop into virtually any tissue type in the human body, suggests staggering
possibilities. In a paper released in November 1999, the American Association for the Advancement of Science details several examples of
disorders that are potentially treatable using ES and EG stem cell therapy. These disorders include T ype 1 diabetes in
children, nervous system diseases,
immunodeficiency diseases , including immune deficiencies suffered as a result of
Acquired Immune Deficiency Syndrome (
AIDS),
diseases of bone and cartilage and cancer. 96 The
paper also describes the
research potential of embryonic stem cell biology, including a greater understanding of human
developmental biology, as well as a better understanding of pathogenic viruses, transplantation and gene
therapy.97
Disease leads to extinction- no burnout
Keating, 9 -- Foreign Policy web editor
(Joshua, "The End of the World," Foreign Policy, 11-13-9,
www.foreignpolicy.com/articles/2009/11/13/the_end_of_the_world?page=full, accessed 9-7-12,
mss)
How it could happen: Throughout history, plagues have brought civilizations to their
knees . The Black Death killed more off more than half of Europe's population in the Middle
Ages. In 1918, a flu pandemic killed an estimated 50 million people, nearly 3 percent of the
world's population, a far greater impact than the just-concluded World War I. Because of
globalization, diseases today spread even faster - witness the rapid worldwide spread of H1N1
currently unfolding. A global outbreak of a disease such as ebola virus -- which has had a 90
percent fatality rate during its flare-ups in rural Africa -- or a mutated drug-resistant form of the
flu virus on a global scale could have a devastating, even civilization-ending impact . How
likely is it? Treatment of deadly diseases has improved since 1918, but so have the diseases.
Modern industrial farming techniques have been blamed for the outbreak of diseases, such as
swine flu, and as the world’s population grows and humans move into previously unoccupied
areas, the risk of exposure to previously unknown pathogens increases. More than 40 new
viruses have emerged since the 1970s, including ebola and HIV. Biological weapons
experimentation has added a new and just as troubling complication.
1AC 3
Contention three: no war
No scenario for great power war – laundry list
Deudney and Ikenberry ‘9 (Professor of Political Science at Johns Hopkins AND Albert G. Milbank
Professor of Politics and International Affairs at Princeton University (Jan/Feb, 2009, Daniel Deudney
and John Ikenberry, “The Myth of the Autocratic Revival: Why Liberal Democracy Will Prevail,” Foreign
Affairs)
This bleak outlook is based on an exaggeration of recent developments and ignores powerful countervailing factors and forces. Indeed, contrary to what the
revivalists describe, the
most striking features of the contemporary international landscape are the
intensification of economic globalization, thickening institutions, and shared problems of
interdependence. The overall structure of the international system today is quite unlike that of the nineteenth
century. Compared to older orders, the contemporary liberal-centered international order provides
a set of constraints and opportunities-of pushes and pulls-that reduce the likelihood of severe
conflict while creating strong imperatives for cooperative problem solving. Those invoking the nineteenth
century as a model for the twenty-first also fail to acknowledge the extent to which war as a path to conflict resolution and greatpower expansion has become largely obsolete. Most important, nuclear weapons have transformed
great-power war from a routine feature of international politics into an exercise in national suicide. With all of the
great powers possessing nuclear weapons and ample means to rapidly expand their deterrent forces, warfare among
these states has truly become an option of last resort. The prospect of such great losses has
instilled in the great powers a level of caution and restraint that effectively precludes major
revisionist efforts. Furthermore, the diffusion of small arms and the near universality of nationalism
have severely limited the ability of great powers to conquer and occupy territory inhabited by resisting
populations (as Algeria, Vietnam, Afghanistan, and now Iraq have demonstrated). Unlike during the days of empire building in the nineteenth century,
states today cannot translate great asymmetries of power into effective territorial control; at most, they
can hope for loose hegemonic relationships that require them to give something in return. Also unlike in the nineteenth century, today the density
of trade, investment, and production networks across international borders raises even more the
costs of war. A Chinese invasion of Taiwan, to take one of the most plausible cases of a future
interstate war, would pose for the Chinese communist regime daunting economic costs, both
domestic and international. Taken together, these changes in the economy of violence mean that the
international system is far more primed for peace than the autocratic revivalists acknowledge.
U.S nuclear primacy solves all conflict- superior capabilities to both Russia and China
are only increasing
Engdahl ’14 (William Engdahl is an award-winning geopolitical analyst and strategic risk consultant
whose internationally best-selling books have been translated into thirteen foreign languages, “US
missile shield: ‘Russian Bear sleeping with one eye open’”, http://rt.com/op-edge/us-missile-shieldrussia-361/, February 17, 2014)
US nuclear primacy
In a 2006 interview with London’s Financial Times, then US Ambassador to NATO, former Cheney advisor Victoria Nuland— the same person today
disgraced by a video of her phone discussion with US Ukraine Ambassador Pyatt on changing the Kiev government (“Fuck the EU”) — declared that the US wanted a “globally deployable
military force” that would operate everywhere – from Africa to the Middle East and beyond—“all across our planet.” Nuland then declared that it would include Japan and Australia as well as
the NATO nations. She added, “It’s a totally different animal.” She was referring to BMD plans of Rumsfeld’s Pentagon.
As nuclear strategy experts warned at
that time, more than eight years ago, deployment of even a minimal missile defense , under the
Pentagon’s then-new CONPLAN 8022, would give the US what the military called, “ Escalation
Dominance ”—the ability to win a war at any level of violence, including nuclear war. As the authors of a
seminal Foreign Affairs article back in April 2006 noted: “Washington's continued refusal to eschew a first strike and the country's development of a limited missile-defense capability take on a
new, and possibly more menacing, look… A nuclear war-fighting capability remains a key component of the United States' military doctrine and nuclear primacy remains a goal of the United
States.” The two authors of the Foreign Affairs piece, Lieber and Press, went on to outline the real consequences of the current escalation of BMD in Europe (and as well against China in
. .[T]he sort of missile defenses that the United States might plausibly deploy would be valuable
primarily in an offensive context, not a defensive one—as an adjunct to a US First Strike capability, not
as a stand-alone shield. If the United States launched a nuclear attack against Russia (or China), the
Japan): “.
targeted country would be left with only a tiny surviving arsenal — if any at all. At that point,
even a relatively modest or inefficient missile defense system might well be enough to protect against
any retaliatory strikes.” They concluded, “ Today , for the first time in almost 50 years, the United States stands on the
verge of attaining nuclear primacy. It will probably soon be possible for the United States to
destroy the long-range nuclear arsenals of Russia or China with a first strike. This dramatic shift in the
nuclear balance of power stems from a series of improvements in the U nited S tates' nuclear
systems, the precipitous decline of Russia's arsenal, and the glacial pace of modernization of
China's nuclear forces.”
Even the newest scientific data doesn’t support nuclear winter
Seitz ‘11 (Russell, served as an Associate of The Center for International Affairs and a Fellow of the
Department of Physics at Harvard. He is presently chief scientist at Microbubbles LLC, Nuclear winter
was and is debatable, Nature, 7 J U LY 2011, vol 475)
concept is itself debatable (Nature 473, 275–276; 2011). This potential climate disaster, popularized in Science in 1983, rested
on the output of a
one- dimensional model that was later shown to overestimate the smoke a nuclear holocaust might engender. More
refined estimates, combined with advanced three-dimensional models (see go.nature.com/ kss8te), have
dramatically reduced the extent and severity of the projected cooling. Despite this, Carl Sagan, who coauthored the 1983 Science paper, went so far as to posit “the extinction of Homo sapiens” (C. Sagan Foreign Affairs 63,75-77; 1984).
Some regarded this apocalyptic prediction as an exercise in mythology. George Rathjens of the
Massachusetts Institute of Technology protested: “Nuclear winter is the worst example of the
misrepresentation of science to the public in my memory,” (see go.nature.com/yujz84) and climatologist Kerry
Emanuel observed that the subject had “become notorious for its lack of scientific integrity” (Nature 319, 259;
1986). Robocks single-digit fall in temperature is at odds with the subzero (about -25°C) continental cooling
originally projected for a wide spectrum of nuclear wars. Whereas Sagan predicted darkness at noon
from a US-Soviet nuclear conflict, Robock projects global sunlight that is several orders of magnitude
brighter for a Pakistan-India conflict — literally the difference between night and day. Since 1983, the
projected worst-case cooling has fallen from a Siberian deep freeze spanning 11,000 degree- days
Celsius (a measure of the severity of winters) to numbers so unseasonably small as to call the very term
‘nuclear winter’ into question.
Counter-forcing solves escalation of wars
Mueller ‘9 (Woody Hayes Chair of National Security Studies and Professor of Political Science at Ohio
State University (John, “Atomic Obsession: Nuclear Alarmism from Hiroshima to Al-Qaeda” p. 8, Google
Books)
To begin to approach a condition that can credibly justify applying such extreme characterizations as societal annihilation, a full-out attack with hundreds, probably
Even in such extreme cases, the area actually devastated by the bombs' blast
and thermal pulse effective would be limited: 2,000 1-MT explosions with a destructive radius of 5 miles each would directly demolish
less than 5 percent of the territory of the United States, for example. Obviously, if major population centers were targeted, this sort
of attack could inflict massive casualties. Back in cold war days, when such devastating events sometimes seemed uncomfortably likely, a number of studies
were conducted to estimate the consequences of massive thermonuclear attacks. One of the most prominent of
these considered several probabilities. The most likely scenario--one that could be perhaps considered at least to begin to approach the rational--was
a "counterforce" strike in which well over 1,000 thermonuclear weapons would be targeted at America's ballistic missile silos,
strategic airfields, and nuclear submarine bases in an effort to destroy the country’s strategic ability to retaliate. Since
the attack would not directly target population centers, most of the ensuing deaths would be from radioactive
fallout, and the study estimates that from 2 to 20 million, depending mostly on wind, weather, and sheltering, would perish during
thousands, of thermonuclear bombs would be required.
the first month.15
No miscalc or escalation or lose nukes—every crisis ever disproves and neither side
would escalate
Quinlan ‘9 (Michael, Former Permanent Under-Sec. State – UK Ministry of Defense, “Thinking about
Nuclear Weapons: Principles, Problems, Prospects”, p. 63-69) *we don’t endorse gendered language
Even if initial nuclear use did not quickly end the fighting, the supposition of inexorable momentum in a developing
exchange, with each side rushing to overreaction amid confusion and uncertainty, is implausible. It fails to consider what the
situation of the decisionmakers would really be.
Neither side could want escalation. Both would be appalled at
what was going on. Both
would be desperately looking for signs that the other was ready to call a halt. Both, given the
capacity for evasion or concealment which modem delivery platforms and vehicles can possess, could have in reserve
significant forces invulnerable enough not to entail use-or-lose pressures. (It may be more open to question, as noted
earlier, whether newer nuclear-weapon possessors can be immediately in that position; but it is within reach of any substantial state with advanced technological
capabilities, and attaining it is certain to be a high priority in the development of forces.) As a result, neither
side can have any predisposition
to suppose, in an ambiguous situation of fearful risk, that the right course when in doubt is to go on copiously launching
weapons. And none of this analysis rests on
any presumption of highly subtle or pre-concerted
rationality . The rationality
required is plain. The argument is reinforced if we consider the possible reasoning of an aggressor at a more dispassionate level. Any substantial nuclear
armoury can inflict destruction outweighing any possible prize that aggression could hope to seize. A state attacking the
possessor of such an armoury must therefore be doing so (once given that it cannot count upon destroying the armoury pre-emptively)
on a judgement that the possessor would be found lacking in the will to use it. If the attacked possessor
used nuclear weapons, whether first or in response to the aggressor's own first use, this judgement would begin to look dangerously
precarious. There must be at least a substantial possibility of the aggressor leaders' concluding that their initial
judgement had been mistaken—that the risks were after all greater than whatever prize they had been seeking, and that for their own country's
survival they must call off the aggression. Deterrence planning such as that of NATO was directed in the first place to preventing the initial
misjudgement and in the second, if it were nevertheless made, to compelling such a reappraisal. The former aim had to have primacy, because it could not be taken
for granted that the latter was certain to work. But there was no ground for assuming in advance, for all possible scenarios, that the chance of its working must be
negligible. An
aggressor state would itself be at huge risk if nuclear war developed, as its leaders would
know. It may be argued that a policy which abandons hope of physically defeating the enemy and simply hopes to get him to desist is pure gamble, a matter of
who blinks first; and that the political and moral nature of most likely aggressors, almost ex hypothesi, makes them the less likely to blink. One response to this is to
ask what is the alternative—it can only be surrender. But a more positive and hopeful answer lies in the fact that the criticism is posed in a political vacuum. Real-life
conflict would have a political context. The context which concerned NATO during the cold war, for example, was one of defending vital interests against a
postulated aggressor whose own vital interests would not be engaged, or would be less engaged. Certainty is not possible, but a clear asymmetry of vital interest is
a legitimate basis for expecting an asymmetry, credible to both sides, of resolve in conflict. That places upon statesmen, as page 23 has noted, the key task in
deterrence of building up in advance a clear and shared grasp of where limits lie. That was plainly achieved in cold-war Europe. If vital interests have been defined in
a way that is dear, and also clearly not overlapping or incompatible with those of the adversary, a credible basis has been laid for the likelihood of greater resolve in
resistance. It was also sometimes suggested by critics that whatever might be indicated by theoretical discussion of political will and interests, the military
environment of nuclear warfare—particularly difficulties of communication and control—would drive escalation with overwhelming probability to the limit. But it is
obscure why matters should be regarded as inevitably .so for every possible level and setting of action. Even if the history of war suggested (as it scarcely does) that
military decision-makers are mostly apt to work on the principle 'When in doubt, lash out', the nuclear revolution creates an utterly new situation. The pervasive
reality, always plain to both sides during the cold war, is `If this goes on to the end, we are all ruined'. Given
that inexorable escalation would
mean catastrophe for both, it would be perverse to suppose them permanently incapable of framing
arrangements which avoid it. As page 16 has noted, NATO gave its military commanders no widespread delegated authority, in peace or war, to
launch nuclear weapons without specific political direction. Many types of weapon moreover had physical safeguards such as
PALs incorporated to reinforce organizational ones. There were multiple communication and control
systems for passing information, orders, and prohibitions. Such systems could not be totally guaranteed against disruption if at a
fairly intense level of strategic exchange—which was only one of many possible levels of conflict— an adversary judged it to be in his interest to weaken political
control. It was far from clear why he necessarily should so judge. Even then, however, it
remained possible to operate on a general fail-
safe presumption: no authorization, no use. That was the basis on which NATO operated. If it is feared that the arrangements which 1 a
nuclear-weapon possessor has in place do not meet such standards in some respects, the logical course is to continue to improve them rather than to assume
The likelihood of
escalation can never be 100 per cent, and never zero. Where between those two extremes it may lie can never
be precisely calculable in advance; and even were it so calculable, it would not be uniquely fixed—it would stand to vary hugely with
escalation to be certain and uncontrollable, with all the enormous inferences that would have to flow from such an assumption.
circumstances. That there should be any risk at all of escalation to widespread nuclear war must be deeply disturbing, and decision-makers would always have to
weigh it most anxiously. But a pair of key truths about it need to be recognized. The first is that the risk of escalation to large-scale nuclear war is inescapably
present in any significant armed conflict between nuclear-capable powers, whoever may have started the conflict and whoever may first have used any particular
category of weapon. The initiator of the conflict will always have physically available to him options for applying more force if he meets effective resistance. If the
risk of escalation, whatever its degree of probability, is to be regarded as absolutely unacceptable, the necessary inference is that a state attacked by a substantial
nuclear power must forgo military resistance. It must surrender, even if it has a nuclear armoury of its own. But the companion truth is that, as page 47 has noted,
the risk of escalation is an inescapable burden also upon the aggressor. The exploitation of that burden is the crucial route, if conflict does break out, for managing
it, to a tolerable outcome--the only route, indeed, intermediate between surrender and holocaust, and so the necessary basis for deterrence beforehand. The
working out of plans to exploit escalation risk most effectively in deterring potential aggression entails further and complex issues. It is for example plainly desirable,
wherever geography, politics, and available resources so permit without triggering arms races, to make provisions and dispositions that are likely to place the onus
of making the bigger, and more evidently dangerous steps in escalation upon the aggressor volib wishes to maintain his attack, rather than upon the defender. (The
customary shorthand for this desirable posture used to be 'escalation dominance'.) These issues are not further discussed here. But addressing them needs to start
from acknowledgement that there are in any event no certainties or absolutes available, no options guaranteed to be risk-free and cost-free. Deterrence is not
possible without escalation risk; and its presence can point to no automatic policy conclusion save for those who espouse outright pacifism and accept its
consequences. Accident and Miscalculation Ensuring
the safety and security of nuclear weapons plainly needs to be
taken most seriously. Detailed information is understandably not published, but such direct evidence as there is suggests that it
always has been so taken in every possessor state, with the inevitable occasional failures to follow strict
procedures dealt with rigorously. Critics have nevertheless from time to time argued that the possibility of accident involving nuclear weapons is
so substantial that it must weigh heavily in the entire evaluation of whether war-prevention structures entailing their existence should be tolerated at all. Two sorts
of scenario are usually in question. The first is that of a single grave event involving an unintended nuclear explosion—a technical disaster at a storage site, for
example, Dr the accidental or unauthorized launch of a delivery system with a live nuclear warhead. The second is that of some event—perhaps such an explosion
or launch, or some other mishap such as malfunction or misinterpretation of radar signals or computer systems—initiating a sequence of response and counterresponse that culminated in a nuclear exchange which no one had truly intended. No event that is physically possible can be said to be of absolutely zero probability
(just as at an opposite extreme it is absurd to claim, as has been heard from distinguished figures, that nuclear-weapon use can be guaranteed to happen within
some finite future span despite not having happened for over sixty years). But human affairs cannot be managed to the standard of either zero or total probability.
We have to assess levels between those theoretical limits and weigh their reality and implications against other factors, in security planning as in everyday life.
There have certainly been, across the decades since 1945, many known accidents involving nuclear weapons, from
transporters skidding off roads to bomber aircraft crashing with or accidentally dropping the weapons they carried ( in past days when such
carriage was a frequent feature of readiness arrangements----it no longer is). A few of these accidents may have released
into the nearby environment highly toxic material. None however has entailed a nuclear detonation. Some commentators suggest that
this reflects bizarrely good fortune amid such massive activity and deployment over so many years. A more rational deduction from the facts of this long experience
would however be that the
probability of any accident triggering a nuclear explosion is extremely low. It might be
mechanisms needed to set off such an explosion are technically demanding, and that in a
large number of ways the past sixty years have seen extensive improvements in safety arrangements for both
further noted that the
the design and the handling of weapons. It is undoubtedly possible to see respects in which, after the cold war, some of the factors bearing
upon risk may be new or more adverse; but some are now plainly less so. The years which the world has come through entirely without
accidental or unauthorized detonation have
included early decades in which knowledge was sketchier,
precautions were less developed, and weapon designs were less ultra-safe
than they later became, as
well as substantial periods in which weapon numbers were larger, deployments more widespread and
diverse, movements more frequent, and several aspects of doctrine and readiness arrangements more
tense. Similar considerations apply to the hypothesis of nuclear war being mistakenly triggered by false alarm. Critics again point to the fact, as it is understood,
of numerous occasions when initial steps in alert sequences for US nuclear forces were embarked upon, or at least called for, by, indicators mistaken or
misconstrued. In
none of these instances, it is accepted, did matters get at all near to nuclear launch-rival and more logical inference from hundreds of events
stretching over sixty years of experience presents itself once more: that the probability of initial
misinterpretation leading far towards mistaken launch is remote. Precisely because any nuclear-weapon
possessor recognizes the vast gravity of any launch, release sequences have many steps, and human
decision is repeatedly interposed as well as capping the sequences. To convey that because a first step was prompted the
extraordinary good fortune again, critics have suggested. But the
world somehow came close to accidental nuclear war is wild hyperbole, rather like asserting, when a tennis champion has lost his opening service game, that he was
nearly beaten in straight sets. History anyway scarcely
offers any ready example of major war started by accident even
before the nuclear revolution imposed an order-of-magnitude increase in caution. It was occasionally conjectured
that nuclear war might be triggered by the real but accidental or unauthorized launch of a strategic nuclear-weapon delivery system in the direction of a potential
adversary. No
such launch is known to have occurred in over sixty years . The probability of it is therefore very
low. But even if it did happen, the further hypothesis of it initiating a general nuclear exchange is far-fetched. It
fails to consider the real situation of decision-makers as pages 63-4 have brought out. The notion that cosmic
holocaust might be mistakenly precipitated in this way belongs to science fiction.
1AC Solvency
Organs can be produced through stem cell bioprinting now, but a clear regulatory
framework is key to the sales market
Gwinn 14 (James Gwinn is a rising senior at the University of Kentucky – Paducah Campus, where he
will graduate with dualdegrees in Economics and Mechanical Engineering. ASME helps the global
engineering community develop solutions to real world challenges. Founded in 1880 as the American
Society of Mechanical Engineers, ASME is a not‐for‐profit professional organization that enables
collaboration, knowledge sharing and skill development across all engineering disciplines, while
promoting the vital role of the engineer in society. ASME codes and standards, publications,
conferences, continuing education and professional development programs provide a foundation for
advancing technical knowledge and a safer world. ASME’s mission is to serve diverse global communities
by advancing, disseminating and applying engineering knowledge for improving the quality of life; and
communicating the excitement of engineering*. “Bioprinting: Organs on Demand” pg. 15)
Breakthroughs in bioprinting are being made regularly , but there is currently no clearly
defined regulatory framework in place
to ensure the safety of these products. (8) Many of the
best and brightest
minds in the world are working to bring bioprinted products to the marketplace; however, the potential and
functional limitations of bioprinting are not yet fully understood.
The technology , as a whole, is so new that public
policy has not had the opportunity to catch up to the current state of the industry . (9) Products
made via bioprinting technology span a number of product review divisions within the FDA due to the wide range of potential applications. (10)
Additionally, FDA regulations for biosimilar biologics† do
stem cells
not yet address biosimilarity between human embryonic
(hESCs) and induced pluripotent stem cells (iPS cells). The FDA evaluates all devices, including any that utilize 3‐D printing
technology, for safety and effectiveness, and appropriate benefit and risk determination, regardless of the manufacturing technologies used
.
In the US, a number of regulatory and legislative hurdles must be cleared before the first
lab‐printed kidney, liver, or heart implant will make it to market . As it is with all biologics, the critical
regulatory challenges with bioprinted organs will revolve around
and
establishing consistent
manufacturing
methods.
(11) There
demonstrating the safety of the final product
are also a number of technological
advancements: software needs refinement; advances in regenerative medicine must be made; more sophisticated printers must be
developed; and thorough testing of the products must be conducted. (12)
the marketplace
in a safe and timely manner,
To ensure that bioprinted products reach
an effective game plan will need to be enacted . (13) This
plan would include clearly identified goals, well‐established short‐ and longterm expectations, and the creation of models and actions for
linking investments to outputs. Additionally, the plan ought to clearly identify roles and responsibilities, milestones and metrics, and reasonable
time frames.
Only Congress clarifying the Dickey-Wicker amendment to legalize stem cell markets
generates long-term stability for research scientists
Cummings 10 (Layla, JD UNC School of Law. “SHERLEY V. SEBELIUS: A CALL TO CONGRESS TO
EXPLICITLY SUPPORT MEDICAL RESEARCH ON HUMAN EMBRYONIC STEM CELLS” NORTH CAROLINA
JOURNAL OF LAw & TECHNOLOGY 12 N.C. J.L. & TECH. ON. 77 (2010) pg lexis)
It is imperative that Congress changes the language of the Dickey-Wicker Amendment while
there is an appeal pending and before the preliminary injunction can be reinstated. With the latest advances,
including the start of an FDA-approved trial using hESCs," 5
it is more important than ever to secure funding for
this type of research . Francis Collins, Director of the NIH and a named defendant in Sherley, stated in reference to the current state
of hESC research, "[lt]his
is one of the most exciting areas of the broad array of engines of discovery that NIH
supports. This decision has just poured sand into that engine of discovery ."" In order to reverse
the negative consequences the district court's decision has had on the scientific community,
Congress needs to amend
or repeal
the Dickey-Wicker Amendment . This would give scientists the confidence
and stability necessary to pursue research that can potentially benefit those with currently incurable conditions. The decision in Sherley rested
on the definition of the word "research" as definitively meaning "a systematic investigation."7 In the government's memorandum to the U.S.
Court of Appeals in support of a stay of the preliminary injunction, defendants' counsel contests the overly-broad definition of the word
"research" in favor of a more narrow definition or, in the alternative, reading the word "research" in context of the surrounding text." While
such textualism is common practice in the judicial arena, it gives the appearance of splitting hairs over something necessarily subjective and
relatively insignificant. 9 Potential clinical
treatments involving stem cell research can take years to develop
and serious commitment on the part of researchers in the field. 9 o Given the social and political debate
surrounding this issue and the high stakes of the research involved, it is probable that the loser at the
appellate level will try to take the issue to the Supreme Court. This will likely be a long and drawn
out process. To let this issue play out in the courts where opposing counsel will argue over
the scope of the word "research" will result in a loss of confidence in the federal
government. Congress has had the opportunity to clarify the language of the Dickey-Wicker
Amendment every fiscal year for over a decade, but has failed to do so in favor of permitting
the long-standing agency interpretation . The Rabb memorandum presented a way for HHS and the NIH to fund critical
research in the face of an explicit appropriations limitation. However, an
agency's adoption of a legal opinion does not
have the same force as direct congressional actio n. At this point, only the courts or the legislature
have the authority to decide if the new Guidelines can be implemented. As a matter of policy, this is
more appropriately settled by the legislature where it can be debated and viewed as a
whole issue rather than as a matter of statutory interpretation . C. Congress Should Pass a Comprehensive
Stem Cell Bill
In addition to narrowing the Dickey-Wicker Amendment, Congress should pass a
bill that will expressly state its intentions regarding embryonic stem cell research . Preferably,
such a bill will codify President Obama's executive order and open the door to a transparent set of rules that will
regulate future hESC research. There is already a bill in Committee that would accomplish this objective. The Stem Cell Research
Advancement Act of 200991 was introduced to the House of Representatives on March 10, 2010, one year after President Obama's executive
order was signed. 92 On September 13, 2010, a companion bill was introduced to the Senate, similarly titled the Stem Cell Research
Advancement Act of 2010.93 These bills call for the support of stem cell research, explicitly including embryonic stem cell research where the
stem cells were derived from excess embryos donated from in vitro fertilization clinics. 94 Additionally, the bills require that a consultation with
the donors be conducted to ensure that the embryos would otherwise be discarded and those individuals donating their embryos provide
written, informed consent without receiving financial or other inducement. 95 The legislation would require the NIH to maintain guidelines and
update them every three years or as "scientifically warranted."96 These provisions, if enacted, would provide a codification of President
Obama's executive order. The only notable difference between the bills is that the more recent Senate bill explicitly states that this act shall not
supersede section 509 of the most recent HHS appropriations bill. 97 This is a direct reference to the language of the Dickey-Wicker
Amendment restricting funding of research that results in the destruction of embryos. Both the House and Senate bills would amend the Public
Health Service Act98 "[n]otwithstanding any other provision of law," so technically the additional language found in the Senate bill would make
no applicable difference. 99 However, the new language does evidence the Senate's intent to comply with the Dickey-Wicker Amendment. In
order for this bill to have its intended effect, the appellate courts will have to uphold the Rabb interpretation of the Dickey-Wicker
Amendment.o" The Rabb interpretation is essentially a legal workaround that should not serve as a permanent solution, but rather should have
worked as merely a stopgap until Congress could act.
Congress should make it a priority to get this legislation
through committee and onto the floor for debate . Also, those representatives who support stem
cell research should make it a goal to amend the Dickey-Wicker Amendment to allow for research on
pre-implantation stage embryos obtained under the ethical boundaries set forth in the Act. The Bush Administration succeeded in
delaying important research, and President Obama tried to reverse the policy through an executive order. Unfortunately, with the recent court
ruling, the Obama Administration's effort may fail. Relying
on the appellate process to vindicate the agency
interpretation of the Amendment is a gamble. The more appropriate solution is to have our
elected representatives pass legislation that will reassure the research community of
continued funding . V. CONCLUSION The decision in Sherley v. Sebelius is a setback for potentially life-saving medical research. Public
funding of embryonic stem cell research benefits the public welfare. Therefore, Congress
should take action that will explicitly
support this type of research. In this economy, the absence of public funding could truly hinder
further medical breakthroughs involving human embryonic stem cells.
Judge Lamberth rested his
holding in the case on the language of the
Dickey-Wicker Amendment. The Amendment, which was added to the appropriations bill
before embryonic stem cells could be isolated and grown in culture, is outdated and should be narrowed to allow for
research that has been implicitly approved by Congress and the Executive Branch for over ten years.
Furthermore, Congress should finally pass a stem cell bill that reflects the executive order issued by President Obama in 2009. Scientists
still have a lot to learn from studying human embryonic stem cells, and we should allow those scientists
the opportunity to decide which cells to study and to what extent."' American scientists should have the
best opportunity to follow through on the promising research they have been conducting and an
incentive to continue with progressive research. Otherwise, the denial of federal funds could
mean the denial of hope
for many Americans struggling with debilitating diseases.
Certainty key- otherwise, scientists simply won’t research embryonic stem cells
Levine 11 (Aaron, School of Public Policy and Institute of Bioengineering and Bioscience, Georgia
Institute of Technology. Policy Uncertainty and the Conduct of Stem Cell Research
http://www.sciencedirect.com/science/article/pii/S1934590911)000038
One consequence of the ethical controversy inspired by human embryonic stem cell (hESC) research has been an
atypically uncertain policy environment . For stem cell scientists in the United States and, in particular, those scientists
working with hESCs,
frequent policy changes have made the years since these cells were first derived (Thomson et al.,
1998) something
of a roller coaster . Similar challenges have faced stem cell scientists around the world, as numerous countries in
Europe, South America, and Asia, as well as the European Union as a whole, have engaged in protracted debates over stem cell policy (see
Gottweis et al., 2009 for a discussion of global stem cell policy debates). In
the United States, scientists have faced several
hESC policy changes (reviewed in Gottweis, 2010). First, following a legal review, the Clinton Administration adopted a policy in August
2000 that permitted federal funding of hESC research, but not the derivation of new hESC lines (65 Fed. Reg. 51,975). Before any grants could
be funded, however, the Bush Administration put this policy on hold and President Bush announced a new policy in August 2001 limiting
federal funding to research using hESC lines derived prior to the date of his speech. Although this policy remained in place for nearly eight
years,
uncertainty persisted . Congress, for instance, twice passed legislation to overturn the temporal
restrictions central to the policy, yet President Bush vetoed both these bills. During the Bush Administration, stem cell
policy was frequently addressed at the state level
restricting it, creating one of the many
with some states supporting stem cell research and others
heterogeneous “policy patchworks” that have become typical of
the field , even on an international scale (Caulfield et al., 2009). Supportive state policies aimed to provide a
workaround for scientists affected by federal funding restrictions, yet even these programs were plagued by
uncertainty , as legal challenges and state budget problems hindered their implementation.
California's stem cell program, for instance, was delayed for nearly 2 and a half years by litigation, causing difficulties for scientists considering
starting new stem cell projects or moving to new institutions. California's funding is now flowing and the state has awarded more than $1
billion, yet the future of this program remains uncertain as the end of its 10 year term approaches (see Karmali et al., 2010 for a recent review
of state stem cell funding). More recently, at the federal level, the Obama Administration adopted a new stem cell research policy in
July 2009 (74 Fed. Reg. 32,170), only to throw the field into chaos when scientists realized the limited number of hESC lines that had been
eligible for federal funding during the Bush Administration were no longer on the approved list and needed to be reevaluated. Key hESC lines,
including the two most heavily studied lines, have since been added to the registry, but not before months of uncertainty during which some
scientists were placed in the awkward position of choosing to delay projects until their preferred cell lines were approved or switching to other
lines and facing the delays associated with reoptimizing experimental protocols. A
legal challenge filed following the
promulgation of the Obama Administration's policy adds additional uncertainty to the field. This
challenge claims that the Obama Administration's policy violates the Dickey-Wicker Amendment, a rider
added to the Department of Health and Human Services appropriations bill each year since fiscal year 1996. This lawsuit received minimal
attention from the scientific community until August 23, 2010 when U.S. District Court Judge Royce Lamberth granted the plaintiffs' request for
a preliminary injunction barring implementation of the Obama Administration's policy pending the outcome of the court case. This
ruling
led the NIH to suspend funding and review of pending hESC research proposals as well as evaluation of new hESC
lines (see Gottweis, 2010 for a general discussion, U.S. NIH Notice NOT-OD-10-126 for details). The Obama Administration appealed and on
September 9, 2010 the U.S. Court of Appeals for the District of Columbia enjoined the preliminary injunction, allowing the NIH to resume
funding hESC research while the case proceeded. Both
the ultimate outcome of this case and the length of time
before the outcome is known are uncertain, placing some scientists in the situation of checking the news each day to
determine the legal status of their research (Harmon, 2010). Although the ultimate outcome of the litigation will depend on
statutory interpretation of the Dickey-Wicker Amendment , much of the legal wrangling thus far has focused
on the issue of potential harm to stem cell scientists associated with these policy changes. In his ruling announcing the injunction, Judge
Lamberth concluded that the plaintiffs—two adult stem cell scientists—would “suffer irreparable injury in the absence of the injunction” due to
increased competition for limited federal research funding, while the ruling “would not seriously harm ESC researchers because the injunction
would simply preserve the status quo and would not interfere with their ability to obtain private funding for their research” (U.S. District Court
for the District of Columbia). In its appeal, the Obama Administration disagreed, arguing that the harm to the plaintiffs was speculative and
“cannot outweigh the disruption or ruin of research into promising treatments for the most debilitating illnesses and injuries” caused by the
preliminary injunction (U.S. Court of Appeals for the D.C. Circuit). Despite this ongoing legal debate in the United States and the prevalence of
policy uncertainty in this field around the world, relatively few empirical studies address these issues. In order to begin to fill this gap, this
Forum reports responses from 370 individuals who participated in a survey of U.S. stem cell scientists in November 2010 and assesses the
reported impact of the preliminary injunction and ongoing uncertainty about the future of federal funding for hESC research on their work (see
Supplemental Information available online for details of survey design and analysis strategies employed). These data show that both Judge
Lamberth's ruling and the
ongoing uncertainty have had a substantial impact on stem cell scientists
and illustrate that this impact extends beyond hESC scientists to affect, often negatively, a larger group of stem cell scientists.
Status quo state and private efforts fail and destroy collaboration- only federal
regulation commercializes stem cell organ technology
Simson 2009 (Sylvia E. Simson B.A., New York University; J.D., Brooklyn Law School (expected 2009);
Executive Articles Editor, Brooklyn Journal of International Law (2008-2009) NOTE: BREAKING BARRIERS,
PUSHING PROMISE: AMERICA'S NEED FOR AN EMBRYONIC STEM CELL REGULATORY SCHEME 34
Brooklyn J. Int'l L. 531 pg lexis)
Despite the fact that embryonic stem cells are regarded as the holy grail of medicine, there is still
no American federal regulatory scheme in place to deal with such research . n8 During his
administration, President George W. Bush twice vetoed legislation that would support, promote, and fund embryonic stem cell research, n9
and consequently,
individual [*533] states have chosen not to wait. n10 Several states have not only legalized
embryonic stem cell research, but also authorized millions in funding and [*534] developed institutes to administer state
stem cell research programs. n11 California, for example, passed Proposition 71 in November of 2004, n12 which provided $ 3 billion in funding
for [all kinds of] stem cell research at California universities and research institutions . . . and called for the establishment of a new state agency
[the California Institute for Regenerative Medicine] to make grants and provide loans for stem cell research, research facilities and other vital
research opportunities. n13 However, there are
also states that specifically prohibit most or all forms of embryonic
stem cell research, like Arkansas, Louisiana, North Dakota, and South Dakota. n14 Other states, like Iowa, permit embryonic stem cell
researchbut do not fund the effort. n15 States like North Carolina and West Virginia have no law on the subject. n16 And some states are
[*535] deadlocked on the issue, like Florida. n17
issue
[of embryonic stem cell research]
Clearly, "a void of nationally cohesive regulation on the
remains," n18 and this lack of uniformity among states will
cause only some states to prosper economically and medically, with others lagging behind .
n19 It
will also be difficult for researchers to engage in interstate collaboration . n20 Most
importantly, however, this wide range of embryonic stem cell research policy and regulation makes the
United States look polarized and in disarray , with the more "blue" states surging ahead with research and the more "red"
states sticking to a conservative approach, likely due to religious influence. n21 It
is necessary for the United States to
construct a federal framework of rules and guidelines to govern the use of embryos for research
purposes, particularly since American society is one that aspires towards both government
[*536]
monitoring and a green light for research . Countries like the United Kingdom n22 have thorough regulation for
embryonic stem cell research, and even Germany, which has been notoriously "conservative about genetic research," n23 has passed the Stem
Cell Act of 2008, which allows for the importation of "human embryonic stem cell lines that were extracted before May 1, 2007." n24 In
order for the United States to stay at the forefront of medical research , be able to develop new
drugs to cure disease , and be able to pioneer new technologies to aid in the transplantation
of organs and tissue, our nation needs to
dispel ambiguities and
unite our country's states with
thorough regulation that supports and funds embryonic stem cell research.
Download